I'm an AI Engineer and Researcher with a current focus on Computer Vision and Generative AI.
Learn more about my approach to AI and work experience on my personal website.
-
Simplify the complex
Aim to explain challenging concepts in plain language, avoiding the dense jargon often found in academic papers. -
Avoid unnecessary work
Take the time to assess whether it's truly worth it and how to approach it in the most efficient way possible. -
Organize for success
Time spent organizing and maintaining clean code and workflows pays off not just in the long term but immediately. -
Respect technological boundaries
Strive to recognize the limits of technology and avoid playing the role of omnipotent creator. -
Stay humble
A little less ego goes a long way in achieving better results.
- PEFT Method Overview [implementing Adapters in PyTorch]
- Physical Symbol Systems and the Language of Thought
- Building a Transformer (Cross-Attention and MHA Explained)
- Understanding Byte-Pair Encoding Algorithm
- Can AI Achieve True Creativity?
-
Python
,C++
,Wolfram
,LaTex
,Git
-
NumPy
,Pandas
,Matplotlib
,Plotly
-
PyTorch
,Lightning
,Huggitng Face libs
,OpenCV
-
MlFlow
,DVC
,Weights & Biases
,Hydra
,Optuna
,Prometheus
,Grafana
-
Flask
,FastAPI
,Docker
,CI/CD
,AWS SageMaker
,Gradio
,Streamlit
,vLLM
- Transformer Architectures Course.
Deep exploration of transformer-based architectures such as BERT, GPT, T5, and others. - VisualTransformer: Adaptation and Fine-tuning with JAX/Flax
I'm working on adapting the Vision Transformer (ViT) model for computer vision tasks and optimizing it using JAX and Flax.