- ๐ Currently studying at IIT Guwahati
- ๐ง Passionate about AI, Quantum Computing, and Biotechnology
- โ๏ธ Iโm currently working on an exciting project: lox
- ๐ก I plan to work in future on creating an Artificial Brain using AI and quantum mechanics
- ๐ Skilled in Python, AI development
- Cuda_ML_Library: CUDA-powered ML library that makes GPU-accelerated machine learning in Python easy and accessible.
- flash_attention: Flash Attention implementation from scratch using CUDA + Python for high-performance attention computation.
- Transformers: Transformers built from scratch with GQA, RoPE, and RMSNorm, then trained end-to-end.
- Universal_Transformers: Decoder-only Universal Transformer architecture implemented in pure Python.
- REPO-Attention: Implementation inspired by Sakana AIโs RePo (Context Re-Positioning for language models).
- TRM-Tiny-Recursive-Model: Tiny recursive model implementation focused on compact, experimental ML architecture design.
- MIRNET_plus_ESRGAN: Low-light enhancement + super-resolution pipeline in Jupyter Notebook.
- More projects coming soon...
-
Email: dinmaybrahma@outlook.com
-
GitHub: dino65-dev
-
โก Fun fact: I can talk about neural networks and CRISPR over coffee like itโs casual small talk!
Thanks for visiting! ๐









