
March 11, 2026
A GPT-style transformer language model built from scratch in PyTorch, trained on OpenWebText with GPT-2 tokenization. The project demonstrates the full LLM pipeline including tokenization, transformer architecture, training optimization, and autoregressive text generation.

February 13, 2026
I failed many times in life. But that doesn't stop me.