Nowadays I focus on low-level deep learning - ML compilers and systems
My fav tech: C++, CUDA, Python, PyTorch, Linux, ML compilers, LLMs, Rust, WebGPU and TypeScript
I'm a regular author for Paged Out! magazine, which is a cool nerdy zine for things like hacking, programming and everything computer-related
I like math and I try to find time every week to learn something new. I'm currently working through the "How to prove it" by Daniel Velleman (basic math undergrad book) and it's lit
I want to be research scientist one day and do research on the intersection of math, AI and low-level systems
Night job:
tiny-vllm - High performance LLM inference engine, a younger sibling of vLLM, native C++ and CUDA implementation
torch-webgpu - PyTorch compiler and WebGPU runtime
PyTorch - mainly compilers TorchDynamo and TorchInductor, less frequently as of 2026