Deep Dive into LLMs like ChatGPT with Andrej Karpathy
This is a general audience deep dive into the Large Language Model (LLM) AI technology that powers ChatGPT and related products. It is covers the full...
CME 295 syllabus lists nine lectures from September 26 to December 5, 2025, covering Transformer architecture, LLMs, training methods like pretraining and LoRA, tuning via RLHF and DPO, reasoning, agents, RAG, and evaluation.
Each lecture includes slides, Panopto recordings around 1:45-1:50 hours, and YouTube videos from a playlist. Resources also feature a midterm on October 24 with exam and solutions, plus a final on December 10.