What Will You Learn?
In this course, you'll learn to construct your own large language model from the ground up, delving into the intricacies of data handling, mathematics, and transformer architectures. Practical implementation will be carried out using Python, providing you with hands-on experience in building and understanding the mechanics of large language models.
About This Course
Provider: Youtube
Format: Online
Duration: 6 hours to complete [Approx]
Target Audience: Beginners
Learning Objectives: Upon completion this free course, you will be able to construct a large language model from scratch, gaining expertise in data handling, mathematical concepts, and transformer architectures, all implemented using Python.
Course Prerequisites: NA
Assessment and Certification: NA
Instructor: Freecodecamp
Key Topics: Large Language Model, Python Programming, Artificial Intelligence
Topic Covered:
- - Introduction
- - Install Libraries
- - Pylzma build tools
- - Jupyter Notebook
- - Download wizard of oz
- - Experimenting with text file
- - Character-level tokenizer
- - Types of tokenizers
- - Tensors instead of Arrays
- - Linear Algebra heads up
- - Train and validation splits
- - Premise of Bigram Model
- - Inputs and Targets
- - Inputs and Targets Implementation
- - Batch size hyperparameter
- - Switching from CPU to CUDA
- - PyTorch Overview
- - CPU vs GPU performance in PyTorch
- - More PyTorch Functions
- - Embedding Vectors
- - Embedding Implementation
- - Logits and Reshaping
- - Generate function and giving the model some context
- - Logits Dimensionality
- - Training loop + Optimizer + Zerograd explanation
- - Optimizers Overview
- - Applications of Optimizers
- - Loss reporting + Train VS Eval mode
- - Normalization Overview
- - ReLU, Sigmoid, Tanh Activations
- - Transformer and Self-Attention
- - Transformer Architecture
- - Building a GPT, not Transformer model
- - Self-Attention Deep Dive
- - GPT architecture
- - Switching to Macbook
- - Implementing Positional Encoding
- - GPTLanguageModel initalization
- - GPTLanguageModel forward pass
- - OpenWebText download and Survey of LLMs paper
- - How the dataloader/batch getter will have to change
- - Extract corpus with winrar
- - Python data extractor
- - Adjusting for train and val splits
- - Adding dataloader
- - Training on OpenWebText
- - Training works well, model loading/saving
- - Pickling
- - Fixing errors + GPU Memory in task manager
- - Command line argument parsing
- - Porting code to script
- - Prompt: Completion feature + more errors
- - nnModule inheritance + generation cropping
0 Comments