Let's build the GPT Tokenizer
Date : 2024-02-20
Description
In this lecture, Andrej Karpathy builds from scratch the Tokenizer used in the GPT series from OpenAI. In the process, he shows that a lot of weird behaviors and problems of LLMs actually trace back to tokenization and discusses why tokenization is at fault, and why someone out there ideally finds a way to delete this stage entirely.
Watch and like on Youtube
Recently on :
Artificial Intelligence
Information Processing | Computing
PITTI - 2024-09-19
A bubble in AI?
Bubble or true technological revolution? While the path forward isn't without obstacles, the value being created by AI extends ...
PITTI - 2024-09-08
Artificial Intelligence : what everyone can agree on
Artificial Intelligence is a divisive subject that sparks numerous debates about both its potential and its limitations. Howeve...
WEB - 2024-03-04
Nvidia bans using translation layers for CUDA software | Tom's Hardware
Tom's Hardware - Nvidia has banned running CUDA-based software on other hardware platforms using translation layers in its lice...
WEB - 2024-02-21
Retell AI : conversational speech engine
Retell tackle the challenge of real time conversations with voice AI.
WEB - 2024-02-21
Groq Inference Tokenomics: Speed, But At What Cost? | Semianalysis
Semianalysis - Groq, an AI hardware startup, has been making waves with their impressive demos showcasing Mistral Mixtral 8x7b ...