4Bit QLoRA - 13B & 33B on 24GB VRAM or less?
Date : 2023-06-16

Introduction

Simple blogpost from Elinas (Zeus Labs) explaining key steps for 4-bit QLoRA. It's not overly technical and it's very clear. By the end, you'll feel like you need a 3090 to train your own model. See below


Link
We care about your privacy so we do not store nor use any cookie unless it is stricly necessary to make the website to work
Got it
Learn more