LoRA explained (and a bit about precision and quantization)
DeepFindr DeepFindr
29.1K subscribers
40,876 views
0

 Published On Aug 26, 2023

▬▬ Papers / Resources ▬▬▬
LoRA Paper: https://arxiv.org/abs/2106.09685
QLoRA Paper: https://arxiv.org/abs/2305.14314
Huggingface 8bit intro: https://huggingface.co/blog/hf-bitsan...
PEFT / LoRA Tutorial: https://www.philschmid.de/fine-tune-f...
Adapter Layers: https://arxiv.org/pdf/1902.00751.pdf
Prefix Tuning: https://arxiv.org/abs/2101.00190


▬▬ Support me if you like 🌟
►Link to this channel: https://bit.ly/3zEqL1W
►Support me on Patreon: https://bit.ly/2Wed242
►Buy me a coffee on Ko-Fi: https://bit.ly/3kJYEdl
►E-Mail: [email protected]

▬▬ Used Music ▬▬▬▬▬▬▬▬▬▬▬
Music from #Uppbeat (free for Creators!):
https://uppbeat.io/t/danger-lion-x/fl...
License code: M4FRIPCTVNOO4S8F

▬▬ Used Icons ▬▬▬▬▬▬▬▬▬▬
All Icons are from flaticon: https://www.flaticon.com/authors/freepik

▬▬ Timestamps ▬▬▬▬▬▬▬▬▬▬▬
00:00 Introduction
00:20 Model scaling vs. fine-tuning
00:58 Precision & Quantization
01:30 Representation of floating point numbers
02:15 Model size
02:57 16 bit networks
03:15 Quantization
04:20 FLOPS
05:23 Parameter-efficient fine tuning
07:18 LoRA
08:10 Intrinsic Dimension
09:20 Rank decomposition
11:24 LoRA forward pass
11:49 Scaling factor alpha
13:40 Optimal rank
14:16 Benefits of LoRA
15:20 Implementation
16:25 QLoRA

▬▬ My equipment 💻
- Microphone: https://amzn.to/3DVqB8H
- Microphone mount: https://amzn.to/3BWUcOJ
- Monitors: https://amzn.to/3G2Jjgr
- Monitor mount: https://amzn.to/3AWGIAY
- Height-adjustable table: https://amzn.to/3aUysXC
- Ergonomic chair: https://amzn.to/3phQg7r
- PC case: https://amzn.to/3jdlI2Y
- GPU: https://amzn.to/3AWyzwy
- Keyboard: https://amzn.to/2XskWHP
- Bluelight filter glasses: https://amzn.to/3pj0fK2

show more

Share/Embed