Power Each AI Agent With A Different LOCAL LLM (AutoGen + Ollama Tutorial)
YouTube Viewers YouTube Viewers
260K subscribers
91,729 views
0

 Published On Nov 29, 2023

In this video, I show you how to power AutoGen AI agents using individual open-source models per AI agent, this is going to be the future AI tech stack for running AI agents locally. Models are powered by Ollama and the API is exposed using LiteLLM.

Enjoy :)

Join My Newsletter for Regular AI Updates 👇🏼
https://www.matthewberman.com

Need AI Consulting? ✅
https://forwardfuture.ai/

Rent a GPU (MassedCompute) 🚀
https://bit.ly/matthew-berman-youtube
USE CODE "MatthewBerman" for 50% discount

My Links 🔗
👉🏻 Subscribe:    / @matthew_berman  
👉🏻 Twitter:   / matthewberman  
👉🏻 Discord:   / discord  
👉🏻 Patreon:   / matthewberman  

Media/Sponsorship Inquiries 📈
https://bit.ly/44TC45V

Links:
Instructions - https://gist.github.com/mberman84/ea2...
Ollama - https://ollama.ai
LiteLLM - https://litellm.ai/
AutoGen - https://github.com/microsoft/autogen
   • AutoGen Agents with Unlimited Memory ...  
   • AutoGen Advanced Tutorial - Build Inc...  
   • Use AutoGen with ANY Open-Source Mode...  
   • How To Use AutoGen With ANY Open-Sour...  
   • AutoGen FULL Tutorial with Python (St...  
   • AutoGen Tutorial 🚀 Create Custom AI A...  

show more

Share/Embed