SESSION 6
Session 6 — LangChain Multi-Provider
Same code, three providers: OpenAI, Anthropic, and Gemini.
60–90 minutes•1 motivation + 10 hands-on exercises
What you'll be able to do by the end
- ✓ Explain why LangChain exists and when NOT to use it
- ✓ Swap between OpenAI, Anthropic, and Gemini by changing one string
- ✓ Stream tokens live and batch many prompts concurrently
- ✓ Define tools with @tool and run the full execution loop with LangChain message classes
- ✓ Produce typed Pydantic objects or dict-shaped data with one line of code
Start with the motivation
Before any exercise: read Why LangChain? It answers the question "I just learned raw SDKs in Session 5 — why am I learning a wrapper now?" 15 minutes, zero code to run, sets up everything else.
Read Why LangChainPrerequisites
Finished Session 5 — especially Phase 2 (tool calling) and Phase 3 (Pydantic parse)Session6 folder has its own .venv and requirements.txt installedOPENAI_API_KEY set in Session6/.env (Anthropic and Google are optional)Comfortable with Pydantic BaseModel and type hints
The learning arc
Why
lesson 00
Basics
00–04
Tools
05–07
Structured
08–09
Exercises
Start here
Why you need LangChain in the first place.
Basics
init, invoke, stream, batch — one interface, many providers.
Tool calling
Declare tools with @tool, run the full execution loop.
Structured output
Pydantic classes or JSON Schema — your choice.