Skip to content

MarioCodes/python-ai-lab

Repository files navigation

Python AI Lab

Hands-on Python examples for building AI apps with Azure, OpenAI, LangChain, RAG, and AI Agents

Python Azure LangChain OpenAI License: MIT Poetry


A collection of practical, runnable Python examples that walk you through core AI concepts (e.g. OpenAI API calls, RAG with Azure Cosmos DB, AI agents with Microsoft Agent Framework).

Each module is self-contained and ready to run with poetry run

Modules · Getting Started


Modules

OpenAI API Basics

Direct interaction with the OpenAI chat completions API. Sends the same prompt 5 times to compare response variance. A practical way to understand model temperature and non-determinism.

poetry run call-to-openai

LangChain

Two examples showing how LangChain abstracts LLM interactions.

poetry run langchain-basics # invocation, messages, system prompts
poetry run langchain-lcel # LCEL (LangChain Expression Language) chains

Vector Databases (Azure Cosmos DB)

End-to-end workflow for storing and querying vector embeddings using Azure Cosmos DB as a vector database.

  1. Generate embeddings - uses Azure OpenAI's text-embedding-3-large model to create vector representations of text
  2. Store in Cosmos DB - uploads both the original text and its embedding as a single document
  3. Query with vector search - performs server-side VectorDistance similarity search
poetry run insert-embeddings    # Step 1-2: embed + store
poetry run query-embeddings     # Step 3: similarity search

RAG (Retrieval-Augmented Generation)

A full RAG pipeline that lets you ask questions against your own documents, powered by Azure Cosmos DB vector search and LangChain.

  1. Setup - creates a Cosmos DB container configured for vector indexing
  2. Ingest - reads PDFs, chunks them, generates embeddings, and stores everything in Cosmos DB
  3. Query - embeds your question, retrieves the most relevant chunks via vector search, and uses GPT-4 with the retrieved context to answer
poetry run setup-cosmosdb-for-rag   # One-time container setup
poetry run insert-rag-files          # Ingest your documents
poetry run ask-rag-question          # Ask questions against your docs

AI Agent Framework (Microsoft Agent Framework)

Getting started with Microsoft's Agent Framework for building AI agents on Azure AI Foundry. Includes both synchronous and streaming patterns.

poetry run agent-framework-chat-stream   # first agent. It responds to a single prompt with Streaming (token-by-token output)
poetry run agent-framework-tool     # improved agent with custom tools usage
poetry run agent-framework-multi-chat # multi-chat agent. You can have a conversation in turns and it stores session
poetry run agent-framework-persistence # agent with memory and persistence

Getting Started

Prerequisites

  • Python 3.13+
  • Poetry
  • Azure subscription
  • OpenAI API key

Installation

git clone https://github.com/MarioCodes/python-ai-lab.git
cd python-ai-lab
poetry install

Configuration

Each module uses environment variables for secrets. Set them before running:

# OpenAI (direct API)
export OPENAI_API_KEY=your-key
 
# Azure AI Foundry (RAG, vector DBs, agents) & Agent Framework
export FOUNDRY_URL=https://your-project.services.ai.azure.com/...
export FOUNDRY_KEY=your-key
 
# Azure Cosmos DB
export COSMOSDB_URL=https://your-account.documents.azure.com:443/
export COSMOSDB_KEY=your-key
 
# Agent Framework
export FOUNDRY_POC_URL=https://your-foundry-project.services.ai.azure.com/api/projects/your-project

Then run any module with the corresponding poetry run command listed above.


If you find this useful consider giving it a star