Skip to content

Commit 6f6ee61

Browse files
giles17Copilot
andauthored
Python: Fix broken samples and add missing READMEs (#5038)
* Python: Fix broken samples and add missing READMEs - simple_context_provider: move instructions kwarg into options dict - suspend_resume_session: use OpenAIChatCompletionClient for in-memory demo - foundry_chat_client_with_hosted_mcp: move store kwarg into options dict - Add README.md for context_providers and conversations sample folders Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com> * Python: Fix additional sample issues in context_providers - mem0_basic: send preferences query before sleep so Mem0 can learn them, print result from new session recall - mem0_sessions: add session for multi-turn conversation in agent-scoped example, remove user_id from agent-scoped provider (Mem0 API stores memories without user_id when agent_id is provided), use single message for storing preferences - redis_basics: print retrieved context messages instead of raw object - redis_sessions: add missing load_dotenv() call - redis_basics/redis_sessions: fix docstrings referencing wrong client type - azure_redis_conversation: replace duplicate copyright with load_dotenv() Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com> * Python: Fix broken link in declarative README openai_responses_agent.py was renamed to openai_agent.py Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com> --------- Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
1 parent 15e435b commit 6f6ee61

11 files changed

Lines changed: 88 additions & 32 deletions

File tree

Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,28 @@
1+
# Context Provider Samples
2+
3+
These samples demonstrate how to use context providers to enrich agent conversations with external knowledge — from custom logic to Azure AI Search (RAG) and memory services.
4+
5+
## Samples
6+
7+
| File / Folder | Description |
8+
|---------------|-------------|
9+
| [`simple_context_provider.py`](simple_context_provider.py) | Implement a custom context provider by extending `BaseContextProvider` to extract and inject structured user information across turns. |
10+
| [`azure_ai_foundry_memory.py`](azure_ai_foundry_memory.py) | Use `FoundryMemoryProvider` to add semantic memory — automatically retrieves, searches, and stores memories via Azure AI Foundry. |
11+
| [`azure_ai_search/`](azure_ai_search/) | Retrieval Augmented Generation (RAG) with Azure AI Search in semantic and agentic modes. See its own [README](azure_ai_search/README.md). |
12+
| [`mem0/`](mem0/) | Memory-powered context using the Mem0 integration (open-source and managed). See its own [README](mem0/README.md). |
13+
| [`redis/`](redis/) | Redis-backed context providers for conversation memory and sessions. See its own [README](redis/README.md). |
14+
15+
## Prerequisites
16+
17+
**For `simple_context_provider.py`:**
18+
- `FOUNDRY_PROJECT_ENDPOINT`: Your Azure AI Foundry project endpoint
19+
- `FOUNDRY_MODEL`: Model deployment name
20+
- Azure CLI authentication (`az login`)
21+
22+
**For `azure_ai_foundry_memory.py`:**
23+
- `FOUNDRY_PROJECT_ENDPOINT`: Your Azure AI Foundry project endpoint
24+
- `FOUNDRY_MODEL`: Chat/responses model deployment name
25+
- `AZURE_OPENAI_EMBEDDING_DEPLOYMENT_NAME`: Embedding model deployment name (e.g., `text-embedding-ada-002`)
26+
- Azure CLI authentication (`az login`)
27+
28+
See each subfolder's README for provider-specific prerequisites.

python/samples/02-agents/context_providers/mem0/mem0_basic.py

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -57,20 +57,27 @@ async def main() -> None:
5757
# Now tell the agent the company code and the report format that you want to use
5858
# and it should be able to invoke the tool and return the report.
5959
query = "I always work with CNTS and I always want a detailed report format. Please remember and retrieve it."
60+
print(f"User: {query}")
61+
result = await agent.run(query)
62+
print(f"Agent: {result}\n")
63+
6064
# Mem0 processes and indexes memories asynchronously.
6165
# Wait for memories to be indexed before querying in a new thread.
6266
# In production, consider implementing retry logic or using Mem0's
6367
# eventual consistency handling instead of a fixed delay.
6468
print("Waiting for memories to be processed...")
65-
await asyncio.sleep(12) # Empirically determined delay for Mem0 indexing
69+
await asyncio.sleep(15) # Empirically determined delay for Mem0 indexing
6670
print("\nRequest within a new session:")
6771
# Create a new session for the agent.
6872
# The new session has no context of the previous conversation.
6973
session = agent.create_session()
7074
# Since we have the mem0 component in the session, the agent should be able to
7175
# retrieve the company report without asking for clarification, as it will
7276
# be able to remember the user preferences from Mem0 component.
77+
query = "Please retrieve my company report"
78+
print(f"User: {query}")
7379
result = await agent.run(query, session=session)
80+
print(f"Agent: {result}")
7481

7582

7683
if __name__ == "__main__":

python/samples/02-agents/context_providers/mem0/mem0_sessions.py

Lines changed: 7 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -71,8 +71,6 @@ async def example_agent_scoped_memory() -> None:
7171
print("2. Agent-Scoped Memory Example:")
7272
print("-" * 40)
7373

74-
user_id = "user123"
75-
7674
async with (
7775
AzureCliCredential() as credential,
7876
Agent(
@@ -83,34 +81,23 @@ async def example_agent_scoped_memory() -> None:
8381
context_providers=[
8482
Mem0ContextProvider(
8583
source_id="mem0",
86-
user_id=user_id,
8784
agent_id="scoped_assistant",
8885
)
8986
],
9087
) as scoped_agent,
9188
):
92-
# Store some information
93-
query = "Remember that for this conversation, I'm working on a Python project about data analysis."
94-
print(f"User: {query}")
95-
result = await scoped_agent.run(query)
96-
print(f"Agent: {result}\n")
97-
98-
# Test memory retrieval
99-
query = "What project am I working on?"
89+
query = (
90+
"Remember that I'm working on a Python project about data analysis "
91+
"and I prefer using pandas and matplotlib."
92+
)
10093
print(f"User: {query}")
10194
result = await scoped_agent.run(query)
10295
print(f"Agent: {result}\n")
10396

104-
# Store more information
105-
query = "Also remember that I prefer using pandas and matplotlib for this project."
106-
print(f"User: {query}")
107-
result = await scoped_agent.run(query)
108-
print(f"Agent: {result}\n")
109-
110-
# Test comprehensive memory retrieval
97+
new_session = scoped_agent.create_session()
11198
query = "What do you know about my current project and preferences?"
112-
print(f"User: {query}")
113-
result = await scoped_agent.run(query)
99+
print(f"User (new session): {query}")
100+
result = await scoped_agent.run(query, session=new_session)
114101
print(f"Agent: {result}\n")
115102

116103

python/samples/02-agents/context_providers/redis/azure_redis_conversation.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,9 +30,10 @@
3030
from agent_framework.redis import RedisHistoryProvider
3131
from azure.identity import AzureCliCredential
3232
from azure.identity.aio import AzureCliCredential as AsyncAzureCliCredential
33+
from dotenv import load_dotenv
3334
from redis.credentials import CredentialProvider
3435

35-
# Copyright (c) Microsoft. All rights reserved.
36+
load_dotenv()
3637

3738

3839
class AzureCredentialProvider(CredentialProvider):

python/samples/02-agents/context_providers/redis/redis_basics.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -100,7 +100,7 @@ def search_flights(origin_airport_code: str, destination_airport_code: str, deta
100100

101101

102102
def create_chat_client() -> FoundryChatClient:
103-
"""Create an Azure OpenAI Responses client using a Foundry project endpoint."""
103+
"""Create a FoundryChatClient using a Foundry project endpoint."""
104104
return FoundryChatClient(
105105
project_endpoint=os.environ["FOUNDRY_PROJECT_ENDPOINT"],
106106
model=os.environ["FOUNDRY_MODEL"],

python/samples/02-agents/context_providers/redis/redis_sessions.py

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -34,10 +34,12 @@
3434
from agent_framework.foundry import FoundryChatClient
3535
from agent_framework.redis import RedisContextProvider
3636
from azure.identity import AzureCliCredential
37+
from dotenv import load_dotenv
3738
from redisvl.extensions.cache.embeddings import EmbeddingsCache
3839
from redisvl.utils.vectorize import OpenAITextVectorizer
3940

40-
# Copyright (c) Microsoft. All rights reserved.
41+
# Load environment variables from .env file
42+
load_dotenv()
4143

4244

4345
# Default Redis URL for local Redis Stack.
@@ -48,7 +50,7 @@
4850
# Please set OPENAI_API_KEY to use the OpenAI vectorizer.
4951
# For chat responses, also set FOUNDRY_PROJECT_ENDPOINT and FOUNDRY_MODEL.
5052
def create_chat_client() -> FoundryChatClient:
51-
"""Create an Azure OpenAI Responses client using a Foundry project endpoint."""
53+
"""Create a FoundryChatClient using a Foundry project endpoint."""
5254
return FoundryChatClient(
5355
project_endpoint=os.environ["FOUNDRY_PROJECT_ENDPOINT"],
5456
model=os.environ["FOUNDRY_MODEL"],

python/samples/02-agents/context_providers/simple_context_provider.py

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -50,9 +50,11 @@ async def after_run(
5050
# Use the chat client to extract structured information
5151
result = await self._chat_client.get_response(
5252
messages=request_messages, # type: ignore
53-
instructions="Extract the user's name and age from the message if present. "
54-
"If not present return nulls.",
55-
options={"response_format": UserInfo},
53+
options={
54+
"instructions": "Extract the user's name and age from the message if present. "
55+
"If not present return nulls.",
56+
"response_format": UserInfo,
57+
},
5658
)
5759

5860
# Update user info with extracted data
Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,28 @@
1+
# Conversation & Session Management Samples
2+
3+
These samples demonstrate different approaches to managing conversation history and session state in Agent Framework.
4+
5+
## Samples
6+
7+
| File | Description |
8+
|------|-------------|
9+
| [`suspend_resume_session.py`](suspend_resume_session.py) | Suspend and resume conversation sessions, comparing service-managed sessions (Azure AI Foundry) with in-memory sessions (OpenAI). |
10+
| [`custom_history_provider.py`](custom_history_provider.py) | Implement a custom history provider by extending `BaseHistoryProvider`, enabling conversation persistence in your preferred storage backend. |
11+
| [`redis_history_provider.py`](redis_history_provider.py) | Use Redis as a history provider for persistent conversation history storage across sessions. |
12+
13+
## Prerequisites
14+
15+
**For `suspend_resume_session.py`:**
16+
- `FOUNDRY_PROJECT_ENDPOINT`: Your Azure AI Foundry project endpoint (service-managed session)
17+
- `FOUNDRY_MODEL`: The Foundry model deployment name
18+
- `OPENAI_API_KEY`: Your OpenAI API key (in-memory session)
19+
- Azure CLI authentication (`az login`)
20+
21+
**For `custom_history_provider.py`:**
22+
- `OPENAI_API_KEY`: Your OpenAI API key
23+
24+
**For `redis_history_provider.py`:**
25+
- `OPENAI_API_KEY`: Your OpenAI API key
26+
- A running Redis server — default URL is `redis://localhost:6379`
27+
- Override via the `REDIS_URL` environment variable for remote or authenticated instances
28+
- Quickstart with Docker: `docker run -d --name redis-stack -p 6379:6379 redis/redis-stack-server:latest`

python/samples/02-agents/conversations/suspend_resume_session.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,7 @@
44

55
from agent_framework import Agent, AgentSession
66
from agent_framework.foundry import FoundryChatClient
7+
from agent_framework.openai import OpenAIChatCompletionClient
78
from azure.identity.aio import AzureCliCredential
89
from dotenv import load_dotenv
910

@@ -62,7 +63,7 @@ async def suspend_resume_in_memory_session() -> None:
6263
# OpenAI Chat Client is used as an example here,
6364
# other chat clients can be used as well.
6465
agent = Agent(
65-
client=FoundryChatClient(),
66+
client=OpenAIChatCompletionClient(),
6667
name="MemoryBot",
6768
instructions="You are a helpful assistant that remembers our conversation.",
6869
)

python/samples/02-agents/declarative/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -68,7 +68,7 @@ Illustrates a basic agent using Azure OpenAI with structured responses.
6868

6969
**Key concepts**: Azure OpenAI integration, credential management, structured outputs
7070

71-
### 5. **OpenAI Responses Agent** ([`openai_responses_agent.py`](./openai_responses_agent.py))
71+
### 5. **OpenAI Responses Agent** ([`openai_agent.py`](./openai_agent.py))
7272

7373
Demonstrates the simplest possible agent using OpenAI directly.
7474

0 commit comments

Comments
 (0)