Agent Framework Integrations
Neo4j Agent Memory provides official integrations for five popular agent frameworks, enabling persistent memory storage backed by Neo4j’s graph database.
Available Integrations
| Framework | Use Case | Status |
|---|---|---|
Chains and agents with conversation memory |
✅ Production Ready |
|
Modern type-safe agents with automatic tracing |
✅ Production Ready |
|
RAG applications with document + graph retrieval |
✅ Production Ready |
|
Multi-agent systems with shared memory |
✅ Production Ready |
|
OpenAI function calling with persistent memory |
✅ Production Ready |
|
Vertex AI embeddings, ADK agents, MCP server |
✅ Production Ready |
|
AWS Strands SDK with Bedrock and Context Graph tools |
✅ Production Ready |
|
Titan and Cohere embeddings via Bedrock |
✅ Production Ready |
|
AgentCore + Context Graphs combined |
✅ Production Ready |
Quick Start
Choose your framework and install:
# LangChain
pip install neo4j-agent-memory[langchain]
# PydanticAI
pip install neo4j-agent-memory[pydantic-ai]
# LlamaIndex
pip install neo4j-agent-memory[llamaindex]
# CrewAI
pip install neo4j-agent-memory[crewai]
# OpenAI Agents SDK
pip install neo4j-agent-memory[openai-agents]
# Google Cloud (Vertex AI + ADK + MCP)
pip install neo4j-agent-memory[google,mcp]
# AWS Strands Agents
pip install neo4j-agent-memory[aws,strands]
# All frameworks
pip install neo4j-agent-memory[all]
Quick Examples
LangChain
from neo4j_agent_memory.integrations.langchain import Neo4jAgentMemory
memory = Neo4jAgentMemory(memory_client=client, session_id="user-123")
context = memory.load_memory_variables({"input": "query"})
memory.save_context({"input": "Hello"}, {"output": "Hi!"})
PydanticAI
from neo4j_agent_memory.integrations.pydantic_ai import MemoryDependency
deps = MemoryDependency(memory_client=client, session_id="user-123")
context = await deps.get_context("query")
LlamaIndex
from neo4j_agent_memory.integrations.llamaindex import Neo4jLlamaIndexMemory
memory = Neo4jLlamaIndexMemory(memory_client=client, session_id="user-123")
nodes = memory.get(input="query")
CrewAI
from neo4j_agent_memory.integrations.crewai import Neo4jCrewMemory
memory = Neo4jCrewMemory(memory_client=client, crew_id="my-crew")
memory.remember("Important fact", metadata={"type": "fact"})
OpenAI Agents SDK
from neo4j_agent_memory.integrations.openai_agents import Neo4jOpenAIMemory
memory = Neo4jOpenAIMemory(memory_client=client, session_id="user-123")
messages = await memory.get_conversation()
Google ADK
from neo4j_agent_memory.integrations.google_adk import Neo4jMemoryService
memory_service = Neo4jMemoryService(memory_client=client, user_id="user-123")
await memory_service.add_session_to_memory(session)
results = await memory_service.search_memories("query")
AWS Strands Agents
from strands import Agent
from neo4j_agent_memory.integrations.strands import context_graph_tools
tools = context_graph_tools(
neo4j_uri="neo4j+s://...",
neo4j_password="...",
embedding_provider="bedrock",
)
agent = Agent(model="anthropic.claude-sonnet-4-20250514-v1:0", tools=tools)
Choosing a Framework
Not sure which integration to use? See the Framework Comparison Guide for a detailed comparison of features, performance, and use cases.
Common Patterns
All integrations share these common capabilities:
Three-Layer Memory
-
Short-Term: Conversation history within a session
-
Long-Term: Entities, preferences, and facts across sessions
-
Reasoning: Task traces and tool usage patterns
Integration Guides
-
LangChain Integration - Memory and retriever for chains
-
PydanticAI Integration - Type-safe agents with tracing
-
LlamaIndex Integration - RAG with TextNode support
-
CrewAI Integration - Multi-agent shared memory
-
OpenAI Agents Integration - Function tools for OpenAI
-
Google Cloud Integration - Vertex AI, ADK, MCP server
-
AWS Strands Integration - Bedrock-powered agents with Context Graph
-
AWS Bedrock Embeddings - Titan embedding configuration
-
AWS Hybrid Memory - Combined memory providers