Choosing an Agent Framework Integration
Neo4j Agent Memory provides integrations for six popular agent frameworks. This guide helps you choose the right integration for your use case.
Feature Comparison Matrix
| Feature | LangChain | PydanticAI | LlamaIndex | CrewAI | OpenAI Agents | Strands |
|---|---|---|---|---|---|---|
Memory Class |
|
|
|
|
|
|
Retriever Support |
✅ |
❌ |
✅ Native |
❌ |
❌ |
❌ |
Built-in Tools |
❌ Custom required |
✅ |
❌ Custom required |
❌ Custom required |
✅ |
✅ |
Trace Recording |
❌ Manual |
✅ |
❌ Manual |
❌ Manual |
✅ |
❌ Manual |
Multi-Agent |
⚠️ Limited |
⚠️ Limited |
⚠️ Limited |
✅ Native |
⚠️ Limited |
⚠️ Limited |
Async Support |
✅ Via wrapper |
✅ Native |
✅ Via wrapper |
✅ Via wrapper |
✅ Native |
✅ Sync (async internal) |
Test Coverage |
✅ 39 tests |
✅ 38 tests |
✅ 25 tests |
✅ 30 tests |
✅ 35 tests |
✅ 95 tests |
Quick Decision Guide
Use LangChain if:
-
You have an existing LangChain application
-
You need retriever integration for RAG pipelines
-
You want the most battle-tested integration
-
You’re building chains with conversation memory
from neo4j_agent_memory.integrations.langchain import Neo4jAgentMemory
memory = Neo4jAgentMemory(memory_client=client, session_id="user-123")
context = memory.load_memory_variables({"input": "query"})
Use PydanticAI if:
-
Starting a new project from scratch
-
You want automatic reasoning trace recording
-
You prefer modern dependency injection patterns
-
You need type-safe tool definitions
from neo4j_agent_memory.integrations.pydantic_ai import MemoryDependency, create_memory_tools
deps = MemoryDependency(memory_client=client, session_id="user-123")
tools = create_memory_tools(deps)
context = await deps.get_context("query")
Use LlamaIndex if:
-
Building RAG applications
-
You need
TextNodecompatibility -
Combining document retrieval with graph knowledge
-
Using LlamaIndex chat engines or agents
from neo4j_agent_memory.integrations.llamaindex import Neo4jLlamaIndexMemory
memory = Neo4jLlamaIndexMemory(memory_client=client, session_id="user-123")
nodes = memory.get(input="query") # Returns TextNode objects
Use CrewAI if:
-
Building multi-agent systems
-
Need shared memory across agents
-
Want agent-specific context generation
-
Using CrewAI’s crew and task abstractions
from neo4j_agent_memory.integrations.crewai import Neo4jCrewMemory
memory = Neo4jCrewMemory(memory_client=client, crew_id="research-crew")
memory.remember("Finding from research", metadata={"type": "fact"})
results = memory.recall("previous findings")
Use OpenAI Agents SDK if:
-
Using OpenAI’s official agent framework
-
You want function calling tools in OpenAI format
-
Prefer OpenAI message format for conversations
-
Building with GPT-4 or GPT-3.5
from neo4j_agent_memory.integrations.openai_agents import Neo4jOpenAIMemory, create_memory_tools
memory = Neo4jOpenAIMemory(memory_client=client, session_id="user-123")
tools = create_memory_tools(memory) # OpenAI function format
messages = await memory.get_conversation() # OpenAI message format
Use Strands Agents if:
-
Building on AWS with Amazon Bedrock
-
You want pre-built tools for context graph operations
-
You prefer a simple tool-based API without managing memory classes
-
You need Bedrock embedding support out of the box
from strands import Agent
from neo4j_agent_memory.integrations.strands import context_graph_tools
tools = context_graph_tools(
neo4j_uri="bolt://localhost:7687",
neo4j_password="password",
embedding_provider="bedrock",
)
agent = Agent(
model="anthropic.claude-sonnet-4-20250514-v1:0",
tools=tools,
)
Memory Types Support
All integrations support the three-layer memory architecture:
| Memory Type | Description | Use Case | All Integrations |
|---|---|---|---|
Short-Term |
Conversation history |
Session context |
✅ |
Long-Term |
Entities, preferences, facts |
Persistent knowledge |
✅ |
Reasoning |
Task traces, tool usage |
Learning from past |
✅ |
Installation
Install with your preferred framework:
# LangChain
pip install neo4j-agent-memory[langchain]
# PydanticAI
pip install neo4j-agent-memory[pydantic-ai]
# LlamaIndex
pip install neo4j-agent-memory[llamaindex]
# CrewAI
pip install neo4j-agent-memory[crewai]
# OpenAI Agents SDK
pip install neo4j-agent-memory[openai-agents]
# Strands Agents (AWS)
pip install neo4j-agent-memory[strands]
# All frameworks
pip install neo4j-agent-memory[all]
Detailed Comparison
Context Retrieval
All integrations provide a way to get combined context for LLM prompts:
| Framework | Method |
|---|---|
LangChain |
|
PydanticAI |
|
LlamaIndex |
|
CrewAI |
|
OpenAI Agents |
|
Strands |
Via |
Message Storage
| Framework | Method |
|---|---|
LangChain |
|
PydanticAI |
|
LlamaIndex |
|
CrewAI |
|
OpenAI Agents |
|
Strands |
Via |
Search Operations
| Framework | Method |
|---|---|
LangChain |
Via |
PydanticAI |
Via memory tools or direct client access |
LlamaIndex |
|
CrewAI |
|
OpenAI Agents |
|
Strands |
Via |
Performance Considerations
Async vs Sync
| Framework | API Style | Notes |
|---|---|---|
LangChain |
Sync (async under hood) |
Uses ThreadPoolExecutor for async bridging |
PydanticAI |
Async native |
Best performance for async applications |
LlamaIndex |
Sync (async under hood) |
Uses ThreadPoolExecutor for async bridging |
CrewAI |
Sync (async under hood) |
Uses ThreadPoolExecutor for async bridging |
OpenAI Agents |
Async native |
Best performance for async applications |
Strands |
Sync (async internal) |
Tools are sync; async MemoryClient runs in thread pool |
Migration Guide
Best Practices
Choose Based on Existing Stack
If you already use a framework, choose that integration:
-
LangChain project → LangChain integration
-
LlamaIndex project → LlamaIndex integration
-
CrewAI project → CrewAI integration
-
OpenAI-native → OpenAI Agents integration
-
AWS/Strands project → Strands integration
For New Projects
Consider PydanticAI or OpenAI Agents for new projects:
-
Modern async-native design
-
Built-in trace recording
-
Type-safe tool creation
-
Better IDE support
For RAG Applications
LlamaIndex integration works best with document-based RAG:
-
Native TextNode support
-
Combines document and graph retrieval
-
Works with LlamaIndex indices and query engines
Troubleshooting
Import Errors
Each integration requires its framework to be installed:
# If you get "ImportError: No module named 'langchain_core'"
pip install langchain-core
# If you get "ImportError: No module named 'llama_index'"
pip install llama-index-core
# If you get "ImportError: No module named 'pydantic_ai'"
pip install pydantic-ai
# If you get "ImportError: No module named 'crewai'"
pip install crewai
# If you get "ImportError: No module named 'openai'"
pip install openai
# If you get "ImportError: No module named 'strands'"
pip install strands-agents
Summary
| Use Case | Recommended Framework | |----------|----------------------| | Existing LangChain app | LangChain | | New project | PydanticAI or OpenAI Agents | | RAG with documents | LlamaIndex | | Multi-agent systems | CrewAI | | OpenAI function calling | OpenAI Agents | | AWS / Bedrock ecosystem | Strands | | Maximum flexibility | Direct MemoryClient |