Build GenAI Apps
for the Enterprise
Return complete answers with LLMs backed by the only knowledge graph with native vector search.
POWER LLMS WITH ENTERPRISE KNOWLEDGE
GenAI, Meet Knowledge Graphs
Completeness of Answers
Produce accurate and relevant responses: always explainable.
Lightning-Fast Responses
Return answers with queries that don’t require joins. Ever.
Dev-Friendly Schema
Easily add new and different data without rebuilding from scratch.
Privacy
Safeguards
Integrate access policies and add rules for granular control.
Completeness of Answers.
Every Time
Accurate
Return True Facts
Neo4j’s knowledge graph grounds LLM responses in validated facts using retrieval augmented generation (RAG). Easily add and update data, indexes, and RAG sources with our developer friendly schema that never requires redesign. Graph structure makes finding factual answers to multi-hop questions possible through traversals that follow a breadcrumb trail of connections.
Contextual
Essence of Meaning
Knowledge graphs provide explicit accuracy from your data, while vector search offers implicit responses using semantic meaning. Neo4j offers explicit and implicit responses from one database to deliver the best possible answers to the user. Storing embeddings as node properties gives users full context in addition to explicit responses native to knowledge graphs.
Relevant
Individualized Results
Ground your LLM in a knowledge graph that represents your company-specific data so responses match your business. Apply data governance policies to control information flow and integrate with your identity and access management provider for continuous governance. Define access policies by role or user and add constraints on nodes, labels, relationships, or properties – you can even wall off specific parts of the graph or limit traversal depth.
Explainable
Traceable Data Sources
Verify enriched LLM responses using linked data sources and origins. Neo4j’s knowledge graphs can capture metadata like sources and organizing principles, simplifying source identification. Users can view and interact with data sources through graph visualizations to better understand how data is connected.
Ground LLMs with Knowledge Graphs:
Step By Step
Use Neo4j directly in orchestration frameworks like LangChain, LlamaIndex, and others
Add and index vector embeddings in the Neo4j knowledge graph
Generate embeddings for user inputs with all model-providers both cloud & local
Find most relevant nodes with similarity search in the vector index & retrieve contextual information from the knowledge graph
Prompt any LLM, cloud or local, with the user question for natural language searches
Ground the LLM with that contextual information with retrieval augmented generation
import neo4j
import langchain.embeddings
import langchain.chat_models
import langchain.prompts.chat
emb = OpenAIEmbeddings() # VertexAIEmbeddings() or BedrockEmbeddings() or ...
llm = ChatOpenAI() # ChatVertexAI() or BedrockChat() or ChatOllama() ...
vector = emb.embed_query(user_input)
vectory_query = """
// find products by similarity search in vector index
CALL db.index.vector.queryNodes('products', 5, $embedding) yield node as product, score
// enrich with additional explicit relationships from the knowledge graph
MATCH (product)-[:HAS_CATEGORY]->(cat), (product)-[:BY_BRAND]->(brand)
MATCH (product)-[:HAS_REVIEW]->(review {rating:5})<-[:WROTE]-(customer)
// return relevant contextual information
RETURN product.Name, product.Description, brand.Name, cat.Name,
collect(review { .Date, .Text })[0..5] as reviews, score
"""
records = neo4j.driver.execute_query(vectory_query, embedding = vector)
context = format_context(records)
template = """
You are a helpful assistant that helps users find information for their shopping needs.
Only use the context provided, do not add any additional information.
Context: {context}
User question: {question}
"""
chain = prompt(template) | llm
answer = chain.invoke({"question":user_input, "context":context}).content
Neo4j Integrates with LLMs and Data Ecosystems
Neo4j’s scalable and flexible database and analytics technologies seamlessly integrate with generative AI tools like LangChain, LlamaIndex, Hugging Face, Ollama, and beyond. This unlocks a universe of information, while setting a new standard for AI accuracy, transparency, and explainability.
What people are saying about LLMs with Knowledge Graphs
"Before you can get the value of your AI, you have to fix the data. Our journey with Neo4j began several years ago with a data liberalization journey across more than 250+ entities that did not share data… Our enterprise intelligence hub today is trained on knowledge graphs powered Neo4j to co-exist and perform based on how our different departments and teams actually think and work. We have multiple deployments in production now…and it’s enabling us to do generative AI at scale."
Chief Digital Officer
Fortune 500 energy leader