Build GenAI Apps
for the Enterprise

Return complete answers with LLMs backed by the only knowledge graph with native vector search.

Video Thumbnail

Ground LLMs with Knowledge Graphs: Step By Step

Use Neo4j directly in orchestration frameworks like LangChain, LlamaIndex, and others

Add and index vector embeddings in the Neo4j knowledge graph

Generate embeddings for user inputs with all model-providers both cloud & local

Find most relevant nodes with similarity search in the vector index & retrieve contextual information from the knowledge graph

Prompt any LLM, cloud or local, with the user question for natural language searches

Ground the LLM with that contextual information with retrieval augmented generation

Cypher

import neo4j
import langchain.embeddings
import langchain.chat_models
import langchain.prompts.chat

emb = OpenAIEmbeddings() # VertexAIEmbeddings() or BedrockEmbeddings() or ...
llm = ChatOpenAI() # ChatVertexAI() or BedrockChat() or ChatOllama() ...

vector = emb.embed_query(user_input)

vectory_query = """
// find products by similarity search in vector index
CALL db.index.vector.queryNodes('products', 5, $embedding) yield node as product, score

// enrich with additional explicit relationships from the knowledge graph
MATCH (product)-[:HAS_CATEGORY]->(cat), (product)-[:BY_BRAND]->(brand)
MATCH (product)-[:HAS_REVIEW]->(review {rating:5})<-[:WROTE]-(customer) 

// return relevant contextual information
RETURN product.Name, product.Description, brand.Name, cat.Name, 
       collect(review { .Date, .Text })[0..5] as reviews, score
"""

records = neo4j.driver.execute_query(vectory_query, embedding = vector)
context = format_context(records)

template = """
You are a helpful assistant that helps users find information for their shopping needs.
Only use the context provided, do not add any additional information.
Context:  {context}
User question: {question}
"""

chain = prompt(template) | llm

answer = chain.invoke({"question":user_input, "context":context}).content

What people are saying about LLMs with Knowledge Graphs

“Before you can get the value of your AI, you have to fix the data. Our journey with Neo4j began several years ago with a data liberalization journey across more than 250+ entities that did not share data… Our enterprise intelligence hub today is trained on knowledge graphs powered Neo4j to co-exist and perform based on how our different departments and teams actually think and work. We have multiple deployments in production now…and it’s enabling us to do generative AI at scale.”

Chief Digital Officer
Fortune 500 energy leader