Introducing a New GenAI Stack for Developers


Today we are launching a new stack for building GenAI-powered applications together with our friends at Docker, LangChain, and Ollama. The GenAI Stack is a great way to quickly get started building GenAI-backed applications. It includes Neo4j as the default database for vector search and knowledge graphs, and it’s available completely for free.

As this post goes live, Docker’s CTO Justin Cormack is talking about it on stage at DockerCon 2023 alongside LangChain founder Harrison Chase, Ollama co-founder Jeff Morgan, and Neo4j graphista extraordinaire and my good friend Michael Hunger. Please check out the livestream here: https://www.dockercon.com

There are three things that I LOVE about the GenAI Stack:

    1. It comes bundled with the core components you need to get started, already integrated and set up for you in Docker containers
    2. It makes it really easy to experiment with new models, hosted locally on your machine (such as Llama2) or via APIs (like OpenAI’s GPT)
    3. It is already set up to help you use the Retrieval Augmented Generation (RAG) architecture for LLM apps, which, in my opinion, is the easiest way to integrate an LLM into an application and give it access to your own data

All of this, available at your fingertips with a simple docker compose up!

If you want to get started right away, check out the technical blog post from the team who built the example support agent application.

If you want to hear a bit about the backstory of how this came about, keep reading!

Ok, How DID This Come About?


GenAI and LLMs are very powerful, mind-bending technologies that offer a ton of promise. But it’s also an incredibly fast-moving space that developers are learning as they go. Even as you read this blog post, LangChain for example has probably had two more releases!

About a month ago, I had a conversation with Scott Johnston, the CEO of Docker. We talked about life, the universe, running developer-centric companies… and about how hard it is for developers to build GenAI-backed applications today.

Scott shared with me that he wanted to create a pre-built GenAI stack of best-in-class technologies that are well integrated, come with sample applications and make it easy for developers to get up and running. We started exploring what a stack like that should include.

Let’s Add Some Knowledge to That LLM of Yours!


Here at Neo4j, we’ve spent a lot of time thinking about LLM-backed applications and what role graphs can play in that. Fundamentally, all data represents something in the real world, and almost everything in the real world is connected in some way. Graphs can represent these hidden patterns and complex relationships within data, enabling GenAI models to better understand the world and in particular understand your data and through that your view of the world.

The powerful combination of graphs and LLMs is why we’ve seen a huge uptake and adoption of Neo4j to build LLM-backed applications. Usage skyrocketed when we added native vector search as part of our core capability, combining the implicit relationships uncovered by vectors with the explicit and factual relationships and patterns illuminated by graphs.

Neo4j also allows users to create knowledge graphs, which ground LLMs in these factual relationships, enable customers to get richer insights from semantic search and generative AI applications, and improve accuracy. While LLMs are great at language skills, they hallucinate because they lack grounding in truth. Knowledge graphs solve this problem.

Putting the Full Stack Together


We combined the power of Docker’s world-leading platform with our own unique advantages of graph database technology with native vector search and knowledge graphs, to create a world-class experience for developers that could get them up and running with GenAI immediately. We also brought in Harrison and Jeff to integrate two critical components into this stack: a programming and orchestration framework for LLMs (LangChain), and the means to run and manage these LLMs locally (Ollama).

It was an intense collaboration. We wanted to create something in time for DockerCon 2023: a moment that would enable thousands of developers to hear about it – and be the first to start actually using it!

The Gen AI Stack is now available as a Docker Compose file in Docker Desktop Learning Center and GitHub. It has several configurations designed to address popular GenAI use cases, with trusted open source content on Docker Hub. Its components include pre-configured open source LLMs via Ollama, Neo4j’s graph database and knowledge graphs, LangChain orchestration, and a series of supporting tools, code templates, how-tos and GenAI best practices.

And it is super easy to get started! docker compose up!

You can learn more about it from our blog post and Docker’s blog. Michael and Harrison will also co-present a talk at DockerCon following the keynote, and we’ll be going deeper on this at NODES 2023, which is Neo4j’s free 24-hour online conference for developers, data scientists, architects, and data analysts across the globe.



Final Thoughts


It’s crucial that we make GenAI more accessible and user-friendly for developers if we are to democratize the technology and enable everyone to harness its potential, from content generation to problem-solving.

The GenAI Stack is a great step towards that, and I’m so proud of our involvement in it. I couldn’t be more excited to see what you all will build with it.