From Zero to Database-backed Support Bot - Using the new GenAI Stack from Docker, LangChain, Ollama, and Neo4j With the breakthrough of large language models, adding generative AI capabilities to your applications is now possible for every developer. But where to start? In a partnership between Docker, Neo4j, LangChain, and Ollama we created a GenAI Stack for building database backed GenAI applications. Just with a single "docker-compose up", you get them up and running and can start importing data and creating vector embeddings as well as using an example chatbot application to answer natural language questions using a combination of a Large Language Model and a Knowledge Graph. In this session, we will look behind the scenes into the containers of the GenAI Stack, how they work together and how the LangChain and Streamlit Python apps are implemented. We will use data from StackOverflow, so you can fetch topics that you're interested in. But we will not stop there! Based on the existing code we will build and run our own GenAI app that extends the existing functionality and thanks to the quick Docker setup, you can code along. This should give you the setup, confidence and convenience to get going with your first applications that use large language models.
Learn more: https://bit.ly/3Q4Ug8j
#GenAI @DockerInc @LangChain #LLMs #Graphdatabase