Langchain v0.1 – Updating GraphAcademy Neo4j & LLM Courses


Langchain has released the first stable version, v0.1.

Langchain v0.1.0

It is an important release and introduces a number of significant changes.

I recently updated the Neo4j GraphAcademy courses, Neo4j & LLM Fundamentals, and Build a Neo4j-backed Chatbot using Python, to use Langchain v0.1. I want to share some of my findings.

It’s important to note that Langchain v0.1 is backward compatible with the previous version, so there is no need to rush into an upgrade. Langchain 0.1 marks a significant change with a whole host of benefits so having a plan in place to migrate is probably a good idea.

Before updating the GraphAcademy courses, running the Langchain examples in the course resulted in many deprecated warning messages.

LangChainDeprecationWarning: The class `ChatOpenAI` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use langchain_openai.ChatOpenAI instead.

While they are just deprecated warnings they are a distraction and if you are trying to learn something new could affect your confidence.

“Hang on, is this right?”
“Did I do something wrong?”

This was my key motivation for progressing an upgrade as soon as possible.

Here are 3 key changes I came across when updating the courses.

Structure

The internal structure of Langchain has changed, making the core leaner and more concise. As a result, LLM integrations and community tools have moved into their own packages.

For example, if you were previously using Langchain with OpenAI and Neo4j, the majority of what you needed would have been installed with the langchain, openaiand neo4jpackages.

pip install langchain openai neo4j

Post v0.1, you need the langchain_openai and langchain_community packages.

pip install langchain openai neo4j langchain_openai langchain_community

These changes are also reflected in how you import the modules, for example, the OpenAI class, was previously part of the langchain.llms module:

from langchain.llms import OpenAI

OpenAI is now part of the langchain_openaiintegration module:

from langchain_openai import OpenAI

The same is true for other LLM integrations and community modules. For example, the Neo4jGraph class, which was previously included in langchain.graphs:

from langchain.graphs import Neo4jGraph

Neo4Graph is now in the langchain_community.graphsmodule:

from langchain_community.graphs import Neo4jGraph

Importing modules from the older packages will result in deprecated warning messages with advice about how to install the new package, similar to this:

LangChainDeprecationWarning: Importing chat models from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

`from langchain_community.chat_models import ChatOpenAI`.

To install langchain-community run `pip install -U langchain-community`.

The new package structure gives greater flexibility and also improves dependency management.

Invoking

Many Langchain objects (LLMs, chains, agents, etc) are callable. For example, you can pass a prompt directly to an LLM object:

llm = OpenAI(openai_api_key="sk-…")
response = llm("What is Neo4j?")

In v0.1, this has been deprecated:

LangChainDeprecationWarning: The function `__call__` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use invoke instead.

The recommended approach is to use the invokemethod.

llm = OpenAI(openai_api_key="sk-…")
response = llm.invoke("What is Neo4j?")

The use of kwargs to pass prompt parameters has also been deprecated and standardized through the use of invoke.

For example, when passing the prompt parameter fruit to a chain.

llm = OpenAI(openai_api_key="sk-…")
template = PromptTemplate.from_template(
"""
Tell me about the following fruit: {fruit}
""")

llm_chain = LLMChain(
llm=llm,
prompt=template
)

response = llm_chain.run(fruit="apple")

Parameters should now be passed as a dictionary through the invoke method.

response = llm_chain.invoke({"fruit": "apple"})

The same pattern is used throughout Langchain, simplifying and standardizing the API, making it easier to integrate and work with.

Agents

From my perspective, the process of creating and working with agents is the most significant change in Langchain v0.1.

Previously, you could create a runnable agent using the initialize_agent function:

agent = initialize_agent(
tools,
llm,
memory=memory,
agent=AgentType.CHAT_CONVERSATIONAL_REACT_DESCRIPTION,
)

The initialize_agent function has now been deprecated:

The function `initialize_agent` was deprecated in LangChain 0.1.0 and will be removed in 0.2.0. Use Use new agent constructor methods like create_react_agent, create_json_agent, create_structured_chat_agent, etc. instead.

Langchain v0.1 introduces more agent types, and they are created using specific functions, such as create_react_agent.

In addition to using new agent type specific functions, you will need to follow a 3-step process for creating an agent:

  1. Create an agent prompt
  2. Create an agent
  3. Create an agent executor
agent_prompt = hub.pull("hwchase17/react-chat")
agent = create_react_agent(llm, tools, agent_prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools, memory=memory)

A key change worth noting is that an agent now requires a prompt. This gives greater control and allows you to instruct the LLM on how to select and deal with tools.

You can create a custom prompt, but in this example, the code pulls a pre-existing prompt hwchase17/react-chat from the Langsmith Hub.

As a result of providing the agent with a concise prompt, I also found that for the LLM to select the correct tool, I needed to provide better tool names and descriptions to the agent.

Langchain 0.1 is an important update. You can learn about how to use Langchain and LLMs alongside Neo4j on our GraphAcademy courses:

  • Neo4j & LLM Fundamentals — In this course, you will learn how to integrate Neo4j with Generative AI models using Langchain and why graph databases are a reliable option for grounding Large Language Models (LLMs),
  • Build a Neo4j-backed Chatbot using Python — In this hands-on course, you will use the knowledge gained in the Neo4j & LLM Fundamentals course to create a Movie Recommendation Chatbot backed by a Neo4j database.

What Is Neo4j GraphAcademy?

At Neo4j GraphAcademy, we offer a wide range of courses completely free of charge, teaching everything from Neo4j Fundamentals to how to develop software that connects to Neo4j.


Langchain v0.1 — Updating Graphacademy Neo4j & LLM courses was originally published in Neo4j Developer Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.