Neo4j Streams Kafka Integration

Neo4j Streams integrates Neo4j with Apache Kafka event streams, to serve as a source of data, for instance change data (CDC) or a sink to ingest any kind of Kafka event into your graph.
Our Kafka Connect Plugin offers the sink functionality.
The Neo4j Server Extension provides both sink and source, as it has also access to the transaction events. It will also add procedures you can call to send and receive data from Kafka.
Neo4j Streams has graduated from Neo4j Labs and is now a fully supported component of Neo4j for Enterprise customers.
Availability & Installation
The Kafka Connect Neo4j Sink Plugin will work with Confluent Cloud and also most forms of Apache Kafka, including run on premise, using a separate Connect framework & infrastructure. This approach is recommended for those who need data sink capabilities.
Alternatively, you can add the Neo4j Server extension to your own installations. Just download the matching release version into the plugins folder and configuring it according to the documentation (see below for both). This approach is recommended only for those who need data source capabilities.
Enterprise users may use Neo4j Streams under their existing support agremeents. Neo4j Community users may also use Neo4j Streams, with community support.
Relevant Links
Releases |
|
Documentation |
|
Enterprise Customer Support |
|
Community Support |
|
Authors |
Michael Hunger, David Allen and Andrea Santurbano, Mauro Roiter from our partners Larus BA, Italy |
Source |
|
Overview |
https://github.com/neo4j-contrib/neo4j-streams/blob/master/readme.adoc |
Issues |
Recent Articles
-
Confluent Blog: Using Graph Processing for Kafka Stream Visualizations
-
How to embrace event-driven graph analytics using Neo4j and Apache Kafka
-
How to produce and consume Kafka data streams directly via Cypher with Streams Procedures
-
How to leverage Neo4j Streams and build a just-in-time data warehouse with Apache Kafka
Was this page helpful?