apoc.import.json

Procedure

apoc.import.json(urlOrBinaryFile ANY, config MAP<STRING, ANY>) - imports a graph from the provided JSON file.

Signature

apoc.import.json(urlOrBinaryFile :: ANY, config = {} :: MAP) :: (file :: STRING, source :: STRING, format :: STRING, nodes :: INTEGER, relationships :: INTEGER, properties :: INTEGER, time :: INTEGER, rows :: INTEGER, batchSize :: INTEGER, batches :: INTEGER, done :: BOOLEAN, data :: STRING)

Input parameters

Name Type Default

urlOrBinaryFile

ANY

null

config

MAP

{}

Config parameters

This procedure supports the following config parameters:

Table 1. Config parameters
name type default description

unwindBatchSize

INTEGER

5000

the batch size of the unwind

txBatchSize

INTEGER

5000

the batch size of the transacttion

importIdName

STRING

neo4jImportId

the name of the property to be populated with the "id" field present into the json. For example a row {"type":"node", "labels":["Language"], "id":"10"}, with importIdName:`foo`, will create a node (:User {foo: "10"})

nodePropertyMappings

MAP

{}

The mapping label/property name/property type for Custom Neo4j types (point date). I.e. { User: { born: 'Point', dateOfBirth: 'Datetime' } }

relPropertyMappings

MAP

{}

The mapping rel type/property name/property type for Custom Neo4j types (point date). I.e. { KNOWS: { since: 'Datetime' } }

compression

Enum[NONE, BYTES, GZIP, BZIP2, DEFLATE, BLOCK_LZ4, FRAMED_SNAPPY]

null

Allow taking binary data, either not compressed (value: NONE) or compressed (other values)

cleanup

BOOLEAN

false

To remove the "id" field present into the json, if present. Note that in this way we cannot import relationship, because we leverage on "id" field to connect the nodes.

nodePropFilter

MAP<STRING, LIST<STRING>>

{}

A map with the labels as keys, and the list of property keys to filter during the import as values. For example { User: ['name', 'surname'], Another: ['foo']} will skip the properties 'name' and 'surname' of nodes with label 'User' and the property 'foo' of (:Another) nodes.
Note that if a node has multiple labels, in this example (:User:Another {}), all properties of both labels will be filtered, that is 'name', 'surname', and 'foo'.
We can also pass a key _all to filter properties of all nodes, for example {_all: ['myProp']}

relPropFilter

MAP<STRING, LIST<STRING>>

{}

A map with the relationship types as keys, and the list of property keys to filter during the import as values. For example { MY_REL: ['foo', 'baz'] } will skip the properties 'foo' and 'baz' of '[:MY_REL]' relationship.
We can also pass a key _all to filter properties of all relationships, for example {_all: ['myProp']}

nodePropertyMappings and relPropertyMappings support the following Neo4j types:

  • Point

  • Localdate

  • Localtime

  • Localdatetime

  • Duration

  • offsettime

  • Zoneddatetime

Output parameters

Name Type

file

STRING

source

STRING

format

STRING

nodes

INTEGER

relationships

INTEGER

properties

INTEGER

time

INTEGER

rows

INTEGER

batchSize

INTEGER

batches

INTEGER

done

BOOLEAN

data

STRING

Usage Examples

The apoc.import.json procedure can be used to import JSON files created by the apoc.export.json.* procedures, exported using the config parameter jsonFormat: 'JSON_LINES' (default config).

all.json contains a subset of Neo4j’s movies graph, and was generated by apoc.export.json.all.

all.json
{"type":"node","id":"0","labels":["User"],"properties":{"born":"2015-07-04T19:32:24","name":"Adam","place":{"crs":"wgs-84","latitude":13.1,"longitude":33.46789,"height":null},"age":42,"male":true,"kids":["Sam","Anna","Grace"]}}
{"type":"node","id":"1","labels":["User"],"properties":{"name":"Jim","age":42}}
{"type":"node","id":"2","labels":["User"],"properties":{"age":12}}
{"id":"0","type":"relationship","label":"KNOWS","properties":{"bffSince":"P5M1DT12H","since":1993},"start":{"id":"0","labels":["User"],"properties":{"born":"2015-07-04T19:32:24","name":"Adam","place":{"crs":"wgs-84","latitude":13.1,"longitude":33.46789,"height":null},"age":42,"male":true,"kids":["Sam","Anna","Grace"]}},"end":{"id":"1","labels":["User"],"properties":{"name":"Jim","age":42}}}

We can import this file using apoc.import.json.

CALL apoc.import.json("file:///all.json")
Table 2. Results
file source format nodes relationships properties time rows batchSize batches done data

"file:///all.json"

"file"

"json"

3

1

15

105

4

-1

0

TRUE

NULL

Binary file

You can also import a file from a binary byte[] not compressed (default value, with config {compression: NONE}) or a compressed file (allowed compression algos are: GZIP, BZIP2, DEFLATE, BLOCK_LZ4, FRAMED_SNAPPY). That is:

CALL apoc.import.json(`binaryFileNotCompressed`, {compression: 'NONE'})

or:

CALL apoc.import.json(`binaryGzipByteArray`, {compression: 'GZIP'})

For example, this one works well with apoc.util.compress function:

WITH apoc.util.compress('{"type":"node","id":"2","labels":["User"],"properties":{"age":12}}', {compression: 'DEFLATE'}) AS jsonCompressed
CALL apoc.import.json(jsonCompressed, {compression: 'DEFLATE'})
YIELD source, format, nodes, relationships, properties
RETURN source, format, nodes, relationships, properties
Table 3. Results
source format nodes relationships properties

"binary"

"json"

1

0

2