apoc.import.json

Procedure APOC Core

apoc.import.json(urlOrBinaryFile,config) - imports the json list to the provided file

Signature

apoc.import.json(urlOrBinaryFile :: ANY?, config = {} :: MAP?) :: (file :: STRING?, source :: STRING?, format :: STRING?, nodes :: INTEGER?, relationships :: INTEGER?, properties :: INTEGER?, time :: INTEGER?, rows :: INTEGER?, batchSize :: INTEGER?, batches :: INTEGER?, done :: BOOLEAN?, data :: STRING?)

Input parameters

Name Type Default

urlOrBinaryFile

ANY?

null

config

MAP?

{}

Config parameters

This procedure supports the following config parameters:

Table 1. Config parameters
name type default description

unwindBatchSize

Integer

5000

the batch size of the unwind

txBatchSize

Integer

5000

the batch size of the transacttion

importIdName

String

neo4jImportId

the name of the property to be populated with the "id" field present into the json. For example a row {"type":"node", "labels":["Language"], "id":"10"}, with importIdName:`foo`, will create a node (:User {foo: "10"})

nodePropertyMappings

Map

{}

The mapping label/property name/property type for Custom Neo4j types (point date). I.e. { User: { born: 'Point', dateOfBirth: 'Datetime' } }

relPropertyMappings

Map

{}

The mapping rel type/property name/property type for Custom Neo4j types (point date). I.e. { KNOWS: { since: 'Datetime' } }

compression

Enum[NONE, BYTES, GZIP, BZIP2, DEFLATE, BLOCK_LZ4, FRAMED_SNAPPY]

null

Allow taking binary data, either not compressed (value: NONE) or compressed (other values) See the Binary file example

nodePropertyMappings and relPropertyMappings support the following Neo4j types:

  • Point

  • Localdate

  • Localtime

  • Localdatetime

  • Duration

  • offsettime

  • Zoneddatetime

Output parameters

Name Type

file

STRING?

source

STRING?

format

STRING?

nodes

INTEGER?

relationships

INTEGER?

properties

INTEGER?

time

INTEGER?

rows

INTEGER?

batchSize

INTEGER?

batches

INTEGER?

done

BOOLEAN?

data

STRING?

Usage Examples

The apoc.import.json procedure can be used to import JSON files created by the apoc.export.json.* procedures.

all.json contains a subset of Neo4j’s movies graph, and was generated by apoc.export.json.all.

all.json
{"type":"node","id":"0","labels":["User"],"properties":{"born":"2015-07-04T19:32:24","name":"Adam","place":{"crs":"wgs-84","latitude":13.1,"longitude":33.46789,"height":null},"age":42,"male":true,"kids":["Sam","Anna","Grace"]}}
{"type":"node","id":"1","labels":["User"],"properties":{"name":"Jim","age":42}}
{"type":"node","id":"2","labels":["User"],"properties":{"age":12}}
{"id":"0","type":"relationship","label":"KNOWS","properties":{"since":1993,"bffSince":"P5M1DT12H"},"start":{"id":"0","labels":["User"]},"end":{"id":"1","labels":["User"]}}

We can import this file using apoc.import.json.

CALL apoc.import.json("file:///all.json")
Table 2. Results
file source format nodes relationships properties time rows batchSize batches done data

"file:///all.json"

"file"

"json"

3

1

15

105

4

-1

0

TRUE

NULL

Binary file

You can also import a file from a binary byte[] not compressed (default value, with config {compression: NONE}) or a compressed file (allowed compression algos are: GZIP, BZIP2, DEFLATE, BLOCK_LZ4, FRAMED_SNAPPY). That is:

CALL apoc.import.json(`binaryFileNotCompressed`, {compression: 'NONE'})

or:

CALL apoc.import.json(`binaryGzipByteArray`, {compression: 'GZIP'})

For example, this one works well with apoc.util.compress function:

WITH apoc.util.compress('{"type":"node","id":"2","labels":["User"],"properties":{"age":12}}', {compression: 'DEFLATE'}) AS jsonCompressed
CALL apoc.import.json(jsonCompressed, {compression: 'DEFLATE'})
YIELD source, format, nodes, relationships, properties
RETURN source, format, nodes, relationships, properties
Table 3. Results
source format nodes relationships properties

"binary"

"json"

1

0

2