12.9. Using Periodic Commit

[Note]Note

See Section 12.8, “Importing CSV files with Cypher” on how to import data from CSV files.

Updating very large amounts of data (for example when importing) with a single Cypher query may fail due to memory constraints. This will manifest itself as an OutOfMemoryError.

For this situation only, Cypher provides the global USING PERIODIC COMMIT query hint for updating queries. You can optionally set the limit for the number of rows per commit like so: USING PERIODIC COMMIT 500.

Periodic Commit will process the rows until the number of rows reaches a limit. Then the current transaction will be committed and replaced with a newly opened transaction. If no limit is set, a default value will be used.

See the section called “Importing large amounts of data” in Section 11.6, “Load CSV” for examples of USING PERIODIC COMMIT with and without setting the number of rows per commit.

[Important]Important

Using periodic commit will prevent running out of memory when updating large amounts of data. However it will also break transactional isolation thus it should only be used where needed.