What I'm trying to do right now is upload information for millions of people very quickly, efficiently and effectively in a graph database in Neo4j. Files containing data exceed 170 GB in size. Can you guide me by giving me an effective description of how I should go about achieving my goal? Obviously the person nodes needs to be connected with other nodes of different type. Thank you in advance!
What is the most efficient and fast way to load very large volumes of data into a Neo4j graph database?
Hi @kostas_vavouris !
Based on my experience, you can easily handle those as a CSV import.
The use of PERIODIC COMMIT is important in order to reduce the RAM load. Avoid trying to do everything in a single query. Try to guarantee the crreation of your base node (Person) first with a CREATE if possible. After it, go with the index creation and then the relationships.
Thanks! I will try your solution!