Importing multiple JSON files ( json bulk import)

Hello Everyone,

I'm modeling datasets for our network infrastructure, I have modeled much of the infrastructure. I got stuck in the bulk import of JSON interface files.
I tried the import tool using regex, it returned an error.

</> <bash-3.2$ ./neo4j-admin import --nodes:Interface="../import/device-interfaces_*" --ignore-missing-nodes </> Expected '--nodes' to have at least 1 valid item, but had 0

It appears that the import tool is only for csv files, so maybe a conversion is required.

The JSON file names look likes this:
"response": [
"status": "up",
"adminStatus": "UP",
"macAddress": "",
"ipv4Address": null,
"voiceVlan": null,
"description": " some desc",
"className": "SwitchPort",
"interfaceType": "Physical",
"speed": "100000",
"portMode": "access",
"portType": "Ethernet Port",
"duplex": "FullDuplex",
"vlanId": "",
"portName": "GigabitEthernet0/1",
"id": ""
the datasets looks something like this.
Any advice on the best way to bulk import JSONs, knowing they will have the same label 'Interface'.


Thank you,

How many nodes/relationships are you trying to create from those JSON files?

A possible option is to use the Load JSON procedure from APOC, the Neo4j standard library. You can learn about that here -

That is executed while the database is running though, which is different than what the neo4j-admin import tool does.

If you want to use neo4j-admin import, perhaps you could use the jq (jq) tool to convert your JSON files into CSV format?

Hi Mark,

Thanks for the input, I imported the datasets using apoc.load.json from the API , but it returned lots of NULLs for different properties. My temp fix ( though painstaking ) was to save the api datasets in a json file , correct/set the values , then import the file. However, I have to find a way, to condition my script, when I import from the API, when a property returns null, set it to value that I want it to be..