Python Script to Import CSV from local files to Remote Connection

I'm trying to create a Python Script to query a redshift database, transform the data into a CSV file and upload to a Neo4j Remote Connection. I am using the Neo4j driver for Python: neo4j · PyPI.

When running the Cypher Query Command LOAD CSV I get the following error:

neo4j.exceptions.ClientError: {code: Neo.ClientError.Statement.ExternalResourceFailed} {message: Couldn't load the external resource at: file:/var/lib/neo4j/import/file.csv}

I have edited the conf file in /etc/neo4j/neo4j.conf and edited the following:

# dbms.directories.import=/var/lib/neo4j/import

dbms.security.allow_csv_import_from_file_urls=true

Even with editing the conf file as stated in LOAD CSV - Cypher Manual, the default file path to load the CSV is still /var/lib/neo4j/import/. Are there any solutions to load a CSV from my local file system and not from the Neo4j server import dir and is it possible to load a CSV object in the Python Script?

Hey,

with loading CSVs from your local machine into a remote database, this one could help: .net - Can't load CSV data into GrapheneDB Instance - Stack Overflow

However, you'd have to serve the CSV from your local machine and make sure that your local file server, serving the CSV, is reachable from the machine running the neo4j DB. This all seems a little complicated. It might be easier to just load the CSV locally with Python and create the data in the DB with Cypher. For that you might want to have a look at the CSV library that is bundled with Python: csv — CSV File Reading and Writing — Python 3.11.1 documentation

Hello, thank you for your response. I used the csv python library and this worked well as a solution.

1 Like