Import complete wikidata into neo4j

I used

to import the whole wikidata into neo4j 3.5 as educational resource but it does not work for Neo4j 4.3. Is there a way to import wikidata into Neo4j 4.3 maybe by n10s ?
Bests
Andreas

What exactly doesn't work Andreas?
Perhaps we can just fix it? Depending on which APIs it uses.

Sorry this is above my competences ;-) Is there a chance to use n10s for the whole import ?

Here is a blog post from Tomaz Bratanic

And here one from @jesus.barrasal, we've recently talked about having a data dump from wikipedia imported with neosemantics.

And a how to guide:

Thanks :slight_smile:
They all have a focus to import a subset of wikidata. It would be great to have a way to import the 60G Turtle-Dump from here: Index of /wikidatawiki/entities/

Do you have a good way to import the complete data? latest-all.json is so large.

Unfortunately not ... it realy takes some time ;-)

A new try could be with neosemantics. We tried it with this approach:

// Create constraints on URIs
CREATE CONSTRAINT n10s_unique_uri FOR (r:Resource) REQUIRE r.uri IS UNIQUE;
// Set initial config

CALL n10s.graphconfig.init({
  handleVocabUris: 'MAP', 
  handleMultival: 'ARRAY', 
  keepLangTag: true, 
  keepCustomDataTypes: true, 
  applyNeo4jNaming: true 
});
// Import: RDF/XML, Turtle 
call n10s.rdf.import.fetch("file:///data/neo4j-community-5.7.0/import/latest-all.ttl", "Turtle", { verifyUriSyntax: false }) yield terminationStatus, triplesLoaded, triplesParsed, namespaces, extraInfo
return terminationStatus, triplesLoaded, triplesParsed, namespaces, extraInfo;

This trick comes from Jesus: { verifyUriSyntax: false }
Thanks for that !
Stuff is running for 24h, we have over 120.000.000 nodes now and it has not finished yet ;-)