Limited issue request

Let's say I sent a querie to receive data. I received an answer that contains 100,000 objects. If every user sends such requests, then the memory will quickly end. My task is for the user to submit a request and get 10 valid results. Then I clicked next and got the next 10 results (or scrolled). This task is similar to getting friends on facebook. How can such a query be made optimal?
Each time building a new request for the next 10 data seems to me very sub-optimal, at the same time, keeping a large cache is very expensive.

@89099124147

do you have more details of the memory will quickly end?

Neo4j does not cache query results?
Neo4j does cache data which is recently accessed and is governed by the size of dbms.memory.pagecache.size. And if for example you configure this to 5G and you access 10G worth of data then it will only keep the most recently used 5G of data.

And yes your description may result in a memory issue at the client end, i.e. how to display 100,000 objects, See cypher SKIP and LIMIT clauses SKIP - Cypher Manual

how to correctly compose a request in order to send data to the client in chunks?
when I gave the first part, the last element needs to be remembered, and so on. If my query returned a large amount of data, maybe I need to limit this query to a specified number of returned items? For example, I limited the number to 1000. I want to give a client 10 pieces. When the client looks at 1000 data, I need to transfer the next 1000. Is there an example of how to properly implement this?

see SKIP - Cypher Manual