Very slow performance with large (local) graph

I just started using LogSeq more extensively and decided to move over my Roam graph which is fairly large (user for over a year), but not huge. I’m using the newest desktop release on Mac and have imported my Roam JSON. I’m surprised once I import the Roam data at just how slowly basic use of my graph becomes. E.g. expanding/collapsing items on a page in the sidebar takes 10+ seconds to execute. It makes the graph basically unusable.

Anyone else experience this type of performance lag? Or have suggestions? I was using a cloud drive for my data file, but have also tried on a fully local graph too…both are very slow.

1 Like

Yes, I ‘m having same problem. My solution is to open some large notes in external editor and continue from there. There was a version that improved a bit, only a bit the performance. It was before the refactoring. From there it becomes slow as it is now

Thanks for the reply Zab. The performance issue continues to persist for me and unfortunately makes LogSeq unusable for me right now. I’ll check back down the line and see if this get resolved/improved.

Same performance issues here. A real shame as
I had hoped that Logseq and Obsidian could have been a really good option for me.

Sorry to hear you’re having the same performance issues @Jules. I have continued to update with new releases, but so far none have seemed to help my slow performance.

did you manage to pinpoint the main cause for bad perf? mine is taking a long time to reindex / parse but once it’s done, I can type/fold at normal speeds, the graphview takes forever, and opening the sidebar can be slow.
I found that files with >500 bullets or files with lots of code blocks were slow.
do you have more details to share? is it choking on specific files or patterns?

1 Like

For me the graph functions ok on new pages, but after importing my Roam DB, any of my somewhat large Roam pages (with a good amount of bullets) are very very slow to load or type or use in any meaningful way, either in the sidebar or the main window. New daily pages or other new pages are better but I need to be able to work across the whole DB or the value of the graph just isn’t there. Reindexing doesn’t help.

It’s a known issue with large pages that have many blocks, we’re going to improve this soon. It could be next week.

3 Likes

I have the same problem with the Linux app with notes in a local folder. It’s so slow, it’s unusable.

Great to hear that @tienson. I’d love to be using LogSeq so hopefully that fix will make it useable for me. I’ll look for the update.

Hi @tienson, is the 0.2.10 release the one that is supposed to address the slow performance? I see the release notes mention long pages should load faster. If helpful to know, I updated and see no improvement in the performance with my large graph pages. For example, it took 10+ seconds to open a large page into the sidebar. Expanding/collapsing blocks in that page take 10+ seconds…

That’s very encouraging!
I’ve been a Clojure enthusiast for many years and Dynalist is a deep part of my workflow; naturally, I would love to use logseq, but my main outline is unusably slow as things stand.

could you share specific metrics about a slow page ? how many bullets/lines, does it contains a lot of links , indentation levels, code blocks, …

in 0.2.10 it seems there is now a pagination system, that fetches around 200 bullets per ‘page’ (not sure about the exact number, it’s an approximation from my quick experiment).
I have tested a medium file with 3577 lines, scattered in approx 500 bullets, no indentation/sub-levels, only 1st level bullets + properties . the initial loading time is 5-6 seconds then switching pages with more/prev takes about 4s. (this is a noticeable improvement over previous version with this specific file, before I think it took between 15 and 20s to load)