Data loss happened twice, I can't trust Logseq anymore

Exact same thing happened to me. I was on a 4 hour flight. Did a ton of work on Logseq. As soon as I connected to wifi, my laptop synced (using Logseq’s sync) and then all my work was gone. I’ve been trying to look for third party sync to use and thereby dump logseq. Even icloud sync seemed to be more stable than this.

@lsdjr72 @arudiuk
Sorry to hear about that, have you guys tried page history?

Also, please make sure smart merge is enabled on all your devices.

See here for instructions on how to to enable Smart Merge.

Smart merge is enabled on all devices. So far I have managed to recover everything, but I make changes across multiple pages and it takes quite a bit of time to go to each page and recover. I even find myself having to recover/correct a page multiple times.
I also worry that if I don’t catch a sync mistake, I will forget I ever added anything and lose that information.

Lost a day’s worth of work today. The bak folder was no help either. Have been using Logseq for a bit over a month and have really enjoyed it. Not using any sync features, just have the files in the Document folder on my Mac (and only using Logseq on this one computer). Not sure if I will keep using Logseq after this, feels really scary.

1 Like

I know this should not be necessary for an app you want to rely on, but I would urge everyone suffering data loss issues to set up a git repository. Install git, go to your Logseq graph folder in the terminal and type git init. Then turn on auto commit in settings->Version control->Enable Git auto commit. Set the time in seconds (I have mine set to 60 just to be safe).

This will pretty much ensure that if you ever see data missing, you can quickly and easily recover it. You can even view page history directly within Logseq, although it’s not the best experience. Something like VS Code will highlight changes between versions of the note, which is helpful.

I guess for the people who have Logseq sync wiping out their data, this could also affect the git folder, which would not be good. But if you’re just working on one machine, this is a good solution.


I just recently experienced this. I think making it much more obvious when you’ve been logged out would be a UX improvement. Its really not that visually obvious currently, and there is no popup to notify you to login again.

Experienced major data loss today because of this Sync issue. The order of events is as follows:

  1. Logseq Sync is enabled
  2. Logseq keeps prompting me there is a conflict between cloud version and the local version as I type.
  3. I disable Sync and keep writing
  4. Before I stop for the day, I re-enable Sync
  5. The next thing I see is my day’s work is gone. The cloud version has overwritten the local version.
  6. I manage to recover some of it by restoring the last version from page history.

Lesson here is either never use Sync or if you do use it, never to disable it, I guess.

Another day, another data loss. This time page history didn’t save me, either. I lost an hour of my notes. Hoping the team finds a solution for this issue quickly. Losing data on a daily basis undermines the credibility of the entire project in my mind. I still use it because I have multiple backups, and time machine, but I’m still losing data on a daily basis.

So… this critical issue not being resolved, and there being no commentary here about it – this makes me feel like something significant has been abandoned. Is there any consensus as to the state of the Logse

The team is working on the DB version which is the only way to improve the syncing and to solve the issues like yours

1 Like

not to be picky but it isn’t the only way to improve sync (there are mountains of file based sync implementations that do this just fine without catastrophic loss…) and a db backed system doesn’t actually solve these issues inherently. it sounds like it is a good way to go and the way they’ve choosen to do it. a db backed sync process is going to have its own liabilities and issues that are not dissimilar from a text/block sync pattern and is actually going to have the solve sync issues between files on disk and the db. a db backend will likely improve performance for large datasets and syncronization – not improve accurace or reduce failures.

additionally this bug is not looking like an actual sync issue. sync works great except when the system logs you out without notification. this could certainly be handled better than a silent failure.

sadly (very sadly) i think we’ve found the end of my road with logsync. i will dearly miss the superior implementation of a great number of things but if this is the way the dev team are handling this issue i’ve run enough software projects to have an intuition as to where this is going. what a shame. Roam is run by people whose politics I can’t personally abide and the other tool that works as I like is logseq which has caused the loss of more than 50 hours of my work (even with time machine and scripted backups… my losses were work that logseq never wrote to disk!)

lack of transparency, failure to pull or repair a horrible bug, pushing resolution of an issue to an eventual/planned retooling or rewrite… i get it and i’ve been there, but c’mon.


Happened here again too, happened twice. It really is foundational for this to work, especially when I’m paying for it!

I’m not even asking for smart syncing, I just want it to fail safe and not delete, rather than just silently delete many chunks. If you’re diffing between files, why can’t they go chunk by chunk and choose whichever side adds, never removing data. Then flag it and ask if you want it start removing things. It’s so much easier to look and say “oh yeah, I wanted to remove that chunk but it got restored, let me press delete” than recreate text which took hours to generate and is days later

And for the times when it’s a 50/50 diff, a word replaced here, I’d still rather a more interactive method than just deciding. But it’s so much more a minority than it’s not worth waiting to fix the broader issue

So incredibly frustrating to lose work and see new changelogs which address anything but that. I’ve seen so many forum posts about this, if it happens again and it’s not in my backups, I’m gone :frowning: I really like it too


You are not, please read this comment:

1 Like

Same, I have experienced multiple sync failures and am concerned about data integrity. Often times, I am working locally (e.g. when on a plane), and when I am back online, the cloud version overrides my local copy – destroying all of the work I did offline.

This becomes even more of a problem when I add/edit multiple pages, and when I’m back online, and it syncs, I have no idea which pages got overridden/reverted, and which ones haven’t. So I’m completely in the dark.

And just today, another issue cropped up where I’m getting an error connecting to the AWS server. When I click on the URL in the error message box, it says something about being disabled.

Data integrity is extremely important and I hope the Logseq dev team prioritize making sure the sync feature is rock solid so that data loss is no longer an issue.


Chiming in because I’ve also experienced a huge data loss. Nothing written in the past 2 weeks has been saved – not even in Sync’s history or bak. I don’t even know what to do… :pensive:

I wanted to report that I had significant data loss for unknown reasons. Many of my journal entries were completely wiped but the files remain. Luckily I had backups. I’ll test git soon to see if it fixes this issue otherwise I’d be forced to use a another sync service.

1 Like

Upon further investigation, I believe the smart merge feature is what caused data loss as it seems the remote graph was what was overwriting my data with empty entries.

How did your test of Git go? I am interested in trying it out.

I’ve been using Git to sync a MacBookPro and iPhone for about 6 months. Lots of Todo and issue tracking. No sign of data loss yet.
As noted in that thread above I am now testing syncing two different graphs using Git.
If there’s a downside to Git sync it’s having to be careful to not have both devices open at once.

1 Like

I’d argue that using multiple devices at once is less of an issue when you’re syncing with git compared with other third-party solutions. You’re likely to get conflicting files with something like SyncThing or cloud services, whereas git will merge the differences and if there are conflicts it will show you exactly what they are in the same file so you can manually fix them.