Logseq Sync is not Beta software

In our development shop, you are not allowed to call software beta if it corrupts data or loses data. Logseq Sync both corrupts data and loses data. This service is alpha at best.

I ran across another significant data loss defect so I downgraded to 0.9.19. I’m currently studying for a certification exam so I don’t have time to fix all the corruption and watch out for data loss so I turned off Logseq Sync for two days.

I use Git on my laptop as well and sync that to Github so that was my backup strategy while Logseq Sync was turned back on. Once I was ready to turn Sync back on, I tagged my Git repo so I could always get back to what it was prior, ya know, just in case.

The only edits to my graph were on my laptop where Sync was turned off. I stopped using my other devices while Sync was off. So if working properly, Sync shouldn’t have made any changes to my local graph.

(narrator: it totally did make changes to my local graph).

I turned Sync on and then went back to see what Sync did to my graph by doing a git diff. Logseq Sync corrupted ~200 files on my local graph.

Mostly, what it did was on H1 blocks and property blocks, it added back in old values of those blocks. All over my graph. Including pages that I didn’t update in the last 3 days. So now I have hundreds of pages with corrupted property blocks and old blocks that have long since been edited.

At what point is the Logseq team going to take data stewardship seriously? This whole application and sync service feels like cowboy coding. For the life of me, I can’t imagine any worthwhile CI/CD automated test suite wouldn’t catch some really straight forward defects like this.

I have zero confidence in this application right now. I’m going to stay on 0.9.19 for the foreseeable future. And I’m going to stop donating to Logseq. They were funded with a multi-million dollar seed round. They can spend some of that money and hire some people who can actually implement a robust quality strategy.

Sorry, I’m super salty. I’m seriously invested in Logseq and I’m now realizing what a mistake that was.

In your place I’d be mad too.

It sounds like something Smart Merge could cause and IIRC they enabled it by default in 0.9.20. You have it enabled in Settings > Features > Sync, right? Did you have any problem with it disabled?

Funny… I thought the same thing. When I turned Sync on, I made sure to leave Smart Merge disabled. I can repeat the test with Smart Merge enabled and see if I get different results. I’m not sure how much effort I want to put into troubleshooting a defect that the Logseq team is likely not going to read about it, let alone fix.

If I thought the Logseq team engaged with these posts, I’d definitely do that to give them more information to help troubleshoot. I’m just super skeptical that team is doing anything other than building more shiny, poorly implemented features to woo investors into the next funding round.

1 Like

If you check the activity on GitHub, you will see most of the team is working on the feat/db branch i.e. on the “database version” for the last year or so.

I think they realized the current implementation of storage is impossible to fix and they decided to introduce a DB, it seems SQLite compiled to WebAssembly, and use it as the reliable storage. Then, to my understanding, they will develop a background service that sync the DB with the Markdown/Org files, making changes propagate in the two directions.

While I totally understand your frustration and I am very scared to lose data, I think we can say the team is aware of the situation and their focus is on fixing the root cause of all these issues.

I tried the DB version in the browser and the new properties are exactly what I wanted. Also, to my surprise, when I reopened the web page days later, my test graph was still there, probably stored locally in my browser.

I can’t wait for reliability and the new properties, I’d really need them now, but we have to wait and eventually be careful when using Logseq now.

3 Likes

I appreciate the info, thanks @alex0.

1 Like

Bizarre, I’ve been using Sync religiously for the last 5 months for both personal and business purposes - and have not had any problems like this.

Are there any other factors in play here, such as having Logseq open/active on another device at the same time or opening more than once instance of Logseq on the laptop? I once opened two instances of logseq with sync enabled (one just to read-only on another monitor) and this was causing me to receive a merge popup quite frequently.

I’ve had Logseq Sync turned off for 5 days. Just turned it back on and watched it wipe out literally every single edit I made today. I watched it do it. I have git backups to Github so I’ll just roll back my git repo to the commit before Sync destroyed my data.

I don’t have multiple windows open. The only plugin I’m using is AwesomeStyler. Everything else is vanilla out of the box. And this isn’t some weird, anomalous, non-deterministic defect. I can repeat this behavior every time.

  • Turn sync off
  • Make edits to some pages or journals
  • Turn sync back on
  • Watch edits disappear

Every single time.

It doesn’t seem to happen if I have Smart Merge turned on. If I have Smart Merge on, then all sorts of blocks and page property values get randomly duplicated all over my graph. So I can choose data loss or data corruption. Those are my only options with Logseq Sync. Which I pay money for by the way.

To put it politely, this feature is hot f__king garbage. The fact that they don’t have this as an automated test case gives me exactly zero faith in the Logseq team. Or even worse, they do have it as a test case and released this anyways. I’ve logged a crap ton of issues in their Github repo and have gotten exactly no replies. No triage. No followup. And they certainly haven’t actually addressed any of the issues I’ve logged.

I hate that I’m so invested in this software. It’s so very poorly implemented.

1 Like

Little more info to add. Logseq Sync was turned on. I reset my git repo to restore all of the changes that Sync wiped out. All my data came back.

AND THEN SYNC WIPED IT OUT AGAIN.

I’m so f__king fed up with this pile of shit software. I’m just so fed up.

You got it wrong, you are not paying for the Sync service, you are contributing to Logseq “community fund” and as a reward they let you try beta or alpha features (depending on how much you donate), you are supposed to test it, they are supposed to consider your feedbacks and only later Sync will become a paid service.

The whole Logseq is in beta, not only Sync. We are not supposed to use it in production until version 1.0 is released. It’s just that Logseq is already good enough that we all want to use it in production. But the truth is that we are early adopters helping with the development of a future product.

Is Logseq taking too much time to release a 1.0? Maybe, maybe not: it has been ~3 years since the first release and ~2 years since a team has been put together. 2-3 years is already the time most commercial proprietary software need to be launched anyway.

So the way I see it, the FOSS model is giving use the priviledge to try some software while it is being developed and to contribute to it.

There is still a chance Logseq team will fail and Logseq will be abandoned but it doesn’t seem the case at the moment, they have a plan, it’s a long term one, only time will tell if they were right investing so much time, particularly the last year on the DB version.

5 Likes

When you require that someone gives you money in order to access a service or product, the term we use for that in English is “paid for”. Give it whatever apologist semantics you want. I’m giving them money in exchange for a service which has been touted as “beta”. In my 30 year career in software development, I’ve never been allowed to call software “beta” if it destroys or corrupts user’s data. That’s table stakes.

And in case anyone’s keeping track at home, the data I had to restore twice yesterday is … gone again. I had to restore again today. So not only does it destroy data, it destroys data that I’ve recreated. 3 times.

4 Likes

Early access to new features is a common thing, Obsidian for example requires a one-time payment of 25$.

It seems you have a bug that trigger this behavior but as you may know, the team can’t do anything until they will be able to reproduce the bug and as I said their plan seems to be releasing the DB version as soon as possible instead of trying to fix all the bugs people are finding now.

Yet another thread of people losing their work (plus time, etc)…

And I just realized that I resigned myself to only use it for really loseable stuff - which ends up in duplication across other programs, making me doubt myself (did I write that down? Where did I put it? Oh gosh, maybe it got lost by Logseq?), stopping myself from using it across devices, etc.

In other words - sync is worse than useless like this. So ironic to pay for an organizer that ends up making things messier. The Logseq website introduces it saying " constantly afraid of losing your thoughts"? Well yes, thanks to Logseq!

And devs stay mum.

Enough complaining, I’m cancelling my subscription right now.

Logseq is free, there is no subscription, you can only donate. See my previous reply:

Thank you so much for the correction. You’re right, Open Collective calls it “periodic contribution”, not subscription.
Not sure what does that matter, but I cancelled it anyway.

Let’s see if Logseq / its back-end is coded well enough to even realize before the periodic re-login-request :laughing:

Trying beta features is a reward to thank you for the donation, I wouldn’t be surprised if they didn’t code the suspension when you stop donating.

Synchronization of information is one of the difficult problems in computing, so I’m not surprised of these issues.

I’m personally moving from Joplin not only because of the superior knowledge management LogSeq offers but also because their synchronization methods are failing on my case (I have more than 5000 references, it gets confusing).

I’m perfectly happy using git for commit-push for external storage, and pull-commit-push for synchronization with a second computer and advise anyone, even if using another method, to use git as a means to fast data recovery.

1 Like

This is a fair comment, but their website doesn’t mention this on the home page. Indeed, it labels two specific features (sync and whiteboards) as being in Beta, which to me would naturally imply everything else is out of beta.

My personal issue is that Logseq was pretty reliable, I used it daily, and now it’s not reliable at all. New releases seem to create new critical bugs to the point that I have had to switch to another solution.

The website is indeed misleading, the release page on GitHub marks all the releases as “beta”:

CC @Ramses

1 Like

It may be worth noting that it is possible to agree with both ends of this conversation: corruption of data is a major deal, and the website should be more clear about the current state of the software so users are aware of the risk level and maybe which specific operations to avoid to improve their chances.

At the same time, Logseq is still in its early days, and I’d guess that many people that donate to the project are doing this for realizing that it takes a lot of work to take something like this off the ground, and we’d all benefit if such a key daily driver software continues to be developed in the open and reaches maturity. We’re not paying to use the software. We’re rather contributing for the software to exist in its intended form.

4 Likes

Stopped using the internal sync and started using Syncthing for peer to peer synchronization. Next step is having an own raspberry as data center for distribution and backup. That is currently my way on dodging the non-realiability of the internal sync feature. Anyways i actually intend to not rely on a centralized sync service.

2 Likes