Performance with large collection

I imported a large collection of website dumps (like ‘read it later’ archive) and have some issues. The import process was OK (using the web-clipper curl API), but the sync now takes like forever…

Are you aware of some size limitations?
Is everyone else having problems with a large amount of nodes?

What could be the bottleneck?
I use WebDAV sync and an OSX client.

Regards, Thomas

1 Like

in may i had that issue I confirm large sync takes age on Linux, not on Windows :confused:
The only size limitation is with Onedrive (file size cant not be very big) said @laurent

I changed to local sync (Filesystem) for the initial sync and copied the files on the WebDAV folder - by hand. Completed in seconds... :see_no_evil::+1:
Downloading takes some time but is way faster than WebDAV uploads.

2 Likes

Nice workaround. We should put that in an FAQ.

cheater :slight_smile:

How are subsequent syncs performing? Did you revert to Joplin’s WebDAV sync method? Post-initial syncs with no changes still take a very long time on my Mac (about 1h with ~6’000 notes & end-to-end encryption enabled).

Yes, I only did the initial sync via local files and then switched to WebDAV sync. But before that i copied the locally synced folder to the WebDAV folder…
Now the WebDAV works, but it sure is slower (network and webdav overhead), but it’s doing ok.

I think there's something like this already in the FAQ, if not we should indeed add it or complete what's in there.

The next version should be a bit faster for initial sync as it will download resource data (which take a big chunck of the time) in parallel to notes, notebooks, etc. It also means the app will be ready to use earlier since notes will be there, and then the resources will keep downloading in the background over time.

2 Likes