Hello,
I do like to keep a copy (and sometimes, many copies over time) of URLs I link in a note, just in case the URL disappears out of internet, or to be able to find the content I find the exact day I took the note.
So I wrote this script : GitHub - eromawyn/joplin-webarchive: joplin-webarchive.py
At the beginning I thought it would take me only 2 or 3 days, but I think I spent many more time at the end. As you can see, it’s even configurable per regex on the URL. It’s not very high quality code, but someone might find this script useful, so I am happy to publish it.
Maxime.
3 Likes
Why not just use the web clipper? Or is there something I don't get about this script? 
I like to store links in my notes. I don’t want to then save every linked URL manually using the clipper. So this create a copy of my links so I don’t have to use the clipper. I just add (cache) or [cache] after a link. And some url (with matching regex) are even automatically recognized and saved.
Plus, it automatically retrieve new version of the page every each day (number of day saved is configurable globally, with regex, or per link), which is nice since often I also want to keep track of how a page evolves.
I see... That's basically the opposite of what I use web clipping for. I just want to keep certain pages the way they are because they may change or disappear over time. Thanks for explaining your use case 
Another thing which I like to do which - to my knowledge - the clipper does not : keeping youtube videos !
And / Or a screenshot of the URL.
1 Like