fruitcombo

go back

downloading a neocities site with wget

neocities includes a handy button to download a copy of your blog. however i've had it disappear on me before, and searching reveals it has for other people too, so i wanted an alternate method just in case.

the command line program wget downloads files from the internet. it is especially useful for mass downloads, can download copies of static websites, and works especially well on neocities. it's available for windows, mac and linux, installation is different for each one. macos can get it from the homebrew package manager, linux can also get it from homebrew but likely has it pre-installed, windows... idk :p

open your terminal/command prompt and change directory into wherever you'd like to save your site, such as cd ~/Downloads. (this command works in both linux/macOS' bash and windows' powershell). test wget on a single image if you've never tried it before. once you're comfortable, enter the following (case-sensitive!) command with your site's url: wget -mpE https://yoursite.neocities.org/.

explanation for the arguments used:

the hyperlinks will still point to your site hosted on neocities, which is ideal for a backup that you can simply reupload if something goes wrong with the original. however if you'd like a fully offline browseable version, change the argument to -mpEk. -k, aka --convert-links, changes links from the absoloute path to your website to the relative path of the downloaded files (meaning you can move it around from where you initially downloaded it too). since most neocities sites are under 1 gb, there's little reason not to grab both versions. (you also may not need this option if you wrote your site with relative links to begin with)

ymmv in regards to download time. on my laptop near the router my image-heavy transformers fansite took 2 minutes to download, on my desktop in my room it took 40 minutes.