You don’t just want an article or an individual image, you want the whole web site. What’s the easiest way to siphon it all?
Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-driven grouping of Q&A web sites.
Image available as wallpaper at GoodFon.
SuperUser reader Joe has a simple request:
How can I download all pages from a website?
Any platform is fine.
Every page, no exception. Joe’s on a mission.
SuperUser contributor Axxmasterr offers an application recommendation:
HTTRACK works like a champ for copying the contents of an entire site. This tool can even grab the pieces needed to make a website with active code content work offline. I am amazed at the stuff it can replicate offline.
This program will do all you require of it.
We can heartily recomment HTTRACK. It’s a mature application that gets the job done. What about archivists on non-Windows platforms? Another contributor, Jonik, suggests another mature and powerful tool:
You’d do something like:
wget -r --no-parent http://site.com/songs/
Have something to add to the explanation? Sound off in the the comments. Want to read more answers from other tech-savvy Stack Exchange users? Check out the full discussion thread here.
- › The Best Command Line Tools You Can Get on Your Mac With Homebrew
- › How to Use wget, the Ultimate Command Line Downloading Tool
- › The Computer Folder Is 40: How the Xerox Star Created the Desktop
- › 5 Websites Every Linux User Should Bookmark
- › Cyber Monday 2021: Best Tech Deals
- › Cyber Monday 2021: Best PC Deals
- › How to Make Chrome Use Less RAM
- › Cyber Monday 2021: Best Apple Deals