wget -p -k -E -U mozilla https://example.com/
🧾 Command:
wget -p -k -E -U mozilla https://example.com/
🧩 Option Breakdown:
Option | Description |
---|---|
-p (--page-requisites ) |
Downloads everything needed to display the page properly, like images, CSS, JavaScript. |
-k (--convert-links ) |
Converts all the links in the downloaded page so they point to local copies, making it viewable offline. |
-E (--adjust-extension ) |
Adjusts the saved file extension to .html , if needed (useful when the URL doesn’t have a file extension). |
-U mozilla |
Sets the User-Agent string to mozilla so the request looks like it’s coming from a regular browser. |
https://example.com/ |
The target URL to download. |
🛠️ What This Command Actually Does:
This command downloads a single web page (in this case, https://example.com/
) with all its visual resources, and saves it in a way that you can open and view offline — looking almost exactly like it does online.
-
It'll grab the HTML, download images, pull in CSS, and JS files referenced in that page.
-
Then, it’ll fix all the links in the HTML so your browser loads the local files, not the original URLs.
🧪 Example Use Case:
You want to save a nicely formatted article or a simple static homepage to read offline or archive — this is the perfect command for that.