Using wget to download a page and browse it locally


Run: [carpincho@bender]$ wget -r --convert-links url
What the arguments do:

  • -r (its the same using --recursive)   Turn on recursive retrieving.   The default maximum depth is 5.

  • -k (its the same using --convert-links) After the download is complete, convert the links in the document to make them suitable for local viewing. This affects not only the visible hyperlinks, but any part of the document that links to external content,such as embedded images, links to style sheets, hyperlinks to non-HTML content, etc.
  • url (obvious :P)
Note: You can run it under any Unix based Os or under Mac Os previouly installing it with macports

Comments

Popular posts from this blog

Como configurar el control de directv para que funcione con el Tv

Las Tutucas y todo lo demás

Python Ipdb Cheatsheet