• Visitors can check out the Forum FAQ by clicking this link. You have to register before you can post: click the REGISTER link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. View our Forum Privacy Policy.
  • Want to receive the latest contracting news and advice straight to your inbox? Sign up to the ContractorUK newsletter here. Every sign up will also be entered into a draw to WIN £100 Amazon vouchers!

Download ASP Site

Collapse
X
  •  
  • Filter
  • Time
  • Show
Clear All
new posts

    #11
    Originally posted by MarillionFan View Post
    Came across that but dont have Linux setup
    Can you install cygwin to give you access to Linux tools on your Windows box?
    Originally posted by MaryPoppins
    I'd still not breastfeed a nazi
    Originally posted by vetran
    Urine is quite nourishing

    Comment


      #12
      You can download wget for Windows.

      Comment


        #13
        You want teleport pro.

        Does exactly what you want

        TM

        Comment


          #14
          Originally posted by MarillionFan View Post
          Does anyone know how to download an ASP Website.

          I have been using WinHTTRack but it can only go so far. The website is written using ASP and I wish to follow all of the links and get downloaded locally.

          Anyone have an ideas?
          One word: wget

          For example, if you wanted to snaffle a local copy of R F Streater's physics site:

          wget -erobots=off --mirror -p -w 2 --convert-links -P C:\streater http://www.mth.kcl.ac.uk/~streater/

          In this example the options are:
          -erobots=off => ignore robots.txt file settings (naughty!)
          -m[irror] => mirror site on local drive (change all site links to "file://" links)
          -p => Get all associated components from site like images and CSS files
          -w # => pause # seconds between each download (to avoid snowing site under with requests)
          -r => recursive (grab sub-links on same site recursively)
          -u mozilla => present wget as mozilla browser

          I think these options should do what you want, without trying to mirror the whole Internet for example. But it is slightly confusing, as different versions of wget use different combinations to achieve the same end.

          But needless to say, it can only get a "connected" set of web pages; so if a site happens to comprise more than one link-disjoint set of pages or a partially ordered set with more than one "maximal" page (for example, downward links that don't each recursively span every page on the site) you'll need to do a wget for for a set of pages that does span the site, if that makes sense.

          edit: Sorry, I see several people have already mentioned wget. But as Bunk pointed out, you can get a Windows version.

          There's also cURL ; but I haven't used that recently.
          Last edited by OwlHoot; 4 September 2009, 22:04.
          Work in the public sector? Read the IR35 FAQ here

          Comment

          Working...
          X