• Visitors can check out the Forum FAQ by clicking this link. You have to register before you can post: click the REGISTER link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. View our Forum Privacy Policy.
  • Want to receive the latest contracting news and advice straight to your inbox? Sign up to the ContractorUK newsletter here. Every sign up will also be entered into a draw to WIN £100 Amazon vouchers!

You are not logged in or you do not have permission to access this page. This could be due to one of several reasons:

  • You are not logged in. If you are already registered, fill in the form below to log in, or follow the "Sign Up" link to register a new account.
  • You may not have sufficient privileges to access this page. Are you trying to edit someone else's post, access administrative features or some other privileged system?
  • If you are trying to post, the administrator may have disabled your account, or it may be awaiting activation.

Previously on "How to download the entire contents of an ASP website?"

Collapse

  • TheMonkey
    replied
    Originally posted by swamp
    Perl.
    WHY is there always ONE! wget is far better than hacking some piece of sh1t together with LWP.

    Leave a comment:


  • swamp
    replied
    Perl.

    Leave a comment:


  • TheMonkey
    replied
    Originally posted by MarillionFan
    Thanks Monkey. Much appreciated. Way behind today, baby has been screaming all day and it is doing my head in and havent managed to finish the first piece of work, let alone browse the net and see which software does the biz.
    Let's help some more:

    http://users.ugent.be/~bpuype/cgi-bi...=wget/wget.exe <-- download
    http://users.ugent.be/~bpuype/wget/#usage <-- usage

    Works wonders on pr0n.

    Leave a comment:


  • MarillionFan
    replied
    Thanks Monkey. Much appreciated. Way behind today, baby has been screaming all day and it is doing my head in and havent managed to finish the first piece of work, let alone browse the net and see which software does the biz.

    Leave a comment:


  • TheMonkey
    replied
    GNU wget or curl (google for them)

    Leave a comment:


  • AtW
    replied
    Hit some nerve again, just like with the cold beans?

    Leave a comment:


  • MarillionFan
    replied
    "MF sounds like your day job is consulting - learn how to make your own decisions ffs!"

    So I take it the fkcuing russian know-it-all doesn't know. So f-ck off ***** and dont respond to posts you know f-ck all about - yet again.

    In fact don't respond to any of my posts again. Pr1ck.
    Last edited by MarillionFan; 8 October 2006, 15:40.

    Leave a comment:


  • AtW
    replied
    MF sounds like your day job is consulting - learn how to make your own decisions ffs!

    Leave a comment:


  • MarillionFan
    replied
    Before I root around for software, can anyone recommend one?

    Leave a comment:


  • hattra
    replied
    MF

    have a look on download.com - they have a number of utils on there that might do what you want. I think there was one called freedownloader or something like (and there are probably others) that that you could rip off, er sorry, download complete sites with - whether it does everything you want, I don't know - but possibly worth a look.

    HTH

    Leave a comment:


  • MarillionFan
    replied
    Its an online catalogue of products. About 5000. I took a snapshot last year by doing it manually but it took forever. It's passworded and I am allowed to take the photos but it takes for ever.

    I can write something in VB (yes VB) but I was hoping there was some software I could point at a site and it would crawl every link etc.

    Come on, it cannot be that difficult!!!!

    Leave a comment:


  • bogeyman
    replied
    Preferably ASP?

    It's ok to do this for static sites, but it makes no sense for dynamic ones. You can never be sure you have captured the 'entire' site.

    If it's a dynamic site then you're only going to get a snapshot of whatever URIs you request.

    Also, dynamic sites can have an infinite (unknowable) number of pages (e.g. they could be DB-driven) so you don't know how much you're going to get for the 'entire site'.

    Never mind. I'm sure Atw's SKA will do the job perfectly

    Why do you need to do this anyhway?

    Leave a comment:


  • AtW
    replied
    SKA v2.0 will do it.

    hth

    Leave a comment:


  • How to download the entire contents of an ASP website?

    OK flame me if you like. I dont know.

    Can someone tell me a way or piece of software I can point at someones website (preferably ASP) and download everything.

    Cheers

    MF

Working...
X