Originally posted by MarillionFan
View Post
- Visitors can check out the Forum FAQ by clicking this link. You have to register before you can post: click the REGISTER link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. View our Forum Privacy Policy.
- Want to receive the latest contracting news and advice straight to your inbox? Sign up to the ContractorUK newsletter here. Every sign up will also be entered into a draw to WIN £100 Amazon vouchers!
Download ASP Site
Collapse
X
-
Can you install cygwin to give you access to Linux tools on your Windows box?Originally posted by MaryPoppinsI'd still not breastfeed a naziOriginally posted by vetranUrine is quite nourishing -
-
-
One word: wgetOriginally posted by MarillionFan View PostDoes anyone know how to download an ASP Website.
I have been using WinHTTRack but it can only go so far. The website is written using ASP and I wish to follow all of the links and get downloaded locally.
Anyone have an ideas?
For example, if you wanted to snaffle a local copy of R F Streater's physics site:
wget -erobots=off --mirror -p -w 2 --convert-links -P C:\streater http://www.mth.kcl.ac.uk/~streater/
In this example the options are:
-erobots=off => ignore robots.txt file settings (naughty!)
-m[irror] => mirror site on local drive (change all site links to "file://" links)
-p => Get all associated components from site like images and CSS files
-w # => pause # seconds between each download (to avoid snowing site under with requests)
-r => recursive (grab sub-links on same site recursively)
-u mozilla => present wget as mozilla browser
I think these options should do what you want, without trying to mirror the whole Internet for example. But it is slightly confusing, as different versions of wget use different combinations to achieve the same end.
But needless to say, it can only get a "connected" set of web pages; so if a site happens to comprise more than one link-disjoint set of pages or a partially ordered set with more than one "maximal" page (for example, downward links that don't each recursively span every page on the site) you'll need to do a wget for for a set of pages that does span the site, if that makes sense.
edit: Sorry, I see several people have already mentioned wget. But as Bunk pointed out, you can get a Windows version.
There's also cURL ; but I haven't used that recently.Last edited by OwlHoot; 4 September 2009, 22:04.Work in the public sector? Read the IR35 FAQ hereComment
- Home
- News & Features
- First Timers
- IR35 / S660 / BN66
- Employee Benefit Trusts
- Agency Workers Regulations
- MSC Legislation
- Limited Companies
- Dividends
- Umbrella Company
- VAT / Flat Rate VAT
- Job News & Guides
- Money News & Guides
- Guide to Contracts
- Successful Contracting
- Contracting Overseas
- Contractor Calculators
- MVL
- Contractor Expenses
Advertisers
Contractor Services
CUK News
- Business expenses: What IT contractors can and cannot claim from HMRC Today 08:44
- April’s umbrella PAYE risk: how contractors’ end-clients are prepping Yesterday 05:45
- How EV tax changes of 2025-2028 add up for contractor limited company directors Jan 28 08:11
- Under the terms he was shackled by, Ray McCann’s Loan Charge Review probably is a fair resolution Jan 27 08:41
- Contractors, a £25million crackdown on rogue company directors is coming Jan 26 05:02
- How to run a contractor limited company — efficiently. Part one: software Jan 22 23:31
- Forget February as an MSC contractor seeking clarity, and maybe forget fairness altogether Jan 22 19:57
- What contractors should take from Honest Payroll Ltd’s failure Jan 21 07:05
- HMRC tax avoidance list ‘proves promoters’ nothing-to-lose mentality’ Jan 20 09:17
- Digital ID won’t be required for Right To Work, but more compulsion looms Jan 19 07:41

Comment