Visitors can check out the Forum FAQ by clicking this link. You have to register before you can post: click the REGISTER link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. View our Forum Privacy Policy.
Want to receive the latest contracting news and advice straight to your inbox? Sign up to the ContractorUK newsletter here. Every sign up will also be entered into a draw to WIN £100 Amazon vouchers!
I really should get back to testpleasedelete.com... and I think of this after spending the whole evening thinking about writing a bunch of code and related articles about implementing real-time 3d graphics in modern browsers via the canvas element... and assorted wondrous approaches to optimisation... and how tricks that were dirty even in FORTH can get even dirtier in object oriented LISP (which is what JavaScript is)...
I'll get the new spider going over the weekend - a Bank Holiday weekend, when traffic on the site is low, is a good time to crawl, as it avoids overloading the server... not that the spider would overload it, given that it adds maybe 0.00001% load to the server (if not less).
Basically, it's equivalent a user loading a page every 30 seconds, but not downloading any images or script files, so it's about the same as a user loading a page every minute or two (it's more complex than that in terms of impact on the server, but it's the humans that hit it harder, not the spider. I can produce a graph, if necessary).
And if they restructured the templates to use HTML and CSS optimally, they could cut their page weight down massively and save a fortune (well, save a bit) on yearly bandwidth costs... as well as making it easier for my parser to digest the pages
I will do this for them, if they ask... as the barman said to the neutron, No Charge.
Comment