Visitors can check out the Forum FAQ by clicking this link. You have to register before you can post: click the REGISTER link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. View our Forum Privacy Policy.
Want to receive the latest contracting news and advice straight to your inbox? Sign up to the ContractorUK newsletter here. Every sign up will also be entered into a draw to WIN £100 Amazon vouchers!
Got phont call on behalf of liquidator of my rented flat
7 next month - nearly 10 people employed now though
7 years to produce an application that reads a web page, stores each documented link in a DB and then reads each stored link and then works recursively.
It is a screen scraper with a connection to a database. A 300 line function.
I am not surprised he is tulipting himself that he cannot afford a flat.
That wasn't the clever bit though. The clever bit was getting 1000's of geeks worldwide to run his scraper for free, using the geeks bandwidth and electricity for free to deliver TB of data for free. That was the genius bit.
All under the premise that the data was going to be used to create an "open" Google rival, which it never was, instead it was used for commercial reasons to help spammers get to the top of the search engines.
Hats off to AtW on that one. He make Duncan Bannatyne look like Mother Teresa.
All under the premise that the data was going to be used to create an "open" Google rival, which it never was, instead it was used for commercial reasons to help spammers get to the top of the search engines.
Can't think of a more useless piece of software. At least my Plan D may be useful to manufacturing companies.
That wasn't the clever bit though. The clever bit was getting 1000's of geeks worldwide to run his scraper for free, using the geeks bandwidth and electricity for free to deliver TB of data for free. That was the genius bit.
All under the premise that the data was going to be used to create an "open" Google rival, which it never was, instead it was used for commercial reasons to help spammers get to the top of the search engines.
Hats off to AtW on that one. He make Duncan Bannatyne look like Mother Teresa.
It's a never ending search isn't it? Are those geeks worldwide still being 'used', or do AtW's servers do all the searching now?
7 years to produce an application that reads a web page, stores each documented link in a DB and then reads each stored link and then works recursively.
7 years!?!
That is a morning's work.
It was funny the first time. 2 years on, not so much.
All under the premise that the data was going to be used to create an "open" Google rival, which it never was, instead it was used for commercial reasons to help spammers get to the top of the search engines.
Two years ago we've become the only second company to find 1 trillion unique URLs - the first one was Google, so yes we are rivals but not the way your pea sized brain can appreciate
People who contributed to the project will be getting first payout money finalised this weekend...
Two years ago we've become the only second company to find 1 trillion unique URLs - the first one was Google, so yes we are rivals but not the way your pea sized brain can appreciate
People who contributed to the project will be getting first payout money finalised this weekend...
That is a big fat porky.
You want to tell me that you have one over on yahoo over URL crawls?
Bing?
Bollocks.
This "only second company to find 1 trillion unique URLs" is a complete lie.
Comment