Originally posted by bobspud
View Post
In the 90's I worked at the Mail group for a while. I was putting in some interesting stuff to do with the picture desk systems. At the time the story was when they were in Fleet Street there was over 3000 men just involved in printing the Daily Mail. When they moved to Surry Quays and sorted out their print presses it was down to 1500 in the whole group of companies.
I can tell you there were a lot of very surprised looking middle aged guys left wondering where their union job for life had just vanished to.
In London quite a few of them seemed to end up doing the knowledge from what I can tell. Shame Uber is about to screw them a new one all over again...
The missing part of the puzzle was AI and the fact that machines needed human intervention at certain points. However as an example of where this is becoming less of an issue there is software that can listen to your phone call and provide an accurate estimation of your level of truthfulness based on the way you answer questions.
It is possible to use standard web cams to monitor microscopic changes in your face to note changes in your wellness or stress this too than tell if we are lying
Soon it will be possible to remove call desks entirely, using functionality like SIRI that can listen and respond to commands incredibly well. If you couple a SIRI based machine with business rules and lie detection, it will become far better than a human as it will be able to measure responses and aggregate trending data to effectively game the humans using it...
It will be possible to use web forms together with web cameras to spot lying claimants many miles away. Why would Admiral need a private eye to come and interview you following an accident when its claim form can see that you just turned bright baboon arse red when it asked you a direct question. The changes in colour are undetectable to you but a filter on the camera that magnifies your skin tone many thousands of times will pick it up no problem.
How many use cases are there for spotting people that are lying on forms that they fill in from tax to benefits. Imaging a situation where you no longer sign a tax return but instead make a web filing that includes a web camera statement that your forms are true and accurate. How many call centres will HMRC need to spot the bastard then?
We already know that low latency share trading is making more than the idiots sat on the trading floors and most back office work has been reduced from 100's of clerks to a few dozen at most.
Many of the jobs we have are truthfully non-jobs the gap between employed and screwed has never been so perilously close.
Im pretty sure that it would be possible to use a mix of induction heat and the skin tone camera to cook a steak far better then most of the idiots in a pub kitchen.
I can tell you there were a lot of very surprised looking middle aged guys left wondering where their union job for life had just vanished to.
In London quite a few of them seemed to end up doing the knowledge from what I can tell. Shame Uber is about to screw them a new one all over again...
The missing part of the puzzle was AI and the fact that machines needed human intervention at certain points. However as an example of where this is becoming less of an issue there is software that can listen to your phone call and provide an accurate estimation of your level of truthfulness based on the way you answer questions.
It is possible to use standard web cams to monitor microscopic changes in your face to note changes in your wellness or stress this too than tell if we are lying
Soon it will be possible to remove call desks entirely, using functionality like SIRI that can listen and respond to commands incredibly well. If you couple a SIRI based machine with business rules and lie detection, it will become far better than a human as it will be able to measure responses and aggregate trending data to effectively game the humans using it...
It will be possible to use web forms together with web cameras to spot lying claimants many miles away. Why would Admiral need a private eye to come and interview you following an accident when its claim form can see that you just turned bright baboon arse red when it asked you a direct question. The changes in colour are undetectable to you but a filter on the camera that magnifies your skin tone many thousands of times will pick it up no problem.
How many use cases are there for spotting people that are lying on forms that they fill in from tax to benefits. Imaging a situation where you no longer sign a tax return but instead make a web filing that includes a web camera statement that your forms are true and accurate. How many call centres will HMRC need to spot the bastard then?
We already know that low latency share trading is making more than the idiots sat on the trading floors and most back office work has been reduced from 100's of clerks to a few dozen at most.
Many of the jobs we have are truthfully non-jobs the gap between employed and screwed has never been so perilously close.
Im pretty sure that it would be possible to use a mix of induction heat and the skin tone camera to cook a steak far better then most of the idiots in a pub kitchen.
Leave a comment: