• Visitors can check out the Forum FAQ by clicking this link. You have to register before you can post: click the REGISTER link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. View our Forum Privacy Policy.
  • Want to receive the latest contracting news and advice straight to your inbox? Sign up to the ContractorUK newsletter here. Every sign up will also be entered into a draw to WIN £100 Amazon vouchers!

You are not logged in or you do not have permission to access this page. This could be due to one of several reasons:

  • You are not logged in. If you are already registered, fill in the form below to log in, or follow the "Sign Up" link to register a new account.
  • You may not have sufficient privileges to access this page. Are you trying to edit someone else's post, access administrative features or some other privileged system?
  • If you are trying to post, the administrator may have disabled your account, or it may be awaiting activation.

Previously on "Bing returning dodgy search results"

Collapse

  • DoctorStrangelove
    replied
    Originally posted by NotAllThere View Post
    I tried some of there examples but generally nothing. I think it's because Bing defaults to German where I am, but I did get "michelle obama ein mann" as a suggestion. I didn't even know that was a theory.
    One was first introduced to this nutjob theory by some very very very distant Septic relatives a couple of years ago during their visit to the metropolis that is Neath.

    One could almost hear the sound of distant banjos.

    Leave a comment:


  • NotAllThere
    replied
    I tried some of there examples but generally nothing. I think it's because Bing defaults to German where I am, but I did get "michelle obama ein mann" as a suggestion. I didn't even know that was a theory.

    Leave a comment:


  • OwlHoot
    started a topic Bing returning dodgy search results

    Bing returning dodgy search results

    2018-10-10 Bing Is Suggesting the Worst Things You Can Imagine

    We’ve seen this happen over and over. Microsoft once unleashed a chatbot named Tay on Twitter. This chatbot quickly turned into a Nazi and declared “Hitler was right I hate the jews” after it learned from other social media users. Microsoft had to pull it offline. ...
    Didn't Tay used to post quite a bit on here, and then abruptly disappeared? I guess that explains what became of him.

Working...
X