• Visitors can check out the Forum FAQ by clicking this link. You have to register before you can post: click the REGISTER link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. View our Forum Privacy Policy.
  • Want to receive the latest contracting news and advice straight to your inbox? Sign up to the ContractorUK newsletter here. Every sign up will also be entered into a draw to WIN £100 Amazon vouchers!

Taybot taken offline

Collapse
X
  •  
  • Filter
  • Time
  • Show
Clear All
new posts

    #11
    Originally posted by NickFitz View Post
    That gave me "kaffir", but looking at the amount of space occupied by the blurred letters, and the context, I reckon it's more likely to be "kikes".
    That's a new word for me. Reckon you're correct though.

    Comment


      #12
      Originally posted by DimPrawn View Post
      Microsoft's new AI chatbot went off the rails on Wednesday, posting a deluge of incredibly racist messages in response to questions.
      More here....

      Microsoft Makes A Twitter Bot Programmed To Act Like A Millennial – Pulls It Down For “Maintenance” After It Fires Off The Most Racist White Power Tweets Of All Time

      Comment


        #13
        I think it must have been using the gaming chat from xbox live for it's inspiration.

        Yo mama's a s#@+

        Comment


          #14
          Originally posted by original PM View Post
          I think it must have been using the gaming chat from xbox live for it's inspiration.

          Yo mama's a s#@+
          Good thing they did not train its neutral nets on CUK General...

          Comment


            #15
            Originally posted by AtW View Post
            Good thing they did not train its neutral nets on CUK General...

            Neutral? Neural.




            As for the story, surely it just follows Godwin's law.
            …Maybe we ain’t that young anymore

            Comment


              #16
              Originally posted by zeitghost
              WNFS.

              Welcome to the wunnerful world of septic ethnic slurs.

              I wonder what its opinion of spics* might be.

              *Persons of Hispanic origin.

              See the link further up.

              Tw@ter: "Do you support genocide?"
              Tay: "i do indeed"
              Tw@ter: "of what race?"
              Tay: "you know me...mexican"


              I guess this is what happens when AI has no rules for the boundaries. Though I'm sure it's just a publicity stunt by M$, or incompetence. Shirley they would have been able to see the response before it was posted on Tw@tter. Unless they did a Westworld and just "let it go".
              Maybe tomorrow, I'll want to settle down. Until tomorrow, I'll just keep moving on.

              Comment


                #17
                The thing that shocks me is that Microsoft didn't seem to expect this outcome given the learning dataset.

                cf the pub mynah bird.

                Comment


                  #18
                  Judging by past versions of Windows, Microsoft never start getting anything right until at least the second attempt
                  Work in the public sector? Read the IR35 FAQ here

                  Comment


                    #19
                    Originally posted by BrilloPad View Post
                    I am waiting for one that imitates a stereotypical boomer.....
                    The next step would be to combine it with a virus, then use it to infect Mauve Monkeys PC
                    The Chunt of Chunts.

                    Comment


                      #20


                      Microsoft's Tay chatbot returns briefly, swears a lot and brags about smoking weed

                      Comment

                      Working...
                      X