Blog post

The 24 hour (Microsoft) Algorithm that went Rogue

By Andrew White | March 25, 2016 | 0 Comments

ChatbotAlgorithm EconomyMicrosoft

Reports in the press today that just 24 hours after release, Microsoft closed down Tay, it’s new artificially intelligent software chatbot.  In its first day, Tay was seen tweeting anti-Semitic rants.  See Microsoft Muzzles Artificially Intelligent Chatbot.

This case is so interesting on several levels:

  • Some users specifically targeted the service in order to seed its intelligence so that it would respond in a certain way.  Was this just an experiment or something more nefarious?  Was this fun and games or anti-Microsoft, or maybe even anti-AI?
  • The ease and speed with which this situation occurred makes you wonder how resilient such algorithms (clearly Tay is not just one algorithm) are to use, unpredictable use, and abuse.
  • Microsoft clearly had not anticipated or thought through all the different ways in which their chatbot could be used (or abused).  I guess they are just too nice.  Should Microsoft hire some bums and otherwise nasty people to sit as judges on their information governance board to ask the difficult questions?

I guess the real question here is this: Is the design of Tay at fault, or is this more a criticism of the design of us and our current society?  I think that might be outside the scope of this blog.

Now Microsoft has, presumably, to add additional algorithms to monitor the seeding of data to Tay in order to filter out content that might get misconstrued and used to create pathways that lead to offending responses.  At some level this is easy: ignore these kinds of words (add example here) absolutely and prevent them from entering  memory.  And do not use these words ever.  But at another level this can become quite complex quickly.  What is the context of some words, and might some uses be acceptable?  Under what conditions is that use acceptability?

Perhaps Tay needs a ‘ethical acceptability’ switch that users can dial-up or dial-down.  Maybe the switch needs to come in different cultural versions?  Algorithms to monitor algorithms?

Oh what joy.  Well, when I say, “joy” I don’t mean the kind of joy you mean…I mean..  Well, you get the point.

Comments are closed