1x06 What We Learned from Tay
When people use chatbots, they're looking for a conversational interface as close to human language fluidity as possible. This includes better intent recognition, turned-based Q&A, and digital personalities. There have been many experiments in recent memory to crowdsource personality, while training artificial language intelligence. These experiments have existed at the cross-section of research and online interaction, since training data is needed, but is sparse. It's hard to develop an appropriate conversational flow and personality without the large amounts of data provided by social media interaction.
How do normal people interact and speak to each other? How do they learn from one another, and pick up each other's mannerisms, lexicon, and speech patterns? Psychology has shown us that people often mimic each other in prolonged interactions. This is how new slang spreads, but it's also how books like "How to Win Friends and Influence People" make a mint.
Programmers are an optimistic bunch, and this idea that utilizing Facebook or Twitter interactions can help enhance experimental language is certainly something we all would try. But in that optimism we fail to see the occasional baser nature of humanity. We have to look no further than the hitch-hiking robot murdered in Philadelphia after spending only two weeks in America.
Behavior can be unpredictable. Are we really sure we want to pattern robots and chatbots after that?
The greatest example in recent memory is Microsoft's Tay, which was revisited and profiled in a story on ZDNet in July. Tay was created to mimic the speech and language patterns of teenagers, and in order to learn those patterns, it was meant to engage with them on Twitter. It gained over 50,000 followers in less than 24 hours, and went from typical, innocent digital assistant speech to full-fledged antisemitism.
Tay didn't last long, but the overwhelming conversation about bias in machine learning persists, and Tay showed rather effectively that algorithms learning from pure data can easily go awry--something we continue to struggle with when it comes to facial recognition technology, hiring, and predictive policing. As more and more businesses hop on the AI bandwagon, Microsoft and other researchers are looking at ways to bake resiliency into algorithms in order to avoid the overwhelming influence of gamed (or biased) crowdsourced data. The last thing society needs are businesses caught in feedback loops from bad data that algorithms utilize and reinforce.
Ultimately, the goal is to avoid automating bias, and although Tay is often recognized for its failures, it stands as a valuable, early lesson in the effects of bad data. Unfortunately, we're still struggling with implementing those lessons and ensuring that resiliency. --Michael Szul & Bill Ahern
Want to take this conversation further? We're experimenting with a public team on Keybase--an end-to-end encrypted messaging, file, and identity management service. Check out our public team here. You can also contact Bill and me directly from the chat feature. --MS
Interested in artificial intelligence, natural language processing, and chatbots in particular? Don't forget that Michael wrote a book: Building Chatbots in TypeScript with the Microsoft Bot Framework.
Building a Personality
Since this issue is talking about Tay, I decided to keep the Microsoft theme going. Two March's ago, I was at my first Microsoft MVP Summit, and I had the privilege of previewing the personality project they were working on for Azure Cognitive Services. It was highly impressive. Since then, the project has come a long way, offering five different personality types that alter the language used in the response, depending on which personality you choose: witty, professional, enthusiastic, caring, and friendly. It accomplishes this using a curated list of responses, and then backs that up with a deep neural network for items that the algorithm doesn't have a response for. You can actually try out the various personalities on the project web site, and there's some easy plug-in instructions for Microsoft's no-code QnA Maker.
I'm not trying to shill for Microsoft here, but toying around with the personality project will give you a good indication of where this technology is headed, and just how personable chatbots could become. --MS
Use IBM Watson in Bot Framework Chatbots
I recently wrapped up a contribution to the Bot Builder Community libraries to allow you to integrate IBM Watson's NLU into Bot Framework chatbots for text analysis. The emotion detection might be the most interesting feature that Watson's NLU offers. --MS
Tay isn't the only chatbot that Microsoft has let loose in cyberia. Zo was a Skype bot (although originally deployed to Kik and eventually available elsewhere) Microsoft released that interacted in a playful manner with users. It was the successor to Tay, running from 2016 to just this Summer, and holds a Microsoft record for continual chat conversations (turned-based interactions). Many of these chatbots that Microsoft releases are meant to help Microsoft research conversational aspects of human language to apply to machine knowledge on the subject.
I used Zo for a little bit while researching chatbots, and the biggest issue was that once installed, you couldn't remove it unless you blocked it. I would get random messages from Zo--sometimes days later--trying to keep a conversation going. It was annoying, but had an interested approach to retaining engagement even when it didn't respond accurately.
The biggest problem right now is the lack of utility with most chatbots beyond simple commerce or knowledge bases. The hype cycle has died down considerably. The hope is that now it's time for the real innovation to occur. --MS
Southern Tier Warlock Imperial Pumpkin Stout
It's headed into stout season, as well as pumpkin beer. I'm already a fan of Pumking, so when I laid eyes on an imperial pumpkin stout by Southern Tier, I didn't hesitate. While the sweetness of the pumpkin isn't as pronounced as it is in Pumking (reigning champ of pumpkin beers, in my opinion), Warlock is just sweet enough to balance the rich coffee and hint of spices brewed into it. If you can find it, get it! --BA