Over the last year, we’ve begun experimenting with bot design at Intercom—we’ve written before about some of the design considerations we made along the way about its role, its name, its personality, and the jobs it was designed to do.
Our bot intentionally handles only a few small jobs, the most essential of which is to keep customers and businesses (teams) in contact, even when the team is away.
In that village, it was not uncommon for girl in their early teens to have babies, putting their education on hold, in some cases permanently.
With a little support and education, these girls could have gone through school and made a difference in a village that is striving for success.
But then came Do Not Pay, a chatbot launched by an 18-year-old to help people fight parking tickets, and suddenly all the world’s media, it seemed, were writing about robot lawyers.
Suddenly, it seems, new law-related chatbots are springing up regularly.
If you’ve ordered a Domino’s or Pizza Hut pizza through Facebook Messenger, then you’ve interacted with a chatbot.
I remember during my time in a small Ghanaian village, it was kind of a taboo to discuss sex, so it was very hard for the youth to gain information from their parents or elders.
How long does it take your average, artificial intelligence-backed, teenage chat bot to turn into a racist, Hitler-loving, 9/11-conspiracy trafficking, incest-preoccupied, Trump-supporting sex object? This week Microsoft unveiled Tay, a research-driven AI chat bot whose aim was to converse with 18- to 24-year-olds on social media (Kik, Facebook, Twitter, Instagram, and Snapchat).
“The more you chat with Tay, Microsoft wrote, “the smarter she gets, so the experience can be more personalized for you.” Personalized because, when you chat with Tay, it understands your nickname, gender, favorite food, zipcode, and relationship status. Soon, the cheerful chat robot they had created to presumably talk about Taylor Swift and Katy Perry, began to parrot the sort of statements that are more typically found in the darkest reaches of website comments sections, or spoken in the full view of network cameras at a Trump rally (which increasingly seem to be the same thing).
As a participant in a conversation, it’s deliberately restrained.
Everything about its conversation style is intended, not to keep the conversation going, .