Sex chatbots for teens
Intelligent conversation is only emulated, allowing the chatbot to have some success in those days, when AI was just something seen in the sci-fi novels.To be amazing today, we need something more advanced. Unfortunately, consultation with a doctor can be difficult to obtain, especially if we need advice on non-life threatening problems.The healthcare system is congested and inefficient, and sick people may wait weeks or months for a visit.Microsoft this week launched the experiment in which the bot nicknamed "Tay" was given the personality of a teenager and designed to learn from online exchanges with real people.Bu the plan was sent awry by an ill-willed campaign to teach her bad things, according to the US software colossus.Anthropomorphism is something that is innate to us humans: we project human emotions on animals, stuffed animals and robots.If a robot has eyes, it makes us think it has a personality, even when we know the personality is not real.
Tay's profile at Twitter describes it as AI (artificial intelligence) "that's got zero chill" and gets smarter as people talk to it.
Nao robots and huggy bear robots have been used for helping children with autistic spectrum disorder to engage socially and learn languages, for instance.
We have also been touched by the story of how Siri became a non-judgmental friend and teacher to an autistic boy called Gus. Helsingin Sanomat interviewed a couple from Japan who had Pepper in their home.
Now granted, most of the above stories state or imply that Microsoft should have realized this would happen and could have taken steps to safeguard against Tay from learning to say offensive things.
(Example: the Atlanta Journal-Constitution noted that “[a]s surprising as it may sound, the company didn’t have the foresight to keep Tay from learning inappropriate responses.”).