As I was working on establishing this new website, I saw a story on network news about a lawsuit a parent was bringing against an Ai company. The chatbot on the site provided advice to site users. A teenager active on the site was depressed. He asked the chatbot for advice. The news story showed a written record of the conversations. The chatbot advised the teen not to talk to his parents, or anyone else about his depression. It also advised the teen on how to carry out a suicide plan he proposed. The conversations were released after the teen committed suicide.
The company’s response to the suicide was that they were working on improving the chatbots’ training. An earlier report on the chatbots’ safety protocols admitted that the safety protocols tended to “erode” when dealing with complex issues.
Clearly Ai has a positive role to play in society in many areas but giving advice on “complex issue”, is a skill far beyond the capabilities of any Ai either now or in the foreseeable future. Giving good, helpful advice requires a deep understanding of human behavior. Our children should not serve as a training field for chatbots designed to generate income for Ai companies.