
“It’s here to stay,” said Rachelle Biderman, assistant professor of communication studies, about artificial intelligence during the Feb. 5 lecture at Kirkwood Community College.
Biderman took initiative in the discussion and gave a nuanced and informative lecture about a specific aspect of AI — empathy and companionship within AI chatbots.
Biderman said she has been researching artificial intelligence for some time and uses it in her classes. One tool she uses is PitchVantage, which provides a virtual audience to help students practice their speeches while receiving feedback afterward.
Her presentation focused on AI chatbots such as C.ai, Nomi and Replika. Biderman began by explaining the interpersonal needs theory developed by William Schutz, which states that people generally have three social needs: inclusion, control and affection. She demonstrated that AI checks those boxes by always being available, lacking judgment, removing real-life social consequences and allowing the user to control intimacy.
She spoke about how AI chatbots can give the same psychological effect as talking face to face with someone for 15 minutes. However, she also explored negative effects such as addiction, as these programs are designed to keep users engaged for as long as possible. There are, at times, manipulative tactics to keep users conversing with chatbots, and tolerance for real, flawed and difficult human interaction can decrease with long-term chatbot use.
The discussion concluded with audience questions. One audience member asked, “In your personal opinion, is AI companionship a win or loss for society in general?”
Biderman said, “Hard to say. Personally, I think the negative implications for developing brains are more negative than positive. However, when it comes to the elderly, the benefits can be really good. But, ultimately, it’s here to stay; it won’t go away whether you like it or not.”
Another audience member inquired, “Teenagers are in the phase of life where they’re developing their sense of what is moral. Have you seen research on how AI’s non-judgmental attitude can affect their morality?”
“I think it’s too soon to see the long-term effects on how the brain processes morality,” said Biderman, “but it can definitely cause a blur between what is right or wrong because of the over-validation from the chatbots.”
In a separate interview, Biderman addressed whether AI is more helpful or harmful for college students.
“AI has helped students find research or approach a topic at a unique angle and tackle overwhelming tasks. It can be manageable, but at the same time, there’s a blurring between what’s their original thought and what’s being generated, and an over reliance,” she said.
Biderman also talked about appropriate use of AI and advised using it as a tool. She said AI is a piece of our society now, so students should learn how to use it as a tool and not use it as a replacement for their work.
Categories: Campus News, News









