Chat Bots and Voice Agents
- Sophia Hettich

- 26. Sept. 2024
- 3 Min. Lesezeit
Can voice agents and chatbot be understood as “conversational”? Should they? What experience does a human-like system bring?
In the current times, we live in, most people are not familiar with the specifics of voice agents (VA), chatbots and AI in general, mainly because they don’t need to. To them, using a VA can feel like having an actual conversation with a real-life person and getting smart or even just funny answers to their questions and requests leaves them impressed. For many people, the whole concept of AI is mysterious and maybe slightly scary. I have lost count of how many times I’ve had conversations with friends and relatives, in which I’ve tried to explain to them that our world will not be taken over by robots tomorrow and that the Intelligence part in AI is very relative, considering that if we break it down, everything is still human-made. That is why I have trouble putting VA´s and chatbots under the category conversational. Yes, technically a user can have a simplified version of a conversation with their VA, but nothing comes from an independently formed thought of the AI. The conversations are always led by the person who created the VA, their biases, social structures they live in and their personal experience. Even if that happens subconsciously. Current VA´s and chatbots are also not put out with the purpose of being conversational. Their purpose is to aid us in our everyday lives, by providing a service of some kind. This brings in the issues that arise with systems like these, that are trying to become more human-like. Most VA´s on the market at the moment are gendered female, which as mentioned by many authors, amongst them Søndergaard & Hansen [3], enforces societal views on women and builds on outdated stereotypes. As mentioned before, VA´s are supposed to be of service to us humans, which would mean that if they do become more human-like, they would also become more servant-like, but with the missing boundaries that we experience in human-human-interaction. However, voice is a very natural aspect of human communication and it is also considered the most natural way of interacting with computers [3]. Studies [1,2] have shown that when human characteristics are applied to VA´s and chatbots, users were for example more likely to share intimate issues with them. Especially if the agents themselves revealed information, it facilitated the user’s self-disclosure [1]. But as Sannon et al. [2] critique, the personification of chatbots might also set different expectations about the agent’s expertise and ability to address certain topics. Of course, all of these discussions are based on the experiences of people who choose to have voice agents and chatbots in their life, because they have a choice on whether they want that assistance or not. But as highlighted in one of this week’s readings [3], some people do not have a choice of using (human) assistance, because for them it is a vital part of managing their everyday life. The struggles that they face, show that the human factor is a necessity and it is one that, at least at the moment, cannot be replaced by “human-like” VA´s. A human connection goes beyond practical tasks and the physical help needed, it includes mental support, social needs and challenging world views [3]. At this moment voice agents and chatbots cannot be understood as conversational and in my opinion, shouldn’t be understood as such as it gives the wrong impression of what AI is and can do. If we were to develop more human-like systems, it is absolutely curial that we pay more attention to the issues that can arise through it, such a giving VA´s a gender. I also think that apart from the many aspects to consider with human-like agents, it should also be made clear to users, that they (VA´s) are in fact not human and should not be considered as a replacement for real human relations.
References
[1]. Lee, Y., Yamashita, N., Huang, Y., & Fu, W. (2020). "I Hear You, I Feel You": Encouraging Deep Self-disclosure through a Chatbot. Proceedings Of The 2020 CHI Conference On Human Factors In Computing Systems. https://doi.org/10.1145/3313831.3376175
[2]. Sannon, S., Stoll, B., DiFranzo, D., Jung, M., & Bazarova, N. (2018). How Personification and Interactivity Influence Stress-Related Disclosures to Conversational Agents. Companion Of The 2018 ACM Conference On Computer Supported Cooperative Work And Social Computing. https://doi.org/10.1145/3272973.3274076
[3]. Søndergaard, M., & Hansen, L. (2018). Intimate Futures. Proceedings Of The 2018 Designing Interactive Systems Conference. https://doi.org/10.1145/3196709.3196766



Kommentare