Behind The Feminine Facade: Gender Bias in Virtual Assistants and Its Effect on Users
PDF

Keywords

Virtual Assistants
Gender bias
Stereotype
Linguistic equity
Digital interaction

How to Cite

Singh, S. ., Kumar, A. ., & Bose, S. . (2024). Behind The Feminine Facade: Gender Bias in Virtual Assistants and Its Effect on Users. Journal of Ecohumanism, 3(4), 351–356. https://doi.org/10.62754/joe.v3i4.3592

Abstract

The paper examines the boundaries of gendered voice in virtual assistant responses. It intends to examine the gender biases these virtual assistants display by looking at popular systems such as Siri, Alexa, Cortana and Google Assistant which are known for using female voices in their choice of assistants. The study explores how gender bias in VA design affects user experience using both quantitative and qualitative methods of study. It examines how Virtual assistants can be programmed and designed for inclusive interaction to challenge societal gender norms. By drawing from diverse fields such as pragmatics, sociolinguistics, and gender studies to analyze how these assistants encode and reproduce gender bias.  The present study also highlights the importance of linguistic equity in digital interaction.

https://doi.org/10.62754/joe.v3i4.3592
PDF
Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.