Deng, M. & Guo, Y. A study of safety acceptance and behavioral interventions for autonomous driving technologies. Sci. Rep. 12, 17891. https://doi.org/10.1038/s41598-022-22720-0 (2022).
Khan, S. K., Shiwakoti, N., Stasinopoulos, P. & Warren, M. A multinational empirical study of perceived cyber barriers to automated vehicles deployment. Sci. Rep. 13, 1842. https://doi.org/10.1038/s41598-023-29018-9 (2023).
Zhang, Q., Yang, X. J. & Robert, L. P. Drivers’ age and automated vehicle explanations. Sustainability 13(4), 2021. https://doi.org/10.3390/su13041948 (1948).
Schwarting, W., Pierson, A., Alonso-Mora, J., Karaman, S. & Rus, D. Social behavior for autonomous vehicles. Proc. Natl. Acad. Sci. 116, 24972–24978 (2019).
Jian, J.-Y., Bisantz, A. M. & Drury, C. G. Foundations for an empirically determined scale of trust in automated systems. Int. J. Cogn. Ergon. 4, 53–71 (2000).
Lee, J. D. & See, K. A. Trust in automation: Designing for appropriate reliance. Hum. Factors 46, 50–80 (2004).
Mayer, R. C., Davis, J. H. & Schoorman, F. D. An integrative model of organizational trust. Acad. Manag. Rev. 20, 709–734 (1995).
Makovi, K., Sargsyan, A., Li, W., Bonnefon, J.-F. & Rahwan, T. Trust within human-machine collectives depends on the perceived consensus about cooperative norms. Nat. Commun. 14, 3108. https://doi.org/10.1038/s41467-023-38592-5 (2023).
Du, N. et al. Look who’s talking now: Implications of AV’s explanations on driver’s trust, AV preference, anxiety and mental workload. Transp. Res. Part C: Emerg. Technol. 104, 428–442. https://doi.org/10.1016/j.trc.2019.05.025 (2019).
Tan, H. et al. Knowledge as a key determinant of public support for autonomous vehicles. Sci. Rep. 14, 2156 (2024).
Zhang, T. et al. Automated vehicle acceptance in China: Social influence and initial trust are key determinants. Transp. Res. Part C: Emerg. Technol. 112, 220–233 (2020).
Koo, J. et al. Why did my car just do that? Explaining semi-autonomous driving actions to improve driver understanding, trust, and performance. Int. J. Interact. Des. Manuf. (IJIDeM) 9, 269–275. https://doi.org/10.1007/s12008-014-0227-2 (2015).
Körber, M., Prasch, L. & Bengler, K. Why do i have to drive now? Post hoc explanations of takeover requests. Hum. Factors 60, 305–323. https://doi.org/10.1177/0018720817747730 (2018).
Ruijten, P. A. M., Terken, J. M. B. & Chandramouli, S. N. Enhancing trust in autonomous vehicles through intelligent user interfaces that mimic human behavior. Multimod. Technol. Interact. 2, 62. https://doi.org/10.3390/mti2040062 (2018).
Hatfield, N. A. The Effects of Automation Transparency and Ethical Outcomes on User Trust and Blame Towards Fully Autonomous Vehicles (2018).
Forster, Y., Naujoks, F. & Neukum, A. Increasing anthropomorphism and trust in automated driving functions by adding speech output. In 2017 IEEE Intelligent Vehicles Symposium (IV), 365–372, https://doi.org/10.1109/IVS.2017.7995746 (2017).
Zhang, Q., Yang, X. J. & Robert, L. P. What and when to explain? A survey of the impact of explanation on attitudes toward adopting automated vehicles. IEEE Access 9, 159533–159540. https://doi.org/10.1109/ACCESS.2021.3130489 (2021).
Montoya, R. M. & Horton, R. S. A meta-analytic investigation of the processes underlying the similarity-attraction effect. J. Soc. Pers. Relationships 30, 64–94 (2013).
Kim, J. K., Harold, C. M. & Holtz, B. C. Evaluations of abusive supervisors: The moderating role of the abuser’s gender. J. Organ. Behav. 43, 465–482 (2022).
Lee, S., Ratan, R. & Park, T. The voice makes the car: Enhancing autonomous vehicle perceptions and adoption intention through voice agent gender and style. Multimodal Technol. Interact. 3, 20 (2019).
Vilage, G. Voice Control In Cars: Where Are We Headed? (2023).
AI Voice Assistants to Push Success of Autonomous Driving, Software-defined Vehicle.
Samuel, S. Sexist AI: Siri and Alexa reinforce gender stereotypes, says UN study – Vox (2019).
The voices on NYC subway? They come from Bloomberg Radio – Talking Biz News.
Zhang, Q., Yang, X. J. & Robert, L. P. Jr. Finding the right voice: exploring the impact of gender similarity and gender-role congruity on the efficacy of automated vehicle explanations. Proc. AAAI Symp. Ser. 2, 219–223. https://doi.org/10.1609/aaaiss.v2i1.27675 (2023).
Broverman, I. K., Vogel, S. R., Broverman, D. M., Clarkson, F. E. & Rosenkrantz, P. S. Sex-role stereotypes: A current appraisal 1. J. Soc. Issues 28, 59–78 (1972).
Brooks, A. W., Huang, L., Kearney, S. W. & Murray, F. E. Investors prefer entrepreneurial ventures pitched by attractive men. Proc. Natl. Acad. Sci. 111, 4427–4431 (2014).
Eagly, A. H. & Steffen, V. J. Gender stereotypes stem from the distribution of women and men into social roles. J. Pers. Soc. Psychol. 46, 735 (1984).
Eagly, A. H., Makhijani, M. G. & Klonsky, B. G. Gender and the evaluation of leaders: A meta-analysis. Psychol. Bull. 111, 3 (1992).
Eagly, A. H. & Karau, S. J. Role congruity theory of prejudice toward female leaders. Psychol. review 109, 573 (2002).
Paek, H.-J., Nelson, M. R. & Vilela, A. M. Examination of gender-role portrayals in television advertising across seven countries. Sex Roles 64, 192–207. https://doi.org/10.1007/s11199-010-9850-y (2011).
Lewis, J. D. & Weigert, A. Trust as a social reality. Soc. Forces 63, 967–985. https://doi.org/10.1093/sf/63.4.967 (1985).
McAllister, D. J. Affect- and cognition-based trust as foundations for interpersonal cooperation in organizations. Acad. Manag. J. 38, 24–59. https://doi.org/10.5465/256727 (1995).
Fox, J. & Gambino, A. Relationship development with humanoid social robots: Applying interpersonal theories to human–robot interaction. Cyberpsychol. Behav. Soc. Netw. 24, 294–299 (2021).
Nass, C., Steuer, J. & Tauber, E. R. Computers are social actors. 72–78 (1994).
Nass, C. I. & Brave, S. Wired for Speech: How Voice Activates and Advances the Human-Computer Relationship (MIT press Cambridge, 2005).
Edwards, C., Edwards, A., Stoll, B., Lin, X. & Massey, N. Evaluations of an artificial intelligence instructor’s voice: Social Identity Theory in human-robot interactions. Comput. Hum. Behav. 90, 357–362 (2019).
Nass, C. & Gong, L. Speech interfaces from an evolutionary perspective. Commun. ACM 43, 36–43 (2000).
Nass, C. et al. Improving automotive safety by pairing driver emotion and car voice emotion, 1973–1976 (2005).
He, F. & Burns, C. M. A Battle of voices: A study of the relationship between driving experience, driving style, and in-vehicle voice assistant character. In Proceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, 236–242. https://doi.org/10.1145/3543174.3546845.
Loideain, N. N. & Adams, R. From Alexa to Siri and the GDPR: The gendering of virtual personal assistants and the role of data protection impact assessments. Comput. Law & Secur. Rev. 36, 105366 (2020).
Feine, J., Gnewuch, U., Morana, S. & Maedche, A. A taxonomy of social cues for conversational agents. Int. J. Human-Computer Stud. 132, 138–161 (2019).
Koenig, A. M. Comparing prescriptive and descriptive gender stereotypes about children, adults, and the elderly. Front. Psychol. 9, 1086 (2018).
Dong, J., Lawson, E., Olsen, J. & Jeon, M. Female voice agents in fully autonomous vehicles are not only more likeable and comfortable, but also more competent, Vol. 64, 1033–1037 (SAGE Publications Sage CA, 2020).
Lynch, S. & Campbell, M. Adolescents voice preference in auditory advertisements: A study in gender stereotypes and multi-media marketing. J. Stud. Res. 10 (2021).
Lee, E. J., Nass, C. & Brave, S. Can computer-generated speech have gender? An experimental test of gender stereotype. In CHI ’00 Extended Abstracts on Human Factors in Computing Systems, CHI EA ’00, 289–290 (Association for Computing Machinery, New York, NY, USA, 2000). https://doi.org/10.1145/633292.633461.
Eyssel, F., Kuchenbrandt, D., Bobinger, S., de Ruiter, L. & Hegel, F. ’If you sound like me, you must be more human’: on the interplay of robot and user features on human-robot acceptance and anthropomorphism. In Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, HRI ’12, 125–126, https://doi.org/10.1145/2157689.2157717 (Association for Computing Machinery, New York, NY, USA, 2012).
Koch, A. J., D’Mello, S. D. & Sackett, P. R. A meta-analysis of gender stereotypes and bias in experimental simulations of employment decision making. J. Appl. Psychol. 100, 128 (2015).
Nag, P. & Yalçın, N. Gender stereotypes in virtual agents. 1–8 (2020).
Habler, F., Schwind, V. & Henze, N. Effects of Smart Virtual Assistants’ Gender and Language. In Proceedings of Menschund Computer 2019, 469–473 (ACM, Hamburg Germany, 2019) https://doi.org/10.1145/3340764.3344441.
Danielescu, A. Eschewing gender stereotypes in voice assistants to promote inclusion. 1–3 (2020).
Nass, C. I., Moon, Y. & Morkes, J. Computers are social actors: A review of current. Hum. Values Design Comput. Technol. 72, 137 (1997).
Tay, B., Jung, Y. & Park, T. When stereotypes meet robots: The double-edge sword of robot gender and personality in human–robot interaction. Comput. Hum. Behav. 38, 75–84 (2014).
Choi, H. et al. On the use of simulation in robotics: Opportunities, challenges, and suggestions for moving forward. Proc. Natl. Acad. Sci. 118, e1907856118 (2021).
Zhang, Q., Yang, X. J. & Robert Jr, L. P. From the head or the heart? an experimental design on the impact of explanation on cognitive and affective trust. arXiv preprint arXiv:2110.03433 (2021).
Lee, J.-G. & Lee, K. M. Polite speech strategies and their impact on drivers’ trust in autonomous vehicles. Comput. Hum. Behav. 127, 107015. https://doi.org/10.1016/j.chb.2021.107015 (2022).
Gambino, A., Fox, J. & Ratan, R. A. Building a stronger CASA: Extending the computers are social actors paradigm. Hum.-Mach. Commun. 1, 71–85 (2020).
Lee, J.-E. R. & Nass, C. I. Trust in computers: The computers-are-social-actors (CASA) paradigm and trustworthiness perception in human-computer communication. In Trust and Technology in a Ubiquitous Modern Environment: Theoretical and Methodological Perspectives, 1–15 (IGI Global, 2010).
Moussawi, S. & Benbunan-Fich, R. The effect of voice and humour on users’ perceptions of personal intelligent agents. Behav. Inf. Technol. 40, 1603–1626. https://doi.org/10.1080/0144929X.2020.1772368 (2021).
Borau, S., Otterbring, T., Laporte, S. & Fosso Wamba, S. The most human bot: Female gendering increases humanness perceptions of bots and acceptance of AI. Psychol. Mark. 38, 1052–1068 (2021).
Byrne, D. E. The Attraction Paradigm Vol. 462 (Academic Press, 1971).
Wetzel, C. G. & Insko, C. A. The similarity-attraction relationship: Is there an ideal one?. J. Exp. Soc. Psychol. 18, 253–276 (1982).
Heilman, M. E. Gender stereotypes and workplace bias. Res. Organ. Behav. 32, 113–135 (2012).
Heilman, M. E., Block, C. J. & Martell, R. F. Sex stereotypes: Do they influence perceptions of managers?. J. Soc. Behavior Pers. 10, 237 (1995).
Lyness, K. S. & Heilman, M. E. When fit is fundamental: performance evaluations and promotions of upper-level female and male managers. J. Appl. Psychol. 91, 777 (2006).
Arnett, J. J. Developmental sources of crash risk in young drivers. Inj. Prev. 8, ii17–ii23. https://doi.org/10.1136/ip.8.suppl_2.ii17 (2002).
Irwin, M. R., Cole, J. C. & Nicassio, P. M. Comparative meta-analysis of behavioral interventions for insomnia and their efficacy in middle-aged adults and in older adults 55+ years of age. Heal. Psychol. 25, 3–14. https://doi.org/10.1037/0278-6133.25.1.3 (2006).
Whittaker, L., Kietzmann, J., Letheren, K., Mulcahy, R. & Russell-Bennett, R. Brace yourself! Why managers should adopt a synthetic media incident response playbook in an age of falsity and synthetic media. Bus. Horizons https://doi.org/10.1016/j.bushor.2022.07.004 (2022).
Latinus, M. & Taylor, M. J. Discriminating male and female voices: Differentiating pitch and gender. Brain Topogr. 25, 194–204. https://doi.org/10.1007/s10548-011-0207-9 (2012).
Leung, Y., Oates, J. & Chan, S. P. Voice, articulation, and prosody contribute to listener perceptions of speaker gender: A systematic review and meta-analysis. J. Speech Lang. Hear. Res. 61, 266–297. https://doi.org/10.1044/2017_JSLHR-S-17-0067 (2018).
Pernet, C. R. & Belin, P. The role of pitch and timbre in voice gender categorization. Front. Psychol. 3, 23 (2012).
Schoettle, B. & Sivak, M. A survey of public opinion about autonomous and self-driving vehicles in the US, the UK, and Australia (University of Michigan, Ann Arbor, Transportation Research Institute, 2014).
Hoy, M. B. Alexa, Siri, Cortana, and more: An introduction to voice assistants. Med. Ref. Serv. Q. 37, 81–88 (2018).
Zhang, Z., Tian, R. & Duffy, V. G. Trust in automated vehicle: A meta-analysis. In Human-automation interaction: Transportation (eds Duffy, V. G. et al.) 221–234 (Springer International Publishing, 2023). https://doi.org/10.1007/978-3-031-10784-9_13.
Bagozzi, R. P., Yi, Y. & Phillips, L. W. Assessing construct validity in organizational research. Adm. Sci. Q. 36, 421–458 (1991).
Fornell, C. & Larcker, D. F. Structural equation models with unobservable variables and measurement error: Algebra and statistics. J. Market. Res. 18, 382 (1981).
Netemeyer, R., Bearden, W. & Sharma, S. Scaling Procedures: Issues and Applications (Sage Publications, 2003).
Streiner, D. L. Starting at the beginning: An introduction to coefficient alpha and internal consistency. J. Pers. Assess. 80, 99–103 (2003).
Cronbach, L. J. Coefficient alpha and the internal structure of tests. Psychometrika 16, 297–334 (1951).
Snijders, T. A. & Bosker, R. Multilevel Analysis: An Introduction to Basic and Advanced Multilevel Modeling (Sage, 2011).
Lorah, J. Effect size measures for multilevel models: Definition, interpretation, and TIMSS example. Large-Scale Assess. Educ. 6(1), 1–11 (2018).
Cohen, J. Quantitative methods in psychology: A power primer. Psychol. Bull. 112, 1155–1159 (1992).
Ye, X., Bhatti, S. & Robert, L. Gender and Security Robot Interactions: A Brief Review and Critique. AMCIS 2024 Proceedings (2024).