
Conference dates
October 26-28, 2023
Authors
Sara Choi, Cornell University
Dawn E. Schrader, Cornell University
October 26-28, 2023
Authors
Sara Choi, Cornell University
Dawn E. Schrader, Cornell University
Background
Research premise
In the race to incorporate AI technologies, I’ve observed long-term consequences of design on how people trust and their sense of personal agency, identity, and belonging. With AI systems capable of emotionally persuasive interactions, there’s growing concern about their potential to exploit cognitively and psychologically vulnerable populations. This led me to collaborate with my moral psychology professor to examine the role of intentional design and moral education frameworks in mitigating the risks of normalized human-AI interactions for those learning how to trust, particularly children and users with psychological vulnerabilities.
Paper session
Abstract
Technology has demonstrable effects on learning, decision making, personal agency, and sociality in interpersonal, community, and global spheres. It influences our engagement with the world and how we express our identities through emotions including happiness and sadness as well as motivation and a sense of belonging. As generative and agentic AI comes to resemble humans more closely, ethical challenges arise for developers and deployers of the technology as well as those interacting with it. This is of particular importance for youth because they are born into an AI-driven world of social robots and are unfamiliar with a non-technologically tethered existence. Notably, youth and digital natives engage in anthropomorphism, imbuing nonhuman things with human characteristics. This paper identifies ethical and normative principles underlying the idea of recreating humans in a digital or robotic form as well as the moral and ethical impact of human engagement with embodied AI (e.g., robots, agents, avatars) and disembodied AI (e.g., chatbots, Alexa, Siri). Regarding AI’s structural makeup, functions, and smart capabilities compared to humans, we examine the potentialities and dangers of normalized human interaction with AI, especially for youth.
We propose rethinking the content and forms of moral education for youth and vulnerable populations to protect children from potential exploitation at the point of “the singularity”–where human and smart systems become indistinguishable. This indistinguishability could be merely in the minds of both youth and adults or in actuality–the real development of AI systems. We posit that the form of embodiment of AI will play an increasingly less relevant role for children and digital natives, and conclude that cognitive developmental moral education of AI developers should focus on moral sensitivity to these psychological factors of youth interaction. We advocate for the industry’s consideration of not just direct users but also those who will be affected by those users’ decisions, and their goals in striving for AI alignment.
S. Choi and D. Schrader. 2023."Embodiment of AI and its Social, Cognitive, and Moral Impacts on Youth"
Presented at the 49th Association for Moral Education Conference, October 26-28, Fort Worth, TX, USA.
Notes
This paper is not published. Please reach out to me or Dr. Schrader for more information about our research.
Learn more about the Association for Moral Education and past conferences on their website.
This paper is not published. Please reach out to me or Dr. Schrader for more information about our research.
Learn more about the Association for Moral Education and past conferences on their website.