SciELO - Scientific Electronic Library Online

 
vol.43 número3Terapia de conducta: raíces, evolución y reflexión sobre la vigencia del conductismo en el contexto clínicoAproximaciones evolutivas y psicopatología: teoría de historia de vida y sistemas psicobiológicos en desorden de personalidad límite índice de autoresíndice de materiabúsqueda de artículos
Home Pagelista alfabética de revistas  

Servicios Personalizados

Revista

Articulo

Indicadores

Links relacionados

  • En proceso de indezaciónCitado por Google
  • No hay articulos similaresSimilares en SciELO
  • En proceso de indezaciónSimilares en Google

Compartir


Papeles del Psicólogo

versión On-line ISSN 1886-1415versión impresa ISSN 0214-7823

Pap. Psicol. vol.43 no.3 Madrid sep./dic. 2022  Epub 11-Sep-2023

https://dx.doi.org/10.23923/pap.psicol.2996 

Articles

Development of empathy through social emotional artificial intelligence

Desarrollo de la empatía a través de la inteligencia artificial socioemocional

María Isabel Gómez-León (orcid: 0000-0001-7466-5441)1 

1Universidad Internacional de La Rioja, Spain

Abstract

It is expected that in the near future robots will be increasingly involved in social roles, however, understanding how students learn empathic skills, and how technology can support this process, is an important but under-researched area in artificial intelligence. This paper analyzes the factors that contribute to the development of empathy from early childhood and the variables of robotic empathy that could help promote this learning. It has been found that social emotional artificial intelligence (SEAI) has already successfully implemented some of the human mechanisms of empathy that are present during the first years of life. The current state of SEAI research is far from achieving full empathic capacity, but it can provide useful tools to promote empathic skills-the basis of social cooperation and ethical and prosocial behavior-from childhood.

Keywords: Empathy; Social interaction; Robotics; Emotional education; Educational intervention

Resumen

Se prevé que en un futuro próximo los robots estarán cada vez más involucrados en roles sociales, sin embargo, comprender cómo los estudiantes aprenden habilidades empáticas, y cómo la tecnología puede respaldar este proceso, es un área importante pero poco investigada. Este trabajo analiza los factores que contribuyen al desarrollo de la empatía desde la infancia temprana y las variables de la empatía robótica que podrían ayudar a favorecer este aprendizaje. Se ha encontrado que la inteligencia artificial socioemocional (IAS) ya ha logrado implementar con éxito algunos de los mecanismos humanos de la empatía que están presentes durante los primeros años de vida. El estado actual de la investigación en IAS está lejos de lograr una capacidad empática completa, pero puede aportar herramientas útiles para fomentar habilidades empáticas desde la infancia.

Palabras clave: Empatía; Interacción social; Robótica; Educación emocional; Intervención educativa

Introduction

Research has shown that empathy can promote students' motivation and prosocial behavior and, consequently, physical and emotional well-being, which is a factor of social protection and development (Bisquerra & Alzina, 2017).

As the understanding of the mechanisms of empathic behavior advances, so does the development of technology capable of significantly influencing behavior, coexistence, and the expectations of a world that aspires to sustainability and social justice. Currently, much of the scientific interest is focused on reproducing and introducing human empathy into computer systems thanks to artificial intelligence, among other things. Social emotional artificial intelligence (SEAI) is the application of certain human social emotional characteristics to artificial intelligence, either to a physical entity, or to an avatar system, but more particularly to robots.

Social emotional robots may have different forms or functions, but they share certain characteristics in common (Woo et al., 2021): they recognize the presence of humans, can engage them in a social interaction, express their own 'emotional state', and interpret that of their interlocutors. At the same time, they must be able to communicate in a natural, human-like way, which must also include nonverbal language, such as communication by gestures, postures, facial expressions, and in any other intuitive way.

Understanding how children learn empathic skills, and how technology can support this process, is an important but as yet under-researched area. As robots are increasingly expected to employ social roles in society and share environments with us in the future (Schiff, 2021), it is essential to understand how to design them so that they can foster rewarding long-term social interactions by activating relevant social schemas, behaviors, and emotions that also benefit the social-emotional education of the children with whom they interact.

The aim of this paper is to analyze the possibilities offered by IAS resources to be used as educational intervention tools in the development of empathy from early childhood. For this purpose, the characteristics of social robots that have been shown to have an effect on some of the variables related to empathy in children are studied. The results are discussed by analyzing the types of design of robotic technologies that would support and help to train the different subcomponents of empathic skills at each stage of normotypical development.

Developmental dimension of empathy

Feeling empathy for a person in need is the best documented source of altruistic motivation. Frans de Wall (2012) described the Russian doll model to demonstrate the different forms of empathy between animals and humans. This model is composed of three embedded dolls representing increasing levels of empathy complexity:

  • - The first doll contains primitive or biological behaviors present in animals, either motor mimicry (i.e., imitation of an observed behavior) or emotional contagion (i.e., sharing an emotional state).

  • - The second doll contains more complex behaviors observed in certain types of animals, either coordination towards a common goal or sympathetic concern (i.e., consolation), where the situation and the reasons for others' emotions are evaluated.

  • - The third is the most characteristic of human beings, that is, the most advanced stage of empathy. It is composed of perspective taking (directed help) and true imitation, not to be confused with motor mimicry, which refers rather to a biological characteristic that occurs automatically in many species, such as the yawning reflex when you see someone yawning. True imitation refers to understanding what the other is doing and recognizing that what they are doing is what they should be doing for the good of all (de Waal & Preston, 2017). It is related to prosocial behaviors that involve the interests or welfare of society as a whole (e.g., cooperation, helping, reciprocity, restorative actions).

According to the neuroconstructivist position, the development of empathy arises from dynamic contextual changes in neural structures that lead to conceptual representations in multiple brain regions. As such, these representations depend not only on the neural context but also on the physical context (de Waal & Preston, 2017). Therefore, as an innate quality, the level of empathy is malleable and can be influenced by educational interventions.

The influence of educational interventions carried out through SEAI on children's social emotional competence can be exerted through strategies similar to those used in child-educator interaction: modeling, instruction, and contingency. Thus, children are exposed to emotions depending on whether or not educators/robots show their emotions, explain their emotional states, and react to others' emotions. From this perspective, it should be considered that the principle of emotional education is based on the idea of a co-construction between the child's integration of new emotional skills and the educators' adjustment (Gómez-León, 2020).

Imitation and emotional contagion in SEAI

It is believed that children are born with the capacity to 'feel' the suffering of others (Geangu et al., 2010). This capacity is manifested through mechanisms such as emotional contagion and motor mimicry (perception-action) that occurs when the child observes the bodily emotions of the other and automatically activates his or her own neural and bodily representations. These mechanisms refer to the first doll of Frans de Waal's (2012) model and are the basis of other more complex empathic processes.

The operationalization in SEAI of motor mimicry and emotional contagion is carried out with Type I social robot designs focused on the external aspect of emotional expressions. The main advantage of this type of resources is that they enable the learning of the emotion by practicing it.

Table 1 summarizes the most relevant dimensions, on a scale of 0 to 5, about social robots that have been shown to have an effect on any of the variables related to the perception-action mechanism: form (from abstract to anthropomorphic), modality (or communication channels), social norms, autonomy, and interactivity.

Table 1. Function and dimensions of social robots related to motor mimicry and emotional contagion. 

Robot Empathic function F. M. S.N. A. I.
Qrio (Tanaka et al., 2007) Recognizes voices and faces, remembers people, and communicates emotions, verbally and non-verbally. 4 5 3 3 3
Robovie (Kahn et al., 2012; Kanda et al, 2007). Good capacity for emotional expression through movement, verbal communication. 4 5 3 3 3
Probo (Goris et al., 2011) Expresses attention and emotions through the gaze and facial expressions. 2 5 3 3 3
Keepon (Kozima et al., 2009) Designed for simple, natural, non-verbal emotional interaction through touch. 2 0 0 3 3
Pleo-innvo lab (Causo et al., 2016) Emotion recognition through visual patterns, sounds, smells, and temperature. 3 5 0 2 3
Flobi (Nitsch & Popp, 2014) Detects and expresses human emotions in a simplistic and caricatured way so that they can be easily perceived. 5 0 0 2 3
Haptic creature (Yohanan, & MacLean, 2012) Tactile perception and emotional communication through their breathing, the strength of their purr, and the rigidity of their ears. 3 0 0 3 3

Note:Dimensions proposed by Bartneck & Forlizzi (2004).

F.: form; M.: modality; S.N.: social norms; A.: autonomy; I.: interactivity.

Non-verbal communication and empathy in SEAI

In developmental psychology, intuitive parenting is considered a scaffold in which children develop empathy when caregivers imitate or exaggerate the child's emotional facial expressions. The child then discovers the relationship between the emotion experienced and the caregiver's facial expression, learning to associate the two. Therefore, when talking about the development of empathy in early childhood education, one of the objectives is to help the child to identify their emotions, to express them, and to understand the emotions of others, so that they can establish a healthier relationship with others and with themselves. This differentiation process is considered to develop from emotional contagion to emotional empathy.

Watanabe et al. (2007) modeled intuitive human parenting using a robot that associated a caregiver's imitated or exaggerated facial expressions with the robot's internal state to learn an empathic response. Social robots can have episodic memories with associated emotions and use them to 'sense' the current situation. This is a two-step process: (1) the robot mimics the child's empathic expressions (e.g., facial expression, voice, and posture); (2) this mimicry results in afferent feedback in the child that produces a congruent parallel effect. For example, the robot imitates the child's smile and, consequently, the child perceives his or her own emotional state. Thus, the child is both the actor of the emotional expression and the observer of the effect it produces. In this situation, the child gradually distinguishes the meaning of these emotional actions and his/her attention can focus on the effects of his/her own emotions and those of others. Here, just as in a natural learning context, the emotional climate is co-constructed between the two agents (robot and child), which offers the child the possibility to experiment with inter- and intrapersonal emotion regulation strategies.

This type of non-verbal communication not only allows for greater understanding of verbal language, but also 'humanizes' the robot, allowing for greater empathy (Park & Whang, 2022). Flobi is an example of a robot suitable for teaching emotion recognition because of its ability to detect facial expressions and communicate them using simple and exaggerated gestures (Goris et al., 2011).

Verbal communication and empathy in SEAI

At this age, another goal is for children to be able to verbalize how they feel when they are happy, sad, angry, or afraid. Recent findings suggest that children's language skills may play both a direct and indirect role in their empathic responses and behaviors. Specifically, more advanced language ability in children aged 14 to 36 months increases emotion understanding and predicts greater concern and less contempt for others, even after controlling for cognitive skills, and in children aged 2 to 4 years it increases empathic concern and prosocial action (Stevens & Taber, 2021).

SEAI is also a good resource for training the child's subjective awareness as a product of language-linked socialization (Stevens & Taber, 2021). Affective states reflected in expressive behavior are perceived, interpreted, and commented on by the robot through imitation linked to and contingent on the child's facial expression and accompanied by a verbal commentary on the emotion: 'You are happy today, aren't you? ' or 'You look sad'. By recognizing the emotional state, imitating the expression, and using verbal labeling, the robot sensitizes the child to emotional cues and provides the necessary links in the awareness between emotional responses and subjective states.

To increase affective engagement and emotional exchange during the interaction requires the emotional adaptation of the robot to the child. For example, demonstrations of emotional expressions are expected to be ostentatious and explicit, or, if engaged in learning tasks, language should be accompanied by modulations of positive emotional intonations and other nonverbal manifestations of empathy. The Probo robot could be efficient in this type of training, because of its optimal interaction between facial expression and verbal communication (Goris et al., 2011).

Critical factors in the implementation of emotional contagion

It has been found that avatars embodying the child's facial appearance or habitual facial expressions can help the child accurately represent his or her identity and relate to his or her avatar (Park et al., 2021). These results are consistent with those found by de Waal and Preston (2017) on motor mimicry and emotional contagion during ontogeny. Being able to recognize one's own face is one of the critical prerequisites of self-awareness and self-identity that is acquired at age two and correlates with empathic and altruistic behavior.

However, according to Masahiro Mori et al.'s (2012) uncanny valley theory, a person's affinity for a robot increases as its features become more human-like, but only to a certain extent, the response can suddenly shift from empathy to disgust at overly realistic, but always imperfect, representations of the human representation, leading to unease and rejection in the child (Feng et al., 2018). Children seem to show this increased preference for more schematic, rather than highly realistic, representations of humans as early as 12 months (Lewkowicz & Ghazanfar, 2012).

One study showed that between the ages of 5 and 7 years, children have a high preference for simplified designs with exaggerated facial features such as Keepon (Kozima et al., 2009). At this age they also seem to have a preference for animal-like robots such as Pleo-innvolab (Causo et al., 2016), as they are considered friendly. However, from the age of 7 years and older, a good choice would be Affetto (Ishihara et al., 2018) or Kaspar (Wood et al., 2013), because of their strong resemblance to a human child. Other studies have shown the influence of age on the attribution of human mental characteristics, while at 5 years of age children tend to anthropomorphize robots more than older ones, regardless of their appearance, from 7 years of age, they attribute a greater number of human mental characteristics if the robot presents a more human-like appearance (such as the Nao robot) than a mechanical appearance (such as the Robovie robot) (Manzi et al., 2020). Despite this, one study found that more than half of 15-year-olds believed that Robovie had mental states (e.g., it was intelligent and had feelings), was a social being (e.g., it could be a friend, offer comfort, and you could trust it with secrets), and morally deserved fair treatment and should not suffer psychological harm, although these conceptualizations occurred to a lesser degree than in 9- and 12-year-olds (Kahn et al., 2012).

Limitations of SEAI for early childhood implementation

One of the current limitations for use with the infant population is that speech recognition is not yet robust enough to allow the robot to understand the spoken utterances of young children. Although these shortcomings can be addressed using alternative means of input, such as touch screens, this imposes a considerable constraint on the natural flow of interaction.

Sometimes the contingencies between robot behavior and children (e.g., the robot waving its hand in front of a child) are not as fast as social events when dealing with young children (e.g., when the robot makes the hand gesture the child is gone). Contingencies similar to a simple reflex can be applied to solve this (e.g., the robot laughing immediately after being touched on the head) but this limits the behavior.

Other improvements refer to the need for dynamic scripts to adapt flexibly to the social context constructed with the child. This is particularly useful for promoting joint attention, as the ability to adjust the robot's response and quickly re-engage the child in the task is essential when the child loses participation and attention. In training joint attention, a robot such as Keepon (Kozima et al., 2009) could be recommended, as its simplicity may help children focus their attention on certain key social aspects that are necessary for the skill being trained, while limiting distracting or confusing stimuli. In addition, the design of social robots should consider the need to provide different social cues (i.e., pointing, gaze orienting, etc.).

Haptic behavior in the robot-child relationship from 18 to 24 months has been shown to be a powerful predictor of interaction quality and bonding. Early type 1 robot studies focused on single mode interaction, such as Haptic Creature (Yohanan & MacLean, 2012), which provides emotional feedback from tactile contact, but there are now an increasing number of multimodal robots. Multimodal recognition is important because the robot recognizes redundant cues for a better understanding of the child's affective state and thoughts. Probo (Goris et al., 2011) is an example of a multimodal social robot.

Primary forms of empathic and prosocial behaviors with a robot

One of the major limitations of these robots is that they behave according to a preprogrammed set of rules, making it difficult to obtain long-term highly autonomous interaction between robots and children (Woo et al., 2021).

The study by Tanaka et al. (2007) was one of the first to prove that, under certain conditions, primary forms of prosocial behaviors can appear through child-robot interaction in the same way as they do in interaction with peers. For this purpose, a humanoid robot, Qrio, was brought into a classroom of children aged 18 to 24 months for more than 5 months. The results showed that, over time, instead of losing interest, the children established a bond and socialization similar to that of a human companion: the hugs they previously directed to other objects (a teddy bear or an inanimate robot, comfortable and easy to handle) were displaced to the robot (despite it being the least huggable). Similarly, they only directed caring, protective, and helpful behaviors to the robot. The most influential variables on the quality of the child-robot interaction were found to be the following: the autonomy of the robot; the broad repertoire of behaviors; the degree to which the robot's behavior was predictable; and the contingency of its responses.

SEAI is currently attempting to make the robot more autonomous through a certain learning ability to develop new behaviors and expressions according to the affective loop model (McStay & Rosner, 2021). These advanced learning techniques use biometric sensors to decipher and respond individually to children's emotional responses while collecting and analyzing sensory data, usually visual, auditory, and tactile. This allows them to learn, adapt, and respond to the needs and preferences of a particular child, but also to have their own 'mood'.

Vircikova et al. (2015) tested this model in a school environment. The robot used, Nao, was able to perceive emotional states through an emotion recognition system, but in addition the affective loop allowed it to plan its responses and learn emotional expressions through experience. The goal of the project was to enable students aged 5 to 7 to learn new words in English. During the experiment, the robot regulated its emotions by analyzing the children's emotional reactions. For example, it would show joy if it perceived that the child was happy because they had remembered a word, but if it detected that the child was starting to get bored the robot would stop teaching and entertain the child (e.g., by dancing). Depending on the previous emotion and the context, the robot not only adapted the expression of the emotion, but also the behavior. The study showed that adapting the response to various environments and personalizing the interaction are two features necessary for the development of a long-term relationship. Similar results have been found with the Robovie interactive robot in an elementary school where children were allowed to interact with it during recess (Kanda et al., 2007).

Emotional empathy and Level 1 perspective-taking in SEAI

Emotional empathy may be an extension of emotional contagion with more capabilities in terms of self-awareness and self-knowledge of the other, it is related to the early development of perspective taking, and it can be observed in children as young as 24 months. Children are at Level 1 of perspective taking when they understand that the content of what they see may differ from what another sees in the same situation. This stage refers to the second doll in Frans de Waal's (2012) model.

In SEAI this behavior, which is more complex than that of the Type I social robot, is explained by classical conditioning learning. In this case the robot associates neutral signals from the child or from the context in which the empathic interactions occur (neutral stimulus, e.g., bird song) with the child's emotional signals (unconditioned stimulus, e.g., smile) and the associated affective state (unconditioned response, e.g., joy). Once conditioned, the single conditioned stimulus (bird song) is sufficient for the robot to present a conditioned response (joy). Therefore, a Type II social robot is expected to associate certain situations or events with certain emotions, however, as is the case in children under 4 years of age, this understanding of others' emotions is limited by experience.

An example of a robot showing this type of empathic behavior is Pepper, which combines knowledge, perception, and the ability to predict the actions and consequences of others. To do so, it analyzes the child's abilities not only from the current state but also from a set of different states that the child could reach (Pandey et al., 2013). It involves proactive behavior and the evaluation of a situation from different perspectives (Buyukgoz et al., 2021), although it always does so based on its experience.

Table 2 describes the dimensions of social robots that have been shown to have an effect on any of the variables related to emotional empathy.

Table 2. Function and dimensions of social robots related to emotional empathy. 

Robot Empathic function F. M. S.N. A. I.
Nao (Manzi et al., 2020; Vircikova et al., 2015). Autonomous and customizable, it encourages student participation. 4 5 3 3 3
Pepper (Buyukgoz et al., 2021; Pandey et al., 2013). Exhibits body language. Analyzes from different perspectives and exhibits proactive behaviors. 4 5 3 3 3
Affetto (Ishihara et al., 2018) Influences through its facial expressions the quality of interactions with its caregiver. 2 5 3 3 3
iCat (Castellano et al., 2017) Provides affective feedback through facial and verbal expressions. 2 0 0 3 3
Jibo (Ali et al., 2021) Autonomous and customizable, emotional facial recognition. 3 5 0 2 3
Kaspar (Wood et al., 2013) Emotional reactions through a wireless keyboard or by activating the robot's tactile sensors. 5 0 0 2 3

Note: Dimensions proposed by Bartneck & Forlizzi (2004).

F.: form; M.: modality; S.N.: social norms; A.: autonomy; I.: interactivity.

Cognitive Empathy and Level 2 Perspective Taking in SEAI

From the age of 4, the child learns that the effect of an event does not depend on the concrete situation, but on the individual's evaluation of it. It can be said that when children understand that they and another person can see the same thing simultaneously from different perspectives they have reached Level 2 of perspective taking. This level is related to mentalizing. Perspective taking is considered the most advanced cognitive process among the empathic processes, it corresponds to the third doll of Frans de Waal's (2012) model and is the final step for an empathic social robot: Type III social robot.

In the processes described so far, the robot's emotion coincided, or was congruent, with that of the child; during this stage the robot must imagine the child's perspective and suppress its own. From the age of 4-5 years, the type of social skills favoring empathy involve understanding social conventions and entail an emotional and cognitive adjustment not only with the other person, but with the other person within a context (Grosse et al., 2020). At this stage the robot must project an imaginary situation together with the state that the child has generated in it to mimic perspective taking. For example, the robot perceives an expression of tiredness in the child. It may consider several reasons, including: 'Since it is exam time, he/she may have had little sleep', which may cause it to say: 'Sleeping well is important for retaining what we have learned'. Such a response requires a virtual construction of the child's situation and the emotional states associated with it from that perspective.

To be socially skilled means to be able to correctly evaluate the problems of a social emotional context in order to produce a socially expected situation in relation to the analysis made based on the point of view of the other or one's own interest. In this way, one can make a fundamental contribution to the understanding of one's own and others' emotions. An interesting study shows that the type of strategies used by 8- and 10-year-old children when a robot violates social norms depends on their ability to adopt the robot's point of view (Serholt, et al., 2020).

At this stage of development, the relevance of the use of SEAI is that the child can develop pragmatic means of intra- and interpersonal emotional regulation. In this sense the robot should be able to feel and have an empathic outcome depending on the modulating factors, being able to decide to calibrate the strength of its empathic response or even not to express it. However, research with such robots is still scarce (Banisetty et al., 2021; Tejwani et al., 2022).

Conclusions

Much of the existing SEAI work could be used to support the learning of empathic skills in education, but there is very little research that has explored this area.

The aim of this study was not to show an integrated model that reproduces the much more complex human interpersonal empathy, but to find a selective and organizing model of the components that influence the human empathy process and that can help to train them. Depending on the task, models that simulate sophisticated empathic abilities could be an obstacle to the acquisition of simpler features at early ages. So, depending on the age of the child, and the stage of empathy they are in, the critical factors to consider in the design of an empathic robot may vary.

The perception-action mechanism is a relatively simple mechanism that is present during the first months of life and has already been successfully implemented in SEAI. Not all empathy can be reduced to this mechanism, but cognitively higher levels of empathy could not exist without it, so it seems logical to start working from the most basic forms.

Currently, most of the empathy research on social robots is slowly moving towards Type II robots, and Type I empathic robots are starting to be seen in industry. Research on Type III robots, which are domain-independent and capable of modulating empathic response, is expected to be seen in the near future.

This review serves to show the current state of SEAI in education and to propose a use that will help improve present and future quality of life. However, as robotic programming becomes more effective in simulating real human social interactions, important ethical and safety issues are required (McStay & Rosner, 2021). It is possible that the development of empathy-the basis for collaborative and prosocial behaviors-from early childhood may help to resolve some of the ethical and social controversies generated by the implementation of SEAI in society.

References

Ali, S., Park, H., & Breazeal, C. (2021). A social robot's influence on children's figural creativity during gameplay. International Journal of Child-Computer Interaction, 28, 100234. https://doi.org/10.1016/j.ijcci.2020.100234Links ]

Banisetty, S. B., Rajamohan, V., Vega, F., & Feil-Seifer, D. (2021). A deep learning approach to Multi-Context Socially-Aware Navigation, 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN), 2021, pp. 23-30, https://doi.org/10.1109/RO-MAN50785.2021.9515424Links ]

Bartneck, C., & Forlizzi, J. (2004). A design-centred framework for social human-robot interaction. In RO-MAN 2004: 13th IEEE International Workshop on Robot and Human Interactive Communication, September 20-22, 2004, Kurashiki. Institute of Electrical and Electronics Engineers (pp. 591-594). https://doi.org/10.1109/ROMAN.2004.1374827Links ]

Bisquerra, R., & Alzina, S. (2017). Psicología positiva, educación emocional y el Programa Aulas Felices [Positive pychology, emotional education and the Happy Classrooms Program]. Papeles del Psicólogo, 38(1), 58-65. https://doi.org/10.23923/pap.psicol2017.2822Links ]

Buyukgoz, S., Pandey, A. K., Chamoux, M., & Chetouani, M. (2021). Exploring behavioral creativity of a proactive robot. Frontiers in Robotics and AI, 8, 694177. https://doi.org/10.3389/frobt.2021.694177Links ]

Castellano, G., Leite, I., & Paiva, A. (2017). Detecting perceived quality of interaction with a robot using contextual features. Autonomous Robots, 41, 1245-1261. https://doi.org/10.1007/s10514-016-9592-yLinks ]

Causo, A., Vo, G. T., Chen, I., & Yeo, S. H. (2016). Design of robots used as education companion and tutor. In Robotics and mechatronics (pp. 75-84). Springer, Cham. https://doi.org/10.1007/978-3-319-22368-1_8Links ]

Waal, F. B. de (2012). The antiquity of empathy. Science (New York, N.Y.), 336(6083), 874-876. https://doi.org/10.1126/science.1220999Links ]

Waal, F. de, & Preston, S. D. (2017). Mammalian empathy: Behavioural manifestations and neural basis. Nature reviews Neuroscience, 18(8), 498-509. https://doi.org/10.1038/nrn.2017.72Links ]

Feng, S., Wang, X., Wang, Q., Fang, J., Wu, Y., Yi, L., & Wei, K. (2018). The uncanny valley effect in typically developing children and its absence in children with autism spectrum disorders. PloS one, 13(11), e0206343. https://doi.org/10.1371/journal.pone.0206343Links ]

Geangu, E., Benga, O., Stahl, D., & Striano, T. (2010). Contagious crying beyond the first days of life. InfantBehavior & Development, 33(3), 279-288. https://doi.org/10.1016/j.infbeh.2010.03.004Links ]

Gómez-León, M. I. (2020). Desarrollo de la alta capacidad durante la infancia temprana [Development of giftedness during early childhood]. Papeles del Psicólogo, 41(2), 147-158. https://doi.org/10.23923/pap.psicol2020.2930Links ]

Goris, K., Saldien, J., Vanderborght, B., & Lefeber, D. (2011). Mechanical Design of the huggable Robot Probo. International Journal Humanoid Robotics, 8, 481-511. https://doi.org/10.1142/S0219843611002563Links ]

Grosse Wiesmann, C., Friederici, A. D., Singer, T., & Steinbeis, N. (2020). Two systems for thinking about others' thoughts in the developing brain. Proceedings of the National Academy of Sciences of the United States of America, 117(12), 6928-6935. https://doi.org/10.1073/pnas.1916725117Links ]

Ishihara, H., Wu, B., & Asada, M. (2018). Identification and evaluation of the face system of a child android robot Affetto for surface motion design. Frontiers in Robotics and AI, 5, 119. https://doi.org/10.3389/frobt.2018.00119Links ]

Kahn, P. H., Jr,, Kanda, T., Ishiguro, H., Freier, N. G., Severson, R. L., Gill, B. T., Ruckert, J. H., & Shen, S. (2012). "Robovie, you'll have to go into the closet now": Children's social and moral relationships with a humanoid robot. Developmental Psychology, 48(2), 303-314. https://doi.org/10.1037/a0027033Links ]

Kanda, T., Sato, R., Saiwaki, N., & Ishiguro, H. (2007). A two-month field trial in an elementary school for long-term human-robot interaction. IEEE Transactions on Robotics, 23(5), 962-971. https://doi.org/10.1109/TRO.2007.904904Links ]

Kozima, H., Michalowski, M. P., & Nakagawa, C. (2009). Keepon. International Journal of Social Robotics, 1(1), 3-18. https://doi.org/10.1007/s12369-008-0009-8Links ]

Lewkowicz, D. J., & Ghazanfar, A. A. (2012). The development of the uncanny valley in infants. Developmental Psychobiology, 54(2), 124-132. https://doi.org/10.1002/dev.20583Links ]

Manzi, F., Peretti, G., Dio, C. di, Cangelosi, A., Itakura, S., Kanda, T., Ishiguro, H., Massaro, D., & Marchetti, A. (2020). A robot is not worth another: Exploring children's mental state attribution to different humanoid robots. Frontiers in Psychology, 11, 2011. https://doi.org/10.3389/fpsyg.2020.02011Links ]

McStay, A., & Rosner, G. (2021). Emotional artificial intelligence in children’s toys and devices: Ethics, governance and practical remedies. Big Data & Society. https://doi.org/10.1177/2053951721994877Links ]

Mori, M., MacDorman, K. F., & Kageki, N. (2012). The uncanny valley. IEEE Robotics and Automation Magazine, 19(2), 98-100. https://doi.org/10.1109/MRA.2012.2192811Links ]

Nitsch, V., & Popp, M. (2014). Emotions in robot psychology. Biological Cybernetics, 108(5), 621-629. https://doi.org/10.1007/S00422-014-0594-6Links ]

Pandey, A. K., Ali, M., & Alami, R. (2013). Towards a task-aware proactive sociable robot based on multi-state perspective-taking. International Journal of Social Robotics, 5, 215-236. https://doi.org/10.1007/s12369-013-0181-3Links ]

Park, S., & Whang, M. (2022). Empathy in human-robot interaction: Designing for social robots. International Journal of Environmental Research and Public Health, 19, 1889. https:// doi.org/10.3390/ijerph19031889Links ]

Park, S., Kim, S. P., & Whang, M. (2021). Individual's social perception of virtual avatars embodied with their habitual facial expressions and facial appearance. Sensors (Basel, Switzerland), 21(17), 5986. https://doi.org/10.3390/s21175986Links ]

Schiff, D. (2021). Out of the laboratory and into the classroom: the future of artificial intelligence in education. AI & Society, 36(1), 331-348. https://doi.org/10.1007/s00146-020-01033-8Links ]

Serholt, S., Pareto, L., Ekström, S., & Ljungblad, S. (2020). Trouble and repair in child-robot interaction: A study of complex interactions with a robot Tutee in a primary school Classroom. Frontiers in Robotics and AI, 7, 46. https://doi.org/10.3389/frobt.2020.00046Links ]

Stevens, F., & Taber, K. (2021). The neuroscience of empathy and compassion in pro-social behavior. Neuropsychologia, 159, 107925. https://doi.org/10.1016/j.neuropsychologia.2021.107925Links ]

Tanaka, F., Cicourel, A., & Movellan, J. R. (2007). Socialization between toddlers and robots at an early childhood education center. Proceedings of the National Academy of Sciences of the United States of America, 104(46), 17954-17958. https://doi.org/10.1073/pnas.0707769104Links ]

Tejwani, R., Kuo, Y., Shu, T., Katz, B., & Barbu, A. (2022). Social interactions as recursive MDPs. Proceedings of the 5th Conference on Robot Learning, in Proceedings of Machine Learning Research, 164, 949-958 Available from https://proceedings.mlr.press/v164/tejwani22a.html. [ Links ]

Vircikova, M., Magyar, G., & Sincak, P. (2015). The affective loop: A tool for autonomous and adaptive emotional human-robot interaction. In Robot Intelligence Technology and Applications 3 (pp. 247-254). Springer, Cham. https://doi.org/10.1007/978-3-319-16841-8_23Links ]

Watanabe, A., Ogino, M., & Asada, M. (2007). Mapping facial expression to internal states based on intuitive parenting. Journal of Robotics and Mechatronics, 19(3), 315-323. http://www.er.ams.eng.osaka-u.ac.jp/Paper/2007/Watanabe07b.pdf [ Links ]

Woo, H., LeTendre, G. K., Pham-Shouse, T., & Xiong, Y. (2021). The use of social robots in classrooms: A review of field-based studies. Educational Research Review, 33, 100388. https://doi.org/10.1016/j.edurev.2021.100388Links ]

Wood, L. J., Dautenhahn, K., Rainer, A., Robins, B., Lehmann, H., & Syrdal, D. S. (2013). Robot-mediated interviews-how effective is a humanoid robot as a tool for interviewing young children? PloS one, 8(3), e59448. https://doi.org/10.1371/journal.pone.0059448Links ]

Yohanan, S., MacLean, K. E. (2012). The role of affective touch in human-robot interaction: Human intent and expectations in touching the haptic creature. International Journal of Social Robotics, 4, 163-180. https://doi.org/10.1007/s12369-011-0126-7Links ]

Cite this article as:Gómez-León, M. I. (2022). Development of Empathy through Social Emotional Artificial Intelligence. Papeles del Psicólogo, 43(3), 218-224. https://doi.org/10.23923/pap.psicol.2996

Received: March 27, 2022; Accepted: June 06, 2022

Correspondence: mabelgomezleon@gmail.com

Conflict of Interests

There is no conflict of interest.

Creative Commons License This is an open-access article distributed under the terms of the Creative Commons Attribution License