Fig 1 - uploaded by Jonathan Gratch
Content may be subject to copyright.
Methodology to get normal maps for wrinkle patterns. 

Methodology to get normal maps for wrinkle patterns. 

Source publication
Conference Paper
Full-text available
Wrinkles, blushing, sweating and tears are physiological manifestations of emotions in humans. Therefore, the simulation of these phenomena is important for the goal of building believable virtual humans which interact naturally and effectively with humans. This paper describes a real-time model for the simulation of wrinkles, blushing, sweating an...

Context in source publication

Context 1
... normal maps for the wrinkle patterns are created. For each, the following procedure is realized, Fig.1: (a) a picture is taken with a person doing the respective wrinkle configuration; (b) the picture is cropped and converted to grayscale; (c) the picture is composited onto the virtual human texture; (d) the composited picture is edited to remove color information everywhere but in the wrinkle region, a Gaussian filter is applied to blur the image and the side borders are faded into the background color; (f) finally, the NVIDIA's normal map tool 1 is used to create the normal map. ...

Similar publications

Article
In this paper we present different aspects of rendering character emotions, as well as the outcome of an experimental study to find out whether considering skin changes can help improving perception of certain emotions. While posture and mimics has been subject of extensive research, we are focusing on dynamic skin changes in order to express stron...

Citations

... Each expression was defined as a set of Action Units of FACS, a commonly used system to describe facial expressions [131]. [125]; B, Perception of emotion can be enhanced by rendering tears, wrinkles, and blushing [126]; C, Bodily expression combined with facial expression can improve emotion recognition [127]; D, Touch can convey different degrees of valence and arousal [128]; E, Robotic manipulators can be augmented with colored LEDs to communicate emotion [42], [129]. ...
... Skin rendering: Most models presented so far focused on defining the muscular activity on the face and the dynamics of movement. But the emotional expressions may change also the rendering of the face with the appearance of tears or wrinkles, or the change of skin color [126] (Figure 3-B). With the advance of rendering techniques, highly realistic results are obtained, even in real time [137], [138]. ...
Article
Full-text available
Virtual humans and social robots frequently generate behaviors that human observers naturally see as expressing emotion. In this review article, we highlight that these expressions can have important benefits for human–machine interaction. We first summarize the psychological findings on how emotional expressions achieve important social functions in human relationships and highlight that artificial emotional expressions can serve analogous functions in human–machine interaction. We then review computational methods for determining what expressions make sense to generate within the context of interaction and how to realize those expressions across multiple modalities, such as facial expressions, voice, language, and touch. The use of synthetic expressions raises a number of ethical concerns, and we conclude with a discussion of principles to achieve the benefits of machine emotion in ethical ways.
... They showed that visualizing sweat using textures and particles can create realistic sweating animations. Other researchers simulated physical properties of water drops to create credible tear and sweating animations [25,97]. de Melo and Gratch [25] designed different emotional expressions of virtual faces using tears to display sadness and sweat to display fear. ...
... Other researchers simulated physical properties of water drops to create credible tear and sweating animations [25,97]. de Melo and Gratch [25] designed different emotional expressions of virtual faces using tears to display sadness and sweat to display fear. The authors showed that visualizing sweat and tears improves identification of the underlying emotions of the virtual human. ...
... 2019.3.3). In line with de Melo and Gratch [25], we simulated material properties of water using textures with high specular and low diffuse values in the sweating area on the face to create static beads of sweat. We designed wet hair using 3D-models and textures with a high level of smoothness. ...
Conference Paper
Full-text available
Avatars are used to represent users in virtual reality (VR) and create embodied experiences. Previous work showed that avatars' stereotypical appearance can affect users' physical performance and perceived exertion while exercising in VR. Although sweating is a natural human response to physical effort, surprisingly little is known about the effects of sweating avatars on users. Therefore, we conducted a study with 24 participants to explore the effects of sweating avatars while cycling in VR. We found that visualizing sweat decreases the perceived exertion and increases perceived endurance. Thus, users feel less exerted while embodying sweating avatars. We conclude that sweating avatars contribute to more effective exergames and fitness applications.
... However, there is more to facial expressions than the facial muscle actions. Autonomic expressions such as pupil size, blushing and sweating are linked to arousal states and to emotions (De Melo and Gratch, 2009) and potentially perceivable by observers (Kret and De Dreu, 2017;Kret et al., 2014;Kret, 2015;Prochazkova & Kret, 2017), even by young infants (Kelsey et al., 2019;Aktar et al., 2020). These physiological responses reflect autonomic nervous system activity and are non-intentional, yet potentially informative for observers. ...
... For example, many people are nervous while giving a speech to an audience. While some get red stains in the neck, others may get dark circles under their armpits on their shirt (De Melo, and Gratch, 2009). Extreme, high arousal emotion states such as laughter or fear can cause the bladder to empty. ...
Article
Full-text available
Humans and great apes are highly social species, and encounter conspecifics throughout their daily lives. During social interactions, they exchange information about their emotional states via expressions through different modalities including the face, body and voice. In this regard, their capacity to express emotions, intentionally or unintentionally, is crucial for them to successfully navigate their social worlds and to bond with group members. Darwin (1872) stressed similarities in how humans and other animals express their emotions, particularly with the great apes. Here, we show that emotional expressions have many conserved, yet also a number of divergent features. Some theorists consider emotional expressions as direct expressions of internal states, implying that they are involuntary, cannot be controlled and are inherently honest. Others see them as more intentional and/ or as indicators of the actor's future behavior. After reviewing the human and ape literature, we establish an integrative, evolutionary perspective and provide evidence showing that these different viewpoints are not mutually exclusive. Recent insights indicate that, in both apes and humans, some emotional expressions can be controlled or regulated voluntarily, including in the presence of audiences, suggesting modulation by cognitive processes. However, even non-intentional expressions such as pupil dilation can nevertheless inform others and influence future behavior. In sum, while showing deep evolutionary homologies across closely related species, emotional expressions show relevant species variation.
... Researchers explored the expression of emotions and perception of agents through adding wrinkles, blushing, sweating, and tears [34] and experimented with real time facial animations such as realistic wrinkles [28], pleasure-arousal-dominance models linking gaze behaviors to emotional states through head movement and body gestures during gaze shifts [62], and implementing complex facial expressions such as superposition and masking [73]. ...
Conference Paper
The field of intelligent virtual agents (IVAs) has evolved immensely over the past 15 years, introducing new application opportunities in areas such as training, health care, and virtual assistants. In this survey paper, we provide a systematic review of the most influential user studies published in the IVA conference from 2001 to 2015 focusing on IVA development, human perception, and interactions. A total of 247 papers with 276 user studies have been classified and reviewed based on their contributions and impact. We identify the different areas of research and provide a summary of the papers with the highest impact. With the trends of past user studies and the current state of technology, we provide insights into future trends and research challenges.
... • Screen size -The large screen size expanded the interface between humans and screens [31]. The default size or the resolution of the entire screen was 780x440 pixels [32].This size was selected because the full screen display was convenient to the students. ...
Article
Full-text available
Quizzes refer to a useful assessment method that assesses overall about what has been taught. Quizzes devised from various interactive ways are capable of encouraging students to pay closer attention and indirectly improve their comprehension towards the subject. Hence, two-dimensional (2D) virtual agent multimedia quiz application (app) was developed. Accordingly, a 2D female virtual agent with Malaysia's native look had been designed to perform the quiz through the app. The quiz app is composed of 20 multiple-choice questions. The targeted participants who took this quiz were from fourth and fifth semester students from electrical engineering department of Malaysia polytechnics. The quiz app was developed based on several theories, principles, and literature overview. The quiz app drives with different functions through several screens namely home, question, feedback, hint, reward and review screens. Additionally, this paper reports the outcomes of usability and user satisfaction based on the tests conducted.
... More specifically, the sense of powerlessness postulated to be associated with crying (Vingerhoets, 2013) should conversely result in a perceived lack of emotional self-control. Since the present study used avatars rather than photographs, the addition of tears might add to their emotional believability (de Melo & Gratch, 2009). Finally, items for happiness and shame were included as potential candidates of other emotional states potentially associated with tears (Reed et al., 2015). ...
... the present study provided some initial evidence for the potential usefulness of generating tears on realistic 3-D models. Building, for example, upon the pioneering work of de Melo and Gratch (2009), realistically animated models of human emotional crying should soon allow testing of crying dynamics, including the shedding of further light on the mechanisms underlying potential evolutionary relationships between emotional tears and pupil constriction. ...
Article
Full-text available
Small pupils elicit empathic socioemotional responses comparable to those found for emotional tears. This might be understood in an evolutionary context. Intense emotional tearing increases tear film volume and disturbs tear layer uniformity, resulting in blurry vision. A constriction of the pupils may help to mitigate this handicap, which in turn may have resulted in a perceptual association of both signals. However, direct empirical evidence for a role of pupil size in tearful emotional crying is still lacking. The present study examined socioemotional responses to different pupil sizes, combined with the presence (absence) of digitally added tears superimposed upon expressively neutral faces. Data from 50 subjects showed significant effects of observing digitally added tears in avatars, replicating previous findings for increased perceived sadness elicited by tearful photographs. No significant interactions were found between tears and pupil size. However, small pupils likewise elicited a significantly greater wish to help in observers. Further analysis showed a significant serial mediation of the effects of tears on perceived wish to help via perceived and then felt sadness. For pupil size, only felt sadness emerged as a significant mediator of the wish to help. These findings support the notion that pupil constriction in the context of intense sadness may function to counteract blurry vision. Pupil size, like emotional tears, appears to have acquired value as a social signal in this context.
... For example, four different emotions, upset, anger, rage, and violence show different intensity levels for the same basic emotion "anger". Effectiveness is another subclass of cues [13] to derive the intensity of interactive emotions. For example, the presence of tears emphasize sadness. ...
... Current technologies of facial animation synthesis are unable to generate realistic facial expressions containing basic emotions [19]. de Melo and Gratch simulated facial emotional expressions for blushing, sweating, tears of fear, anger, and wrinkles to express the physiological emotions in human [20]. They still relied on unrealistic texture based method which is a static one. ...
... The high speed at which we process the database indicates that memory suffices for valuing (Cannon 1927). Nevertheless, the body may also react at "low" speed when we recognize something or somebody as important, for example sweating or blushing, and more generally in terms of stress, intentionality or relaxation (Faigman, et al. 2003, Melo & Gratch 2009). This reaction may play a role in body language (Gelder 2006) and presumably also in new learning. ...
Article
Full-text available
This article presents a new conceptual view on the conscious will. This new concept approaches our will from the perspective of the requirements of our neural-muscular system and not from our anthropocentric perspective. This approach not only repositions the will at the core of behavior control, it also integrates the studies of Libet and Wegner, which seem to support the opposite. The will does not return as an instrument we use to steer, but rather as part of the way we learn new automatic behavior and of how our neural system steers us. The new concept suggests that understanding of our will is more about understanding of our daily behavior than about the will itself.
... At the same time there has been vast research for architectures capable of communicating emotions to human interaction partners. Published work ranges from realistic expressions of basic emotions [3], down to selective gazing behavior [4] and other detailed micro expressions. Goal is to express simulated emotions as naturalistic as possible. ...
Conference Paper
Full-text available
Research on emotional expressive embodiments for simulated affective behavior resulted in impressive systems allowing for characters being perceived as more natural. In consequence we want to explore what happens if these characters gain the human ability to hide their true emotions. More particular we are looking for the change in emotional engagement and the change in perceived naturalism. The Partial Poker-Face will allow virtual characters to recognize and control their own emotional expressive behaviour in order to influence the perception of self by others. While our research assumes applications in virtual storytelling, we are confident that it will be also of value for other areas, like human robot interaction.