Figure 1 - uploaded by Rene Te Boekhorst
Content may be subject to copyright.
Aibo playing " peekaboo " game. Left: Sony Aibo with human partner Right: Using a static image. (Top: hiding head with front-leg, Bottom: Aibo's view, showing face detection.)  

Aibo playing " peekaboo " game. Left: Sony Aibo with human partner Right: Using a static image. (Top: hiding head with front-leg, Bottom: Aibo's view, showing face detection.)  

Source publication
Article
Full-text available
The game peekaboo, ordinarily played be- tween an adult and baby, is used as a situ- ation where a robot may develop social in- teraction skills such as rhythm, timing and turn taking, using its experience and history of interactions over dierent temporal hori- zons. We present experiments using a robot that explore the length of experiences in an...

Contexts in source publication

Context 1
... take the view that appropriate mechanisms, while based in innate abilities, should largely develop through on- togeny. Our approach is to conduct experiments on a physical robot (see figure 1) to examine these mech- anisms for development. ...
Context 2
... robot stays in a "sitting" position throughout the experiments with the forelegs are free to move, facing a picture of a face (see Figure 1) at a fixed distance of 40cm. A picture was used rather than an interaction partner in these particular experiments to allow analysis of the robot's interactions in isolation when comparing horizon lengths, and for experimen- tal repeatability. ...

Similar publications

Article
Full-text available
South American subterranean rodents are mainly described as solitary and mutual synchronization was never observed among individuals maintained together in laboratory. We report that a single birth event was capable of disrupting the robust nocturnal activity rhythm of singly housed tuco-tucos from northwest Argentina. " Around-the-clock activity "...
Article
Full-text available
Understanding the actions of others is a necessary foundational cornerstone for effective and affective social interactions. Such understanding may result from a mapping of observed actions as well as heard sounds onto one's own motor representations of those events. To examine the electrophysiological basis of action-related sounds, EEG data were...
Article
Full-text available
Movements and behavior synchronize during social interaction at many levels, often unintentionally. During smooth conversation, for example, participants adapt to each others' speech rates. Here we aimed to find out to which extent speakers adapt their turn-taking rhythms during a story-building game. Nine sex-matched dyads of adults (12 males, 6 f...
Article
Full-text available
Behavioral rhythms synchronize between humans for communication; however, the relationship of brain rhythm synchronization during speech rhythm synchronization between individuals remains unclear. Here, we conducted alternating speech tasks in which two subjects alternately pronounced letters of the alphabet during hyperscanning electroencephalogra...
Article
Full-text available
Symmetry formation, symmetry breaking, and the strength of symmetric coupling in social interaction are investigated using motion capture data from pairs of individuals dancing to repeating rhythms. Repeating auditory rhythms are a simple form of temporal symmetry in which local entropy can be controlled. Spatio--temporal symmetry is formed when an...

Citations

... Experience is operationalized as the temporally extended flow of information across sensory, motor and internal variables mediating interaction. 1 Such raw uninterpreted experience can be used with biologically plausible information-theoretic methods to self-structure sensory fields, and derive dynamical regularities relating motor actuation to what will be sensed, resulting in actively guided perception and action (e.g. the discovery of optical flow and the capacity for visually guided movement [1]). Without employing any notion of representation or static, truth-functionalist semantics, on several robots and humanoids we have harnessed enactive cognitive architectures based on interaction histories and shown that such temporally extended experiences can structure the acquisition of self-regulating skills such as turn-taking games in social engagement [2], [3], with the acquisition of new behaviours, and the dynamic switching between them adaptively depending on details of ongoing interactions with humans [4], [5]. ...
... where p(x, y) is the joint distribution of X and Y . 2 The information distance between X and Y is then defined as ...
... Using social cues such as turn-taking in various modalities, detection of social engagement, and feedback from vocal, face-and gaze interaction to help acquire and select action, these cognitive architectures have allowed robots to acquire behaviours such as predictive gaze and peek-a-boo [2]. Moreover, on the basis of social feedback our humanoids have first developed distinct interactive drumming behaviour and peek-a-boo interactive behaviours, and then were able to autonomously switch between these behaviours, depending on social cues and engagement with a human interaction partner [4], [5]. 4 In our experiments, notably, peek-a-boo turning-taking behaviour could not be stably acquired if action selection is random, i.e. not based on prior experience, or if the temporal horizon of experiences is too small or too large [2], [5], and the same is true for interactive drumming which additionally required short-term memory of recent social engagement to be acquired [5]. ...
Conference Paper
Full-text available
Abstract—We overview how sensorimotor experience can be operationalized for interaction scenarios in which humanoid robots acquire skills and linguistic behaviours via enacting a “form-of-life”’ in interaction games (following Wittgenstein) with humans. The enactive paradigm is introduced which provides a powerful framework for the construction of complex adaptive systems, based on interaction, habit, and experience. Enactive cognitive architectures (following insights of Varela, Thompson and Rosch) that we have developed support social learning and robot ontogeny by harnessing information-theoretic methods and raw uninterpreted sensorimotor experience to scaffold the acquisition of behaviours. The success criterion here is validation by the robot engaging in ongoing human-robot interaction with naive participants who, over the course of iterated interactions, shape the robot’s behavioural and linguistic development. Engagement in such interaction exhibiting aspects of purposeful, habitual recurring structure evidences the developed capability of the humanoid to enact language and interaction games as a successful participant.
Article
Full-text available
Early infant emotional development concerns the interactive emergence of emotional states that may motivate approach and withdrawal in epigenetic systems. Different patterns of infant facial expressions, vocalization, and gazing emerge within dyadic interactions in the first 10 months of life. Concretely, the interface of a limited number of interactive parameters create complex real-time patterns which change over developmental time. I describe these phenomena using statistical simulations, continuous ratings, computer vision modeling, and old-fashioned observation, ponder their psychological meaning, and indicate how they are ripe for formal modeling.