International Mini-Symposium on Cognitive Robotics

Aim and scope | Speakers | Program

Date Friday, September 14th, 2018
Place Conference Room at Center for Information and Neural Networks (CiNet),
National Institute of Information and Communications Technology (NICT), Osaka, Japan
Access to CiNet

Aim and scope

Cognitive robotics is an interdisciplinary research area integrating robotics, cognitive science, and neuroscience. Researchers aim at understanding human cognition by means of constructive approach and/or at designing cognitive mechanisms for robots inspired by human cognition. This mini-symposium provides a great opportunity for you to learn the state-of-the-art research achievements in this area. Five excellent international researchers will present their work on robotic/computational models of perception and motor learning, symbolic representation, physical and social interaction with humans, etc. You are cordially invited.

Speakers

Jan Babic
Associate Professor, “Jozef Stefan” Institute

Title
Human Perspective on Dynamical Contacts in Cognitive Humanoids

Abstract
In this talk I will present a series of human behavioral studies that we conducted in the scope of the European project CoDyCo that dealt with the control and cognitive understanding of robust, goal-directed whole-body motion interaction with multiple contacts in cognitive humanoids. The main research questions that I will address are “What are the effects of contacts on postural control?” “How humans utilize goal oriented contacts?” “How can we classify contacts?” and “How fast do we move and why?” I will conclude my talk with an introduction of the follow-up project AnDy that aims to go beyond the state of art in understanding collaborative tasks by combining human ergonomic models with cognitive predictive models of human dynamic behavior.

Erhan Oztop
Associate Professor, Ozyegin University

Title
Human Adaptation to Human-Robot Shared Control

Abstract
Human-in-the loop robot control is an effective method for synthesizing skilled behaviors for robots. It can be used as a type of so-called learning by demonstration method, in which the human operator acts as an adaptive controller in the control loop of the robot. An important direction in robotics now is to develop robotic systems that are not geared exclusively for full autonomy but more for human compatibility, including their ability to learn and co-adapt with humans for better synergistic performance. For the success of such systems, it becomes critical to understand how the humans adapt within a human-in-the-loop shared control system. In this talk, our recent study aimed at answering this question will be presented.

Bio
Erhan Oztop earned his Ph.D. at the University of Southern California in 2002. In the same year, he joined Computational Neuroscience Laboratories at the Advanced Telecommunications Research Institute International, (ATR) in Japan, where he served as first researcher, then senior researcher, group leader and vice department head positions. In 2011, he moved to Computer Science department at Ozyegin University in Istanbul, and since 2014 he serves as the Department Head of the Computer Science Department. He also directs the newly founded AI-Lab, and co-direct the Robotics laboratory at Ozyegin University.

Emre Ugur
Assistant Professor, Bogazici University

Title
Principles of predictability and verifiability in developmental systems

Abstract
The aim of this talk is to discuss the important roles of predictability and verifiability of created structures in life-long unbounded development of robots where the learning targets are not defined apriori. Such problems typically deal with unsupervised organization of the low-level sensorimotor space in generating higher levels of abstractions for efficient and effective reasoning, planning and communication. Generic unsupervised dimensionality reduction techniques such as auto encoders or various clustering methods process information independent of the tasks the robot may face in the future, and generate data are not guaranteed to be suitable for the upcoming tasks and problems. I will discuss how the principles of predictability and verifiability can be exploited in order to scaffold the unsupervised learning methods in generating progressively higher-level abstractions. In particular, I will summarize our previous experiments that exploited predictability and verifiability principles in generating affordance and effect categories that the robot would perceive and reason about, discovering symbols and rules that are used for planning and plan execution, and partitioning the sensorimotor space of the agent for learning and exploring different regions in intrinsically motivation systems.

Pablo Lanillos
Postdoc researcher, Technical University of Munich

Title
Is this my body? Studying the sensorimotor self by replicating the rubber-hand illusion on a robot

Abstract
How humans learn, adapt and perceive their body as a unity during interaction with the environment? Can these mechanisms be applied to robots?
Under the umbrella of SELFCEPTION project (www.selfception.eu) we are developing computational models for robotic self/other distinction in order to improve interaction under uncertainty in complex scenarios. This interdisciplinary project, which combines cognitive psychology and robotics, seeks (i) enabling robots to learn to recognize their own body while differentiating it from the rest of the environment; and (ii) investigating the mechanisms of the minimal self in humans using synthetic models. In this talk, I will introduce the sensorimotor or bodily self and how to approach it from the probabilistic perception point of view, and how we can systematically validate the proposed models through body-illusions. I will mathematically describe body perception for robots and humans taking as a backdrop predictive processing, one of the most promising theories under the Bayesian brain assumption. Using that framework, I will detail how we were able to replicate the first rubber-hand illusion on a multisensory humanoid robot. Finally, I will discuss the challenges to develop a full-fledged artificial sensorimotor self.

Bio
Pablo Lanillos is a Marie Skłodowska-Curie postdoc leading the MSCA EU H2020 funded project SELFCEPTION (www.selfception.eu) at the Institute of Cognitive Systems (ICS) of the Technical University of Munich (Germany), directed by Prof. Gordon Cheng, He has a M.Sc. degree in computer sciences and a Ph.D. degree in robotics. Interests: embodied artificial intelligence, sensorimotor self, self/other distinction, multisensory models of attention, bio-inspired computational modelling and decision making.
Personal website: www.therobotdecision.com

Eli Sheppard
PhD Student, Heriot-Watt University

Title
Multimodal Representation Learning

Abstract
In this talk I we’ll discuss how Multimodal Convolutional autoencoders can be used to learn a joint embedding of different sensory percepts. The joint embedding can be used to recover missing modalities as well as to explore the interactions between modalities. Through this exploration we will see how the learning of different percepts can be decomposed into high level attributes (shape, color, size, etc). and how the use of multimodality facilitators the easy manipulation of the latent space to generate novel combinations of attributes.

Program

13:30-14:10 “Principles of predictability and verifiability in developmental systems”
Emre Ugur
14:10-14:50 “Human Perspective on Dynamical Contacts in Cognitive Humanoids”
Jan Babic
14:50-15:30 “Human Adaptation to Human-Robot Shared Control”
Erhan Ozop
15:30-15:45 Break
15:45-16:25 “Is this my body? Studying the sensorimotor self by replicating the rubber-hand illusion on a robot”
Pablo Lanillos
16:25-17:05 “Multimodal Representation Learning”
Eli Sheppard

Sponsor

  • CREST “Cognitive Mirroring: Assisting people with developmental disorders by means of self-understanding and social sharing of cognitive processes”

Co-sponsor

  • Center for Information and Neural Networks, National Institute of Information and Communications Technology