September 2, 2020 
16:30 – 20:00 (KST)

Naples, Italy (Virtual Conference)

,The 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN 2020) invites papers for a Workshop on Social Human-Robot Interaction of Human-care Service Robots.

Service robots with social intelligence are coming to the human world, and they will help us to make our lives better. We are organizing an exciting workshop at RO-MAN 2020 that is oriented towards sharing the ideas of participants with diverse backgrounds ranging from Human-Robot Interaction design, social intelligence, decision making, social psychology and robotic skills. The purpose of this workshop is to explore how social robots can interact with humans socially and facilitate the integration of social robots into our daily lives.

This workshop focuses on three social aspects of human-robot interaction: (1) technical implementation of social robots and products, (2) form, function and behavior, and (3) human behavior and expectations as a means to understand the social aspects of interacting with these robots and products.

This workshop will focus on the current advances in the area of social Human-Robot Interaction, social intelligence, social skills, and their applications including clinical evaluations, user studies exploring consumer acceptance of social robots, and so on. Papers are solicited on all areas directly related to these topics, including but not limited to:

  • Social perception and context awareness
  • Short/long-term behaviour recognition
  • Social expression and interactive behavior
  • Social task modelling and management
  • Social grasping and navigation skills
  • Social humanoid robot design
  • Human-robot interaction design
  • Emotion recognition and model design
  • Dialogue based interaction
  • User evaluation
  • Applications such as healthcare, receptionist, education

Authors should prepare their papers based on template provided by IEEE (LaTex template or Microsoft Word template). All papers must be written in English and submitted electronically in A4 PDF format. 

Submission Instructions

We invite short papers of 2-4 pages in the standard double-column IEEE RO-MAN conference format. Submissions should be in accordance with the topics of interest, and they will be available on the workshop’s website:

Email your papers to Ho Seok Ahn ( by the submission due. You can download the CFP here


Presentation Instructions

All technical presentations go for 7 mins including 2 min Q&A. 

Each invited talks go for 25 mins including 5 mins Q&A. 

2020 virtual conference we will use the platform. Please remember that the attendance to the workshops as well as to the main conference is free, but to get access credentials you have to register using the form available at


Important Dates

  • Paper submission: August 15, 2020
  • Notification of acceptance: August 20, 2020
  • Final camera-ready manuscript: August 25, 2020
  • Final video/slide file: August 25, 2020

Workshop Program (KST)

  • 16:30 – 16:40 Greeting & Opening (Chair: Dr. Minsu Jang)
  • 16:40 – 17:05 Invited Talk I: Minas Liarokapis, University of Auckland, New Zealand (Chair: Prof. Ho Seok Ahn)
  • 17:05  17:30 Invited Talk II: Jung Kim, KAIST, Korea (Chair: Dr. Jongsuk Choi)
  • 17:30 – 17:55 Invited Talk III: Mary Ellen Foster, The University of Glasgow, UK (Chair: Dr. Minsu Jang)
  • 17:55  18:10 Break
  • 18:10  19:45 Technical Session (in order of the technical papers) (Chair: Prof. Ho Seok Ahn)
  • 19:45  20:00 Closing (Chair: Dr. Jongsuk Choi)

Invited Speakers 

  • Minas Liarokapis, University of Auckland, New Zealand, “Increasing the Dexterity of Humans and Robots: From Robust Grasping and Dexterous Manipulation to Haptic Object Identification”
    Abstract: The human hand is Nature’s most versatile and dexterous end-effector. Its sensory and motor capabilities endow it with unique properties and a degree of specialization that is not evident in other human body parts.  Greek philosophers Anaxagoras and Aristotle debated in their works, whether the dexterity of human hand helped humans develop a superior brain or the superior brain increased the human hand dexterity. This mechanical dexterity is nowadays recognized as a key characteristic that facilitated the development of a superior brain in Homo sapiens. For this reason, the human hand has always been a source of inspiration for roboticists, who are still struggling to equip robots with humanlike grasping and manipulation capabilities. Recently, a new class of adaptive, underactuated, and compliant robot mechanisms has been introduced. The reduced number of actuators and the structural compliance of these mechanisms tend to significantly simplify the grasping and  manipulation problems, relaxing the computational complexity and the required control effort. Adaptive robot mechanisms provide a robust and affordable alternative to the sophisticated, expensive, fully-actuated devices that are typically considered for complex tasks. This talk will focus on how adaptive robot hardware can increase human and robot dexterity facilitating robust grasping, dexterous manipulation and haptic object identification. 


  • Jung Kim, KAIST, Korea, “Tactile sensing and sensors for human robot interactions”
    Abstract: Although human touch contributes significantly to various interactions between humans and robots, there are still many challenges to be addressed compared to vision and hearing modalities. One of the reasons is a challenge to develop reliable tactile sensing structures, including soft skin with embedded four types of mechanoreceptors as in human counterparts. In this talk, I will present two of the current works in the Biorobotics lab of KAIST. Firstly, A Large Area Robotic Skin is going to be presented. The skin can detect mainly dynamic tactile stimuli and interpret the signals into the tactile cues for nonverbal sensory communications. The large tactile sensor that could cover the robot’s passive body parts using a few sparsely distributed microphones to cover a large area in a cost-efficient manner combined with the FPGA system for computational efficiency and high communication bandwidth.Secondly, the use of tactile sensors for human-like grasping will be presented. The tactile sensor detects the high-frequency tactile signals and low-frequency signals separately, which is similar to the mechanoreceptors’ roles. The tactile signals at different grasping state transitions are memorized as predicted tactile events and used as triggers in our tactile event based grasping. This method allows for grasping objects that are mislocalized from their expected positions. The simulation and experimental method and results will be presented at the talk


  • Mary Ellen Foster, The University of Glasgow, UK, “MuMMER: Socially Intelligent Human-Robot Interaction in Public Spaces”
    Abstract: In the EU-funded MuMMER project, we have developed a social robot designed to interact naturally and flexibly with users in a public space. The robot system encompasses state-of-the-art components for audio-visual sensing, social signal processing, conversational interaction, perspective taking, geometric reasoning, and motion planning. The final MuMMER robot system was deployed in a shopping mall in Finland for 14 weeks, where it interacted with a wide range of customers. In this talk, I will describe the components of the MuMMER system and the supported robot behaviours and scenarios. I will also present the details, results, and lessons learned from the final long-term robot deployment.

Technical papers

  • Elahe Bagheri, Oliver Roesler and Bram Vanderborght, Universiteit Brussel, Belgium, “Toward a Reinforcement Learning Based Framework for Learning Cognitive Empathy in Human-Robot Interactions” [Paper] [Slide
  • Woo-Ri Ko, Minsu Jang, Jaeyeon Lee, and Jaehong Kim, ETRI, Korea, “Adaptive Behavior Generation of Social Robots Based on User Behavior Recognition” [Paper] [Slide
  • Dohyung Kim, Jinhyeok Jang, Minsu Jang, Jaeyeon Lee, and Jaehong Kim, ETRI, Korea, “3D Daily Activity Recognition Dataset for Elderly-care Robots” [Paper] [Slide
  • Chankyu Park, Minsu Jang, Jaeyeon Lee, and Jaehong Kim, ETRI, Korea, “Deep Multi Class wise Cloth ing Attributes Recognition for the Elderly Care Robot Environment” [Paper] [Slide 
  • Jaeyeon Lee, Dohyung Kim, Minsu Jang, and Jaehong Kim, ETRI, Korea, “Fashion Small Talk: Generating Friendly Comments on the Attire of the Interacting Person by using Image Captioning Technology” [Paper] [Slide 
  • Bok Cha, and Joo-Haeng Lee, ETRI, Korea, “Initial Design on VR-based Data Acquisition Environments for Human-Robot Interaction: Scenarios and Functional Requirements” [Paper] [Slide 
  • Ung Park, Eui Jun Hwang, and JongSuk Choi, KIST, Korea, “Automatic Generation of Eye Expressions with End-to-End Learning” [Paper] [Slide 
  • Jeongho Lee, Myoungha Song, SeungHyun Kang, Changjoo Nam, Changhwan Kim, and Dong Hwan Kim, KIST, Korea, “Human-like Object Manipulation Based on Object Affordance Detection and 3D Shape Analysis for Social Robot” [Paper] [Slide 
  • Hanna Lee, Seo-young Lee, Jongsuk Choi, Jee Eun Sung, Heuiseok Lim and Yoonseob Lim, KIST, Korea, “Analyzing the Rules of Social Dialogue and Building a Social Dialogue Model in Human Robot Interaction” [Paper] [Slide  
  • Adriana Lorena González, Denise Y. Geiskkovitch, and James E. Young, University of Manitoba, “When Can I Get a Robot for my Home?: A Constrained Design Approach to Feasible, Deployable Companion Robots” [Paper] [Slide
  • Rahatul Amin Ananto, and James E. Young, University of Manitoba, “Robot Pets for Everyone: The Untapped Potential for Domestic Social Robots” [Paper] [Slide
  • Deborah Johanson, Jong Yoon Lim, and Ho Seok Ahn, The University of Auckland, New Zealand, “Evaluation about forward lean, self-disclosure, and voice pitch changes” [Paper] [Slide]  
  • Jingwen Mao, and Ho Seok Ahn, The University of Auckland, New Zealand, “How to measure personality traits of characters?” [Paper] [Slide

Conference Information


Workshop Organizers

  • Ho Seok Ahn, University of Auckland, New Zealand (hs.ahn[at]
  • Jongsuk Choi, KIST, Korea (pristine70[at]
  • Minsu Jang, ETRI, Korea (minsu[at]
  • Yoonseob Lim, KIST, Korea (yslim[at]
  • Hyungpil Moon, SungKyunKwan University, Korea (hyungpil[at]
  • Sonya S. Kwak, KIST, Korea (