ICSR 2017 WORKSHOP ON SOCIAL ROBOT INTELLIGENCE FOR SOCIAL HUMAN-ROBOT INTERACTION OF SERVICE ROBOTS

November 22, 2017

Room 401, International Congress Center Epochal Tsukuba, Tsukuba, Japan

The 2017 International Conference on Social Robotics (ICSR 2017) invites papers for a Workshop on Social Robot Intelligence for Social Human-Robot Interaction of Service Robots.

Service robots with social intelligence are coming to the human world, and they will help us to make our lives better. We are organizing an exciting workshop at ICSR that is oriented towards sharing the ideas of participants with diverse backgrounds ranging from robotics, machine learning, computer vision, social psychology, and Human-Robot Interaction design. The purpose of this special session is to explore how social robots can interact with humans socially and facilitate the integration of social robots.

This workshop will focus on the current advances in the area of social Human-Robot Interaction, social intelligence, social skills, and their applications including clinical evaluations. Papers are solicited on all areas directly related to these topics, including but not limited to:

  • Social perception and context awareness
  • Short/long-term behaviour recognition
  • Social expression and interactive behavior
  • Social task modelling and management
  • Ontology based decision making
  • User evaluation
  • Emotion recognition
  • Emotion model
  • Dialogue based interaction
  • Script language design
  • Human-robot interaction design
  • Healthcare applications
  • Receptionist applications
  • Education applications

Prospective authors are invited to submit short papers (1-2 pages) with the ICSR2017 Workshop on Social Human-Robot Interaction of Service Robots format by the paper submission due, and the slide file (ppt or pdf) by the slide submission due. You can download the workshop format from here: [ICSR_Workshop_Form]

Please submit your paper and slide to Dr. Ho Seok Ahn (hs.ahn@auckland.ac.nz) with this title format: [ICSR2017 Workshop] Author_Title

Important Dates

  • Paper submission: October 30, 2017
  • Notification of acceptance: November 10, 2017
  • Slide (ppt or pdf) submission: November 20, 2017 (please re-submit your slide on the workshop day if you update your slide after the submission)
  • Workshop: November 22, 2017

Workshop Organizers

  • Jongsuk Choi (KIST, Korea)
  • Ho Seok Ahn (The University of Auckland, New Zealand)
  • Ho-Sub Yoon (ETRI, Korea, Korea)
  • Yong-Suk Choi (Hanyang University, Korea)
  • Jaeho Lee (University of Seoul, Korea)
  • Sonya S. Kwak (Ewha Womans University, Korea)
  • Yoonseob Lim (KIST, Korea)

Invited Speakers

  • John-John Cabibihan (Qatar University, Qatar)
  • Yunkyung Kim (NASA, USA)
  • Amit Kumar Pandey (SoftBank Robotics)
  • Roy Amir (Intuition Robotics)

Presentations 

  • JongSuk Choi, Yoonseob Lim, “Introduction to Social Human-Robot Interaction and The Related Korean Project” (KIST) – Short talk / Submission / Slide
    Robots play important roles in our daily lives. Some robots are designed to lift and assemble heavy parts of the machine and other robots are closely related with humandaily jobs such as cleaning  oors and guiding customers at the shop. However, most of the robot follows pre-designed interaction rules and cannot behave adaptivelyunder dynamic human-robot interaction situations. Recently, we have proposed a new project whose goal is to build robot intelligence for social human-robot interactionin diverse interaction conditions. In this presentation, we outlined the overall architecture of the system and possible interaction scenarios with proposed system.
  • Minsu Jang, Hosub Yoon, Jaehong Kim, “Developing a software framework for social robots: Some issues and experiences” (ETRI) / Submission Slide
    We succinctly introduce some issues we experienced while designing and experimenting with a software framework for social robots. Our framework, inspired by Aldebaran’s Naoqi framework, provides a chatbot-oriented interaction system interfaced with multimodal event handling based on rich perception and behavior control features. Building simple interaction scenarios was easy in our framework, but enriching them with more complex and nuanced situations was very hard due to some technical issues like lack of perceptual continuity, false positives in perception, crisp  decisions and difficulty of implementing multiparty interaction. These issues can downgrade the overall quality of social robot experiences and make engaging people in long-term repetitive interactions very difficult.
  • Hwawoo Jeon, Jongsuk Choi, Yoonseob Lim, “Ontology for caring dementia patient” (KIST) / Submission / Slide
    In this paper, we have designed a knowledge-base for caring dementia patient. Proposed knowledge system contains an ontology that describes the knowledge on dementia patient, indoor environment, and various interaction situations between human and robot. By implementing the ontology with additional reasoning rules programmed in Prolog, we have shown the feasibility of the proposed knowledge-base under simple human-robot interaction scenarios. For example, robot can find an alternative way of drinking water such as using straw if the patient cannot drink water directly from a cup or help the patient find a straw based on the spatial relation between patient and the object. We are going to expand the knowledge on robot and human-robot interactions and test the validity of the proposed knowledge with the help from experts in dementia patient caring facilities in Korea.
  • Geon Kim, Doo Soo Chang, Yong Suk Choi, “Verbal and Nonverbal Greetings as a Unit of Social Interaction between Human and Robot” (Hanyang University) / Submission / Slide
    Most existing robot knowledge is domain-specific and not suitable for personlized services because of the focus on performing specific service tasks and the lack of knowledge description for users respectively. To overcome such limitations, we propose an ontology-based human-robot interaction knowledge categorized into five ontology models to avoid service-dependency and to maintain extensibility: user, robot, perception, environment, and action. In addition, we develop a knowledge management system to manage the extension and inference of the knowledge.
  • Seonghark Hong, Byunggi Choi, Jaeho Lee, “Ontology-based Service Model for Social Robot Service Description” (University of Seoul) / Submission / Slide
    This paper presents an ontology-based service model to describe social human-robot interactions of service robots. The ontological description allows us to develop an open and extensible service model suitable to build social robots in the multi-agent framework. We use the Gaia Methodology as the foundation of the model to augment social human-robot interaction. Experimental results show that our ontology-based model can be extended towards an integration framework for social robots to utilize existing robotic systems.
  • Ho-Sub Yoon, Jaeyoon Jang, Jaehong Kim, “The build of new short-term social action DB using RGB-D cameras” (ETRI) / Submission / Slide 
    There are so many action DBs including 3D depth information for the test and verification of action recognition technology. However, there are rare databases of the focused on social behavior recognition between people and people or people and robots. In this paper we aim to construct a new action databased that is specified to these social behaviors. To do this, we define various short-term social behaviors in groups according to their meaning, and describe the surrounding environment for acquiring them. Also, in order to develop a recognition system based on deep learning algorithm in the future, a DB generation method considering various ages and angles of cameras is described in order to acquire a large-capacity database. Finally, our short-term social action DB proposed in this paper will play an important role in recognizing and understanding the social interaction between human and robot.
  • SunKyoung Kim, Dahyun Kang, Go Woon Kim, Sonya S. Kwak, “Verbal and Nonverbal Greetings as a Unit of Social Interaction between Human and Robot” (Ewha Womans University) / Submission / Slide  
    This investigation considers verbal and nonverbal greetings as a unit of social interaction between human and robot. The social interaction process is divided into three stages: initiation, continuation, and termination. In this research, we explore each stage of social interaction in the context of sharing greetings. The greeting process continues or ends depending on the level of interest expressed by the user. The challenge of this research is to explore efficient verbal or nonverbal greeting expressions in each stage in order to design social human-robot interaction.
  • Byeong-Kyu Ahn, Ho Seok Ahn, Jong Yoon Lim, Elizabeth BroadbentBruce MacDonald, “A multi-modal behavior generator for social robots” (The University of Auckland) / Submission / Slide
    The paper presents a behavior generator for social robots that generates multi-modal expressions. It is not easy to make natural expressions as social robots should consider different parts and skills in the interaction with human. Therefore, we suggest common elements that required to interaction with humans as well as an abstracted command interface for each elements. We develop the suggested engines and apply to two different robot platforms; EveR and Silbot3.
  • Hyunwook Bin, Yoonseob Lim, Jongsuk Choi, “Automated psychophysical personality data acquisition system for human-robot interaction” (KIST) / Submission / Slide  
    This paper introduces an experimental method to collect various types of non-verbal cues for personality recognition that could be used for reliable human-robot interaction. The proposed system can record raw data of video, audio, and physiological signals through 9 different interaction scenarios. Each scenario is presented through a customized program, PsychoPy that is freely available for psychology study. Also, ROS is used to control a robot and sensing hardwares such as camera and microphone. All the acquired data can be automatically organized for distinct modalities to find the possible correlation between different non-verbal cues and human personality scores from Big-5 personality model. So far, 10 different people have participated in the experiment and responses from more people will be added to the dataset. In the future, we are going to analyze the obtained dataset to find the correlation between non-verbal human behavior and human personalities.
  • YuJungChae, HunSeob Sin, ChangHwan Kim, Sung-Kee Park, “Phrase-based Gesture Type Inference and Gesture Generation for a robot” (KIST) / Submission  / Slide 
    Robot gestures enhance user’s comprehension to recall something from memories and help to reduce confusion in communication when a robot and a human work together. In addition, robot gestures provide users with familiarity and likability with a robot. Therefore, the generating of robot gestures is essential to interact socially with users in Human-Robot Interaction. In this regard, there are two important issues. First, it is required to recognize the appropriate start and end timing of the gesture in order to generate natural robot gestures and clearly emphasize the important parts of the spoken dialogue. Second, gesture types should be considered, since the outcome of the interaction between a human and a robot is different according to each gesture type. This paper introduces a new methodology of the generation of co-speech gesture suitable for the spoken dialogue of a robot for a user’s comprehension. Our method automatically adjusts a gesture to the length of play time of the phrase and infers gesture types based on an ensemble learning method using grammatical information as Part-Of-Speech, Constituent, and morpheme. As a result, we verified good accuracy for the recognition of phrases and inference of gesture types, and received a positive evaluation from the user tests.

Invited Speakers 

  • John-John Cabibihan (Qatar University, Qatar)
  • Yunkyung Kim (NASA, USA)
  • Amit Kumar Pandey (SoftBank Robotics)
  • Roy Amir (Intuition Robotics)
  • Ho-Sub Yoon (ETRI, Korea, Korea)
  • Yong-Suk Choi (Hanyang University, Korea)
  • Yoonseob Lim (KIST)
  • Ho Seok Ahn (The University of Auckland)

Workshop Program

       9:00 – 9:30 Registration & Greeting
       9:30 – 9:40 Opening
       9:40 – 11:00 Invited Talk I: Research Group
       
11:00 – 11:10 Break
       
11:10 – 12:40 Session I: Framework and Ontology
       
12:40 – 14:00 Lunch
       
14:00 – 15:40 Session II: Social Aspects and Behavior
       
15:40 – 16:00 Break
       
16:00 – 16:50 Invited Talk II: Industry Group
       
16:50 – 17:10 Final Remark
       
17:10 – 17:20 Closing

       Opening / Closing (Jongsuk Choi)

      Invited Talk I: Research Group (Chair: Ho Seok Ahn)
       9:40 – 10:20 John-John Cabibihan (Qatar University, Qatar)
       10:20 – 11:00 Yunkyung Kim (NASA, USA)

      Invited Talk II: Industry Group (Chair: Ho Seok Ahn)
       16:00 – 16:30 Amit Kumar Pandey (SoftBank Robotics)
       16:30 – 16:50 Roy Amir (Intuition Robotics)

      Session I: Framework and Ontology (Chairs: Yong Suk Choi / Yoonseob Lim)
       11:10 – 11:20 JongSuk Choi, Yoonseob Lim, “Introduction to Social Human-Robot Interaction and The Related Korean Project” (KIST)
       11:20 – 11:40 Minsu Jang, Hosub Yoon, Jaehong Kim, “Developing a software framework for social robots: Some issues and                                                  experiences” (ETRI)
       11:40 – 12:00 Hwawoo Jeon, Jongsuk Choi, Yoonseob Lim, “Ontology for caring dementia patient” (KIST)
       12:00 – 12:20 Geon Kim, Doo Soo Chang, Yong Suk Choi, “Ontology based Human-Robot Interaction Knowledge for Intelligent                                                Services” (Hanyang University)
       12:20 – 12:40 Seonghark Hong, Byunggi Choi, Jaeho Lee, “Ontology-based Service Model for Social Robot Service Description”                                            (University of Seoul)

      Session II: Social Aspects and Behavior (Chairs: JongSuk Choi / Hosub Yoon)
     
14:00 – 14:20 Ho-Sub Yoon, Jaeyoon Jang, Jaehong Kim, “The build of new short-term social action DB using RGB-D cameras” (ETRI)

       14:20 – 14:40 SunKyoung Kim, Dahyun Kang, Go Woon Kim, Sonya S. Kwak, “Verbal and Nonverbal Greetings as a Unit of Social                                             Interaction between Human and Robot” (Ewha Womans University)
       14:40 – 15:00 Byeong-Kyu Ahn, Ho Seok Ahn, Jong Yoon Lim, Elizabeth BroadbentBruce MacDonald, “A multi-modal behavior                                             generator for social robots” (The University of Auckland)
       15:00 – 15:20 Hyunwook Bin, Yoonseob Lim, Jongsuk Choi, “Automated psychophysical personality data acquisition system for                                             human-robot interaction” (KIST)
       15:20 – 15:40 YuJung Chae, HunSeob Sin, ChangHwan Kim, Sung-Kee Park, “Phrase-based Gesture Type Inference and Gesture                                             Generation for a robot” (KIST)

       Final Remark (Chairs: Yoonseob Lim)

Conference Information

Contact Information

  • Jongsuk Choi (KIST, pristine70[at]gmail.com)
  • Ho Seok Ahn (The University of Auckland, hs.ahn[at]auckland.ac.nz)