Difference between revisions of "3W for HRI"
From Robot Intelligence
(→Project Outline) |
(→Results) |
||
Line 30: | Line 30: | ||
* [https://youtu.be/MNRCwFDDbXI-_I WHERE-WHO Fusion] | * [https://youtu.be/MNRCwFDDbXI-_I WHERE-WHO Fusion] | ||
+ | ::[[File:(SimonPiC)_WHERE_WHO_Fusion.jpg|600px|left]] <br/> <br/> <br/> <br/> <br/> <br/><br/> <br/> <br/> <br/> <br/> <br/> | ||
+ | |||
* [https://youtu.be/tVGpAoH13dI-_I HRI2015 Leg detection] | * [https://youtu.be/tVGpAoH13dI-_I HRI2015 Leg detection] | ||
* [https://youtu.be/kyu7psC4IG0-_I 3D simulation of 3W Fusion] | * [https://youtu.be/kyu7psC4IG0-_I 3D simulation of 3W Fusion] |
Revision as of 11:48, 16 October 2017
Contents
Project Outline
- Research Period: 2012.6 ~ 2017.5
- Funded by the Ministry of Trade, Industry and Energy (Grant No: 10041629)
- Members: Sung-Kee Park, Ph.D. , Yoonseob Lim, Ph.D. , Sang-Seok Yun, Ph.D., Junghoon Kim, M.S., Hoang Minh Do(UST), Donghui Song(UST), Hyeonuk Bhin(UST), Gyeore Lee(UST),
Introduction and Research Targets
1. This project is for the purpose of the implementation of technologies for identification (WHO), behavior (WHAT) and location (WHERE) of human based sensor network fusion program.
2. In this project we develop a robot-assisted management system for promptly coping with abnormal events in classroom environments.
- Reliably detect the occurrence of human-caused emergency situations via audio-visual perception modules - Make an urgent SMS transmission to let someone know that an emergency event occurs, and relay spot information to them - Perform an immediate reaction for the happening by using robot navigation and interaction technologies on behlaf of the remote user
Results
Developing Core Technologies
DETECTION
- WHERE: Human detection and localization
- WHO: Face recognition and ID tracking
- WHAT: Recognition of individual and group behavior
- 3W data association on the perception sensor network
AUTOMATIC SURVEILLANCE
- Automatic message transmission for human-caused emergencies
- Analysis of student attitude
- Remote monitoring via Web technologies and stream server
ROBOT REACTION
- Human-friendly robot behavior
- Gaze control and robot navigation
- Human following with recovery mechanism