Various media appearances
Amazon Astro
There has been plenty of press around Astro, so I won't link it all here. The following blog post highlights the latest developments in Astro's functionality - pet detection, small and medium sized businesses, and more!
How Amazon is enhancing Astro for the home and beyondAmazon news, 2022
Huawei's Xiaoyi Wizard home educational robot for children
Here are a couple of articles presenting the children's educational robot we developed at Huawei.
The robot was launched in April 2021, and was capable of spoken interaction, a range of multimodal emotional expressions and gestures using face and body motion, visual recognition and tracking of users, full-body gaze control, connectivity to external devices, and a variety of games and educational applications.
Huawei has launched an AI-equipped robot that you can interact withGizChina.it, 2021
Huawei Children's robot, a different kind of artificial intelligenceDayDayNews, 2021
Erica: Man Made (Guardian video documentary)
Professor Ishiguro and I were recently featured in a Guardian video documentary about ERICA.
The Guardian: Documentaries - Erica: Man Made
The Guardian: Documentaries - Erica: Man Made
Robot's Delight - A Lyrical Exposition on Learning by Imitation from Human-Human Interaction
Best Video Award, HRI 2017
Our latest video won the Best Video Award at the 2017 ACM/IEEE International Conference on Human-Robot Interaction in Vienna! In the form of a musical tribute to The Sugarhill Gang’s 1979 hit “Rapper’s Delight”, this video features Robovie and ERICA rapping in English to outline our recent research into learning-by-imitation of human-human conversational interaction.
In one study, we asked participants to role-play interactions between a shopkeeper and customer in a camera shop. We captured their motion and speech data with sensors, and we applied unsupervised learning techniques to reproduce the shopkeeper's behaviors with Robovie.
In a second study, we applied this technique to a stationary android, ERICA. In this case, since spatial cues were unavailable, we needed to develop a new technique for identifying topic patterns based on modeling the interaction structure.
The Extended Abstract can be found here:
Dylan F. Glas, Malcolm Doering, Phoebe Liu, Takayuki Kanda, and Hiroshi Ishiguro, Robot's Delight - A Lyrical Exposition on Learning by Imitation from Human-Human Interaction, in Companion Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (HRI 2017), pp. 408, Vienna, Austria, March 2017. (Video submission)
Extended Abstract
The related research can be found in the following journal paper, as well as other work currently under review:
Phoebe Liu, Dylan F. Glas, Takayuki Kanda, and Hiroshi Ishiguro, Data-Driven HRI: Learning Social Behaviors by Example from Human-Human Interaction, in IEEE Transactions on Robotics, Vol. 32, No. 4, pp. 988-1008, 2016.
DOI: 10.1109/TRO.2016.2588880
Authors' Preprint
In one study, we asked participants to role-play interactions between a shopkeeper and customer in a camera shop. We captured their motion and speech data with sensors, and we applied unsupervised learning techniques to reproduce the shopkeeper's behaviors with Robovie.
In a second study, we applied this technique to a stationary android, ERICA. In this case, since spatial cues were unavailable, we needed to develop a new technique for identifying topic patterns based on modeling the interaction structure.
The Extended Abstract can be found here:
Dylan F. Glas, Malcolm Doering, Phoebe Liu, Takayuki Kanda, and Hiroshi Ishiguro, Robot's Delight - A Lyrical Exposition on Learning by Imitation from Human-Human Interaction, in Companion Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (HRI 2017), pp. 408, Vienna, Austria, March 2017. (Video submission)
Extended Abstract
The related research can be found in the following journal paper, as well as other work currently under review:
Phoebe Liu, Dylan F. Glas, Takayuki Kanda, and Hiroshi Ishiguro, Data-Driven HRI: Learning Social Behaviors by Example from Human-Human Interaction, in IEEE Transactions on Robotics, Vol. 32, No. 4, pp. 988-1008, 2016.
DOI: 10.1109/TRO.2016.2588880
Authors' Preprint
Erica Demonstration at Miraikan, August 2015
On August 3, 2015, we unveiled our new android, Erica! Her name stands for "ERATO Intelligent Conversational Android". At a press release and an open symposium at "Miraikan", the Japanese national science museum in Tokyo, members of the press and the public came on stage, where Erica answered their questions about her interests, hobbies, dreams, and so on. Prof. Hiroshi Ishiguro from Osaka University and Prof. Tatsuya Kawahara, from Kyoto University, presented the vision and goals of the ERATO Ishiguro Symbiotic Human-Robot Interaction Project, and the other core members of the ERATO team presented various elements of the android system and their research objectives in this five-year project.
For this demonstration, Erica used our "ATRacker" pedestrian tracking system based on laser range finders in conjunction with a microphone array for sound source localization, in order to identify who was talking to her at any given time. Although many androids are teleoperated, Erica was developed to be fully-autonomous, and the only human input was selecting her different interaction modes, e.g. "Q+A Mode", "Listening mode", and "Idle mode", based on the schedule of events in the symposium.
The core android control software, gaze control, execution and blending of explicit gestures and facial expressions, and various nonverbal behaviors such as blinking, breathing, and backchannel nodding, were developed by our team at ATR. The dialogue management framework was developed by a team from Kyoto University using Julius for speech recognition. Speech synthesis was performed using a custom voice developed for Hoya's VoiceText software.
Erica is one of a set of three sister robots, one each at Osaka University, Kyoto University, and ATR. In the future, we plan to develop Erica's personality and capabilities further, creating more engaging, humanlike, and expressive interactions, and to lay a framework for developing interactive android applications for a variety of scenarios.
English and other languages:
For this demonstration, Erica used our "ATRacker" pedestrian tracking system based on laser range finders in conjunction with a microphone array for sound source localization, in order to identify who was talking to her at any given time. Although many androids are teleoperated, Erica was developed to be fully-autonomous, and the only human input was selecting her different interaction modes, e.g. "Q+A Mode", "Listening mode", and "Idle mode", based on the schedule of events in the symposium.
The core android control software, gaze control, execution and blending of explicit gestures and facial expressions, and various nonverbal behaviors such as blinking, breathing, and backchannel nodding, were developed by our team at ATR. The dialogue management framework was developed by a team from Kyoto University using Julius for speech recognition. Speech synthesis was performed using a custom voice developed for Hoya's VoiceText software.
Erica is one of a set of three sister robots, one each at Osaka University, Kyoto University, and ATR. In the future, we plan to develop Erica's personality and capabilities further, creating more engaging, humanlike, and expressive interactions, and to lay a framework for developing interactive android applications for a variety of scenarios.
English and other languages:
- [Video] Say konnichiwa to Erica, the android who can have “completely natural” conversations RocketNews, 5 Aug, 2015
- Meet Japan's Latest Android, Erica Anime News Network, 4 Aug, 2015
- Erica, Robot Android Cantik yang Pandai Bicara Merahputih, 6 Aug, 2015 [Indonesian]
- [Video] 好きなタイプは「滑舌の良い人」 美人顔の会話ロボ開発 Asahi, 3 Aug, 2015
- [Video] 人型ロボット:「エリカ」日本科学未来館で公開 Mainichi Shimbun, 3 Aug, 2015
- [Video] “喜怒哀楽”表情豊かに-阪大・京大、自立対話型アンドロイド「ERICA」を開発 Nikkan, 4 Aug, 2015
- “美人”で人間そっくりのアンドロイド「ERICA」、JSTや阪大など開発 人間との自然な対話、実現へ IT Media News, 3 Aug, 2015
- JST Official Press Release, 3 Aug, 2015
- 美しすぎるハーフ顔の自律対話型アンドロイド「エリカ」 PC Watch, 3 Aug, 2015
- 美人ロボット「ERICA」は23歳 人工知能で自然に対話 阪大や京大が開発 Sankei West, 3 Aug, 2015
- 表情も人間らしく…人工知能で受け答え ロボット公開 阪大・京大などが開発 Nikkei, 3 Aug, 2015
- 「見た目は負けていない」23歳の美人ロボット「ERICA」公開 阪大や京大が開発(画像) 2chan, 3 Aug, 2015
Miraikan ASIMO Tour Guide Demonstration, 2013
In October, 2013, we demonstrated the results of our collaboration with Honda. Using our ambient intelligence systems and several new algorithms developed in the project, we deployed Honda's ASIMO robot as an interactive tour guide in Tokyo's Miraikan science museum.
In this demonstration, the robot relied on our 3D human tracking system to precisely track where and how a visitor was walking, and this data was used to infer the visitor's level of interest in different exhibits and determine the optimal standing position and communication strategy for the robot. For repeat visitors, the robot used information about previous visits to enable richer autonomous spoken interactions.
Note: Our work should not be confused with an unrelated ASIMO demonstration conducted at Miraikan in July 2013, which received some negative reviews when it misrecognized people holding up their smartphones for people raising their hands to ask a question.
In this demonstration, the robot relied on our 3D human tracking system to precisely track where and how a visitor was walking, and this data was used to infer the visitor's level of interest in different exhibits and determine the optimal standing position and communication strategy for the robot. For repeat visitors, the robot used information about previous visits to enable richer autonomous spoken interactions.
Note: Our work should not be confused with an unrelated ASIMO demonstration conducted at Miraikan in July 2013, which received some negative reviews when it misrecognized people holding up their smartphones for people raising their hands to ask a question.
Thank you very much, Mr. Roboto | World Future Society, 2011
Ishiguro-sensei and I were featured in an article in THE FUTURIST Magazine, published by the World Future Society.
The article covers several projects involving Robovie, Geminoid, and some of our Ambient Intelligence work as well as describing some of the motivations of our research and providing an interesting perspective on some of our robotics research.
Thank you very much, Mr. Roboto | World Future Society
The article covers several projects involving Robovie, Geminoid, and some of our Ambient Intelligence work as well as describing some of the motivations of our research and providing an interesting perspective on some of our robotics research.
Thank you very much, Mr. Roboto | World Future Society
APiTA Town Keihanna Shopping Center Wheelchair Demonstration, 2011
On March 30, 2011, we demonstrated the latest developments in the Ubiquitous Network Robot project, this time featuring a robotic wheelchair in a shopping mall.
This demo presented new applications of networked robot systems, demonstrating autonomous planning and safety using ubiquitous sensor networks, location-based services, and integration with remote teleoperators and mobile devices over the internet, to ensure safety and ease-of-use of the robotic wheelchair.
The system successfully enabled the customer to move freely throughout the shopping mall by herself, giving a new level of independence to someone who typically would be dependent on a caretaker to accompany her if she even went out at all.
This demo presented new applications of networked robot systems, demonstrating autonomous planning and safety using ubiquitous sensor networks, location-based services, and integration with remote teleoperators and mobile devices over the internet, to ensure safety and ease-of-use of the robotic wheelchair.
The system successfully enabled the customer to move freely throughout the shopping mall by herself, giving a new level of independence to someone who typically would be dependent on a caretaker to accompany her if she even went out at all.
- NCPAD: "Robotic Wheelchair: Mall shoppers dream come true?", 31 May 2011
- UberGizmo: "Robotic wheelchair offer independence to the elderly", 26 May 2011
- Newlaunches.com: "Robot-wheelchair makes life easier for the elderly", 20 May 2011
- JapanTrends: "Robotic Wheelchair Gives Elderly Independence", 19 May 2011
- IEEE-RAS TC Networked Robots blog: "Robotic Wheelchair Takes Elderly Customers Shopping", 19 May 2011
- Robonable: ATR、足の不自由な高齢者の買い物支援システム公開、車椅子ロボが誘導, 07 Apr 2011
- Kyoto Shimbun: 車いすロボ、買い物支援 ATR、精華で実験印刷用画面を開く, 03 Mar 2011
- ATR Press release: 車いす利用者のショッピングをロボットがサポート!, 28 Mar 2011
Nara Tourist Information Center, 2010
In December 2010, we placed a Robovie-R3 robot in the Nara Tourist Information Center, near JR Nara Station, to demonstrate a prototype system enabling elderly operators to control a conversational robot over the internet. The target of the system is to enable people with limited mobility, such as retirees or parents raising young children, to easily engage in part-time work using teleoperated robots.
The teleoperators in the demonstration were members of Suzaku, an association of retirees who act as volunteer tour guides in the Nara area. One operator was located in Nara and the other at ATR in Kyoto, and they took turns controlling the robot to answer questions, give directions, and tell entertaining stories about famous sights in Nara, such as the Great Buddha at Todaiji temple and the deer in Nara Park.
The teleoperators in the demonstration were members of Suzaku, an association of retirees who act as volunteer tour guides in the Nara area. One operator was located in Nara and the other at ATR in Kyoto, and they took turns controlling the robot to answer questions, give directions, and tell entertaining stories about famous sights in Nara, such as the Great Buddha at Todaiji temple and the deer in Nara Park.
- IEEE-RAS TC Networked Robots blog: "Robot Teleoperation for Everyone", 27 Dec 2010
- PlasticPals.com: Robovie R3: Telepresence Tour Guide, 17 Dec 2010
- Spotsu: Robovie R3 entra al mundo de la telepresencia, 18 Dec 2010
- Jiji.com: 観光案内のロボット開発=身ぶり交え、アドリブも-奈良, 15 Dec 2010
- 奈良新聞Web: 「ロボット通し観光案内 - 遠隔操作で社会参加/ATR実証デモ」, 16 Dec 2010
- Mediajam: 「ロボット通し観光案内 - 遠隔操作で社会参加/ATR実証デモ」., 16 Dec 2010
- Asahi.com: 「難しい質問に…「勉強中です」奈良駅にガイドロボット」, 25 Jan 2011
APiTA Town Keihanna Shopping Center, 2009-2010
In 2009-2010, we placed robots and environmental sensors in the APiTA Town shopping mall. The target application: helping elderly people with their shopping. This field demonstration showcased several new technologies, such as new, portable sensors for our laser-based human tracking system and smartphone integration with robot services.
- IEEE-RAS TC Networked Robots blog: "Robovie II helps elderly customers in a supermarket", 15 Apr 2010
- Robovie Goes Shopping with Grandma, 2 Jan 2010
- DigitalTrends: Robovie II Assists the Elderly in Supermarkets, 18 Jan 2010
- ElderGadget: Robovie II: Shopping Made Easy for Seniors, 19 Jan 2010
- RobotWatch: ATR、京都府相楽郡で複数ロボットの連携による高齢者の生活支援実験を開始, 14 Dec 2009
Universal CityWalk Osaka, 2008-2009
In 2008-2009, we performed a series of field studies in which four robots patrolled a part of the shopping area in Universal CityWalk Osaka, greeting customers, recommending shops, and giving directions. During these studies, we demonstrated our multi-robot teleoperation systems, our human tracking and motion primitive analysis systems, a global service allocation and path planning system, and several other technologies.
Robot Services
Integration with environmental sensor networks and primitive analysis enable us to target robot services to the people who appear most likely to need them.- PinkTentacle: "Robovie droid helps lost shoppers" 25 Jan 2008
- BotJunkie: "Robovie Finds Lost Shoppers, Eats Their Souls" 28 Jan 2008
Network Robot System Demonstration
Final demonstration of the Network Robot project, including simultaneous teleoperation of four robots, centralized dynamic service allocation and path planning, robot-robot collaboration, and much more.Robovie and ASIMO - "Robot Cafe"
Combining the strengths of the two robots, Robovie and Honda's ASIMO work together in a cafe demonstration, where Robovie chats with the customers and takes orders, and ASIMO walks around delivering drinks. The collaboration was enabled by the Network Robot Platform, which mediated messages between the two robots.- Botropolis: Asimo & Friend Run a Cafe Together, 6 Jan 2009
- Nikkei BPnet: ASIMOと他のロボットが連携して働く実験,国際電気通信基礎技術研究所らが実演, 24 Dec 2008
- RoboNable: ホンダとATR、ASIMOとRobovie-Ⅱによる連携サービスを実施。環境情報の獲得により周辺状況に対応, 26 Dec 2008
- RoboNable: ATR、イタリアのゴミ収集ロボを初公開、Robovieとの連携サービスを披露, 25 Dec 2008
Robovie and DustCart Collaboration
DustCart, the autonomous mobile trash-can robot developed as a part of the DustBot project by SSSA, came to Japan for a collaboration with Robovie. The two robots demonstrated a luggage-carrying service scenario mediated by the Network Robot Platform.RoboPal, 2007
This was the first robot I worked on at ATR. It is designed to be a daily-life companion for elderly people. I developed the entire robot control architecture, including map-based localization, path planning, a safety system, a scripting language for developing applications, and a graphical interface for teleoperation and system monitoring.
- PlasticPals: "RoboPal", 16 Mar 2010