FROG (project)

From Wikipedia, the free encyclopedia

FROG is a collaborative project that proposes to develop a guide-robot that engages tourists in the exploration of outdoor attractions. It is granted by the European Commission under the Seventh Framework Programme.

The acronym stands for Fun Robotic Outdoor guide.

Context

In outdoor settings such as historical and cultural sites as well as theme-parks and zoos, there is a limited amount of information that can be consumed. Content is mostly static, usually conveyed only by text or still images. Human Guides may provide in depth information but these tours are pre-set and do not allow for much personal exploration. They are limited in number and duration and are usually for large groups of visitors rather than the visitors own group of friends or family. Moreover, the quality of the tour and the pleasurability of the experience depend greatly on the individual tour guide’s experience, capabilities and personality.

The use of technologies to support the exploration of such sites has been limited mainly to multimedia GPS-based guides, particularly audio-guides. The entertaining experience depends strongly on their contents and they do not offer a shared experience for visitors who are more likely to visit in small groups rather than individually. Furthermore, such tools do not adapt themselves to the users’ responses.

This project focuses on outdoor guide robots, an emerging class of intelligent robot platforms. The personal and service robotic market is expected to reach $16 billion[1] by 2025 and the augmented reality market is expected to reach between $350 to $732[2] million, by 2014. Outdoor robotic services such as environmental monitoring robots, autonomous transport carts, mobile information kiosks and surveillance robots are altering the way our society experiences services and outdoor environments. FROG aims to turn autonomous outdoor robots into viable and significant location-based outdoor service providers.

Objectives

The main goal of the project is to deliver a robust autonomous mobile robot that engages visitors in the exploration of outdoor sites. Specifically, FROG aims to advance the state of the art in the following fields:

  • Reliable autonomous outdoor robots.
  • Vision-based recognition of spontaneous displays of affective signals including agreement, disagreement, interest and confusion in outdoor environments.
  • Localization and Navigation in populated and dynamic environments.
  • Cognitive social-psychological evaluation of user responses to robot’s adaptive behaviors.

Project Description

FROG key components are intended to:

  • Detect obstacles, recognize certain objects such as pedestrians, and track them;
  • Constantly localize the robot in the environment;
  • Detect affective behaviors of humans;
  • Plan routes for the robot, taking into account the humans and their behaviors;
  • Execute the planned trajectories considering all navigation related issues such as dynamic obstacles;
  • Provide users with relevant content;
  • Interface with users in an engaging way.

The project faces challenges regarding navigation in crowded scenarios: adequate robot chassis and locomotion to overcome the terrain where the robot evolves, under the constraint that it will have to work in an unstructured environment and interact with humans.[3] Robots that have been successfully deployed in crowded scenarios usually navigate by computing the shortest path to the next goal and using local reactive navigation to reach the goal.[4][5]

Regarding navigation, the main objective is the development of socially acceptable, efficient path planning and execution in crowded scenarios, both for robot navigation and group of persons guiding. Research found that people unfamiliar with a robot preferred interacting with a moody robot,[6] probably because the display of emotions was a novelty. Frequent visitors on the other hand preferred interacting with the positive version of the robot, probably because they felt a sense of common ground when they saw the happy expression. When robot behaviors were consistent with human personality types along the extraversion–introversion dimension, participants responded better when interacting with robots whose designed ‘personality’ matched their own. It has been shown that users perceive personalities in robot behaviors and appearances.[7]

The work plan for this project include tasks as:

  • design of a robust outer shell;
  • development of a socially acceptable robot behavior - based on the results of the URUS project on navigation, and MoVeME (on obstacle avoidance research);
  • detection and tracking of nonverbal human communicative cues;
  • human-aware guidance build upon state-of-the-art approaches to pedestrian detection.[8]

Consortium

The Frog consortium consists of 3 technical beneficiaries and two SME partners from four European countries: United Kingdom, Portugal, Spain and the Netherlands.

  • The University of Amsterdam
The Intelligent Autonomous Systems (IAS) group at the University of Amsterdam (UvA) is part of the Intelligent Systems Laboratory and carries out research in the field of robotics and autonomous systems. The group has participated in programmes such as the European IST ITEA program (‘Ambience’ project) and the IST FP6 program (‘COGNIRON’ project).
YDreams develops interactive environments, products and intellectual property in interaction technology and design. It will be responsible to develop an interactive AR application that presents entertaining information to the users and create an intelligent agent architecture that will integrate the different robot components.
  • IDMind
IDMind is a Portuguese SME, founded in April 2000. It developed a set of robotic platforms for research and edutainment:
  •  (2009) FollowMe Robots. Integrated in a joint interactive project with YDreams, IDMind developed a fleet of autonomous mobile robots that guide visitors to the Visitor ́s Center of the “Ciudad Grupo Santander”, near Madrid, Spain.
  • (2006-2009) ROBOSW ARM - knowledge environment for interacting ROBOt SW ARMs (European IST Project – FP6). IDMind was responsible for the specification of the hardware and led the integration of all the robot components (software and hardware).
  • (2003-2005) RAPOSA - Semi-Autonomous Search & Rescue Robot
  • Universidad Pablo de Olavide
The Robotics, Vision and Control Group will participate in the project. The group joins more than 10 full time researchers from two universities, the University of Seville and the Pablo de Olavide University. The researchers at the Pablo de Olavide University have expertise on decentralized sensor fusion, multi-robot systems, decision-making under uncertainty, and outdoors robot localization, mapping and navigation.
The primary interest of the Visual Information Processing section on this project is to pursue fundamental research and real-world applications of Computer Vision. ICL's research has led to over 600 publications, including over 250 peer reviewed journal papers.

See also

References

  1. Robotic Trends, via Japan Robotics Association, 2006
  2. According to Juniper Research and ABI Research, 2009
  3. Borenstein et al., 1996, Siegwart et al., 2004
  4. Burgard et al., 1999
  5. Siegwart et al., 2003
  6. Gockley et al. (2006)
  7. Walters et al, 2008; Butler and Agah 2001
  8. Enzweiler & Gavrila 2009

    External links


    This article is issued from Wikipedia. The text is available under the Creative Commons Attribution/Share Alike; additional terms may apply for the media files.