Please Note: The content on this page is not maintained after the colloquium event is completed. As such, some links may no longer be functional.
Development of a Google Glass App to Improve Robot Safety
Wednesday, September 14, 2016
Building 3 Auditorium - 11:00 AM
(Coffee and cookies at 10:30 AM)
While NASA has taken large measures to assure the safety and security of its employees, there are still potential concerns inherent in the use of large industrial robots. These are a prime interest to the Satellite Servicing Capabilities Office (SSCO), which is developing numerous robotic systems for the Robotic Refueling Mission (RRM), Asteroid Redirect Robotic Mission (ARRM), and Restore-L Mission. When controlling these systems, robot operators must split their attention between the robotic arm and the robot control and telemetry displays. It is vital that the operator maintain an awareness of the positioning of the robot with respect to hazards in the environment to prevent incidents. To aid in this, the use of a Google Glass device was proposed, which would utilize the voice recognition and Heads Up Display (HUD) features of the Glass to alert the operator to potential dangers. As the operator keeps the robot in their direct line of sight, data or alerts appear in the top-right corner of their vision via the Glass's holographic capabilities. Emergency alerts, pertaining to hazardous conditions, are immediate and intrusive, with both visual and auditory cues, notifying the operator before any damage occurs. The software was developed in Android Studio using Java, XML, and the GraphView open-source API. Data was collected as UDP and TCP packets, parsed and evaluated, and then displayed in dynamic numerical and graphical representations to provide the operator with an intuitive understanding of the robot's performance. The results of a survey conducted amongst the robotic arm operators showed overwhelming support for the software, as well as future development to include more complex graphical features. The software provides a safer simulation environment, where the operator is more aware of the robot and its environment, and greatly mitigates the potential for harm to both machinery and robotics personnel. Additionally, the SSCO is investigating the use of advancements in the area of Augmented Reality (AR) beyond the capabilities of the Google Glass, including using devices such as Microsoft's HoloLens to directly overlay telemetry and safety information on the robots. The goal of these efforts is to improve and streamline the workflow for robot operators, while ensuring the safety of both personnel and facilities. Ultimately, such technologies could be leveraged by robot operators, including astronauts, controlling robots in space, either locally or remotely.
William Gallagher is a Robotics Engineer at NASA Goddard Space Flight Center in the Satellite Servicing Capabilities Office in Greenbelt, MD. He received his B.S. in aerospace engineering from the University of Notre Dame in Notre Dame, Indiana, in 2007, then received his M.S. in aerospace engineering and Ph.D. in robotics from Georgia Institute of Technology in Atlanta, Georgia, in 2008 and 2013, respectively. His work focuses on advancing robotic capabilities for on-orbit servicing, assembly, and repair of satellites, as well as developing robotic systems for missions to solar system bodies. He is currently working on the design of the robotic components of the Asteroid Redirect Mission to acquire and return a boulder-sized sample from an asteroid, as well as the robotic hardware for the Restore-L mission for satellite servicing. His technical interests include increased autonomy for space-based robots, as well as improved human-robot interaction both on Earth and in space.
IS&T Colloquium Committee Host: Keith Keller
Sign language interpreter upon request: 301-286-7040