… and other ponderings in 11th-dimensional space

Barajas et al., US Patent 9,387,589 Granted

Jul 23rd, 2016 | By | Category: Featured Articles, NASA, Patents, Publications, Robonaut 2

United States Patent 9,387,589
Barajas, et al. July 12, 2016

Visual debugging of robotic tasks

Download: US Patent 9,387,589  [text]

Abstract
A robotic system includes a robot, sensors which measure status information including a position and orientation of the robot and an object within the workspace, and a controller. The controller, which visually debugs an operation of the robot, includes a simulator module, action planning module, and graphical user interface (GUI). The simulator module receives the status information and generates visual markers, in response to marker commands, as graphical depictions of the object and robot. An action planning module selects a next action of the robot. The marker generator module generates and outputs the marker commands to the simulator module in response to the selected next action. The GUI receives and displays the visual markers, selected future action, and input commands. Via the action planning module, the position and/or orientation of the visual markers are modified in real time to change the operation of the robot.


Inventors: Barajas; Leandro G. (Harvest, AL), Payton; David W (Calabasas, CA), Ku; Li Yang (Amherst, MA), Uhlenbrock; Ryan M (Calabasas, CA), Earl; Darren (Los Angeles, CA)
Applicant:
Name City State Country Type

GM GLOBAL TECHNOLOGY OPERATIONS LLC
Detroit MI US
Assignee: GM Global Technology Operations LLC (Detroit, MI)
Family ID: 53485175
Appl. No.: 14/189,452
Filed: February 25, 2014
United States Patent 9,120,224
Sanders, Barajas, et al. September 1, 2015

 US Patent 9,387,589 2016 Barajas et al., Visual Debugging of Robotic Tasks 600

A robotic system comprising: a robot responsive to input commands; sensors which measure a set of status information in real time, including a position and an orientation of the robot and an object within the workspace; and a controller having a processor and memory on which is recorded instructions for visually debugging an operation of the robot, thereby allowing a user to change the robot’s behavior in real time, the controller including: a simulator module in communication with the sensors, wherein the simulator module receives the set of status information from the sensors in real time and generates visual markers, in response to marker commands, as graphical depictions of the object and the robot in the workspace, wherein the visual markers provide graphical representations of current and future actions of the robot; an action planning module configured to select a next action of the robot; a marker generator module in communication with the action planning module, and configured to generate and output the marker commands to the simulator module in response to the selected next action of the robot; and a graphical user interface (GUI) having a display screen, wherein the GUI is in communication with the simulator module, and is operable to receive and display the visual markers, and to receive the input commands and modify, via the action planning module, at least one of the position and orientation of the visual markers to change the operation of the robot in real time, thereby visually debugging the operation; wherein the visual markers include a target marker indicating a desired target of an end effector of the robot, trajectory markers indicating an approach and a departure trajectory of the end effector, an objective marker indicating where the object will be in the future, and a collision marker indicating where the end effector will collide with the object.

Leave Comment