Space Robots, Machine Vision, And Object Interaction

With the successful launch and reentry of Tesla’s Falcon Heavy rocket, now is an excellent opportunity to talk

With the successful launch and reentry of Tesla’s Falcon Heavy rocket, now is an excellent opportunity to talk about robots, machine vision, and their roles in expanding space research and exploration.

 

a still of the starman mannequin and tesla roadster with earth in the background from the SpaceX.com live feed..

A Stunning View Of Earth Captured Following The Falcon Heavy Launch. Photograph: SpaceX.com Live Feed.

 

Space robots. Emulated after us in terms of morphology and size, they are superior to industrial robots when it comes to versatility and capability. While right now they may not look as advanced or operate as nimbly as their representations in sci-fi features from the Star Wars and Star Trek franchises, that gap is quickly shrinking. Taking on repairs and other tasks deemed too dangerous for astronauts, these specialized robots are the obvious candidates for many of the precarious activities taking place beyond the relative comfort of Earth.

R2 Goes To The International Space Station

The first humanoid robot in space, Robonaut 2, R2 for short, was developed by the Dextrous Robotics Laboratory at Johnson Space Center (JSC) in Houston, Texas. R2 emerged earlier this decade as the latest subject of robotics research in space. Originally consisting of an upper only torso and arms, R2 has now been equipped with two climbing manipulators, read as legs, capable of providing mobility in zero-g environments to complement dexterous arms and digits that handle intricate tasks. R2’s evolution is a marvel for researchers and enthusiasts to behold, but what’s more impressive than the achievements made over its predecessor, R1, are the advanced sensing capabilities that allow R2 to truly perform in one of the most challenging environments imaginable.

A picture of Robonaut 2 aboard the international space station.

Robonaut 2 Working Tirelessly Aboard The International Space Station.

Machine Vision, Sensing, And Perception

The abilities to touch and see are perhaps the most extraordinary components of these robots’ capability. Vision and sensing components relay complex sets of data such as the identity, position, and orientation of objects in an image. Powerful industrial machine vision and process guidance systems are allowing next-generation robots the ability to evaluate and react effectively in real-time.

Without the component of machine vision, robots are little more than extensions of their controllers and the setpoints governing automated tasks. In R2’s case, 3D vision is the component of machine vision that allows it to perform complex tasks in a semi-autonomous fashion. R2 is both capable of remote control by operators and semi-autonomous operation using advanced software that lets R2 “think” of the solution to a given task. Software updates regularly expand the depth and breadth of R2’s capability. R2’s vision is governed by five cameras in all. Two to provide stereo vision for the robot and its operators, and two auxiliary cameras for backup use. The component of stereo vision allows images from two vantage points to be compared, effectively allowing R2 – and us – to see in 3D. A fifth infrared camera is contained within the mouth area to aid in depth perception. All vision components are housed within the cranium, while R2’s “brain” is located within the robot’s torso. R2 can look up and down, left and right, to fully gauge its surroundings.

 

a picture of R2 with legs attached

R2 Equipped With Two Climbing Manipulators, Read As Legs, Capable Of Providing Mobility In Zero-G Environments.

 

A prime example of cooperative robotics at work, R2’s ability to interact with the astronauts on the ISS mimics the way another person might. Operating at a pace relative to humans, R2 has a soft, padded skin that is equipped with sensing systems that allow it to react when encountering a person. Force control is provided by torsion springs inside the robot and allow R2 to react to influence from the environment. So, when a person pushes away an arm, R2 gives to the motion and lets the person by. This sensing capability also provides R2 with continuous awareness of its orientation and the location of its limbs relative to the environment and surrounding people.

Object Interaction

As for Robonaut’s interaction with its environment, its hands work a bit differently than both humans’ and industrial robots’. The key difference resides in R2’s tendon-driven robotic fingers. Typically, robots will control their joints via tension controllers located on each tendon individually. Putting it another way, joint torque translates into tendon tension.  This poses a problem in the case of R2. The resulting disturbances between joint displacement and the tendon had to be addressed for R2 to be able to interact with unfamiliar objects in an array of positions. This is in stark contrast to R2’s industrial cousins, which operate in uniform spaces with familiar objects. The solutions to R2’s dilemma came when NASA and GM engineers devised a joint-based torque control method that decouples the tendon. All this talk about torque is of particular importance for R2, as well as many other humanoid-robots, due to the necessity for adaptable grip when interacting with a variety of objects large and small.

A picture of Robonaut 2 holding different tools

Robonaut 2 Is Capable Of Working With An Array Of Tools. Photographer: Robert Markowitz.

What’s Next For The ISS And Non-Human Crewmembers

The most recent iteration of Robonaut coming from Houston’s JSC is R5, or Valkyrie. Built to compete in the 2013 DARPA Robotics Challenge (DRC) Trials, the design of Valkyrie took place over a 15-month period and improved electronics, actuators, and sensing capabilities based on earlier generations of JSC humanoid robots. In particular, R5’s vision and sensing system improvements are a tremendous advancement over than those found in R2. Valkyrie’s redesigned head sits on a neck possessing three degrees of freedom and features a Carnegie Robotics Multisense SL, a tri-modal (laser, 3D stereo, and video), high-resolution, high-data-rate, and high-accuracy 3D range sensor, as the main perceptual sensor. Additional modifications include infrared-structured light point cloud generation beyond the laser and passive stereo methods already implemented, as well as front and rear “hazard cameras” positioned in the torso.

A picture of robonaut 5 with hands on hips.

The Latest Iteration Of Robonaut, Robonaut 5, Is Also Referred To As Valkyrie And features The Latest Tech In Robotics For Space Applications.

As research advances technology here on the ground, components and software can be sent to the ISS for utilization. Once proven to operate effectively on the ISS, NASA and other robotics laboratories hope that innovative robotics and associated technologies can be applied further in the depths of space. In the future, thermal resistance for robots will likely be a main focal point for researchers.

About Encompass Solutions

Encompass Solutions, Inc. is an ERP consulting firm that offers professional services in business consulting, project management, and software implementation. Whether undertaking full-scale implementation, integration and renovation of existing systems or addressing the emerging challenges in corporate and operational growth, Encompass provides a specialized approach to every client’s needs. As experts in identifying customer requirements and addressing them with the right solutions, we ensure our clients are equipped to match the pace of Industry.

Pin It