Tags: nasa | using | kinect | xbox one

NASA Using Kinect 2 on Xbox One to Control a Robot's Arm in Lab

Thursday, 02 Jan 2014 11:05 AM

By Clyde Hughes

A    A   |
   Email Us   |
   Print   |
   Forward Article  |
  Copy Shortlink
NASA is turning to gaming technology for its next steps in space exploration, using Kinect 2 from Xbox One to control a robot's arm at its Jet Propulsion laboratory in Pasadena, Calif.

The Kinect tracks its user's movement, specifically how the arm moves, rotates and clenches. The robot arm follows, though with a slight lag due to the interface and the latency involved with operating a robot so far away.

Urgent: Do You Approve Or Disapprove of President Obama's Job Performance? Vote Now in Urgent Poll

The Kinect's software gathers information regarding motion-capture of actual moving things in real-life scenarios, according to Jameco.com. It then processes the data using an artificial intelligence algorithm that allows Kinect to map the visual data to models representing people.

The laboratory has built a rig that allows an operator to use the Oculus Rift virtual reality headset to see through a robot's image sensors, according to Tom's Guide. Invented by virtual reality enthusiast Palmer Luckey, the Oculus Rift is a set of virtual reality goggles that is compatible with a computer or mobile device.

"(NASA) received their (development) kit in late November, and after a few days of tinkering, were able to hook up an Oculus Rift with the Kinect 2 in order to manipulate an off-the-shelf robotic arm," Engadget.com reported last month. "According to our interview with a group of JPL engineers, the combination of the Oculus's head-mounted display and the Kinect's motion sensors has resulted in 'the most immersive interface' JPL has built to date."

NASA researchers told Engadget.com that the technology could possibly allow human to actually interact with the Martian surface without actually being there through the robots – and the technology has been here all along through gaming software.

"We're able for the first time, with a consumer-grade sensor, control the entire orientation rotation of a robotic limb," Alex Menzies, a human interfaces engineer, said. "Plus we're able to really immerse someone in the environment so that it feels like an extension of your own body -- you're able to look at the scene from a human-like perspective with full stereo vision. … It feels very natural and immersive. I felt like you have a much better awareness of where objects are in the world."

Editor's Note: ObamaCare Is Here. Are You Prepared?

Related Stories:

© 2014 Newsmax. All rights reserved.

   Email Us   |
   Print   |
   Forward Article  |
  Copy Shortlink
Send me more news as it happens.
Get me on The Wire
Send me more news as it happens.
Around the Web
Retype Email:
Zip Code:
Hot Topics
Follow Newsmax
Like us
on Facebook
Follow us
on Twitter
Add us
on Google Plus

Newsmax, Moneynews, and Independent. American. are registered trademarks of Newsmax Media, Inc. Newsmax TV, NewsmaxWorld, NewsmaxHealth, are trademarks of Newsmax Media, Inc.

America's News Page
©  Newsmax Media, Inc.
All Rights Reserved