A six-person startup from Seattle developed augmented telerobotics software that gives humans better control of remotely operated robots which can be useful for exploring Mars or other planets.
BluHaptics specializes in robotic control for underwater environments, but with a recently awarded grant funded by NASA, they are now applying their software to control robotic operations in space – by performing maintenance on satellites or space stations, servicing the Space Station, exploring planets, or eventually building habitats for humans.
Their technology uses real-time modeling, haptic feedback, and deep learning to allow users to control remotely operated robots with a video game controller.
Using an NVIDIA TITAN Xp GPU and cuDNN with the CNTK deep learning framework, their trained software is able to precisely recognize different objects on the intended target.
“There’s this gap that exists between manual control of robots and automation,” Pickering explains. “We’re seeing better sensors and better hardware come along, but making computer vision and control has always been sort of disparate and by connecting what a robot sees to what it does, we can achieve complex tasks more easily and help users achieve those tasks more easily and achieve a level of safety and efficiency that wasn’t possible before.”
Read more >
GPUs and Deep Learning Could Help Control Robots in Space
Jun 01, 2017
0
Discuss (0)

Related resources
- GTC session: Accelerating ROS 2 Workloads With NVIDIA GPU-Powered Libraries and AI Models
- GTC session: Robotic Simulations With Reinforcement Learning at Scale: Harness the Power of NVIDIA Isaac Lab on AWS
- GTC session: Accelerate Physical AI Development Workflows with Omniverse Cloud Sensor RTX
- SDK: Isaac Lab
- SDK: cuMotion
- SDK: Isaac Manipulator
0