The human touch is incredibly complicated, so much so that most contemporary robots and artificial intelligence systems struggle to comprehend how it functions.

The ability of machines to manipulate their targets (such as assembling, reorienting, and packaging) is still mainly elusive, even though they are already fairly good at grasping and replacing items. However, scientists recently developed an incredibly dexterous robot after realizing it required fewer, rather than more, sensory inputs. Created by Columbia Engineers, combines an improved sense of touch with motor-learning algorithms and does not rely on vision to manipulate objects.

Watch this video:

Read More: Tech Meets Treats: You Won’t Believe This AI-Generated KitKat Ad

“[This is] a novel method for achieving dexterous manipulation of complex objects, while simultaneously securing the object without the use of passive support surfaces,” the researchers said in a study recently posted to the preprint server arXiv.

Image source: IFLscience

“[We used] the strength of both RL and SBP methods to train motor control policies for in-hand manipulation with finger gaiting,” the researchers said. “We aim to manipulate more difficult objects, including concave shapes, while securing them at all times without relying on support surfaces.”

Robotic Commands

Developing a set of instructions is simple for an AI. It can instruct the robot on what to perform, but most robots are not very good at giving feedback. With fingertips that can precisely feel what they are touching and discern an object’s movement and location, the new robot hand goes beyond that. It required another algorithm, the rapidly exploring random tree (RRT), to do this. The hand can grasp more challenging things because of this method. RRT locates the branch of the tree that provides the shortest route from the current state to the state that denotes the completion of a job.

Read More: Coca-Cola Releases Its First-Ever AI-Generated Ad

Stay tuned to Brandsynario for the latest news and updates.