SC247    Topics     News    Sensors

Humanoid Robot Holy Grail: Decoding the Language of Human Touch

Advances in tactile sensing and AI are giving robots real-world capabilities, unlike sci-fi speculation about humanoids.


From the narratives of Ex Machina and Blade Runner to the claims this year by Google engineer Blake Lemoine that its AI was sentient, the concept of machines with emotions has always sparked people's imaginations. While the truth to these tales remains uncertain, a tangible new reality is emerging: Robots are have gained the sense of “feeling,” moving into the realm of human touch.

For years, machines have perfected the art of seeing, using advancements in computer vision and laser technologies to enhance their precision. But now, a new frontier has opened – comprehending tactile sensations. Imagine a robotic finger gliding across a glass surface, identifying subtle imperfections, discerning temperature variations, and effortlessly distinguishing between metal and wood.

In the dynamic landscape of today's technological frontiers, touch stands as a game-changer for robots and the industrial sectors they serve. Let’s delve into the latest advances in human-like touch and how they are propelling the world of robotics forward at an astonishing pace.

Robotic hands gain human-like receptors

The profound connection between the human experience and the nuances of touch is well known. Our tactile senses, orchestrated by a network of intricate receptors known as the somatosensory system, allow us to comprehend the world around us. This system enables us to differentiate between sensations like hot and cold, rough and smooth, and even pain and tickling.

Given the sophistication of this sensory process, replicating it within robotics, particularly in their extremities, has been a challenge historically. However, new approaches to introducing human-like receptors into the hands of robots are turning what was once considered science fiction into nonfiction.

At the forefront of this disruption is GelSight, a company that recently emerged from Hexagon's accelerator program. GelSight's innovative approach involves engineering a transparent pad using a rubbery thermoplastic elastomer coated with metallic paint on one side.

When this pad is furnished to a robotic arm and pressed against an object, it precisely mimics the object's shape, providing a surface that a camera can capture with exceptional accuracy. This method effectively eliminates distortions caused by reflective or transparent elements on the thing, revolutionizing the way robots can perceive and interact with objects within a warehouse or factory.

In the future, GelSight’s technology may not be limited to robotic hands, it may also be used in robotic feet.

In fact, the company’s static 3D tactile sensing device is already being used by the Norwegian ski team to measure the snow consistency on the slopes each day to help their skiers understand how the surface of their skis will interact with snow. It’s not hard to hypothesize how these measurements could be made in real-time on robotic feet in the future to adapt traction or tread to best suit the current environment.

Bringing dexterity to robotic fingers with AI

Once a robot detects an object through its hand, the next crucial step is enabling it to effectively interact with the object. This dexterity challenge becomes particularly evident in scenarios in the factory of the future, where the robot must identify and then handle hundreds of objects of different shapes or sizes.

The key to solving this challenge is developing robotic fingers and hands that mirror the finesse of human-like dexterity. This requires meticulous engineering to enable machines to understand the nuances of hand and finger movements and provide the intelligence to execute touch-driven tasks with precision.

Researchers at Columbia University are among the latest to address these challenges. They have demonstrated that robotic fingers can be trained to execute complex in-hand manipulations of intricate objects, relying solely on the sense of touch to guide their movements. The elimination of dependence on visual input, demonstrates the true potential of tactile-driven robotics.

That work follows advances in dexterity in recent years from the likes of both OpenAI and MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL). While the former is better known today for ChatGPT, in 2018, it made headlines by training a human-like robot hand to manipulate physical objects with unprecedented dexterity by using a reinforcement learning algorithm and code.

CSAIL also relied on advances in deep learning to reorient a robotic hand to handle over 2,000 objects. As the pace of AI innovation continues to accelerate, we’re not far from seeing deep learning models for robotic dexterity deployed in real-world assembly lines.

Giving robots awareness of their hands and feet

The final sense of touch that humans need to pass on to robots is self-awareness akin to human proprioception. Proprioception, often called the “sixth sense,” is vital in enabling humans to navigate their environment precisely and gracefully by understanding how their limbs move. It's an internal compass that allows us to move seamlessly without needing constant conscious monitoring of every movement.

Recent breakthroughs in robotic proprioception at Carnegie Mellon University have propelled us closer to imbuing robots with a simulated version of this remarkable ability (see video below). Picture a robot that can not only comprehend its physical form, but also construct an intricate internal model of its limbs and their positions within three-dimensional space.

This development isn't about instilling a sense of self in machines; it's about equipping them with a foundational framework to execute tasks with safe precision around human co-workers.

Visualize a manufacturing facility where robots and humans work hand in hand, using virtual proprioception to execute intricate tasks with less need for external safeguards. These scenarios are no longer confined to the distant future.

Furthermore, breakthroughs in robotic proprioception could also save lives in the real world in the future. Imagine a robot navigating effortlessly through an area devastated by a natural disaster, adeptly recognizing obstacles and tactically planning its path to reach survivors.

These replications of human functions in robots aren't signs of a robotic takeover or a post-apocalyptic future; instead, they're steps on the path to human-robot harmony. Step by step, progress in touch brings us closer to a harmonious coexistence of humans and machines.

The convergence of tactile understanding and robotics promises increased productivity. Ultimately, the union of robotics and the sense of touch could shift the limits of machine capabilities. With technologists pushing the boundaries of tactile perception, the dream of robots with an intricate grasp of human touch is no longer distant.

Milan Kocis, Hexagon Sixth Sense

About the author

With more than 25 years of experience in advanced manufacturing, Milan Kocic is the founder and head of Hexagon's Sixth Sense accelerator program and open innovation platform. His stated mission is to create an environment where startups can explore possibilities for growth with Hexagon’s network of products, services, and customers.

Kocic has spent the past 15 years bringing new Hexagon products and services such as PULSE and MyCare to market. He holds more than 30 patents in many different areas of advanced manufacturing.


Article Topics


Latest in Supply Chain

A Look at Baltimore’s Key Bridge Collapse—One Month Later
European Parliament Passes New Law Requiring Supply Chain Accountability
Baltimore Continues Bridge Recovery With Opening of New Channel
How Shippers Can Prep for Hurricane Season
Apple Accused of Multiple Human Rights Violations
South Korea Finally Overtakes China in Goods Exported to U.S.
Talking Supply Chain: Understanding the FTC’s ban on noncompetes
More Supply Chain
Researchers at MIT and other institutions are providing robots with the sense of touch and intelligence to manipulate previously unseen objects.
Source: MIT CSAIL
Researchers at MIT and other institutions are providing robots with the sense of touch and intelligence to manipulate previously unseen objects.
Legged robots use ground contacts and the reaction forces they provide to achieve agile locomotion. However, uncertainty coupled with contact discontinuities can lead to failure, especially in real-world environments with unexpected height variations such as rocky hills or curbs.

 

Featured Downloads

Unified Control System - Intelligent Warehouse Orchestration
Unified Control System - Intelligent Warehouse Orchestration
Download this whitepaper to learn Unified Control System (UCS), designed to orchestrate automated and human workflows across the warehouse, enabling automation technologies...
An Inside Look at Dropshipping
An Inside Look at Dropshipping
Korber Supply Chain’s introduction to the world of dropshipping. While dropshipping is not for every retailer or distributor, it does provide...

C3 Solutions Major Trends for Yard and Dock Management in 2024
C3 Solutions Major Trends for Yard and Dock Management in 2024
What trends you should be focusing on in 2024 depends on how far you are on your yard and dock management journey. This...
Packsize on Demand Packing Solution for Furniture and Cabinetry Manufacturers
Packsize on Demand Packing Solution for Furniture and Cabinetry Manufacturers
In this industry guide, we’ll share some of the challenges manufacturers face and how a Right-Sized Packaging On Demand® solution can...
Streamline Operations with Composable Commerce
Streamline Operations with Composable Commerce
Revamp warehouse operations with composable commerce. Say goodbye to legacy systems and hello to modernization.