The robotic hand is capable of sensing and rotating objects in any direction and orientation, a task which the Bristol team said it can do when the hand is upside down – something which has never been done before.
According to the researchers, improving the dexterity of robot hands could have significant implications for automating tasks such as handling goods for supermarkets or sorting through waste for recycling.
In 2019, OpenAI became the first to show human-like feats of dexterity with a robot hand, though their 20-strong robotics team was disbanded soon after the launch. OpenAI’s set up used a cage holding 19 cameras and over 6,000 CPUs to learn huge neural networks which could control the hands, but this operation would have required significant costs.
Professor Nathan Lepora and his colleagues said that they wanted to see if similar results could be achieved using simpler and more cost-efficient methods.
Indeed, in the past year, four university teams from Bristol plus the USA’s MIT, Berkeley, and Columbia University have shown complex feats of robot hand dexterity from picking up and passing rods to rotating children’s toys in-hand – and all have done so using simple set-ups and desktop computers.
More from The Engineer
According to the Bristol team, the key advance that made the respective technologies possible was that all teams built a sense of touch into their robot hands – made possible by the advances in smartphone cameras which are now so small they can fit inside a robot fingertip.
“In Bristol, our artificial tactile fingertip uses a 3D-printed mesh of pin-like papillae on the underside of the skin, based on copying the internal structure of human skin,” Professor Lepora said in a statement.
“These papillae are made on advanced 3D-printers that can mix soft and hard materials to create complicated structures like those found in biology.
“The first time this worked on a robot hand upside-down was hugely exciting as no-one had done this before. Initially the robot would drop the object, but we found the right way to train the hand using tactile data and it suddenly worked, even when the hand was being waved around on a robotic arm.”
Looking ahead, the researchers are aiming to go beyond pick-and-place or rotation tasks and move to more advanced examples of dexterity, such as manually assembling items like Lego.
The research, funded through a Leverhulme Trust Research Leadership Award, can be accessed here.
UK startup Wild Hydrogen promises carbon negative H2
They expect you to state you have read and accept their membership terms without anywhere they can be read. There are no contact details or any...