From swarming devices that get your online food order ready for delivery to the autonomous vehicles that will bring it to your front door, robots are poised to revolutionise the retail sector. Helen Knight reports.
A warehouse technician takes out a component for a maintenance check. Without a word, his eager assistant immediately slides over to offer another pair of hands with the task. Unlike most assistants, this one never gets tired or has to nip off for a comfort break, because ARMAR-6 is a robot.
The prototype robot was recently delivered to Ocado Technology’s robotics research lab, where the online grocer’s team of engineers will experiment with the use of the technology in maintaining and repairing automation equipment.
The robot is the first prototype developed as part of the EU-funded SecondHands project, which is aiming to develop collaborative bots that can assist technicians working in Ocado’s automated warehouses, known as customer fulfilment centres (CFCs).
More widely, ARMAR-6 is part of a growing robot workforce that is changing the way the retail industry operates, whether it is in the warehouse, on the road, or in the store.
At Ocado, for example, as well as designing a second pair of hands for the company’s maintenance crew, roboticists are developing robots to pick and pack the 50,000 different items the grocer stocks.
The company has recently developed an articulated robot arm equipped with a suction cup and a 3D vision system that allows it to pick up thousands of different objects without damaging them, according to Graham Deacon, the robotics research team leader at Ocado Technology.
Rather than creating a model of each item to be picked, which would be extremely time-consuming, the engineers have developed a vision system that can identify the best grasp point on any object it sees. The system then lowers the articulated arm down into the crate where the suction cup, which is connected via a pipe to an air compressor, creates an airtight seal with the item’s surface.
Sensors ensure the arm does not damage the item during picking, and the vision system then determines the right orientation to rotate it to, before placing it in the bag. “We are still in the process of quantifying how many different items the robot can pick up, but we expect it to be able to handle thousands of items,” said Deacon.
Similarly, the team has been working on a soft-handed picking robot, capable of handling even delicate items such as fruit and vegetables without damaging them, as part of the EU-funded SoMA project. The project is investigating the use of a compliant gripper such as the RBO Hand 2, developed by the Technische Universität Berlin, which uses flexible rubber materials and pressurised air to passively adapt its grasp.
“If the robot is going to pick up a bunch of bananas, it will shape itself to the particular bunch it is picking up,” said Deacon.
The robot’s vision system is being designed to analyse the environment in which the object is placed, to determine if there is anything the gripper can use to help it pick up the item, such as the surface on which it is sitting, he said.
Ocado has invested heavily in robotics in recent years. Its fulfilment centres are highly automated, in particular its Andover and soon-to-be-opened Erith facilities, which are equipped with technology known as the Ocado Smart Platform (OSP). In the OSP, a swarm of robots pick items from a 3D grid, or hive, said Greg Hutton, head of construction and engineering.
“Our bots sit on rails and move left and right,” said Hutton. “They can lower a gripper down to the box, which lifts it up into the belly of the bot, and then moves it into another position, or to a pick station or outlay point,” he said.
Similarly, robots are now used to move shelves to the human pickers in a handful of Amazon’s 16 UK fulfilment centres. The robots, called drives, slide under the shelves and move them around the facility as needed.
Outside its fulfilment centres, the online retailer is also developing the Prime Air service, which it hopes will ultimately see packages weighing up to 2.3kg delivered to customers by autonomous, GPS-guided drones, within 30 minutes or less.
The drones have been carrying out test deliveries to a small group of customers in Cambridgeshire in the UK, as part of a private trial, and the company hopes to widen their use soon.
Starship Technologies, meanwhile, is running trials of autonomous delivery robots on the streets of the UK, Germany and Switzerland. Working with German retailer Metro Group, as well as takeaway delivery company Just Eat and parcel service Hermes, Starship has deployed dozens of robots in five cities to run test deliveries.
The robots are designed to deliver groceries, food and packages to consumers within a two-three mile radius. The robots can drive autonomously, while being monitored by human operators in a control centre.
Starship recently announced a partnership with Mercedes-Benz Vans to develop ‘Robovan’, a specially adapted van designed to carry eight autonomous delivery robots. The van will drive to a city or town and stop in a designated location, said Noel Sharkey, emeritus professor of artificial intelligence and robotics at the University of Sheffield.
“The idea is they will drive to the outskirts of a town or city, and then release all of the robots to deliver the goods,” he said.
Last summer, Ocado also ran an autonomous delivery trial in south-east London, using a self-driving truck developed by Oxford’s Oxbotica.
Robots are even finding their way on to the shopfloor itself. In the US, robotics firm Bossa Nova is testing autonomous service robots in 50 Walmart stores throughout the country. The robots travel up and down the aisles, taking images of the shelves and using AI to calculate the status of different products, including their location, price and any that are out of stock.
In Japan, SoftBank Robotics’ Pepper robot is already being used by more than 2,000 companies, for tasks such as communicating with customers about services and products offered by the retailer, and guiding them around the store. Oliver Lemon, leader of the Interaction Lab research group at Heriot-Watt University, is experimenting with Pepper as part of the four-year, EU-funded MuMMER (MultiModal Mall Entertainment Robot) project involving SoftBank Robotics Europe.
The project, which also includes researchers from Glasgow University, VTT Technical Research Centre of Finland, LAAS-CNRS in France, and the Idiap Research Institute in Switzerland, is aiming to develop a humanoid robot, based on the Pepper platform, which can interact autonomously and naturally with shoppers within the unpredictable environment of a public mall.
“We’re hoping to build robots that can help people find their way around a big shopping mall, or find products in the supermarket, while being entertaining and fun to use,” said Lemon.
The researchers have been carrying out experiments, including a recent week-long stint in an Edinburgh supermarket, where they have been gathering data on how robots should best interact with people.
“These are things that as humans we don’t even think about. You simply walk up towards someone and start talking, but there are lots of signals going on, such as eye gaze, body orientation and distance,” said Lemon. “They seem mundane, but they’re incredibly important to get right, because otherwise people might find it frightening if a robot drives towards them at high speed.”
But if the move towards the greater use of robotics in retail continues at its current pace, we may all have to start getting used to robots driving up to us to deliver our groceries, or to point us in the direction of the chilled food aisle.