Michael Kenward visits Amazon’s Baltimore technology test-bed and explains how robotics underpins a fundamental change in its operations.
After you negotiate airport style security, and walk by the posters advertising free flu jabs, the first impressions on entering Amazon’s high-tech warehouse near Baltimore are the noise and the lack of people.
The din comes from miles of conveyors that carry the many thousands of packages that leave each ‘fulfilment centre’ – , as Amazon dubs its warehouses – every day.
A product’s journey through Amazon’s system begins with what at first glance appears to be a somewhat counter-intuitive approach to product stowing – whereby inventory entering the warehouse is deliberately randomly distributed by teams of “Stowers” into bins within 2-metre-high storage pods carried by mobile robot drive units.
Such a chaotic approach to storing inventory – dubbed random stow by Amazon – would have once been unthinkable. But by randomly distributing items in this way and using an overarching IT system known as Amazon Web Services (AWS) to bring together the data on every single item within the system, pickers are able to locate products far more quickly than if they had to visit a dedicated shelf for each product.
The system also makes far more efficient use of every available inch of space. “We can squeeze more stuff into the same footprint,” explained Tye Brady, Chief Technologist of Amazon Robotics, “We want our large objects to be mixed with our small objects to be mixed with our medium objects because it volume optimises.”
Although the system is underpinned by robotics and automation Baltimore’s technology test bed still employs around 3000 workers, or “associates” as the company calls them, most of whom stand at picking stations waiting for the robots to glide up bearing their ‘pods’.
Pickers then consult a screen and select from the appropriate bins whatever customers have ordered. These items are placed into “totes” that travel along conveyor belts to packing stations where workers put orders into the familiar cardboard boxes that go back on to the conveyors where more robots label the boxes for their destinations.
Baltimore – 10/18/18 – Operations at Amazon’s Baltimore Fulfillment Center today. Credit: Marty Katz/baltimorephotographer.comThe rise of the robots at Amazon started in earnest in 2012, when the company paid $775 million for the robotics company Kiva Systems (now known as Amazon Robotics). The acquisition, Amazon’s second biggest deal at the time, was for a company, formed in 2003, that had previously raised a mere $33 million in funding. Amazon Robotics has since produced more than 100,000 mobile robots or drive units for the company.
Since the acquisition the technology has evolved, and today’s fourth generation robots have more intelligence and carrying capacity in smaller devices than earlier models.
The robots are part of what Tye Brady describes as “a symphony of humans and machines working together”. But in reality ‘working together’ means that robots and humans are kept well apart. The robots carry around those heavy pods in huge football-pitch-sized caged areas, on floors with bar codes that tell them exactly where they are, and communicating over a WiFi network with a central control system.
Robots can be dangerous, especially when carrying heavy pods and moving around close to one another. Operators can’t just walk into a cage to pick up things that have fallen out of a tote or to fix ailing robots.
Until recently, staff had to use a tablet to plot their intended path through the enclosure before they could enter the robots’ work zone. That would then tell the control system to keep the devices away from the person. But Amazon has now started trials of a new robot avoidance technology in Baltimore, with RF beacons built into the drive units. Instead of mapping a route on a tablet, the operator dons a “robotic vest” with an RF transmitter. Tapping a tablet creates an RF safety shield around the human worker and the control network causes the robots to slow down, and if necessary halt operation, as the human approaches.
The Baltimore site is also a proving ground for new technologies to track the bar codes on the assorted bins, totes, robots, and products flowing in and out of the warehouse – a situation made all the more challenging by the firm’s random stow approach to storage.
Automated optical tracking is making deeper inroads into the process in an attempt to reduce the need to use handheld scanners to read codes to record the movements of products. Cameras also scan codes on totes to monitor their progress through the system of conveyor belts. An optical computer vision system also monitors the contents of every bin in a pod.
Hand-held barcode scanners can get in the way of picking and packing. As Eli Gallaudet, a software development manager at Amazon described it, seeing pickers tucking a scanner under their chain as they tried to stow items in bins, prompted the search for better ways of keeping track of items. The quest was on to “get this hand-held scanner out of the way,” explained Gallaudet.
The idea of “wiring up” workers with wrist straps did not down well when news got out. Instead, the engineers set out to devise a computer vision system that watched workers hands as they stowed items. “It has to be highly accurate,” said Gallaudet. The answer was to use two cameras and complex algorithms to track hands and objects, to match that to the layout of the bins on a pod and to figure out which bin an item went into. Gallaudet wouldn’t disclose how accurate the system is, but admitted that initially it was not accurate enough.
The next move was to enlist the help of Amazon’s machine learning and Amazon Web Services (AWS), the company’s cloud based computer system, to train the algorithms to predict which bin something would end up in. The end result, said Gallaudet, was “a very high accuracy on a large percentage of stows”.
The power of AWS also sits behind Amazon’s design and implementation of roboticised warehouses. It is impractical to test 5000 robots, so before a robot gets anywhere near a warehouse, it is modelled to death. “Before a building goes live we have simulated the whole building,” said Joe Quinlivan, President and COO at Amazon. “When a building goes up, we know it is going to work.”
Software is also central to the running of a robotic warehouse. The drive units can diagnose their own health, and can work out when they need to go off line and recharge their batteries. The AWS central “nervous system” also deploys machine learning and artificial intelligence to predict when a robot needs a motor changed.
Amazon’s use of technology to monitor its workforce regularly create headlines. Robots would not interest the media if automation took people out of the equation. Brady, who has clearly been asked many times if robot will replace people, rebutted the idea forcefully. “Humans will always be central to our equation,” he insisted. “There is no such thing as a ‘lights out’ fulfilment centre, it just doesn’t work.”
John Felton, the company’s vice-president of Global Customer Fulfilment, threw in some numbers. Over the time in which it has introduced robots, he said, it also added 300,000 people to its workforce. His take on new technology is that it is there “to improve the associates’ experience”. What about productivity? That too, he agreed, but when it comes to putting numbers on the gains, Amazon won’t let on.