UK, July 23, 2017. – The arrival of robots in the supply chain has been a reality for years. Day by day, the automation of logistics processes with robots increases. Here we have talked about autonomous trucks, autonomous ships, drones for inventory in warehouses, etc. Today we reflect on the collaboration between professional logistic and robot. A very current example is the Amazon robots that help prepare the orders in the store.
Amazon, the e-commerce giant now has 45,000 robots across 20 fulfillment centres. That would be a 50% increase from the same time the year before, when the company said it had 30,000 robots working alongside 230,000 people. Amazon bought a robotics company called Kiva Systems in 2012 for $775 million (£632 million). Kiva’s robots automate the picking and packing process at large warehouses in a way that stands to help Amazon become more efficient. The robots — 16 inches tall and almost 145 kilograms — can run at 5 mph and haul packages weighing up to 317 kilograms.
When Amazon acquired Kiva, Phil Hardin, Amazon’s director of investor relations, said: “It’s a bit of an investment that has implications for a lot of elements of our cost structure, but we’re happy with Kiva. It has been a great innovation for us, and we think it makes the warehouse jobs better, and we think it makes our warehouses more productive.”Amazon also uses other types of robots in its warehouses, including large robotic arms that can move large pallets of Amazon inventory. The company has been adding about 15,000 robots year-on-year, based on multiple reports. At the end of 2014, Amazon said it had 15,000 robots operating across 10 warehouses. In 2015, that number rose to 30,000, and now Amazon has 45,000. “We’ve changed, again, the automation, the size, the scale many times, and we continue to learn and grow there,” Amazon CFO Brian Olsavsky said at a conference last April, according to The Times. Olsavsky added that the number of robots used varies from warehouse to warehouse, saying that some are “fully outfitted” in robots, while others don’t have “robot volume” for economic reasons.
Learning to work with robots
Perhaps we are not prepared to work with a robot as if it were a human being. Perhaps this is an activity to be solved in the coming decades. Perhaps our sons and daughters are the ones who work best with robots, because they are digital natives. Below we summarize an interesting reflection on the interaction between robots and humans.
Robot panic seems to move in cycles, as new innovations in technology drive fear about machines that will take over our jobs, our lives, and our society—only to collapse as it becomes clear just how far away such omnipotent robots are. Today’s robots can barely walk effectively, much less conquer civilization.But that doesn’t mean there aren’t good reasons to be nervous. The more pressing problem today is not what robots can do our bodies and livelihoods, but what they will do to our brains. “The problem is not that if we teach robots to kick they’ll kick our ass,” Kate Darling, an MIT robot ethicist, said Thursday at the Aspen Ideas Festival, which is co-hosted by the Aspen Institute and The Atlantic. “We have to figure out what happens to us if we kick the robots.”That’s not just a metaphor. Two years ago, Boston Dynamics released a video showing employees kicking a dog-like robot named Spot. The idea was to show that the machine could regain its balance when knocked askew. But that wasn’t the message many viewers took. Instead, they were horrified by what resembled animal cruelty. PETA even weighed in, saying that “PETA deals with actual animal abuse every day, so we won’t lose sleep over this incident,” but adding that “most reasonable people find even the idea of such violence inappropriate.”
The Spot incident, along with the outpouring of grief for the “Hitchbot”—a friendly android that asked people to carry it around the world, but met an untimely demise in Philadelphia—show the strange ways humans seem to associate with robots. Darling reeled off a series of other ways: People name their Roombas, and feel pity for it when it gets stuck under furniture. They are reluctant to touch the “private areas” of robots, even only vaguely humanoid ones. Robots have been shown to be more effective in helping weight loss than traditional methods, because of the social interaction involved.
People are more forgiving of robots’ flaws when they are given human names, and a Japanese manufacturer has its robots “stretch” with human workers to encourage the employees to think of the machines as colleagues rather than tools. Even when robots don’t have human features, people develop affection toward them. This phenomenon has manifested in soldiers bonding with bomb-dismantling robots that are neither anthropomorphic nor autonomous: The soldiers take care of them and repair them as though they were pets. “We treat them as though they’re alive even though we know perfectly well they’re machines,” Darling said.
That can be good news—whether it’s as weight-loss coaches or therapy aides for autistic children—but it also opens up unexplored ethical territory. Human empathy is a volatile, unpredictable force, and if it can be manipulated for good, it can be manipulated for bad as well. Might people share sensitive personal information or data more readily with a robot they perceive as partly human than they would ever be willing to share with a “mere” computer?.