The future of robotics: A convergence of the physical and digital
Over the last 15 years, robots or robotics has changed the way we work, while adding never-before-seen value to society
Robotics, what’s it all about? Well, at a general level it’s about automating different functions across the social and enterprise spectrum, reducing the costs of goods and freeing up employees and citizens to become more efficient and care-free.
The impact of this robotic revolution is growing, and now being felt in multiple channels. You only have to look at the incredible rise of robotic process automation companies, such as UiPath, Blue Prism and Automation Anywhere, to truly understand the scale of this industry.
“With RPA, there is no limit to where it can spread; there is no part of any business that cannot benefit. RPA is therefore right at the start of its life. Ultimately, we believe that every person will have their own robot,” said Guy Kirkwood, UiPath’s chief evangelist RPA during a recent interview with Information Age.
Rate of change
On the software side there is technologies such as RPA. On the hardware side, we’re talking about drones, driverless cars and even military robots.
Industrial robots have been around for sometime, but the boundaries between humans and robots were distinct. Now, the line is blurring as advances in technologies, such as chatbots, come to the fore.
At LGIM’s Annual Investment Conference, Dr Illah Nourbakhsh, professor of robotics, Carnegie Mellon University discussed the ways in which this line is fading, how robotics is changing both business and society, by fusing the physical and digital worlds.
Dr Illah Nourbakhsh during his talk highlighted how robots and humans were becoming indistinguishable.
Don’t get distracted
With computers increasing in processing power at a Moore’s Law trajectory, the capabilities of robots are going to improve incrementally until the point of ‘self-awareness’, or are they? Nourbakhsh disagrees and dismisses this idea of consciousness — “it’s distracting us talking about that from what actually is happening vis a vis power relationships and robotics,” he said.
Instead, “we’re going to solve many social problems with machines.”
Some technologists, also, believe that we’re on the verge of an era of immortality. Not, however, in the traditional sense of the word.
The hybridisation (there’s a word for you, technology and the dictionary are seeing hybridisation) of robots and humans are already coming together. There are ‘simple’ examples in healthcare, where robotic limbs are connecting to the user’s brain, so that an arm, for example, feels like it’s yours. It becomes an extension of that person, and adds to their sense of being.
On the other side, robots are increasingly acting like humans, because the best way to interact with humans is to imitate them. But, to get this right we need to eradicate racial, gender, sexual and background prejudices that have stained society. Why? Because our robots and AI systems are going to use the social norms that we’re used to.
Advancing human life, safety and convenience
300 years ago prison architecture changed so that the guard could always see a prisoner. This, consequently, changed the behaviour of that prisoner.
Information and knowledge of individuals can change their behaviour. In London, for example, the erection of 420,000 cameras and their position has changed the dynamic of crime.
But, and this is where robotics and automation comes in, there is to much data to process.
Computer vision technology and data are colliding to create incredible insights. Technologies — robotics and automation — in computer vision are becoming useful because computers are getting faster, and are able to collect the behavioural data on millions of people — which can be used by AI to tailor services and help law enforcement keep citizens safe. This is a good example of the virtual impacting the physical.
Nourbakhsh points to another example in McDonalds. Certain locations experience very high demand, and so they wanted to be able to detect at what time of day different demographics were using the fast food restaurant’s service — this can tell them how many and what food products to make.
Computer vision can detect what food the passengers in a specific car are more likely to buy — Toyota Prius owners, for example, are more likely to buy fish filet burgers, or so the data suggests. This information will then relay to McDonald’s, and the employees will know what to cook before the car has ordered. “It’s a triple win: less waste, less of a queue and it’s more cost-efficient,” said Nourbakhsh.
Finally, let’s look to Davos, and AI’s role in the world and saving it.
The mood, according to Nourbakhsh, was “serious and strange, and the parties were less absurd”.
Despite applauding David Attenborough and other environmentalists who spoke at the World Economic Forum, 1500 private jets flew in for annual forum, and those in attendance were getting called out.
Source: Information Age