Stay informed with free updates
Simply sign up for the Artificial Intelligence myFT Digest, delivered straight to your inbox.
Pepper, a humanoid robot, was created in 2014. He enjoyed a brief wave of hype, including a visit to the Financial Times to meet the editor. “This is a robot that acts autonomously out of love,” declared Masayoshi Son, head of SoftBank, a major backer. Alibaba and Foxconn have also invested hundreds of millions of dollars in efforts to embed robotics into everyday life. But that wasn’t the case. Pepper can still sometimes be found in public libraries in Japan, unplugged and head bowed, like a 4-foot-tall Pinocchio who dreams of becoming a real boy but can’t. Production was halted in 2021 and only 27,000 units were built.
But the vision of humanoid robots, machines that can do all the jobs we don’t want them to do, just like us, is too appealing to give up for long. Recent dramatic advances in artificial intelligence have sparked a new wave of enthusiasm for robotics. “The next wave of AI is physical AI. AI that understands the laws of physics, AI that can work among us,” Jensen Huang, chief executive of chip designer Nvidia, said earlier this year. Ta. Nvidia has ridden the AI model training boom to become the world’s second-largest company by market capitalization.
Billions of dollars in venture capital are being poured into robotics startups. They aim to apply the same kind of model-training techniques that make computers predict how proteins fold or produce surprisingly realistic text. Their goal is, firstly, to enable the robot to understand what it sees in the physical world, and secondly, to enable it to interact naturally with the physical world and perform simple actions such as picking up and manipulating objects. The built-in is to solve a huge number of programming tasks.
That’s my dream. But modern investors and entrepreneurs will likely be just as disappointed as those who supported Pepper. That’s not because AI isn’t useful. Rather, the obstacles to creating economically viable robots that can make dinner and clean toilets are hardware issues, not just software, and AI itself cannot solve them, let alone address them. Because you can’t do it either.
These physical challenges are many and difficult. For example, human arms and legs are moved by muscles, whereas robot limbs must be driven by motors. More motors are needed for each axis of motion that the limb needs to move. All of this is doable, as demonstrated by factory robot arms, but they require high-performance motors, gears, and transmissions, making them bulky, costly, power-hungry, and prone to failure. component occurs.
After creating the desired motion, there is the challenge of sensing and feedback. For example, when you pick up a piece of fruit, the human nerves in your hand tell you how soft it feels and how hard you can squeeze it. You can taste if your food is cooked and smell if it’s burnt. None of these senses are easy to provide to robots, and making them even more costly. Machine vision and AI may help by noting whether the fruit is crushed or the food in the pot is the right color, but they are imperfect substitutes.
Next, there is the issue of power. Autonomous machines require their own energy source. The factory robot arm is connected to the mains power supply. they can’t move around. Humanoid robots will most likely use batteries, but this will involve trade-offs in size, power, strength, flexibility, operating time, service life, and cost. These are just some of the issues. Many smart people are working on solving the problem and progress is being made. But the point is, these are physical challenges, longstanding, and difficult. Even with the AI revolution, they won’t go away.
So what does AI make possible in the physical world? Rather than imagining how technology will enable new machines, it’s better to imagine how AI changes existing machines when applied to them. It’s more realistic to imagine what you would do.
A simple example is self-driving cars. In this case, the machine does not need to be modified at all. The movement of the car in the physical world and its power sources will continue to function as before, but the sensing involved in driving the car will be almost entirely visual. The new AI craze has put a damper on the hype cycle for self-driving cars. In fact it should be the opposite. Autonomous driving is a huge market and the real-world challenge that AI can most easily tackle, something that anyone looking to invest in other applications of robotics should ponder.
It also makes sense to think about how existing robots will evolve, from industrial robot arms to robot vacuum cleaners. AI-powered machine vision subtly expands the range of tasks that robotic arms can perform, making it safer for robotic arms to work alongside humans. Lightweight, single-purpose devices like robot vacuums will become increasingly useful. For example, it is already quite common for hotels in China to have robots deliver deliveries to your room. This kind of limited and controlled autonomy is most easily provided.
In this way, AI will gradually bring us closer to androids. Unfortunately, when it comes to robots like Pepper that can clean toilets, it’s much easier to build robots that write bad poetry, and that’s not likely to change anytime soon.
robin.harding@ft.com