9 Sources
9 Sources
[1]
Robot can beat elite players at table tennis
Few sports require a more finely honed combination of speed, perception and skill than table tennis. Watching a professional game is a jaw-dropping experience that shows how a player can, through training and physical prowess, estimate ball speed and spin with astounding accuracy, and, by combining this with fast reflexes and agility, achieve the tactical gameplay and incredible pace the sport is known for. Writing in Nature, Dürr et al., a new player comes to the fore: an artificial-intelligence agent that combines a robotic arm with an AI-based control system. The system, called Ace, can not only challenge professional players, but also provide valuable insights on human strategy and movement. Designing and making a robot that can play professional-standard table tennis is no simple endeavour: building artificial entities that can detect an environmental change, decide how to react and then implement that reaction at speeds that enable them to compete with humans is a challenge across many fields of engineering. One relevant example is car-racing simulations, a contest in which other agents perform the same or competing actions. In 2022, researchers at the multinational firm Sony AI reported an AI system called GT Sophy that could beat championship players at the racing simulation game Gran Turismo. But in the work by Dürr and colleagues, the battlefield for assessment wasn't a simulation but a real table-tennis table, complete with real rackets and ball. Ace comprised three modules: a high-speed perception system, a control system and a robotic arm. The perception system used conventional cameras to locate the ball and three 'gaze control systems' that estimated the rate at which it was spinning, known as its angular velocity. The direction and rate of a table-tennis ball's spin determines its trajectory -- a skilled player can give the ball a desired spin to deliver shots that are difficult for their opponent to return. The perception system's cameras, which were all located outside the court, covered the entire playing area. This information was used by the AI-based control system to direct serves and returns during gameplay. The control system's decisions were enacted by the hardware of the robotic arm: a custom platform with eight independently controlled joints, designed to deliver shots in a manner comparable to those of professional players. Every device and capability that must be integrated into Ace adds complexity, but two aspects of the system deserve particular attention. First, there is the use of AI. The angular velocity of the ball was estimated using a type of algorithm called a convolutional neural network (CNN), which is typically used to classify images. Information on the position and speed of the ball was then passed to the control system, which was trained to play rallies using simulated table-tennis shots. In Dürr and colleagues' approach, which is an example of a process called 'deep reinforcement learning', the decision-making part of the algorithm, called the actor, was scored by another programme, called the critic. Through this process, the system learnt actions that enabled it not only to rally but to give its returning shots desired characteristics such as topspin. Finally, the authors used a genetic algorithm -- which finds the best solutions to a problem by mimicking biological evolution -- to develop a library of serves for the system to use. The second noteworthy aspect of the AI-based system was the role of humans in its development. In table tennis, players serve by tossing the ball up before striking it with the racket. Ace's tosses were based on human demonstrations, adapted to the robot's motional features so that the final serve adhered to the official rules of the game. Expert players informed the genetic algorithm, determining which of the possible serve strategies were challenging enough to be used. If, during training with a human coach, a particular serve succeeded at least 95% of the time across 20 attempts, it became part of the robot's serve set. Ace played against five elite and two professional players (Fig. 1), beating three of the elite players. It lost to both professional players, winning one game out of seven in the matches played against them. Ace's performance was mainly due to its ability to generate different kinds of spin and its consistency in returning the ball, rather than the use of faster-than-human shots. This is noteworthy, because it might have been expected that specialized machines capable of generating extremely high speeds would rely predominantly on power. The authors report that Kinjiro Nakamura, a table-tennis player who competed in the 1992 Barcelona Olympics, commented as he watched Ace perform a particular shot: "No one else would have been able to do that. I didn't think it was possible. But the fact that it was possible ... means that there is a possibility that a human could do it too." Overall, the authors report a successful implementation of a fast-acting AI-based system that operates in a real environment. It must be stressed, however, that Ace relied on guidance and assessment from humans who understood the situations and interactions involved in table tennis. Nevertheless, it is remarkable that human specialists such as Nakamura might learn new skills just by playing against and observing Ace, suggesting that AI-controlled robotic systems could be an arena for human development beyond table tennis. In 1997, the chess-playing system Deep Blue defeated world champion Garry Kasparov in a six-game match. Ace has yet to reach the equivalent level of performance, and even if it were capable of beating a world-champion table-tennis player, the system is far from being humanoid. For example, unlike human players, it observes the game from multiple points at once. AI-based chess engines can now be run using a mobile phone rather than the specialized computer required for DeepBlue -- as autonomous systems become more advanced, Ace might also one day become outdated. Nevertheless, like Deep Blue, Ace is an important milestone, showcasing the potential of the next generation of high-quality, competitive agents that interact with physical environments.
[2]
Table tennis-playing robot on track to becoming world champion
A robot built by Sony AI is rapidly learning how to beat the world's very best table tennis players Ace, an autonomous robot powered by AI, cutting-edge sensors and an extremely dexterous arm with eight joints, has played competition-rule table tennis and beaten elite human competitors. The robot is the first machine to excel at the sport. It was the cerebral game of chess that was first disrupted by computers, but Ace's success suggests physical sports may be about to have their "Deep Blue" moment - the day, in 1997, that a machine of that name beat world chess champion Garry Kasparov. "Games have long served as benchmarks for AI, including chess for Deep Blue, but also other games in more recent breakthroughs, like [the Go-playing AI] AlphaGo," says Peter Dürr at Sony AI, Zurich Switzerland, who led the team that built Ace. But he says those earlier AI milestones were played out online. Ace represents an important advance because it has taken on real-world, professional table tennis champions and held its own. "Ace offers something that has simply never been captured before: a robot and a human in genuine athletic competition," says Dürr. Ace boasts three main advancements in autonomous robotics, he says. Firstly, it uses "event-based sensors", which means that the robot focuses on certain regions of the images its cameras capture - those indicating changes in motion or brightness, which are critical to tracking the path of the table tennis ball. Next, the robot's table tennis skills are built using "model-free reinforcement learning", which means, says Dürr, the robot "learns through experience in simulation rather than adopting a model of how table tennis should be played". This process was similar to having the robot play a table tennis computer game, and the robot notched up several thousand hours of training during the process. And finally, the team has deployed high-speed robot hardware that allows Ace to play with "human-like agility", says Dürr. In some ways, it is even more agile than a human, because athletes require around 230 milliseconds to react, he says, whereas the total latency of Ace is only around 20 milliseconds. Currently, the robot looks like a robot from a factory floor, and relies on a network of cameras and sensors surrounding the table tennis arena. But as the technology advances, the researchers expect Ace will eventually be embodied in a humanoid form. For the matches played as part of a study published today, Japanese professional table tennis league rules applied as Ace competed against five elite but non-professional players, each of whom had competed for at least a decade and trained 20 hours per week. The robot also took on two professionals. Ace lost only two of its five matches against elite players, but both of its matches against professional players. It did, however, achieve a win in one game within one of the professional matches. Another advantage that Ace has over humans is that it does not give away any tells of its next move. On the other hand, it lacks the capacity to read any signs of the body language of humans. "Some of the athletes involved in our experiments commented that they are usually watching their opponent's face - which Ace does not have," says Dürr. Others were surprised by Ace's ability to read the spin of their serves, despite their attempts to hide it with different motions. The robot also confounded its inventors - especially when it was able to hit balls that bounced off the net, which was not a skill it had trained for. This was a skill that just "emerged", says Dürr. Over the past year, since the study was completed, the team has continued to improve Ace's abilities. In December 2025, Ace beat a professional player for the first time, and in March 2026, Ace won matches against three more professional players: a female professional, Miyuu Kihara, who is ranked in the top 25 in the World Table Tennis ranking, as well as two male professionals, Tonin Ryuzaki and Fumiya Igarashi. "With further improvements, it should be possible to outperform even the world champion," says Dürr. And improvements go both ways, he says. "Former Olympian Kinjiro Nakamura noted that before watching Ace, he thought a certain shot was impossible, but having seen it, he believes human athletes could replicate this technique."
[3]
Ping Pong Robot Uses Agentic AI to Beat Expert Human Players
Scientists at Sony AI have developed a table tennis robot with enough speed and precision to beat even some expert ping pong players in the latest matchup between biological and artificial intelligence. The robot, dubbed "Ace," combines vision sensors, model-free reinforcement learning and high-speed robotic hardware in a crane-like lever with a ping pong bat attached. The resulting system can autonomously locate a ping pong ball in space, determine the correct technique needed to return it to an opponent's side and repeat the process until a play is over. Robots like Ace are known as AI agents, systems that can reason and take action to solve multi-step problems with limited human supervision. Get the Tech Newsletter bundle. Get the Tech Newsletter bundle. Get the Tech Newsletter bundle. Bloomberg's subscriber-only tech newsletters, and full access to all the articles they feature. Bloomberg's subscriber-only tech newsletters, and full access to all the articles they feature. Bloomberg's subscriber-only tech newsletters, and full access to all the articles they feature. Plus Signed UpPlus Sign UpPlus Sign Up By continuing, I agree to the Privacy Policy and Terms of Service. Researchers at Sony AI, a subsidiary of Sony Group Corp., tested Ace in a series of games against highly ranked table tennis players. Without any prior knowledge of the opponents' style of play, the robot won seven out of 13 games against five different elite athletes, defined as those with more than 10 years of intensive training. Against two professional players in officially recognized leagues, Ace won one of seven games. Table tennis requires precise movements, split-second decisions and powerful reactions, making it a case study for how AI systems interact in complex physical situations. While previous robots have been able to hit a ping pong ball back and forth, none have surpassed an amateur level. The new research, published in Nature, marks the first time a robot has been competitive with experienced players. Scientists have long tried to design robots that can compete against humans in sports, with the aim of developing one with the physical skills required to interact in real-time with humans. On April 20, a Chinese humanoid robot completed a half marathon against human racers, winning the competition with a time about seven minutes faster than the men's world record. Electric vehicle maker Tesla Inc. has also pivoted to developing humanoid robots that can perform repetitive or dangerous tasks for humans. The research shows "that an AI system can perceive, reason, and act effectively in complex, rapidly changing real-world environments that demand precision and speed," said Peter Stone, chief scientist at Sony AI, in a statement. "Once AI can operate at an expert human level under these conditions, it opens the door to an entirely new class of real-world applications that were previously out of reach."
[4]
AI robot outplays humans in table tennis milestone
An AI-powered robot has beaten expert table tennis players in a landmark machine-over-human triumph in a major competitive sport. The mechanical maestro, known as Ace, uses a network of cameras and AI to achieve the rapid planning and reaction times needed to compete. The invention made by Japanese tech group Sony highlights how researchers are using AI to improve robots' ability to adapt to physical tasks they have struggled with, particularly those involving people. It follows a race in China at the weekend in which some automated robots outperformed runners, a dramatic improvement on last year. "Table tennis is a game of enormous complexity that requires split-second decisions as well as speed and power," said Peter Dürr, director of Sony AI in Zurich, who led the Ace project. "This research breakthrough highlights the potential of physical AI agents to perform real-time interactive tasks, and represents a significant step toward creating robots with broader applications in fast, precise and real-time human interactions." Ace beat three out of five elite table tennis players who had more than ten years of training, and scored 48 points versus 70 in two defeats to professionals, according to a paper published in Nature on Wednesday. The robot had improved further, Sony said: since the paper's submission, it had played four further matches against humans, beating two elite players and winning one out of two matches against professionals. The robot handled spin and unexpected changes in trajectory caused by the ball clipping the net on its way over, the researchers said. It outscored the elite players in aces -- points won directly from serve -- by 16 to eight. The results show how robots are starting to challenge human sporting capabilities after demonstrating superiority in cognitive and screen-based pursuits such as chess, Go and video games. In a half-marathon in Beijing on Sunday, a robot developed by the smartphone company Honor finished in 50 minutes and 26 seconds -- almost seven minutes faster than the men's world record. Researchers not involved in the Ace project hailed it as a milestone but noted the complexity of the visual inputs needed to track the ball's location and velocity. The system included nine cameras and three so-called gaze-control systems, positioned around the table. A big hurdle that remained for robot interactions with humans was how machines behaved when "information is incomplete, situations are ambiguous, and mistakes have serious consequences", said Johannes Köhler, an assistant professor in Imperial College London's Department of Mechanical Engineering. "Those challenges are largely not present here: the robot can see everything it needs to see, the task is highly structured, physical interaction is minimal, and there is little inherent danger," he said. "As a result, while the work is technically impressive, I am not convinced it addresses the core safety and uncertainty issues that currently limit autonomous robots operating around people."
[5]
Ping-pong robot Ace makes history by beating top-level human players
April 22 (Reuters) - An autonomous robot ping-pong player dubbed Ace has achieved a milestone for AI and robotics in Tokyo by competing against and sometimes defeating top-level human players at table tennis, a feat that could presage an array of other applications for similarly adept robots. Ace, created by the Japanese company Sony's (6758.T), opens new tab AI research division, is the first robot to attain expert-level performance in a competitive physical sport, one that requires rapid decisions and precision execution, the project's leader said. Ace did so by employing high-speed perception, AI-based control and a state-of-the-art robotic system. There have been various ping-pong-playing robots since 1983, but until now they were unable to rival highly skilled human competitors. Ace changed that with its performances against human elite-level and professional players in matches following the rules of the International Table Tennis Federation, the sport's governing body, and officiated by licensed umpires. "Unlike computer games, where prior AI systems surpass human experts, physical and real-time sports such as table tennis remain a major open challenge due to their requirements for fast, precise and adversarial interactions near obstacles and at the edge of human reaction time," said Peter Dürr, director of Sony AI Zurich and leader for Sony AI's project Ace. The project's goal was not only to compete at table tennis but to develop insights into how robots can perceive, plan and act with human-like speed and precision in dynamic environments, Dürr said. "The success of Ace, with its perception system and learning-based control algorithm, suggests that similar techniques could be applied to other areas requiring fast, real-time control and human interaction - such as manufacturing and service robotics, as well as applications across sports, entertainment and safety-critical physical domains," said Dürr, lead author of a study describing Ace's achievements published on Wednesday in the journal Nature, opens new tab. In matches detailed in the study, Ace in April 2025 won three out of five versus elite players and lost two matches against professional players, the top skill level in the sport. Sony AI said that since then Ace beat professional players in December 2025 and last month. Companies worldwide are making advances with robots. On Sunday, for instance, robots outran human runners in a half-marathon race in Beijing. 'A BLUR TO THE HUMAN EYE' AI systems already have excelled in digital domains in strategy games such as chess and Go and at complex video games. While video games take place in simulated environments, table tennis requires rapid decision-making, precise physical execution and continuous adaptation to an unpredictable opponent, Dürr said. The ball moves at high speeds with complex spins and trajectories, pushing humans and robots to operate at the limits of sensing, prediction and motor control, Dürr said. Ace's architecture integrates nine synchronized cameras and three vision systems to track a spinning ball with exceptional accuracy and speedy processing time. "This is fast enough to capture motion that would be a blur to the human eye," Dürr said. The researchers developed a custom robot platform featuring eight joints. This was, Dürr said, the minimum number necessary to execute competitive shots: three for the racket's position, two for its orientation and three for the shot's speed and strength. Mayuka Taira, a professional table tennis player who lost a match to Ace last December, said in comments provided by Sony AI that the robot's strengths "are that it is very hard to predict, and it shows no emotion." "Because you can't read its reactions, it's impossible to sense what kind of shots it dislikes or struggles with, and that makes it even more difficult to play against," Taira said. Rui Takenaka, an elite-level player who has won and lost matches against Ace, said in comments provided by Sony AI: "When it came to my serve, if I used a serve with complex spin, Ace also returned the ball with complex spin, which made it difficult for me. But when I used a simple serve - what we call a knuckle serve - Ace returned a simpler ball. That made it easier for me to attack on the third shot, and I think that was the key reason why I was able to win." Ace has room for improvement, Dürr said. "Ace has a superhuman ability to read the spin of incoming balls, and superhuman reaction time. As it learns to play not from watching humans play, but is trained by itself in simulation, it also reacts differently from human players and creates surprising situations," Dürr said. "At the same time, professional human athletes are very good at adapting to their opponent and finding weaknesses, which is an area that we are working on." Reporting by Will Dunham in Washington; Editing by Daniel Wallis Our Standards: The Thomson Reuters Trust Principles., opens new tab
[6]
A robot is beating human pros at table tennis. Its maker calls it a milestone for machines
A paddle-wielding robot is so adept at playing table tennis that it is posing a tough challenge to elite human players and sometimes defeating them, according to a new study that shows how advances in artificial intelligence are making robots more agile. Japanese electronics giant Sony built the robotic arm it calls Ace and pitted it against professional athletes. Ace proved a worthy adversary, though one with some non-human attributes: nine camera eyes positioned around the court and an uncanny ability to follow the ball's logo to measure its spin. The robot learned how to play the sport using the AI method known as reinforcement learning. "There's no way to program a robot by hand to play table tennis. You have to learn how to play from experience," said Sony AI researcher Peter Dürr, co-author of the study published Wednesday in the science journal Nature. To conduct the experiments, Sony built an Olympic-sized table tennis court at its headquarters in Tokyo to give professional and other highly skilled athletes a "level playing field" with the robot, Dürr said in an interview with The Associated Press. Some of the athletes said they were surprised by Ace's prowess. Sony says it is the "first time a robot has achieved human, expert-level play in a commonly played competitive sport in the physical world -- a longstanding milestone for AI and robotics research." The custom-built robot has eight joints that direct its movements, or degrees of freedom, enabling it to position the racket, execute shots and swiftly respond to its opponent's rallies. "Speed is really one of the fundamental issues in robotics today, especially in scenarios or environments that are not fixed," said Michael Spranger, president of Sony AI, in an interview. "We see a lot of robots that are in factories that are very, very fast," Spranger said. "But they're doing the same trajectory over and over again. With this technology, we show that it's actually possible to train robots to be very adaptive and competitive and fast in uncertain environments that constantly change." Spranger said such technology could play a role in manufacturing and other industries. It's also not hard to imagine how such high-speed and highly perceptive hardware could be used in war. A humanoid robot ran faster than the human world record in a half-marathon race for robots in Beijing on Sunday, but getting a machine to interact and compete at split-second speeds with skilled human athletes is in some ways a more difficult challenge. Spranger said it was important for researchers to not give the robot too unfair of an advantage and make its speed, arm's reach and performance comparable to a skilled athlete who trains at least 20 hours a week. It plays by official table tennis rules on a typically sized court. "It's very easy to build a superhuman table tennis robot," Spranger said. "You build a machine that sucks in the ball and shoots it out much faster than a human can return it. But that's not the goal here. The goal is to have some level of comparability, some level of fairness to the human, and win really at the level of AI and the level of decision-making and tactics and, to some extent, skill." That means, he said, that "the robot cannot just win by hitting the ball faster than any human ever could, but it has to win by actually playing the game.'' AI researchers have long used board games like chess as benchmarks for a computer's capabilities. They later moved into more open-ended video game worlds. But moving AI from simulated environments to the physical world has long been the gold standard for robot makers. The past year has marked a ''kind of ChatGPT moment for robotics," Spranger said, with new, AI-driven approaches to teach robots about their real-world environments and task them with physically demanding activities, like backflips. Sony is hardly the first to tackle robots in table tennis. John Billingsley helped pioneer such contests in 1983 in a paper titled "Robot Ping-Pong." More recently, Google's AI research division DeepMind has also tackled the sport. And while impressive, Billingsley said Sony's all-seeing computer vision and motion detection capabilities make it hard for a two-eyed human to stand a chance. "I would not want to belittle the achievement, but they have gone at the task mob-handed, and used sledgehammer techniques," Billingsley, a retired mechatronics professor at the University of Southern Queensland in Australia, said in an email to the AP. He added, however, that it adds to the lesson that "true progress comes out of contests, whether they involve hitting a ball or setting foot on Mars." Japanese professional players Minami Ando and Kakeru Sone were among those who competed against Sony's robot. Two umpires from the Japanese Table Tennis Association judged the games. After submitting the paper to peer review ahead of its publication in Nature, Sony researchers kept experimenting and said Ace accelerated its shot speeds and rallies and played even more aggressively and closer to the table edge. Competing against four high-skill players, Sony said Ace defeated all but one of them in December. Another expert player, Kinjiro Nakamura, who competed in the 1992 Barcelona Olympics, told researchers after observing Ace play a shot that "no one else would have been able to do that. I didn't think it was possible." But the robot now having done it "means that there is a possibility that a human could do it too," he said, in remarks published in the Nature paper. ___ AP journalists Yuri Kageyama and Javier Arciga contributed to this report.
[7]
Watch an AI-powered table tennis robot beat elite players
Using high-precision cameras and an AI system, Sony AI's Ace is revealing the advancements robotics. The world of table tennis may be in for a shake-up after Sony's AI division unveiled Ace -- an autonomous robot that can compete with expert table tennis players. Using a combination of high-speed cameras and proprietary state-of-the-art hardware, Ace scored 16 unchallenged points, or "aces," after serving against multiple elite players. This is the first time a robot has achieved "expert-level play in a commonly played competitive sport in the physical world," Sony AI representatives said in a statement. "This breakthrough is much bigger than table tennis," Peter Stone, chief scientist at Sony AI, said in the statement. "It represents a landmark moment in AI research, showing, for the first time, that an AI system can perceive, reason, and act effectively in complex, rapidly changing real-world environments that demand precision and speed." A study detailing how the robot works was published April 22 in the journal Nature. Where hardware meets software AI systems have already shown prowess in strategy games, such as Go, chess and role-playing games. However, moving AI into a robotic body, where quick reflexes are paired with physical movements, can be more challenging. Here, the software and hardware components have to work seamlessly -- and in table tennis, where speed and hand-eye coordination are vital, this pairing has to work to win. "Table tennis is a game of enormous complexity that requires split-second decisions as well as speed and power," Peter Dürr, director of Sony AI in Zurich and project lead for Ace, said in the statement. "This research breakthrough highlights the potential of physical AI agents to perform real-time interactive tasks, and represents a significant step toward creating robots with broader applications in fast, precise, and real-time human interactions." Ace's strategy builds on Sony AI's previous research on its AI agent Gran Turismo Sophy, with Ace using advanced sensors and high-speed software to perceive its environment. These sensors include nine active pixel sensor cameras that help Ace identify the ball's exact position in 3D space, along with three gaze systems that use mirrors and event-based vision cameras to measure the ball's spin and angular velocity as it moves through the air. Running these cameras is Sony AI's proprietary AI control system, which is based on model-free reinforcement learning, where an AI agent learns directly from interactions in its environment without making a predictive model first. This technology allows Ace to adapt and make decisions faster, without relying on a preprogrammed model. Lastly, Ace's robotic body, which includes a swiveling arm with a paddle-like appendage at its end, was created with the company's robotic hardware. Beating the pros In April 2025, scientists had Ace play against five elite players (each with over 10 years of experience and around 20 hours of weekly training) and two professional table tennis players (Minami Ando and Kakeru Sone, both in the Japanese professional league). While players in both tiers are skilled at table tennis, professional athletes make their living by playing table tennis, whereas elite players may not have the same caliber to make the sport their livelihood. Ace won three out of five matches with the elite players and boasted a 75% serve return rate. Its autonomous system also allowed the robot to return unusual shots, such as balls bouncing off the net. It however lost both matches against the pros. Then, in December 2025, Sony AI had Ace play a series of separate matches in which it competed against two professional and two elite players. This time, Ace beat both elite players and one of the professionals. Company representatives said the robot moved closer to the table edge, had higher shot speeds and launched faster-paced volleys against its opponents. Given that less than two years ago, Google DeepMind's robotic table tennis robot was defeated by elite players, Ace's victories show how quickly this field of robotics is advancing in a short time. "Once AI can operate at an expert human level under these conditions, it opens the door to an entirely new class of real-world applications that were previously out of reach," Stone said in the statement.
[8]
Watch Sony’s AI Robot Compete Withâ€"and Beatâ€"Elite Table Tennis Players
Ace is the first robot that can match serves with some of the best pro players in the world, a new study shows. Watch out Marty Supreme, there's a new contender for the throne of table tennis champâ€"and it's not human. Research out today showcases a robot that can match and even best elite human players. Scientists at Sony's AI division developed the autonomous robotic system, dubbed Ace. Their study details how Ace won a majority of its matches against table tennis players with extensive experience, though it came up short against professional athletes. Novelty aside, the software and hardware that makes the robot possible could have many other uses, its creators say. "The results of our work on Ace highlight the potential of physical AI agents to perform complex, real-time interactive tasks, suggesting broader applications in domains requiring fast, precise human-robot interaction," lead author Peter Dürr told Gizmodo. Systems based on artificial intelligence can now regularly beat people at all sorts of tasks, including various games. Historically, though, it's been a challenge to design robots smart and nimble enough to surpass humans at physical sports. Table tennis in particular requires fast reaction times and the ability to generate accurate, yet difficult-to-return, high-spin balls to opponents. Scientists have been tinkering with the possibility of tennis robots since the 1980s, but ACE represents an important step forward for both artificial intelligence and robotics, according to Dürr. "Sony AI conducted this research to study how AI could operate safely and effectively in the physical world, where perception, control, and agility must come together in real time," he said. "Unlike simulated environments where AI can rely on perfect information, real-world sports like table tennis demand rapid decision-making based on state estimation from noisy sensors and adversarial human interactions." Unlike past experiments, the researchers judged Ace's performance against humans using the actual rules of the International Table Tennis Federation (ITTF); they also recruited licensed umpires to oversee the games. In the present study, conducted in April 2025, the researchers paired Ace against five players deemed elite, defined as people who had at least 10 years of playing experience and regularly trained 20 hours a week on average. It also faced off against Minami Ando and Kakeru Sone, two players active in Japan's professional table tennis league. Ace won three of the five matches against elite players. It won one game against a pro, though it ultimately lost both matches to Ando and Sone. And throughout the matches, the robot displayed agile moves and could consistently serve and return high-speed and high-spin balls. The team's findings were published Wednesday in the journal Nature. The team's experiments didn't stop there. Ace had another set of matches in December 2025, where it was able to beat both elite and professional players (it won one of the two pro matches). In March 2026, it won three matches against professionals, including Miyuu Kihara, currently a top 25 player in the World Table Tennis rankings for women's singles. During these matches, Ace displayed improved performance at shooting balls faster and more aggressively closer to the table edge, according to Dürr. Still, Ace probably isn't going to take over the world of table tennis. The project was devised as a way for the researchers to push the individual technologies driving Ace as far as they could, rather than any specific goal. But the lessons learned from Ace might allow scientists to create better robotic systems for various "applications across sports, entertainment, and other safety-critical physical domains," Dürr said. Thankfully, I've always been complete trash at table tennis/ping pong, so I'm already happy to accept Ace as our new robotic overlord just in case.
[9]
AI-powered robot beats elite table tennis players
In feat hailed as milestone in robotics, Sony AI's Ace wins three out of five matches played under official rules An AI-powered robot has beaten elite players at table tennis in a landmark achievement for a machine faced with a human athlete in a real-world competitive sport. Named Ace, the robotic system developed by Sony AI, won three out of five matches against elite players, but lost the two it played against professionals, clawing back only one game in the seven contests. The feat has been hailed as a milestone for robotics, a field that has long seen table tennis - and the lightning-fast reactions, perception and skill it demands - as one of the toughest tests of how far the technology has advanced. In the matches, played under official competition rules, Ace displayed a mastery of spin, handled difficult shots, such as balls catching on the net, and pulled off one rapid backspin shot that a professional had thought impossible. A research paper on the robot was published in Nature on Wednesday, but scientists working on the project said Ace had improved since the report was submitted. "We played stronger and stronger players and we beat stronger and stronger players," said Peter Dürr, the director of Sony AI in Zurich and project lead for Ace. AI researchers use games from chess and go, to poker and Breakout to teach programs on how to make decisions in complex situations. Building an intelligent robot takes the challenge to the next level by requiring the machine to enact decisions effectively. Ace sidesteps some tricky aspects of table tennis by having an eight-jointed arm on a moveable base that does not have to stand on two legs. And instead of seeing the ball with two eyes, it draws on images from multiple cameras that view the entire court from different angles and track the position and spin of the ball. By zooming in on the ball's logo, the camera system can estimate the ball's spin and axis of rotation in the milliseconds it takes to reach Ace's end of the table. How to deal with spin, and which shots to play, were honed during 3,000 hours of games played in a computer simulation. Other skills, such as serves, were drawn from those used by expert players. Ace was not a table tennis ace from the start. Early on, it had problems facing slow balls with minimal spin, returning them weakly and being punished for the slip. But it was impressive at tricky shots, such as when the ball catches on the net, with Ace responding extremely fast to the altered trajectory. "If I used a serve with complex spin, Ace also returned the ball with complex spin, which made it difficult for me," said Rui Takenaka, an elite player. "But when I used a simple serve - what we call a knuckle serve - Ace returned a simpler ball. That made it easier for me to attack on the third shot, and I think that was the key reason why I was able to win." When Ace played an unusual shot, intercepting the ball early and imparting backspin, the former Olympic table tennis player Kinjiro Nakamura, said it had not thought it possible, but now believed that humans could learn the shot. One difficulty in playing Ace is that the robot has no eyes to look into, no body language to read, and does not succumb to pressure when a game is tied 10-10. Dürr said: "The players want to see the eyes of their opponent. And the eyes of Ace are all around the court and they don't show any intention or feeling." Jan Peters, a professor of intelligent autonomous systems at the Technical University of Darmstadt in Germany, has worked on table tennis robots. He called the project "truly impressive", but said research on table tennis would not solve some of the significant challenges in robotics, such as manipulating objects. To be "useful for the general public, a lot of good old-fashioned engineering is needed", Peters added. "There will be a moment in the next decade which will change the world as much as ChatGPT did in 2022. That moment may be closer to now than to 2036."
Share
Share
Copy Link
Sony AI unveiled Ace, an AI robot that defeated elite table tennis players in competition-rule matches. The system uses high-speed perception, reinforcement learning, and an eight-jointed robotic arm to achieve expert-level performance. Ace won three out of five matches against elite players and has since beaten professional players, marking the first time a robot has competed at this level in a physical sport.
Sony AI has developed Ace, an AI robot capable of competing against and defeating elite table tennis players in official matches, marking a significant advance in autonomous robotics
1
. The system combines high-speed perception, AI-based control, and specialized hardware to play at a level previously unattainable by machines. In matches conducted under International Table Tennis Federation rules with licensed umpires, Ace won three out of five games against elite players who trained 20 hours per week for over a decade2
. Against two professional players, the robot beats human players by winning one of seven games during initial testing3
.
Source: AP
Peter Dürr, director of Sony AI Zurich who led the project, describes table tennis as "a game of enormous complexity that requires split-second decisions as well as speed and power"
4
. The ping pong robot represents what researchers call a "Deep Blue" moment for physical sports, referencing the 1997 chess match where IBM's computer defeated world champion Garry Kasparov2
.The system integrates nine synchronized cameras and three gaze-control systems positioned around the table to track ball spin and trajectory with exceptional accuracy
5
. These event-based sensors focus on regions indicating changes in motion or brightness, critical for tracking the ball's path2
. A convolutional neural network estimates the angular velocity of ball spin, which determines trajectory and enables skilled players to deliver difficult shots1
.
Source: Bloomberg
Ace's table tennis skills were built using model-free reinforcement learning, meaning the robot learned through thousands of hours of simulated experience rather than adopting a predetermined model of gameplay
2
. The decision-making algorithm, called the actor, was scored by another program called the critic through deep reinforcement learning1
. A genetic algorithm developed a library of serves by mimicking biological evolution, with expert players determining which strategies were challenging enough for competition use1
.The custom robotic platform features eight independently controlled joints, the minimum necessary to execute competitive shots covering racket position, orientation, speed and strength
5
. While human athletes require around 230 milliseconds to react, Ace's total latency is only 20 milliseconds2
. This processing speed captures motion that would be a blur to the human eye, according to Dürr5
.
Source: Nature
Ace's performance relied on generating different kinds of spin and consistency in returning the ball rather than faster-than-human shots
1
. The robot outscored elite players in aces by 16 to eight and handled unexpected changes when balls clipped the net, a skill that simply emerged without specific training4
. Professional player Mayuka Taira, who lost to Ace in December 2025, noted the robot's strengths: "it is very hard to predict, and it shows no emotion"5
.Related Stories
Since the Nature study was completed in April 2025, Ace has continued advancing
1
. In December 2025, it beat a professional player for the first time, and in March 2026, Ace won matches against three more professionals including Miyuu Kihara, ranked in the top 25 in World Table Tennis rankings2
. "With further improvements, it should be possible to outperform even the world champion," Dürr stated2
.Former Olympian Kinjiro Nakamura, who competed in the 1992 Barcelona Olympics, observed Ace perform a shot he thought impossible, commenting that "the fact that it was possible means that there is a possibility that a human could do it too"
1
. Peter Stone, chief scientist at Sony AI, suggests the research shows "that an AI system can perceive, reason, and act effectively in complex, rapidly changing real-world environments," opening doors to applications in manufacturing, service robotics, sports, entertainment and safety-critical domains3
.Summarized by
Navi
[2]
1
Policy and Regulation

2
Technology

3
Technology
