3 Sources
3 Sources
[1]
Disney and Nvidia Combine on Robotics and AI to Bring Olaf Droid to Life
While much of the United States suffers through an extended winter, in Glendale, California, it's another story. Sunny and warm on an afternoon at Walt Disney Imagineering R&D in mid-March, it was the perfect setting for meeting summer-loving snowman Olaf of Disney's juggernaut Frozen franchise. Olaf -- in robotic form and primed for theme park appearances -- debuted last week in Los Angeles and is now being showcased at Nvidia's GTC 2026 AI conference in the San Francisco Bay Area. Nvidia CEO Jensen Huang will be joined onstage by Olaf during his keynote Monday kicking off the company's annual tech conference. The robotic snowman runs simulations on Nvidia GPUs and is powered by Nvidia chips. Olaf was brought to life using the Newton Physics Engine, an open-source system developed by Nvidia, Google DeepMind and Disney Research that enables high-performance robot simulations to run quickly on GPUs. Much like the free-roaming droids developed by Walt Disney Imagineering that you can find in Star Wars Galaxy's Edge at both Disneyland and Disney World, Olaf will walk around theme parks to greet guests and interact with them. Walt Disney Studios animators helped create the training data used to teach Olaf to walk in simulation -- not just to walk, but to emulate his signature snowman shuffle. When CNET senior producer Jesse Orrall interacted with Olaf during a press preview at Imagineering R&D HQ last week, the AI-powered robot was controlled by an operator. His voice responses were selected by that operator, and he was somewhat limited in what he could say. Olaf can also follow a script as part of a show, with plans to expand those capabilities over time. "We had our sights set on creating a real-life Olaf for a long time," Josh Gorin, Disney executive R&D Imagineer, told Orrall. "Technology finally caught up." He lamented that you can't give the Olaf robot warm hugs just yet -- though Disney acknowledges it is very huggable and hopes that will change. Iridescent fibers woven into Olaf's body give him a sparkly, snowy look. His mouth and eyes are fully articulated, while his stick arms, carrot nose, buttons, eyebrows and hair attach magnetically, allowing for character gags in which his body parts can be pulled off and reattached. Olaf debuts March 29 with the opening of the World of Frozen land at Disneyland Paris and will later appear at Hong Kong Disneyland. There's no word yet on when the beloved snowman might make appearances stateside at Disneyland or Walt Disney World.
[2]
I met Olaf -- the Frozen robot who might be the future of Disney Parks
You know Olaf. Before K-Pop Demon Hunters, before Wicked, it was Disney's Frozen that blasted show tunes like "Let it Go" and "Into the Unknown" into our lives. My little girls loved belting those tunes. So when I met Olaf, the Disney Imagineering robot, I kept thinking: I can't wait for my kids to meet him too. It's a weird thought, really, because this Olaf isn't a "he" and can't carry on a conversation. Why do I keep thinking "I met him" when he's largely a remote-controlled puppet teleoperated by a Steam Deck gaming handheld? I think the answer is that Olaf -- coming to Disneyland Paris on March 26th and Hong Kong Disneyland this summer -- is the rare robot that crosses the uncanny valley as long as he keeps moving. And that's because Disney animators helped him train himself, sticking 100,000 virtual copies of the physical Olaf robot into a Nvidia-powered simulation and rewarding him for screen-accurate moves. It took just two days to train Olaf with an Nvidia RTX 4090 GPU. "This absolutely is the future of how we're building robot characters," Disney Imagineering SVP of R&D Kyle Laughlin tells The Verge. He says reinforcement learning is the "true unlock" that could let Disney populate entire lands full of interactive characters, now that entire robots can be built in months instead of years. And while Disney Imagineering has done some of this with its Star Wars droids before, those were "robots being robots," says Laughlin. "This is our first animated character that we brought to life." To be crystal-clear, Olaf is not artificially intelligent. The 35-inch tall, 33-pound robot may have 25 actuators and three computers including an Nvidia Jetson Orin NX and a Raspberry Pi, but it's not speaking for itself. It plays pre-recorded lines from what sounds like Olaf's voice actor Josh Gad while it performs animated moves. While Olaf blinks autonomously, it can't "see" you to look at you -- that requires the operator to flick a joystick. The Steam Deck's other joystick tells Olaf where to walk, and the operator can swipe across a touchpad to quickly access page after page of conversation options. In my early demo, it wasn't yet enough to carry on a conversation: a quick "Of course!" or "Sure!" was often all I got. But whenever Olaf is moving, I can't take my eyes off him, and I automatically find myself writing "him" again and again. He waddles around so convincingly! When I can't quite put my finger on why, Disney Research lab director Moritz Bäecher explains a big part of it in eight words: "The eyes go first, and the body follows." We automatically assume we're looking at a living being, because the eyes are mentally controlled. (It also doesn't hurt that Olaf's four-way-stretch costume, built atop foam "snowballs," sparkles like fresh snow as the light shines through. Olaf's carrot, sticks, and buttons are all magnetic, so they can be easily reattached or even intentionally detached as a gag.) While Disney is notorious for protecting its intellectual property, it sees its robot research differently. Last March, it partnered with Nvidia and Google DeepMind to release the Newton Physics Engine as an open-source project managed by the Linux Foundation; now, Disney Research is also contributing Kamino, the simulation tool it developed to train "extremely complex mechanical assemblies" like Olaf and other robots to come, including a simple thermal dynamics model to keep joints from overheating prematurely. Olaf was a challenge, the team says, because robots traditionally don't have big weighty heads that rest on a small neck. It puts a lot of strain on that joint, making it prone to overheating. Olaf's clomp-clomp-clomp walking was a noisy challenge to solve, too. But in the reinforcement learning simulation, Disney was simply able to reward the 100,000 virtual Olafs who move without overheating that joint or making too much noise. "It's like telling my six-year-old to stop running through the house: Can you just be a little bit quieter? That's pretty much what we had to do for Olaf," says Laughlin. Bächer tells me these tools are designed to interface with the ones that animators already use, including Maya, so animators can create motions that target emotions, letting the physics simulation do the work of figuring out what the bots can actually do. I have to admit the illusion breaks down a lot when Olaf stops moving, and Disney Research isn't saying when it might make these robots truly autonomous. It sounds like that technology isn't up to Disney's standards yet. Bäecher says "believable autonomy" is the goal: "It needs to be something that you believe is real." But it won't always be a human with a Steam Deck at the controls, either. Olaf can be part of time-coded performances, tied directly into Disney's live entertainment choreography systems, Laughlin says, and that's one of the first ways he'll appear at Disneyland Paris. He'll be performing on a boat in a lagoon in front of the castle. "We built a mock boat in our R&D lab to simulate the considerable amount of rocking back and forth in this boat, and Olaf does an incredible job staying afloat," says Laughlin. "He's got his sea legs." Those performances may get even more intriguing, Laughlin suggests, as Disney creates more robots. "You can expect to see more robots from franchises together so that they can interact." "The real power is going to come from Olaf interacting with characters that he knows and loves. Not only performers, but also other characters that we haven't been able to bring to life without robotics," he hints. Disney Research has published an eight-page whitepaper on how it created Olaf, including some of the exact components and formulas it used. You can read it below.
[3]
We Met Disney's Most Advanced Robot Yet: Olaf From 'Frozen'
When you get an email saying Olaf is in town for one day only, you drop what you're doing and make plans to meet him. That's what happened last week, as io9/Gizmodo was invited to the research and development section of Walt Disney Imagineering to meet the company's most advanced robotics creation to date: Olaf, the magical snowman from Frozen. The Olaf robot was first introduced into the world back in November, with the promise to debut in the World of Frozen areas at both Disneyland Paris and Hong Kong Disneyland Resort. But, before that happens, we and a small group of journalists and influencers got to talk to Kyle Laughlin, SVP of R&D at Walt Disney Imagineering, and Josh Gorin, VP of R&D at Walt Disney Imagineering and Disney Live Entertainment Innovation. They're two of the many Imagineers involved with the project, which has been in the works for a while. "We sat with all of our parks partners across the world, and the overwhelming character that they wanted for their lands, regardless of whether they had a Frozen World or not, was Olaf," Laughlin said. When Olaf enters a room (or, eventually, a theme park), the first thing you notice is the walk. Being a magical talking snowman, the character has a very recognizable and distinguished gait, and Gorin explained that nailing that was one of the first things they knew they had to get right. It was achieved with the aid of AI called "reinforcement learning" that they previously used to bring mobile characters, such as the BD-X droids in Star Wars Galaxy's Edge, to life. "Instead of hand animating every possible pose you would need to do a walk, we actually take the characters, and we bring them into a simulation environment, and they actually learn to walk," Gorin said. "So what we do is, we do a full 3D CG simulation of the characterâ€"every motor, every wire, every boltâ€"and together that goes into a space with gravity and actual physics. The same way we all learned how to walk, it tries and tries againâ€"millions and millions of timesâ€"until it learns how to stand upright and walk." Other robotics companies have also done this. But Disney, being the massive company that it is, can give the process a very unique touch. "The Disney difference is, and I think what differentiates our robotic characters so much from other sorts of robots you might see in the industry, is story and personality and character," Gorin continued. "We don't want our guests to see technology. We want our guests to fall in love with their favorite characters in the film. So, to do that, we bring in animation training data. We get the actual animators who worked on the Frozen films to create the poses and the movements, and we feed that into the training data. So Olaf doesn't just learn how to stand and walk; he learns how to stand and walk like Olaf." You can see some of that in this video below, provided by Disney. https://www.youtube.com/watch?v=msL1UmUWPw8 He also better talk like Olaf, which, obviously, he does. "We work directly with Josh Gad on all of these lines," Laughlin said. "We brought him into the studio to record as if he was recording for our animated features, so we're continuing to build a library of lines that Josh is recording with us as a part of the performance." That means, yes, Olaf has plenty to say now, but he'll have even more to say as his adventures continue. Those adventures began this week with an appearance at NVIDIA GTC 2026, which will then be followed by the character making his Disney Parks debut as part of the opening of World of Frozen in Disney Adventure World at Disneyland Paris on March 29. There, the character will largely be seen on a boat, along with the rest of the cast, so he can be visible to a lot of people at the same time. "What's really exciting about these characters is that, unlike an animatronic that's sort of designed for one scene and one ride, this can be used in [many different ways]," Gorin said. "Parades and meet-and-greets, in atmospheric entertainment, in shows. This allows us to truly think about them as total and complete characters. And so you're going to see him doing more and saying more over time." That means, unfortunately, guests themselves won't get to spend a lot of time with Olaf one-on-one. At least, not for now. "We don't have a date to announce yet, but as you can imagine, the hug-ability of our character is incredibly powerful," Laughlin said. "And that's ultimately our North Star. That's where we're headed, and we're doing everything we can to try and make that happen as fast as possible." io9/Gizmodo did, however, get to have that interaction, and it was simultaneously wonderful and slightly disappointing. Like many of Disney's current robotic creatures, Olaf is operated by a human with basically a big remote control somewhere off to the side to not ruin the illusion. That person is the one who keys in what the creature says when talking to people. And when talking to a group of journalists, it sure did sound like Olaf didn't know a lot. Here are a few examples of exchanges we heard. Clearly, we didn't hear every single thing Olaf could possibly say, and human response time plays a role here as well. But a few of those questions felt like very easy ones Olaf should have had an answer to. And yet, he did have four different ways to say "I don't know," which is impressive in its own right. That said, there were more than enough magical moments to cover the few small hiccups. Simply seeing the character walk around and glimmer (thanks to iridescent fibers in his snow) was a sight to behold. He truly does move and look like the character from the film. If you're lucky enough to get a picture, Olaf will shift his body slightly, spreading his twig arms and saying, "Cheese!" Before he leaves, he might sing a few lines of "Into the Unknown" from Frozen 2. And for every question he may not have had an answer to, he had two or three fantastic things to say on other subjects. "This wandering thing is great," Olaf said while walking around. "Oaken! I get it now," screaming to one of the other characters in the film. Ask him about the whereabouts of Anna and Elsa, and he might say, "They're probably doing official royal queen stuff." Or he might suggest they're off eating chocolate. At one point, Olaf even melted hearts when he said, "Want to know a secret? We’re friends now." And of course, this is just the beginning. Laughlin called Olaf "bleeding edge" in terms of Disney's character creation. Basically, he's the culmination of everything they've done so far. The best of what's possible. But also, that's just today. Who knows what the future holds? "Reinforcement learning allows us to develop these robots faster than ever, which gives us the opportunity to potentially deliver these experiences day and date," Laughlin said. "We used to wait years, potentially, after a film was out to be able to deliver those experiences. Now, the technology is at a place where we can debut a robot and a character alongside the story that people have just seen." That makes Olaf the most advanced Disney robot ever, until the next one.
Share
Share
Copy Link
Disney and Nvidia have partnered to create an advanced Olaf robot that will interact with guests at theme parks. Using reinforcement learning and the Newton Physics Engine, 100,000 virtual Olafs trained in just two days on Nvidia GPUs to perfect the character's signature waddle. The 35-inch robot debuts at Disneyland Paris on March 29 and represents a shift in how Disney builds characters—from years to months.
Walt Disney Imagineering has unveiled the Olaf robot, a 35-inch tall, 33-pound AI-powered robot that brings the beloved Frozen snowman to life in theme parks
1
. The collaboration between Disney and Nvidia showcased the character at Nvidia's GTC 2026 AI conference, where CEO Jensen Huang presented the robot during his keynote1
. This marks Disney's first animated character brought to life through robotics, distinct from previous "robots being robots" like the Star Wars droids at Galaxy's Edge2
.
Source: CNET
The Olaf robot runs simulations on Nvidia GPUs and is powered by Nvidia chips, utilizing the Newton Physics Engine—an open-source system developed by Nvidia, Google DeepMind, and Disney Research
1
. Josh Gorin, Disney executive R&D Imagineer, explained that "technology finally caught up" to realize their long-held vision of creating a real-life Olaf1
.The breakthrough in creating the Olaf robot lies in reinforcement learning, which Disney Imagineering SVP of R&D Kyle Laughlin calls the "true unlock" for populating entire lands with interactive characters
2
. Instead of hand-animating every pose, Disney placed 100,000 virtual copies of the physical Olaf robot into a Nvidia-powered simulation where they learned to walk through millions of attempts3
. Using an Nvidia RTX 4090 GPU, this training took just two days2
.
Source: The Verge
What differentiates Disney's approach is incorporating animation training data from Walt Disney Studios animators who worked on the Frozen films
1
. This ensures the robot doesn't just walk—it emulates Olaf's signature snowman shuffle with authentic personality3
. The simulation also addressed technical challenges like preventing joint overheating from Olaf's heavy head on a small neck and reducing noisy clomp-clomp-clomp walking by rewarding quieter movements2
.The Olaf robot features 25 actuators and three computers including an Nvidia Jetson Orin NX and a Raspberry Pi
2
. Iridescent fibers woven into his four-way-stretch costume create a sparkly, snowy appearance, while his mouth and eyes are fully articulated1
. His carrot nose, stick arms, buttons, eyebrows, and hair attach magnetically, allowing for character gags where body parts can be intentionally detached and reattached1
.
Source: Gizmodo
Currently, the robot is teleoperated using a Steam Deck gaming handheld, with an operator controlling movement via joysticks and selecting pre-recorded dialogue from Josh Gad across touchpad pages
2
. While Olaf blinks autonomously, he cannot yet "see" guests or carry on autonomous conversations2
. Disney Research director Moritz Bäecher explains the convincing illusion comes from a simple principle: "The eyes go first, and the body follows," making observers automatically perceive a living being2
.Related Stories
Despite Disney's reputation for protecting intellectual property, the company is open-sourcing its robot research
2
. In March 2025, Disney partnered with Nvidia and Google DeepMind to release the Newton Physics Engine as an open-source project managed by the Linux Foundation2
. Now Disney Research is contributing Kamino, the simulation tool developed to train "extremely complex mechanical assemblies" like Olaf, including thermal dynamics models to prevent premature joint overheating2
.These tools interface with existing animation software like Maya, enabling animators to create motions targeting specific emotions while the physics simulation determines what robots can physically accomplish
2
. This approach reduces robot character development timelines from years to months2
.The Olaf robot debuts March 29 at the World of Frozen land opening in Disney Adventure World at Disneyland Paris, where he'll initially perform on a boat in a lagoon in front of the castle as part of time-coded performances tied into Disney's live entertainment choreography systems
2
3
. He will later appear at Hong Kong Disneyland this summer2
. No announcement has been made regarding appearances at US Disney Parks1
.Unlike an animatronic designed for one scene, this robot can be deployed across parades, meet-and-greets, atmospheric entertainment, and shows
3
. However, one-on-one guest interaction and warm hugs remain future goals. Laughlin acknowledged the "hug-ability of our character is incredibly powerful" and represents their "North Star," though no date has been announced3
. Disney continues building a library of lines recorded by Josh Gad specifically for the robot's performances3
.The technology signals a shift in how theme parks might populate entire lands with interactive characters, though Disney Research hasn't specified when robots might achieve truly autonomous "believable autonomy" that meets Disney's standards
2
. For now, the combination of advanced robotics, simulation, and GPUs demonstrates how AI is reshaping entertainment experiences beyond screens.Summarized by
Navi
24 Nov 2025•Entertainment and Society

19 Mar 2025•Technology

16 Jul 2025•Technology

1
Technology

2
Technology

3
Business and Economy
