Curated by THEOUTPOST
On Sat, 2 Nov, 12:05 AM UTC
4 Sources
[1]
Runway Adds Precise Camera Controls to its AI Video Editor
Runway has added a precise camera control feature to its video platform allowing editors to pan, track, and zoom around AI subjects. Advanced Camera Control began rolling out to Gen-3 Alpha Turbo users across the weekend adding to its Act-One feature announced just a few weeks ago which can transform an actor into a cartoon from just a single video. On the social media platform X, Runway explains that Advanced Camera Control allows users to "choose both the direction and intensity of how you move through your scenes for even more intention in every shot." The impressive features allow for panning around AI subjects, which keeps their consistency even as the camera moves. They also allow for zooming out to "reveal new context and story." Digital Trends notes that the Advanced Camera Control Feature is restricted to Gen-3 Alpha Turbo which costs $12 a month to subscribe to. The new camera controls may be of interest to Hollywood. Runway partnered with Lionsgate -- the film studio behind The Hunger Games, John Wick, and The Twilight Saga franchises -- in a deal that gave Runway permission to train custom video models on Lionsgate's extensive Hollywood catalog, thought to be over 20,000 films and TV shows. Runway is positioning itself as a tool for filmmakers after the relase of Act-One last month which takes a video of someone talking -- which can be shot on just a smartphone -- and uses the performance as an input to create compelling animations. But Runway CEO Cristóbal Valenzuela caused controversy last week after comparing the current state of artificial intelligence to the invention of daguerreotypes in the 19th century.
[2]
Runway brings precise camera controls to AI videos | Digital Trends
Content creators will have more control over the look and feel of their AI-generated videos thanks to a new feature set coming to Runway's Gen-3 Alpha model. Advanced Camera Control is rolling out on Gen-3 Alpha Turbo starting today, the company announced via a post on X (formerly Twitter). Recommended Videos The new Advanced camera controls expand on the model's existing capabilities. With it, users can "move horizontally while panning to arc around subjects ... Or, move horizontally while panning to explore locations," per the company. They can also customize the direction and intensity of how the camera moves through a scene "for even more intention in every shot," while combining "outputs with various camera moves and speed ramps for interesting loops." Unfortunately, since the new feature is restricted to Gen-3 Alpha Turbo, you will need to subscribe to the $12-per-month Standard plan to access that model and try out the camera controls for yourself. Runway debuted the Gen-3 Alpha model in June, billing it as a "major improvement in fidelity, consistency, and motion over Gen-2, and a step towards building General World Models." Gen-3 powers all of Runway's text-to video, image-to-video, and text-to-image tools. The system is capable of generating photorealistic depictions of humans, as evidenced in the X post, as well as creating outputs in a wide variety of artistic styles. Advanced Camera Controls arrive roughly a month after Runway revealed gen-3's new video-to-video capabilities in mid-September, which allows users to edit and "reskin" a generated video in another artistic style using only text prompts. When combined with Apple's Vision Pro AR headset, the results are striking. The company also announced the release of an API so that developers can integrate gen-3's abilities into their own apps and products. The new camera controls could soon be put to use by film editors at Lionsgate, the studio behind the John Wick and The Hunger Games franchises, which signed a deal with Runway in September to "augment" humans' efforts with AI generated video content. The deal reportedly centers on the startup building and training a new generative AI model fine-tuned on Lionsgate's 20,000-title catalog of films and television series.
[3]
Runway's Gen-3 Alpha Turbo Now Offers Advanced Camera Control Tool
It is the latest frontier video generation AI model by Runway Runway, the video-focused artificial intelligence (AI) firm released a new feature on Friday. The feature, dubbed advanced camera control, will allow users to get granular control over the camera movement in an AI-generated video. The capability is being added to the AI firm's Gen-3 Alpha Turbo model, which was released in June. The feature supports text, image, and video inputs, and allows users to select both the direction as well as the intensity of the camera movement. This feature is available to both free and paid subscribers of the platform. While AI video models have made significant progress since coming to the mainstream, one area where they still struggle is granular control in camera movements. Users can control the style, objects, and focus, as well as finer details, but asking the AI to pan the camera or zoom in to the shot generates randomised results. In a post on X (formerly known as Twitter), the AI firm detailed the advanced camera control feature which aims to tackle this challenge. The company shared several video examples of how this tool will allow users to manually control how the camera moves in the shot. The feature will let users zoom in and out on the subject or an object in the frame. The camera can also be moved horizontally, vertically, or diagonally based on specific prompts. Additionally, videos with panning shots to reveal more context to the scene can also be generated. Another interesting aspect of the advanced camera control feature is that users can also control the intensity of the movement. This means users can either pick a slow pan or a fast movement for the desired effect. Further, multiple movements can be combined to generate a free-flowing effect. These features are only available on the Gen-3 Alpha Turbo AI model. The model is available to both free users as well as paying subscribers. However, those on the free tier will get a limited number of tokens to try out the video generation model. Runway's paid subscription starts at $12 (roughly Rs. 1,000) a month per user.
[4]
Runway goes 3D with new AI video camera controls for Gen-3 Alpha Turbo
Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More As the AI video wars continue to wage with new, realistic video generating models being released on a near weekly basis, early leader Runway isn't ceding any ground in terms of capabilities. Rather, the New York City-based startup -- funded to the tune of $100M+ by Google and Nvidia, among others -- is actually deploying even new features that help set it apart. Today, for instance, it launched a powerful new set of advanced AI camera controls for its Gen-3 Alpha Turbo video generation model. Now, when users generate a new video from text prompts, uploaded images, or their own video, the user can also control how the AI generated effects and scenes play out much more granularly than with a random "roll of the dice." Instead, as Runway shows in a thread of example videos uploaded to its X account, the user can actually zoom in and out of their scene and subjects, preserving even the AI generated character forms and setting behind them, realistically put them and their viewers into a fully realized, seemingly 3D world -- like they are on a real movie set or on location. As Runway CEO Crisóbal Valenzuela wrote on X, "Who said 3D?" This is a big leap forward in capabilities. Even though other AI video generators and Runway itself previously offered camera controls, they were relatively blunt and the way in which they generated a resulting new video was often seemingly random and limited -- trying to pan up or down or around a subject could sometimes deform it or turn it 2D or result in strange deformations and glitches. What you can do with Runway's new Gen-3 Alpha Turbo Advanced Camera Controls The Advanced Camera Controls include options for setting both the direction and intensity of movements, providing users with nuanced capabilities to shape their visual projects. Among the highlights, creators can use horizontal movements to arc smoothly around subjects or explore locations from different vantage points, enhancing the sense of immersion and perspective. For those looking to experiment with motion dynamics, the toolset allows for the combination of various camera moves with speed ramps. This feature is particularly useful for generating visually engaging loops or transitions, offering greater creative potential. Users can also perform dramatic zoom-ins, navigating deeper into scenes with cinematic flair, or execute quick zoom-outs to introduce new context, shifting the narrative focus and providing audiences with a fresh perspective. The update also includes options for slow trucking movements, which let the camera glide steadily across scenes. This provides a controlled and intentional viewing experience, ideal for emphasizing detail or building suspense. Runway's integration of these diverse options aims to transform the way users think about digital camera work, allowing for seamless transitions and enhanced scene composition. These capabilities are now available for creators using the Gen-3 Alpha Turbo model. To explore the full range of Advanced Camera Control features, users can visit Runway's platform at runwayml.com. Positioning Runway to continue being the AI provider and toolmaker of choice for filmmakers of all levels While we haven't yet tried the new Runway Gen-3 Alpha Turbo model, the videos showing its capabilities indicate a much higher level of precision in control and should help AI filmmakers -- including those from major legacy Hollywood studios such as Lionsgate, with whom Runway recently partnered -- to realize major motion picture quality scenes more quickly, affordably, and seamlessly than ever before. Asked by VentureBeat over Direct Message on X if Runway had developed a 3D AI scene generation model -- something currently being pursued by other rivals from China and the U.S. such as Midjourney -- Valenzuela responded: "world models :-)." Runway first mentioned it was building AI models designed to simulate the physical world back in December 2023, nearly a year ago, when co-founder and chief technology officer (CTO) Anastasis Germanidis posted on the Runway website about the concept, stating: "A world model is an AI system that builds an internal representation of an environment, and uses it to simulate future events within that environment. Research in world models has so far been focused on very limited and controlled settings, either in toy simulated worlds (like those of video games) or narrow contexts (such as developing world models for driving). The aim of general world models will be to represent and simulate a wide range of situations and interactions, like those encountered in the real world." As evidenced in the new camera controls unveiled today, Runway is well along on its journey to build such models and deploy them to users.
Share
Share
Copy Link
Runway has added precise camera control features to its Gen-3 Alpha Turbo AI video editor, allowing users to manipulate AI-generated scenes with unprecedented control over camera movements and perspectives.
Runway, a leading AI video generation company, has introduced a groundbreaking feature called Advanced Camera Control for its Gen-3 Alpha Turbo model. This new tool allows users to manipulate AI-generated video scenes with unprecedented precision, marking a significant advancement in AI-powered video editing technology [1][2].
The new feature offers a range of sophisticated camera movements, including:
These controls enable users to create more intentional and cinematic AI-generated videos, preserving the consistency of AI subjects even as the camera moves through the scene [1][3].
This development represents a significant step forward in AI video generation capabilities. Content creators now have more granular control over the look and feel of their AI-generated videos, allowing for:
The Advanced Camera Control feature is currently available exclusively for Gen-3 Alpha Turbo users. Interested creators can access this model through Runway's Standard plan, priced at $12 per month [2][3].
Runway's latest innovation comes on the heels of its partnership with Lionsgate, a major Hollywood studio. This collaboration allows Runway to train custom video models on Lionsgate's extensive catalog of over 20,000 films and TV shows [1][4].
The new camera controls could potentially revolutionize film editing and production processes, offering filmmakers powerful tools to augment their creative workflows [2][4].
Cristóbal Valenzuela, Runway's CEO, has positioned the company as a tool for filmmakers, drawing parallels between AI's current state and the invention of early photography techniques [1]. The company's focus on building "world models" – AI systems that can represent and simulate a wide range of real-world situations – underpins their approach to advancing AI in video generation [4].
As AI video generation technology continues to evolve, Runway's advancements in camera control and 3D-like capabilities signal a new frontier in digital filmmaking. The integration of these tools with existing film production processes could lead to more efficient, cost-effective, and creatively expansive video content creation across various industries [4].
Reference
[2]
[3]
Runway AI, a leader in AI-powered video generation, has launched an API for its advanced video model. This move aims to expand access to its technology, enabling developers and enterprises to integrate powerful video generation capabilities into their applications and products.
8 Sources
Runway introduces Gen-3 Alpha Turbo, an AI-powered tool that can turn selfies into action-packed videos. This advancement in AI technology promises faster and more cost-effective video generation for content creators.
2 Sources
Runway introduces Act-One, a groundbreaking AI tool that transforms human performances into animated characters, potentially revolutionizing filmmaking and content creation.
10 Sources
Runway, a leading AI video generation company, has announced a $5 million fund to support up to 100 experimental films using its AI technology. This initiative aims to push the boundaries of filmmaking and explore new creative possibilities.
4 Sources
Lionsgate, a major Hollywood studio, has entered into a partnership with AI video generator Runway to develop a custom video generation model using Lionsgate's vast content library. This collaboration marks a significant step in the integration of AI technology in the entertainment industry.
22 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2024 TheOutpost.AI All rights reserved