5 Sources
5 Sources
[1]
Adobe Firefly now supports prompt-based video editing, adds more third-party models | TechCrunch
Adobe is updating its AI video-generation app, Firefly, with a new video editor that supports precise prompt-based edits, as well as adding new third-party models for image and video generation, including Black Forest Labs' FLUX.2 and Topaz Astra. Until now, Firefly only supported prompt-based generation, so you would have to recreate the entire clip if any part of the video was not to your liking. With the new editor, you can use text prompts to edit video elements, colors, and camera angles, and we also get a new timeline view that lets you adjust frames, sounds and other characteristics easily. The company first announced the new video editor in October in private beta, and now it is rolling out to all users. The company said that using Runway's Aleph model, users can give Firefly specific instructions, such as "Change the sky to overcast and lower the contrast" or "Zoom in slightly on the main subject." And with Adobe's own Firefly Video model, users can now do stuff like upload a start frame and a reference video of a camera motion and tell it to recreate that camera angle for the video they're working on. The company also said users can now use the Topaz Labs' Astra model to upscale videos to 1080p or 4K. Black Forest Labs' FLUX.2 image generation model is coming to the app, too, along with a collaborative boards feature. The company said FLUX.2 will be available on Firefly across platforms immediately, and Adobe Express users will be able to use FLUX.2 from January. With competitors releasing new models for image and video generation, Adobe wants to lure users into engaging with its app more. Along with new updates to the Firefly app, the company said subscribers of the Firefly Pro, Firefly Premium, 7,000-credit, and 50,000-credit plans will get unlimited generations from all image models and the Adobe Firefly Video Model in the Firefly app until January 15. Adobe has made a ton of changes to its Firefly models and apps this year. In February, the company launched a subscription that let users access various levels of image and video generation; then it launched a new Firefly web app along with mobile apps later in the year, and has added support for more third-party models within the Firefly app.
[2]
Adobe Firefly's New AI Editing Tools Are a Step Toward More Precise AI Video
Anyone who has used an AI image or video generator knows that these tools are often the opposite of precise. If you're using an AI generator with a specific idea in mind, you'll likely need to do a lot of work to bring that exact vision to life. Adobe's convinced that it can make its AI hub, Firefly, a place where AI can be customized and precise, which is what the company aims for with the release of new AI video editing tools on Tuesday. Over the course of 2025, Adobe has quietly emerged as one of the best places to use generative AI tools. With Firefly subscriptions starting at $10 per month, it's an affordable program that provides integration with top models from Google, OpenAI, Runway, Luma and several other leading AI companies. It's expanding its roster with Topaz Labs' Astra (available in Firefly Boards) and Flux 2.1 from Black Forest Labs, available in Firefly and Photoshop desktop. The partnerships are helping to make Firefly an all-in-one hub for creators to leverage AI, said Steve Newcomb, vice president of product for Firefly, in an exclusive interview. Just as Photoshop is the "career partner" of photographers, Firefly aims to become a partner for AI video and image creators. "If you're a photographer, [Photoshop] has everything that you could ever want. It has all the tooling, all the plugins, in one spot with one subscription. You don't have to subscribe to 25 different photo things," Newcomb said. "So for us, Firefly, our philosophy is, how do we be that home?" One way is through partnerships with AI companies, similar to Photoshop plug-ins. Precise editing tools are another, he said. That's why Adobe is trying to make it easier to edit AI-generated content. Hallucinations are common in AI-generated images and videos, such as disappearing and reappearing objects, weird blurs and other inaccuracies. For professional creators who use Adobe, the inability to edit out hallucinations makes AI almost unusable for final projects. In my own testing, I've often found that editing tools are basic, at best. At worst, they're entirely absent, particularly for newer AI video technologies. Firefly's new prompt-based editing for AI videos, announced on Tuesday, is a way to get that hands-on control. If you've edited images in Firefly via prompting, the video setup will feel very familiar. Even if you haven't, prompt-based editing is essentially a fancy term for asking AI to modify things as you would when talking with a chatbot. Google's nano banana pro in Gemini is one example of an AI tool that allows you to edit through prompts. Firefly's video prompt editing has the added bonus of allowing you to switch between models for edits: You can generate with Firefly and edit with Runway's Aleph, for example. Like with any AI chatbot or tool, prompt-based editing isn't always accurate. But it's a nice option without having to leave Firefly for Premiere Pro. The plan is to go beyond just prompt-based editing, Newcomb said. More AI-based precision editing tools for Firefly will be important, allowing you to make even more minute changes. What makes it possible is something called layer-based editing, a behind-the-scenes technology that enables easier, detailed changes in AI-generated images and videos. Adobe plans to implement layer-based editing in the future, which will likely form the foundation for future AI video editing tools. The goal is to make it easier to stay working in Firefly "until the last mile" of editing, Newcomb said. "We can run the gamut of the precision continuum all the way to the end, and just think of prompting as being one of many tools. But it is absolutely not the only tool," said Newcomb. For now, there is another piece of video editing news that could help you build more precise AI videos. Adobe is also bringing its full AI video editor into beta on Tuesday, the next step toward making editable and, therefore, usable AI video. Debuted at the company's annual Max conference in October, the video editor is now launching in a public beta. It sits between basic video editors and the feature-stuffed Premiere Pro. It'll be great for AI enthusiasts who want more editing firepower than you get with OpenAI or Google, without needing expertise in Premiere Pro. The video editor is meant to help you put all the pieces of your project together in one place. It has a multitrack timeline for you to compile all your clips and audio tracks. That's especially important because, while you can create your own AI speech and soundtracks, Firefly AI videoes don't natively generate with sound. (You can use Veo 3 or Sora 2 in Firefly to generate those initial clips with audio, though.) You can also export in a variety of aspect ratios. "Think of the video editor as being one of our cornerstone releases that is helping us move toward being one place, one home, where you can have one subscription and get to every model you ever needed to get the job done," Newcomb said.
[3]
Adobe Firefly adds AI video features most generators lack -- here's what's rolling out today
Adobe is giving users more control with prompt-to-edit update Those who have spent any time with AI video generators, know just how frustrating it can be when what you've prompted and what you've generated are completlely different. It's frustrating when it comes time to edit those videos, especially when you've generated something impressive, but need to tweak a single thing such as removing a person or changing the lighting. Adobe is trying to fix that. Today, Adobe is rolling out a set of video-focused features for Firefly designed to make AI video editing feel more like real editing -- iterative, controlled and integrated into a broader workflow. The updates include prompt-based video edits, camera motion references, a browser-based Firefly video editor and built-in upscaling powered by a partner model. Users can now prompt to edit their videos. Instead of regenerating an entire clip every time you want a change, Firefly now lets you edit an existing AI-generated video using text prompts. That means you can do things like: This is a big shift. Most AI video tools still treat every change as a full regeneration, which makes fine-tuning nearly impossible. Adobe's approach is closer to how creators already think: generate first, then refine. Firefly's Video Model now supports camera motion reference, which solves another common AI video problem: inconsistent movement. You can upload a start-frame image (your scene) and a reference video that shows the camera movement you want. From there, Firefly then applies that camera motion -- pan, dolly, zoom, tilt -- to your generated clip while keeping the subject and framing anchored. This makes it far easier to create cinematic-looking shots that don't feel jittery or random, especially for B-roll, transitions or short-form video. Adobe is also integrating Topaz Astra, a partner AI model focused on video enhancement and upscaling. With this integration, users can now enhance low-resolution or older footage, upscale clips to 1080p or 4K and queue multiple upscales while continuing to work. This is aimed at creators who already have footage (not just AI-generated clips) and want a quick way to improve quality without leaving Adobe's ecosystem. Firefly is also getting a browser-based video editor, now available in public beta. It supports two workflows, including timeline editing, with multitrack control for video and audio, and text-based editing, where you edit video by working with a transcript (ideal for interviews, explainers, or talking-head content) The editor lets users combine their own footage with Firefly-generated video, plus music and audio. Adobe's long-term vision is clear: AI shouldn't replace human editors, it should work with them. By focusing on prompt-based edits, camera consistency and practical tools like upscaling and transcript editing, Adobe is positioning Firefly as a creative hub rather than a novelty generator. To encourage experimentation, Adobe is also offering a limited-time unlimited generations promotion (through January 15) for certain paid Firefly plans, including Pro, Premium and high-credit tiers. The promo covers Firefly image models, Firefly video models and select partner image models. Unfortunately, it doesn't apply to free users, but it does lower the friction for creators who want to seriously test Firefly's new video tools without worrying about credits.
[4]
Adobe Wants to Make Editing AI Videos More Like Editing Real Videos
Adobe launched Firefly Video back in February to criticism. In the months since, the company has continued to update its platform and today unveiled new AI video editing features and partner AI models inside Firefly. When Firefly Video arrived, creators could only generate AI video clips using prompts. Once the videos were created, they either took them as they were or had to start over from scratch. At the recent Adobe MAX, Adobe showcased new editing tools in Firefly that let users edit their video creations without starting over. These tools are now available to everyone via a public beta. With the new "Prompt to Edit" controls, Firefly users can edit generated video clips using text prompts, including ones that simulate virtual camera control. This feature, powered by Runway's Aleph model, enables people to remove or add specific objects to their scene, replace backgrounds, change the lighting and conditions, and adjust "focal length." "Firefly makes those changes directly in the existing clip. You're no longer at the mercy of the next random generation. You're directing the scene. And you can continue to refine by adding sound effects or music tracks, and edit further within Firefly video editor or Premiere desktop -- all built to give creators full control from idea to execution," Adobe explains. The browser-based Firefly video editor lets users combine generated clips with their own real footage to create a final edited video. The multi-track timeline looks like a slightly simplified version of Premiere Pro. Beyond timeline editing, users can also edit content like talking-head segments or interview clips by editing text in a video's transcript. Speaking of third-party models like Runway Aleph, Adobe also today brought AI technology from Topaz Labs into Firefly. As Adobe notes, "generative AI isn't only about generating content -- it's also about the tools that make it adaptable, integrated, and actionable across workflows." Topaz Astra is now available in Firefly Boards, letting users upscale footage, whether AI-generated or captured by real cameras. Low-resolution footage can be upscaled in Firefly to Full HD or 4K, ensuring it is usable across a broader range of platforms. Topaz Astra can also be used to restore "older or low-quality footage" inside Firefly. Another third-party model, Black Forest Labs' FLUX.2, is also now available in Firefly. This is an AI image model that promises photorealistic images, advanced text rendering, and support for up to four reference images. FLUX.2 is now available in Firefly's Text to Image Module, Prompt to Edit, and Firefly Boards, plus as a model choice for Photoshop's Generative Fill feature. FLUX.2 will be added to Adobe Express next month. From now until January 15, 2026, customers with Firefly Pro, Firefly Premium, 7,000, and 50,000-credit plans get unlimited image and video generations inside Firefly. Firefly Pro starts at $19.99 per month.
[5]
Adobe Firefly's newest update enables users to direct, rather than generate, AI video - SiliconANGLE
Adobe Inc. is revamping the video creation capabilities within its Firefly suite of artificial intelligence models. In a major update to its AI video generation tool, it's adding more powerful editing capabilities and precise motion controls, along with video upscaling in Firefly Boards. In addition, it's announcing the full beta launch of Firefly video editor, which is being pitched as a new "creative assembly space" for generative AI storytelling. The updates were unveiled along with a limited-time promotion. The company is offering users unlimited access to both Firefly's video generation model and all of its image-generation models for all current subscribers. For professional video creators, the addition of more precise editing tools is probably the biggest deal with today's update. Adobe said it's adding new "prompt-to-edit" controls to its AI video generator that enables creators to make precision refinements to any video they generate. Adobe explained it's trying to tackle a common problem with AI video generation. For instance, someone might generate a video of a coffee shop that looks just the way they want, except for some random object that appears on a table that seems to spoil the vision they had in mind. Normally, they'd have to regenerate the entire clip to fix this, but in doing so, they could end up with an entirely different, less-than-ideal scene. With the new tools, users can now generate a video and "surgically remove" the things they don't like using Runway AI Inc.'s Aleph video model. All they have to do is enter a prompt, such as "remove the person on the left side of the frame" or "change the sky to overcast." They can also enter commands such as "zoom in slightly on the main subject," Adobe said. The changes will immediately be applied to the existing video, as if users are truly directing the scene, rather than just having to regenerate everything from scratch and hope it comes out better. As well as the precise edits, Adobe is adding a feature called camera motion reference workflow, which allows users to upload a start frame reference image, as well as a video that shows the camera motion they intend to recreate. The model will then generate both the scene and the desired cinematic movements. Adobe said it's hoping to save creators hours of time spent on trial and error as they struggle to get their videos looking exactly the way they want. For creators hoping to generate slicker, more professional-looking videos, the new Topaz Astra tool in Firefly Boards may be just what they were looking for in an AI video generator. Firefly Boards is an AI-native software workspace for creators to explore, iterate and collaborate in real time, and Topaz Astra is being made available there first of all. It's essentially an upscaling tool that allows users to transform low-resolution clips into 1080p or 4K quality, making them sharp enough to share publicly. Adobe said it can also help in restoring older, lower-quality video footage by enhancing the clarity and detail to make them appear as if they were shot using modern camera equipment. In addition, Firefly Boards is getting some new models in the shape of FLUX.2, the most advanced image-generation model from Black Forest Labs Inc., providing photorealistic detail and advanced text rendering, along with support for up to four reference images at once. The company said FLUX.2 is available now in Firefly Boards, as well as Photoshop and the Firefly Text-to-Image module. Firefly's video editor is a browser-based video-generation tool that allows creators to get to work on any device, without downloading any of Adobe's full applications. Available in public beta now, it supports complete video generation and editing, and allows users to add music tracks and other visuals to their footage and shape them on a lightweight, multitrack timeline. Subscribers will be able to enjoy unlimited access to Adobe's new video generation tools for a limited time thanks to the company's latest promotion, which runs until Jan. 15. As part of the offer, customers who are subscribed to Firefly Pro, Firefly Premium, or the 7,000-credit and 50,000-credit personal plans will benefit from unlimited image and video generations in the Adobe Firefly application. The promotion spans every model available in Firefly, including Adobe's own and third-party offerings like the new FLUX.2, Google LLC's Nano Banana and OpenAI Group PBC's GPT-Image model.
Share
Share
Copy Link
Adobe is transforming its Firefly platform with new prompt-based video editing tools that let creators refine AI-generated clips without starting over. The update includes Runway's Aleph model for precise edits, camera motion references, and third-party integrations like Topaz Astra for upscaling to 4K. A browser-based video editor enters public beta, while subscribers get unlimited generations until January 15.
Adobe Firefly is addressing one of the biggest frustrations in AI video editing: the inability to make targeted changes without regenerating entire clips. The platform now supports prompt-based editing through its new video editor, which exits private beta and rolls out to all users
1
. Until now, Adobe Firefly only allowed prompt-based generation, forcing creators to recreate entire videos if any element didn't match their vision. The new approach treats AI video editing as an iterative editing experience, where creators can refine specific elements using text commands like "remove the person on the left side of the frame" or "change the sky to overcast and lower the contrast"1
5
.
Source: Tom's Guide
The update integrates multiple third-party AI models to position Firefly as a comprehensive hub for AI-generated video content. Using Runway Aleph, creators can issue specific instructions to modify video elements, colors, and camera angles without leaving the platform
1
. Black Forest Labs FLUX.2 arrives as an image generation model offering photorealistic detail, advanced text rendering, and support for up to four reference images4
. Topaz Astra enables AI video upscaling, allowing users to enhance low-resolution footage to 1080p or 4K quality, making it suitable for professional distribution . Steve Newcomb, vice president of product for Firefly, explained the strategy: "If you're a photographer, [Photoshop] has everything that you could ever want. It has all the tooling, all the plugins, in one spot with one subscription. You don't have to subscribe to 25 different photo things"2
.
Source: TechCrunch
Adobe's Firefly Video Model now supports camera motion references, solving a persistent problem in AI video generation: inconsistent or random movement . Creators can upload a start frame image showing their desired scene and a reference video demonstrating the camera movement they want to replicate. The model then applies that motion—whether pan, dolly, zoom, or tilt—to the generated clip while maintaining subject and framing consistency
1
. This feature saves hours previously spent on trial and error, enabling creators to produce cinematic-looking shots for B-roll, transitions, or short-form content without the jittery or random movements typical of AI-generated footage5
.The Firefly video editor, first announced at Adobe's MAX conference in October, is now available in public beta
2
. This browser-based video editor positions itself between basic AI generators and the feature-rich Premiere Pro, offering a middle ground for creators who need more control than chatbot-style tools provide but don't require professional-grade software expertise2
. The editor includes timeline editing with multitrack control for video and audio, plus text-based editing where users can modify content by working with transcripts—ideal for interviews, explainers, or talking-head segments . Users can combine their own footage with AI-generated clips, add music and sound effects through the new timeline view, and export in various aspect ratios1
.
Source: PetaPixel
Related Stories
To drive adoption of these new capabilities, Adobe is offering unlimited generations from all image models and the Adobe Firefly Video Model until January 15 for subscribers of Firefly Pro, Firefly Premium, 7,000-credit, and 50,000-credit plans
1
. The promotion covers Adobe's own models plus third-party offerings like Black Forest Labs FLUX.2, Google's Nano Banana, and OpenAI's GPT-Image model5
. While the offer doesn't extend to free users, it removes credit constraints for paid subscribers wanting to test Firefly's expanded video tools . Firefly Pro starts at $19.99 per month, with subscriptions beginning at $10 per month for basic access4
2
.Adobe's roadmap suggests even more granular control is coming. Newcomb indicated that prompt-based editing represents just one tool in a broader precision continuum, with layer-based editing technology planned for future implementation
2
. This behind-the-scenes technology would enable more detailed changes in AI-generated images and videos, allowing creators to stay within Firefly "until the last mile" of editing before moving to Premiere Pro if needed2
. The company's vision centers on making AI work alongside human editors rather than replacing them, with Firefly serving as a creative hub that addresses common AI video problems like hallucinations, disappearing objects, and other inaccuracies that currently make AI-generated content difficult to use in final professional projects2
.Summarized by
Navi
[1]
[3]
1
Technology

2
Technology

3
Policy and Regulation
