Adobe Firefly adds prompt-based video editing and third-party AI models for precise control

Reviewed byNidhi Govil

5 Sources

Share

Adobe is transforming its Firefly platform with new prompt-based video editing tools that let creators refine AI-generated clips without starting over. The update includes Runway's Aleph model for precise edits, camera motion references, and third-party integrations like Topaz Astra for upscaling to 4K. A browser-based video editor enters public beta, while subscribers get unlimited generations until January 15.

Adobe Firefly Introduces Prompt-Based Video Editing for Precise Control

Adobe Firefly is addressing one of the biggest frustrations in AI video editing: the inability to make targeted changes without regenerating entire clips. The platform now supports prompt-based editing through its new video editor, which exits private beta and rolls out to all users

1

. Until now, Adobe Firefly only allowed prompt-based generation, forcing creators to recreate entire videos if any element didn't match their vision. The new approach treats AI video editing as an iterative editing experience, where creators can refine specific elements using text commands like "remove the person on the left side of the frame" or "change the sky to overcast and lower the contrast"

1

5

.

Source: Tom's Guide

Source: Tom's Guide

Third-Party AI Models Expand Creative Capabilities

The update integrates multiple third-party AI models to position Firefly as a comprehensive hub for AI-generated video content. Using Runway Aleph, creators can issue specific instructions to modify video elements, colors, and camera angles without leaving the platform

1

. Black Forest Labs FLUX.2 arrives as an image generation model offering photorealistic detail, advanced text rendering, and support for up to four reference images

4

. Topaz Astra enables AI video upscaling, allowing users to enhance low-resolution footage to 1080p or 4K quality, making it suitable for professional distribution . Steve Newcomb, vice president of product for Firefly, explained the strategy: "If you're a photographer, [Photoshop] has everything that you could ever want. It has all the tooling, all the plugins, in one spot with one subscription. You don't have to subscribe to 25 different photo things"

2

.

Source: TechCrunch

Source: TechCrunch

Camera Motion References Enable Cinematic Consistency

Adobe's Firefly Video Model now supports camera motion references, solving a persistent problem in AI video generation: inconsistent or random movement . Creators can upload a start frame image showing their desired scene and a reference video demonstrating the camera movement they want to replicate. The model then applies that motion—whether pan, dolly, zoom, or tilt—to the generated clip while maintaining subject and framing consistency

1

. This feature saves hours previously spent on trial and error, enabling creators to produce cinematic-looking shots for B-roll, transitions, or short-form content without the jittery or random movements typical of AI-generated footage

5

.

Browser-Based Video Editor Enters Public Beta

The Firefly video editor, first announced at Adobe's MAX conference in October, is now available in public beta

2

. This browser-based video editor positions itself between basic AI generators and the feature-rich Premiere Pro, offering a middle ground for creators who need more control than chatbot-style tools provide but don't require professional-grade software expertise

2

. The editor includes timeline editing with multitrack control for video and audio, plus text-based editing where users can modify content by working with transcripts—ideal for interviews, explainers, or talking-head segments . Users can combine their own footage with AI-generated clips, add music and sound effects through the new timeline view, and export in various aspect ratios

1

.

Source: PetaPixel

Source: PetaPixel

Unlimited Generations Promotion Encourages Experimentation

To drive adoption of these new capabilities, Adobe is offering unlimited generations from all image models and the Adobe Firefly Video Model until January 15 for subscribers of Firefly Pro, Firefly Premium, 7,000-credit, and 50,000-credit plans

1

. The promotion covers Adobe's own models plus third-party offerings like Black Forest Labs FLUX.2, Google's Nano Banana, and OpenAI's GPT-Image model

5

. While the offer doesn't extend to free users, it removes credit constraints for paid subscribers wanting to test Firefly's expanded video tools . Firefly Pro starts at $19.99 per month, with subscriptions beginning at $10 per month for basic access

4

2

.

Future Plans Point Toward Layer-Based Editing

Adobe's roadmap suggests even more granular control is coming. Newcomb indicated that prompt-based editing represents just one tool in a broader precision continuum, with layer-based editing technology planned for future implementation

2

. This behind-the-scenes technology would enable more detailed changes in AI-generated images and videos, allowing creators to stay within Firefly "until the last mile" of editing before moving to Premiere Pro if needed

2

. The company's vision centers on making AI work alongside human editors rather than replacing them, with Firefly serving as a creative hub that addresses common AI video problems like hallucinations, disappearing objects, and other inaccuracies that currently make AI-generated content difficult to use in final professional projects

2

.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo