
Meta has announced a slick new AI-powered video editor on June 11 that works across three of its platforms.
Available in the Meta AI app, the Meta.AI website, and the Edits app, this feature lets users edit short videos using over 50 preset prompts.
In just seconds, it can change your outfit, alter the scene or background, adjust the lighting, or apply an entirely new visual style to your clip.

Image credits: Meta
Meta says it designed these prompts based on feedback from creators, to make editing fast and fun for them.
“We built this feature so that everyone can experiment creatively and make fun, interesting videos to share with their friends, family and followers,” the company wrote in its announcement.
Meta worked with video creators to choose prompt styles that would appeal to their audiences and integrated the tool into the Edits app to keep it a seamless part of the creative process.
How to use the AI video editor?
All you need to do is upload a video (the AI will use up to the first 10 seconds of it). Pick from dozens of preset prompts. And boom – your video gets a fun makeover!

Edited videos can be shared directly to Facebook or Instagram from the Meta AI app or Edits app, and to the Discover feed on Meta.AI
The feature is rolling out in the US and more than a dozen other countries worldwide.
For now it is free to try, Meta says it’s only free for a ‘limited time‘, with each session limited to one 10-second clip.
How the feature works under the hood
The new video editor uses Meta’s AI technology. It is built on several models Meta has been working on for the past few years.
In 2022, Meta launched a project called Make-A-Scene. It allowed AI to create images, audio, videos, and 3D animations.
Later, Meta developed Llama Image to produce better images and short videos. They also created the Emu system in 2023, which can edit videos or generate new clips from a text description.
For example, “Emu Video” can generate a short clip from a text description, while Emu Edit can apply quick AI edits to an existing video.
In 2024, Meta combined all of this into a model called Movie Gen. It can generate new videos or edit existing ones using simple text inputs.
Movie Gen can also create produce personalized videos from a photo and even create sound effects and soundtracks like this 👇 (sound on)
The AI video editing feature now rolling out is the first real product to come from this research, bringing Meta’s latest video generation technology into users’ hands.
Role of the Edits app
It’s no coincidence that Meta chose to integrate this AI feature into Edits, its standalone mobile video creation app.
Edits, launched in April 2025 as a full-featured video editor for creators, supports green screen tools, AI animation, auto captions, high-quality exports, and no watermarks. It topped app store charts and saw over 7 million downloads in its first week of release
The new generative AI editing function slots into Edits as another creative tool, so creators can apply an AI prompt to a clip as part of their normal editing workflow.
Why it matters
Meta’s new tool makes AI video editing easy for everyone. Now, anyone can edit videos with no special skills. You can take a simple clip and turn it into something fun in just one tap.
Many other tech companies are adding AI to their editing tools too. Google, Adobe, and others are all moving fast. Meta is doing the same but making sure the tool works well inside its own apps.
You can edit in the Meta AI app or Edits app, then share the video straight to Facebook, Instagram, or Meta’s Discover feed. It only takes a few taps.
Meta says this is just the beginning. They worked with creators to pick prompts that would appeal to people. The goal is to make video editing simple and fun for anyone, and to keep users coming back to Meta’s apps.
As AI tools get better, this could change how more people create and share videos online.
What early users are seeing
This feature is still brand new, and early tests show the results can be a mixed bag.
In some cases the AI’s visual changes look impressive; in others, they can appear a bit uncanny or off-target.
For example, Emma Roth at The Verge, tried making her dogs look like they were in a desert but Meta’s AI turned the ground bright orange, added some oddly purple cacti, and gave the dogs a strange shimmery glow.

In another test, asking for an “anime” style gave the user unnaturally fluorescent pink eyes and lips.
That said, users are finding the tool fun and easy.
What comes next
Right now, you’re limited to the built-in prompt presets.
Meta has already announced that more advanced features are on the way.
“Later this year, you’ll be able to edit videos alongside Meta AI with your own text prompts to make your video edits exactly as you imagine them,” the company said.
In other words, instead of picking from pre-made styles, you will be able to type in a custom prompt describing the effect you want, and the AI will try to apply it. This will give users much more control to get specific or creative with their video edits.
Meta also plans to broaden the availability of the feature to more countries as it fine-tunes the technology.
Since the AI editor is part of Meta’s broader Meta AI assistant initiative, we might eventually see it integrate with other Meta AI capabilities (for example, combining video edits with AI image generation or using Meta’s AI for captions and descriptions). For now, Meta is positioning this launch as just the first step in a larger journey.




