Adobe is moving beyond generative image tools and into broader workflow automation with the debut of its Firefly AI Assistant. This new "creative agent" introduces a conversational interface to the Creative Cloud, allowing users to coordinate complex, multi-step tasks across apps including Firefly, Photoshop, Illustrator, Premiere, Lightroom, and Express using natural language.
The assistant is an evolution of Project Moonlight, which Adobe previewed last year. The goal is to let creators start from the outcome while the agent handles the steps behind the scenes. Instead of hunting through menus or manually clicking through repetitive steps, users can tell the assistant what they want to achieve. Because the agent works directly with native file formats, every change it makes stays fully editable, so a designer can still jump in and fine-tune vectors or pixels by hand. Adobe says users remain in control throughout the process, with the ability to step in and adjust outputs at any stage.
Firefly AI Assistant is designed to maintain context as you work. If you start a project in Firefly and then open Photoshop for detailed editing, the assistant moves with you, remembering the conversation and the project's history across apps and sessions so you don't have to explain the task a second time. The assistant will first arrive within Adobe Firefly, the company's all-in-one creative AI studio.
Adobe is also introducing "Creative Skills," which function as automated playbooks for specific goals. A creator could use a social media skill to have the assistant take a single image, crop it for different platforms, use Generative Extend to expand the background, and then optimize and save the files to the cloud in one go. It can also turn a still image into an animation. Over time, the system is expected to learn a user's preferred tools and workflows, acting more like a personalized extension of their creative process.
The assistant also has a level of awareness regarding what is on the screen. It understands the type of content being edited, including images, video, and design assets. For an editor working on a product shot in a specific environment, the assistant might provide a slider to adjust elements like trees or lighting, rather than requiring multiple manual edits. For teams, it integrates with Frame.io to turn reviewer feedback into actual edits, cutting down the time it takes to get to a final version.
Adobe says it is also working to expand access to third-party AI models like Anthropic's Claude. This could allow additional AI capabilities to be integrated into its creative tools.
Firefly AI Assistant is expected to enter public beta in the coming weeks.
Get the iClarified Daily Newsletter
Apple news, rumors, tutorials, price drop alerts, in your inbox every evening, free.
Unsubscribe at any time.
Success!
You have been subscribed.
Add Comment
Would you like to be notified when someone replies or adds a new comment?
Yes (All Threads)
Yes (This Thread Only)
No
Notifications
Would you like to be notified when we post a new Apple news article or tutorial?