AI Started Making Creative Decisions For You
How to keep your perspective in the work when AI handles the execution
You use AI to start a project. It gives you something back that’s actually pretty good, maybe better than what you had going in. And then somewhere in the middle of the work, you realize you’ve shifted into response mode.
You’re editing what it gave you instead of executing what you had in mind. So the work is still moving forward, but you’re not sure anymore how much of the direction is actually yours.
That’s the moment your role quietly changed from creator to reviewer.
That shift can happen faster than you expect. And if you’re still building your creative voice, you can end up pretty far into a direction that came from the tool before you realize it’s happening.
Adobe and Parsons School of Design spent a full semester studying how AI was affecting the creative process across a group of design students at different stages of development, documented in Adobe’s report Creativity in the Age of AI.
What they found is a specific observation about what happens when a tool starts operating between you and your own judgment, and what actually needs to be in place before that becomes useful instead of disorienting.
AI Entered Early
When the program started, students weren’t coming in cold. Across every experience level, most had already worked AI into their process before the semester began.
They used it early: getting ideas out faster, visualizing concepts that would’ve taken longer to prototype by hand, and exploring across mediums in ways that pushed their thinking into territory it might not have reached otherwise. For the early, wide-open stages of a project, it fit naturally.
What stood out was what happened as the work progressed. The closer students got to a final decision, to the part of a project where authorship and personal direction are most visible, the more they stepped back from the tools and made the call themselves. They’d use AI to build momentum and then take over to land it. The fast start and the final call had different rules, and students knew it.
That instinct to grab the wheel at the end turned out to be one of the most consistent patterns across the group, and it pointed at something worth looking at before it became a problem.
The Hidden Cost of AI-Assisted Creativity
The conversation around AI in creative work tends to focus on output quality. This program was built to observe the creative process itself.
The goal was to watch where AI supported the work, where it started steering decisions students hadn’t fully made yet, and where students stepped back in to take control again.
Students worked with Firefly, Firefly Boards, Firefly Video Editor, Photoshop, Lightroom, Premiere, and Content Credentials through the Content Authenticity Initiative.
The program watched what was actually happening to the creative process as students worked. They tracked where AI felt like it was serving the work, where it started to push past what a student had already decided, and where students pulled back and took over.
The Tool Picks a Direction Without You
Once students were deep into the work, something started coming up across the group. AI was moving their projects in directions they hadn’t chosen.
Several of the group described moments where they couldn’t separate their own instinct from what the tool had introduced. They’d made decisions throughout the process, but they weren’t sure how many were actually theirs. The work looked like their work, but the path to it was harder to trace.
The mechanism is pretty simple: what you bring into the tool gets reflected back. You come in with a clear direction, AI extends it. You come in with gaps, AI fills them. The tool responds to what’s there. It works with what you gave it, not what you meant.
If your direction is vague, the output won’t fix that. It will confidently build on top of it.
Brooke Hopper, Senior Principal Designer of Machine Intelligence and New Technology at Adobe, has a name for what usually catches this problem in a slower creative process: creative friction.
It’s the part of the process where you have to choose between two directions that both have potential, push past the obvious first result, or sit with something that isn’t working and figure out why. That friction is often how taste develops. It’s where point of view gets built.
When a tool moves fast enough to smooth over that friction entirely, you can lose your reference point for your own judgment without noticing it’s gone.
What creates friction is having to slow down, make a call, sit in the uncertainty, and figure out what you actually think. That process is where a lot of the real work happens.
“The system is operating faster than your ability to engage with it critically.”
— Brooke Hopper, Senior Principal Designer, Adobe
Speed works when you’ve already set your direction and runs ahead of your judgment when you’re still figuring it out. And once it’s moving, it’s very easy to mistake momentum for clarity.
AI Fills in the Gaps
Students in this program didn’t all have the same experience with the same tools, and the difference came down to clarity.
If someone entered the workflow with a strong direction already established, AI accelerated it. More output, faster execution, and more room to expand ideas they had already decided on.
But when the direction itself was still vague, AI started filling in the missing pieces.
Models are designed to complete patterns. If your instruction is incomplete, the AI keeps generating anyway. The less defined the direction is, the more influence the system has over where the work ends up going.
Several students described moments where they realized the tool had started making decisions in areas they hadn’t fully resolved themselves yet. Not because the AI was “wrong,” but because the gaps were still open enough for the model to start interpreting them.
For students who entered the process with a clearer direction, the exact same tools behaved differently. AI accelerated decisions they had already made instead of making them for them.
“AI can get you somewhere interesting fast, but that doesn’t mean the work is finished. It encouraged me to ask harder questions about the story behind the idea, what decisions were actually mine, and where I needed to step back in to direct the work toward my own perspective.”
— Kiara Chang, Student, Parsons School of Design
What each student brought in determined everything they got back, and the ones who understood that had the most control over where their work went.
The Ask Was Ownership
When students described the tools they wanted, they described something more connected to their own thinking.
They wanted creative ownership. They wanted space to stay close to their own process while they worked, rather than arriving at a result they weren’t sure they’d actually directed. The ask was ownership over their own decision-making, built into the output itself.
The distinction students were drawing was between AI that works alongside your process and AI that replaces part of it. They wanted the first one, and they were specific: tools that responded to their thinking rather than ahead of it.
Hopper frames this as a design goal for the tools themselves. Moving someone forward is one part of it, but keeping them close enough to their own ideas to keep directing the work is the other part. Those two things don’t automatically come together just because a tool is fast.
AI literacy, in her view, is about knowing when to bring these tools in and when to pull back, and being deliberate about which parts of your process you’re handing over and which parts you’re keeping.
For students in this program, the process itself is often what they’re in creative work for. The back and forth, the exploration, and the part where something isn’t working yet and you push through it anyway, because that’s where a lot of the development actually lives. A tool that compresses all of that into fast output can produce work that feels less personal, and the students here were specific about wanting something different.
Proving the Work Is Yours
These students are stepping into a professional field full of AI-generated content, and they already know it. That context shaped how they engaged with one specific tool in the program.
Content Credentials, through the Content Authenticity Initiative, lets creators attach verified attribution directly to their work. You can connect your name, your social accounts, and other attribution details as verifiable metadata so the provenance of your work travels with it. Whoever receives the work can see how it was made and edited.
For this group, it didn’t land as abstract or technical. It landed as something they needed.
Being able to prove the work is theirs, in a field where AI-generated content is everywhere and authorship is increasingly hard to establish, gave them a concrete way to show up with confidence. Provenance is a practical tool for professional identity to this generation, and they engaged with it exactly that way.
For those of us scaling businesses and brands, the lesson is clear: Speed is a liability when you're still defining your direction.
Don't let the tool outrun your judgment.
Curiosity Showed Up
Hopper came in expecting some hesitation. Students arrived curious instead.
They weren’t just exploring what the tools could do. They were interrogating them, asking hard questions about ownership, intent, and where the line between useful and overreaching actually sits. They wanted to understand how these tools connect to authorship, responsibility, and what it means to develop as a creative when the tools available to you keep shifting.
Several students came up to the Adobe team after the program and thanked them, not for the tool access, but for being willing to engage on those harder questions instead of stepping around them. What they were really thanking Adobe for was honesty about the limits of the tools, about what’s still being figured out, and about the fact that these are genuinely hard questions without settled answers inside the companies building them. Students noticed the complicated parts weren’t treated as off-limits.
Hopper also noted something that reframes the whole picture: not every designer working on generative AI is fully committed to an AI-first direction, and she names that as a strength. Having people inside the building process who are willing to question assumptions, push back on decisions, and ask where something might go too far means more care goes into the work. The tension students were working through in the classroom is the same tension designers are carrying at their desks.
The students finishing this program aren’t waiting to see what generative AI becomes. They’re already part of the conversation that shapes it.
This is exactly what we train for inside AI Flow Club.
How to use AI without handing over your judgment. How to move faster without losing your own direction in the process.
Join us at AI Flow Club.


