If AI Is Driving the Campaign, What’s the Job of a Brand Team?
The platforms aren’t shy about where this is heading. Meta says all you need is a goal and a budget and the system will “do the rest.” Google’s pitch sounds the same. And early results are wild: a 30% spike in advertisers adopting Meta’s AI tools, video ads being reformatted automatically with generative pixels, and reported lifts as high as 46% in incremental conversions .
This is the part where CMOs are supposed to celebrate. The promise is seductive: launch faster, scale wider, personalize everything, and never touch an A/B test again.
Yet every marcomm leader I talk with has the same concerns. Output goes up, clarity goes down, and there’s fear when no one can name who is steering the story.
If AI is driving the campaign, what’s the job of your brand team?
Spoiler: “Do the rest” is not a strategy. And it’s not accountability.
The Illusion of Full Automation
Marina Cooper, Senior Associate Vice President for Integrated Marketing and Brand at Johns Hopkins University, hears the same platform promises we all do. And when we spoke earlier this year on my podcast, her first instinct wasn’t panic.
These tools help entrepreneurs and small teams who need a marketing engine they don’t have to build themselves. But for an institution with a complex brand, diverse audiences, and real reputational risk, automation raises the bar for judgment.
JHU, like many universities, markets to a teenager in the inquiry phase and a 72-year-old donor in the same hour. No algorithm can hold that nuance alone. And it shouldn’t.
When platforms say, “trust us,” what they mean is: We’ll optimize for the metric. You own the consequences.
Brand Teams Only Lose Control If They Hand It Over
The fear isn’t irrational. If AI is optimizing for clicks, not narrative, it will drift toward the familiar traps: short-term performance over long-term meaning, cleverness over clarity, and sometimes outright brand misalignment. Higher ed leaders know exactly what’s at stake. One off-tone ad can undo months of trust-building with faculty, boards, legislators, and alumni.
And Marina was blunt about the operational reality: real-time creative changes sound great, until something unfolds on campus that requires tone awareness an algorithm doesn’t have. Institutions live in a world where context shifts by the hour. Your campaign has to shift with it. Machines don’t know when not to publish.
Brand safety stops being a marketing issue and becomes a governance issue.
The Brand Team’s Role Is Shifting
If AI is driving the campaign, the brand team's job becomes three things:
1. Define the guardrails before the campaign launches
Guardrails are boundaries that keep institutional trust intact. Leaders need to set:
Creative limits: What can change (layout, weight, hierarchy) and what stays static (tone, core photography, message pillars)?
No-go zones: Sensitive campus topics, crisis contexts, political references, or fragile narratives the machine cannot touch.
Brand safety rules: Where ads can and cannot appear, and what automated systems must never generate (for example, fabricated university imagery — an actual risk Marina called out).
This front-end clarity prevents back-end chaos.
2. Establish oversight you can actually execute
Once dynamic creative is in motion, your oversight model matters more than your asset library. You need:
A monitoring cadence that matches the speed of the algorithm. Daily for some teams. Hourly for others.
A human failsafe capable of hitting pause when performance looks great but the narrative feels off.
A clear owner — and this is critical. Platforms don’t own the outcome of your ad. Vendors don’t own it. Your team does.
Or as Marina framed it: the marketing team must be “ultimately accountable” for what the machine puts into the world.
3. Turn the machine’s signals into strategic intelligence
This is the overlooked upside. When AI recombines assets at scale, you get a high-resolution view of what resonates across segments.
Which images pull in prospective students? Which phrases convert adult learners? Which formats spark donors?
This is where brand teams earn authority. You translate performance signals into upstream decisions:
Stronger program positioning
Smarter creative briefs
More relevant storytelling
Clearer message hierarchy
In other words: you turn machine learning into institutional learning.
Real-time Optimization Requires Real-time Leadership
There’s a cost to all of this speed.
“Not being able to answer the why or how we did something puts us in jeopardy of not being able to repeat and again own the success. We know the rules change on these social platforms all the time, so we have to be smart about what we’re investing in and make sure our strategies are repeatable and truly delivering the results we need.”
A black-box result puts leaders at risk — not just of bad performance, but of not being able to repeat success or defend it internally. Higher ed operates under scrutiny. Presidents want explanations. Boards want rationale. Faculty want transparency. “The algorithm said so” is not a governance model.
Your team’s role is to preserve explainability in an environment designed to obscure it.
Ethics isn’t a sidebar — it’s the operating system
We are entering a moment where ethics shows up in every tactical choice. Marina calls this out directly when she says there’s “a little bit of ethics involved in all of this” … and we both know she was being generous. There’s a lot.
Ethics shows up when:
AI adds pixels to a video and subtly alters the meaning of a scene. The platform sees an optimization; your faculty see misrepresentation.
An automated system keeps pushing creative during a campus crisis because the algorithm only knows momentum, not context.
Optimization skews toward one segment in ways that disadvantage another, a risk every enrollment leader understands in their bones.
These are the predictable outcomes of tools trained to reward performance over nuance. And this is where brand teams earn their keep.
Your job is to decide where automation ends. You define what’s permissible, what’s off-limits, and what requires a human with institutional awareness to step in. You protect the meaning behind the metrics. You guard the lived experiences of your students, alumni, and faculty from being flattened into performance data.
So What Is the Job of a Brand Team in 2026?
The job is to shape the inputs, define the boundaries, and lead the oversight. AI can execute, remix, and optimize. But it can’t make the judgment calls that protect reputation, context, and mission.
It can’t decide when speed matters more than tone.
It can’t weigh a 5% lift against a long-term narrative cost.
It can’t feel the temperature of a campus moment.
Leaders can. And the teams who learn to run alongside the machines (not behind them) will move faster, stay safer, and gain the strategic ground everyone else is ceding.
A Question for Your Team
If your campaign changed hundreds of times today and you didn’t touch a thing: Who owns the outcome?
If you don’t have that answer, you don’t have a strategy. You have a system on autopilot.
Now is the moment to decide what you are accountable for.