Problem Statement

DCE users finds it hard to understand and manage the Learning Phase because the current process is unclear and lacks flexibility. They need a simpler, more visible way to plan, test, and control how ads run, so they can make smarter decisions and improve campaign performance.

About The Product

DCE is a campaign management platform used by advertisers to plan, schedule, and distribute video ads across different partners. It helps brands test ad variations during a "Learning Phase" to understand what performs best.

Clarifying Scope & Success Metrics

To improve the DCE experience, I focused on making the Learning Phase simpler, clearer, and more flexible for users. The goal was to help them plan, schedule, and test ad variations without confusion or manual back and forth.

I defined clear goals to measure whether the new experience was actually helping them work faster and smarter.

  • Improve how users set up and manage the Learning Phase from the dashboard.

  • Support both automated and manual ad scheduling.

  • Help users to easily understand when ad is going live and how it's performing.

  • Reduce dependency on backend teams.

Scope

  • Users can complete Learning Phase setup without help or confusion.

  • Campaigns show a multiple ad variations during testing.

  • Manual scheduling drop significantly, which save time for users and developers.

  • Time to schedule creatives is reduced compared to the previous workflow.

Success Metrics

Empathy Mapping

Persona Focus: Campaign Manager / Ad Ops Strategist.


Goal: Confidently schedule and test multiple ad variations during the Learning Phase with minimal confusion or manual work.

“I want control, but I also want the system to help me.”

“If I can’t see what’s scheduled, how do I trust if it’s right?”

“This should be faster, I’m doing the same thing every day.”

“I don’t know if all my ad variations are actually going live.”

Tries to work manualy using spreadsheets.

Checks ad placements and delivery settings multiple times.

Asks others or developers to confirm or fix ad delivery.

A list of ad without indication of what’s scheduled or live.

No timeline or calendar to review ad delivery.

Repetitive backend requests to change and schedule creatives manually.

A complex UI with unclear flow or next steps.

“The client wants more ad versions early in the campaign.”

“You’ll have to ask a developer to manually change or schedule that.”

“Why there is a no better way to see what’s running?”

“We missed performance data on some variations again.”

Think & Feels

Say & Do

See

Hear

  • Build a automated system to distribute multiple ad variations without manual effort.

  • Ensure the system selects a multiple set of creatives to maximize learning.

  • Provide a calendar view which should display upcoming ad schedules, live statuses, and available slots.

  • Show impression count, engagement, and creative performance in one view.

  • Give alert if any ad is rejected.

  • Allow replace Ad.

Actionable Takeaways

Empathy Interviews

Empathy interviews are a one-on-one interaction with a user to learn about their experiences, needs, and goals. The goal is to get insights into the user's needs and desires, which may be unspoken or unrecognized by the user.

In an empathy interview, a researcher asks a single user open-ended questions about a specific topic. The questions are designed to encourage the user to share their stories and experiences in their own words.


Participants

  • Creative Strategist: Age 35, manages global ad campaigns.

  • Campaign Manager: Age 29, responsible for ad scheduling and performance tracking.

Can you walk me through how you currently set up ads during the Learning Phase?

"We upload manually and schedule each creative day by day, but I can’t see what’s going out tomorrow or even if the right variations are being tested."

What’s the biggest pain point when scheduling or rotating creatives?

How do you know if a creative is performing or not during Learning?

Do you prefer automation or manual control? Or both?

Understanding other available tools

I wanted to understand how other platforms like Smartly.io and Celtra solve similar problems in ad scheduling and automation. By studying their platforms, I was able to identify how they streamline the learning phase, manage creative variety, and offer visibility into ad performance.

Feature


DCE (Current)

Celtra

Smartly.io

Learning Phase Control



Basic, unclear process

Strong creative

testing tools

Both manual + auto

with control

Creative Variation in

Learning


Only first clip varied

Full creative testing

& rotation

Strong variation testing

Manual + Automated (mix)


Not supported (yet)

Manual setup required

Fully supports both

Ad Scheduling Visibility



Hidden backend logic

Visual creative planner

Drag and drop calendar

for creatives

Ease of Creative Change



Manual through dev team

Visual editor with instant

updates

Easy reorder and replace

features

Impression based

Option


Not available

Custom triggers

Supports both impression

& time based

UI Simplicity

Needs improvement

Modern, sleek creative focus

Clean, intuitive dashboard

  • Clear Workflow Guidance: Users need step by step guide to move through the ad setup and Learning Phase.

  • Control over Learning Phase: Users need the flexibility to choose between automation and manual setup, for pushing ads live.

  • Ad Scheduling visibility: Users want to see what is scheduled so they can avoid last minute changes.

  • Easy Ad Management: Users need an intuitive way to search, drop or replace ads within a calendar.

User Needs

  • Confusing workflow: Users get confused when they reach the “Versions” section without understanding how it connects.

  • Limited Ad Testing: Only first creative gets rotated, rest must be swapped manually affect data quality.

  • No Visibility: No clear visual tell which ads are active or scheduled, which makes hard to optimize ad performance.

  • Unclear Impression: Users are unsure when their impression goals are met or how to adapt if they want to continue even after hitting the threshold.

User Pains

User Persona

Meet Sarah, a Digital Marketing Manager who uses DCE to run ad campaigns across different platforms. She needs flexibility, visibility, and the ability to test a wide range of ad variations without relying on daily manual swaps or backend support. Through this persona, I was able to empathize with the challenges users face especially around unclear workflows

Goal

  • Run highly effective ad campaigns with data-backed decisions

  • Test a wide variety of creative formats during the Learning Phase

  • Minimize manual work while keeping full visibility and control

  • Meet impression goals while ensuring brand messaging is consistent

Frustrations

  • “I don’t know which ads are live right now or how they’re rotating.”

  • “I want to test more variations, but the system limits me.”

  • “The interface does not tell me where I am in the setup process.”

  • “I always have to rely on a developer to swap ads, why can’t I just do it myself?”

Behavior

  • Likes to plan campaigns a week in advance.

  • She needs transparency and checks before launch.

  • Depends on performance metrics and impressions to make decisions.

  • Prefers tools that offer both automation and manual control.

Role: Digital Marketing Manager at a Retail

e-Commerce

Experience: 8+ years in media planning and digital

campaigns

Tools uses: DCE, Smartly.io, Excel, Slack,

Google Analytics


Sarah

Persona Validation (Human-Centered Approach)

To ensure our persona truly reflected the real users, conducted validation interviews and feedback sessions with Publicis Groupe stakeholders, Digital Marketing Managers the primary users of DCE.

We took an empathetic approach by asking them to step into the shoe of persona and walk us through how closely it reflected their daily workflows, pain points, and expectations.

  • User Interviews: We conducted 1:1 conversations and informal feedback sessions.

  • Asked users to react to the persona: "Does this sound like your day?"

Validation

  • Users confirmed the persona is very related and accurately described their struggle with workflow clarity and control during Learning Phase.

  • This gave us confidence that the persona was a strong foundation for design decisions.

Outcome

User Flow

I have created a user flow design for the DCE learning feature, keeping our user persona at the heart of it. The flow is designed to feel intuitive, like a quiet assistant that supports the user in making better, faster, and more thoughtful decisions.

Final Design

Created a user flow design for the DCE learning feature, keeping our user persona at the heart of it. The flow is designed to feel intuitive, like a quiet assistant that supports the user in making better, faster, and more thoughtful decisions.

BeforeAfter
BeforeAfter
BeforeAfter

In the old version, users had to schedule ads manually with no clear guidance, making the process confusing and time consuming.

In the old version, users had to schedule one ad set at a time, which was time-consuming and repetitive.

In the old version, users couldn’t easily tell which ad was scheduled or live, and key options were hard to find.

In the new version, the flow is intuitive, with visual cues, automation, and a calendar that makes scheduling quick and stress free.

In the new version, they can plan and manage multiple ad sets at once which saves effort and improve efficiency.

In the new version, everything is clearly visible—scheduled ads, live status, and actions which makes it easy to track and manage campaigns.

Usability Testing Report

Objective:

To evaluate how intuitive and helpful the redesigned Learning Phase feature is for scheduling and managing creative distribution across campaigns.


Method and Focus area:

We conducted a interview with 5 users from Publicis Groupe

they are Digital Marketing Managers & Ad Ops Specialists

they have experience of 3 to7 years in campaign planning. Each participant was asked to complete specific tasks

like:

  • Ease of understanding the Learning Phase flow.

  • Ability to schedule and preview creatives.

  • Manual vs Automated control experience.

  • Visibility of live ads and scheduling pipeline.

  • Users found the ad set scheduling intuitive and faster than before.

  • The Learning Phase Tab gave clear visibility into scheduled ads and saved manual checks.

  • Real time status indicators like "Live" and "Scheduled" "Delivered" and Rejected" helped users feel in control.

  • Impression based optimization was easy to understand.

What worked well

Accessibility

We wanted to make sure the DCE experience worked not just for expert users, but for everyone including people with visual, or cognitive challenges. Our users come from diverse agencies, work under pressure, and need to act fast. If the tool isn't clear, intuitive, or inclusive, it slows them down or even locks them out. Accessibility testing helped us uncover those invisible barriers.

A clear, usable interface helps all users move faster and make smarter decisions, regardless of their ability level.

Use of Color:

  • We used color to communicate ad status e.g., Live, Scheduled, Delivered. We paired color with icons and labels for clarity.

Visual hierarchy:

  • All headings, card, and contrast levels were tested for readability.

Navigation:

  • Interactive elements like buttons were built to be navigable.

Text and Icons:

  • Fonts and icons were optimized for clarity across devices and lighting conditions.

Testing result

What I Learned

We Here's what I personally learned from working on the DCE project.

Clarity beats complexity:

  • I learned users engage better to a clear layout with less and smarter options and that improve confidence and speed.


Early testing matters:

  • Talking to users early helped me catch gaps in flow and missing information that weren’t obvious on first drafts.


Design for real work:

  • Users don’t always have time to explore. I had to design flows that fit into busy high pressure marketing environment

Learnings