Skip to content

Media Buyer Interview Questions

Prepare for your Media Buyer interview with common questions and expert sample answers.

Media Buyer Interview Questions & Answers: Complete Preparation Guide

Landing a media buyer role means proving you can blend analytical thinking with strategic negotiation skills—and your interview is where that proof begins. Whether you’re applying to an agency, tech company, or brand marketing department, you’ll face questions designed to assess everything from your campaign planning abilities to your comfort with data and pressure situations.

This guide walks you through the media buyer interview questions you’re likely to encounter, complete with realistic sample answers you can adapt to your experience. We’ve organized these by question type so you can practice the areas where you need it most.

Common Media Buyer Interview Questions

What’s your approach to defining and understanding a target audience?

Why they ask: Interviewers want to see that you don’t just launch campaigns blindly. They’re looking for evidence that you understand how to use data and research to make informed decisions about where to spend money.

Sample Answer:

“I start with the research the client or marketing team has already done—demographics, psychographics, past customer data. Then I layer in what I’m seeing in the platforms themselves. For example, in one campaign targeting women aged 25-40 interested in fitness, I didn’t just trust the demographic data. I spent time looking at engagement patterns on Facebook and Instagram, analyzed search trends, and even looked at the content our audience was sharing. I noticed the highest engagement came during early morning hours on weekdays, and the messaging that performed best was results-focused rather than inspiration-focused. So I adjusted my media plan to prioritize those dayparts and time slots. That level of audience understanding directly translated to a 22% lower CPC than our initial projections.”

Personalization tip: Replace the fitness example with a real campaign you worked on. If you don’t have direct experience, talk about how you’d approach it step-by-step, mentioning specific tools like Google Analytics, social listening platforms, or survey data.

How do you approach budget allocation across different media channels?

Why they ask: Media buying is fundamentally about spending money wisely. Interviewers need to know you think strategically about channel mix and don’t just default to “spend everywhere equally.”

Sample Answer:

“I start by understanding what each channel does best for the specific objective. If we’re driving awareness among a cold audience, I’ll typically weight toward channels with broad reach—maybe 40% toward display and social. If the goal is conversion, I shift heavier toward search and retargeting, maybe 50% toward those high-intent channels. Then I look at historical performance data. If we’ve run similar campaigns before, I use those benchmarks. If it’s new territory, I’ll allocate a test budget—usually 10-15%—to explore underutilized channels before committing the bulk. I also build in flexibility. After the first two weeks of a campaign, I analyze performance and reallocate toward the channels actually delivering the best ROI, even if that means pulling budget from what I initially thought would work best. For a B2B tech client last year, I initially allocated 30% to LinkedIn. Performance data showed their audience was actually more active on Google Search and YouTube, so by week three, I had shifted that to 50% search, 20% YouTube, and 15% LinkedIn. The adjusted mix improved our cost per lead by 31%.”

Personalization tip: Mention the specific metrics you’d look at (CPA, ROAS, CTR) based on your campaign goals. If you’re new to the role, walk through your decision-making framework rather than a specific result.

Walk me through how you’d analyze an underperforming campaign.

Why they ask: This tests your troubleshooting ability and whether you can handle pressure calmly. They want to see that you dig into data rather than panic or make emotional decisions.

Sample Answer:

“I’d break it down systematically. First, I’d look at the macro numbers—are we hitting impressions? Getting clicks? Converting? This tells me whether it’s a reach problem, engagement problem, or conversion problem. Then I’d dig channel by channel. I worked on a campaign once where overall ROAS was down 40% from projections. Top-level it looked terrible, but when I segmented by channel, I found that search was actually performing well, but programmatic display was tanking. So it wasn’t a strategy failure—it was a placement or targeting issue in one channel. I pulled the programmatic data and found we were getting a lot of impressions on low-quality sites. I adjusted our site exclusions, tightened our contextual targeting, and reallocated some of that underperforming budget to our best-performing placements. We recovered 80% of the lost ROAS within two weeks. The lesson I learned is that ‘underperforming campaign’ rarely means the whole thing is broken. Usually there’s a specific lever that needs adjusting.”

Personalization tip: Choose a real example where you actually fixed something. Explain the specific metrics you looked at and what you changed. If you’re entry-level, describe a hypothetical but realistic scenario.

Tell me about a time you negotiated a better rate or placement with a media vendor.

Why they asks: Negotiation directly impacts profitability and ROI. They want proof that you don’t just accept rate cards—you advocate for your clients or company.

Sample Answer:

“I was working on a year-long campaign with a solid budget for an online publication we wanted to use heavily. Instead of just booking at their standard rates, I approached them with our data. I showed them our previous performance on their platform, our audience overlap, and the fact that we’d be committing to a large, consistent spend. I also came prepared with what competitor sites were charging for similar placements. I asked for a 20% discount on their rate card. They countered at 12%. We landed at 18%. That might sound like negotiating down from my initial ask, but here’s what made it work: I committed to a guaranteed monthly spend level and gave them a two-quarter commitment upfront. In exchange for predictability on their side, they gave me better rates. We maintained that relationship for two years, and because I’d negotiated good rates early, I was able to increase our frequency throughout the year without blowing the budget. The campaign’s ROAS actually improved as we optimized based on performance data.”

Personalization tip: Include the specific outcome. How much did that negotiation save the company? How did it impact the campaign? If you haven’t negotiated directly, talk about what you’d prepare (competitor pricing, performance data, commitment level).

How do you measure campaign success?

Why they ask: This reveals whether you’re data-driven and outcome-focused. The “right” metrics depend on the business goal, so they want to see strategic thinking, not rote answers.

Sample Answer:

“It depends entirely on the campaign objective. For a brand awareness campaign, I’m looking at reach, frequency, and brand lift if we’re running a study. For lead generation, it’s CPA and cost per qualified lead—and I always push to understand what ‘qualified’ means on the business side so I’m measuring the right thing. For ecommerce, it’s ROAS, period. I also look at diagnostic metrics. If ROAS is down, I want to know why: Is it because CTR dropped? Is the audience narrow? Did creatives fatigue? I track those metrics—CTR, engagement rate, conversion rate—so I can actually diagnose problems, not just report that something underperformed. In my last role, I worked on a campaign where our KPI was revenue generated from ad spend. I didn’t just track revenue; I also tracked metrics like view-through rate and time-to-conversion to understand the full customer journey. That insight helped me adjust my media mix in real time to prioritize channels that were driving higher-quality customers, not just cheaper conversions. We beat our revenue target by 23%.”

Personalization tip: Name the specific metrics you’ve used and why they matter. Show that you understand the difference between vanity metrics and business metrics.

What’s your experience with programmatic buying and DSPs?

Why they ask: Programmatic is table stakes in modern media buying. They want to know your comfort level and whether you understand how it works beyond just “automated buying.”

Sample Answer:

“I’ve worked with programmatic buying for about three years now, across several DSPs—primarily The Trade Desk and DV360, but I’ve also used smaller platforms depending on client needs. I understand the basics: you’re bidding in real time on ad inventory, usually using first-party data for targeting and exclusions. What I’ve learned is that programmatic isn’t a set-it-and-forget-it channel. Early on, I made that mistake. I set up campaigns, and they performed okay but not great. Then I started treating programmatic like any other channel—reviewing daily performance, adjusting creative, testing different audience segments, and refining site exclusions and contextual targeting. The difference was huge. Once I started actively managing it, performance improved by about 40%. I also learned the importance of brand safety. I always build out thoughtful exclusion lists and use contextual targeting to avoid placements that don’t align with the brand. That’s saved clients from some really bad situations. I’m also staying on top of the privacy shifts—less reliance on third-party cookies, more first-party data integration. I’ve started implementing server-side tracking and testing cohort-based audiences instead of individual targeting.”

Personalization tip: Mention specific DSPs you’ve used and name one concrete optimization you made. If you’re new to programmatic, talk about what you want to learn and why.

Describe your experience with A/B testing and creative optimization.

Why they ask: Testing is how media buyers improve performance. They want to see that you’re systematic about it and that you understand statistical significance.

Sample Answer:

“I A/B test constantly, but not randomly. I’m very deliberate about what I’m testing and why. I always test one variable at a time so I can actually attribute results to something specific. For example, I might test two different headlines on search ads across the same audience, same placement, same time period—just the headline changes. I also make sure I’m giving the test enough time and volume to matter. I won’t call a result significant until I hit statistical significance, which usually means a 95% confidence level for me. Early in my career, I’d jump on early performance signals too fast and make decisions based on 500 clicks when I should have waited for 5,000. I learned that lesson the hard way. In a recent campaign, I tested video creative length—15 seconds versus 30 seconds—for a mobile audience. The 15-second version had a higher click-through rate initially, but once I looked at the data after a full week across a larger sample, the 30-second version actually drove more conversions despite a lower CTR. That’s why I wait for the full picture. I also test continuously: different CTAs, different audience segments, different placements. It’s part of how I optimize campaigns week over week.”

Personalization tip: Describe a specific test you ran and the actual outcome—even if it surprised you. Show that you understand the difference between statistical significance and random noise.

Why they ask: The media landscape changes constantly. They want proof that you’re a learner and that you’ll adapt to their specific needs.

Sample Answer:

“I’m obsessive about this because the tools and platforms change so fast. I subscribe to eMarketer’s daily insights, I follow key voices on LinkedIn—people like Sheryl Sandberg’s posts on Meta changes, folks from Google covering search updates. I attend AdWeek when I can afford it. I also join Slack communities and forums where media buying professionals share real-world insights, not just marketing speak. The theory is interesting, but the practical stuff—‘Hey, we’ve noticed Facebook’s performance has dropped since the iOS update, here’s how we adjusted’—that’s gold. I’ve also gotten more hands-on about testing new features. When Google Ads rolled out Performance Max, I set up a small test campaign rather than waiting to hear how it worked. I spent a few grand to figure out if it made sense for our clients. For some, it’s been great. For others, the loss of control over placement isn’t worth it. I wouldn’t know the difference without testing it. Recently, I’ve been diving into zero-party data and first-party data strategies because third-party cookies are going away. I’ve implemented custom audiences based on customer data, I’m learning about contextual targeting alternatives. It’s not just trends for trends’ sake—it’s about staying ahead of the changes that’ll impact our ability to target effectively.”

Personalization tip: Name specific publications, people, or communities you follow. Mention one recent change in platforms you’ve already adapted to or plan to explore.

How would you handle a client or manager pushing to spend money on a media channel you don’t think will work?

Why they ask: They’re testing your confidence and your ability to advocate professionally, even when you disagree. They want someone who stands by data, not someone who just agrees with authority.

Sample Answer:

“I’ve been in this situation multiple times. My approach is to never just say ‘no, that won’t work.’ Instead, I come with data. I’d say something like, ‘I understand why you want to explore that channel—here’s what we’ve seen in our data on similar audiences, here’s what competitors are reporting, and here’s why I think it’s not the best use of this budget.’ Then I offer alternatives. Maybe instead of pushing back with a hard ‘no,’ I propose a small test. ‘Let’s allocate 5% of the budget to test it. We’ll give it three weeks, see if our assumptions hold up, and if it works, we’ll scale it. If it doesn’t, we’ll reallocate.’ That way, they get to try their idea, but we’re not blowing the whole budget. I had a manager once who was convinced TikTok was where we needed to be for a B2B financial services client. My data said it wasn’t a fit. I could have fought it, but instead I said, ‘Let’s test it. $2,000, four weeks.’ We did, and the data confirmed my hypothesis—terrible fit, high CPAs, low conversion. But because we tested it instead of me just being stubborn, they learned it themselves. More importantly, they trusted my judgment going forward because I was willing to prove my point with data, not just opinion. I’d rather lose some battles on small tests than win arguments and have the client distrust my recommendation on bigger spends.”

Personalization tip: Show that you’re confident but not arrogant. Use phrases like “I understand why” and “Let me show you the data.” Offer a compromise (the small test) rather than a brick wall.

Describe your experience working with different types of clients or campaigns.

Why they ask: They want to know if you’re versatile and if you can adapt your strategy to different business models. A strategy for ecommerce is different from B2B lead gen, which is different from brand awareness.

Sample Answer:

“I’ve worked across B2B, ecommerce, and DTC brands, and the approach really does differ. Early in my career, I worked on a B2B software campaign, and I learned that longer sales cycles and multiple decision-makers completely change how you think about media. We couldn’t just measure to first click or last click—we had to understand the full funnel and attribution across touchpoints. Awareness campaigns to build familiarity with the brand, mid-funnel content to educate, conversion campaigns to close. It took months from first touch to closed deal, so I had to get comfortable with reporting to clients on leading indicators—engagement metrics, demo requests, not just immediate sales. Then I moved to ecommerce, and it’s almost the opposite. I’m in a sprint mindset. Quick tests, rapid optimization, daily monitoring. The feedback loop is fast; I can see results in days, not months. With DTC specifically, I’ve learned that customer acquisition cost is only part of the equation—I need to understand lifetime value, repeat purchase behavior, and how to balance acquisition with retention spending. The creative changes, the channels change, the metrics change. What’s constant is understanding the business model first, then building the media strategy around it. For this role, I’d want to understand what type of campaign I’d be managing so I can talk about relevant experience or quickly get up to speed on the nuances.”

Personalization tip: If you’ve worked across multiple types of campaigns, highlight the differences you’ve learned. If you specialize in one area, be specific about how you’d approach a different type.

What tools and platforms are you most comfortable using?

Why they ask: They need to know if you’ll require training on their specific stack or if you can jump in quickly. They also want to assess how much hands-on work you do versus delegating.

Sample Answer:

“I’m very hands-on, which I think is important for a media buyer. I work regularly with Google Ads and Google Analytics, The Trade Desk for programmatic, DV360, Facebook Business Manager, and LinkedIn Campaign Manager. I’m comfortable pulling custom reports from these platforms and I’ve spent time with Google Sheets and Excel to build dashboards that help clients see performance at a glance. I’ve also used Tableau for more complex analysis and reporting. On the analytics side, I’m solid with Google Analytics 4—I understand goals, events, conversion tracking, and how to set up proper UTM parameters so that my media data actually connects to what’s happening on the website. I’m not a data scientist, but I’m comfortable enough with SQL to pull data from a CDP if I need to. I’ve also used marketing automation platforms like HubSpot to understand how leads flow through the system, which helps me understand what ‘qualified lead’ means and optimize toward that. That said, I’m a fast learner with tools. If your team uses platforms I haven’t worked with before, I can get up to speed quickly. What’s more important to me is understanding the strategic principles—how to think about audience targeting, how to interpret data, how to test and optimize—because the specific platform is just the vehicle for executing that strategy.”

Personalization tip: Be specific about what you can do with each tool (not just “I know Google Ads”). Mention one tool they use and how you’d apply your experience to their specific needs. Also note where you’re open to learning.

Can you walk me through a full campaign you managed from start to finish?

Why they ask: This is often asked near the end of an interview and they’re looking for evidence of end-to-end thinking. Can you define goals, plan, execute, and measure success?

Sample Answer:

“Sure. I managed a campaign for a SaaS company selling project management software. The goal was to drive demo requests from mid-market companies. Here’s how I approached it. First, I spent time understanding the audience—who are the decision-makers? I learned it was usually project managers and operations managers at 50-500 person companies. I worked with the marketing team to understand messaging. We had different value propositions for each role. Then I built the media plan. We went with search—because people were already actively looking for solutions—LinkedIn for account-based targeting of specific companies, and display remarketing. I allocated about 50% to search, 30% to LinkedIn, 20% to display. I set up tracking so we could measure from click to demo request, and I worked with the sales team to define what a qualified lead looked like. We ran for 90 days. I monitored performance daily, testing different ad copy, different landing pages, adjusting bids based on performance. Search was working great from day one, but LinkedIn was underperforming initially. Instead of killing it, I dug in. I realized our audience targeting was too broad. I refined it to only target people with certain job titles at companies of a certain size. That adjustment took about two weeks to implement and approve, but it improved our cost per demo by 35% on LinkedIn. After 90 days, we’d generated 200 demo requests at a cost of about $45 per demo, which was well below the target of $65. Total campaign spend was about $9,000. We also learned a ton about what messaging resonated, what channels worked best for this audience, which made future campaigns more efficient. We ultimately booked the campaigns for another two quarters.”

Personalization tip: Use a real campaign if possible. If you don’t have extensive experience, walk through what you’d do hypothetically, step by step. Include both strategy and execution details, plus at least one optimization you made during the campaign.

How do you handle pressure and tight deadlines?

Why they ask: Media buying often involves last-minute changes, urgent campaigns, and reporting deadlines. They want to know you stay focused and don’t make reckless decisions under pressure.

Sample Answer:

“I actually perform well under pressure, which is probably why I’m drawn to this role. That said, I try to prevent unnecessary crisis situations through good planning. I build in buffer time for approvals, I anticipate questions before they come up, I check campaigns daily so small issues don’t become big ones. But when there is genuine urgency—which happens—I have a system. I get clear on what’s actually urgent and what just feels urgent. I ask clarifying questions, understand the real deadline and why, and then I prioritize ruthlessly. For example, we once had a campaign that needed to launch in 48 hours instead of the planned two weeks. Instead of panicking, I figured out what was essential for launch—core creative, basic targeting, baseline budget—versus what could come later—the full optimization, advanced segmentation. We launched with the essentials, then spent the next week dialing in optimization. Did it perform perfectly? No. Did it perform well enough? Yes. It drove 85% of what we projected, and by week two it had caught up. What I won’t do is make rushed decisions that create bigger problems. I’d rather explain why we need more time than ship something that hurts the brand or wastes budget. That’s a conversation I’ve had with managers before, and honestly, I respect managers who appreciate that distinction.”

Personalization tip: Give a real example of a tight deadline situation. Show your thought process, not just that you delivered. What did you prioritize and why?

Behavioral Interview Questions for Media Buyers

Behavioral questions ask you to describe past situations using the STAR method: Situation (set the context), Task (what you were responsible for), Action (what you did), Result (what happened). These answers should be specific and measurable.

Tell me about a time you had to work across departments to align on media strategy.

Why they ask: Media buyers don’t work in a vacuum. They collaborate with creative teams, product, sales, and marketing leadership. This tests your communication and alignment skills.

STAR Framework:

  • Situation: Briefly set up the context. What was the campaign? Who were the other departments?
  • Task: What was the specific challenge with alignment? What did you need to accomplish?
  • Action: What did you do to bridge the gap? How did you communicate?
  • Result: What was the outcome? How did it impact the campaign?

Sample Answer:

“We were launching a new product line, and the creative team had developed ads focused on design and aesthetics. But when I talked to sales, they told me the real objection they were hearing from prospects was about price and value for money—not design. The creatives were proud of their work, and I didn’t want to kill their momentum, but I knew the messaging wasn’t aligned with what would actually drive conversions. I set up a meeting with creative, marketing leadership, and a couple of sales reps. I came prepared with win/loss data showing what messaging resonated with prospects and what didn’t. I positioned it as ‘here’s what I’m hearing from the market, and I want to make sure we’re not leaving money on the table.’ We ended up in a collaboration where creative kept the visual aesthetic they loved but we rewrote the ad copy to focus on value and specific use cases. We ran both versions side by side in the first week of the campaign. The value-focused version outperformed the aesthetic-focused version by 3:1 on conversions. Once the team saw the data, everyone was on board. It actually led to a standing practice where we’d share sales call insights with the creative team weekly. Better alignment overall.”

Personalization tip: Include the specific data or insight that helped gain alignment (sales data, performance data, customer feedback). Show that you were collaborative, not dictatorial.

Describe a time you failed or made a mistake. What did you learn?

Why they ask: They want to know if you can own mistakes, reflect on them, and actually change your behavior. No one wants someone who blames externals or repeats the same mistakes.

STAR Framework:

  • Situation: What were you doing? What was the campaign or metric?
  • Task: What happened? What was your mistake?
  • Action: How did you respond? What did you do to fix it and improve?
  • Result: What changed? How do you do it differently now?

Sample Answer:

“Early in my career, I over-relied on demographic targeting and didn’t test behavior-based segments enough. I was running a campaign for a financial services company and I confidently set up audience targeting based just on age, income, and location. The campaign launched, and I didn’t even check it for the first week. When I finally looked at performance, the CPCs were 40% higher than expected, and the conversion rate was half what the historical benchmark was. I panicked. I immediately analyzed what went wrong and realized my demographic assumptions were off—the audience I targeted on paper wasn’t actually converting. The real converters had certain behavioral traits I hadn’t included in my original targeting. I was upset with myself for not testing sooner, but I immediately split the audience and started testing behavioral segments. Within two weeks, I’d identified the right audience definition, optimized toward those segments, and brought the CPA back down to expected levels—actually lower than the historical benchmark. What I learned wasn’t just ‘test early,’ though that’s true. I learned that my job is to challenge my own assumptions before launching. I started building in a mandatory review period—no campaign goes live without me stress-testing the core assumptions first. I also learned the importance of checking performance early and often, not just waiting for a full week of data. That’s a habit I’ve kept. It’s saved me from much bigger mistakes.”

Personalization tip: Pick a real failure with stakes. Show how you responded and what changed as a result. Don’t minimize the mistake or make excuses.

Give me an example of how you’ve handled a conflict with a vendor or media contact.

Why they ask: Media buying requires strong negotiation and relationship-building skills. They want to know you can advocate for your company while maintaining professional relationships.

STAR Framework:

  • Situation: Who was the vendor? What was the conflict about (pricing, placement, performance)?
  • Task: What were you responsible for? What outcome did you need?
  • Action: How did you approach the conversation? What was your strategy?
  • Result: How did it resolve? Did you maintain the relationship?

Sample Answer:

“I had been working with a publisher for several quarters, and we had a good relationship. They offered a guaranteed placement and performance level in our contract. About halfway through, they told us they couldn’t deliver the performance we’d agreed to due to ‘unexpected traffic changes.’ They offered to move our ads to a less desirable placement or reduce our spend proportionally. I was frustrated because this was their commitment, not a market change. But I approached it professionally. I scheduled a call, and before the call, I gathered all the data: our contract terms, their stated performance metrics, our actual performance to date compared to benchmarks. During the call, I didn’t come in hot. I acknowledged that traffic shifts happen, and I asked them to walk me through what they were seeing. They explained the situation. I then showed them the data on performance and said, ‘I understand the challenge. Here’s what I think we should do: let’s move to a different placement on your site that historically performs well for similar advertisers, and you cover the performance gap for the next month as we stabilize there.’ They agreed. We ended up getting great performance in the new placement, and I maintained the relationship. In fact, because I’d handled the conflict professionally rather than being adversarial, they offered us a better rate in the next quarter.”

Personalization tip: Show that you can be firm on business terms while remaining professional. Mention something you learned about the vendor’s perspective or constraints.

Tell me about a time you had to learn a new skill or technology quickly.

Why they ask: Media buying tools and platforms change constantly. They want proof that you can learn quickly and adapt rather than getting stuck.

STAR Framework:

  • Situation: What was the tool or skill? Why did you need to learn it?
  • Task: How much time did you have? What was expected?
  • Action: How did you go about learning it? What resources did you use?
  • Result: How did you apply it? What was the business impact?

Sample Answer:

“When I switched agencies, my new team was using The Trade Desk, which I’d never worked with before. They had a campaign launching in two weeks that needed heavy programmatic spend. I told them honestly, ‘I’m not familiar with TDD, but I can learn it.’ I spent my first week basically doing a crash course: I did their training modules, I watched YouTube tutorials, I shadowed a senior person for a day who walked me through a campaign setup. But I didn’t just watch—I set up a small test campaign myself to understand the interface and see how the bid optimization worked. Once I got hands-on, things clicked. I had about 100 questions during that week, and I asked them all. The team was patient because they could see I was genuinely trying to ramp quickly. By week two, I was confident enough to manage the launch campaign myself. It actually performed really well—we hit performance targets by day three. What I learned is that I don’t need to know everything before jumping in. I need to know enough to get started, be willing to ask questions, and learn by doing. That campaign is now one of our best-performing accounts.”

Personalization tip: Pick a specific tool and show your learning process (training, hands-on practice, asking for help). Include the business result, not just that you learned it.

Tell me about a time you had to influence a decision you disagreed with or advocate for an unpopular idea.

Why they ask: They want to know if you can think independently, communicate persuasively, and handle disagreement professionally. This separates people who are just executors from people who drive strategy.

STAR Framework:

  • Situation: What was the decision? Why did you disagree?
  • Task: What outcome were you trying to drive? What was at stake?
  • Action: How did you build your case? How did you communicate it?
  • Result: Did you influence the decision? What happened?

Sample Answer:

“We were planning to consolidate spending into three channels that were already proven performers. Most of the team wanted to go that route—it’s simpler, lower risk. But I noticed we weren’t testing anything new, and I thought we were missing opportunities. I prepared a case for allocating 15% of budget to test emerging channels—specifically YouTube and TikTok for a younger demographic. I came with data: I showed them growth trends on these platforms for our target audience, I pulled case studies from similar brands, and I was honest about the risk—‘This could underperform. But not testing means we might be missing a 2x opportunity.’ The leadership team was skeptical. We pushed back and forth a couple times. What changed their mind was when I committed to a framework: ‘Let’s test it for 30 days. If the performance is within 20% of our benchmark CPA, we scale. If it’s worse, we kill it and reallocate.’ That gave them a clear exit and a deadline. We tested it. TikTok underperformed, but YouTube actually outperformed our benchmarks by 12%. Because I’d pushed to test, we found a new efficient channel. The company expanded YouTube spend across multiple campaigns after that. If I’d just gone along with consolidating spend, we’d have left money on the table.”

Personalization tip: Show that you came prepared with data, not just opinion. Also show that you were open to feedback and willing to structure a test if people were hesitant.

Technical Interview Questions for Media Buyers

Technical questions dig into your hands-on expertise. These questions often have multiple valid approaches, so focus on showing your thinking process rather than one “right” answer.

How would you calculate ROAS for a campaign? Walk me through your thinking.

Why they ask: ROAS (Return on Ad Spend) is fundamental to media buying. This tests whether you understand the concept and can apply it correctly, including the subtleties (what counts as revenue? what about attribution?).

How to Approach:

  1. Start with the basic formula: Revenue attributed to the campaign ÷ Total ad spend = ROAS
  2. Then address the complexity: What counts as revenue? Is it all conversions, or only conversions that meet quality criteria? What’s your attribution model?
  3. Give a concrete example with specific numbers
  4. Mention limitations of ROAS so they know you think critically

Sample Answer:

“The simple formula is revenue attributed to the campaign divided by ad spend. So if I spend $10,000 and generate $50,000 in revenue, that’s a 5:1 ROAS. But here’s where it gets complicated. First, what revenue counts? If I’m selling a product with a high return rate, do I count revenue at purchase or after the return window closes? For most of my work, I count revenue at the point of sale, but some companies track it differently. Second, attribution. Am I using last-click attribution, which gives all the credit to the last touchpoint? Or am I using a model that credits the entire customer journey? I usually try to get the best attribution model the company has—multi-touch if possible. Because if I’m relying on last-click attribution, I might be overvaluing my media spend if customers saw an ad, didn’t convert, then came back directly and bought. My media gets credit it didn’t entirely deserve. Third, I always segment ROAS by channel. Overall ROAS might be 4:1, but search might be 6:1 and display might be 2:1. That breakdown tells me where to spend more money. And I’m honest about the limitations: ROAS doesn’t tell me about profit margin—a 5:1 ROAS sounds great, but if my product margin is only 15%, I might be losing money. So I try to understand the relationship between ROAS and actual profitability for the business. A concrete example: Last year, I managed an ecommerce campaign with $50,000 spend. We generated $210,000 in attributed revenue. That’s 4.2:1 ROAS. But when I broke it down, search was 5.8:1 and display was 2.3:1. So I shifted $10,000 of budget from display to search, and next month, the overall ROAS improved to 4.7:1. That’s how I use ROAS in practice.”

Personalization tip: Show that you understand both the straightforward calculation and the nuances (attribution, revenue definition, segment analysis). If you have access to real numbers from your experience, use them.

How would you approach setting up a new paid search campaign from scratch?

Why they ask: Paid search is foundational media buying skill. This tests your ability to think through strategy, keyword research, budgeting, and success metrics in a logical order.

How to Approach:

  1. Start with the goal and audience — What are we trying to achieve and for whom?
  2. Keyword research and structure — What keywords indicate intent? How would you organize campaigns?
  3. Budget and bid strategy — How much to spend? What’s the bidding approach?
  4. Conversion tracking and measurement — How will you measure success?
  5. Timeline — When would you expect to see data? When woul

Build your Media Buyer resume

Teal's AI Resume Builder tailors your resume to Media Buyer job descriptions — highlighting the right skills, keywords, and experience.

Try the AI Resume Builder — Free

Find Media Buyer Jobs

Explore the newest Media Buyer roles across industries, career levels, salary ranges, and more.

See Media Buyer Jobs

Start Your Media Buyer Career with Teal

Join Teal for Free

Join our community of 150,000+ members and get tailored career guidance and support from us at every step.