Episode 143
The Long & Short of Measurement with Matt Hultgren
Measuring marketing's impact is hard. There's no silver bullet. And if someone tells you there is, they're probably selling you something that only tracks clicks.
This week, Elena, Angela, and Rob are joined by Chief Analytics Officer Matt Hultgren to tackle one of marketing's most persistent challenges: measurement. They explore why so many campaigns fail before they even launch, how to balance short-term performance with long-term brand building, and why the best marketers use multiple models to find the truth.
Topics Covered
• [02:00] Why human behavior makes measurement messy
• [04:00] The planning problem causing measurement failures
• [06:00] Choosing your North Star metric
• [08:00] Balancing immediate CAC with long-term brand growth
• [10:00] Using multiple models to triangulate the truth
• [13:00] Quantifying TV's halo effect across channels
• [15:00] Incrementality testing vs MMM vs synthetic controls
Resources:
2025 Marketing Architects Report
Today's Hosts
Elena Jasper
Chief Marketing Officer
Rob DeMars
Chief Product Architect
Angela Voss
Chief Executive Officer
Matt Hultgren
Chief Analytics Officer
Transcript
Matt: It's proven time and time again in research, in the field, you name it. So some of the more old school—there's a reason why they were doing that way back in the day. It was working. Brands have been built over time on TV, on other channels, and sometimes it's even the less sophisticated ones that prevail.
Elena: Hello and welcome to the Marketing Architects, a research-first podcast dedicated to answering your toughest marketing questions. I'm Elena Jasper on the marketing team here at Marketing Architects, and I'm joined by my co-host Angela Voss, the CEO of Marketing Architects, and Rob DeMars, the Chief Product Architect of Misfits and Machines.
Rob: Hello.
Elena: And we're joined by Matt Hultgren, the Chief Analytics Officer at Marketing Architects.
Rob: Thanks for having me back. Magic, Matt.
Angela: About a minute.
Elena: We're back with our thoughts on some recent marketing news, always trying to root our opinions in data research and what drives business results. And today we are gonna be on theme. We're talking about measurement. We'll explore what's broken in measurement today, how to balance short-term performance with long-term brand building, and why the best marketers use multiple models to find the truth. But I'll kick us off, as I always do with some research, and we have some of our own to share today.
It's our report that dropped this week, and it's called Measuring the Long and the Short. It tackles one of marketing's toughest challenges: proving what's really working. It argues that TV and marketing as a whole isn't unmeasurable. It's just being measured the wrong way. It explains why so many campaigns fail before they even launch, from siloed teams to unclear goals, and lays out how to design measurement into a campaign from the start. It also explores how to capture both short-term response and long-term brand impact because real growth comes from balancing the two. And let's actually start with you, Angela. Why is measurement such a persistent challenge in marketing?
Angela: How long do we have here? This is a big loaded question. Not that long. Okay. All right. I'll try to keep it as succinct as possible. Human behavior is very messy. It's contextual. It's influenced by far more than one touchpoint, and I think even with better data, better tools, more sophisticated models, people just don't make decisions in a super clean, linear way. People make choices based on their emotion, their habit, their memory, their friend, their identity, social cues, none of which is easily visible in data, and all of which kind of shifts over time.
And I think beyond that, marketing effects are or can be, at least the majority of them—they're delayed, they're cumulative. A single ad just really rarely causes a predictable purchase, but instead builds that familiarity and memory over weeks and months, which is something that most measurement systems struggle to capture. Right, Matt?
Matt: A hundred percent.
Elena: And thank goodness I'm not managing that part of our business. It's Matt, who is very smart, has a big brain, is also I'd say probably more of a skeptic than the average measurement thought leader we've had on the show. We have a lot of questions for you, Matt, so we're gonna pepper you with them today. What is the biggest misconception brands have about attribution in general?
Matt: So I don't know who to point the blame to. I don't know if we go all the way back to Steve Jobs, Google, Meta. But what those three things have in common is the need for speed, whether it was the iPhone and just having everything on your hands at one time. Google, you can track clicks and immediately get feedback in terms of how things are performing. Meta just piled onto Google and—we call 'em the digital divas, but everyone wants results now.
And if you can't get it to me now, you're broken. And everyone just wants the one answer that's gonna tell them, how's my campaign performing? And it's that need for speed where people sometimes forget how offline channels work, how consumers interact with media. So the number one thing we come up against is just people wanting that silver bullet solution. And I think it comes from being obsessed with those digital metrics that are so easily tangible all over the place.
Elena: Google and Meta have sold that you can have a result immediately. It's gonna be a hundred percent accurate, but that is not how marketing works, as you said, or consumers work. One thing from the report that we've talked about before on the podcast a little bit is this planning problem. So I wanted you to explain what that is and why does it cause so many measurement failures before campaigns even start?
Matt: I don't even think I knew this existed until I had seen it in the wild numerous times now where it's like teams can be so disjointed and they're like, "Hey, we're gonna do TV," but it's just like this one person within a business. They didn't talk to their other fellow teams to figure out how is everything gonna synergize together, and it's like, oh, I got 50 grand.
I'm just gonna throw it at TV and see if that works. Is 50 grand enough? Is digital set up in a way to actually help TV do better? And there aren't these conversations happening. What are the KPIs? What does your boss want? What does the CMO, CEO want? And it's like if you don't start from day one being very crystal clear, aligned on: here's my KPIs, here's what I'm trying to measure, here's how I'm going to measure.
You might go to the market at the wrong time during peak seasonality, and you're for sure not gonna measure anything. You might go to the market with the wrong budget where there's just no chance you're gonna be able to see the lift that you're hoping to see. And it's like there's just so many campaigns that just aren't thought out. It's like, "Hey, performance isn't good. We haven't tried TV. I'm just gonna throw some money at it and hope it works." Usually doesn't pan out very well.
Elena: What's that saying? If you fail to plan, you plan to fail?
Rob: Sounds right to me.
Matt: Yep.
Rob: I like it. I think I have that tattoo.
Elena: Oh, great. No one needs to see that. Okay. Part of planning is what you've called the North Star metric, and I like this a lot, 'cause I think it's nice on the podcast to try to have some practical takeaways. You've talked about the importance of aligning around that. How could a CMO or marketing team go about first of all, what is it and then how could they go about choosing it?
Matt: Yeah, I think a lot of times we're kind of obsessed with data now, right? Like I have 50 KPIs, I gotta track 'em all. And oh no, this one doesn't look good. What does that mean? And sometimes simplification can really help. It's gonna probably look different for Google and branded paid search than it does look like for TV. So sometimes it's good to compartmentalize. This is the upper funnel channel. What is the most important thing I'm hoping this channel does for my business? Obviously, Google's gonna be there to collect. People are searching for my brand, I'm getting purchases. You're gonna be hyper-focused on bringing in the ROAS you need.
But when it comes to TV, are you looking for awareness? Are you looking for traffic to your site? Are you looking to maybe pay Google less by having more people searching for you than generic terms and making sure you're again, aligned with your internal team on what's your intent of going to market with an offline channel? Is it building mental availability and awareness? Is it—I do need revenue and we don't have unlimited cash to fund this? So making sure you know what that North Star metric is going into a campaign so you don't get into the campaign and wonder, oh no, like some metrics look good, some metrics look bad. What does success look like? I think you just have to be really aligned on that before you even start.
Angela: Okay, Matt, I'm the marketer and my North Star metric is day one, CAC. So, what's your advice for me? I'm under pressure to drive immediate results. How can we think about balancing and proving short-term ROI with building that long-term brand growth?
Elena: So you're one of Matt's favorite people.
Matt: You must spend a little bit with Google and Meta there, Ang. Very—
Angela: Just trying to pay the bills.
Matt: Very—
Angela: do it. Very accountably.
Matt: There's definitely businesses that are in a position where they do need CAC, and we're not here to say that is wrong. At the end of the day, TV does a lot more than just that, and there's a chance that you turn off TV because you're not hitting the CACs you need. And yeah, you can't just put in more budget. And we always tell our clients, we're gonna treat your dollars like they're our own. At the same time, we do know the power of building a brand, whether it's pricing power, having more loyal customers, growing your market share. I don't think anyone's gonna complain about that. And I think, again, you have to figure out where you are in your business lifecycle and can you afford to invest in a brand channel? We do believe TV also drives immediate results. We see it every single day where there's spikes in traffic and people making purchases. Sometimes that ROI is net profitable day one. There's other instances where we have brands that are investing in that awareness and the long-term payoff oftentimes surmounts, like what they see before that. So there's different situations that kind of play out live in the field.
Angela: Well, I think one of the key challenges and opportunities, especially of a channel like TV, is that it is full funnel and it does drive both immediate sales, to your point. So measure CAC as well as building that long-term demand from a brand perspective. But one of the challenges I know we've seen with clients as they come in is just the silos that you mentioned earlier between maybe a brand team or performance team, and analytics team when it comes to measurement. What's your advice to breaking down some of those silos?
Matt: I will say being a TV-only agency, again, we've seen this play out where the brand team doesn't talk to the performance team, which doesn't talk to the digital team. We try to come to the table and say, "Everyone, please come to the table. Can we have conversations about how to make this campaign?" TV's a pretty big investment. Like you'd think you would have all the key stakeholders at the table. There's instances where we couldn't even convince them of that, but that really, it comes down to if you want TV to work as hard as it possibly can for you, TV is driving people to search for you on the web.
I would sure hope you have your digital teams at that table to have that conversation about how are we gonna have these campaigns synergized between the two, and everyone should be there thinking through all those intangible elements you're talking about, Ang. It's not just a performance channel, it's not just a brand channel. It truly is full funnel.
Angela: Totally. One of our distinctive assets, I think, on this podcast is the saying: "All models are wrong, some are useful," and you may have been the first one to have said that on this podcast, maybe, I don't know, a year ago. But how do you think about—
Rob: I've got that tattoo as well.
Matt: Oh gosh. It's probably a triangle if I had to guess.
Angela: You may be listening to this in the morning, but I'm just gonna tell all the listeners right now, it's too late in the day for all these jokes from Rob about his tattoos. How do you think about that triangulation, Matt, using multiple models or methods to build confidence in results?
Matt: This is not a fun topic. I usually start off anytime I'm talking to a client to say, "Measuring TV's hard. Like it's not easy." And if we can even go back to earlier in the pod, the reality is there isn't a silver bullet. You're gonna have to be comfortable with ambiguity that is TV in terms of each model is gonna have its strengths, each model is gonna have its weaknesses and it's probably best you know what those are in each circumstance, so you don't go make bad marketing decisions, but it is kind of a massive puzzle. There's a lot of different models you can use and each one is gonna help you fill in that puzzle in terms of: Are people responding to my ads? Is different creative or media performing well?
How is it affecting my overall top line? And again, there's certain models we use to help understand the insights of what media is working best. Those are not the models we use to help our clients understand how TV's performing overall. So I definitely would recommend making sure you have a really good feel and sense of what each model does for your business before even going in to TV and trying to make decisions.
Angela: From your perspective, given there's so many options and the channel is full funnel, what short-term metrics do you believe really signal that effectiveness versus perhaps just being noise that's going to distract from the measurement story? And then what long-term metrics have you seen actually translate into that business growth that you mentioned earlier?
Matt: I think coming back to the planning process too, like you don't want just a quick spike and be like, "Oh, it's working," or "Oh, it's not." Like time is a really important part of this equation. You wanna make sure that you're starting to see trends and you're not getting like a false positive or a false negative. I think that's one of the first big pieces of it. In addition to that, for longer terms, like brand metrics, I hope you're not gonna go spend for four weeks on TV and expect for your awareness to shoot up.
I'm sorry, I don't remember what I had for breakfast yesterday, but I do remember Geico commercials and maybe that's an extreme example of "15 minutes could save you 15% or more on your car insurance." And it takes time for ads to wear into people's minds. If you are looking for those bigger moves, it's a big bet and it takes time to plant those seeds and to let them grow and get the ultimate payoff.
Angela: We're heading into the holiday season. Just recently I saw an ad for a new jewelry company that I didn't know of, but I did take note of it. Didn't take any action on it, didn't go out to the website, but I happened to get a catalog about a week later in the mail, and we have seen the impact of a top of funnel channel like TV driving a halo impact to other investments across the marketing channel—digital and others, direct mail, et cetera. So how should marketers look to quantify that halo effect of brand investments across all their channels?
Matt: The invisible hand of brand building, that's what I like to call it. And we've seen it and tested it a lot of different ways. Setting up incrementality tests is a fundamental way to help you understand how are TV and paid—it could be paid social, it could be paid digital, a lot of different things synergizing together. It can be really hard if you're like, "Hey again, I'm just gonna go with this flat budget nationally and hope I see all these different cross channel synergies." A lot of times it does take having tests in individual markets. We do have other instances where a campaign ran on TV for 12 months.
And what do you know? They were able to pull back on their Google budget for the first time in five years. They had been growing sequentially, 30% more to Google, 30% more to Google, Google. And all of a sudden they were like, "Hey, people are searching for my brand name. They're not searching for our generic product," and they're able to actually reallocate that budget and continue to invest in their own brand. So it plays out in different ways for different brands, but a lot of it's making sure you are looking at it the right way and you're not just going in and saying, "Oh, it's gonna lift all boats." And it should be easy to see, 'cause usually if you say it's easy, it's probably not true.
Rob: All right, Matt, enough of these softball questions. Angela and Elena, they're just feeding you the easy ones. I got some toughies for you. You ready?
Matt: Rob? Did you come up with these yourself?
Rob: Oh, no way. No. No way possible. All right, so question one, you can only invest in one. You can either pick media mixed modeling, incrementality testing or synthetic controls? Which one are you going with, Matt?
Matt: They all serve different purposes. I would say right now I am gonna hop on the bandwagon of what feels like the industry and hop on incrementality testing. I think this can take lots of different shapes, sizes, and forms, but having a controlled experiment where you do the planning and set it up in a way that you know, "Hey, this happened because I spent on TV" is a good feeling. I always say, I don't like to live in the gray zone. If a test fails, I just wanna know it failed so I can move on, learn from it, and go to the next step. And that's the whole point of incrementality testing. You do the math ahead of time, you figure out what the right budget is, and either you hit your metric or you don't, and you move on. That's my favorite way to do it.
Elena: Hey Matt, for my sake and the listener's sake, can you quickly summarize what MMM and synthetic controls are too?
Rob: Basically saying, can you define that for Rob, please?
Matt: I'll do my best, Rob. I'll do my best. I mean, MMM's been around for a really long time. I feel like it's made a little bit of a resurgence now with some of the open source models that are out there from both Facebook and Google. But MMM is basically: throw all your data at AI and let it look for changes over time. And usually this is taking two to three years of your historic data. So key point number one, and if you don't have two to three years of good, consistent data, don't waste your time. But from there, as you introduce new channels, whether it's TV, whether it's social, whatever it is, it will look for when you did make that investment, how did things shift around? As for synthetic controls, it's kind of a new, I'd call it buzzword up there with incrementality. It's usually just a fancy way to say, "How can we test for cheaper?" And what you're doing is you're hyper-focusing in on a smaller subset of usually the country, and you look for like audiences. So maybe you pick five, ten DMAs and you pair them up into likeness. You expose half of it. You don't expose the other half, and you look for: how's my exposed group doing compared to—it can be a way to reduce budgets and try to test TV or CTV for cheaper.
Rob: Matt, I'm gonna say you did a pretty good job answering that question. You pass. You get to move on to the next, the next juicy one.
Matt: Rob, I appreciate it.
Rob: What do you think is the hardest marketing channel to measure?
Matt: I feel like I'm biased, but I haven't measured out of home. I can only imagine how hard that is. But being that I work for a TV agency and I know how important sometimes multimillion dollar budgets are, not only is TV hard to measure, but it's just the critical pressure to make sure you get it right. I am gonna say TV. It's just there's so many eyeballs on it. Usually the CEO's right around the corner lurking, wanting to know, "Is this working or not?" So you combine being in a pressure cooker along with it being an offline channel. I usually get in the hot seat month one rolls around. "How are things performing? Please answer that." It can be kind of fun sometimes, sometimes not so fun.
Rob: Absolutely. All right, let's hear from Matt's BS meter here for a minute. What's one metric or model you think marketers should just stop relying on period?
Matt: I will say a lot of times—this is, this is probably, this is not gonna be a popular choice. But everyone is obsessed with new CAC, new CAC, new, new CAC. I think I'm going against every venture capital firm right now, so I might be voted off this podcast forever, but TV does so much. It definitely drives new, but it also invigorates existing too. And if you can't look at the whole picture, I think I'm less anti-new CAC and more so—don't rule out different segments of people, whether it's new, whether it's existing, if it's driving revenue, I think you should be happy. And people will hyperfocus on whether it's new or existing. I just don't like metrics that say, "Hey, let's ignore this half of the picture because my boss told me this was the only thing to look at." Because a lot of times we find the full value in TV across more things than just one.
Rob: I love that answer, more because I just hate the acronym.
Elena: Yeah, Rob's trying to get that acronym banned.
Rob: I'm sorry, CAC sounds like something you do when you have a hairball. So I'm just, I would love to eliminate intelligent people in boardrooms saying the word CAC. It just shouldn't happen. So I'm with you on that one. How about the other direction? What do you think is the most underrated?
Matt: There's just these massive correlations that exist that I think sometimes people fight against. It's as simple as, "Hey, when you spend on TV, people search for your brand. Hey, when you have a higher share of voice, your market share grows." While these aren't maybe as sexy as "Oh, my ROAS is a 100," there's just correlations you can't deny. And it's like just simple math. And it's proven time and time again in research, in the field, you name it. So some of the more old school—there's a reason why they were doing that way back in the day. It was working. Brands have been built over time on TV, on other channels, and sometimes it's even the less sophisticated ones that prevail.
Rob: Good answer. All right. Are you gonna gimme a score? Like, is that, was that a seven in your book, Rob?
Elena: And now Rob's gonna answer the same question.
Rob: Now I'm going to absolutely tell—
Angela: Let's do, let's turn it on, Rob.
Rob: Turn these on.
Angela: Rob, many marketers are exploring AI attribution tools. What do you think they should be excited about, Rob? What do you think they should be cautious about?
Rob: I mean, you know what? I just, I love it. I would say M&Ms and CAC. I mean, I would just, that'd be my answer.
Matt: I think you just made up a new acronym. Can you please define that for us?
Angela: You're talking about candy, M&Ms?
Rob: Exactly. M&Ms and CAC.
Matt: Do you want me to take—
Rob: I do love me, I do love me, I love me some AI. I do love me some AI and I hear there's AI attribution tools. Haven't played with them myself. Wouldn't know the names of them, but I know you know all of them. So are you excited about 'em? Are you like, "Eh," what? What's Matt's hot take?
Matt: It's almost three years since ChatGPT changed the way everyone thinks about everything. And I will say for us, data people out there, it's taken some time for the cooler gadgets to come out. At first it was, "Oh, I'm talking to what feels like a human." It's getting to a point now where we're starting to feel the speed more than we were on the front end of that three-year journey. So definitely excited by the way it's able to talk across data sets, across organizations. It's able to tap into more and more every single day. And I think the speed and automation—it's getting intense.
We're feeling it, we're loving it. At the same time, garbage in, garbage out. There's been a fair amount of warnings out there in terms of that too. And people are now trying to plant seeds to throw models off and there is some of that. If you bring that within your own business, you have to make sure that whatever you're feeding AI is of high quality and it's making the right decisions. More often than not, we'll get data sets and it's like, "Oh, I made a little mistake when I pulled that and sent that to you guys," and it's like, "Okay, if we had fed that to an AI model, it just would've assumed you're giving it the right information."
Like no, it would go off and make the wrong decision. So quality control is 100% a concern. You have to make sure you have really tight constraints around that and you're only feeding it good information. Super excited just to see where it can go in terms of modeling and performance. On the flip side, where does privacy go and what can you give it? What can't you give it? So it's continuing to evolve. It's exciting. I don't think marketing's gonna look the same in five years. I think in five years, it'll be two years after that, it'll look dramatically different. I think this is gonna continue to evolve really, really, really rapidly.
Rob: The future's coming fast at us, Matt. Where do you see it going in the next few years?
Matt: We're already feeling it out there, like having designed frameworks for testing. Again, five, ten years ago, people would just come to TV and throw money at it and see what happened. People are now realizing that might have been a naive decision and making sure you have design going into your tests. Like measurement just doesn't just happen. You can't just run a test and then measure it after the fact.
You have to go in intentionally and there's a lot more third-party attribution companies popping up that help you think through design. We walk each one of our clients through: how can you design a test that you can measure? Because that's the most important thing. There's so many people that go test TV and they're like, "I don't know if it worked or didn't work. Can you help me do it better next time?" And it's like, "Wow, that was six figures, seven figures, eight figures," you name it, right out the door.
Rob: All right.
Elena: Well, Matt, thank you. Thank you so much for answering all of our questions. I feel like there's a lot of really practical takeaways there for any marketer. To wrap us up with something a little more fun, what is the most surprising thing data has ever taught you about yourself?
Matt: I shouldn't say this is surprising, but I've felt it this year more than ever. I don't know if it's because I have two little kids. I don't know if it's because I'm getting old. But there is an element of sleep and your physical performance. I ventured off to do a half marathon this year, and I will say the days I only got a couple hours of sleep, the running the next day was not so hot. And now with watches and gadgets all over the place, there's clear correlation between sleep and performance in terms of running.
Rob: I love sleep. It's so good. I sleep like it's my job.
Angela: I am glad you have a job, Rob. Yeah, mine—what's that saying? If you wanna know what people value, look at their bank account. It's that, only in my head it's related to what are the most important things that you're working on? Like I love a good time tracking exercise to actually do a bit of checks and balance on: are you really working on the things with the right quantity that are so important in your life? Sometimes we all need a little check.
Matt: You tell me to put my phone down more often, Ang.
Angela: I need to put my phone down more often. I feel like—all right, Rob. When I saw this question, I was like, "Oh God, what are we gonna learn about Rob?" I—
Rob: Well, I just, so I had a blood panel done recently for food intolerance and two days ago I learned I'm extremely gluten intolerant, so that's a big, that was a big data point that I didn't know about myself and explains a lot about my visits to the restroom. I'm 51 years old.
Angela: I thought we were gonna get through it without that, but—
Rob: I'm 51. This is quite an age to learn such a factoid.
Angela: Elena, please—
Matt: Follow that up.
Elena: Mine's not gonna be anything like that. We had a company onsite, we had a speaker there who asked us to—everyone, go look at your screen time on your phone. I think that data was very interesting for me, like a wake-up call of just the amount of time you spend on your phone. We use our phones for work and different types of life, so it can be hard, but definitely after I saw that and—I'm setting time limits on some of these apps before I start just doom scrolling. So I'd recommend that. Go look at your screen time and look at averages too, because that was scary. He had people sit down in the room as he got to, I think it was, he was getting to like better screen—like less screen time.
Angela: Yeah, he started with 10 hours or something and then worked his way backwards and I think there was, what, two out of all of us that was maybe at then less than an hour? Is that what we ended up with? Yeah. That was pretty amazing.
Rob: It has to be the first product feature that shames you for using the product. Let's shame people for using our product.
Elena: You know how addictive it is. They weren't worried about it.
Rob: That's true. That's true.
Elena: Alright, great. Matt, thank you so much for joining us.
Angela: Thanks, Matt.
Matt: Thanks, Matt. Thanks for the invite. Appreciate it, guys.
Rob: Thank you. Thank you.
Episode 143
The Long & Short of Measurement with Matt Hultgren
Measuring marketing's impact is hard. There's no silver bullet. And if someone tells you there is, they're probably selling you something that only tracks clicks.
This week, Elena, Angela, and Rob are joined by Chief Analytics Officer Matt Hultgren to tackle one of marketing's most persistent challenges: measurement. They explore why so many campaigns fail before they even launch, how to balance short-term performance with long-term brand building, and why the best marketers use multiple models to find the truth.
Topics Covered
• [02:00] Why human behavior makes measurement messy
• [04:00] The planning problem causing measurement failures
• [06:00] Choosing your North Star metric
• [08:00] Balancing immediate CAC with long-term brand growth
• [10:00] Using multiple models to triangulate the truth
• [13:00] Quantifying TV's halo effect across channels
• [15:00] Incrementality testing vs MMM vs synthetic controls
Resources:
2025 Marketing Architects Report
Today's Hosts
Elena Jasper
Chief Marketing Officer
Rob DeMars
Chief Product Architect
Angela Voss
Chief Executive Officer
Matt Hultgren
Chief Analytics Officer
Enjoy this episode? Leave us a review.
Transcript
Matt: It's proven time and time again in research, in the field, you name it. So some of the more old school—there's a reason why they were doing that way back in the day. It was working. Brands have been built over time on TV, on other channels, and sometimes it's even the less sophisticated ones that prevail.
Elena: Hello and welcome to the Marketing Architects, a research-first podcast dedicated to answering your toughest marketing questions. I'm Elena Jasper on the marketing team here at Marketing Architects, and I'm joined by my co-host Angela Voss, the CEO of Marketing Architects, and Rob DeMars, the Chief Product Architect of Misfits and Machines.
Rob: Hello.
Elena: And we're joined by Matt Hultgren, the Chief Analytics Officer at Marketing Architects.
Rob: Thanks for having me back. Magic, Matt.
Angela: About a minute.
Elena: We're back with our thoughts on some recent marketing news, always trying to root our opinions in data research and what drives business results. And today we are gonna be on theme. We're talking about measurement. We'll explore what's broken in measurement today, how to balance short-term performance with long-term brand building, and why the best marketers use multiple models to find the truth. But I'll kick us off, as I always do with some research, and we have some of our own to share today.
It's our report that dropped this week, and it's called Measuring the Long and the Short. It tackles one of marketing's toughest challenges: proving what's really working. It argues that TV and marketing as a whole isn't unmeasurable. It's just being measured the wrong way. It explains why so many campaigns fail before they even launch, from siloed teams to unclear goals, and lays out how to design measurement into a campaign from the start. It also explores how to capture both short-term response and long-term brand impact because real growth comes from balancing the two. And let's actually start with you, Angela. Why is measurement such a persistent challenge in marketing?
Angela: How long do we have here? This is a big loaded question. Not that long. Okay. All right. I'll try to keep it as succinct as possible. Human behavior is very messy. It's contextual. It's influenced by far more than one touchpoint, and I think even with better data, better tools, more sophisticated models, people just don't make decisions in a super clean, linear way. People make choices based on their emotion, their habit, their memory, their friend, their identity, social cues, none of which is easily visible in data, and all of which kind of shifts over time.
And I think beyond that, marketing effects are or can be, at least the majority of them—they're delayed, they're cumulative. A single ad just really rarely causes a predictable purchase, but instead builds that familiarity and memory over weeks and months, which is something that most measurement systems struggle to capture. Right, Matt?
Matt: A hundred percent.
Elena: And thank goodness I'm not managing that part of our business. It's Matt, who is very smart, has a big brain, is also I'd say probably more of a skeptic than the average measurement thought leader we've had on the show. We have a lot of questions for you, Matt, so we're gonna pepper you with them today. What is the biggest misconception brands have about attribution in general?
Matt: So I don't know who to point the blame to. I don't know if we go all the way back to Steve Jobs, Google, Meta. But what those three things have in common is the need for speed, whether it was the iPhone and just having everything on your hands at one time. Google, you can track clicks and immediately get feedback in terms of how things are performing. Meta just piled onto Google and—we call 'em the digital divas, but everyone wants results now.
And if you can't get it to me now, you're broken. And everyone just wants the one answer that's gonna tell them, how's my campaign performing? And it's that need for speed where people sometimes forget how offline channels work, how consumers interact with media. So the number one thing we come up against is just people wanting that silver bullet solution. And I think it comes from being obsessed with those digital metrics that are so easily tangible all over the place.
Elena: Google and Meta have sold that you can have a result immediately. It's gonna be a hundred percent accurate, but that is not how marketing works, as you said, or consumers work. One thing from the report that we've talked about before on the podcast a little bit is this planning problem. So I wanted you to explain what that is and why does it cause so many measurement failures before campaigns even start?
Matt: I don't even think I knew this existed until I had seen it in the wild numerous times now where it's like teams can be so disjointed and they're like, "Hey, we're gonna do TV," but it's just like this one person within a business. They didn't talk to their other fellow teams to figure out how is everything gonna synergize together, and it's like, oh, I got 50 grand.
I'm just gonna throw it at TV and see if that works. Is 50 grand enough? Is digital set up in a way to actually help TV do better? And there aren't these conversations happening. What are the KPIs? What does your boss want? What does the CMO, CEO want? And it's like if you don't start from day one being very crystal clear, aligned on: here's my KPIs, here's what I'm trying to measure, here's how I'm going to measure.
You might go to the market at the wrong time during peak seasonality, and you're for sure not gonna measure anything. You might go to the market with the wrong budget where there's just no chance you're gonna be able to see the lift that you're hoping to see. And it's like there's just so many campaigns that just aren't thought out. It's like, "Hey, performance isn't good. We haven't tried TV. I'm just gonna throw some money at it and hope it works." Usually doesn't pan out very well.
Elena: What's that saying? If you fail to plan, you plan to fail?
Rob: Sounds right to me.
Matt: Yep.
Rob: I like it. I think I have that tattoo.
Elena: Oh, great. No one needs to see that. Okay. Part of planning is what you've called the North Star metric, and I like this a lot, 'cause I think it's nice on the podcast to try to have some practical takeaways. You've talked about the importance of aligning around that. How could a CMO or marketing team go about first of all, what is it and then how could they go about choosing it?
Matt: Yeah, I think a lot of times we're kind of obsessed with data now, right? Like I have 50 KPIs, I gotta track 'em all. And oh no, this one doesn't look good. What does that mean? And sometimes simplification can really help. It's gonna probably look different for Google and branded paid search than it does look like for TV. So sometimes it's good to compartmentalize. This is the upper funnel channel. What is the most important thing I'm hoping this channel does for my business? Obviously, Google's gonna be there to collect. People are searching for my brand, I'm getting purchases. You're gonna be hyper-focused on bringing in the ROAS you need.
But when it comes to TV, are you looking for awareness? Are you looking for traffic to your site? Are you looking to maybe pay Google less by having more people searching for you than generic terms and making sure you're again, aligned with your internal team on what's your intent of going to market with an offline channel? Is it building mental availability and awareness? Is it—I do need revenue and we don't have unlimited cash to fund this? So making sure you know what that North Star metric is going into a campaign so you don't get into the campaign and wonder, oh no, like some metrics look good, some metrics look bad. What does success look like? I think you just have to be really aligned on that before you even start.
Angela: Okay, Matt, I'm the marketer and my North Star metric is day one, CAC. So, what's your advice for me? I'm under pressure to drive immediate results. How can we think about balancing and proving short-term ROI with building that long-term brand growth?
Elena: So you're one of Matt's favorite people.
Matt: You must spend a little bit with Google and Meta there, Ang. Very—
Angela: Just trying to pay the bills.
Matt: Very—
Angela: do it. Very accountably.
Matt: There's definitely businesses that are in a position where they do need CAC, and we're not here to say that is wrong. At the end of the day, TV does a lot more than just that, and there's a chance that you turn off TV because you're not hitting the CACs you need. And yeah, you can't just put in more budget. And we always tell our clients, we're gonna treat your dollars like they're our own. At the same time, we do know the power of building a brand, whether it's pricing power, having more loyal customers, growing your market share. I don't think anyone's gonna complain about that. And I think, again, you have to figure out where you are in your business lifecycle and can you afford to invest in a brand channel? We do believe TV also drives immediate results. We see it every single day where there's spikes in traffic and people making purchases. Sometimes that ROI is net profitable day one. There's other instances where we have brands that are investing in that awareness and the long-term payoff oftentimes surmounts, like what they see before that. So there's different situations that kind of play out live in the field.
Angela: Well, I think one of the key challenges and opportunities, especially of a channel like TV, is that it is full funnel and it does drive both immediate sales, to your point. So measure CAC as well as building that long-term demand from a brand perspective. But one of the challenges I know we've seen with clients as they come in is just the silos that you mentioned earlier between maybe a brand team or performance team, and analytics team when it comes to measurement. What's your advice to breaking down some of those silos?
Matt: I will say being a TV-only agency, again, we've seen this play out where the brand team doesn't talk to the performance team, which doesn't talk to the digital team. We try to come to the table and say, "Everyone, please come to the table. Can we have conversations about how to make this campaign?" TV's a pretty big investment. Like you'd think you would have all the key stakeholders at the table. There's instances where we couldn't even convince them of that, but that really, it comes down to if you want TV to work as hard as it possibly can for you, TV is driving people to search for you on the web.
I would sure hope you have your digital teams at that table to have that conversation about how are we gonna have these campaigns synergized between the two, and everyone should be there thinking through all those intangible elements you're talking about, Ang. It's not just a performance channel, it's not just a brand channel. It truly is full funnel.
Angela: Totally. One of our distinctive assets, I think, on this podcast is the saying: "All models are wrong, some are useful," and you may have been the first one to have said that on this podcast, maybe, I don't know, a year ago. But how do you think about—
Rob: I've got that tattoo as well.
Matt: Oh gosh. It's probably a triangle if I had to guess.
Angela: You may be listening to this in the morning, but I'm just gonna tell all the listeners right now, it's too late in the day for all these jokes from Rob about his tattoos. How do you think about that triangulation, Matt, using multiple models or methods to build confidence in results?
Matt: This is not a fun topic. I usually start off anytime I'm talking to a client to say, "Measuring TV's hard. Like it's not easy." And if we can even go back to earlier in the pod, the reality is there isn't a silver bullet. You're gonna have to be comfortable with ambiguity that is TV in terms of each model is gonna have its strengths, each model is gonna have its weaknesses and it's probably best you know what those are in each circumstance, so you don't go make bad marketing decisions, but it is kind of a massive puzzle. There's a lot of different models you can use and each one is gonna help you fill in that puzzle in terms of: Are people responding to my ads? Is different creative or media performing well?
How is it affecting my overall top line? And again, there's certain models we use to help understand the insights of what media is working best. Those are not the models we use to help our clients understand how TV's performing overall. So I definitely would recommend making sure you have a really good feel and sense of what each model does for your business before even going in to TV and trying to make decisions.
Angela: From your perspective, given there's so many options and the channel is full funnel, what short-term metrics do you believe really signal that effectiveness versus perhaps just being noise that's going to distract from the measurement story? And then what long-term metrics have you seen actually translate into that business growth that you mentioned earlier?
Matt: I think coming back to the planning process too, like you don't want just a quick spike and be like, "Oh, it's working," or "Oh, it's not." Like time is a really important part of this equation. You wanna make sure that you're starting to see trends and you're not getting like a false positive or a false negative. I think that's one of the first big pieces of it. In addition to that, for longer terms, like brand metrics, I hope you're not gonna go spend for four weeks on TV and expect for your awareness to shoot up.
I'm sorry, I don't remember what I had for breakfast yesterday, but I do remember Geico commercials and maybe that's an extreme example of "15 minutes could save you 15% or more on your car insurance." And it takes time for ads to wear into people's minds. If you are looking for those bigger moves, it's a big bet and it takes time to plant those seeds and to let them grow and get the ultimate payoff.
Angela: We're heading into the holiday season. Just recently I saw an ad for a new jewelry company that I didn't know of, but I did take note of it. Didn't take any action on it, didn't go out to the website, but I happened to get a catalog about a week later in the mail, and we have seen the impact of a top of funnel channel like TV driving a halo impact to other investments across the marketing channel—digital and others, direct mail, et cetera. So how should marketers look to quantify that halo effect of brand investments across all their channels?
Matt: The invisible hand of brand building, that's what I like to call it. And we've seen it and tested it a lot of different ways. Setting up incrementality tests is a fundamental way to help you understand how are TV and paid—it could be paid social, it could be paid digital, a lot of different things synergizing together. It can be really hard if you're like, "Hey again, I'm just gonna go with this flat budget nationally and hope I see all these different cross channel synergies." A lot of times it does take having tests in individual markets. We do have other instances where a campaign ran on TV for 12 months.
And what do you know? They were able to pull back on their Google budget for the first time in five years. They had been growing sequentially, 30% more to Google, 30% more to Google, Google. And all of a sudden they were like, "Hey, people are searching for my brand name. They're not searching for our generic product," and they're able to actually reallocate that budget and continue to invest in their own brand. So it plays out in different ways for different brands, but a lot of it's making sure you are looking at it the right way and you're not just going in and saying, "Oh, it's gonna lift all boats." And it should be easy to see, 'cause usually if you say it's easy, it's probably not true.
Rob: All right, Matt, enough of these softball questions. Angela and Elena, they're just feeding you the easy ones. I got some toughies for you. You ready?
Matt: Rob? Did you come up with these yourself?
Rob: Oh, no way. No. No way possible. All right, so question one, you can only invest in one. You can either pick media mixed modeling, incrementality testing or synthetic controls? Which one are you going with, Matt?
Matt: They all serve different purposes. I would say right now I am gonna hop on the bandwagon of what feels like the industry and hop on incrementality testing. I think this can take lots of different shapes, sizes, and forms, but having a controlled experiment where you do the planning and set it up in a way that you know, "Hey, this happened because I spent on TV" is a good feeling. I always say, I don't like to live in the gray zone. If a test fails, I just wanna know it failed so I can move on, learn from it, and go to the next step. And that's the whole point of incrementality testing. You do the math ahead of time, you figure out what the right budget is, and either you hit your metric or you don't, and you move on. That's my favorite way to do it.
Elena: Hey Matt, for my sake and the listener's sake, can you quickly summarize what MMM and synthetic controls are too?
Rob: Basically saying, can you define that for Rob, please?
Matt: I'll do my best, Rob. I'll do my best. I mean, MMM's been around for a really long time. I feel like it's made a little bit of a resurgence now with some of the open source models that are out there from both Facebook and Google. But MMM is basically: throw all your data at AI and let it look for changes over time. And usually this is taking two to three years of your historic data. So key point number one, and if you don't have two to three years of good, consistent data, don't waste your time. But from there, as you introduce new channels, whether it's TV, whether it's social, whatever it is, it will look for when you did make that investment, how did things shift around? As for synthetic controls, it's kind of a new, I'd call it buzzword up there with incrementality. It's usually just a fancy way to say, "How can we test for cheaper?" And what you're doing is you're hyper-focusing in on a smaller subset of usually the country, and you look for like audiences. So maybe you pick five, ten DMAs and you pair them up into likeness. You expose half of it. You don't expose the other half, and you look for: how's my exposed group doing compared to—it can be a way to reduce budgets and try to test TV or CTV for cheaper.
Rob: Matt, I'm gonna say you did a pretty good job answering that question. You pass. You get to move on to the next, the next juicy one.
Matt: Rob, I appreciate it.
Rob: What do you think is the hardest marketing channel to measure?
Matt: I feel like I'm biased, but I haven't measured out of home. I can only imagine how hard that is. But being that I work for a TV agency and I know how important sometimes multimillion dollar budgets are, not only is TV hard to measure, but it's just the critical pressure to make sure you get it right. I am gonna say TV. It's just there's so many eyeballs on it. Usually the CEO's right around the corner lurking, wanting to know, "Is this working or not?" So you combine being in a pressure cooker along with it being an offline channel. I usually get in the hot seat month one rolls around. "How are things performing? Please answer that." It can be kind of fun sometimes, sometimes not so fun.
Rob: Absolutely. All right, let's hear from Matt's BS meter here for a minute. What's one metric or model you think marketers should just stop relying on period?
Matt: I will say a lot of times—this is, this is probably, this is not gonna be a popular choice. But everyone is obsessed with new CAC, new CAC, new, new CAC. I think I'm going against every venture capital firm right now, so I might be voted off this podcast forever, but TV does so much. It definitely drives new, but it also invigorates existing too. And if you can't look at the whole picture, I think I'm less anti-new CAC and more so—don't rule out different segments of people, whether it's new, whether it's existing, if it's driving revenue, I think you should be happy. And people will hyperfocus on whether it's new or existing. I just don't like metrics that say, "Hey, let's ignore this half of the picture because my boss told me this was the only thing to look at." Because a lot of times we find the full value in TV across more things than just one.
Rob: I love that answer, more because I just hate the acronym.
Elena: Yeah, Rob's trying to get that acronym banned.
Rob: I'm sorry, CAC sounds like something you do when you have a hairball. So I'm just, I would love to eliminate intelligent people in boardrooms saying the word CAC. It just shouldn't happen. So I'm with you on that one. How about the other direction? What do you think is the most underrated?
Matt: There's just these massive correlations that exist that I think sometimes people fight against. It's as simple as, "Hey, when you spend on TV, people search for your brand. Hey, when you have a higher share of voice, your market share grows." While these aren't maybe as sexy as "Oh, my ROAS is a 100," there's just correlations you can't deny. And it's like just simple math. And it's proven time and time again in research, in the field, you name it. So some of the more old school—there's a reason why they were doing that way back in the day. It was working. Brands have been built over time on TV, on other channels, and sometimes it's even the less sophisticated ones that prevail.
Rob: Good answer. All right. Are you gonna gimme a score? Like, is that, was that a seven in your book, Rob?
Elena: And now Rob's gonna answer the same question.
Rob: Now I'm going to absolutely tell—
Angela: Let's do, let's turn it on, Rob.
Rob: Turn these on.
Angela: Rob, many marketers are exploring AI attribution tools. What do you think they should be excited about, Rob? What do you think they should be cautious about?
Rob: I mean, you know what? I just, I love it. I would say M&Ms and CAC. I mean, I would just, that'd be my answer.
Matt: I think you just made up a new acronym. Can you please define that for us?
Angela: You're talking about candy, M&Ms?
Rob: Exactly. M&Ms and CAC.
Matt: Do you want me to take—
Rob: I do love me, I do love me, I love me some AI. I do love me some AI and I hear there's AI attribution tools. Haven't played with them myself. Wouldn't know the names of them, but I know you know all of them. So are you excited about 'em? Are you like, "Eh," what? What's Matt's hot take?
Matt: It's almost three years since ChatGPT changed the way everyone thinks about everything. And I will say for us, data people out there, it's taken some time for the cooler gadgets to come out. At first it was, "Oh, I'm talking to what feels like a human." It's getting to a point now where we're starting to feel the speed more than we were on the front end of that three-year journey. So definitely excited by the way it's able to talk across data sets, across organizations. It's able to tap into more and more every single day. And I think the speed and automation—it's getting intense.
We're feeling it, we're loving it. At the same time, garbage in, garbage out. There's been a fair amount of warnings out there in terms of that too. And people are now trying to plant seeds to throw models off and there is some of that. If you bring that within your own business, you have to make sure that whatever you're feeding AI is of high quality and it's making the right decisions. More often than not, we'll get data sets and it's like, "Oh, I made a little mistake when I pulled that and sent that to you guys," and it's like, "Okay, if we had fed that to an AI model, it just would've assumed you're giving it the right information."
Like no, it would go off and make the wrong decision. So quality control is 100% a concern. You have to make sure you have really tight constraints around that and you're only feeding it good information. Super excited just to see where it can go in terms of modeling and performance. On the flip side, where does privacy go and what can you give it? What can't you give it? So it's continuing to evolve. It's exciting. I don't think marketing's gonna look the same in five years. I think in five years, it'll be two years after that, it'll look dramatically different. I think this is gonna continue to evolve really, really, really rapidly.
Rob: The future's coming fast at us, Matt. Where do you see it going in the next few years?
Matt: We're already feeling it out there, like having designed frameworks for testing. Again, five, ten years ago, people would just come to TV and throw money at it and see what happened. People are now realizing that might have been a naive decision and making sure you have design going into your tests. Like measurement just doesn't just happen. You can't just run a test and then measure it after the fact.
You have to go in intentionally and there's a lot more third-party attribution companies popping up that help you think through design. We walk each one of our clients through: how can you design a test that you can measure? Because that's the most important thing. There's so many people that go test TV and they're like, "I don't know if it worked or didn't work. Can you help me do it better next time?" And it's like, "Wow, that was six figures, seven figures, eight figures," you name it, right out the door.
Rob: All right.
Elena: Well, Matt, thank you. Thank you so much for answering all of our questions. I feel like there's a lot of really practical takeaways there for any marketer. To wrap us up with something a little more fun, what is the most surprising thing data has ever taught you about yourself?
Matt: I shouldn't say this is surprising, but I've felt it this year more than ever. I don't know if it's because I have two little kids. I don't know if it's because I'm getting old. But there is an element of sleep and your physical performance. I ventured off to do a half marathon this year, and I will say the days I only got a couple hours of sleep, the running the next day was not so hot. And now with watches and gadgets all over the place, there's clear correlation between sleep and performance in terms of running.
Rob: I love sleep. It's so good. I sleep like it's my job.
Angela: I am glad you have a job, Rob. Yeah, mine—what's that saying? If you wanna know what people value, look at their bank account. It's that, only in my head it's related to what are the most important things that you're working on? Like I love a good time tracking exercise to actually do a bit of checks and balance on: are you really working on the things with the right quantity that are so important in your life? Sometimes we all need a little check.
Matt: You tell me to put my phone down more often, Ang.
Angela: I need to put my phone down more often. I feel like—all right, Rob. When I saw this question, I was like, "Oh God, what are we gonna learn about Rob?" I—
Rob: Well, I just, so I had a blood panel done recently for food intolerance and two days ago I learned I'm extremely gluten intolerant, so that's a big, that was a big data point that I didn't know about myself and explains a lot about my visits to the restroom. I'm 51 years old.
Angela: I thought we were gonna get through it without that, but—
Rob: I'm 51. This is quite an age to learn such a factoid.
Angela: Elena, please—
Matt: Follow that up.
Elena: Mine's not gonna be anything like that. We had a company onsite, we had a speaker there who asked us to—everyone, go look at your screen time on your phone. I think that data was very interesting for me, like a wake-up call of just the amount of time you spend on your phone. We use our phones for work and different types of life, so it can be hard, but definitely after I saw that and—I'm setting time limits on some of these apps before I start just doom scrolling. So I'd recommend that. Go look at your screen time and look at averages too, because that was scary. He had people sit down in the room as he got to, I think it was, he was getting to like better screen—like less screen time.
Angela: Yeah, he started with 10 hours or something and then worked his way backwards and I think there was, what, two out of all of us that was maybe at then less than an hour? Is that what we ended up with? Yeah. That was pretty amazing.
Rob: It has to be the first product feature that shames you for using the product. Let's shame people for using our product.
Elena: You know how addictive it is. They weren't worried about it.
Rob: That's true. That's true.
Elena: Alright, great. Matt, thank you so much for joining us.
Angela: Thanks, Matt.
Matt: Thanks, Matt. Thanks for the invite. Appreciate it, guys.
Rob: Thank you. Thank you.