Gabe Travers is a product innovator with over a decade of experience leading digital transformation in media and consumer technology. Formerly Head of Product at The Points Guy, he has also held senior product leadership roles at Hearst Television, where he spearheaded personalization systems, platform modernization, and streaming services. With a career that began in broadcast journalism, Gabe brings a unique lens to product management — blending storytelling, technical acumen, and a deep focus on user needs.
In this conversation, Gabe shares his philosophy of product leadership, including how to balance tradeoffs, measure success when metrics fall short, and use AI as an amplifier rather than a replacement. He also reflects on the role of technical fluency in product management, his approach to mentoring the next generation of leaders, and why experimentation and continuous learning are vital for long-term growth.
The product development tripod
What’s your philosophy about product management and how does that inform the principles you bring to your day-to-day work?
There’s this analogy that I’ve used for a long time, and I think it’s because I come from a broadcast journalism background. I think of a software or product development organization as a bit like a camera tripod, where engineering, product, and project are the three legs.
If one of those legs is off balance, the business can’t see the picture clearly. On top of that tripod are the fine-tuning pieces — business goals, user needs, technical constraints — like the focus or color balance on a camera. Everything has to be working together so the vision is clear.
Ideally, of course, it all starts with the user. If you can solve a real user need and drive business impact, that’s the dream. But in reality, there are always tradeoffs: resourcing, tech debt, competing priorities. That’s where clarity, communication, and trust across teams really come in.
With the tradeoffs, does that make it more difficult to define progress?
I think it’s all a balance. There’s never a perfect scenario where you’ll do something with no tech debt, or where you’ll always have the long-term solution. Things change. The key is understanding the tradeoffs you’re making and having a record of them. Those decisions will come back into play later and will determine whether a feature holds up.
Are there times when it’s difficult to show a tangible metric? When success might look more qualitative or intangible?
Absolutely. Metrics are great, but outcomes aren’t always tangible. Two good examples are tech debt and security incidents. No one gets applauded for preventing a security incident, but everyone feels the pain if you don’t. You need balance: some things that are metrics-driven and others that are less measurable.
Organizational learning is another example. You might run a POC that never ships, but it could shape future approaches in really important ways.
AI as amplifier, not replacement
How do you think AI is changing your philosophy or approach to product?
On the consumer side, automation dilemmas have been around long before LLMs. Think of yelling “representative” into the phone 10 years ago. Now it’s harder for users to detect, which brings new challenges.
For product management, I think of AI as an amplifier, not a replacement. I’ve used LLMs to automate workflows, explore data faster, and prototype features quickly. But none of that replaces human judgment. You still need to know the team context, business outcomes, and user needs.
It’s crucial for PMs to be fluent in AI, not just the tools, but also the principles. You need to understand prompting, where hallucinations or bias might creep in, and when a solution is overengineered. Each tool has strengths, whether it’s ChatGPT for spreadsheet analysis or Cursor for code exploration.
Technical acumen matters here — you may not need to write a Lambda yourself, but you should know what it does, its constraints, and whether it’s the right solution.
Building brick by brick
Do you feel like coming from a technical product management background changes the way you approach things?
Yes, it makes me a better partner to engineering, project managers, and the business side. It helps me assess what’s truly complex versus what just sounds complex. And it means we spend less time on basics and more time bringing creative ideas to the table.
I like to think of it as Lego bricks. You don’t need to know how to make the brick, but you need to know what it connects to and how it fits into the larger build. APIs, cloud functions, and services like AWS or GCP are those bricks. Understanding them helps you weigh tradeoffs and see where issues might arise years down the line.
What’s the difference between troubleshooting from a developer perspective versus a product one?
In a crisis, engineers often start from the logs, errors, and performance bottlenecks. PMs may start at the opposite end: the user’s challenges. The best troubleshooting comes from combining those angles. Engineers might surface bugs through telemetry, while PMs spot issues through analytics. Together, you can connect the dots and grow the product more effectively.
With regard to user behavior, what do you look for first when diagnosing problems?
You start with the big picture: conversion, stability, total users. Then you drill down — was it new users, traffic from social, a holiday week, or a third-party API change? Correlations matter. AI can help speed this up, whether it’s writing complex queries or analyzing data, but you still need to validate the insights yourself.
Once you’ve identified an issue, is it better to overreact or underreact?
You have to think about actual impact. Sometimes a loud group of users reports a bug, but it isn’t affecting most people. Other times, something is quietly crashing in the background and impacting everyone. You have to balance fixing immediate issues with building long-term solutions. Personally, I lean toward fixing user-impacting issues quickly, even if it incurs some tech debt, and then addressing the longer-term fix later.
Making the leap from execution to strategy
What misconceptions about product management do you try to address when mentoring aspiring leaders?
Good product management isn’t about cranking out lots of decks or tickets; it’s about depth. You need to understand a feature from A to Z and be able to distill the why into a sentence. Even something as simple as a text field requires deep thinking — what inputs are allowed, how do you handle errors, how do you keep users on the happy path? That level of detail saves time, builds respect with engineers, and speeds up releases.
Is it difficult to help junior product managers shift from execution to strategy?
It can be. Early on, we give people small end-to-end features so they learn the process. But at some point, you have to make the leap: instead of, “Here’s a feature, go build it,” you start asking, “Here’s a user problem, how would you solve it?” Recognizing when someone is ready for that leap is a key part of leadership.
Is there a piece of guidance you give people that they’re reluctant to accept at first?
I tell people they have to make space to experiment, even when things are going well. Top performers can get stuck in their own processes because they’ve been rewarded for them. But growth requires challenging your systems and trying new approaches, even if it’s uncomfortable. Otherwise, you plateau. Just like in fitness, you need to keep changing your routine to keep growing.
What does LogRocket do?
LogRocket’s Galileo AI watches user sessions for you and surfaces the technical and usability issues holding back your web and mobile apps. Understand where your users are struggling by trying it for free at LogRocket.com.


