What Netflix's Culture Memo Couldn't Scale
A retired Netflix engineering director reveals how the company's legendary culture worked brilliantly—until it didn't. The scaling problem nobody talks about.
Written by AI. Marcus Chen-Ramirez
February 17, 2026

Photo: Ryan Peterman / YouTube
David Ronca spent years as an engineering director at Netflix, arriving in 2007 when the company was still small enough that Patty McCord, the chief people officer, personally interviewed every hire. He left in the late 2010s after the company had grown into something very different. Now retired, he's talking about what that transformation looked like from the inside—and what the legendary Netflix culture memo couldn't survive.
The story he tells in a recent interview with Ryan Peterman isn't about failure, exactly. It's about the specific ways that principles designed for 100 people begin to malfunction at 1,000. And it raises a question that most scaling narratives skip over: What if some of the things that made you successful are structurally incompatible with getting bigger?
The Part That Actually Worked
Before Netflix, Ronca had worked at companies where the most critical engineer's cubicle walls were literally bulging with hoarded junk, where prima donnas screamed at colleagues and got away with it because management believed the company would collapse without them. (Spoiler: when that guy quit, the company didn't collapse. Turns out nobody could work with him anyway.)
Netflix was different. The famous "no brilliant jerks" policy wasn't just a slogan. "If you're a person who's always going off the rails, who's yelling and screaming, who's difficult to work with, who makes yourself unapproachable, they're going to let you go," Ronca explains. "And they don't care how smart you are."
More surprising was what McCord told him during his interview: "We don't value 24/7 work here and if you just come in and work all the time we're not going to be impressed." The challenge was to blow them away with what you could do in eight hours. This wasn't about work-life balance as a perk—it was about forcing leaders to build sustainable systems instead of heroic individuals.
When one of Ronca's engineers was working every weekend to keep systems running, Ronca pulled him aside: "If this company cannot survive without you here on the clock, we got a problem. You need to take a vacation. And I don't care what happens. If things break, they break."
The engineer took a week off. The systems didn't collapse. In fact, freed from the treadmill of just keeping things running, the team quickly stabilized the architecture. The forced vacation wasn't kindness—it was a diagnostic tool that revealed bad systems design.
The Compensation Paradox
Early Netflix had no engineering levels. Everyone was a "senior software engineer," period. You could call yourself "Software Engineering Video Guru" on your business card if you wanted, but there was only one actual level.
Compensation worked through "personal top of market." If you interviewed at Google and got offered $200K, you'd tell your manager. They'd either match it or you'd leave. The culture memo encouraged people to interview regularly—using the external market as a continuous calibration mechanism.
"My starting salary when I joined Netflix, which at the time was a good starting salary, was 175K a year," Ronca says. A year later, if Google offered him $200K, Netflix would adjust. But if another engineer who came in at $175K never interviewed elsewhere, their compensation might not move.
This worked beautifully when Ronca could sit in a room with three team members and an HR partner to discuss a candidate. It broke down when the company scaled. An engineer who'd gotten a 2x market adjustment three years ago might now be making far more than peers at the same contribution level. But without levels or structured performance reviews, new leaders looking at spreadsheets couldn't understand why.
"We had some engineers that we interviewed and offered jobs at X and they all got competing offers at Facebook at 1.2x," Ronca recalls. "The culture memo says we adjust our offer to 1.2. But nobody could rationalize that 1.2x" without a leveling system to provide context.
Netflix eventually added levels after Ronca left. They had to. The "personal top of market" approach requires institutional memory and context that doesn't survive scaling.
The Recognition Problem
Early Netflix was obsessively team-focused. When they shipped Netflix on PS3—doing something so unexpected that Sony's DVD division asked them to stop—nobody said "Scott Woo and David and Mitch were the foundational engineers that made this possible." They said "Netflix won."
Ronca now thinks this was a mistake: "As the company got bigger and instead of doing one or two things, you're starting to do many things, individual contributions which are still huge—I mean, engineering, that's the way engineering works—were starting to get lost. And if you don't credit individuals for great engineering work, who gets the credit? Leaders."
Without a system to recognize and reward individual technical achievement, the best engineers started leaving. The culture that said "everyone is equal" created a vacuum that management filled by default.
"Not having a good objective process for recognizing wins, achievements and rewarding those wins and achievements ultimately impacts your best people who end up leaving," Ronca says. He was arguing for levels by the late 2010s, but the cultural antibodies were strong.
The Aspiration Gap
Here's what Ronca says about the Netflix culture memo that everyone else missed: "People would ask others, is this how it is at Netflix? And they would say, yeah. They would ask me, I say, no, it's what we aspire to be. The culture memo is aspirational."
This is the gap that matters. The memo described a specific set of practices that worked at a specific scale. As the company grew, the practices stopped working, but the aspiration remained. The result was cognitive dissonance—people defending a culture that no longer matched the operational reality.
McCord eventually stopped interviewing every hire because she couldn't. Reed Hastings stopped having intimate one-on-ones with small groups of new employees. The "personal top of market" compensation philosophy became impossible to execute fairly. The flat structure that eliminated politics also eliminated the ability to recognize exceptional work.
None of these changes mean Netflix failed. They mean Netflix succeeded enough that the original design constraints no longer applied. But the company struggled to acknowledge that, which Ronca identifies as the real problem. "When I left," he says, "I felt the culture was struggling with a culture memo that was perfect for a very small aggressive engineering focused company."
The question isn't whether Netflix should have scaled differently. The question is whether any culture designed for 100 exceptional people can survive contact with 10,000 pretty-good ones. And whether the companies currently worshipping at the altar of "Netflix culture" understand that what they're actually copying is a specific moment that Netflix itself couldn't preserve.
—Marcus Chen-Ramirez
Watch the Original Video
Retired Netflix Engineering Director On Regrets, Video Engineering, Hiring Stories
Ryan Peterman
1h 19mAbout This Source
Ryan Peterman
Ryan Peterman is making waves in the software engineering sphere on YouTube with his channel centered on career stories and advice. With 40,700 subscribers, Peterman draws from his experience as a former Staff Engineer at Instagram to offer insightful content aimed at accelerating viewers' careers in software engineering. Since launching his channel in December 2025, Peterman has carved out a niche in career advancement and workplace culture within the tech industry.
Read full source profileMore Like This
Decoding Core Dumped: Insights from George's Q&A
Explore Core Dumped's George on video creation, programming, AI's role, and computer science learning. Discover insights for developers and tech enthusiasts.
Transforming Unstructured Data with Docling: A Deep Dive
Explore how Docling converts unstructured data into AI-ready formats, enhancing RAG and AI agent performance.
Why Hackers Are Ditching Stolen Passwords for Apps
Public-facing app exploits surged 44% while credential theft dropped. IBM's new threat report reveals what's driving the shift—and why it matters.
Leslie Lamport: Why the Smartest People Don't Think They're Smart
Turing Award winner Leslie Lamport on the bakery algorithm, working with Dijkstra, and why his 'gift of abstraction' mattered more than raw intelligence.