Perspective · Part 03 of 07

The Only Thing That Works: Hitting the Bottom Line

If you want companies like Meta or OpenAI to change their behavior around kids, the conversation has to reach revenue. Everything else is negotiable.

The Only Thing That Works: Hitting the Bottom Line

I think this is the part people don’t always want to say out loud, or maybe they don’t fully connect the dots on it. And realistically, only those behind big tech would be the ones to disagree.

If you actually want companies like OpenAI or Meta to change their behavior in a meaningful way, it has to affect how they make money.

Not their reputation. Not a temporary news cycle. Not even internal discomfort.

Their revenue.

Because once that’s on the line, the conversation inside those companies changes really quickly.

Up until that point, a lot of decisions live in this gray area. Teams talk about safety, ethics, long-term impact, and those things do matter to people individually. But they’re competing with very clear, very measurable goals like growth, engagement, and retention.

And when those two things conflict, growth tends to win. Not because anyone is trying to make a bad decision, but because the system is designed to reward the outcome that looks best on paper.

So if we’re talking about regulating AI or social media, especially when it comes to kids, I don’t think the question is “what rules should exist.”

I think the better question is, “what happens when those rules are ignored?”

Because that’s what determines whether they matter.

If there’s no real consequence, then the rule just becomes something companies can work around, delay, or interpret loosely. And we’ve already seen that play out with things like age gates, where technically there are restrictions, but in practice almost anyone can get through them.

So what actually changes behavior?

The simplest answer is anything that makes non-compliance more expensive than compliance.

That can show up in a few different ways, and I think it’s worth being specific here because this is where things start to move from theory into something that could actually work.

One is fines, but not the kind that get treated as a cost of doing business. Flat fines don’t really do much when you’re dealing with companies operating at massive scale. They need to scale with revenue or impact in a way that actually hurts.

Another is restricting access to revenue streams directly. Advertising is the obvious one. If a platform is found to be non-compliant, especially around something like underage users, then limiting or suspending their ability to run ads in that region would get attention very quickly. That’s not a minor inconvenience. That’s a core part of the business.

Market access is another big lever. Not in the sense of banning users, because I don’t think that’s the right approach, but restricting the company’s ability to operate fully until they meet certain standards. That could mean limiting features, slowing down expansion, or requiring additional approvals before new functionality is rolled out.

And then there’s ongoing compliance, not just one-time checks. A system where companies are monitored, flagged, and given something like a strike system starts to change how seriously these things are taken internally. If you know that repeated failures will escalate into something that affects revenue or operations, you’re going to invest in preventing those failures.

All of that sounds heavy, but when you compare it to how things work right now, it’s actually just shifting responsibility to where it already belongs.

Right now, a lot of the burden sits on parents and kids. Parents are expected to understand platforms that are constantly changing, and kids are expected to navigate systems that are designed to keep them engaged for as long as possible.

At the same time, companies can point to policies and say the rules exist, even if they’re not meaningfully enforced.

That gap is the problem.

And I think this is where the earlier point about free accounts becomes really important, because it ties directly into how these companies grow.

Free access sounds great on the surface, but it’s also the easiest way for underage users to get in. There’s no friction, no real verification, and no accountability tied to identity. At the same time, those users are still incredibly valuable from a data and engagement perspective.

So if a company isn’t properly enforcing age restrictions but continues to benefit from those users through its free tier, there’s no real incentive to fix the problem.

That’s where regulation can get more targeted.

If access to a free tier is restricted until proper compliance is in place, or if companies are required to implement stronger verification for certain types of accounts, it starts to change the equation. Suddenly, it’s not just a policy issue, it’s a growth issue.

And once it becomes a growth issue, it gets prioritized.

I want to be clear that I don’t think the goal here is to punish users. Especially not kids. Blocking them entirely or removing access without addressing the underlying system doesn’t really solve anything.

The goal is to make sure that if a company benefits from having users on its platform, it’s also responsible for enforcing the rules around who should be there and how those users are treated.

That’s a very different approach than what we have now, which is closer to “we’ve set the rules, but enforcement is kind of optional.”

And when you zoom out, this isn’t really a new idea. It’s the same pattern we talked about with GDPR and anti-spam laws.

Companies didn’t change because they were asked to. They changed because not changing became more expensive.

I think that’s the shift that needs to happen here.

Not more conversations about whether companies should do better, but a very clear understanding that if they don’t, there are consequences that actually matter to them.

Because that’s when behavior changes.

Not slowly, not eventually, but immediately.

The MPC briefing

One short letter. No outrage cycle.

Reviews, practical guides, and parent perspectives on games, screens, AI, and online life — straight to your inbox.

Free · unsubscribe anytime · we never sell your data