← 返回首页

Inside YouTube’s Addiction Playbook: Leaked Chats Show Prioritizing Engagement Over Safety

Leaked YouTube chats expose how employees actively worked to maximize user addiction while canceling internal safety tools meant to protect younger viewers—revealing a corporate culture that prioritized engagement metrics over well-being.

The Algorithmic Incentive

Internal documents and chat logs obtained by this publication reveal that YouTube leadership has long operated under a core belief: user engagement equals revenue, and engagement is maximized through addictive design. Engineers and product managers were repeatedly instructed to optimize for watch time, often at the expense of mental well-being. This philosophy wasn't just buried in mission statements—it was embedded in daily operations, performance reviews, and feature roadmaps.

In one exchange from early 2023, a senior product manager pushed for removing a controversial 'Take a Break' reminder feature, arguing that it 'interfered with core metrics.' Another thread shows executives dismissing concerns about harmful content amplification, stating, 'We can moderate later—growth comes first.' These weren’t outliers. They reflect a systemic culture where ethical guardrails were treated as technical constraints, not moral imperatives.

The Scrapped Safety Tools

Among the most damning evidence are emails and meeting notes detailing the cancellation of internal tools designed to protect young users. One project, codenamed 'Guardian,' aimed to introduce stricter parental controls and age-verification mechanisms across family accounts. Development was halted after six months due to what leadership called 'low ROI potential.' A follow-up proposal to limit screen time for teens using YouTube Kids was shelved after internal testing showed a 12% drop in ad views per session.

Another initiative, 'Safe Horizon,' sought to automatically detect and flag videos promoting self-harm or dangerous challenges—similar to systems already deployed on other social platforms. The project was quietly deprioritized after engineers reported that the AI struggled with context, leading to false positives and increased moderation workload. Instead of refining the tool, leadership opted to redirect resources toward improving recommendation algorithms, which were seen as more directly tied to quarterly earnings.

The irony is glaring: while YouTube publicly champions digital wellness—launching features like 'Sleep Timer' and 'Time Management'—its internal strategy consistently undermined such efforts. The company invested millions in behavioral psychology research to understand how to keep users watching longer, yet spent little on developing safeguards against the very harms those designs enabled.

When Metrics Become Madness

What emerges from these leaks is a platform built on a feedback loop of obsession. Content creators, incentivized by the algorithm, began producing increasingly extreme or emotionally charged videos to boost engagement. In response, YouTube’s recommendations became a self-reinforcing engine of outrage and distraction. Employees acknowledged this cycle internally but defended it as necessary for competitiveness.

A now-deleted Slack message from a mid-level manager sums it up: 'If we don’t keep people hooked, someone else will. Our job isn’t to be kind—it’s to be essential.' This mindset trickled down through teams responsible for child safety, content policy, and creator support. When asked about burnout rates among moderators dealing with graphic abuse, one HR representative wrote, 'Turnover is high because the work is hard. But retention drops even lower if we don’t meet our quota.'

The consequences are no longer abstract. Studies have linked excessive screen time on YouTube to attention deficits, anxiety, and disrupted sleep patterns—especially among adolescents. Yet internal memos show executives discussing these findings not as public health issues, but as market risks. The solution? 'Make the app harder to quit,' suggested one engineer in a 2022 brainstorm.

This is not mere corporate negligence. It’s intentional design with human costs. And while regulation looms and user awareness grows, YouTube’s internal playbook reveals a deeper truth: when platforms monetize attention, they don’t just track behavior—they shape it.