Susan Wojcicki is pushing to make YouTube responsible. PewDiePie is not pleased
YouTube spent 2019 answering critics with some of the most drastic changes in its 15-year history. With each step, it gave those activists, regulators and lawmakers more reasons to attack its freewheeling, user-generated business model.
Susan Wojcicki, YouTube’s chief executive, announced her goals in April. “My top priority,” she wrote, “is responsibility.”
Her company spent the year trying to traverse an almost impossible tightrope: nurture a growing community of demanding creators, while pledging to police troubling videos and protect millions of underage users who officially shouldn’t even be watching. The efforts pleased almost no one and highlighted an existential quandary. Every time YouTube tries to fix something, the company, an arm of Alphabet Inc.’s Google, risks losing the neutrality that it needs to thrive.
“They know that every time they are successful catching problematic content or removing it, this just raises expectations,” said Mike Godwin, a senior fellow at think tank R Street Institute and a trustee of the Internet Society. “It’s a never-ending cycle of increasing demands for these dominant platforms to operate fairly.”
As 2020 begins, the largest online video service is being dragged deeper into political fights over privacy, copyright and content moderation. In response, YouTube is trying to preserve the sanctity of its status as an online platform with little liability for what happens on its site. Instead, that burden is increasingly falling on the shoulders of regulators, video creators and other partners.
YouTube will launch a separate website for children after it was criticized and investigated for showing inappropriate videos to kids on its main site.
Nowhere is that more evident than YouTube’s approach to kids. A landmark privacy settlement this year with the Federal Trade Commission is forcing YouTube to split its massive site in two. Every clip, starting in January, must be designated as “made for kids” or not. The overhaul puts billions of ad dollars at stake and has sparked panic among creators, who also now face new legal risk. The company isn’t offering creators legal advice or ways to salvage their businesses. It isn’t even defining what a “made for kids” video is on YouTube — and has argued to the government that it shouldn’t have to.
“Creators will make those decisions themselves,” Wojcicki said recently. “Creators know their content best.”
YouTube privately considered taking more control. Earlier this year, it assembled a team of more than 40 employees to brace for the FTC decision. The team was code-named Crosswalk — as in a way to guide kids across YouTube’s chaotic streets.
Among its proposals was a radical one, at least by the standards of Silicon Valley: YouTube would screen every video aimed at kids younger than 8 in its YouTube Kids app, ensuring that no untoward content crept into the feed of millions of tots around the world. A news release was even drafted in which Wojcicki said professional moderators would check each clip, according to people familiar with the plans. Yet at the last minute, the CEO and her top deputies ditched the plan, said the people, who asked not to be identified discussing private deliberations.
The rationale was clear to some at YouTube, one person involved in the project recalled. Hand-picking videos, even for kids, made YouTube look too much like a media company, not a neutral platform. A YouTube spokeswoman denied the idea was turned down because it put the company in charge of programming, but she declined to comment further on the decision. In a recent interview, Wojcicki made it clear that her content-moderation push only goes so far, telling CBS News that even being liable for video recommendations would destroy the essence of the service.
“If we were held liable for every single piece of content that we recommended, we would have to review it,” she said. “That would mean there would be a much smaller set of information that people would be finding. Much, much smaller.”
YouTube’s balancing act between media publisher and hands-off internet bulletin board has sparked intense debate internally. For some business partners and employees, this year’s decisions leave them with the impression that the company is unable to take a serious stand.
“What is the mission of this company? People don’t even know,” said Claire Stapleton, a former YouTube marketing manager who left this year after clashing with Google over employee protests. “YouTube is so ill-equipped to manage these massive challenges.”
The YouTube spokeswoman said the company has made significant investments to better protect its online community. Over the last 18 months, the results of this effort include an 80% reduction in views of videos that violate its policies. YouTube also increased viewership of videos from “authoritative news publishers” by 60%, according to the spokeswoman.
“While there will always be healthy debate around this work, we’ll continue to make the hard decisions needed to better protect the openness of the YouTube platform and the community that depends on it,” she added in a statement.
No episode in 2019 typified YouTube’s arduous search for middle ground more than the Maza affair. In June, gay journalist and YouTube creator Carlos Maza accused Steven Crowder, a conservative YouTuber, of repeated harassment. The Vox reporter put together a montage of clips from Crowder’s YouTube channel to highlight what Maza said were homophobic and racist insults.
After saying it would review Maza’s complaints, YouTube concluded the comments were not in violation of its policies, angering some of its own employees. YouTube staff held a private call to explain its rationale to Maza, who remained unconvinced. “It was very awkward,” he recalled.
Crowder, meanwhile, devoted a 21-minute video to rehashing his comments. After days of criticism, YouTube removed ads from his videos, angering him.
At a conference about a week later, Wojcicki apologized to the LGBTQ community but defended YouTube’s decision to keep Crowder’s videos on the site. Removing his clips, or banning him from YouTube, would have put the company in an untenable situation, with millions of viewers asking, “What about this one?” for hundreds of comedy, hip-hop and late-night TV show videos, the CEO said.
Two months later, a group of LGBTQ YouTube creators filed a class-action lawsuit accusing the company of discrimination. The case mirrored similar charges from across the ideological aisle — a filing from PragerU, a conservative video channel, which has accused YouTube of censorship. In fact, the lawsuits were brought by the same attorney.
“It just looks like YouTube is taking the maximum amount of time for a solution that pleases no one,” said Stapleton, the former employee.
YouTube spent the months after the Maza episode rewriting its harassment policy. The update, announced earlier this month, set new rules that would now treat Crowder’s videos as violations subject to removal. Like clockwork, the decision riled other creators. Felix Kjellberg, YouTube’s biggest star, who posts as PewDiePie, declared he was leaving the video site and blamed the new policy.
“We have this anarchy system, OK,” he said. “If YouTube knows what’s good for them, they’ll keep their [expletive] hands out.... Don’t come and ruin it for us.”
While criticism comes from all sides, YouTube’s challenge is practically insurmountable: More than 500 hours of video are uploaded every minute. And the company’s software is still unable to gain a thorough understanding of the content before people start watching.
“You are trying to keep free speech going and, at the same time, you’re trying to make sure crud doesn’t get in, and trying to make sure that people who watch aren’t getting affected. It’s a really, really, really hard problem,” said Diya Jolly, a former YouTube executive who left in 2017. “Susan is doing an awesome job.”
Wojcicki’s task is set to become even more difficult. The European Parliament has approved rules that make YouTube liable the moment anyone uploads a video that violates a copyright. That could force YouTube to take down content from popular creators, while hiking its legal bills and hurting ad sales. Wojcicki used Google’s political muscle and invited creators to lobby against the regulation, but she has failed to stop it. According to one former senior employee, the fight often claimed as much of the executive team’s attention in 2019 as the more-public battles over children’s privacy and inappropriate content.
Even in the U.S., the walls are closing in around YouTube. Republican and Democratic lawmakers have proposed peeling back protections that have shielded internet companies from liability for decades. YouTube’s dominance may draw antitrust scrutiny. Lawmakers are also considering tougher copyright laws, egged on by YouTube’s rivals in media and music. “That’s where there is a lot of money at stake, and people have valid objections,” said Jeff Kosseff, an assistant professor at the U.S. Naval Academy and an expert on internet law.
For now, though, YouTube’s biggest challenge is kids’ privacy. In September, the FTC fined Google for illegally tracking children for its ad business, forcing significant changes to YouTube’s operations. On Nov. 13, YouTube sent an email to tens of thousands of creators about the coming “made for kids” designation. If marked as “made for kids,” videos will lose lucrative personalized ads and other valuable features, including user comments. If clips aren’t labeled this way, and the government decides the video is indeed reaching children, creators can be fined thousands of dollars.
“We know this won’t be easy for some creators, and that this required change is going to take some getting used to,” the company wrote in the email. YouTube has also advised many of them to “lawyer up,” according to partners. A recent regulatory filing went further, with Google estimating the changes will mean YouTube creators “who make mostly child-directed content will likely lose a majority of their revenue.”
Waves of new media darlings have tried to unseat YouTube, with no success. Now, with YouTube in turmoil, they see an opportunity.
In contrast, YouTube itself emerged relatively unscathed. Google paid a $170-million fine, a tiny sliver of its profit. The FTC settlement on the Children’s Online Privacy Protection Act, or COPPA, focused on YouTube, not other parts of Google. The internet giant worked hard to limit any broader effects on the rest of its businesses, according to one former executive. Best of all for YouTube, it doesn’t need to screen clips before they go up, nor is it liable for any infringing videos.
The FTC is now rewriting its COPPA rules and has invited public comment. In a filing, Google told the agency it was worried about any laws forcing it to “identify and police” videos aimed at kids. The company was, in effect, arguing it couldn’t know for sure the age of its audience and shouldn’t be punished for that.
Critics were appalled. Lindsey Barrett, a staff attorney at Georgetown Law’s Communications & Technology Clinic who worked with complainants in the FTC case, found it hard to imagine the contortions required for Google to make this argument. “Our entire business is based on being able to slice and dice our audience, and see who’s watching what,” she said. “But we couldn’t possibly tell you if there’s a child here!”
The YouTube spokeswoman said the company has done its best to comply with its COPPA obligations, as it understands them, and has asked the FTC for more clarification on the rules.
The company is “not answering the questions everyone wants,” said Greg Alkalay, chief executive of BatteryPop, a children’s media company. “YouTube’s success comes from its creators. They built a beast and don’t know how to wrangle it.”
Bergen and Shaw write for Bloomberg.
More to Read
Inside the business of entertainment
The Wide Shot brings you news, analysis and insights on everything from streaming wars to production — and what it all means for the future.
You may occasionally receive promotional content from the Los Angeles Times.