Posted 5/5/21
Social media companies, particularly YouTube, have been called out for using suggestion algorithms that lead to radicalization. The premise is that YouTube suggestions (or auto-play functionality) prompt users with increasingly extreme political content in an effort to increase engagement. The longer you stay on the platform, the more advertisements you’ll watch, the more money YouTube makes. The same premise underlies the content feeds of Facebook, Twitter, TikTok, and so on. Generally, pushing users towards neo-nazi content to maximize ad revenue is seen as a Bad Thing. The first responses are usually “we should regulate social media companies and forbid them from using auto-radicalization algorithms”, or “we should dispose of the surveillance-advertisement framework, running services like YouTube on a subscription model instead.” These are both problematic strategies, and I’ll argue that we can do better by abolishing YouTube altogether in favor of distributed hosting. Let’s jump in!
The first approach is fraught with challenges: many suggestion-feed algorithms are machine-learning black boxes, which don’t necessarily understand that they’re amplifying extreme political content, but just see correlations between keywords or specific video viewership and increased engagement. “Users that watched video X were likely to also watch video Y and therefore sit through another ad.” Legislating the computer-calibrated feed algorithms of social media companies would be an impressive feat. Even if it could be done, social media companies would be incentivized to skirt or ignore such regulations, since it’s in their financial interest to keep user engagement as high as possible. (We see a similar trend in content policies, where social media companies are incentivized to remove anything the public might see as morally objectionable, while avoiding any actions that could be perceived as political censorship. The result is censorship of LGBTQ+ and sex-positive content, and minimal intervention in white supremacist rhethoric until it crosses a clear legal line)
The second approach is more satisfying: If we pay YouTube or Facebook directly, then their financial future is secured and they no longer need advertisements. Without advertisements, they no longer need suggestion algorithms to drive up engagement. Better yet, they can take down their creepy surveillance infrastructure, since they no longer need to sell user data to advertisers to turn a profit! The viewer is now the customer rather than the product, hooray! Sort of. Putting aside whether enough users would pay to make this business model viable, it doesn’t actually solve the original issue: social media companies are still incentivized to maximize engagement, because users that regularly spend time on their platform are more likely to keep up their subscriptions. We have a similar problem if the service is “free” with micro-transactions, incentivizing addictive behavior, drawing users in until they’re willing to spend more money. Maybe the feed algorithms would get a little better, but they’d remain fundamentally the same.
But who said we have to pay for our content with money? If the core issue is “YouTube needs income from somewhere to cover their infrastructure costs”, could we instead donate our computers to serve as that infrastructure?
This could take a few forms:
For every video you watch, you must host that video for a time, and seed it to X other users before deleting it. This would prioritize redundancy for popular content, but gives no guarantee that obscure videos would remain available. Maybe this is a good thing? Content is “forgotten” by the network unless it’s actively viewed, or someone feels strongly that the content should remain available and explicitly chooses to host it.
For each video you watch, you are assigned a second video that you must host for X number of views or Y amount of time. This resolves the “no hosts for obscure content” problem at the cost of additional overhead.
Both suggested systems work similarly to current torrent technology, where peers preferentially share content with you based on your past sharing behavior. “Good citizens” who host lots of video content are able to view videos more quickly in turn, improving latency before playback, or providing sufficient bandwidth to load higher-quality versions of videos, etc. Users that host content for short periods of time, or refuse to host videos altogether, are given lower priority, and so have long video load times, especially for higher-quality versions of videos. A distributed hash table tracks what videos are currently hosted from which nodes, providing both a redundant index and redundant video storage.
From the user perspective, they’re not “paying” anything to watch YouTube videos, and there are no advertisements. If they checked, they’d see that their hard drive has lost some free space temporarily, but unless they’ve almost filled their computer, they’re unlikely to notice.
What are the limitations of this strategy? Well first, we’ve said nothing about YouTube “turning a profit”, only covering their infrastructure costs. Indeed, there’s little to no profit in this, and YouTube would become a collective public service rather than a corporation. Replacing YouTube as a company with shared volunteer labor sounds radical, but there’s precedence, in both torrenting communities and Usenet. People are generally willing to give back to the community, or at least there are enough willing people to keep the community going, especially if it costs them very little. Losing some hard drive space for a while to share that great music video you just watched is distinctly “very little effort”. OurTube :)
This leaves two large sociotechnical hurdles:
Unequal access. Users that live in urban areas with higher bandwidth, and can afford additional storage capacity, receive preferential treatment in a reciprocal hosting network. This is not great, but users with low bandwidth already have an inferior YouTube experience, and data storage is getting cheaper and cheaper (you can get a 1 TB hard drive for under $50 now!). At best we’re not making this problem worse.
Content discovery. We don’t need a recommendation algorithm to drive engagement anymore - engagement no longer brings anyone wealth - but we do need some mechanism to find videos to watch! Video titles and descriptions are searchable, but that’s easily gamed (remember early search engine optimization, websites stuffing hundreds of tiny words in the bottom of the HTML to coax search engines into increasing their relevancy scores?), and sorting search results by views doesn’t solve the problem.
I’m hopeful that the second problem could become a boon, and we could see a resurgence of curation: Users put together channels of videos by topic, amassing subscribers that enjoy the kinds of videos in the list. There could be group voting and discussion on videos, subreddits dedicated to an interest, users submitting videos and arguing over content and relevancy. The solutions seem more human than technical, and exist outside the video-hosting software itself.
A better Internet is possible, one that isn’t covered in unskippable double advertisements, doesn’t push teens towards white supremacy videos, and doesn’t have an American corporation as adjudicator over what content is morally acceptable.