Was Youtube’s Christmas Crypto Purge Illegal?

Was Youtube’s Christmas Crypto Purge Illegal?

This article considers the legal implications of Youtube’s notorious purge of crypto channels on Christmas Eve. What legal context induces Youtube and other social media giants to operate as they do? The article does not explore whether it is morally proper to terminate a contract without cause or explanation and to threaten people’s livelihoods; it is not. Nor are political implications, such as Youtube’s liberal bias, discussed. The legal factors context surrounding Google’s Youtube purge are important. Users should know what is happening and why.

Also read: Youtube ‘Christmas Purge’ Has Content Creators Pointing to These Alternate Platforms

What Happened?

On Christmas Eve, Bitcoin.com contributor Graham Smith warned, “At least six crypto Youtube channels have reported in recent hours that their content is being removed under the site’s ‘harmful and dangerous’ policy, with one popular channel claiming Youtube pointed to a ‘sale of regulated goods’.” The purged channels received no warning, no plausible explanation. Presumably, the “harmful and dangerous” policy so vaguely referenced by Youtube was an alleged violation of Section 17(b) of the Securities Act of 1933.

It shall be unlawful for any person … to publish, give publicity to, or circulate any notice, circular, advertisement, newspaper, article, letter, investment service, or communication which, though not purporting to offer a security for sale, describes such security for a consideration received or to be received, directly or indirectly, from an issuer, underwriter, or dealer, without fully disclosing the receipt, whether past or prospective, of such consideration and the amount thereof [italics added].

The original purpose of Section 17(b) was to make it illegal for anyone to promote a stock without disclosing any consideration they may have received from an issuer, underwriter, or dealer in the stock.

The Christmas Purge is not the first time Google has removed crypto material. In June 2018, Google followed Facebook’s lead in banning crypto-related advertising. CNBC reported, “Even companies with legitimate cryptocurrency offerings won’t be allowed to serve ads through any of Google’s ad products, which place advertising on its own sites as well as third-party websites.” The legality of crypto and the reputation of the advertiser were irrelevant. Three months later, Google’s outright ban ended, but a new policy was instated. Forbes explained, “regulated cryptocurrency exchanges” could “buy ads in the U.S. and Japan … Ads for initial coin offerings (ICOs), wallets, and trading advice will remain banned … with the updated policy applying to advertisers all over the world, though the ads will only run in the U.S. and Japan.” The stated reason for the ban and restriction was a desire to shut down illegal activities connected to crypto for which Google could have been liable.

A Look at the Top Cryptocurrency Markets From Christmases Past

Happy Christmas, the Purge Is Over

The Christmas Purge now seems to be resolved and the channels restored—at least, according to Youtube. But Chris Dunn, owner of a channel with more than 200,000 subscribers, claims that his videos have not been completely restored; other crypto Youtubers echo this complaint and some report an inability to post new videos or to insert links within the ones they do put up. Meanwhile, Youtube ascribes the incident to an “error.”

Even those without complaints should note Youtube’s new Terms of Service (November 11, 2019). “Youtube may terminate your access, or your Google account’s access to all or part of the Service if Youtube believes, in its sole discretion, that provision of the Service to you is no longer commercially viable.” Most centralized social media have similar terms.

What is the legal context of such authoritarianism?

Publisher or Platform?

The key question is “publisher or platform?”

A publisher edits and controls the material it issues, which means it assumes legal liability. Under the traditional common law for defamation, for example, a publisher who issues the defamatory statement of another party can bear the same legal liability as the party himself. This is because the publisher has the knowledge, ability, and opportunity to control what appears.

For years, Facebook has described itself as a platform—that is, an array of services, tools, and products through which other parties independently create and publish content. Except for minimal and common sense restrictions, such as the prohibition of illegal activity, Facebook claimed it did not edit or control content. It was a platform or a distribution mechanism. A platform is no more legally responsible for the content it hosts than a phone company is responsible for the content of conversations that flow over its lines.

This immunity comes from Section 230 of the Communications Decency Act, subtitled “Protection for private blocking and screening of offensive material.” It reads, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” The provision provides all interactive online services with broad immunity from tort liability that could arise due to material posted by a third party.

Establishing Where Liability Lies

The legal difference between a platform (a distributor) and a publisher is significant and centers on the issue of liability. The Digital Media Law Project explained, “Distributor liability is much more limited. Newsstands, bookstores, and libraries are generally not held liable for the content of the material that they distribute. The concern is that it would be impossible for distributors to read every publication before they sell or distribute it, and that as a result, distributors would engage in excessive self-censorship. In addition, it would be very hard for distributors to know whether something is actionable defamation; after all, speech must be false to be defamatory.” A publisher could be liable.

Social media—platform or publisher?—was the gist of an early court battle over online content. The case revolved around defamatory statements made by posters on a bulletin board: Cubby v. CompuServe, Inc. (1991). Subscribers to Compuserve could access over 150 forums that were managed by third parties. When the so-called “Rumorville” forum was sued for defamation, Compuserve claimed to enjoy the immunity of a distributor because it did not review posts before they appeared. The court agreed and dismissed the case.

But the Compuserve ruling did not create settled law. In Stratton Oakmont v. Prodigy (1995), a court found that a computer network did exercise editorial control over messages through its content guidelines and software screening filter. Prodigy was found legally liable for content on its site.

The Digital Media Law Project commented on the conflicting decisions. “The perverse upshot of the CompuServe and Stratton decisions was that any effort by an online information provider to restrict or edit user-submitted content on its site faced a much higher risk of liability if it failed to eliminate all defamatory material than if it simply didn’t try to control or edit the content of third parties at all.” Social media is in the tense position of wishing to be legally viewed as a platform while exercising the control vested in a publisher. Facebook and its ilk try to straddle the fence, with a foot planted on either side. The footing is not solid.

For one thing, the situation is more complicated. On civil liability, for example, the current Section 230 includes a “good Samaritan clause” that states, “No provider or user of an interactive computer service shall be held liable on account of any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.”

Social media does not incur civil liability for exercising “good faith” efforts to patrol content, at least in theory. But the civil immunity is not absolute, and courts have shown a tendency to chip away at the broad protection. In Barnes v. Yahoo!, Inc., for example, the court found that a failure to comply with promises to remove material could invalidate Section 230 protection.

Besides which, many accusations against crypto are criminal, not civil.

Social Media’s Schizophrenia

Since the civil liability of social media is evolving law, the greatest safety resides in being a “platform” rather than “publisher.” But as the media grows in wealth, influence, and arrogance, the lines blur.

According to Bloomberg Business, Susan Wojcicki, Youtube’s chief executive officer, has been “trying to traverse an almost impossible tightrope: nurture a growing community of demanding creators, while pledging to police troubling videos … The efforts pleased almost no one and highlighted an existential quandary. Every time Youtube tries to fix something, the company, an arm of Alphabet Inc.’s Google, risks losing the neutrality that it needs to thrive.” The neutrality comes from acting as a platform. And yet, Youtube also acts as a publisher by censoring material. When caught doing so, Youtube reverts to being a platform again by ascribing the “mistake” to a computer glitch, human error or act of God.

Facebook is similarly schizophrenic; that is, it wants the legal protections of a platform while functioning as a publisher. Facebook has been leaning more openly toward a publisher status, however.

In April 2019, while testifying before a Senate hearing, Facebook CEO Mark Zuckerberg stated that his company “is responsible for the content.” In response, Senator Ted Cruz spelled out the legal implications of this statement by asking Zuckerberg, “Are you a First Amendment speaker expressing your views [a publisher], or are you a neutral public forum allowing everyone to speak [a platform]? … The predicate for Section 230 immunity under the CDA is that you’re a neutral public forum.” Cruz’s question must have been for the record because Zuckerberg was surely familiar with the legal distinction.

Of more significance: Facebook represented itself as a First Amendment speaker—a publisher—in a 2018 court proceeding. The Guardian reported, “In a small courtroom in California … attorneys for the social media company … repeatedly argued, [Facebook] is a publisher, and … makes editorial decisions, which are protected by the first amendment.”

Complying with the demands the state makes upon a publisher is costly, inconvenient, and disempowering; but demands are happening more and more. Bloomberg Business reported on just one policy the state has imposed on Youtube for 2020:

Youtube’s approach to kids. A landmark privacy settlement this year with the Federal Trade Commission is forcing Youtube to split its massive site in two. Every clip, starting in January, must be designated as “made for kids” or not. The overhaul puts billions of ad dollars at stake and has sparked panic among creators, who also now face new legal risk. The company isn’t offering creators legal advice or ways to salvage their businesses. It isn’t even defining what a “made for kids” video is on Youtube.

Youtube will pass on the cost of compliance to the user, of course. One cost will be the arbitrary suspension of “suspicious” accounts—suspicious by the unstated or vague definition of the social media company. If some simply “offensive” accounts fall victim, so be it.

Conclusion

Social media does not have to be this way. It should not involve the exercise of arbitrary power over dutiful customers. The unthinking or biased bureaucracy of the giants results from several factors. The companies are highly centralized, which makes them unresponsive and autocratic. They are financially supported by the state through tax grants and privileges, contracts and exemptions; in short, they do not need to provide a competitive service in the marketplace. They strip privacy away from users in order to sell it or share it with the state—neither of which benefits users.

The function and legal context of the social media giants is not likely to change, except for the worse. The solution is obvious, however. Graham Smith presented it out in “Youtube ‘Christmas Purge’ Has Content Creators Pointing to These Alternate Platforms.” These crypto-powered video sharing platforms deserve an immediate look before the social media giants oblige content creators and their fans to adopt them en masse.

What are your thoughts on Youtube’s Christmas crypto purge? Let us know in the comments section below.

Op-ed disclaimer: This is an Op-ed article. The opinions expressed in this article are the author’s own. Bitcoin.com is not responsible for or liable for any content, accuracy or quality within the Op-ed article. Readers should do their own due diligence before taking any actions related to the content. Bitcoin.com is not responsible, directly or indirectly, for any damage or loss caused or alleged to be caused by or in connection with the use of or reliance on any information in this Op-ed article.


Images courtesy of Shutterstock.


Did you know you can verify any unconfirmed Bitcoin transaction with our Bitcoin Block Explorer tool? Simply complete a Bitcoin address search to view it on the blockchain. Plus, visit our Bitcoin Charts to see what’s happening in the industry.

The post Was Youtube’s Christmas Crypto Purge Illegal? appeared first on Bitcoin News.


Posted

in

,

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.