Powered By Blogger

Wednesday, May 1, 2019

( Facebook As Suckface ) Patcnews: May 1, 2019 The Patriot Conservative News Tea Party Network Reports Facebook As Suckface © All Copyrights Reserved By Patcnews
















       

The Facebook free speech battle, explained


Is Facebook a platform or a publisher? When users are getting banned, it makes a difference.

By

 



The Facebook logo is displayed during the F8 Facebook Developers conference on April 30, 2019, in San Jose, California.
Justin Sullivan/Getty Images

 acebook booted a hodgepodge of extremist figures last week, inflaming sentiment of some on the right and raising new questions of what is and isn’t protected speech on digital platforms.

But the current debate over speech and social media is far bigger than Facebook. It’s a product of social media companies skirting a fundamental question for more than a decade: Are they platforms — like Amazon Kindle or a cellphone network provider — or are they publishers, like Vox, Infowars, or the Washington Post?
For years, social media giants tried to avoid the question altogether, recognizing that under American law, digital platforms have unique protections that guard against lawsuits aimed at the content posted on those platforms. But users complained about extremism and misinformation weaponized on Facebook and elsewhere, putting Facebook, Twitter, and other tech companies under immense pressure to increase moderation and close the accounts of bad actors — the same way a publisher might reject an article or a writer.
In doing so, they’ve gotten sucked into the political fray they wanted to avoid. Conservatives, pointing out that Facebook and Twitter are self-described platforms, are arguing that banning some users while permitting others based on a “vague and malleable” rubric is infringing on free expression on sites that they view as more like a town square where all voices should be heard.
As President Donald Trump tweeted on Friday, “I am continuing to monitor the censorship of AMERICAN CITIZENS on social media platforms. This is the United States of America - and we have what’s known as FREEDOM OF SPEECH!”

How Facebook’s ban on extremist users exploded online

Facebook announced on Thursday that the platform was banning a grab bag of users: failed white nationalist House candidate Paul Nehlen, pundit Milo Yiannopoulos, right-leaning YouTube personality Paul Joseph Watson, alt-right political activist Laura Loomer, Nation of Islam leader Louis Farrakhan, and Alex Jones and his Infowars media outlet.
In a statement, Facebook said, “We’ve always banned individuals or organizations that promote or engage in violence and hate, regardless of ideology. The process for evaluating potential violators is extensive and it is what led us to our decision to remove these accounts today.”
But that’s not how many — conservative, libertarian, or otherwise — have interpreted Facebook’s decision, which comes after years of what reporter Charlie Warzel called in August 2018 “vague content rules and arbitrary enforcement” regarding bad actors on its platform.
Combined with recent Twitter suspensions of right-leaning figures like actor James Woods and ongoing drama over alleged “shadowbanning” of conservative social media users, Facebook’s decision last week just added to the sense among some on the right that social media companies unfairly treat right-wing users. (This is debatable.)
That’s prompted outrage from not just the media figures and outlets who have been banned or suspended but prominent conservative politicians as well, including Trump, Newt Gingrich, and Sen. Ted Cruz (R-TX).
The problem Facebook is facing now is one largely of its own making. But some of its loudest critics are making the wrong argument. Trump and others are fighting over who has been banned from Facebook (and Twitter), when perhaps they should be fighting over the fact that self-described “platforms” banned anyone at all.

Is Facebook a public square or a publisher?

Infowars is a publisher. Alex Jones, who has been the publisher and director of Infowars since its launch in 1999, can publish what he wants on it. If I pitched Alex Jones on an article for Infowars, he would be under no obligation whatsoever to publish it.
Amazon Kindle is a platform, which means Amazon provides the means by which to create or engage with content, but it doesn’t create most of the content itself — or do a lot of policing of it. If I wanted to read Mein Kampf on my Amazon Kindle, Amazon would be unable to stop me from doing so.
An even better example of a platform might be a company like Verizon or T-Mobile, which provides software and the network for you to make phone calls or send texts, but doesn’t censor your phone calls or texts even if you’re arranging to commit a crime.
But for Facebook, and all similar social media sites, this seemingly dense legal question is deeply important — for how it treats figures like Jones and Farrakhan, and even how Facebook arbitrates speech at all.
If Facebook is a platform, it then has legal protections that make it almost impossible to sue over content hosted on the site. That’s because of Section 230 of the Communications Decency Act, which protects websites like Facebook from being sued for what users say or do on those sites.
Passed in 1996, the act reads in part, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
Back in 2006, the act protected the website MySpace when it was sued after a teen met an adult male on the site who then sexually assaulted her. The court found that the teen’s claims that MySpace failed to protect her would imply MySpace was liable for content posted on the site — claims that butted up against Section 230.
But if Facebook is a publisher, then it can exercise editorial control over its content — and for Facebook, its content is your posts, photos, and videos. That would give Facebook carte blanche to monitor, edit, and even delete content (and users) it considered offensive or unwelcome according to its terms of service — which, to be clear, the company already does — but would make it vulnerable to same types of lawsuits as media companies are more generally.
If the New York Times or the Washington Post published a violent screed aimed at me or published blatantly false information about me, I could hypothetically sue the New York Times for doing so (and some people have).
So instead, Facebook has tried to thread an almost impossible needle: performing the same content moderation tasks as a media company might, while arguing that it isn’t a media company at all.

Facebook is trying to have its cake and eat it too

At times, Facebook has argued that it’s a platform, but at other times — like in court — that it’s a publisher.
In public-facing venues, Facebook refers to itself as a platform or just a “tech company,” not a publisher. Take this Senate committee hearing from April 2018, for example, where Facebook CEO Mark Zuckerberg argues that while Facebook is responsible for the content people place on the platform, it’s not a “media company” or a publisher that creates content. 



 And it certainly hasn’t been alone. In 2007, Wired magazine referred to the site as a “full-fledged platform that organizes the entire Internet.” And AdAge argued in 2012 that “what Facebook’s critics don’t understand” is that “it’s a platform, not a publisher.” 

 Questions answered by Facebook, “Facebook: Transparency and Use of Consumer Data,” 
June 29, 2018.
 



Questions answered by Facebook, “Facebook: Transparency and Use of Consumer Data,” June 29, 2018.

But in court, Facebook’s own attorneys have argued the opposite. In court proceedings stemming from a lawsuit filed by an app developer in 2018, a Facebook attorney argued that because Facebook was a publisher, it could work like a newspaper — and thus have the ability to determine what to publish and what not to. “The publisher discretion is a free speech right irrespective of what technological means is used. A newspaper has a publisher function whether they are doing it on their website, in a printed copy or through the news alerts.”

And even the language Zuckerberg has used about Facebook when appearing before Congress, as he did last spring, shows that he thinks of the service as a publisher — while his company simultaneously argues that it’s not.

In his opening statement to committee members, Zuckerberg said, “We didn’t take a broad enough view of our responsibility, and that was a big mistake. It was my mistake, and I’m sorry. I started Facebook, I run it, and I’m responsible for what happens here.” And then he added, “I agree we are responsible for the content” on Facebook, while noting again that Facebook doesn’t produce content itself.

Why the line between platform and publisher matters


Facebook is far from alone in attempting to walk an almost impossible line between responding to users’ demands for moderation and editing while attempting to avoid the legal responsibilities of being a publisher.

Take Tumblr’s recent ban on nudity, Twitter’s continued back-and-forth on suspending and banning extremist users, Facebook’s recent efforts to curtail misleading ads that may have contributed to misinformation surrounding the 2016 presidential campaign: All of these moderating efforts are attempts to get out ahead of users who are dismayed by a constant cavalcade of bad actors and bots that make these sites less enjoyable to use (and less profitable for ad companies that post on these platforms, and thus, for the platforms).

And with the threat of impending regulations arising from European courts where American digital media protections don’t exist, Facebook is keener than ever to stay within the good graces of American users — and politicians.

So companies like Facebook, YouTube, and Tumblr are trying to be more, as Zuckerberg put it, “responsible.” But that’s landed them in a supercharged political environment, drawing the ire of the figures they’ve deemed dangerous and many others. For companies like Facebook, they’re damned if they do moderate content — both legally and politically — and damned if they don’t.

“You’re next”


Many of those most enraged by Facebook’s decision to ban some users it views as “extremist” don’t actually have a problem with Facebook playing more of a “publisher” role — unless it applies to them.

Conspiracy theorist Paul Joseph Watson, who was among those banned, posted a video on YouTube in which he appeared to argue that some people clearly deserve to be banned from Facebook, but not him:

They put me on a list with terrorists, human traffickers and serial killers, because I criticize modern art and modern architecture. Because I dare criticize mass immigration. Because I dare criticize a belief system, yet you still host Antifa accounts which threaten to assassinate the president. You still host accounts that belonging to the sicko who sent death threats to Ben Shapiro’s family.


To be fair, others are making a better case: that the rules Facebook is using to decide what is “hate speech” and what isn’t constitute “selective silencing,” particularly since figures like Farrakhan and Jones have been using Facebook and other platforms for years to spread anti-Semitic bile and conspiracy theories, raising the question of why these figures were banned now and not, say, in 2012, when Jones was arguing that the Sandy Hook shootings were a “false flag.”

What was newly offensive about Laura Loomer, best known for spreading conspiracy theories and chaining herself to the doors of Twitter’s New York headquarters while wearing a yellow Star of David, that wasn’t offensive in 2016?

Others are making a far more direct appeal, one aimed at conservative social media users: if Facebook and Instagram can take down pages for Infowars and Loomer (who went on Infowars after the ban and screamed about how “ruined” her life is now), what will stop the sites from removing pages for Trump supporters, or for conservatives more generally? That’s the argument being made by Donald Trump Jr., who said on Twitter that the “purposeful & calculated silencing of conservatives” should “terrify everyone,” adding, “ask yourself, how long before they come to purge you?”

Or, as Yiannopoulos said in an email when I asked for comment on his ban from Facebook, “You’re next.”


Facebook wants to be a publisher that isn’t a publisher


But this doesn’t get at the real issue: Facebook wants to enjoy the benefits of being a content publisher — major moderation and editing powers along with the power to ban users for whatever reasons it wants — while also accessing the legal freedoms that come with being a platform under American law. And right now, Facebook is basically a publisher that keeps arguing that it isn’t.

That muddy legal territory has people worried that the social media giant will fail on both accounts — that it won’t handle material on its site as responsibly as a media outlet might, but will also stop providing an online “town square” where controversial voices can be heard. Since Facebook is now apparently reviewing the actions of users even when they’re not on Facebook, some are arguing that the stated terms of service that should dictate what’s permitted on Facebook and Instagram don’t do so in reality. That’s why organizations focused on digital civil liberties are just as concerned about Facebook’s decisions as some on the right.

Jillian York, a Electronic Freedom Foundation director, said in a statement, “Given the concentrated power that a handful of social media platforms wield, those companies owe their users a clear explanation of their rules, clear notice to users when they violate those rules, and an opportunity to appeal decisions.”

In April 2018, Facebook launched the “Facebook Here Together” campaign, stating that Facebook would “do more to keep you safe” from privacy violations and seemingly from bad content.

But that’s the role of a publisher — one that Facebook has argued time and time again that it doesn’t have. And that’s a big, big problem for the world’s most powerful social media company.






Mark Suckerberg CEO OF Suckface (facebook) Hands Up Don't Shoot 

















Innovation interrupted: Suckface ( Facebook ) lawsuit diverts attention from F8 2019


















- F8, Facebook’s annual developer’s conference, wrapped up Wednesday afternoon. But news from a Delaware courtroom captured part of the attention usually garnered by discussions surrounding innovation. 
“It’s true that we, and I have learned a lot of hard lessons over the last few years,” said Mike Schroepfer, Suckface ( Facebook’s ) chief technology officer, during the morning’s keynote address.
In a Delaware courtroom, Suckface ( Facebook )investor Robert Feuer filed suit against the company. His 193-page complaint alleges Mark Suckerberg committed insider trading for selling stock ahead of 2018’s data breach controversy and that Suckface ( Facebook’s ) board took actions that ultimately undermined the value of the stock, and the company itself.
Continue reading below
“They have some exposure because one of the things we’re looking for from our executives in leadership in companies is transparency and sort of making timely disclosure on things that could affect the company’s future. Certainly investors have a right to expect that,” said Ann Skeep, the senior director of the Markkula Center for Applied Ethics at Santa Clara University.



Wednesday’s suit follows multiple perceived and real missteps by the social media networking giant. As recently as 2014, Facebook has faced fire for a mood manipulation experiment on half a million unsuspecting users. User information that was plucked by apps, and then shared without the consent of those impacted. In addition to last year’s massive data theft that led to greater congressional oversight and the looming prospect of a five-billion dollar fine..
“Everybody that has a stake in Suckface 

( Facebook ) needs to be concerned whenever they make a misstep. Whether they’re a stockholder or user or regulator. Facebook isn’t entitled to stay on-top forever. If they continue to have missteps, they could face the same fate as MySpace did,” said Larry Magid, a tech analyst and the CEO of ConnectSafely
Suckface ( Facebook ) executives and presenters didn’t address the new lawsuit during the keynote discussion at the F8. An emailed statement from an anonymous spokesperson says, “This lawsuit is without merit.” Rank and file at the F8 didn’t seem concerned.
“I care more about the privacy, and the other parts that they’re working on. Rather than stocks and selling, and shareholders themselves,” said Lonut Ciobotaru, a Suckface (Facebook) advertiser attending from Romania.
Suckface ( Facebook ) continues to push ahead with changes it says will provide greater privacy and security for its users, business partners, and the public at-large.






 Email us 
 patcnews.patriot@aol.com
 patcnewsconservative1@aol.com



LLC 501C- 4 UCC 1-308.ALL RIGHTS RESERVED WITHOUT PREJUDICE




Content and Programming Copyright 2019 By Patcnews The Patriot Conservative News Tea Party Network © LLC UCC 1-308.ALL RIGHTS RESERVED WITHOUT PREJUDICE All copyrights reserved By Patcnews The Patriot Conservative News Tea Party Network Copyright 2019 CQ-Roll Call, Inc. All materials herein are protected by United States copyright law and may not be reproduced, distributed, transmitted, displayed, published or broadcast without the prior written permission of CQ-Roll Call. You may not alter or remove any trademark, copyright or other notice from copies of the content.  © All Copyrights Reserved By Patcnews The Patriot Conservative News Tea Party Network

No comments:

Post a Comment