Some of the people who literally helped write Facebook’s community standards say Zuckerberg is wrong on Trump posts
Thirty-three of Facebook’s earliest employees write letter objecting to hands-off policy
Facebook ( Suckface ) names 20 people to its 'Supreme Court' for content moderation
The list includes nine law professors, a Nobel Peace Prize laureate from Yemen and journalists but no disinformation experts.
By David Ingram
Facebook
on Wednesday appointed 20 people from around the world to serve on what
will effectively be the social media network’s “Supreme Court” for
speech, issuing rulings on what kind of posts will be allowed and what
should be taken down.
The list includes
nine law professors, a Nobel Peace Prize laureate from Yemen,
journalists, free speech advocates and a writer from the libertarian
Cato Institute.
Absent, however, was any prominent expert in studying disinformation. Facebook has struggled to contain state-based manipulation efforts as well as hoaxes on subjects like false cures and gun
Helle
Thorning-Schmidt, a former prime minister of Denmark and one of four
co-chairs of the board, said they would consider such expertise in
recruiting more members.
“We have tried to
consider all communities and also people who have been critical of
Facebook in the past,” she said. The number of members will rise to 40
over time, she said.
The oversight board is more than two years in the making, its creation prompted by CEO Mark Zuckerberg, who said in 2018
that he wanted to create “some sort of structure, almost like a Supreme
Court,” for users to get a final judgment call on what is acceptable
speech and relieve the company's executives of having to decide.
Social
media networks dating back to MySpace have struggled to write rulebooks
that are easy to understand and consistently enforceable and yet cover
the varied material that people try to post online.
The
rules, including Facebook’s “community standards,” have evolved to
prohibit not only illegal images such as child pornography but also hate
speech, harassment and, most recently, false information about the
coronavirus pandemic.
The questions often
become political footballs, as lawmakers in Washington and elsewhere
have turned their fire on Zuckerberg when they believe they or their
supporters are being unfairly censored.
The
creation of Facebook’s oversight board is designed to effectively hand
the last word over to the expert panel, possibly taking Zuckerberg and
other Facebook executives out of the picture on writing speech rules —
and sparing them having to answer questions from users, lawmakers and
journalists.
But one of the co-chairs, former federal judge Michael McConnell, said he expected the board to have a steep learning curve.
“We
are not the internet police,” McConnell said. “Don’t think of us as
sort of a fast-action group that’s going to swoop in and deal with
rapidly moving problems. That’s not our job.” The job, he added, was to
hear appeals of decisions that Facebook has already made.
The
board’s decisions will be binding “unless implementation could violate
the law,” Facebook said. The decisions will also apply to Facebook-owned
Instagram but not initially to WhatsApp, where content is generally
encrypted. Membership on the board is part-time. The board isn’t
disclosing its compensation.
Facebook has
taken steps to try to make the board independent, creating a $130
million trust to pay for its operation and pledging that it cannot
remove members from the board. Facebook will refer cases to the board
for its consideration when the company considers them “significant and difficult,” and Facebook users will be able to suggest cases through an online portal.
“All
Members are committed to free expression, and reflect a wide range of
perspectives on how to understand the principle and its limits,”
Facebook said in a statement.
“Some have
expressed concerns with the dangers of imposing restrictions on speech,
and allow for only very narrow exceptions. Others make comparatively
greater accommodations to a range of competing values, including safety
and privacy,” the company said.
Day-to-day
enforcement of the rules will still be up to Facebook, which uses a
combination of computer algorithms and human moderators to decide which
posts violate its rules.
One reason that
moderating content online is so complicated is because companies such as
Facebook tailor their rules to specific countries based on local law.
Facebook, with 2.6 billion people across its apps, has users in nearly
every country.
Americans are the
best-represented nationality on the oversight board, with at least five
members. No other country has more than one. Facebook said the members
chosen collectively have lived in more than 27 countries and speak at
least 29 languages.
Not all are avid
Facebook users. “I myself am not really an Instagram or Facebook user,”
said Jamal Greene, a Columbia Law School professor. But he said he
appreciated that “Facebook’s decisions affect people all over the world
and can affect people in profound ways.”
Of the 20 members so far, half are male and half female.
Two
of the lawyers joining the board have been discussed as potential U.S.
Supreme Court nominees: Pamela Karlan, a Stanford law professor who’s a favorite of liberals, and McConnell, also a Stanford professor and a conservative former judge appointed by President George W. Bush.
McConnell told reporters on a conference call that he viewed the court as ensuring Facebook is a neutral platform — a contentious idea
as hoaxes and other information have spread on the network. “One of the
fruits of this if we do our jobs right is that this will bring about a
degree of political and cultural neutrality,” he said.
“It is our ambition and goal that Facebook not decide elections,” he said.
The Nobel Peace Prize laureate is Tawakkol Karman, who won the award in 2011 for her role in organizing protests against the Yemeni government as part of the pro-democracy Arab spring.
Among the other members are Alan Rusbridger, a former editor of Britain’s Guardian newspaper who oversaw
the newspaper’s coverage of U.S. spying based on leaked documents from
Edward Snowden, and John Samples, a Cato Institute vice president who
has argued against government censorship of social media.
Below is Facebook's list of the 20 members of the Facebook Oversight Board:
- Afia Asantewaa Asare-Kyei - A human rights advocate who works on women’s rights, media freedom and access to information issues across Africa at the Open Society Initiative for West Afric
- Evelyn Aswad - A University of Oklahoma College of Law professor who formerly served as a senior State Department lawyer and specializes in the application of international human rights standards to content moderation issues
- Endy Bayuni - A journalist who twice served as the editor-in-chief of The Jakarta Post, and helps direct a journalists’ association that promotes excellence in the coverage of religion and spirituality.
- Catalina Botero Marino, co-chair - A former U.N. special rapporteur for freedom of expression of the Inter-American Commission on Human Rights of the Organization of American States who now serves as dean of the Universidad de los Andes Faculty of Law.
- Katherine Chen - A communications scholar at the National Chengchi University who studies social media, mobile news and privacy, and a former national communications regulator in Taiwan.
- Nighat Dad - A digital rights advocate who offers digital security training to women in Pakistan and across South Asia to help them protect themselves against online harassment, campaigns against government restrictions on dissent, and received the Human Rights Tulip Award.
- Jamal Greene, co-chair - A Columbia Law professor who focuses on constitutional rights adjudication and the structure of legal and constitutional argument.
- Pamela Karlan - A Stanford Law professor and Supreme Court advocate who has represented clients in voting rights, LGBTQ+ rights, and First Amendment cases, and serves as a member of the board of the American Constitution Society.
- Tawakkol Karman - A Nobel Peace Prize laureate who used her voice to promote nonviolent change in Yemen during the Arab Spring, and was named as one of “History's Most Rebellious Women” by Time magazine.
- Maina Kiai - A director of Human Rights Watch’s Global Alliances and Partnerships Program and a former U.N. special rapporteur on the rights to freedom of peaceful assembly and of association who has decades of experience advocating for human rights in Kenya.
- Sudhir Krishnaswamy - A vice chancellor of the National Law School of India University who co-founded an advocacy organization that works to advance constitutional values for everyone, including LGBTQ+ and transgender persons, in India.
- Ronaldo Lemos - A technology, intellectual property and media lawyer who co-created a national internet rights law in Brazil, co-founded a nonprofit focused on technology and policy issues, and teaches law at the Universidade do Estado do Rio de Janeiro.
- Michael McConnell, co-chair - A former U.S. federal circuit judge who is now a constitutional law professor at Stanford, an expert on religious freedom, and a Supreme Court advocate who has represented clients in a wide range of First Amendment cases involving freedom of speech, religion and association.
- Julie Owono - A digital rights and anti-censorship advocate who leads Internet Sans Frontières and campaigns against internet censorship in Africa and around the world.
- Emi Palmor - A former director general of the Israeli Ministry of Justice who led initiatives to address racial discrimination, advance access to justice via digital services and platforms and promote diversity in the public sector.
- Alan Rusbridger - A former editor-in-chief of The Guardian who transformed the newspaper into a global institution and oversaw its Pulitzer Prize-winning coverage of the Edward Snowden disclosures.
- András Sajó - A former judge and vice president of the European Court of Human Rights who is an expert in free speech and comparative constitutionalism.
- John Samples - A public intellectual who writes extensively on social media and speech regulation, advocates against restrictions on online expression, and helps lead a libertarian think tank.
- Nicolas Suzor - A Queensland University of Technology Law School professor who focuses on the governance of social networks and the regulation of automated systems, and has published a book on internet governance.
Harvard University Political Economics Professor David Cutler
Says Zuckerberg Facebook Is Screwing Up
Over 140 scientists who have received funding from the philanthropic group, the Chan Zuckerberg Initiative, sent Facebook founder and CEO Mark Zuckerberg a letter on Saturday calling the social networking company out for its recent stance on President Donald Trump’s inflammatory posts on police brutality protesters.
“As scientists, we are dedicated to investigating ways to better our world,” reads the letter obtained by The Washington Post. “The spread of deliberate misinformation and divisive language is directly antithetical to this goal and we are therefore deeply concerned at the stance Facebook has taken.” The letter specifically calls out Trump’s social media post which said “when the looting starts, the shooting starts,” defining it as a “clear statement of inciting violence.”
Facebook and
Zuckerberg himself have been criticized in recent days for defending the
company’s decision to keep the post on its website. The inaction on the
social networking site’s part was in direct conflict with the response
of other platforms, such as Twitter which hid the tweet behind a warning
label.
“We urge you to consider stricter policies on misinformation and incendiary language that harms people or groups of people, especially in our current climate that is grappling with racial injustice,” the letter concluded. Also among its signatories were scientists funded by the Zuckerberg Biohub, which has recently been working on expanding testing for the coronavirus.
Employees at Facebook have spoken out against the company’s decision to leave up Trump’s tweet. Some even staged a digital walk out. At least two Facebook engineers resigned from the company outright. One software developer shared an email turning down a Facebook recruiter’s offer for a job opportunity with the company.
“The Chan Zuckerberg Initiative is a philanthropic organization started by Priscilla Chan and Mark Zuckerberg that is separate from Facebook,” said a spokesperson for the group in a statement provided to the Post. “We have a separate staff, separate offices, and a separate mission: to build a more inclusive, just, and healthy future for everyone through our work in science, education, and on issues related to justice and opportunity. We are grateful for our staff, partners and grantees in this work and we respect their right to voice their opinions, including on Facebook policies.”
Zuckerberg originally doubled down and tried to defend the company position. As of Friday, however, it appears he’s flipped, saying the company will now review its policies, particularly those concerning state violence.
The Chan Zuckerberg Initiative was originally founded in 2015 by the Facebook CEO and his wife, Priscilla. The organization has pledged $3 billion in order to “eradicate all disease.”
In the letter, the scientists also highlighted another important mission defined by the organization, one that runs in contrast to what Facebook’s policies facilitate on its platform. That mission is to use technology “to help solve some of our toughest challenges — from preventing and eradicating disease, to improving learning experiences for kids, to reforming the criminal justice system” and "to build a more inclusive, just, and healthy future for everyone.”
These scientists sure do have a good point.
“We urge you to consider stricter policies on misinformation and incendiary language that harms people or groups of people, especially in our current climate that is grappling with racial injustice,” the letter concluded. Also among its signatories were scientists funded by the Zuckerberg Biohub, which has recently been working on expanding testing for the coronavirus.
Employees at Facebook have spoken out against the company’s decision to leave up Trump’s tweet. Some even staged a digital walk out. At least two Facebook engineers resigned from the company outright. One software developer shared an email turning down a Facebook recruiter’s offer for a job opportunity with the company.
“The Chan Zuckerberg Initiative is a philanthropic organization started by Priscilla Chan and Mark Zuckerberg that is separate from Facebook,” said a spokesperson for the group in a statement provided to the Post. “We have a separate staff, separate offices, and a separate mission: to build a more inclusive, just, and healthy future for everyone through our work in science, education, and on issues related to justice and opportunity. We are grateful for our staff, partners and grantees in this work and we respect their right to voice their opinions, including on Facebook policies.”
Zuckerberg originally doubled down and tried to defend the company position. As of Friday, however, it appears he’s flipped, saying the company will now review its policies, particularly those concerning state violence.
The Chan Zuckerberg Initiative was originally founded in 2015 by the Facebook CEO and his wife, Priscilla. The organization has pledged $3 billion in order to “eradicate all disease.”
In the letter, the scientists also highlighted another important mission defined by the organization, one that runs in contrast to what Facebook’s policies facilitate on its platform. That mission is to use technology “to help solve some of our toughest challenges — from preventing and eradicating disease, to improving learning experiences for kids, to reforming the criminal justice system” and "to build a more inclusive, just, and healthy future for everyone.”
These scientists sure do have a good point.
Facebook Still Doesn’t Get It
No one man should have all that power.
Photo: Chip Somodevilla/Getty Images
Rashad
Robinson, who has helped organize a high-profile advertising boycott of
Facebook during the month of July, believes the social-media giant
doesn’t really care about getting rid of hate on its platform. On the
latest episode of New York’s Pivot Podcast,
Kara Swisher and Scott Galloway talk to Robinson, who has helped
spearhead the effort, about the gap between the company’s rhetoric and
its actions.
Kara Swisher:
Rashad Robinson is the executive director of Color of Change, the
country’s largest racial-justice organization. Last week, he was part of
a meeting with Facebook executives about the July ad boycott
of Facebook, to discuss the demands he and those companies have made to
the social-media platform. Mark Zuckerberg and Sheryl Sandberg were on
the call, and he was not impressed by Zuckerberg’s performance. So
Rashad, why don’t you give us a rundown of what happened.
Rashad Robinson:
Before the meeting, we had shared the list of demands again, and the
demands are not complicated. They’d been part of ongoing meetings and
protests. Some of them have been highlighted in previous versions of the
civil-rights audit that have come out over the past year and a half,
two years. So we got there really with the goal of having them tell us
what they thought and where they were heading, because they actually
requested the meeting.
And
you know, I’ve been in a lot of meetings with Facebook. I’m going to
meetings with a lot of corporations, and they get trained on how to run
out the clock. They have these strategies on how to have a meeting where
they get you to talk a lot and then they don’t actually have to tell
you anything new. And so I took the lead. I really sort of pushed him,
like, “Hey, you’ve got the demands. We actually want to go through
them.”
Subscribe on:
Google Podcast
Swisher: Give me an example of one or two of the demands.
Robinson:
So one is bringing in a C-suite civil-rights leader that has the budget
and the ability to oversee and weigh in on product and new policy.
Another was specifically to deal with their political-exemption policy
and the way they talk out of both sides of their mouth.
On
one hand, they’ll say there’s a political exemption, but they don’t
really use it, and no one ever gets exempted. And then Donald Trump will
get exempted. And then they’ll say, “Well, that’s because he didn’t
violate the policy,” but they can’t ever tell you when he will violate
the policy. It’s just like you’re talking in circles. That’s just
another example of how you end up with the situation where we have spent
years working on getting rules in place only for them to not enforce
them when it actually matters.
And so I wanted them to go through this. My last meeting with Mark and Sheryl was on June 1, right after the “looters and shooters” post, right after those posts around voter suppression,
where I, at the end of the meeting, was like, “What are we doing here?
Why are we continuing to meet if I don’t feel like anything’s happening
and if you’re trying to just explain to us why you’re working hard?”
They spent a lot of time in the meeting telling us why they’re doing
more than all the other social-media platforms.
Swisher: They’ve gone around to advertisers and said that too.
Robinson:
They’re so much better. They’re working so much harder. They have done
things that other folks won’t do. This is the kind of constant line. At
some point, someone in the meeting said, “So, I guess what you’re saying
is that you’re doing everything right and that we’re just crazy.”
They’re like, “No, no, that’s not what we’re saying.” I’m like, “Well,
what are you saying?”
Swisher: Their own audit
said exactly what you were saying, which was that they have created a
really dangerous situation by favoring their version of free speech over
civil rights. Why do you think that is? You have spent time with them.
If you were them, what would you do to fix their structure?
Robinson:
I would separate the decisions about moderation and content from his
global policy shop. There is not a scenario moving forward where Joel Kaplan
overseeing this is going to be fine with anyone. If Zuckerberg replaces
Joel Kaplan with someone else that has to oversee their relationships
in Washington, other folks are not going to be comfortable with that.
The
fact of the matter is if these decisions are made through the lens of
how to keep policy-makers and policy leaders happy, then you’ve actually
violated one of the tenets of fermenting connection, because you are
making decisions rooted in keeping powerful people and powerful forces
comfortable and happy. It happens here in the United States, and we have
a particular experience with it. But folks in other parts of the world
have a different experience, where protests might be illegal, where
speaking out might be illegal. The fact of the matter is that Facebook
will tell us one thing about their intentions, but every single decision
is rooted in profit and growth. Every single decision is through that
lens.
Galloway: 100 percent.
Robinson: And so in order to keep profit and growth going, they actually have to stay friends with those in power.
Swisher: This is Scott’s opening, because this is one of the main points he makes all the time.
Galloway:
First off, kudos to you and Color of Change. I was really skeptical
that this boycott was going to have any impact, but it’s had more impact
than almost any other effort I can see today. So first off, well done.
Secondly, quite frankly, I’m not sure it’s going to do anything. Let’s
speculate that if you call on Facebook’s better angels, that no one’s
home — and that you have to move back to applying financial pressure.
Can you give us a sense of the state of the boycott and how you put
pressure on the better angels of the people at organizations that spend
money on Facebook?
Robinson:
I think financial pressure is important as well as hopefully changing
the political levers in Washington. That to me is the long game, because
even this type of effort feels like something that we just can’t be
constantly doing, going against the largest advertising platform the
world has ever known. It just can’t simply be about asking advertisers
to walk away. I’ve had a lot of conversation with advertisers, a lot of
conversations recently with the Madison Avenue firms who manage
advertisers, trying to continue to get a pulse of where folks are at. I
think one thing that’s been really helpful here is that this
conversation has trickled up to the board level at a lot of companies.
I
also think that some of the things that Mark has said about advertisers
coming back, some of the flip ways he has responded to this — it’s one
thing for Mark to call us weak, for us to say he doesn’t have to think
about what we are demanding. But you know, a bunch of corporate CEOs, at
what point are you all going to stand up? At what point are you going
to say that you’re not going to let this person walk all over you? I
think that has been part of Facebook’s missteps. They have stepped on
the ego of a lot of folks who have ego and who don’t want to be treated
like that they’re not valuable or their opinions don’t matter.
Swisher:
One of the things is they don’t like Facebook. You can talk to most of
them — they tolerate it because they need it, because it’s the only game
in town.
So,
two things I’d love to know. What do you think the impact right now is
of what Facebook is doing on people of color? Because you have a group
that’s not just people of color — you have the ADL, you’ve got the
NAACP, you’ve got so many groups you’re working with. What is the impact
on society right now for these continued — I would call them — abuses
by Facebook?
Robinson:
The technology that’s supposed to bring us into the future is in so
many ways dragging us into the past. We had created a sense of social
contracts around the ways that white nationalists could organize, right?
They can’t organize at the Starbucks in a public space and have a
meeting. They couldn’t do things out in public, but the incentive
structures at Facebook have allowed people to not only organize, but … A
15-year-old that is searching for one thing runs into some
white-nationalist content and then goes down a hole because they get
served more and more of this content. Because the ways that the
algorithms are set up, people are almost indoctrinated into these ideas
that we’ve tried to put at the margins. Facebook has created a space
that feels like home, that makes these things comfortable, that makes
these things acceptable. And to that extent, they’ve been damaging.
At
the same time, Facebook has refused to be accountable. I was having a
conversation with Alicia Garza, who’s one of the co-founders of Black
Lives Matter. Alicia famously posted “Black Lives Matter” on Facebook
right after the Zimmerman verdict.
Kara Swisher: Which got it started.
Robinson:
Mark talks about it. He talked about it in his free-expression speech
at Georgetown. And Alicia gets regular death threats on Facebook. She
has to go through the same decision tree that anyone else has to go
through. She’s had about 20 death threats over the last several months.
And Facebook has declined to take action on every one of them through
automation. They say something about how it doesn’t violate terms. And
she’s never gotten a phone call from Facebook, no outreach, no
engagement that one would expect. This is Alicia, who’s on TV, who is
well known — and Facebook actually uses her name. They use her work in
the cases they make around this, and they don’t even respond to the
attacks that she’s getting. It’s because they don’t care. The same way
Mark can say that these Fortune 500 advertisers don’t matter, he’s on
the other hand saying that Black activists’ voices don’t matter either.
This is one would imagine how he would have treated SNCC organizers,
how he would have treated the civil-rights leaders that we lionize
today in terms of the ways in which they were attacked and targeted. All
of this is because you’ve got this person that has far too much control
and believes that they, and they alone, understand what’s right. We
don’t actually have the leverage to challenge them. And so I really
appreciate what you said around the boycott. I feel really proud of what
we’ve been able to do. But part of this, from my perspective, has
always been about raising the level of attention and energy and focus so
that we can advance the real conversation about 21st-century rules of
the road. It’s not just Facebook. It is that all of these platforms, if
left to their own devices, will rely on the wrong set of incentive
structures because profit and growth are key drivers to why they exist.
Galloway:
What are the one or two things any of the 3 and a half billion Facebook
users could do right now if they wanted to be supportive of your
actions? What’s the call to action?
Robinson:
A couple of things, I think that folks need to, first and foremost,
vote in this upcoming election. I think that people need to make sure
that politicians know that we want to hold big institutions accountable
and that we vote, because the long game is a new set of rules and we
just don’t get that by wishing. The second thing, I think, for folks who
are actively using Facebook, is that if they see negative content, if
they see content that’s hateful and they see an advertiser next to it,
send that to the advertiser. Advertisers need to consistently hear from
consumers — why are you sponsoring this type of content? Why do you have
your brands next to this type of content? The vast majority, the
overwhelming majority of advertisers are not trying to have their stuff
next to this.
But
Facebook is telling them one story and there’s a totally different
story that’s actually happening. And then finally, I think that all of
us have to be really active users about the content that’s coming our
way. What are we clicking on? What are we sharing? What are we engaging
with? Because the level of disinformation and misinformation that’s
going to be on platform as we head into this election is going to be
outrageous. We all, in our day-to-day lives, can play a role in
disrupting that and pushing back on that.
Swisher: And what is your next move? More boycotts? Continuing the pressure?
Robinson: Continuing the pressure. July 27, Mark testifies in front of Congress
on antitrust issues. A corporation that has become so big and powerful
where they don’t listen to major corporations, where they don’t have to
listen to social-justice leaders, means that there are questions about
whether the platform has become too powerful. And whether it needs new
rules. I think that’s the next phase in this work. The problem for
Facebook is that they are asking people to trust them and big companies
to trust them. And I think the message I have for big companies is: Do
you think that they’re going to embarrass you? Because I have a quick
answer for you. They will. And so just know that time and time again,
they have no problem with embarrassing you, embarrassing your brand.
Swisher:
Rashad, thank you so much. I don’t know what to say. It’s great to hear
a voice like you. Your whole group is fantastic. You all should pay
attention and advertisers should absolutely be paying attention to this
as we’re going forward. And anything we can do to help, we certainly
will.
Robinson: Appreciate you. Thank you.
Pivot is produced by Rebecca Sananes. Erica Anderson is the executive producer.
This transcript has been edited for length and clarity.
Sen. Ted Cruz on Fox Blasts Big Tech Censorship: We Are Seeing Silicon Valley Billionaires Drunk With Power
Sen.Ted Cruz: Big Tech is Drunk On Their Power
Appears on CNBC, Fox News, 'The Dana Show,' and Axios podcast Re:Cap to discuss developments in Big Tech censorship
HOUSTON, Texas - U.S. Sen. Ted Cruz (R-Texas) spoke with Fox News' Harris Faulkner, Dana Loesch, Axios' Dan Primack, CNBC's Kelly Evans and Dom Chu about Big Tech and Twitter's unprecedented censorship of the New York Post's reporting on the Bidens.
WATCH: Cruz on Fox Blasts Big Tech Censorship: We Are Seeing Silicon Valley Billionaires Drunk With Power
On Big Tech's blatant election interference:
"The New York Post has the fourth largest circulation of any newspaper in America, and right now, Jack Dorsey is just behaving as Joe Biden's press secretary. It is censorship. It is wrong." (Sen. Cruz, Fox News, 10/15/2020)
"It's not their role to determine who gets to speak and who doesn't. And by the way, if the emails are fake, you know what? The New York Post can be sued for defamation. If a media outlet puts out defamation, they can be sued. You know who can't be sued? Twitter. They're protected by Section 230. [...] Big Tech has a monopoly position. We have right now, roughly 70 percent of Americans get their political news via social media, that if Big Tech blocks it - and on Twitter, not only can you not tweet it out, they're blocking the New York Post from reaching their own followers and tweeting out their own story." (Sen. Cruz, Axios' Re:Cap, 10/15/2020)
"We're 19 days out from an election, and so the reason I focused on Jack Dorsey and Twitter is because Twitter is blocking the China story. Facebook right now is not blocking the China story. So Dorsey has doubled down. I think Big Tech, frankly, is just drunk on their power." (Sen. Cruz, Axios' Re:Cap, 10/15/2020)
On potential remedies for Big Tech's censorship:
"Big Tech has this special immunity from liability, Section 230. [...] They don't have any entitlement to that subsidy. They shouldn't be immune from liability, particularly if they're going to engage in blatant censorship. So that's number one.
"Number two is the antitrust laws, and the antitrust laws have been on the books for decades. And if you look at the Big Tech giants, you look at Google, YouTube, Facebook, Twitter, they have power. They have a market cap that by any measure, they're bigger and more powerful than Standard Oil was when it was broken up under the antitrust laws. [...] You can't abuse monopoly power, and that's existing law.
"And third is, is consumer fraud and deception. And their basic promise is that that if you follow someone, you'll see what they say. If they follow you, they'll see what you say. They're breaking their promise. They're deceiving consumers, and if you lie to consumers, you can be held liable for that fraud." (Sen. Cruz, The Dana Show, 10/15/2020)
"I expect trial lawyers will be making the argument that they've abandoned the protections of Section 230. [...] But number two, from the Department of Justice and the Federal Trade Commission, and I have spoken repeatedly with the Attorney General, the Deputy Attorney General, the Assistant Attorney General for Antitrust, the Chairman of the FTC, the President of the United States, the Vice President; all about directing the enforcement authority of the administration to go after Big Tech and stop them from abusing their monopoly power, and they are monopolies under the antitrust laws, they are abusing their monopoly power." (Sen. Cruz, CNBC, 10/15/2020)
On the questions Jack Dorsey needs to answer:
"What are your standards, Mr. Dorsey? Why are you choosing to silence this media story and not other stories? How do you how do you determine who gets to speak in America and who doesn't? And how do you get to decide that you can use your corporate treasury to give what is, in effect, a multi-million dollar campaign donation to the Biden campaign, just 7 days out from an election."
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment