Facebook sure does love free $peech
If Facebook just wants to allow politicians to spout mistruths and conspiracy theories without fact checks on its platforms as status updates from their personal pages, then some of these claims could be taken more seriously, but Facebook is getting paid to push these messages to its users. It’s algorithmically deciding where these messages go based on parameters set by the campaigns via a system it designed.
Before you sound off, yeah, political advertising isn’t anything new. I am well aware that TV channels and newspapers have carried messy attack ads and hauled in the advertising revenues for decades, but Facebook is a platform designed around scale. Scale has allowed the company to tap massive revenue streams, but it’s also opened up the company to critiques. The company has learned to respect this scale after sizable amounts of external pressure were applied, but they’ve always defaulted to dated comparisons when it’s profitable to them.
Newspaper and TV political ads are painted with a wider brush and are subject to more stringent laws, but there’s a responsibility in Facebook’s precise ad-targeting that the company still doesn’t seem to respect. The company has the tools to push out judgment calls on content, and it could still do so on a case-by-case basis. Some truths are buried in more nuance than others, but by painting all political claims in its same bath of indifference to truth, Facebook is abusing its scale and creating a platform where a politician’s speech is exempt, as if political leaders aren’t the ultimate primary sources on politically contentious matters.Political advertising legislation is going to take far too long to catch up to the current landscape of technology platforms — it would be nice if we could trust Facebook to stay at a moral forefront that isn’t legally mandated. Twitter and YouTube aren’t immune to this same criticism either, but Facebook is operating in broad daylight, believing that they can reverse engineer a free expression mission statement to prevent responsibility-free revenues from leaking out.
Facebook hired me to head Elections Integrity ops for political ads. I asked if we could scan ads for misinfo. Engineers had great ideas. Higher ups were silent. Free speech is b.s. answer when FB takes $ for ads. Time to regulate ads same as tv and print.
CHP chief investigated over transphobic Facebook post about Caitlyn Jenner
A
California Highway Patrol chief is under investigation after sharing a
post on social media that demeaned transgender star Caitlyn Jenner and
her gender transition.
The
investigation into Chief Mark Garrett was initiated Monday after a
Times reporter showed officials at CHP headquarters a message that the
veteran highway patrol supervisor posted on his personal Facebook page.
The
entry, which Garrett posted in April 2017, shows a photo of Jenner that
is overlaid with a transphobic and vulgar message. In bold type on
Jenner’s image, it reads, “Anyone who says I’m not a lady can,” and then
suggests the reader perform a sex act.
Jenner, formerly known as Bruce Jenner, underwent sex reassignment surgery in January 2017 and transitioned to become Caitlyn Jenner,
a trans woman. The former Olympic decathlon champion and parent of
Kylie and Kendall Jenner of “Keeping Up With the Kardashians” fame has
become one of the public faces of the transgender community.
As UCLA celebrates its centennial anniversary, now is an ideal time to assess the university's relevance to humanity at large. Fortunately, that relevance is easily...
See More By UCLA
When confronted about the post, Garrett initially said he did not remember it.
“I have no recollection of it,” he said. “I am on Facebook very rarely.… If I shared it, I shared.”
He
later acknowledged that he knew the woman who originally posted the
image but said he did not recall sharing it or notice the comments that
several of his friends added further demeaning transgender people.
Garrett said the posting doesn’t “reflect how I feel” or the department’s values.
“That was a personal Facebook page, and it has nothing to do with the CHP,” said Garrett, the chief of the Los Angeles-area CHP.
The
department has launched an investigation into the post and denounced
the image. The probe comes as prosecutors are reviewing whether CHP
officers in the East L.A. station broke the law following an
announcement in February by Garrett that dozens of officers had been relieved of duty amid falsified overtime allegations.
“While
the post in question appeared on a personal Facebook page, which CHP
policy does not specifically address, the post is not consistent with
the department’s organizational values,” spokeswoman Fran Clader said.
“The
CHP is an organization of inclusiveness, and any posts made on an
employee’s personal social media page do not reflect the diversity,
views and background of the more than 11,000 men and women of the
California Highway Patrol who work for this department.”
The
Times attempted to contact Garrett on Tuesday and received an automated
reply stating the chief would be out of the office through June 25.
Julie
Callahan, the founder of the TCops International — Transgender
Community of Police and Sheriffs, said the posting “is obviously
offensive and vulgar.”
But
Callahan, a retired San Jose police detective, said the greater problem
is the larger message it sends because it was posted by a police
leader.
“It
is of concern because he has transgender officers in his area,” she
said. She added that she has counted a dozen executive-level cops who
have lost their positions over things they shared on social media.
While
the CHP says it has no specific policy on personal social media use by
officers, many police agencies have adopted strict guidelines on online
activity. The New York Police Department disciplined 17 officers in 2012
over offensive comments about a West Indian American Day parade.
A
court of appeal last year, in upholding a five-day suspension for a Los
Angeles police officer for a remark he wrote on Facebook, said the LAPD
had the right to discipline him for his online conduct.
Ed
Obayashi, a Plumas County deputy and legal advisor, said he is
surprised the CHP doesn’t have a personal social media policy in place,
because it is probably the hottest issue in internal affairs.
“That
is the best practices standard in the industry to have a clear social
media policy for officers’ personal usage,” he said. Obayashi said that
given the role of social media, officers need to understand what they
should not post.
Obayashi
said comments of discriminatory nature undermine the very job officers
do. “Regardless of officers having 1st Amendment rights, whether on or
off duty, 1st Amendment considerations have to be balanced against the
legitimate department need to be impartial.”
In
recent years, agencies have struggled with officers’ use of social
media. In North Charleston, S.C., a police officer was fired for posting
a photo of himself wearing Confederate flag underwear. The post was
discovered in the wake of the killing of nine black churchgoers by a white supremacist. The officer sued over his termination and was awarded a settlement.
A Philadelphia attorney has launched an extensive examination of the private social media accounts
of nearly 3,000 law enforcement officers from eight departments
nationwide. The resulting database is intended to show how hundreds of
racist or bigoted comments and images from officers’ posts undermine the
public trust, the project’s website says.
Mayor deletes Facebook account after posting hate-filled rant
The mayor of a small Alabama town deleted his Facebook account
amid public outrage over a hate-filled post where he suggested society
should deal with “homosexuals” and “baby killers” by “killing them out.”
“We live in a society where homosexuals lecture us on morals, transvestites lecture us on human biology, baby killers lecture us on human rights and socialists lecture us on economics,” Carbon Hill mayor Mark Chambers fumed in a Facebook post last Friday, according to WBRC.
A Facebook commenter added fuel to the fire, responding to his post, “By giving the minority more rights than the majority. I hate to think of the country my grandkids will live in unless somehow we change and I think that will take a revolution.”
Chambers continued his rant by replying to the comment, “The only way to change it would be to kill the problem out. I know it’s bad to say but without killing them out there’s no way to fix it.”
The mayor’s Facebook post — and now his account — have since been deleted, but not before Chambers first denied making the remarks to a WBRC reporter. He later admitted to and defended his actions.
“I don’t think I posted that. I think that’s somebody else’s post,” he initially told the reporter in a Monday phone call, before hanging up.
But minutes later, Chambers phoned the reporter back, called immigrants “ungrateful,” and claimed his “killing them out” comments were in reference to a potential American civil war where everybody would die.
“That’s in a revolution. That’s right! If it comes to a revolution in this country both sides of these people will be killed out,” he said.
“We live in a society where homosexuals lecture us on morals, transvestites lecture us on human biology, baby killers lecture us on human rights and socialists lecture us on economics,” Carbon Hill mayor Mark Chambers fumed in a Facebook post last Friday, according to WBRC.
A Facebook commenter added fuel to the fire, responding to his post, “By giving the minority more rights than the majority. I hate to think of the country my grandkids will live in unless somehow we change and I think that will take a revolution.”
Chambers continued his rant by replying to the comment, “The only way to change it would be to kill the problem out. I know it’s bad to say but without killing them out there’s no way to fix it.”
The mayor’s Facebook post — and now his account — have since been deleted, but not before Chambers first denied making the remarks to a WBRC reporter. He later admitted to and defended his actions.
“I don’t think I posted that. I think that’s somebody else’s post,” he initially told the reporter in a Monday phone call, before hanging up.
But minutes later, Chambers phoned the reporter back, called immigrants “ungrateful,” and claimed his “killing them out” comments were in reference to a potential American civil war where everybody would die.
“That’s in a revolution. That’s right! If it comes to a revolution in this country both sides of these people will be killed out,” he said.
I had enough of Cory Gestapo Booker and Kamala Communist Harris calls to break up Facebook damn more Government control of the damn internet in the great words of James Traficant ' Beam Me Up Mr. Speaker'
Facebook does not need Government Control.
What Facebook needs to block terrorists groups!
James Traficant 'Scotty Beam Me Up Mr. Speaker' facebook twitter google+ Is allowing terrorists groups without being block
I will Say this Again In the Words OF James Traficant 'Beam Me Up Mr. Speaker'
Cory Booker and Kamala Harris address calls to break up Facebook
Calls to break up Facebook are increasing, but Booker and Harris aren’t sure that’s the right approach.
By
Two Democratic presidential candidates, Sens. Cory
Booker and Kamala Harris, faced questions about Facebook’s size and
power Sunday following the publication of an op-ed by Facebook co-founder Chris Hughes,
who wrote Facebook CEO Mark Zuckerberg has been put into a position of
“unprecedented and un-American” power.
Hughes went on to explicitly call
for the company to be broken up. Democratic presidential candidate Sen. Elizabeth Warren
has made it clear she is for breaking up the social media company.
Booker and Harris have not gone as far, and have links to the tech
industry Warren lacks; Booker has close political and financial ties to
Silicon Valley, and Harris actually represents the industry center in
Congress.
It has become clear that Facebook’s role will be a hot
issue in the presidential campaign, and if Democrats win in 2020, a new
Democratic administration would have a number of options to chose from
when it comes to tech titans, from pushing for new regulations to
working to break up current Silicon Valley giants. In his piece, Hughes advocated for the latter option, in
part because he believes regulations would not be enough to disrupt
Facebook’s current ways of doing business.
“Facebook isn’t afraid of a few more rules,” Hughes
wrote. “It’s afraid of an antitrust case and the kind of accountability
that real government oversight would bring.”
Booker: Calling for Facebook’s breakup ‘sounds more like a Donald Trump thing’
Appearing on ABC’s This Week, Booker was asked about Hughes’s piece. Booker’s own background
as a Stanford classmate of many of the modern Internet’s founding
figures — and as someone who had friendly relations with Facebook and
other companies in the pre-Trump era — puts him in a position of having
to balance his personal relationships with the sorts of calls for change
Hughes and Warren have added to political discourse.
Initially, Booker framed Facebook’s dominance as part of a wider issue of corporate consolidation across multiple sectors.
“I don’t care if it’s Facebook, the pharma industry, even
the agricultural industry,” the senator said. “We’ve had a problem in
America with corporate consolidation, that is having really ill effects.
It’s driving out the independent family farmer. It’s driving up our
prescription drug costs. And in the realm of technology, we’re seeing
... one or two companies, controlling a significant amount of the online
advertising.”
When pressed on the specific idea that Facebook should be
broken up, Booker signaled he believes tech regulation ought to be a
task shared by a number of agencies. Specifically, he said he would have
the Justice Department pursue appropriate antitrust investigations, and
hold industry accountable.
When asked about Warren’s proposal to break up tech giants, Booker said he is not completely in agreement with the idea.
“I don’t think that a president should be running around,
pointing at companies and saying break them up without any kind of
process here,” Booker said. “Do I think it is a massive problem in
America, corporate consolidation? Absolutely. It’s about making sure
that we have a system that works.”
He continued: “It’s not me and my own personal opinion
about going after folks. That sounds more like a Donald Trump thing to
say: I’m going to break up you guys, I’m gonna break – no.”
Jonathan Karl, who was conducting the interview, was
clearly surprised, and responded, “You just compared Elizabeth Warren to
Donald Trump.”
“I — I — I most certainly did not, she is my friend,”
Booker said, before going on to add, “Let her discuss and debate her
positions. I’m telling you right now, we do not need a president that is
going to use their own personal beliefs and tell you which companies we
should break up. We need a president that’s going to enforce antitrust
laws in this country, and I will be that person.”
Harris: Facebook is a ‘utility that has gone unregulated’
Meanwhile, on CNN’s State of the Union, Harris
(D-CA) — who was being interviewed not too far away from Silicon Valley
itself, as host Jake Tapper noted — raised a possibility for a major
regulatory change: Treating Facebook as a public utility.
”I think that Facebook has experienced massive growth,
and has prioritized its growth over the best interests of its consumers —
especially on the issue of privacy. There is no question in my mind
that there needs to be serious regulation, and that has not been
happing. There needs to be more oversight; that has not been happening.”
Sen. Kamala Harris says "Facebook has experienced massive growth and has prioritized its growth over the best interests of its consumers … It is essentially a utility that has gone unregulated." #CNNSOTU pic.twitter.com/LADYpEaT7h— CNN Politics (@CNNPolitics)
May 12, 2019
Harris declined to explicitly say that Facebook should be
broken up, but she did say she remains open to exploring that
possibility.
“I think we have to seriously take a look at that, yes.
When you look at the issue, they’re essentially a utility,” Harris said.
“There are very few people that can actually get by, and be involved in
their communities and society, or in whatever their profession, without
somehow, somewhere using Facebook. It’s very difficult for people to be
engaged in any level of commerce without it. So, we have to recognize
it for what it is: It is essentially a utility that has gone
unregulated. And as far as I’m concerned, that’s got to stop.”
As presidential candidates discuss Facebook, the FTC is also debating what to do about the company
While Harris, Booker, and Warren have different
theoretical approaches to dealing with Facebook, the FTC is wrestling
with what to do with Facebook now.
The company is set to pay a fine of anywhere from $3 billion to $5 billion
to the Federal Trade Commission (FTC) over its handling of user data
and privacy violations in the wake of a scandal involving Cambridge
Analytica, a data firm that worked for the Trump presidential campaign.
While the fine under consideration is incredibly large, it might not actually hurt a company of Facebook’s size.
Not only can Facebook pay the money, but its stock price actually went up
after the expected fine was revealed. Such fines can become a normal
cost of doing business — and potentially provide a market benefit by
getting rid of uncertainty.
Two of Harris, Booker, and Warren’s colleagues, Sens. Richard Blumenthal (D-CT) and Josh Hawley (R-MO), wrote a bipartisan letter to the FTC chairman last week,
declaring that the fine under consideration is not enough, while
arguing the commission needs to levy a more punishing fine that would
send a message to Facebook and other tech companies.
The letter also
requested that concrete rules be put in place to restrict Facebook’s
collection of data and sharing across its different platforms.
Blumenthal and Hawley further added ominously that
individual executives such as Zuckerberg should be personally on the
line for any future violations:
As important as remedies on Facebook as a company are, the FTC should impose tough accountability measures and penalties for individual executives and management responsible for violations of the consent order and for privacy failures. Personal responsibility must be recognized from the top of the corporate board down to the product development teams.
The FTC has been investigating Facebook for more than a
year; it is not clear when it will make an announcement about the fine,
although commissioners are expected to make a decision ahead of the 2020
election.
It’s Time to Break Up Facebook
By Chris Hughes
May 9, 2019
CreditCreditJessica Chou for The New York Times (Zuckerberg); Damon Winter/The New York Times (Hughes)
The Privacy Project
The last time I saw Mark Zuckerberg was in the summer of 2017, several months before the Cambridge Analytica scandal broke. We met at Facebook’s
Menlo Park, Calif., office and drove to his house, in a quiet, leafy
neighborhood. We spent an hour or two together while his toddler
daughter cruised around. We talked politics mostly, a little about Facebook, a bit about our families. When the shadows grew long, I had to head out. I hugged his wife, Priscilla, and said goodbye to Mark.
Since
then, Mark’s personal reputation and the reputation of Facebook have
taken a nose-dive. The company’s mistakes — the sloppy privacy practices
that dropped tens of millions of users’ data into a political
consulting firm’s lap; the slow
response to Russian agents, violent rhetoric and fake news; and the
unbounded drive to capture ever more of our time and attention —
dominate the headlines.
It’s been 15 years since I co-founded Facebook
at Harvard, and I haven’t worked at the company in a decade. But I feel a
sense of anger and responsibility.
Mark is still the same person I watched hug his parents as they left our dorm’s common room at the beginning of our
sophomore year. He is the same person who procrastinated studying for
tests, fell in love with his future wife while in line for the bathroom
at a party and slept on a mattress on the floor in a small apartment
years after he could have afforded much more. In other words, he’s
human. But it’s his very humanity that makes his unchecked power so
problematic.
Mark’s influence is staggering, far beyond that of anyone else in the private sector or in government. He controls three core communications platforms — Facebook,
Instagram and WhatsApp — that billions of people use every day.
Facebook’s board works more like an advisory committee than an overseer,
because Mark controls around 60 percent of voting shares.
Mark alone can decide how to configure Facebook’s algorithms to
determine what people see in their News Feeds, what privacy settings
they can use and even which messages get delivered. He sets the rules
for how to distinguish violent and incendiary speech from the merely
offensive, and he can choose to shut down a competitor by acquiring,
blocking or copying it.
Mark is a good, kind person. But I’m angry that his focus on growth led him to sacrifice security and civility for clicks. I’m disappointed in myself and the early Facebook team for not thinking more about how the News Feed algorithm could change our culture, influence elections and empower nationalist leaders. And I’m worried that Mark has surrounded himself with a team that reinforces his beliefs instead of challenging them.
The
government must hold Mark accountable. For too long, lawmakers have
marveled at Facebook’s explosive growth and overlooked their
responsibility to ensure that Americans are protected and markets are
competitive. Any day now, the Federal Trade Commission is expected to
impose a $5 billion fine on the company, but that is not enough; nor is
Facebook’s offer to appoint some kind of privacy czar.
After Mark’s congressional
testimony last year, there should have been calls for him to truly
reckon with his mistakes. Instead the legislators who questioned him
were derided as too old and out of touch to understand how tech works.
That’s the impression Mark wanted Americans to have, because it means little will change.
We
are a nation with a tradition of reining in monopolies, no matter how
well intentioned the leaders of these companies may be. Mark’s power is
unprecedented and un-American.
It is time to break up Facebook.
We already have the tools we need to check the domination of Facebook. We just seem to have forgotten about them.
America
was built on the idea that power should not be concentrated in any one
person, because we are all fallible. That’s why the founders created a
system of checks and balances.
They didn’t need to foresee the rise of
Facebook to understand the threat that gargantuan companies would pose
to democracy. Jefferson and Madison were voracious readers of Adam
Smith, who believed that monopolies prevent the competition that spurs
innovation and leads to economic growth.
A century later, in response to the rise of the
oil, railroad and banking trusts of the Gilded Age, the Ohio Republican
John Sherman said on the floor of Congress: “If we will not endure a
king as a political power, we should not endure a king over the
production, transportation and sale of any of the necessities of life.
If
we would not submit to an emperor, we should not submit to an autocrat
of trade with power to prevent competition and to fix the price of any
commodity.”
The Sherman Antitrust Act of 1890 outlawed monopolies. More
legislation followed in the 20th century, creating legal and regulatory
structures to promote competition and hold the biggest companies
accountable.
The Department of Justice broke up monopolies like Standard
Oil and AT&T. For many people
today, it’s hard to imagine government doing much of anything right, let
alone breaking up a company like Facebook. This isn’t by coincidence.
Starting
in the 1970s, a small but dedicated group of economists, lawyers and
policymakers sowed the seeds of our cynicism.
Over the next 40 years,
they financed a network of think tanks, journals, social clubs, academic
centers and media outlets to teach an emerging generation that private
interests should take precedence over public ones.
Their gospel was
simple: “Free” markets are dynamic and productive, while government is
bureaucratic and ineffective. By the mid-1980s, they had largely managed
to relegate energetic antitrust enforcement to the history books.
This
shift, combined with business-friendly tax and regulatory policy,
ushered in a period of mergers and acquisitions that created
megacorporations.
In the past 20 years, more than 75 percent of American
industries, from airlines to pharmaceuticals, have experienced
increased concentration, and the average size of public companies has tripled.
The results are a decline in entrepreneurship, stalled productivity growth, and higher prices and fewer choices for consumers.
The
same thing is happening in social media and digital communications.
Because Facebook so dominates social networking, it faces no
market-based accountability. This means that every time Facebook messes
up, we repeat an exhausting pattern: first outrage, then disappointment
and, finally, resignation.
In 2005, I was in Facebook’s first office,
on Emerson Street in downtown Palo Alto, when I read the news that
Rupert Murdoch’s News Corporation was acquiring the social networking
site Myspace for $580 million.
The overhead lights were off, and a group
of us were pecking away on our keyboards, our 21-year-old faces
half-illuminated by the glow of our screens.
I heard a “whoa,” and the
news then ricocheted silently through the room, delivered by AOL Instant
Messenger. My eyes widened. Really, $580 million?
Facebook was competing with Myspace, albeit obliquely. We were focused on college students at that point, but we had real identities while Myspace had fictions.
Our users were more engaged, visiting daily, if not hourly.
We believed Facebook surpassed Myspace in quality and would easily
displace it given enough time and money. If Myspace was worth $580
million, Facebook could be worth at least double.
Journalists kept close watch as social media became big business.
From our earliest days, Mark used the word “domination” to describe our ambitions, with no hint of irony or humility.
Back then, we competed with a whole host of social networks, not just Myspace, but also Friendster, Twitter, Tumblr, LiveJournal and others.
The pressure to beat them spurred innovation and led to many of the features that distinguish Facebook: simple, beautiful interfaces, the News Feed, a tie to real-world identities and more.
It was this drive to compete that led Mark to acquire, over the years, dozens of other companies, including Instagram and WhatsApp in 2012 and 2014. There was nothing unethical or suspicious, in my view, in these moves.
One
night during the summer of the Myspace sale, I remember driving home
from work with Mark, back to the house we shared with several engineers
and designers.
I was in the passenger seat of the Infiniti S.U.V. that
our investor Peter Thiel had bought for Mark to replace the unreliable
used Jeep that he had been driving.
As
we turned right off Valparaiso Avenue, Mark confessed the immense
pressure he felt. “Now that we employ so many people …” he said,
trailing off. “We just really can’t fail.”
Facebook
had gone from a project developed in our dorm room and chaotic summer
houses to a serious company with lawyers and a human resources
department. We had around 50 employees, and their families relied on
Facebook to put food on the table.
I gazed out the window and thought to
myself, It’s never going to stop. The bigger we get, the harder we’ll have to work to keep growing.
Over a decade later, Facebook has earned the prize of domination. It is worth half a trillion dollars and commands, by my estimate,
more than 80 percent of the world’s social networking revenue. It is a
powerful monopoly, eclipsing all of its rivals and erasing competition
from the social networking category.
This explains why, even during the
annus horribilis of 2018, Facebook’s earnings per share increased by an
astounding 40 percent compared with the year before.
(I liquidated my
Facebook shares in 2012, and I don’t invest directly in any social media companies.)
Facebook’s monopoly is also visible in its usage statistics. About 70 percent
of American adults use social media, and a vast majority are on
Facebook products. Over two-thirds use the core site, a third use
Instagram, and a fifth use WhatsApp.
By contrast, fewer than a third
report using Pinterest, LinkedIn or Snapchat. What started out as
lighthearted entertainment has become the primary way that people of all
ages communicate online.
Dominating the Market
The total number of users across Facebook’s platforms far exceeds the number on any rival platform.
Platforms owned by Facebook
Facebook
2.3B monthly
active users
1.6B
Messenger
1.3B
Instagram
1.1B
YouTube
1.9B monthly
active user
WeChat
1.1B
331M
332M
TikTok
500M
LinkedIn
303M
Snapchat
287M
303M
Snapchat
287M
Even
when people want to quit Facebook, they don’t have any meaningful
alternative, as we saw in the aftermath of the Cambridge Analytica
scandal.
Worried about their privacy and lacking confidence in
Facebook’s good faith, users across the world started a “Delete
Facebook” movement. According to the Pew Research Center,
a quarter deleted their accounts from their phones, but many did so
only temporarily. I heard more than one friend say, “I’m getting off
Facebook altogether — thank God for Instagram,” not realizing that
Instagram was a Facebook subsidiary. In the end people did not leave the
company’s platforms en masse. After all, where would they go?
Facebook’s dominance is not an accident of history.
The company’s strategy was to beat every competitor in plain view, and
regulators and the government tacitly — and at times explicitly —
approved. In one of the government’s few attempts to rein in the
company, the F.T.C. in 2011 issued a consent decree that Facebook not
share any private information beyond what users already agreed to.
Facebook largely ignored the decree. Last month, the day after the
company predicted in an earnings call that it would need to pay up to $5 billion as a penalty for its negligence — a slap on the wrist — Facebook’s shares surged 7 percent, adding $30 billion to its value, six times the size of the fine.
The
F.T.C.’s biggest mistake was to allow Facebook to acquire Instagram and
WhatsApp. In 2012, the newer platforms were nipping at Facebook’s heels
because they had been built for the smartphone, where Facebook was
still struggling to gain traction.
Mark responded by buying them, and
the F.T.C. approved. Neither
Instagram nor WhatsApp had any meaningful revenue, but both were
incredibly popular. The Instagram acquisition guaranteed Facebook would
preserve its dominance in photo networking, and WhatsApp gave it a new
entry into mobile real-time messaging.
Now, the founders of Instagram and WhatsApp have left the company after clashing with Mark over his management of their platforms. But their former properties remain Facebook’s, driving much of its recent growth.
When it hasn’t acquired its way to
dominance, Facebook has used its monopoly position to shut out competing
companies or has copied their technology.
The News Feed algorithm reportedly prioritized
videos created through Facebook over videos from competitors, like
YouTube and Vimeo. In 2012, Twitter introduced a video network called
Vine that featured six-second videos. That same day, Facebook blocked
Vine from hosting a tool that let its users search for their Facebook friends while on the new network. The decision hobbled Vine, which shut down four years later.
Snapchat
posed a different threat. Snapchat’s Stories and impermanent messaging
options made it an attractive alternative to Facebook and Instagram. And
unlike Vine, Snapchat wasn’t interfacing with the Facebook ecosystem;
there was no obvious way to handicap the company or shut it out.
So
Facebook simply copied it.
Facebook’s
version of Snapchat’s stories and disappearing messages proved wildly
successful, at Snapchat’s expense.
At an all-hands meeting in 2016, Mark
told Facebook employees not to let their pride get in the way of giving
users what they want. According to Wired magazine, “Zuckerberg’s message became an informal slogan at Facebook: ‘Don’t be too proud to copy.’”
(There
is little regulators can do about this tactic: Snapchat patented its
“ephemeral message galleries,” but copyright law does not extend to the
abstract concept itself.)
As
a result of all this, would-be competitors can’t raise the money to
take on Facebook. Investors realize that if a company gets traction,
Facebook will copy its innovations, shut it down or acquire it for a relatively modest sum.
So despite an extended economic expansion, increasing interest in
high-tech start-ups, an explosion of venture capital and growing public
distaste for Facebook, no major social networking company has been
founded since the fall of 2011.
As
markets become more concentrated, the number of new start-up businesses
declines. This holds true in other high-tech areas dominated by single
companies, like search (controlled by Google) and e-commerce (taken over
by Amazon). Meanwhile, there has been plenty of innovation in areas
where there is no monopolistic domination, such as in workplace
productivity (Slack, Trello, Asana), urban transportation (Lyft, Uber,
Lime, Bird) and cryptocurrency exchanges (Ripple, Coinbase, Circle).
I
don’t blame Mark for his quest for domination. He has demonstrated
nothing more nefarious than the virtuous hustle of a talented
entrepreneur. Yet he has created a leviathan that crowds out
entrepreneurship and restricts consumer choice. It’s on our government
to ensure that we never lose the magic of the invisible hand. How did we allow this to happen?
Since the 1970s, courts have become increasingly hesitant
to break up companies or block mergers unless consumers are paying
inflated prices that would be lower in a competitive market. But a
narrow reliance on whether or not consumers have experienced price
gouging fails to take into account the full cost of market domination.
It doesn’t recognize that we also want markets to be competitive to
encourage innovation and to hold power in check.
And it is out of step
with the history of antitrust law. Two of the last major antitrust
suits, against AT&T and IBM in the 1980s, were grounded in the
argument that they had used their size to stifle innovation and crush
competition.
As the Columbia law professor Tim Wu writes,
“It is a disservice to the laws and their intent to retain such a
laserlike focus on price effects as the measure of all that antitrust
was meant to do.”
Facebook is the
perfect case on which to reverse course, precisely because Facebook
makes its money from targeted advertising, meaning users do not pay to
use the service. But it is not actually free, and it certainly isn’t
harmless.
Facebook’s
business model is built on capturing as much of our attention as
possible to encourage people to create and share more information about
who they are and who they want to be. We pay for Facebook with our data
and our attention, and by either measure it doesn’t come cheap.
I
was on the original News Feed team (my name is on the patent), and that
product now gets billions of hours of attention and pulls in unknowable
amounts of data each year. The average Facebook user spends an hour a day
on the platform; Instagram users spend 53 minutes a day scrolling
through pictures and videos.
They create immense amounts of data — not
just likes and dislikes, but how many seconds they watch a particular
video — that Facebook uses to refine its targeted advertising. Facebook
also collects data from partner companies and apps, without most users knowing about it, according to testing by The Wall Street Journal. Some
days, lying on the floor next to my 1-year-old son as he plays with his
dinosaurs, I catch myself scrolling through Instagram, waiting to see
if the next image will be more beautiful than the last. What am I doing?
I know it’s not good for me, or for my son, and yet I do it anyway. The
choice is mine, but it doesn’t feel like a choice. Facebook seeps into
every corner of our lives to capture as much of our attention and data
as possible and, without any alternative, we make the trade. The
vibrant marketplace that once drove Facebook and other social media
companies to compete to come up with better products has virtually
disappeared. This means there’s less chance of start-ups developing
healthier, less exploitative social media platforms. It also means less
accountability on issues like privacy. Just last month, Facebook seemingly tried to bury news
that it had stored tens of millions of user passwords in plain text
format, which thousands of Facebook employees could see. Competition
alone wouldn’t necessarily spur privacy protection — regulation is
required to ensure accountability — but Facebook’s lock on the market
guarantees that users can’t protest by moving to alternative platforms.
The most problematic aspect of Facebook’s power
is Mark’s unilateral control over speech. There is no precedent for his
ability to monitor, organize and even censor the conversations of two
billion people.
Facebook engineers
write algorithms that select which users’ comments or experiences end up
displayed in the News Feeds of friends and family. These rules are
proprietary and so complex that many Facebook employees themselves don’t
understand them.
In 2014, the rules
favored curiosity-inducing “clickbait” headlines. In 2016, they enabled
the spread of fringe political views and fake news, which made it
easier for Russian actors to manipulate the American electorate.
In
January 2018, Mark announced that the algorithms would favor non-news
content shared by friends and news from “trustworthy” sources, which his
engineers interpreted — to the confusion of many — as a boost for anything in the category of “politics, crime, tragedy.”
Facebook
has responded to many of the criticisms of how it manages speech by
hiring thousands of contractors to enforce the rules that Mark and
senior executives develop. After a few weeks of training, these
contractors decide which videos count as hate speech or free speech,
which images are erotic and which are simply artistic, and which live
streams are too violent to be broadcast. (The Verge reported
that some of these moderators, working through a vendor in Arizona,
were paid $28,800 a year, got limited breaks and faced significant
mental health risks.)
As if Facebook’s opaque algorithms weren’t enough,
last year we learned that Facebook executives had permanently deleted
their own messages from the platform, erasing them from the inboxes of
recipients; the justification was corporate security concerns. When I
look at my years of Facebook messages with Mark now, it’s just a long
stream of my own light-blue comments, clearly written in response to
words he had once sent me.
(Facebook now offers a limited version of
this feature to all users.)
The most extreme example of Facebook manipulating speech happened in Myanmar in late 2017. Mark said in a Vox interview
that he personally made the decision to delete the private messages of
Facebook users who were encouraging genocide there.
“I remember, one
Saturday morning, I got a phone call,” he said, “and we detected that
people were trying to spread sensational messages through — it was
Facebook Messenger in this case — to each side of the conflict,
basically telling the Muslims, ‘Hey, there’s about to be an uprising of
the Buddhists, so make sure that you are armed and go to this place.’
And then the same thing on the other side.”
Facebook isn’t afraid of a few more rules. It’s afraid of an antitrust case.
If we don’t have public servants shaping these policies, corporations will.
Mark’s influence is staggering, far beyond that of anyone else in the private sector or in government.
From our earliest days, Mark used the word “domination” to describe our ambitions.
Cnn the obama regime news network paid for Facebook with our data and there attention, and by either measure it doesn’t come cheap.
Mark
made a call: “We stop those messages from going through.” Most people
would agree with his decision, but it’s deeply troubling that he made it
with no accountability to any independent authority or government.
Facebook could, in theory, delete en masse the messages of Americans,
too, if its leadership decided it didn’t like them.
Mark
used to insist that Facebook was just a “social utility,” a neutral
platform for people to communicate what they wished. Now he recognizes
that Facebook is both a platform and a publisher and that it is
inevitably making decisions about values.
The company’s own lawyers have
argued in court that Facebook is a publisher and thus entitled to First Amendment protection.
No
one at Facebook headquarters is choosing what single news story
everyone in America wakes up to, of course. But they do decide whether
it will be an article from a reputable outlet or a clip from “The Daily
Show,” a photo from a friend’s wedding or an incendiary call to kill
others.
Mark knows that this is too
much power and is pursuing a twofold strategy to mitigate it. He is
pivoting Facebook’s focus toward encouraging more private,
encrypted messaging that Facebook’s employees can’t see, let alone
control. Second, he is hoping for friendly oversight from regulators and
other industry executives.
Late
last year, he proposed an independent commission to handle difficult
content moderation decisions by social media platforms. It would afford
an independent check, Mark argued, on Facebook’s decisions, and users
could appeal to it if they disagreed. But its decisions would not have
the force of law, since companies would voluntarily participate.
In an op-ed essay in The Washington Post
in March, he wrote, “Lawmakers often tell me we have too much power
over speech, and I agree.” And he went even further than before, calling
for more government regulation — not just on speech, but also on
privacy and interoperability, the ability of consumers to seamlessly
leave one network and transfer their profiles, friend connections,
photos and other data to another.
I
don’t think these proposals were made in bad faith. But I do think
they’re an attempt to head off the argument that regulators need to go
further and break up the company. Facebook isn’t afraid of a few more
rules. It’s afraid of an antitrust case and of the kind of
accountability that real government oversight would bring.
We
don’t expect calcified rules or voluntary commissions to work to
regulate drug companies, health care companies, car manufacturers or
credit card providers. Agencies oversee these industries to ensure that
the private market works for the public good. In these cases, we all
understand that government isn’t an external force meddling in an
organic market; it’s what makes a dynamic and fair market possible in
the first place.
This should be just as true for social networking as it
is for air travel or pharmaceuticals.
In the summer of 2006, Yahoo offered us $1 billion for
Facebook. I desperately wanted Mark to say yes. Even my small slice of
the company would have made me a millionaire several times over. For a
22-year-old scholarship kid from small-town North Carolina, that kind of
money was unimaginable.
I wasn’t alone — just about every other person
at the company wanted the same. It
was taboo to talk about it openly, but I finally asked Mark when we had a
moment alone, “How are you feeling about Yahoo?” I got a shrug and a
one-line answer: “I just don’t know if I want to work for Terry Semel,”
Yahoo’s chief executive.
Outside of a
couple of gigs in college, Mark had never had a real boss and seemed
entirely uninterested in the prospect. I didn’t like the idea much
myself, but I would have traded having a boss for several million
dollars any day of the week. Mark’s drive was infinitely stronger.
Domination meant domination, and the hustle was just too delicious.
Mark
may never have a boss, but he needs to have some check on his power.
The American government needs to do two things: break up Facebook’s
monopoly and regulate the company to make it more accountable to the
American people.
First,
Facebook should be separated into multiple companies. The F.T.C., in
conjunction with the Justice Department, should enforce antitrust laws
by undoing the Instagram and WhatsApp acquisitions and banning future
acquisitions for several years.
The F.T.C. should have blocked these
mergers, but it’s not too late to act. There is precedent for correcting
bad decisions — as recently as 2009, Whole Foods settled antitrust complaints by selling off the Wild Oats brand and stores that it had bought a few years earlier.
There is some evidence that we may be headed in this direction. Senator Elizabeth Warren has called for reversing the Facebook mergers, and in February, the F.T.C. announced the creation of a task force to monitor competition among tech companies and review previous mergers.
How would a breakup work? Facebook would have a brief period to spin off
the Instagram and WhatsApp businesses, and the three would become
distinct companies, most likely publicly traded.
Facebook shareholders
would initially hold stock in the new companies, although Mark and other
executives would probably be required to divest their management
shares.
Until recently, WhatsApp and
Instagram were administered as independent platforms inside the parent
company, so that should make the process easier. But time is of the
essence: Facebook is working quickly to integrate the three, which would
make it harder for the F.T.C. to split them up.
Some economists are skeptical that
breaking up Facebook would spur that much competition, because
Facebook, they say, is a “natural” monopoly. Natural monopolies have
emerged in areas like water systems and the electrical grid, where the
price of entering the business is very high — because you have to lay
pipes or electrical lines — but it gets cheaper and cheaper to add each
additional customer.
In other words, the monopoly arises naturally from
the circumstances of the business, rather than a company’s illegal
maneuvering. In addition, defenders of natural monopolies often make the
case that they benefit consumers because they are able to provide
services more cheaply than anyone else.
Facebook
is indeed more valuable when there are more people on it: There are
more connections for a user to make and more content to be shared. But
the cost of entering the social network business is not that high. And
unlike with pipes and electricity, there is no good argument that the
country benefits from having only one dominant social networking
company.
Still
others worry that the breakup of Facebook or other American tech
companies could be a national security problem. Because advancements in
artificial intelligence require immense amounts of data and computing
power, only large companies like Facebook, Google and Amazon can afford
these investments, they say. If American companies become smaller, the
Chinese will outpace us.
While
serious, these concerns do not justify inaction. Even after a breakup,
Facebook would be a hugely profitable business with billions to invest
in new technologies — and a more competitive market would only encourage
those investments. If the Chinese did pull ahead, our government could
invest in research and development and pursue tactical trade policy,
just as it is doing today to hold China’s 5G technology at bay.
The
cost of breaking up Facebook would be next to zero for the government,
and lots of people stand to gain economically. A ban on short-term
acquisitions would ensure that competitors, and the investors who take a
bet on them, would have the space to flourish. Digital advertisers
would suddenly have multiple companies vying for their dollars.
Even
Facebook shareholders would probably benefit, as shareholders often do
in the years after a company’s split. The value of the companies that
made up Standard Oil doubled within a year of its being dismantled and
had increased by fivefold a few years later. Ten years after the 1984
breakup of AT&T, the value of its successor companies had tripled. But
the biggest winners would be the American people.
Imagine a competitive
market in which they could choose among one network that offered higher
privacy standards, another that cost a fee to join but had little
advertising and another that would allow users to customize and tweak
their feeds as they saw fit.
No one knows exactly what Facebook’s
competitors would offer to differentiate themselves. That’s exactly the
point.
The Justice Department faced
similar questions of social costs and benefits with AT&T in the
1950s. AT&T had a monopoly on phone services and telecommunications
equipment. The government filed suit under antitrust laws, and the case
ended with a consent decree that required AT&T to release its
patents and refrain from expanding into the nascent computer industry.
This resulted in an explosion of innovation, greatly increasing
follow-on patents and leading to the development of the semiconductor
and modern computing. We would most likely not have iPhones or laptops
without the competitive markets that antitrust action ushered in.
Adam Smith was right: Competition spurs growth and innovation.
Just breaking up Facebook is not enough.
We need a new agency, empowered by Congress to regulate tech companies. Its first mandate should be to protect privacy.
The
Europeans have made headway on privacy with the General Data Protection
Regulation, a law that guarantees users a minimal level of protection.
A
landmark privacy bill in the United States should specify exactly what
control Americans have over their digital information, require clearer
disclosure to users and provide enough flexibility to the agency to
exercise effective oversight over time.
The agency should also be
charged with guaranteeing basic interoperability across platforms. Finally,
the agency should create guidelines for acceptable speech on social
media.
This idea may seem un-American — we would never stand for a
government agency censoring speech. But we already have limits on
yelling “fire” in a crowded theater, child pornography, speech intended
to provoke violence and false statements to manipulate stock prices.
We
will have to create similar standards that tech companies can use. These
standards should of course be subject to the review of the courts, just
as any other limits on speech are. But there is no constitutional right
to harass others or live-stream violence.
These are difficult challenges. I worry that government regulators will not be able to keep up with the pace of digital innovation.
I worry that more competition in social networking might lead to a conservative Facebook and a liberal one, or that newer social networks might be less secure if government regulation is weak.
But sticking with the status quo would be worse: If we don’t have public servants shaping these policies, corporations will.
Some
people doubt that an effort to break up Facebook would win in the
courts, given the hostility on the federal bench to antitrust action, or
that this divided Congress would ever be able to muster enough
consensus to create a regulatory agency for social media.
But
even if breakup and regulation aren’t immediately successful, simply
pushing for them will bring more oversight.
The government’s case
against Microsoft — that it illegally used its market power in operating
systems to force its customers to use its web browser, Internet
Explorer — ended in 2001 when George W. Bush’s administration abandoned
its effort to break up the company.
Yet that prosecution helped rein in
Microsoft’s ambitions to dominate the early web.
Similarly,
the Justice Department’s 1970s suit accusing IBM of illegally
maintaining its monopoly on computer sales ended in a stalemate. But
along the way, IBM changed many of its behaviors.
It stopped bundling
its hardware and software, chose an extremely open design for the
operating system in its personal computers and did not exercise undue
control over its suppliers.
Professor Wu has written that this
“policeman at the elbow” led IBM to steer clear “of anything close to
anticompetitive conduct, for fear of adding to the case against it.” We can expect the same from even an unsuccessful suit against Facebook.
Finally,
an aggressive case against Facebook would persuade other behemoths like
Google and Amazon to think twice about stifling competition in their
own sectors, out of fear that they could be next.
If the government were
to use this moment to resurrect an effective competition standard that
takes a broader view of the full cost of “free” products, it could
affect a whole host of industries.
The
alternative is bleak.
If we do not take action, Facebook’s monopoly
will become even more entrenched. With much of the world’s personal
communications in hand, it can mine that data for patterns and trends,
giving it an advantage over competitors for decades to come.
I take responsibility for not sounding the alarm earlier.
Don Graham, a former Facebook board member, has accused
those who criticize the company now as having “all the courage of the
last man leaping on the pile at a football game.”
The financial rewards I
reaped from working at Facebook radically changed the trajectory of my
life, and even after I cashed out, I watched in awe as the company grew.
It took the 2016 election fallout and Cambridge Analytica to awaken me
to the dangers of Facebook’s monopoly. But anyone suggesting that
Facebook is akin to a pinned football player misrepresents its
resilience and power.
An era of
accountability for Facebook and other monopolies may be beginning.
Collective anger is growing, and a new cohort of leaders has begun to
emerge.
On Capitol Hill, Representative David Cicilline has taken a
special interest in checking the power of monopolies, and Senators Amy
Klobuchar and Ted Cruz have joined Senator Warren in calling for more
oversight.
Economists like Jason Furman, a former chairman of the
Council of Economic Advisers, are speaking out about monopolies, and a
host of legal scholars like Lina Khan, Barry Lynn and Ganesh Sitaraman
are plotting a way forward.
This
movement of public servants, scholars and activists deserves our
support. Mark Zuckerberg cannot fix Facebook, but our government can.
Chris
Hughes, a co-founder of Facebook, is a co-chairman of the Economic
Security Project and a senior adviser at the Roosevelt Institute.
The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: letters@nytimes.com.
Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.
Correction:
An
earlier version of this essay misidentified the type of computers at
issue in a 1970s antitrust case against IBM. They were mainframe
computers, not personal computers.
Mark Suckerberg CEO OF Suckface (facebook)
Hands Up Don't Shoot
Email us
patcnews.patriot@aol.com
patcnewsconservative1@aol.com
LLC 501C- 4 UCC 1-308.ALL RIGHTS RESERVED WITHOUT PREJUDICE
Mark Suckerberg CEO OF Suckface (facebook)
Hands Up Don't Shoot
INTRODUCTION
Every
day, people come to Facebook to share their stories, see the world
through the eyes of others, and connect with friends and causes. The
conversations that happen on Facebook reflect the diversity of a
community of more than two billion people communicating across countries
and cultures and in dozens of languages, posting everything from text
to photos and videos.
We recognize how important it is for Facebook to be a place where people feel empowered to communicate, and we take our role in keeping abuse off our service seriously. That’s why we have developed a set of Community Standards that outline what is and is not allowed on Facebook. Our Standards apply around the world to all types of content. They’re designed to be comprehensive – for example, content that might not be considered hate speech may still be removed for violating our bullying policies.
The goal of our Community Standards is to encourage expression and create a safe environment. We develop our policies based on input from our community and from experts in fields such as technology and public safety. Our policies are also rooted in the following principles:
Safety:
People need to feel safe in order to build community. We are committed
to removing content that encourages real-world harm, including (but not
limited to) physical, financial, and emotional injury.
Voice:
Our mission is all about embracing diverse views. We err on the side of
allowing content, even when some find it objectionable, unless removing
that content can prevent a specific harm. Moreover, at times we will
allow content that might otherwise violate our standards if we feel that
it is newsworthy, significant, or important to the public interest. We
do this only after weighing the public interest value of the content
against the risk of real-world harm.
Equity:
Our community is global and diverse. Our policies may seem broad, but
that is because we apply them consistently and fairly to a community
that transcends regions, cultures, and languages. As a result, our
Community Standards can sometimes appear less nuanced than we would
like, leading to an outcome that is at odds with their underlying
purpose. For that reason, in some cases, and when we are provided with
additional context, we make a decision based on the spirit, rather than
the letter, of the policy.
Everyone on Facebook plays a part in keeping the platform safe and respectful. We ask people to share responsibly and to let us know when they see something that may violate our Community Standards. We make it easy for people to report potentially violating content, including Pages, Groups, profiles, individual content, and/or comments to us for review. We also give people the option to block, unfollow, or hide people and posts, so that they can control their own experience on Facebook.
The consequences for violating our Community Standards vary depending on the severity of the violation and a person's history on the platform. For instance, we may warn someone for a first violation, but if they continue to violate our policies, we may restrict their ability to post on Facebook or disable their profile. We also may notify law enforcement when we believe there is a genuine risk of physical harm or a direct threat to public safety.
Our Community Standards, which we will continue to develop over time, serve as a guide for how to communicate on Facebook. It is in this spirit that we ask members of the Facebook community to follow these guidelines.
We recognize how important it is for Facebook to be a place where people feel empowered to communicate, and we take our role in keeping abuse off our service seriously. That’s why we have developed a set of Community Standards that outline what is and is not allowed on Facebook. Our Standards apply around the world to all types of content. They’re designed to be comprehensive – for example, content that might not be considered hate speech may still be removed for violating our bullying policies.
The goal of our Community Standards is to encourage expression and create a safe environment. We develop our policies based on input from our community and from experts in fields such as technology and public safety. Our policies are also rooted in the following principles:
Everyone on Facebook plays a part in keeping the platform safe and respectful. We ask people to share responsibly and to let us know when they see something that may violate our Community Standards. We make it easy for people to report potentially violating content, including Pages, Groups, profiles, individual content, and/or comments to us for review. We also give people the option to block, unfollow, or hide people and posts, so that they can control their own experience on Facebook.
The consequences for violating our Community Standards vary depending on the severity of the violation and a person's history on the platform. For instance, we may warn someone for a first violation, but if they continue to violate our policies, we may restrict their ability to post on Facebook or disable their profile. We also may notify law enforcement when we believe there is a genuine risk of physical harm or a direct threat to public safety.
Our Community Standards, which we will continue to develop over time, serve as a guide for how to communicate on Facebook. It is in this spirit that we ask members of the Facebook community to follow these guidelines.
Your Facebook Account is Temporarily Locked
We’ve detected suspicious activity on your Facebook account and have temporarily locked it as a security precaution.
It’s
likely that your account was compromised as a result of entering your
password on a website designed to look like Facebook. This type of
attack is known as phishing. Learn more in the Help Center.
Over the next few steps we’ll walk you through a security check to help secure your account, and let you log back in.
Email us
patcnews.patriot@aol.com
patcnewsconservative1@aol.com
LLC 501C- 4 UCC 1-308.ALL RIGHTS RESERVED WITHOUT PREJUDICE
Content
and Programming Copyright 2019 By Patcnews The Patriot Conservative
News Tea Party Network © LLC UCC 1-308.ALL RIGHTS RESERVED WITHOUT
PREJUDICE All copyrights reserved By Patcnews The Patriot Conservative
News Tea Party Network Copyright 2019 CQ-Roll Call, Inc. All materials
herein are protected by United States copyright law and may not be
reproduced, distributed, transmitted, displayed, published or broadcast
without the prior written permission of CQ-Roll Call. You may not alter
or remove any trademark, copyright or other notice from copies of the
content. © All Copyrights Reserved By Patcnews The Patriot Conservative
News Tea Party Network
No comments:
Post a Comment