Thursday, March 16, 2017

( Facebook as Suckface 2017 News Report ) Patcnews March 16, 2017 The Patriot Conservative News Tea Party Network Reports Facebook as Suckface 2017 © All Copyrights reserved By Patcnews

  Mark Zuckerberg Addresses 'Breach Of Trust' In Facebook User Data Crisis

(Getty Images)
Facebook CEO Mark Zuckerberg spoke out on the Cambridge Analytica data leak for the first time on Wednesday.
After days without comment by Facebook's FB -3.35% top executives on public revelations that Cambridge Analytica collected the personal information of tens of millions of Facebook users without their consent, the social network's CEO Mark Zuckerberg has finally spoken out about the privacy crisis.
Zuckerberg, who is expected to complete a CNN interview airing Wednesday evening, said in a Facebook post on Wednesday that he claims responsibility for protecting users' data and has been working to understand what happened to ensure a similar leak doesn't happen again. He also outlined steps Facebook has taken to address the data leak and the company's plans to prevent the abuse of users' privacy moving forward. Cambridge Analytica helped run President Donald Trump's political campaign.
"This was a breach of trust between Kogan, Cambridge Analytica and Facebook," Zuckerberg said in Wednesday post. "But it was also a breach of trust between Facebook and the people who share their data with us and expect us to protect it. We need to fix that."

Zuckerberg confirmed that in 2013, a Cambridge University researcher named Aleksandr Kogan created a personality quiz that was installed by about 300,000 people who shared their data and some of their friends' data, giving Kogan access to the data of tens of millions of Facebook users. In 2014, Zuckerberg said Facebook changed its policies to stop allowing app creators to access data about a person's friends unless the user's friends had also authorized the app. In 2015, Facebook learned from journalists at The Guardian that Kogan had violated Facebook's policies by sharing the social network data with data-profiling firm Cambridge Analytica. Facebook then banned Kogan from the platform and demanded that he and Cambridge Analytica "formally certify" that they had deleted the data, and both parties "provided these certifications," Zuckerberg said.
He confirmed that last week, Facebook learned from The Guardian, The New York Times and Channel 4 that Cambridge Analytica "may not" have deleted the data as it had claimed. Facebook then banned Cambridge Analytica from using any of its services and hired a firm to complete a forensic audit of the firm. Zuckerberg said the company is "working with regulators to investigate what happened." (The U.S. Federal Trade Commission is reportedly investigating Facebook's handling of user data, and the European Commission is asking data protection authorities to investigate the data leak.)
Moving forward, Zuckerberg said Facebook will investigate all apps that had access to "large amounts of information" before the social network changed its data policies in 2014. Facebook said it will audit of any app "with suspicious activity" and ban any developers that do not agree to being reviewed. If Facebook finds that developers missed any personal information, Facebook will ban the app creators and alert all users affect by the apps, including users affected by Kogan's leak.
Facebook said it will also further restrict developers' access to data more broadly. The company will remove developers' access to data if users haven't used the app in three months, and will limit the data users give developers when they log in to include only their name, profile photo and email address. Developers will need to get approval from users and sign a contract with Facebook in order to ask any users for access to their posts or private data.
To make it easier for Facebook users to control their information, Facebook will add a new tool to the top of news feed showing people which apps can access their data, including a feature to revoke the app's access. (This feature is currently available in users' privacy settings, however the new placement will make it easier for people to find from news feed.) Zuckerberg said the company plans to announce more data policy changes over the coming days.
"I started Facebook, and at the end of the day I'm responsible for what happens on our platform," Zuckerberg wrote on Wednesday. "I'm serious about doing what it takes to protect our community. While this specific issue involving Cambridge Analytica should no longer happen with new apps today, that doesn't change what happened in the past."
"We will learn from this experience to secure our platform further and make our community safer for everyone going forward," he added.
On Wednesday, Facebook COO Sheryl Sandberg also addressed the Cambridge Analytica controversy for the first time, echoing Zuckerberg's comments that the social network.
"As he said, we know that this was a major violation of people's trust, and I deeply regret that we didn't do enough to deal with it," Sandberg said in a Facebook post
Follow me on Twitter @kchaykowski and e-mail me at kchaykowski@forbes.com.

 

 

Facebook Watch Suffers the Scroll

By Published on .





Facebook confirms test of a downvote button for flagging comments



Next Story


‘Pod Save America’ is coming to HBO


How can Facebook promote meaningful interaction between users? By letting them downvote inappropriate comments to hide them. Facebook is now testing a downvote button on a limited set of public Page post comment reels, the company confirms to TechCrunch. But what Facebook does with signals about problematic comments could raise new questions about censorship, and its role as a news editor and media company.
A Facebook spokesperson tells TechCrunch Facebook that the motivation behind the downvote button is to create a lightweight way for people to provide a signal to Facebook that a comment is inappropriate, uncivil, or misleading.
Here’s the statement Facebook provided: “We are not testing a dislike button. We are exploring a feature for people to give us feedback about comments on public page posts. This is running for a small set of people in the U.S. only.”
This is what the downvote button looks like up close:


When tapped, the downvote button hides a comment, and gives users additional reporting options like “Offensive”, “Misleading”, and “Off Topic”. Those could help Facebook figure out if the comment is objectionable, a form of “fake news”, or just irrelevant. Facebook already has a “Hide” button for comments, but it’s usually hidden behind the drop-down arrow on comments rather than immediately clickable.

Here you can see the downvote button on a comment thread, plus what happens when you click it. The screenshots come from Christina Hudler.


According to Facebook, this is a short-term test that doesn’t affect the ranking of the comment, post, or Page. It’s designed as a way to give feedback to Facebook, not the commenter, and there will be no publicly visible count of how many downvotes a comment gets. The test is running for 5% of Android users in the U.S. with language set to English. The downvote button only appears on public Page posts, not on posts by Groups, public figures or users. There’s currently no plan to expand the test as is.

Not A Dislike Button

A dislike button has long been the most requested Facebook feature, but Facebook has never given in.
Back in 2015, CEO Mark Zuckerberg responded to a Q&A question about it, saying:

“We didn’t want to just build a Dislike button because we don’t want to turn Facebook into a forum where people are voting up or down on people’s posts. That doesn’t seem like the kind of community we want to create.”
Instead, Facebook built the Reactions options that let you respond to posts and comments with love, wow, haha, sad or angry emoji. Facebook also built reactions into Messenger with the option to give messages a thumbs-up or thumbs-down so you could show agreement or disagreement.





But the new downvote button is the closest Facebook has come to actually giving people a dislike button. Downvoting was popularized on Reddit for crowdsourcedcomment ranking. Reddit co-founder Alexi Ohanian weighed in on Facebook’s version via Twitter:

The downvote button ties in with Facebook’s recent push to enhance its users’ well-being by prioritizing News Feed content that drives meaningful interactions instead of passive, zombie browsing. That led Facebook to show fewer viral videos, which in turn contributed to a 700,000 user decrease in U.S. and Canada daily active users — its first decline ever anywhere — and Facebook’s slowest DAU growth rate it’s ever reported.
But one way Facebook could generate more meaningful interaction without losing time on site could be by ensuring the most interesting comments are at the top of posts. Facebook already ranks comments by relevancy based on Likes and replies. But the downvote button could ensure that if objectionable comments rise up and stall discussion, Facebook will know.
That could eventually lead to a way for Facebook to bury these comments, or the people that post them. However, this will only open up more questions about censorship and what qualifies as inappropriate at a time when Facebook is already struggling to manage its responsibilities as what Zuckerberg calls “not a traditional media company.”

Apple and Facebook: The Teflon Twins







The Patriot Conservative News Tea Party Network


      liberalism + Socialism = Terrorism 
                          Thanks for your Support







 © All copyrights reserved By Patcnews
 liberalism + Socialism = Terrorism






When Facebook this year named its YouTube-style video destination Watch, it was like a command for viewers to stop scrolling so fast and just ... watch.
Watch basketball's Ball family, watch reality dating shows, watch people travel and cook. Many of the shows came from top publishers like Business Insider, Hearst, The Atlantic and Time Inc., all striving for commercial viability in digital video.
But people have not tuned in to the Watch hub as expected, with the "vast majority" of video views still coming from the News Feed, publishers say.
"They are really struggling with how to figure out how to get users to consume the videos in Watch," says a publishing executive, speaking on condition of anonymity. "All of our views come from News Feed and not the Watch tab."
Facebook is now moving quickly to improve its offer. It is considering hiking the share of ad revenue that Watch creators get from 55 percent, a person familiar with the negotiations said.
Facebook will also test letting partners sell their own ad inventory, according to people familiar with the strategy. A Facebook representative confirmed that the company will run a limited test next year.
Letting partners sell their own shows to advertisers might help attract high-powered networks and studios to Watch, because some of the more prestigious media outlets prefer to control their content and ads on digital platforms.
Partners have asked for a while for the ability to sell their own media on Facebook. The social network has allowed them to sell brand integrations directly in videos, but otherwise has handled ad sales itself and split the revenue.
The rep declined to comment on the revenue split beyond saying it "would be competitive to other platforms."
Facebook has already announced a number of new initiatives to drive more viewers to Watch shows as well as a test of ads before programs start.
This is the dilemma: Facebook lured publishers to make and post shows to the Watch section. But issues like difficulty getting exposure in the hub have made the News Feed the easiest way to find viewers.
And that matters because the News Feed doesn't encourage the viewing habits that Facebook wants. One of the driving motivations for the video hub was to get people to turn on the volume, sit through entire shows, stick around even during ad breaks and perhaps watch something else after that. Ads on video in the hub seem to command a higher price too, twice as much as the same ads in News Feed, according to one digital video ad buyer.
In the News Feed, viewers are doing drive-bys, usually without sound, and can quickly scroll away if a commercial starts. They're also less likely to get addicted to a series. Don't even think about binge-watching.
Facebook last week acknowledged the challenges with updates in the way it promotes shows—in the feed and the hub. "We will show more videos in News Feed that people seek out or return to watch from the same publisher or creator week after week," Facebook's product team said in a post. Watch's "Discover" tab will also prioritize "shows that people come back to," the post said.
In other words: Anyone who can hook viewers will reap the rewards.
"The plans they approached partners with when they pitched the video tab and the deals for Watch programming, none of that applies today," says an executive at another publishing partner, one that has gone through two rounds of negotiations with Facebook to develop Watch programs. "It's all thrown out."


____________________________________


Wichita Islamic Society responds to threatening Facebook post

WICHITA, Kan. (KWCH) A Facebook post has cause concern in Wichita's Muslim community after a man suggests using the Wichita Islamic Society building as a shooting range.
For Hussam Madi, the idea that anyone would want to attack his place of worship is frightening.
So when someone made a post on Facebook alluding to using Islamic Society building as a shooting range, its something he and society members took very seriously.
"To display it where some people have their families, and human beings, and children and people that are coming to worship only - to insinuate that that's your shooting range, absolutely horrendous and not acceptable towards anybody. I prayed for the guy because he's ignorant."
He says regardless of the posters intent, it's something that deserves looking into.
"It could be a joke, I don't know. Unfortunately, it happened in other communities, where people went into a church, a mosque, or a synagogue and they shot people."
For Hussam, its nothing to joke about and he hopes others will think about it before posting threatening things.

"We're all together in this, the city of Wichita - we're all Americans, and there is nobody that needs to be targeted like that."
Hussam says police did contact the person who posted that picture to see if it was a viable threat. Eyewitness News also reached out to that person but have not heard back.

 

Facebook bravely admits that it is a problem, and suggests we spend more time on Facebook

Yesterday (Dec. 15), a strange post went up on Facebook’s corporate blog. It was strange because it suggested that Facebook might, in fact, be bad for you.
What solution can the social network provide? The same answer it gives to every question: namely, more Facebook.
The post was the latest in Facebook’s somewhat new series, “Hard Questions.” This set of blog posts aims to address concerns that social media broadly, and Facebook specifically, might be having a negative impact on society. Topics include “Hate Speech,” “How We Counter Terrorism,” and the latest one, “Is Spending Time on Social Media Bad for Us?”


The structure of these posts is usually the same. Step one: identify some ill in society. Step two: admit that people think technology, and Facebook, might be contributing to that ill. Step three: assert that more Facebook, not less, is the cure for said ill.
In the new post on the potential downside of social media, the authors, who are researchers at Facebook, begin by correctly saying that people are worried about the effect social media has on relationships and mental health. They then point to research that suggests scrolling through Facebook, and blindly hitting the “like” button, makes people feel like crap. “In general, when people spend a lot of time passively consuming information—reading but not interacting with people—they report feeling worse afterward,” they write.
The key phrase is “passively consuming.” The authors’ solution to this problem is not, as you might think, using Facebook less. It is using it more, and more actively. Instead of just liking things, and scrolling through our feeds, they suggest that we should be all-in. Send more messages, post more updates, leave more comments, click more reaction buttons. “A study we conducted with Robert Kraut at Carnegie Mellon University found that people who sent or received more messages, comments and Timeline posts reported improvements in social support, depression and loneliness,” they cheerily note.
They then adds a caveat that “simply broadcasting status updates wasn’t enough; people had to interact one-on-one with others in their network.” But wait. Isn’t Facebook a social network, connecting me to hundreds or thousands of other people? I don’t need Facebook to interact one-on-one, over text, email, or coffee.


Facebook might admit it has some negative effects, but it is unwilling to face up to the fact that the solution might be using it less. This latest post mentions Facebook’s “take a break” feature. This will hide your ex-partner’s profile updates for you after a break-up, to help in “emotional recovery.” Because, sure, that seems healthier than just not using Facebook at all for a little while.
Pretty much every Facebook post about the ill effects of the platform follows this formula. Hate speech on Facebook is a problem. The solution? Use Facebook more to tag hate speech, so we can get rid of it. Kids are on Facebook, and it might not be good for them. The solution? Give them Facebook Messenger Kids, a new app made just for them. Facebook is causing political divisiveness in America. The solution? Use Facebook to build digital “communities.”
Turns out Facebook’s “hard questions” are actually pretty easy. The answer, after all, is always the same.
___________________________________

Facebook just admitted that using Facebook can be bad for you

  • Facebook said on Friday that there are certain use cases of the social network that can be bad for your health.
  • It also found that some use-cases can be positive, specifically social interaction, and said it's going to work to improve those features.


 Facebook just admitted that using Facebook can be bad for you
Facebook admitted on Thursday that using its social network can be bad for you in some instances.
Facebook's director of research David Ginsberg and research scientist Moira Burke published a post in which they addressed questions about the impact Facebook has on our moods, and revealed some compelling information.
"University of Michigan students randomly assigned to read Facebook for 10 minutes were in a worse mood at the end of the day than students assigned to post or talk to friends on Facebook," the blog post said. "A study from UC San Diego and Yale found that people who clicked on about four times as many links as the average person, or who liked twice as many posts, reported worse mental health than average in a survey."





In other words, if you're using Facebook to mindlessly browse through your feed or click posts, you may end up in a foul mood after. Facebook also worked with Carnegie Mellon University for additional insight, and found that "people who sent or received more messages, comments and timeline posts reported improvements in social support, depression and loneliness." Likewise, Facebook said students at Cornell who used Facebook for 5 minutes while viewing their own profiles saw "boosts in self-affirmation," while folks who looked at other profiles did not.
In other words, using Facebook to interact with people -- as opposed to just "browsing" as the University of Michigan study analyzed -- seemed to have a positive effect on people.
Facebook's blog post follows criticisms from former Facebook exec Chamath Palihapitiya, who said recently that social networks such as Facebook are "starting to erode the social fabric of how society works" and that they're "ripping apart" society. Palihapitiya has since walked back those remarks.
Facebook says it's going to take this data and work to encourage more social interaction among users in an effort to cut down on those who spend it to waste time and, ultimately, feel worse after.

WATCH: Ex-Facebook executive Chamath Palihapitiya: Social media is 'ripping apart' society

 

The Patriot Conservative News Tea Party Network


      liberalism + Socialism = Terrorism 
                          Thanks for your Support







 © All copyrights reserved By Patcnews
 liberalism + Socialism = Terrorism

 






LLC 501C- 4 UCC 1-308.ALL RIGHTS RESERVED WITHOUT PREJUDICE



Content and Programming Copyright 2017 By Patcnews The Patriot Conservative News Tea Party Network © LLC UCC 1-308.ALL RIGHTS RESERVED WITHOUT PREJUDICE All copyrights reserved By Patcnews The Patriot Conservative News Tea Party Network Copyright 2017 CQ-Roll Call, Inc. All materials herein are protected by United States copyright law and may not be reproduced, distributed, transmitted, displayed, published or broadcast without the prior written permission of CQ-Roll Call. You may not alter or remove any trademark, copyright or other notice from copies of the content.  © All Copyrights reserved By Patcnews The Patriot Conservative News Tea Party Network



No comments:

Post a Comment