Tumgik
#In Mark Zuckerberg We Trust? The State and Future of Facebook
fumpkins · 2 years
Text
US sues cryptocurrency exchange run by Winklevoss twins
Tyler and Cameron Winklevoss developed crypto exchange Gemini Trust Co. after taking legal action against one-time Harvard schoolmate Mark Zuckerberg over who really created the concept for Facebook.
US regulators on Thursday stated they are taking legal action against the Gemini Trust cryptocurrency exchange, which is run by Cameron and Tyler Winklevoss, for providing deceptive responses in 2017 about a bitcoin task.
The Commodity Futures Trading Commission claim submitted in federal court in New York implicates Gemini of not being in advance about how simple it would be to control a bitcoin futures task proposed at the time, the company stated in a declaration.
The futures agreement gone for completion of 2017 and stopped trading 2 years later on, according to article from Gemini and a partner business
Making incorrect or deceptive declarations to the commission weakens its work to safeguard market individuals, avoid cost control, and promote reasonable competitors, acting director of enforcement Gretchen Lowe stated in the declaration.
“This enforcement action sends a strong message that the Commission will act to safeguard the integrity of the market oversight process,” Lowe stated.
The US company is looking for punitive damages, the surrender of any ill-gotten gains, and an injunction prohibiting Gemini from such habits in the future, it stated.
Gemini safeguarded its record when inquired about the match.
“We have an eight year track-record of asking for permission, not forgiveness, and always doing the right thing,” it informed AFP, including: “We look forward to definitively proving this in court.”
Cameron and Tyler Winklevoss, twin Harvard schoolmates of Mark Zuckerberg, who sued him over claims he took the concept for Facebook from them, began and run New York-based Gemini.
The bros informed Gemini staff members on Thursday that about 10 percent of them were being laid off as personnel is cut to sustain a “crypto winter” most likely to continue for a while, according to a copy of the e-mail published online by the business.
“The crypto revolution is well underway and its impact will continue to be profound, but its trajectory has been anything but gradual or predictable,” the bros stated.
The market remains in a “contraction phase that is settling into a period of stasis—what our industry refers to as ‘crypto winter'” intensified by macroeconomic and geopolitical chaos, they included.
US authorizes Winklevoss dollar-linked cryptocurrencies
© 2022 AFP
Citation: US sues cryptocurrency exchange run by Winklevoss twins (2022, June 2) recovered 3 June 2022 from https://techxplore.com/news/2022-06-sues-cryptocurrency-exchange-winklevoss-twins.html
This file goes through copyright. Apart from any reasonable dealing for the function of personal research study or research study, no part might be recreated without the composed approval. The material is attended to details functions just.
New post published on: https://livescience.tech/2022/06/03/us-sues-cryptocurrency-exchange-run-by-winklevoss-twins/
0 notes
futuresin-blog1 · 5 years
Text
Twitter Did the Right Thing Banning Political Ads
Tumblr media
As advertising is having to question its ethics following Elizabeth Warren’s pressure on Mark Zuckerberg to change Facebook’s Ad policies, Twitter is saying it will now ban all political ads from its platform.
Facebook’s political Ads neglect even basic fact-checking and thus are prone to misinformation and hate speech.
If Jack Dorsey thinks political messages should be earned and not bought, will Facebook and Google follow suit? I think advertising without fact-checking is ridiculous and it’s not a free speech story. My generation needs politicians to be held accountable.
In an era where AOC, Bernie Sanders, and other politicians go viral on Twitter, including Donald Trump having unheard of power on the stock markets in 2019 with headline-bait you have to wonder at how this helps left-leaning politicians who already have a secure foothold on Twitter now.
AOC basically has 6 million Twitter followers boosting Twitter incredibly and achieving organic reach through the use of Twitter’s platform. The same could be said for the Yang Gang’s efforts. These two young politicians obviously have a bright future for 2024 (for Yang) and 2028 for AOC seeing that automation and capitalism are likely to deteriorate economic conditions for young people.
So Twitter did the right thing, even while for people being more or less a deteriorated loudspeaker.
Twitter’s reasoning appears to be that politics should not be hacked unfairly in a way that could be dangerous for voters. There’s also considerable backlash for 2016 and how Trump was elected just by a few swing states being hacked by foreign bad actors.
What’s funny for me is how this is a dig on Facebook’s ethics. Facebook is the one that allows even the most threatening and false advertisements that make politics seem like a brutal game of punishing opponents instead of talking about issues.
If only Instagram followed in the footsteps of Twitter. So many people in California dislike what Facebook has become, the pressure to break it up and censor its monopoly power may increase in 2020.
At least Jack is trying to be transparent.
Twitter is but a small piece of the ad puzzle of politics but still highly relevant for breaking news and political messaging. Donald Trump has popularized Twitter as a platform about the world and geopolitical affairs. Twitter trying to clean up its bots, fake accounts and Ads seems to be a step in the right direction.
Will this move make communications teams for politicians get better at viral and word-of-mouth marketing? They must make better tweets that resonate with large groups of their core constituents. Young people barely trust politicians or social media platforms anymore.
We live in a world where Facebook’s Mark Zuckerberg stands by decision to run unchecked ads. Where one person’s arrogance or misunderstanding of what’s good for democracy or capitalism, can hold us back.
Twitter, however, did the right thing to ban political advertisements.
1 note · View note
aaronbarrnna · 6 years
Text
In Mark Zuckerberg We Trust? The State and Future of Facebook, User Data, Cambridge Analytica, Fake News, Elections, Russia and You
In Mark Zuckerberg We Trust? The State and Future of Facebook, User Data, Cambridge Analytica, Fake News, Elections, Russia and You
In the wake of Cambridge Analytica, data misappropriation, #deletefacebook, calls for regulation and pending testimony to U.S. Congress, Facebook announced a series of initiativesto restrict data access and also a renewed selfie awareness to focus efforts on protecting people on the platform. What’s more notable however is that Mark Zuckerberg also hosted a last-minute, rare town hall with media…
View On WordPress
0 notes
elizabetdfhmartin · 6 years
Text
In Mark Zuckerberg We Trust? The State and Future of Facebook, User Data, Cambridge Analytica, Fake News, Elections, Russia and You
In Mark Zuckerberg We Trust? The State and Future of Facebook, User Data, Cambridge Analytica, Fake News, Elections, Russia and You
In the wake of Cambridge Analytica, data misappropriation, #deletefacebook, calls for regulation and pending testimony to U.S. Congress, Facebook announced a series of initiativesto restrict data access and also a renewed selfie awareness to focus efforts on protecting people on the platform. What’s more notable however is that Mark Zuckerberg also hosted a last-minute, rare town hall with media…
View On WordPress
0 notes
dustinwootenne · 6 years
Text
In Mark Zuckerberg We Trust? The State and Future of Facebook, User Data, Cambridge Analytica, Fake News, Elections, Russia and You
In Mark Zuckerberg We Trust? The State and Future of Facebook, User Data, Cambridge Analytica, Fake News, Elections, Russia and You
In the wake of Cambridge Analytica, data misappropriation, #deletefacebook, calls for regulation and pending testimony to U.S. Congress, Facebook announced a series of initiativesto restrict data access and also a renewed selfie awareness to focus efforts on protecting people on the platform. What’s more notable however is that Mark Zuckerberg also hosted a last-minute, rare town hall with media…
View On WordPress
0 notes
arlabusattars · 6 years
Text
In Mark Zuckerberg We Trust? The State and Future of Facebook, User Data, Cambridge Analytica, Fake News, Elections, Russia and You
In Mark Zuckerberg We Trust? The State and Future of Facebook, User Data, Cambridge Analytica, Fake News, Elections, Russia and You
In the wake of Cambridge Analytica, data misappropriation, #deletefacebook, calls for regulation and pending testimony to U.S. Congress, Facebook announced a series of initiativesto restrict data access and also a renewed selfie awareness to focus efforts on protecting people on the platform. What’s more notable however is that Mark Zuckerberg also hosted a last-minute, rare town hall with media…
View On WordPress
0 notes
Text
In Mark Zuckerberg We Trust? The State and Future of Facebook, User Data, Cambridge Analytica, Fake News, Elections, Russia and You
In Mark Zuckerberg We Trust? The State and Future of Facebook, User Data, Cambridge Analytica, Fake News, Elections, Russia and You
In the wake of Cambridge Analytica, data misappropriation, #deletefacebook, calls for regulation and pending testimony to U.S. Congress, Facebook announced a series of initiativesto restrict data access and also a renewed selfie awareness to focus efforts on protecting people on the platform. What’s more notable however is that Mark Zuckerberg also hosted a last-minute, rare town hall with media…
View On WordPress
0 notes
mariaaklnthony · 6 years
Text
In Mark Zuckerberg We Trust? The State and Future of Facebook, User Data, Cambridge Analytica, Fake News, Elections, Russia and You
In the wake of Cambridge Analytica, data misappropriation, #deletefacebook, calls for regulation and pending testimony to U.S. Congress, Facebook announced a series of initiatives to restrict data access and also a renewed selfie awareness to focus efforts on protecting people on the platform. What’s more notable however is that Mark Zuckerberg also hosted a last-minute, rare town hall with media and analysts to explain these efforts and also take tough questions for the better part of an hour.
Let’s start with the company’s news on data restrictions.
To better protect Facebook user information, the company is making the following changes across nine priority areas over the coming months (Sourced from Facebook):
Events API: Until today, people could grant an app permission to get information about events they host or attend, including private events. Doing so allowed users to add Facebook Events to calendar, ticketing or other apps. According to the company, Facebook Events carry information about other people’s attendance as well as posts on the event wall. As of today, apps using the API can no longer access the guest list or posts on the event wall.
Groups API: Currently apps need permission of a group admin or member to access group content for closed groups. For secret groups, apps need the permission of an admin. However, groups contain information about people and conversations and Facebook wants to make sure everything is protected. Moving forward, all third-party apps using the Groups API will need approval from Facebook and an admin to ensure they benefit the group. Apps will no longer be able to access the member list of a group. Facebook is also removing personal information, such as names and profile photos, attached to posts or comments.
Pages API: Previously, third party apps could use the Pages API to read posts or comments from any Page. Doing so lets developers create tools to help Page owners perform common tasks such as schedule posts and reply to comments or messages. At the same time, it also let apps access more data than necessary. Now, Facebook wants to ensure that Page information is only available to apps providing useful services to our community. All future access to the Pages API will need to be approved by Facebook.
Facebook Login: Two weeks, Facebook announced changes to Facebook Login. As of today, Facebook will need to approve all apps that request access to information such as check-ins, likes, photos, posts, videos, events and groups. Additionally, the company no longer allow apps to ask for access to personal information such as religious or political views, relationship status and details, custom friends lists, education and work history, fitness activity, book reading activity, music listening activity, news reading, video watch activity, and games activity. Soon, Facebook will also remove a developer’s ability to request data people shared with them if there has been no activity on the app in at least three months.
Instagram Platform API: Facebook is accelerating the deprecation of the Instagram Platform API effective today.
Search and Account Recovery: Previously, people could enter a phone number or email address into Facebook search to help find their profiles. According to Facebook, “malicious actors” have abused these features to scrape public profile information by submitting phone numbers or email addresses. Given the scale and sophistication of the activity, Facebook believes most people on Facebook could have had their public profile scraped in this way. This feature is now disabled. Changes are also coming to account recovery to also reduce the risk of scraping. 
Call and Text History: Call and text history was part of an opt-in feature for Messenger or Facebook Lite users on Android. Facebook has reviewed this feature to confirm that it does not collect the content of messages. Logs older than one year will be deleted. More so, broader data, such as the time of calls, will no longer be collected.
Data Providers and Partner Categories: Facebook is shuttering Partner Categories, a product that once let third-party data providers offer their targeting directly on Facebook. The company stated that “although this is common industry practice…winding down…will help improve people’s privacy on Facebook”
App Controls: As of April 9th, Facebook display a link at the top of the News Feed for users to see what apps they use and the information they have shared with those apps. Users will also have streamlined access to remove apps that they no longer need. The company will reveal if information may have been improperly shared with Cambridge Analytica.
Cambridge Analytica may have had data from as many as 87 million people
Facebook also made a startling announcement. After thorough review, the company believes that Cambridge Analytica may have collected information on as many as 87 million people. 81.6% of these users resided in the United Sates with the rest of the affected users scattered across the Philippines, Indonesia, United Kingdom, Mexico, Canada, India, among others. Original reports from the New York Times estimated that the number of affected users was closer to 50 million.
Mark Zuckerberg Faces the Media; Shows Maturity and Also Inexperienced Leadership
In a rare move, Mark Zuckerberg invited press and analysts to a next-day call where he shared details on the company’s latest moves to protect user data, improve the integrity of information shared on the platform and protect users from misinformation. After initially going AWOL following the Cambridge Analytic data SNAFU, he’s since been on a whirlwind media tour. He genuinely seems to want us to know that he made mistakes, that he’s learning from them and that he’s trying to do the right thing. On our call, he stayed on beyond his allotted time to answer tough questions for the better part of 60 minutes.
From the onset, Mark approached the discussion by acknowledging that he and the rest of Facebook hadn’t done enough to date to prevent its latest fiasco nor had it done enough to protect user trust.
“It’s clear now that we didn’t do enough in preventing abuse…that goes for fake news, foreign interference, elections, hate speech, in addition to developers and data privacy,” Zuckerberg stated. “We didn’t take a broad enough view what our responsibility is. It was my fault.”
He further pledged to right these wrongs while focusing on protecting user data and ultimately their Facebook experience.
“It’s not enough to just connect people. We have to make sure those connections are positive and that they’re bringing people closer together,” he said. “It’s not enough to give people a voice. We have to make sure that people aren’t using that voice to hurt people or spread disinformation. And it’s not enough to give people tools to manage apps. We have to ensure that all of those developers protect people’s information too. We have to ensure that everyone in our ecosystem protects information.”
Zuckerberg admitted that protecting data is just one piece of the company’s multi-faceted strategy to get the platform back on track. Misinformation, security issues and user-driven polarization still threaten facts, truth and upcoming elections.
He shared some of the big steps Facebook is taking to combat these issues. “Yesterday we took a big action by taking down Russian IRA pages,” he boasted. “Since we became aware of this activity…we’ve been working to root out the IRA to protect the integrity of elections around the world. All in, we now have about 15,000 people working on security and content review and we’ll have more than 20,000 by the end of this year. This is going to be a major focus for us.”
He added, “While we’ve been doing this, we’ve also been tracing back and identifying this network of fake accounts the IRA has been using so we can work to remove them from Facebook entirely. This is the first action that we’ve taken against the IRA and Russia itself. And it included identifying and taking down a Russian news organization. We have more work to do here.”
Highlights, Observations and Findings
This conversation was pretty dense. In fact, it took hours to pour over the conversation just to put this article together. I understand if you don’t have time to read through the entire interview or listen to the full Q&A. To help, I’ve some of the highlights, insights and takeaways from our hour together.
Mark Zuckerberg wants you to know that he’s very sorry. He articulated on several occasions that he feels the weight of his mistakes, mischaracterizations and gross misjudgments on everything…user data, fake news, election tampering, polarization, data scraping, and user trust. He also wants you to know that he’s learning from his mistakes and his priority is fixing these problems while regaining trust moving forward. He sees this as a multi-year strategy of which Facebook is already one year underway.
Facebook now believes that up to 87 million users, not 50, mostly in the US, may have been affected by Kogan’s personality quiz app. Facebook does not know the extent of which user data was sold to or used by Cambridge Analytica. This was not a data breach according to the company. People willingly took Kogan’s quiz.
Facebook has also potentially exposed millions of user profiles to data scraping due to existing API standards on other fronts over the years. The extent of this scraping and how data was used by third parties is unknown. Facebook has turned off access. Even still, it is unacceptable that it wasn’t taken seriously before. Facebook must own its part in exposing data to bad actors who scraped information for nefarious purposes.
Mark believes that Facebook hasn’t done a good enough job explaining user privacy, how the company makes money and how it does and doesn’t use user content/data. This is changing.
Mark, and the board/shareholders, believe, he’s still the right person for the job. Two reporters asked directly whether he’d step down by force or choice. His answer was an emphatic, “no.” His rationale is that this is his ship and he is the one who’s going to fix everything. He stated on several occasions that he wants to do the right thing. While I applaud his “awakening,” he has made some huge missteps as a leader that need more than promises to rectify. I still believe that Facebook would benefit from seasoned, strategic leadership to establish/renew a social contract with users, Facebook and its partners. The company is after all, fighting wars on multiple fronts. And the company has demonstrated a pattern of either negligence or ignorance in the past and then apologizing afterward. One can assume that this pattern will only continue.
There’s still a fair amount naïveté in play here when it comes to user trust, data and weaponizing information against Facebook users. Even though the company is aiming to right its wrongs, there’s more that lies ahead that the company and its key players cannot see yet. There’s a history of missing significant events here. And, Mark has a history of downplaying these events, acting too late and apologizing after the fact. “I didn’t know” is not a suitable response. Even though the company is making important strides, there’s nothing to make me believe that sophisticated data thieves, information terrorists and shape-shifting scammers aren’t already a step or two ahead of the Facebook team. Remember, following the 2016 election, Mark said it was “crazy” that fake news could somehow sway an election. He’s since recanted that reaction, but it was still his initial response and belief.
Facebook is already taking action against economic actors, government interference and lack of truthfulness and promises to do more. Its since removed thousands of Russian IRA accounts. Russia has responded that Facebook’s moves are considered “censorship.”
Not everything is Facebook’s fault, according to Facebook. Mark places some of the onus ofresponsibility on Facebook userswho didn’t read the ToS, manage their data settings or fully understand what happens when you put your entire life online. In his view, and it’s a tough pill to swallow, no one forced users to take a personality quiz. No one is forcing people to share every aspect of their life online. While the company is making it easier for users to understand what they’re signing up for and how to manage what they share, people still don’t realize that with this free service comes an agreement that as a user, they are the product and their attention is for sale.
Moving forward, Facebook isn’t as worried about data breaches as it is about user manipulation and psyops. According to Mark, users are more likely susceptible to “social engineering” threats over hacking and break-ins. Social engineering is the use of centralized planning and coordinated efforts to manipulate individuals to divulge information for fraudulent purposes. This can also be aimed at manipulating individual perspectives, behaviors and also influencing social change (for better or for worse.) Users aren’t prepared to fully understand if, when and how they’re succeptible to manipulation and I’d argue that research needs to be done in understanding how we’re influencing one another based on our own cognitive biases and how we choose to share and perceive information in real-time.
Facebook really wants you to know that it doesn’t sell user data to advertisers. But, it also acknowledges that it could have done and will do a better job in helping users understand Facebook’s business model. Mark said that users want “better ads” and “better experiences.” In addition to fighting information wars, Facebook is also prioritizing ad targeting, better news feeds, and the creation/delivery of better products and services that users love.
Even though upwards of 87 million users may have been affected by Kogan’s personality quiz and some of that user information was sold to and used by Cambridge Analytica, and that user data was also compromised in many other ways for years, the #deletefacebook movement had zero meaningful impact. But still, Mark says that the fact the movement even gained any momentum is “not good.” This leads to a separate but related conversation about useraddictiveness and dependencyon these platforms that kill movements such as #deletefacebook before they gain momentum.
Users cannot rely on Facebook, Youtube, Twitter, Reddit, et al., to protect them. Respective leaders of each of these platforms MUST fight bad actors to protect users. At the same time, they are not doing enough. Users are in many ways, unwitting pawns in what amounts to not only social engineering, but full-blown information warfare and psyops to cause chaos, disruption or worse. Make no mistake, people, their minds and their beliefs are under attack. It’s not just the “bad actors.” We are witnessing true villains, regardless of intent, damage, abuse and undermine human relationships, truth and digital and real-world democracy.
People and their relationships with one another are being radicalized and weaponized right under their noses. No one is teaching people how this even happens. More so, we are still not exposing the secrets of social design that makes these apps and servicesaddictive. In the face of social disorder, people ar still readily sharing everything about themselves online and believe they are in control of their own experiences, situational analyses and resulting emotions. I don’t know that people could really walk away even if they wanted to and that’s what scares me the most.
Q&A in Full: The Whole Story According to Zuckerberg
Please note that this call was 60 minutes long and what follows is not a complete transcript. I went through the entire conversation to surface key points and context.
David McCabe, Axios: “Given the numbers [around the IRA] have changed so drastically, Why should lawmakers and why should users trust that you’re giving them a full and accurate picture now?”
Zuckerberg: “There is going to be more content that we’re going to find over time. As long as there are people employed in Russia who have the job of trying to find ways to exploit these systems, this is going to be a never-ending battle. You never fully solve security, it’s an arms race. In retrospect, we were behind and we didn’t invest in it upfront. I’m confident that we’re making progress against these adversaries. But they’re very sophisticated. It would be a mistake to assume that you can fully solve a problem like this…”
Rory Cellan-Jones, BBC: “Back in November 2016, dismissed as crazy that fake news could have swung the election. Are you taking this seriously enough…?”
Zuckerberg: “Yes. I clearly made a mistake by just dismissing fake news as crazy as [not] having an impact. What I think is clear at this point, is that it was too flippant. I should never have referred to it as crazy. This is clearly a problem that requires careful work…This is an important area of work for us.”
Ian Sherr, CNET: “You just announced 87 million people affected by Cambridge Analytica, how long have you known this number because the 50 million number has been out there for a while. It feels like the data keeps changing on us and we’re not getting a full forthright view of what’s going on here.”
Zuckerberg: “We only just finalized our understanding of the situation in the last couple of days. We didn’t put out the 50 million number…we wanted to wait until we had a full understanding. Just to give you the complete picture on this, we don’t have logs going back for when exactly [Aleksandr] Kogan’s app queried for everyone’s friends…We wanted to take a broad view and a conservative estimate. I’m quite confident given our analysis, that it is not more than 87 million. It very well could be less…”
David Ingram, Reuters: “…Why weren’t there audits of the use of the social graph API years ago between the 2010 – 2015 period.
Zuckerberg: “In retrospect, I think we should have been doing more all along. Just to speak to how we were thinking about it at the time, as just a matter of explanation, I’m not trying to defend this now…I think our view in a number of aspects of our relationship with people was that our job was to give them tools and that it was largely people’s responsibility in how they chose to use them…I think it was wrong in retrospect to have that limited of a view but the reason why we acted the way that we did was because I think we viewed when someone chose to share their data and then the platform acted in a way that it was designed with the personality quiz app, our view is that, yes, Kogan broke the policies. And, he broke expectations, but also people chose to share that data with them. But today, given what we know, not just about developers, but across all of our tools and just across what our place in society is, it’s such a big service that’s so central in people’s lives, I think we understand that we need to take a broader view of our responsibility. We’re not just building tools that we have to take responsibility for the outcomes in how people use those tools as well. That’s why we didn’t do it at the time. Knowing what I know today, clearly we should have done more and we will going forward.
Cecilia King, NY Times: “Mark, you have indicated that you could be comfortable with some sort of regulation. I’d like to ask you about privacy regulations that are about to take effect in Europe…GDPR. Would you be comfortable with those types of data protection regulation in the U.S. and with global users.”
Zuckerberg: “Regulations like the GDPR are very positive…We intend to make all the same controls and settings everywhere not just Europe.”
Tony Romm, Washington Post: “Do you believe that this [data scraping] was all in violation of your 2011 settlement with the FTC?”
Zuckerberg: “We’ve worked hard to make sure that we comply with it. The reality here is that we have to take a broader view of our responsibility, rather than just legal responsibility. We’re focused on doing the right thing and making sure people’s information is protected. We’re doing investigations, we’re locking down the platform, etc. I think our responsibilities to the people who use Facebook are greater than what’s written in that order and that’s the standard that I want to hold us to.”
Hannah Kuchler, Financial Times, “Investors have raised a lot of concerns about whether this is the result of corporate governance issues at Facebook. Has the board discussed whether you should step down as chairman?”
Zuckerberg: “Ahhh, not that I’m aware of.”
Alexis Madrigal, Atlantic: “Have you ever made a decision that benefitted Facebook’s business but not the community.”
Zuckerberg: “The thing that makes our product challenging to manage and operate are not the trade offs between people and the business, I actually think that those are quite easy, because over the long term the business will be better if you serve people. I just think it would be near sighted to focus on short term revenue over what value to people is and I don’t think we’re that short-sighted. All of the hard decisions we have to make are actually trade-offs between people. One of the big differences between the type of product we’re building, which is why I refer to it as a community and what do I think some of the specific governance issues we have are that different people who use..
https://ift.tt/2uM8Fio
0 notes
aracecvliwest · 6 years
Text
In Mark Zuckerberg We Trust? The State and Future of Facebook, User Data, Cambridge Analytica, Fake News, Elections, Russia and You
In Mark Zuckerberg We Trust? The State and Future of Facebook, User Data, Cambridge Analytica, Fake News, Elections, Russia and You
In the wake of Cambridge Analytica, data misappropriation, #deletefacebook, calls for regulation and pending testimony to U.S. Congress, Facebook announced a series of initiativesto restrict data access and also a renewed selfie awareness to focus efforts on protecting people on the platform. What’s more notable however is that Mark Zuckerberg also hosted a last-minute, rare town hall with media…
View On WordPress
0 notes
jeanshesallenberger · 6 years
Text
In Mark Zuckerberg We Trust? The State and Future of Facebook, User Data, Cambridge Analytica, Fake News, Elections, Russia and You
In the wake of Cambridge Analytica, data misappropriation, #deletefacebook, calls for regulation and pending testimony to U.S. Congress, Facebook announced a series of initiatives to restrict data access and also a renewed selfie awareness to focus efforts on protecting people on the platform. What’s more notable however is that Mark Zuckerberg also hosted a last-minute, rare town hall with media and analysts to explain these efforts and also take tough questions for the better part of an hour.
Let’s start with the company’s news on data restrictions.
To better protect Facebook user information, the company is making the following changes across nine priority areas over the coming months (Sourced from Facebook):
Events API: Until today, people could grant an app permission to get information about events they host or attend, including private events. Doing so allowed users to add Facebook Events to calendar, ticketing or other apps. According to the company, Facebook Events carry information about other people’s attendance as well as posts on the event wall. As of today, apps using the API can no longer access the guest list or posts on the event wall.
Groups API: Currently apps need permission of a group admin or member to access group content for closed groups. For secret groups, apps need the permission of an admin. However, groups contain information about people and conversations and Facebook wants to make sure everything is protected. Moving forward, all third-party apps using the Groups API will need approval from Facebook and an admin to ensure they benefit the group. Apps will no longer be able to access the member list of a group. Facebook is also removing personal information, such as names and profile photos, attached to posts or comments.
Pages API: Previously, third party apps could use the Pages API to read posts or comments from any Page. Doing so lets developers create tools to help Page owners perform common tasks such as schedule posts and reply to comments or messages. At the same time, it also let apps access more data than necessary. Now, Facebook wants to ensure that Page information is only available to apps providing useful services to our community. All future access to the Pages API will need to be approved by Facebook.
Facebook Login: Two weeks, Facebook announced changes to Facebook Login. As of today, Facebook will need to approve all apps that request access to information such as check-ins, likes, photos, posts, videos, events and groups. Additionally, the company no longer allow apps to ask for access to personal information such as religious or political views, relationship status and details, custom friends lists, education and work history, fitness activity, book reading activity, music listening activity, news reading, video watch activity, and games activity. Soon, Facebook will also remove a developer’s ability to request data people shared with them if there has been no activity on the app in at least three months.
Instagram Platform API: Facebook is accelerating the deprecation of the Instagram Platform API effective today.
Search and Account Recovery: Previously, people could enter a phone number or email address into Facebook search to help find their profiles. According to Facebook, “malicious actors” have abused these features to scrape public profile information by submitting phone numbers or email addresses. Given the scale and sophistication of the activity, Facebook believes most people on Facebook could have had their public profile scraped in this way. This feature is now disabled. Changes are also coming to account recovery to also reduce the risk of scraping. 
Call and Text History: Call and text history was part of an opt-in feature for Messenger or Facebook Lite users on Android. Facebook has reviewed this feature to confirm that it does not collect the content of messages. Logs older than one year will be deleted. More so, broader data, such as the time of calls, will no longer be collected.
Data Providers and Partner Categories: Facebook is shuttering Partner Categories, a product that once let third-party data providers offer their targeting directly on Facebook. The company stated that “although this is common industry practice…winding down…will help improve people’s privacy on Facebook”
App Controls: As of April 9th, Facebook display a link at the top of the News Feed for users to see what apps they use and the information they have shared with those apps. Users will also have streamlined access to remove apps that they no longer need. The company will reveal if information may have been improperly shared with Cambridge Analytica.
Cambridge Analytica may have had data from as many as 87 million people
Facebook also made a startling announcement. After thorough review, the company believes that Cambridge Analytica may have collected information on as many as 87 million people. 81.6% of these users resided in the United Sates with the rest of the affected users scattered across the Philippines, Indonesia, United Kingdom, Mexico, Canada, India, among others. Original reports from the New York Times estimated that the number of affected users was closer to 50 million.
Mark Zuckerberg Faces the Media; Shows Maturity and Also Inexperienced Leadership
In a rare move, Mark Zuckerberg invited press and analysts to a next-day call where he shared details on the company’s latest moves to protect user data, improve the integrity of information shared on the platform and protect users from misinformation. After initially going AWOL following the Cambridge Analytic data SNAFU, he’s since been on a whirlwind media tour. He genuinely seems to want us to know that he made mistakes, that he’s learning from them and that he’s trying to do the right thing. On our call, he stayed on beyond his allotted time to answer tough questions for the better part of 60 minutes.
From the onset, Mark approached the discussion by acknowledging that he and the rest of Facebook hadn’t done enough to date to prevent its latest fiasco nor had it done enough to protect user trust.
“It’s clear now that we didn’t do enough in preventing abuse…that goes for fake news, foreign interference, elections, hate speech, in addition to developers and data privacy,” Zuckerberg stated. “We didn’t take a broad enough view what our responsibility is. It was my fault.”
He further pledged to right these wrongs while focusing on protecting user data and ultimately their Facebook experience.
“It’s not enough to just connect people. We have to make sure those connections are positive and that they’re bringing people closer together,” he said. “It’s not enough to give people a voice. We have to make sure that people aren’t using that voice to hurt people or spread disinformation. And it’s not enough to give people tools to manage apps. We have to ensure that all of those developers protect people’s information too. We have to ensure that everyone in our ecosystem protects information.”
Zuckerberg admitted that protecting data is just one piece of the company’s multi-faceted strategy to get the platform back on track. Misinformation, security issues and user-driven polarization still threaten facts, truth and upcoming elections.
He shared some of the big steps Facebook is taking to combat these issues. “Yesterday we took a big action by taking down Russian IRA pages,” he boasted. “Since we became aware of this activity…we’ve been working to root out the IRA to protect the integrity of elections around the world. All in, we now have about 15,000 people working on security and content review and we’ll have more than 20,000 by the end of this year. This is going to be a major focus for us.”
He added, “While we’ve been doing this, we’ve also been tracing back and identifying this network of fake accounts the IRA has been using so we can work to remove them from Facebook entirely. This is the first action that we’ve taken against the IRA and Russia itself. And it included identifying and taking down a Russian news organization. We have more work to do here.”
Highlights, Observations and Findings
This conversation was pretty dense. In fact, it took hours to pour over the conversation just to put this article together. I understand if you don’t have time to read through the entire interview or listen to the full Q&A. To help, I’ve some of the highlights, insights and takeaways from our hour together.
Mark Zuckerberg wants you to know that he’s very sorry. He articulated on several occasions that he feels the weight of his mistakes, mischaracterizations and gross misjudgments on everything…user data, fake news, election tampering, polarization, data scraping, and user trust. He also wants you to know that he’s learning from his mistakes and his priority is fixing these problems while regaining trust moving forward. He sees this as a multi-year strategy of which Facebook is already one year underway.
Facebook now believes that up to 87 million users, not 50, mostly in the US, may have been affected by Kogan’s personality quiz app. Facebook does not know the extent of which user data was sold to or used by Cambridge Analytica. This was not a data breach according to the company. People willingly took Kogan’s quiz.
Facebook has also potentially exposed millions of user profiles to data scraping due to existing API standards on other fronts over the years. The extent of this scraping and how data was used by third parties is unknown. Facebook has turned off access. Even still, it is unacceptable that it wasn’t taken seriously before. Facebook must own its part in exposing data to bad actors who scraped information for nefarious purposes.
Mark believes that Facebook hasn’t done a good enough job explaining user privacy, how the company makes money and how it does and doesn’t use user content/data. This is changing.
Mark, and the board/shareholders, believe, he’s still the right person for the job. Two reporters asked directly whether he’d step down by force or choice. His answer was an emphatic, “no.” His rationale is that this is his ship and he is the one who’s going to fix everything. He stated on several occasions that he wants to do the right thing. While I applaud his “awakening,” he has made some huge missteps as a leader that need more than promises to rectify. I still believe that Facebook would benefit from seasoned, strategic leadership to establish/renew a social contract with users, Facebook and its partners. The company is after all, fighting wars on multiple fronts. And the company has demonstrated a pattern of either negligence or ignorance in the past and then apologizing afterward. One can assume that this pattern will only continue.
There’s still a fair amount naïveté in play here when it comes to user trust, data and weaponizing information against Facebook users. Even though the company is aiming to right its wrongs, there’s more that lies ahead that the company and its key players cannot see yet. There’s a history of missing significant events here. And, Mark has a history of downplaying these events, acting too late and apologizing after the fact. “I didn’t know” is not a suitable response. Even though the company is making important strides, there’s nothing to make me believe that sophisticated data thieves, information terrorists and shape-shifting scammers aren’t already a step or two ahead of the Facebook team. Remember, following the 2016 election, Mark said it was “crazy” that fake news could somehow sway an election. He’s since recanted that reaction, but it was still his initial response and belief.
Facebook is already taking action against economic actors, government interference and lack of truthfulness and promises to do more. Its since removed thousands of Russian IRA accounts. Russia has responded that Facebook’s moves are considered “censorship.”
Not everything is Facebook’s fault, according to Facebook. Mark places some of the onus ofresponsibility on Facebook userswho didn’t read the ToS, manage their data settings or fully understand what happens when you put your entire life online. In his view, and it’s a tough pill to swallow, no one forced users to take a personality quiz. No one is forcing people to share every aspect of their life online. While the company is making it easier for users to understand what they’re signing up for and how to manage what they share, people still don’t realize that with this free service comes an agreement that as a user, they are the product and their attention is for sale.
Moving forward, Facebook isn’t as worried about data breaches as it is about user manipulation and psyops. According to Mark, users are more likely susceptible to “social engineering” threats over hacking and break-ins. Social engineering is the use of centralized planning and coordinated efforts to manipulate individuals to divulge information for fraudulent purposes. This can also be aimed at manipulating individual perspectives, behaviors and also influencing social change (for better or for worse.) Users aren’t prepared to fully understand if, when and how they’re succeptible to manipulation and I’d argue that research needs to be done in understanding how we’re influencing one another based on our own cognitive biases and how we choose to share and perceive information in real-time.
Facebook really wants you to know that it doesn’t sell user data to advertisers. But, it also acknowledges that it could have done and will do a better job in helping users understand Facebook’s business model. Mark said that users want “better ads” and “better experiences.” In addition to fighting information wars, Facebook is also prioritizing ad targeting, better news feeds, and the creation/delivery of better products and services that users love.
Even though upwards of 87 million users may have been affected by Kogan’s personality quiz and some of that user information was sold to and used by Cambridge Analytica, and that user data was also compromised in many other ways for years, the #deletefacebook movement had zero meaningful impact. But still, Mark says that the fact the movement even gained any momentum is “not good.” This leads to a separate but related conversation about useraddictiveness and dependencyon these platforms that kill movements such as #deletefacebook before they gain momentum.
Users cannot rely on Facebook, Youtube, Twitter, Reddit, et al., to protect them. Respective leaders of each of these platforms MUST fight bad actors to protect users. At the same time, they are not doing enough. Users are in many ways, unwitting pawns in what amounts to not only social engineering, but full-blown information warfare and psyops to cause chaos, disruption or worse. Make no mistake, people, their minds and their beliefs are under attack. It’s not just the “bad actors.” We are witnessing true villains, regardless of intent, damage, abuse and undermine human relationships, truth and digital and real-world democracy.
People and their relationships with one another are being radicalized and weaponized right under their noses. No one is teaching people how this even happens. More so, we are still not exposing the secrets of social design that makes these apps and servicesaddictive. In the face of social disorder, people ar still readily sharing everything about themselves online and believe they are in control of their own experiences, situational analyses and resulting emotions. I don’t know that people could really walk away even if they wanted to and that’s what scares me the most.
Q&A in Full: The Whole Story According to Zuckerberg
Please note that this call was 60 minutes long and what follows is not a complete transcript. I went through the entire conversation to surface key points and context.
David McCabe, Axios: “Given the numbers [around the IRA] have changed so drastically, Why should lawmakers and why should users trust that you’re giving them a full and accurate picture now?”
Zuckerberg: “There is going to be more content that we’re going to find over time. As long as there are people employed in Russia who have the job of trying to find ways to exploit these systems, this is going to be a never-ending battle. You never fully solve security, it’s an arms race. In retrospect, we were behind and we didn’t invest in it upfront. I’m confident that we’re making progress against these adversaries. But they’re very sophisticated. It would be a mistake to assume that you can fully solve a problem like this…”
Rory Cellan-Jones, BBC: “Back in November 2016, dismissed as crazy that fake news could have swung the election. Are you taking this seriously enough…?”
Zuckerberg: “Yes. I clearly made a mistake by just dismissing fake news as crazy as [not] having an impact. What I think is clear at this point, is that it was too flippant. I should never have referred to it as crazy. This is clearly a problem that requires careful work…This is an important area of work for us.”
Ian Sherr, CNET: “You just announced 87 million people affected by Cambridge Analytica, how long have you known this number because the 50 million number has been out there for a while. It feels like the data keeps changing on us and we’re not getting a full forthright view of what’s going on here.”
Zuckerberg: “We only just finalized our understanding of the situation in the last couple of days. We didn’t put out the 50 million number…we wanted to wait until we had a full understanding. Just to give you the complete picture on this, we don’t have logs going back for when exactly [Aleksandr] Kogan’s app queried for everyone’s friends…We wanted to take a broad view and a conservative estimate. I’m quite confident given our analysis, that it is not more than 87 million. It very well could be less…”
David Ingram, Reuters: “…Why weren’t there audits of the use of the social graph API years ago between the 2010 – 2015 period.
Zuckerberg: “In retrospect, I think we should have been doing more all along. Just to speak to how we were thinking about it at the time, as just a matter of explanation, I’m not trying to defend this now…I think our view in a number of aspects of our relationship with people was that our job was to give them tools and that it was largely people’s responsibility in how they chose to use them…I think it was wrong in retrospect to have that limited of a view but the reason why we acted the way that we did was because I think we viewed when someone chose to share their data and then the platform acted in a way that it was designed with the personality quiz app, our view is that, yes, Kogan broke the policies. And, he broke expectations, but also people chose to share that data with them. But today, given what we know, not just about developers, but across all of our tools and just across what our place in society is, it’s such a big service that’s so central in people’s lives, I think we understand that we need to take a broader view of our responsibility. We’re not just building tools that we have to take responsibility for the outcomes in how people use those tools as well. That’s why we didn’t do it at the time. Knowing what I know today, clearly we should have done more and we will going forward.
Cecilia King, NY Times: “Mark, you have indicated that you could be comfortable with some sort of regulation. I’d like to ask you about privacy regulations that are about to take effect in Europe…GDPR. Would you be comfortable with those types of data protection regulation in the U.S. and with global users.”
Zuckerberg: “Regulations like the GDPR are very positive…We intend to make all the same controls and settings everywhere not just Europe.”
Tony Romm, Washington Post: “Do you believe that this [data scraping] was all in violation of your 2011 settlement with the FTC?”
Zuckerberg: “We’ve worked hard to make sure that we comply with it. The reality here is that we have to take a broader view of our responsibility, rather than just legal responsibility. We’re focused on doing the right thing and making sure people’s information is protected. We’re doing investigations, we’re locking down the platform, etc. I think our responsibilities to the people who use Facebook are greater than what’s written in that order and that’s the standard that I want to hold us to.”
Hannah Kuchler, Financial Times, “Investors have raised a lot of concerns about whether this is the result of corporate governance issues at Facebook. Has the board discussed whether you should step down as chairman?”
Zuckerberg: “Ahhh, not that I’m aware of.”
Alexis Madrigal, Atlantic: “Have you ever made a decision that benefitted Facebook’s business but not the community.”
Zuckerberg: “The thing that makes our product challenging to manage and operate are not the trade offs between people and the business, I actually think that those are quite easy, because over the long term the business will be better if you serve people. I just think it would be near sighted to focus on short term revenue over what value to people is and I don’t think we’re that short-sighted. All of the hard decisions we have to make are actually trade-offs between people. One of the big differences between the type of product we’re building, which is why I refer to it as a community and what do I think some of the specific governance issues we have are that different people who use..
https://ift.tt/2uM8Fio
0 notes
waltercostellone · 6 years
Text
In Mark Zuckerberg We Trust? The State and Future of Facebook, User Data, Cambridge Analytica, Fake News, Elections, Russia and You
In the wake of Cambridge Analytica, data misappropriation, #deletefacebook, calls for regulation and pending testimony to U.S. Congress, Facebook announced a series of initiatives to restrict data access and also a renewed selfie awareness to focus efforts on protecting people on the platform. What’s more notable however is that Mark Zuckerberg also hosted a last-minute, rare town hall with media and analysts to explain these efforts and also take tough questions for the better part of an hour.
Let’s start with the company’s news on data restrictions.
To better protect Facebook user information, the company is making the following changes across nine priority areas over the coming months (Sourced from Facebook):
Events API: Until today, people could grant an app permission to get information about events they host or attend, including private events. Doing so allowed users to add Facebook Events to calendar, ticketing or other apps. According to the company, Facebook Events carry information about other people’s attendance as well as posts on the event wall. As of today, apps using the API can no longer access the guest list or posts on the event wall.
Groups API: Currently apps need permission of a group admin or member to access group content for closed groups. For secret groups, apps need the permission of an admin. However, groups contain information about people and conversations and Facebook wants to make sure everything is protected. Moving forward, all third-party apps using the Groups API will need approval from Facebook and an admin to ensure they benefit the group. Apps will no longer be able to access the member list of a group. Facebook is also removing personal information, such as names and profile photos, attached to posts or comments.
Pages API: Previously, third party apps could use the Pages API to read posts or comments from any Page. Doing so lets developers create tools to help Page owners perform common tasks such as schedule posts and reply to comments or messages. At the same time, it also let apps access more data than necessary. Now, Facebook wants to ensure that Page information is only available to apps providing useful services to our community. All future access to the Pages API will need to be approved by Facebook.
Facebook Login: Two weeks, Facebook announced changes to Facebook Login. As of today, Facebook will need to approve all apps that request access to information such as check-ins, likes, photos, posts, videos, events and groups. Additionally, the company no longer allow apps to ask for access to personal information such as religious or political views, relationship status and details, custom friends lists, education and work history, fitness activity, book reading activity, music listening activity, news reading, video watch activity, and games activity. Soon, Facebook will also remove a developer’s ability to request data people shared with them if there has been no activity on the app in at least three months.
Instagram Platform API: Facebook is accelerating the deprecation of the Instagram Platform API effective today.
Search and Account Recovery: Previously, people could enter a phone number or email address into Facebook search to help find their profiles. According to Facebook, “malicious actors” have abused these features to scrape public profile information by submitting phone numbers or email addresses. Given the scale and sophistication of the activity, Facebook believes most people on Facebook could have had their public profile scraped in this way. This feature is now disabled. Changes are also coming to account recovery to also reduce the risk of scraping. 
Call and Text History: Call and text history was part of an opt-in feature for Messenger or Facebook Lite users on Android. Facebook has reviewed this feature to confirm that it does not collect the content of messages. Logs older than one year will be deleted. More so, broader data, such as the time of calls, will no longer be collected.
Data Providers and Partner Categories: Facebook is shuttering Partner Categories, a product that once let third-party data providers offer their targeting directly on Facebook. The company stated that “although this is common industry practice…winding down…will help improve people’s privacy on Facebook”
App Controls: As of April 9th, Facebook display a link at the top of the News Feed for users to see what apps they use and the information they have shared with those apps. Users will also have streamlined access to remove apps that they no longer need. The company will reveal if information may have been improperly shared with Cambridge Analytica.
Cambridge Analytica may have had data from as many as 87 million people
Facebook also made a startling announcement. After thorough review, the company believes that Cambridge Analytica may have collected information on as many as 87 million people. 81.6% of these users resided in the United Sates with the rest of the affected users scattered across the Philippines, Indonesia, United Kingdom, Mexico, Canada, India, among others. Original reports from the New York Times estimated that the number of affected users was closer to 50 million.
Mark Zuckerberg Faces the Media; Shows Maturity and Also Inexperienced Leadership
In a rare move, Mark Zuckerberg invited press and analysts to a next-day call where he shared details on the company’s latest moves to protect user data, improve the integrity of information shared on the platform and protect users from misinformation. After initially going AWOL following the Cambridge Analytic data SNAFU, he’s since been on a whirlwind media tour. He genuinely seems to want us to know that he made mistakes, that he’s learning from them and that he’s trying to do the right thing. On our call, he stayed on beyond his allotted time to answer tough questions for the better part of 60 minutes.
From the onset, Mark approached the discussion by acknowledging that he and the rest of Facebook hadn’t done enough to date to prevent its latest fiasco nor had it done enough to protect user trust.
“It’s clear now that we didn’t do enough in preventing abuse…that goes for fake news, foreign interference, elections, hate speech, in addition to developers and data privacy,” Zuckerberg stated. “We didn’t take a broad enough view what our responsibility is. It was my fault.”
He further pledged to right these wrongs while focusing on protecting user data and ultimately their Facebook experience.
“It’s not enough to just connect people. We have to make sure those connections are positive and that they’re bringing people closer together,” he said. “It’s not enough to give people a voice. We have to make sure that people aren’t using that voice to hurt people or spread disinformation. And it’s not enough to give people tools to manage apps. We have to ensure that all of those developers protect people’s information too. We have to ensure that everyone in our ecosystem protects information.”
Zuckerberg admitted that protecting data is just one piece of the company’s multi-faceted strategy to get the platform back on track. Misinformation, security issues and user-driven polarization still threaten facts, truth and upcoming elections.
He shared some of the big steps Facebook is taking to combat these issues. “Yesterday we took a big action by taking down Russian IRA pages,” he boasted. “Since we became aware of this activity…we’ve been working to root out the IRA to protect the integrity of elections around the world. All in, we now have about 15,000 people working on security and content review and we’ll have more than 20,000 by the end of this year. This is going to be a major focus for us.”
He added, “While we’ve been doing this, we’ve also been tracing back and identifying this network of fake accounts the IRA has been using so we can work to remove them from Facebook entirely. This is the first action that we’ve taken against the IRA and Russia itself. And it included identifying and taking down a Russian news organization. We have more work to do here.”
Highlights, Observations and Findings
This conversation was pretty dense. In fact, it took hours to pour over the conversation just to put this article together. I understand if you don’t have time to read through the entire interview or listen to the full Q&A. To help, I’ve some of the highlights, insights and takeaways from our hour together.
Mark Zuckerberg wants you to know that he’s very sorry. He articulated on several occasions that he feels the weight of his mistakes, mischaracterizations and gross misjudgments on everything…user data, fake news, election tampering, polarization, data scraping, and user trust. He also wants you to know that he’s learning from his mistakes and his priority is fixing these problems while regaining trust moving forward. He sees this as a multi-year strategy of which Facebook is already one year underway.
Facebook now believes that up to 87 million users, not 50, mostly in the US, may have been affected by Kogan’s personality quiz app. Facebook does not know the extent of which user data was sold to or used by Cambridge Analytica. This was not a data breach according to the company. People willingly took Kogan’s quiz.
Facebook has also potentially exposed millions of user profiles to data scraping due to existing API standards on other fronts over the years. The extent of this scraping and how data was used by third parties is unknown. Facebook has turned off access. Even still, it is unacceptable that it wasn’t taken seriously before. Facebook must own its part in exposing data to bad actors who scraped information for nefarious purposes.
Mark believes that Facebook hasn’t done a good enough job explaining user privacy, how the company makes money and how it does and doesn’t use user content/data. This is changing.
Mark, and the board/shareholders, believe, he’s still the right person for the job. Two reporters asked directly whether he’d step down by force or choice. His answer was an emphatic, “no.” His rationale is that this is his ship and he is the one who’s going to fix everything. He stated on several occasions that he wants to do the right thing. While I applaud his “awakening,” he has made some huge missteps as a leader that need more than promises to rectify. I still believe that Facebook would benefit from seasoned, strategic leadership to establish/renew a social contract with users, Facebook and its partners. The company is after all, fighting wars on multiple fronts. And the company has demonstrated a pattern of either negligence or ignorance in the past and then apologizing afterward. One can assume that this pattern will only continue.
There’s still a fair amount naïveté in play here when it comes to user trust, data and weaponizing information against Facebook users. Even though the company is aiming to right its wrongs, there’s more that lies ahead that the company and its key players cannot see yet. There’s a history of missing significant events here. And, Mark has a history of downplaying these events, acting too late and apologizing after the fact. “I didn’t know” is not a suitable response. Even though the company is making important strides, there’s nothing to make me believe that sophisticated data thieves, information terrorists and shape-shifting scammers aren’t already a step or two ahead of the Facebook team. Remember, following the 2016 election, Mark said it was “crazy” that fake news could somehow sway an election. He’s since recanted that reaction, but it was still his initial response and belief.
Facebook is already taking action against economic actors, government interference and lack of truthfulness and promises to do more. Its since removed thousands of Russian IRA accounts. Russia has responded that Facebook’s moves are considered “censorship.”
Not everything is Facebook’s fault, according to Facebook. Mark places some of the onus ofresponsibility on Facebook userswho didn’t read the ToS, manage their data settings or fully understand what happens when you put your entire life online. In his view, and it’s a tough pill to swallow, no one forced users to take a personality quiz. No one is forcing people to share every aspect of their life online. While the company is making it easier for users to understand what they’re signing up for and how to manage what they share, people still don’t realize that with this free service comes an agreement that as a user, they are the product and their attention is for sale.
Moving forward, Facebook isn’t as worried about data breaches as it is about user manipulation and psyops. According to Mark, users are more likely susceptible to “social engineering” threats over hacking and break-ins. Social engineering is the use of centralized planning and coordinated efforts to manipulate individuals to divulge information for fraudulent purposes. This can also be aimed at manipulating individual perspectives, behaviors and also influencing social change (for better or for worse.) Users aren’t prepared to fully understand if, when and how they’re succeptible to manipulation and I’d argue that research needs to be done in understanding how we’re influencing one another based on our own cognitive biases and how we choose to share and perceive information in real-time.
Facebook really wants you to know that it doesn’t sell user data to advertisers. But, it also acknowledges that it could have done and will do a better job in helping users understand Facebook’s business model. Mark said that users want “better ads” and “better experiences.” In addition to fighting information wars, Facebook is also prioritizing ad targeting, better news feeds, and the creation/delivery of better products and services that users love.
Even though upwards of 87 million users may have been affected by Kogan’s personality quiz and some of that user information was sold to and used by Cambridge Analytica, and that user data was also compromised in many other ways for years, the #deletefacebook movement had zero meaningful impact. But still, Mark says that the fact the movement even gained any momentum is “not good.” This leads to a separate but related conversation about useraddictiveness and dependencyon these platforms that kill movements such as #deletefacebook before they gain momentum.
Users cannot rely on Facebook, Youtube, Twitter, Reddit, et al., to protect them. Respective leaders of each of these platforms MUST fight bad actors to protect users. At the same time, they are not doing enough. Users are in many ways, unwitting pawns in what amounts to not only social engineering, but full-blown information warfare and psyops to cause chaos, disruption or worse. Make no mistake, people, their minds and their beliefs are under attack. It’s not just the “bad actors.” We are witnessing true villains, regardless of intent, damage, abuse and undermine human relationships, truth and digital and real-world democracy.
People and their relationships with one another are being radicalized and weaponized right under their noses. No one is teaching people how this even happens. More so, we are still not exposing the secrets of social design that makes these apps and servicesaddictive. In the face of social disorder, people ar still readily sharing everything about themselves online and believe they are in control of their own experiences, situational analyses and resulting emotions. I don’t know that people could really walk away even if they wanted to and that’s what scares me the most.
Q&A in Full: The Whole Story According to Zuckerberg
Please note that this call was 60 minutes long and what follows is not a complete transcript. I went through the entire conversation to surface key points and context.
David McCabe, Axios: “Given the numbers [around the IRA] have changed so drastically, Why should lawmakers and why should users trust that you’re giving them a full and accurate picture now?”
Zuckerberg: “There is going to be more content that we’re going to find over time. As long as there are people employed in Russia who have the job of trying to find ways to exploit these systems, this is going to be a never-ending battle. You never fully solve security, it’s an arms race. In retrospect, we were behind and we didn’t invest in it upfront. I’m confident that we’re making progress against these adversaries. But they’re very sophisticated. It would be a mistake to assume that you can fully solve a problem like this…”
Rory Cellan-Jones, BBC: “Back in November 2016, dismissed as crazy that fake news could have swung the election. Are you taking this seriously enough…?”
Zuckerberg: “Yes. I clearly made a mistake by just dismissing fake news as crazy as [not] having an impact. What I think is clear at this point, is that it was too flippant. I should never have referred to it as crazy. This is clearly a problem that requires careful work…This is an important area of work for us.”
Ian Sherr, CNET: “You just announced 87 million people affected by Cambridge Analytica, how long have you known this number because the 50 million number has been out there for a while. It feels like the data keeps changing on us and we’re not getting a full forthright view of what’s going on here.”
Zuckerberg: “We only just finalized our understanding of the situation in the last couple of days. We didn’t put out the 50 million number…we wanted to wait until we had a full understanding. Just to give you the complete picture on this, we don’t have logs going back for when exactly [Aleksandr] Kogan’s app queried for everyone’s friends…We wanted to take a broad view and a conservative estimate. I’m quite confident given our analysis, that it is not more than 87 million. It very well could be less…”
David Ingram, Reuters: “…Why weren’t there audits of the use of the social graph API years ago between the 2010 – 2015 period.
Zuckerberg: “In retrospect, I think we should have been doing more all along. Just to speak to how we were thinking about it at the time, as just a matter of explanation, I’m not trying to defend this now…I think our view in a number of aspects of our relationship with people was that our job was to give them tools and that it was largely people’s responsibility in how they chose to use them…I think it was wrong in retrospect to have that limited of a view but the reason why we acted the way that we did was because I think we viewed when someone chose to share their data and then the platform acted in a way that it was designed with the personality quiz app, our view is that, yes, Kogan broke the policies. And, he broke expectations, but also people chose to share that data with them. But today, given what we know, not just about developers, but across all of our tools and just across what our place in society is, it’s such a big service that’s so central in people’s lives, I think we understand that we need to take a broader view of our responsibility. We’re not just building tools that we have to take responsibility for the outcomes in how people use those tools as well. That’s why we didn’t do it at the time. Knowing what I know today, clearly we should have done more and we will going forward.
Cecilia King, NY Times: “Mark, you have indicated that you could be comfortable with some sort of regulation. I’d like to ask you about privacy regulations that are about to take effect in Europe…GDPR. Would you be comfortable with those types of data protection regulation in the U.S. and with global users.”
Zuckerberg: “Regulations like the GDPR are very positive…We intend to make all the same controls and settings everywhere not just Europe.”
Tony Romm, Washington Post: “Do you believe that this [data scraping] was all in violation of your 2011 settlement with the FTC?”
Zuckerberg: “We’ve worked hard to make sure that we comply with it. The reality here is that we have to take a broader view of our responsibility, rather than just legal responsibility. We’re focused on doing the right thing and making sure people’s information is protected. We’re doing investigations, we’re locking down the platform, etc. I think our responsibilities to the people who use Facebook are greater than what’s written in that order and that’s the standard that I want to hold us to.”
Hannah Kuchler, Financial Times, “Investors have raised a lot of concerns about whether this is the result of corporate governance issues at Facebook. Has the board discussed whether you should step down as chairman?”
Zuckerberg: “Ahhh, not that I’m aware of.”
Alexis Madrigal, Atlantic: “Have you ever made a decision that benefitted Facebook’s business but not the community.”
Zuckerberg: “The thing that makes our product challenging to manage and operate are not the trade offs between people and the business, I actually think that those are quite easy, because over the long term the business will be better if you serve people. I just think it would be near sighted to focus on short term revenue over what value to people is and I don’t think we’re that short-sighted. All of the hard decisions we have to make are actually trade-offs between people. One of the big differences between the type of product we’re building, which is why I refer to it as a community and what do I think some of the specific governance issues we have are that different people who use..
https://ift.tt/2uM8Fio
0 notes
pattersondonaldblk5 · 6 years
Text
In Mark Zuckerberg We Trust? The State and Future of Facebook, User Data, Cambridge Analytica, Fake News, Elections, Russia and You
In the wake of Cambridge Analytica, data misappropriation, #deletefacebook, calls for regulation and pending testimony to U.S. Congress, Facebook announced a series of initiatives to restrict data access and also a renewed selfie awareness to focus efforts on protecting people on the platform. What’s more notable however is that Mark Zuckerberg also hosted a last-minute, rare town hall with media and analysts to explain these efforts and also take tough questions for the better part of an hour.
Let’s start with the company’s news on data restrictions.
To better protect Facebook user information, the company is making the following changes across nine priority areas over the coming months (Sourced from Facebook):
Events API: Until today, people could grant an app permission to get information about events they host or attend, including private events. Doing so allowed users to add Facebook Events to calendar, ticketing or other apps. According to the company, Facebook Events carry information about other people’s attendance as well as posts on the event wall. As of today, apps using the API can no longer access the guest list or posts on the event wall.
Groups API: Currently apps need permission of a group admin or member to access group content for closed groups. For secret groups, apps need the permission of an admin. However, groups contain information about people and conversations and Facebook wants to make sure everything is protected. Moving forward, all third-party apps using the Groups API will need approval from Facebook and an admin to ensure they benefit the group. Apps will no longer be able to access the member list of a group. Facebook is also removing personal information, such as names and profile photos, attached to posts or comments.
Pages API: Previously, third party apps could use the Pages API to read posts or comments from any Page. Doing so lets developers create tools to help Page owners perform common tasks such as schedule posts and reply to comments or messages. At the same time, it also let apps access more data than necessary. Now, Facebook wants to ensure that Page information is only available to apps providing useful services to our community. All future access to the Pages API will need to be approved by Facebook.
Facebook Login: Two weeks, Facebook announced changes to Facebook Login. As of today, Facebook will need to approve all apps that request access to information such as check-ins, likes, photos, posts, videos, events and groups. Additionally, the company no longer allow apps to ask for access to personal information such as religious or political views, relationship status and details, custom friends lists, education and work history, fitness activity, book reading activity, music listening activity, news reading, video watch activity, and games activity. Soon, Facebook will also remove a developer’s ability to request data people shared with them if there has been no activity on the app in at least three months.
Instagram Platform API: Facebook is accelerating the deprecation of the Instagram Platform API effective today.
Search and Account Recovery: Previously, people could enter a phone number or email address into Facebook search to help find their profiles. According to Facebook, “malicious actors” have abused these features to scrape public profile information by submitting phone numbers or email addresses. Given the scale and sophistication of the activity, Facebook believes most people on Facebook could have had their public profile scraped in this way. This feature is now disabled. Changes are also coming to account recovery to also reduce the risk of scraping. 
Call and Text History: Call and text history was part of an opt-in feature for Messenger or Facebook Lite users on Android. Facebook has reviewed this feature to confirm that it does not collect the content of messages. Logs older than one year will be deleted. More so, broader data, such as the time of calls, will no longer be collected.
Data Providers and Partner Categories: Facebook is shuttering Partner Categories, a product that once let third-party data providers offer their targeting directly on Facebook. The company stated that “although this is common industry practice…winding down…will help improve people’s privacy on Facebook”
App Controls: As of April 9th, Facebook display a link at the top of the News Feed for users to see what apps they use and the information they have shared with those apps. Users will also have streamlined access to remove apps that they no longer need. The company will reveal if information may have been improperly shared with Cambridge Analytica.
Cambridge Analytica may have had data from as many as 87 million people
Facebook also made a startling announcement. After thorough review, the company believes that Cambridge Analytica may have collected information on as many as 87 million people. 81.6% of these users resided in the United Sates with the rest of the affected users scattered across the Philippines, Indonesia, United Kingdom, Mexico, Canada, India, among others. Original reports from the New York Times estimated that the number of affected users was closer to 50 million.
Mark Zuckerberg Faces the Media; Shows Maturity and Also Inexperienced Leadership
In a rare move, Mark Zuckerberg invited press and analysts to a next-day call where he shared details on the company’s latest moves to protect user data, improve the integrity of information shared on the platform and protect users from misinformation. After initially going AWOL following the Cambridge Analytic data SNAFU, he’s since been on a whirlwind media tour. He genuinely seems to want us to know that he made mistakes, that he’s learning from them and that he’s trying to do the right thing. On our call, he stayed on beyond his allotted time to answer tough questions for the better part of 60 minutes.
From the onset, Mark approached the discussion by acknowledging that he and the rest of Facebook hadn’t done enough to date to prevent its latest fiasco nor had it done enough to protect user trust.
“It’s clear now that we didn’t do enough in preventing abuse…that goes for fake news, foreign interference, elections, hate speech, in addition to developers and data privacy,” Zuckerberg stated. “We didn’t take a broad enough view what our responsibility is. It was my fault.”
He further pledged to right these wrongs while focusing on protecting user data and ultimately their Facebook experience.
“It’s not enough to just connect people. We have to make sure those connections are positive and that they’re bringing people closer together,” he said. “It’s not enough to give people a voice. We have to make sure that people aren’t using that voice to hurt people or spread disinformation. And it’s not enough to give people tools to manage apps. We have to ensure that all of those developers protect people’s information too. We have to ensure that everyone in our ecosystem protects information.”
Zuckerberg admitted that protecting data is just one piece of the company’s multi-faceted strategy to get the platform back on track. Misinformation, security issues and user-driven polarization still threaten facts, truth and upcoming elections.
He shared some of the big steps Facebook is taking to combat these issues. “Yesterday we took a big action by taking down Russian IRA pages,” he boasted. “Since we became aware of this activity…we’ve been working to root out the IRA to protect the integrity of elections around the world. All in, we now have about 15,000 people working on security and content review and we’ll have more than 20,000 by the end of this year. This is going to be a major focus for us.”
He added, “While we’ve been doing this, we’ve also been tracing back and identifying this network of fake accounts the IRA has been using so we can work to remove them from Facebook entirely. This is the first action that we’ve taken against the IRA and Russia itself. And it included identifying and taking down a Russian news organization. We have more work to do here.”
Highlights, Observations and Findings
This conversation was pretty dense. In fact, it took hours to pour over the conversation just to put this article together. I understand if you don’t have time to read through the entire interview or listen to the full Q&A. To help, I’ve some of the highlights, insights and takeaways from our hour together.
Mark Zuckerberg wants you to know that he’s very sorry. He articulated on several occasions that he feels the weight of his mistakes, mischaracterizations and gross misjudgments on everything…user data, fake news, election tampering, polarization, data scraping, and user trust. He also wants you to know that he’s learning from his mistakes and his priority is fixing these problems while regaining trust moving forward. He sees this as a multi-year strategy of which Facebook is already one year underway.
Facebook now believes that up to 87 million users, not 50, mostly in the US, may have been affected by Kogan’s personality quiz app. Facebook does not know the extent of which user data was sold to or used by Cambridge Analytica. This was not a data breach according to the company. People willingly took Kogan’s quiz.
Facebook has also potentially exposed millions of user profiles to data scraping due to existing API standards on other fronts over the years. The extent of this scraping and how data was used by third parties is unknown. Facebook has turned off access. Even still, it is unacceptable that it wasn’t taken seriously before. Facebook must own its part in exposing data to bad actors who scraped information for nefarious purposes.
Mark believes that Facebook hasn’t done a good enough job explaining user privacy, how the company makes money and how it does and doesn’t use user content/data. This is changing.
Mark, and the board/shareholders, believe, he’s still the right person for the job. Two reporters asked directly whether he’d step down by force or choice. His answer was an emphatic, “no.” His rationale is that this is his ship and he is the one who’s going to fix everything. He stated on several occasions that he wants to do the right thing. While I applaud his “awakening,” he has made some huge missteps as a leader that need more than promises to rectify. I still believe that Facebook would benefit from seasoned, strategic leadership to establish/renew a social contract with users, Facebook and its partners. The company is after all, fighting wars on multiple fronts. And the company has demonstrated a pattern of either negligence or ignorance in the past and then apologizing afterward. One can assume that this pattern will only continue.
There’s still a fair amount naïveté in play here when it comes to user trust, data and weaponizing information against Facebook users. Even though the company is aiming to right its wrongs, there’s more that lies ahead that the company and its key players cannot see yet. There’s a history of missing significant events here. And, Mark has a history of downplaying these events, acting too late and apologizing after the fact. “I didn’t know” is not a suitable response. Even though the company is making important strides, there’s nothing to make me believe that sophisticated data thieves, information terrorists and shape-shifting scammers aren’t already a step or two ahead of the Facebook team. Remember, following the 2016 election, Mark said it was “crazy” that fake news could somehow sway an election. He’s since recanted that reaction, but it was still his initial response and belief.
Facebook is already taking action against economic actors, government interference and lack of truthfulness and promises to do more. Its since removed thousands of Russian IRA accounts. Russia has responded that Facebook’s moves are considered “censorship.”
Not everything is Facebook’s fault, according to Facebook. Mark places some of the onus ofresponsibility on Facebook userswho didn’t read the ToS, manage their data settings or fully understand what happens when you put your entire life online. In his view, and it’s a tough pill to swallow, no one forced users to take a personality quiz. No one is forcing people to share every aspect of their life online. While the company is making it easier for users to understand what they’re signing up for and how to manage what they share, people still don’t realize that with this free service comes an agreement that as a user, they are the product and their attention is for sale.
Moving forward, Facebook isn’t as worried about data breaches as it is about user manipulation and psyops. According to Mark, users are more likely susceptible to “social engineering” threats over hacking and break-ins. Social engineering is the use of centralized planning and coordinated efforts to manipulate individuals to divulge information for fraudulent purposes. This can also be aimed at manipulating individual perspectives, behaviors and also influencing social change (for better or for worse.) Users aren’t prepared to fully understand if, when and how they’re succeptible to manipulation and I’d argue that research needs to be done in understanding how we’re influencing one another based on our own cognitive biases and how we choose to share and perceive information in real-time.
Facebook really wants you to know that it doesn’t sell user data to advertisers. But, it also acknowledges that it could have done and will do a better job in helping users understand Facebook’s business model. Mark said that users want “better ads” and “better experiences.” In addition to fighting information wars, Facebook is also prioritizing ad targeting, better news feeds, and the creation/delivery of better products and services that users love.
Even though upwards of 87 million users may have been affected by Kogan’s personality quiz and some of that user information was sold to and used by Cambridge Analytica, and that user data was also compromised in many other ways for years, the #deletefacebook movement had zero meaningful impact. But still, Mark says that the fact the movement even gained any momentum is “not good.” This leads to a separate but related conversation about useraddictiveness and dependencyon these platforms that kill movements such as #deletefacebook before they gain momentum.
Users cannot rely on Facebook, Youtube, Twitter, Reddit, et al., to protect them. Respective leaders of each of these platforms MUST fight bad actors to protect users. At the same time, they are not doing enough. Users are in many ways, unwitting pawns in what amounts to not only social engineering, but full-blown information warfare and psyops to cause chaos, disruption or worse. Make no mistake, people, their minds and their beliefs are under attack. It’s not just the “bad actors.” We are witnessing true villains, regardless of intent, damage, abuse and undermine human relationships, truth and digital and real-world democracy.
People and their relationships with one another are being radicalized and weaponized right under their noses. No one is teaching people how this even happens. More so, we are still not exposing the secrets of social design that makes these apps and servicesaddictive. In the face of social disorder, people ar still readily sharing everything about themselves online and believe they are in control of their own experiences, situational analyses and resulting emotions. I don’t know that people could really walk away even if they wanted to and that’s what scares me the most.
Q&A in Full: The Whole Story According to Zuckerberg
Please note that this call was 60 minutes long and what follows is not a complete transcript. I went through the entire conversation to surface key points and context.
David McCabe, Axios: “Given the numbers [around the IRA] have changed so drastically, Why should lawmakers and why should users trust that you’re giving them a full and accurate picture now?”
Zuckerberg: “There is going to be more content that we’re going to find over time. As long as there are people employed in Russia who have the job of trying to find ways to exploit these systems, this is going to be a never-ending battle. You never fully solve security, it’s an arms race. In retrospect, we were behind and we didn’t invest in it upfront. I’m confident that we’re making progress against these adversaries. But they’re very sophisticated. It would be a mistake to assume that you can fully solve a problem like this…”
Rory Cellan-Jones, BBC: “Back in November 2016, dismissed as crazy that fake news could have swung the election. Are you taking this seriously enough…?”
Zuckerberg: “Yes. I clearly made a mistake by just dismissing fake news as crazy as [not] having an impact. What I think is clear at this point, is that it was too flippant. I should never have referred to it as crazy. This is clearly a problem that requires careful work…This is an important area of work for us.”
Ian Sherr, CNET: “You just announced 87 million people affected by Cambridge Analytica, how long have you known this number because the 50 million number has been out there for a while. It feels like the data keeps changing on us and we’re not getting a full forthright view of what’s going on here.”
Zuckerberg: “We only just finalized our understanding of the situation in the last couple of days. We didn’t put out the 50 million number…we wanted to wait until we had a full understanding. Just to give you the complete picture on this, we don’t have logs going back for when exactly [Aleksandr] Kogan’s app queried for everyone’s friends…We wanted to take a broad view and a conservative estimate. I’m quite confident given our analysis, that it is not more than 87 million. It very well could be less…”
David Ingram, Reuters: “…Why weren’t there audits of the use of the social graph API years ago between the 2010 – 2015 period.
Zuckerberg: “In retrospect, I think we should have been doing more all along. Just to speak to how we were thinking about it at the time, as just a matter of explanation, I’m not trying to defend this now…I think our view in a number of aspects of our relationship with people was that our job was to give them tools and that it was largely people’s responsibility in how they chose to use them…I think it was wrong in retrospect to have that limited of a view but the reason why we acted the way that we did was because I think we viewed when someone chose to share their data and then the platform acted in a way that it was designed with the personality quiz app, our view is that, yes, Kogan broke the policies. And, he broke expectations, but also people chose to share that data with them. But today, given what we know, not just about developers, but across all of our tools and just across what our place in society is, it’s such a big service that’s so central in people’s lives, I think we understand that we need to take a broader view of our responsibility. We’re not just building tools that we have to take responsibility for the outcomes in how people use those tools as well. That’s why we didn’t do it at the time. Knowing what I know today, clearly we should have done more and we will going forward.
Cecilia King, NY Times: “Mark, you have indicated that you could be comfortable with some sort of regulation. I’d like to ask you about privacy regulations that are about to take effect in Europe…GDPR. Would you be comfortable with those types of data protection regulation in the U.S. and with global users.”
Zuckerberg: “Regulations like the GDPR are very positive…We intend to make all the same controls and settings everywhere not just Europe.”
Tony Romm, Washington Post: “Do you believe that this [data scraping] was all in violation of your 2011 settlement with the FTC?”
Zuckerberg: “We’ve worked hard to make sure that we comply with it. The reality here is that we have to take a broader view of our responsibility, rather than just legal responsibility. We’re focused on doing the right thing and making sure people’s information is protected. We’re doing investigations, we’re locking down the platform, etc. I think our responsibilities to the people who use Facebook are greater than what’s written in that order and that’s the standard that I want to hold us to.”
Hannah Kuchler, Financial Times, “Investors have raised a lot of concerns about whether this is the result of corporate governance issues at Facebook. Has the board discussed whether you should step down as chairman?”
Zuckerberg: “Ahhh, not that I’m aware of.”
Alexis Madrigal, Atlantic: “Have you ever made a decision that benefitted Facebook’s business but not the community.”
Zuckerberg: “The thing that makes our product challenging to manage and operate are not the trade offs between people and the business, I actually think that those are quite easy, because over the long term the business will be better if you serve people. I just think it would be near sighted to focus on short term revenue over what value to people is and I don’t think we’re that short-sighted. All of the hard decisions we have to make are actually trade-offs between people. One of the big differences between the type of product we’re building, which is why I refer to it as a community and what do I think some of the specific governance issues we have are that different people who use..
https://ift.tt/2uM8Fio
0 notes
joannlyfgnch · 6 years
Text
In Mark Zuckerberg We Trust? The State and Future of Facebook, User Data, Cambridge Analytica, Fake News, Elections, Russia and You
In the wake of Cambridge Analytica, data misappropriation, #deletefacebook, calls for regulation and pending testimony to U.S. Congress, Facebook announced a series of initiatives to restrict data access and also a renewed selfie awareness to focus efforts on protecting people on the platform. What’s more notable however is that Mark Zuckerberg also hosted a last-minute, rare town hall with media and analysts to explain these efforts and also take tough questions for the better part of an hour.
Let’s start with the company’s news on data restrictions.
To better protect Facebook user information, the company is making the following changes across nine priority areas over the coming months (Sourced from Facebook):
Events API: Until today, people could grant an app permission to get information about events they host or attend, including private events. Doing so allowed users to add Facebook Events to calendar, ticketing or other apps. According to the company, Facebook Events carry information about other people’s attendance as well as posts on the event wall. As of today, apps using the API can no longer access the guest list or posts on the event wall.
Groups API: Currently apps need permission of a group admin or member to access group content for closed groups. For secret groups, apps need the permission of an admin. However, groups contain information about people and conversations and Facebook wants to make sure everything is protected. Moving forward, all third-party apps using the Groups API will need approval from Facebook and an admin to ensure they benefit the group. Apps will no longer be able to access the member list of a group. Facebook is also removing personal information, such as names and profile photos, attached to posts or comments.
Pages API: Previously, third party apps could use the Pages API to read posts or comments from any Page. Doing so lets developers create tools to help Page owners perform common tasks such as schedule posts and reply to comments or messages. At the same time, it also let apps access more data than necessary. Now, Facebook wants to ensure that Page information is only available to apps providing useful services to our community. All future access to the Pages API will need to be approved by Facebook.
Facebook Login: Two weeks, Facebook announced changes to Facebook Login. As of today, Facebook will need to approve all apps that request access to information such as check-ins, likes, photos, posts, videos, events and groups. Additionally, the company no longer allow apps to ask for access to personal information such as religious or political views, relationship status and details, custom friends lists, education and work history, fitness activity, book reading activity, music listening activity, news reading, video watch activity, and games activity. Soon, Facebook will also remove a developer’s ability to request data people shared with them if there has been no activity on the app in at least three months.
Instagram Platform API: Facebook is accelerating the deprecation of the Instagram Platform API effective today.
Search and Account Recovery: Previously, people could enter a phone number or email address into Facebook search to help find their profiles. According to Facebook, “malicious actors” have abused these features to scrape public profile information by submitting phone numbers or email addresses. Given the scale and sophistication of the activity, Facebook believes most people on Facebook could have had their public profile scraped in this way. This feature is now disabled. Changes are also coming to account recovery to also reduce the risk of scraping. 
Call and Text History: Call and text history was part of an opt-in feature for Messenger or Facebook Lite users on Android. Facebook has reviewed this feature to confirm that it does not collect the content of messages. Logs older than one year will be deleted. More so, broader data, such as the time of calls, will no longer be collected.
Data Providers and Partner Categories: Facebook is shuttering Partner Categories, a product that once let third-party data providers offer their targeting directly on Facebook. The company stated that “although this is common industry practice…winding down…will help improve people’s privacy on Facebook”
App Controls: As of April 9th, Facebook display a link at the top of the News Feed for users to see what apps they use and the information they have shared with those apps. Users will also have streamlined access to remove apps that they no longer need. The company will reveal if information may have been improperly shared with Cambridge Analytica.
Cambridge Analytica may have had data from as many as 87 million people
Facebook also made a startling announcement. After thorough review, the company believes that Cambridge Analytica may have collected information on as many as 87 million people. 81.6% of these users resided in the United Sates with the rest of the affected users scattered across the Philippines, Indonesia, United Kingdom, Mexico, Canada, India, among others. Original reports from the New York Times estimated that the number of affected users was closer to 50 million.
Mark Zuckerberg Faces the Media; Shows Maturity and Also Inexperienced Leadership
In a rare move, Mark Zuckerberg invited press and analysts to a next-day call where he shared details on the company’s latest moves to protect user data, improve the integrity of information shared on the platform and protect users from misinformation. After initially going AWOL following the Cambridge Analytic data SNAFU, he’s since been on a whirlwind media tour. He genuinely seems to want us to know that he made mistakes, that he’s learning from them and that he’s trying to do the right thing. On our call, he stayed on beyond his allotted time to answer tough questions for the better part of 60 minutes.
From the onset, Mark approached the discussion by acknowledging that he and the rest of Facebook hadn’t done enough to date to prevent its latest fiasco nor had it done enough to protect user trust.
“It’s clear now that we didn’t do enough in preventing abuse…that goes for fake news, foreign interference, elections, hate speech, in addition to developers and data privacy,” Zuckerberg stated. “We didn’t take a broad enough view what our responsibility is. It was my fault.”
He further pledged to right these wrongs while focusing on protecting user data and ultimately their Facebook experience.
“It’s not enough to just connect people. We have to make sure those connections are positive and that they’re bringing people closer together,” he said. “It’s not enough to give people a voice. We have to make sure that people aren’t using that voice to hurt people or spread disinformation. And it’s not enough to give people tools to manage apps. We have to ensure that all of those developers protect people’s information too. We have to ensure that everyone in our ecosystem protects information.”
Zuckerberg admitted that protecting data is just one piece of the company’s multi-faceted strategy to get the platform back on track. Misinformation, security issues and user-driven polarization still threaten facts, truth and upcoming elections.
He shared some of the big steps Facebook is taking to combat these issues. “Yesterday we took a big action by taking down Russian IRA pages,” he boasted. “Since we became aware of this activity…we’ve been working to root out the IRA to protect the integrity of elections around the world. All in, we now have about 15,000 people working on security and content review and we’ll have more than 20,000 by the end of this year. This is going to be a major focus for us.”
He added, “While we’ve been doing this, we’ve also been tracing back and identifying this network of fake accounts the IRA has been using so we can work to remove them from Facebook entirely. This is the first action that we’ve taken against the IRA and Russia itself. And it included identifying and taking down a Russian news organization. We have more work to do here.”
Highlights, Observations and Findings
This conversation was pretty dense. In fact, it took hours to pour over the conversation just to put this article together. I understand if you don’t have time to read through the entire interview or listen to the full Q&A. To help, I’ve some of the highlights, insights and takeaways from our hour together.
Mark Zuckerberg wants you to know that he’s very sorry. He articulated on several occasions that he feels the weight of his mistakes, mischaracterizations and gross misjudgments on everything…user data, fake news, election tampering, polarization, data scraping, and user trust. He also wants you to know that he’s learning from his mistakes and his priority is fixing these problems while regaining trust moving forward. He sees this as a multi-year strategy of which Facebook is already one year underway.
Facebook now believes that up to 87 million users, not 50, mostly in the US, may have been affected by Kogan’s personality quiz app. Facebook does not know the extent of which user data was sold to or used by Cambridge Analytica. This was not a data breach according to the company. People willingly took Kogan’s quiz.
Facebook has also potentially exposed millions of user profiles to data scraping due to existing API standards on other fronts over the years. The extent of this scraping and how data was used by third parties is unknown. Facebook has turned off access. Even still, it is unacceptable that it wasn’t taken seriously before. Facebook must own its part in exposing data to bad actors who scraped information for nefarious purposes.
Mark believes that Facebook hasn’t done a good enough job explaining user privacy, how the company makes money and how it does and doesn’t use user content/data. This is changing.
Mark, and the board/shareholders, believe, he’s still the right person for the job. Two reporters asked directly whether he’d step down by force or choice. His answer was an emphatic, “no.” His rationale is that this is his ship and he is the one who’s going to fix everything. He stated on several occasions that he wants to do the right thing. While I applaud his “awakening,” he has made some huge missteps as a leader that need more than promises to rectify. I still believe that Facebook would benefit from seasoned, strategic leadership to establish/renew a social contract with users, Facebook and its partners. The company is after all, fighting wars on multiple fronts. And the company has demonstrated a pattern of either negligence or ignorance in the past and then apologizing afterward. One can assume that this pattern will only continue.
There’s still a fair amount naïveté in play here when it comes to user trust, data and weaponizing information against Facebook users. Even though the company is aiming to right its wrongs, there’s more that lies ahead that the company and its key players cannot see yet. There’s a history of missing significant events here. And, Mark has a history of downplaying these events, acting too late and apologizing after the fact. “I didn’t know” is not a suitable response. Even though the company is making important strides, there’s nothing to make me believe that sophisticated data thieves, information terrorists and shape-shifting scammers aren’t already a step or two ahead of the Facebook team. Remember, following the 2016 election, Mark said it was “crazy” that fake news could somehow sway an election. He’s since recanted that reaction, but it was still his initial response and belief.
Facebook is already taking action against economic actors, government interference and lack of truthfulness and promises to do more. Its since removed thousands of Russian IRA accounts. Russia has responded that Facebook’s moves are considered “censorship.”
Not everything is Facebook’s fault, according to Facebook. Mark places some of the onus ofresponsibility on Facebook userswho didn’t read the ToS, manage their data settings or fully understand what happens when you put your entire life online. In his view, and it’s a tough pill to swallow, no one forced users to take a personality quiz. No one is forcing people to share every aspect of their life online. While the company is making it easier for users to understand what they’re signing up for and how to manage what they share, people still don’t realize that with this free service comes an agreement that as a user, they are the product and their attention is for sale.
Moving forward, Facebook isn’t as worried about data breaches as it is about user manipulation and psyops. According to Mark, users are more likely susceptible to “social engineering” threats over hacking and break-ins. Social engineering is the use of centralized planning and coordinated efforts to manipulate individuals to divulge information for fraudulent purposes. This can also be aimed at manipulating individual perspectives, behaviors and also influencing social change (for better or for worse.) Users aren’t prepared to fully understand if, when and how they’re succeptible to manipulation and I’d argue that research needs to be done in understanding how we’re influencing one another based on our own cognitive biases and how we choose to share and perceive information in real-time.
Facebook really wants you to know that it doesn’t sell user data to advertisers. But, it also acknowledges that it could have done and will do a better job in helping users understand Facebook’s business model. Mark said that users want “better ads” and “better experiences.” In addition to fighting information wars, Facebook is also prioritizing ad targeting, better news feeds, and the creation/delivery of better products and services that users love.
Even though upwards of 87 million users may have been affected by Kogan’s personality quiz and some of that user information was sold to and used by Cambridge Analytica, and that user data was also compromised in many other ways for years, the #deletefacebook movement had zero meaningful impact. But still, Mark says that the fact the movement even gained any momentum is “not good.” This leads to a separate but related conversation about useraddictiveness and dependencyon these platforms that kill movements such as #deletefacebook before they gain momentum.
Users cannot rely on Facebook, Youtube, Twitter, Reddit, et al., to protect them. Respective leaders of each of these platforms MUST fight bad actors to protect users. At the same time, they are not doing enough. Users are in many ways, unwitting pawns in what amounts to not only social engineering, but full-blown information warfare and psyops to cause chaos, disruption or worse. Make no mistake, people, their minds and their beliefs are under attack. It’s not just the “bad actors.” We are witnessing true villains, regardless of intent, damage, abuse and undermine human relationships, truth and digital and real-world democracy.
People and their relationships with one another are being radicalized and weaponized right under their noses. No one is teaching people how this even happens. More so, we are still not exposing the secrets of social design that makes these apps and servicesaddictive. In the face of social disorder, people ar still readily sharing everything about themselves online and believe they are in control of their own experiences, situational analyses and resulting emotions. I don’t know that people could really walk away even if they wanted to and that’s what scares me the most.
Q&A in Full: The Whole Story According to Zuckerberg
Please note that this call was 60 minutes long and what follows is not a complete transcript. I went through the entire conversation to surface key points and context.
David McCabe, Axios: “Given the numbers [around the IRA] have changed so drastically, Why should lawmakers and why should users trust that you’re giving them a full and accurate picture now?”
Zuckerberg: “There is going to be more content that we’re going to find over time. As long as there are people employed in Russia who have the job of trying to find ways to exploit these systems, this is going to be a never-ending battle. You never fully solve security, it’s an arms race. In retrospect, we were behind and we didn’t invest in it upfront. I’m confident that we’re making progress against these adversaries. But they’re very sophisticated. It would be a mistake to assume that you can fully solve a problem like this…”
Rory Cellan-Jones, BBC: “Back in November 2016, dismissed as crazy that fake news could have swung the election. Are you taking this seriously enough…?”
Zuckerberg: “Yes. I clearly made a mistake by just dismissing fake news as crazy as [not] having an impact. What I think is clear at this point, is that it was too flippant. I should never have referred to it as crazy. This is clearly a problem that requires careful work…This is an important area of work for us.”
Ian Sherr, CNET: “You just announced 87 million people affected by Cambridge Analytica, how long have you known this number because the 50 million number has been out there for a while. It feels like the data keeps changing on us and we’re not getting a full forthright view of what’s going on here.”
Zuckerberg: “We only just finalized our understanding of the situation in the last couple of days. We didn’t put out the 50 million number…we wanted to wait until we had a full understanding. Just to give you the complete picture on this, we don’t have logs going back for when exactly [Aleksandr] Kogan’s app queried for everyone’s friends…We wanted to take a broad view and a conservative estimate. I’m quite confident given our analysis, that it is not more than 87 million. It very well could be less…”
David Ingram, Reuters: “…Why weren’t there audits of the use of the social graph API years ago between the 2010 – 2015 period.
Zuckerberg: “In retrospect, I think we should have been doing more all along. Just to speak to how we were thinking about it at the time, as just a matter of explanation, I’m not trying to defend this now…I think our view in a number of aspects of our relationship with people was that our job was to give them tools and that it was largely people’s responsibility in how they chose to use them…I think it was wrong in retrospect to have that limited of a view but the reason why we acted the way that we did was because I think we viewed when someone chose to share their data and then the platform acted in a way that it was designed with the personality quiz app, our view is that, yes, Kogan broke the policies. And, he broke expectations, but also people chose to share that data with them. But today, given what we know, not just about developers, but across all of our tools and just across what our place in society is, it’s such a big service that’s so central in people’s lives, I think we understand that we need to take a broader view of our responsibility. We’re not just building tools that we have to take responsibility for the outcomes in how people use those tools as well. That’s why we didn’t do it at the time. Knowing what I know today, clearly we should have done more and we will going forward.
Cecilia King, NY Times: “Mark, you have indicated that you could be comfortable with some sort of regulation. I’d like to ask you about privacy regulations that are about to take effect in Europe…GDPR. Would you be comfortable with those types of data protection regulation in the U.S. and with global users.”
Zuckerberg: “Regulations like the GDPR are very positive…We intend to make all the same controls and settings everywhere not just Europe.”
Tony Romm, Washington Post: “Do you believe that this [data scraping] was all in violation of your 2011 settlement with the FTC?”
Zuckerberg: “We’ve worked hard to make sure that we comply with it. The reality here is that we have to take a broader view of our responsibility, rather than just legal responsibility. We’re focused on doing the right thing and making sure people’s information is protected. We’re doing investigations, we’re locking down the platform, etc. I think our responsibilities to the people who use Facebook are greater than what’s written in that order and that’s the standard that I want to hold us to.”
Hannah Kuchler, Financial Times, “Investors have raised a lot of concerns about whether this is the result of corporate governance issues at Facebook. Has the board discussed whether you should step down as chairman?”
Zuckerberg: “Ahhh, not that I’m aware of.”
Alexis Madrigal, Atlantic: “Have you ever made a decision that benefitted Facebook’s business but not the community.”
Zuckerberg: “The thing that makes our product challenging to manage and operate are not the trade offs between people and the business, I actually think that those are quite easy, because over the long term the business will be better if you serve people. I just think it would be near sighted to focus on short term revenue over what value to people is and I don’t think we’re that short-sighted. All of the hard decisions we have to make are actually trade-offs between people. One of the big differences between the type of product we’re building, which is why I refer to it as a community and what do I think some of the specific governance issues we have are that different people who use..
https://ift.tt/2uM8Fio
0 notes
aaronbarrnna · 6 years
Text
In Mark Zuckerberg We Trust? The State and Future of Facebook, User Data, Cambridge Analytica, Fake News, Elections, Russia and You
In Mark Zuckerberg We Trust? The State and Future of Facebook, User Data, Cambridge Analytica, Fake News, Elections, Russia and You
In the wake of Cambridge Analytica, data misappropriation, #deletefacebook, calls for regulation and pending testimony to U.S. Congress, Facebook announced a series of initiativesto restrict data access and also a renewed selfie awareness to focus efforts on protecting people on the platform. What’s more notable however is that Mark Zuckerberg also hosted a last-minute, rare town hall with media…
View On WordPress
0 notes
primorcoin · 2 years
Photo
Tumblr media
New Post has been published on https://primorcoin.com/why-we-must-fight-for-a-decentralized-future/
Why we must fight for a decentralized future
Tumblr media
If you’re into cryptocurrency or blockchain, there’s a good chance I don’t have to spell out the benefits of decentralization. You’re a first-generation user of a technology that will increasingly define the future of the internet, and you have front-row seats to the world premiere of Web3.
Tumblr media
The internet’s use and control were always as centralized as we see now. In the early days, under the stewardship of the United States Department of Defense, the network needed not to rely on one core computer. What if a terrorist attack or missile strike took down the principal node? Individual network parts had to communicate without relying on a single computer to reduce vulnerability.
Later, the unincorporated Internet Engineering Task Force, which facilitated the development of all internet protocols, worked ceaselessly to prevent private companies or particular countries from controlling the network.
Today, centralized app nodes are controlled and operated by the planet’s richest organizations, collecting and storing billions of people’s data. Private companies control the user experience on apps and can incentivize and manipulate behavior. From a reliability standpoint, billions lose their primary means of communication when centralized nodes go down — as in recent incidents with Facebook, Instagram, WhatsApp and Messenger in October 2021.
Tumblr media
We have also seen how little the tech behemoths think of our privacy when dollar signs appear in their eyes: They harvest and sell our data on an industrial scale. After 10-plus years of using people as advertisers’ products, Mark Zuckerberg has brazenly co-opted the metaverse. Google and Apple, meanwhile, continue their incessant mission to enter every corner of our lives.
Related: The data economy is a dystopian nightmare
We also know what happens when authoritarian governments come knocking on the doors of these centralized mega-warehouses of data, fed by our devices that function as a surveillance army. We’ve seen in Ukraine the awful, large-scale violence that can be excused or hidden when media and military power comes under authoritarian control. In some countries, the state has unprecedented access to every aspect of citizens’ behavior, monitoring everything from internet search history to minor social infractions. Systems that would horrify even George Orwell are only possible because of centralization.
Even in Silicon Valley, ensconced within Western notions of freedom and individuals’ rights, tech empires rarely choose a principled stance over a large, lucrative market. When centralized powers such as Moscow, Beijing or Istanbul ask for censorship and control, they usually get it. Fundamentally, we cannot trust the tech giants with the innermost details of our lives; the centralization of control over the internet is undermining or forestalling democracy everywhere.
Taking our power back
We should not be surprised that tech behemoths have become the natural enemies of decentralization: Centralization is a natural instinct for those in control. Until the advent of the internet and the blockchain, centralization often meant convenience and simplicity. In the Middle Ages, a distributed system of vassal lords meant the monarchy lacked control, and money seeped through the cracks of corruption.
With time and distance no longer problematic in the internet age, Big Tech’s drive toward centralization is less surprising. Can we be astonished by the horrific results of attention-grabbing algorithms, such as attempted genocides or political manipulation based on psychometric analysis of user data? Centralization has consequences.
Distributed ledger technology provides a practical alternative. Social media, messaging, streaming, searching and data-sharing on the blockchain can be fairer, more transparent and accessible, and less centralized. Conversely, this does not mean data has to be less private.
In XX Messenger’s case, which my team and I launched in January, XX Network nodes process anonymous messages worldwide, shredding metadata for recipients and timestamps. With XX, there is privacy and decentralization. Later, this new paradigm of communications and information-sharing makes a significant extension and reinvention of democracy possible.
Related: Blockchain-based decentralized messengers: A privacy pipedream?
There are moments in history when two separate events combine to tell a greater truth. In 2008, when Lehman Brothers Holdings Inc. crashed in the wake of the Great Recession, it seemed to be the death knell of centralized financial institutions, despite the economic pain it would herald. Then, little more than a month later, Satoshi Nakamoto published the Bitcoin (BTC) white paper, the revolutionary blueprint for modern peer-to-peer currency. There’s an important connection between these two momentous events, yet the words “Bitcoin,” “blockchain” and “cryptocurrency” draw eye-rolls from those who misunderstand centralization’s issues.
In the autumn of 2008 was the opportunity to begin telling a story: It is up to us — the cryptographers, privacy lovers, traders, developers, activists and converts — to carry the torch of decentralization and democracy. If there was ever a tale that deserved to be told, beginning to end, it is this one.
Join me in telling it.
This article does not contain investment advice or recommendations. Every investment and trading move involves risk, and readers should conduct their own research when making a decision.
The views, thoughts and opinions expressed here are the author’s alone and do not necessarily reflect or represent the views and opinions of Cointelegraph.
David Chaum is one of the earliest blockchain researchers and a world-renowned cryptographer and privacy advocate. Known as “The Godfather of Privacy,” Chaum first proposed a solution for protecting metadata with mix-cascade networks in 1979. In 1982, his dissertation at the University of California, Berkeley became the first known proposal of a blockchain protocol. Chaum developed eCash, the first digital currency, and made numerous contributions to secure voting systems in the 1990s. Today, Chaum is the founder of Elixxir, Praxxis and the XX Network, which combine his decades of research and contributions in cryptography and privacy to deliver state-of-the-art blockchain solutions.
Source link
#Blockchain #BTC #CryptoNews
0 notes
dustinwootenne · 6 years
Text
In Mark Zuckerberg We Trust? The State and Future of Facebook, User Data, Cambridge Analytica, Fake News, Elections, Russia and You
In Mark Zuckerberg We Trust? The State and Future of Facebook, User Data, Cambridge Analytica, Fake News, Elections, Russia and You
In the wake of Cambridge Analytica, data misappropriation, #deletefacebook, calls for regulation and pending testimony to U.S. Congress, Facebook announced a series of initiativesto restrict data access and also a renewed selfie awareness to focus efforts on protecting people on the platform. What’s more notable however is that Mark Zuckerberg also hosted a last-minute, rare town hall with media…
View On WordPress
0 notes