Facebook Terms of Service Need Work--Figuratively and Literally
Facebook "terms of service" take on new meaning as the company faces backlash for seeming to turn a blind eye to questionable use of customer data.
If one were to take Mark Zuckerberg’s words at face value, Facebook was created to bring us all closer together as a global community. To quote Zuckerberg directly from a recent post on Facebook in response to the emerging revelations around Cambridge Analytica’s alleged misappropriation, use and withholding of Facebook user data obtained through a third party application: “In 2007, we launched the Facebook Platform with the vision that more apps should be social. Your calendar should be able to show your friends' birthdays, your maps should show where your friends live, and your address book should show their pictures. To do this, we enabled people to log into apps and share who their friends were and some information about them.”
The complete text of the post can be found here.
At the center of this issue are the revelations that in 2015, Facebook was made aware through British journalists at the Guardian that Cambridge Analytica, a data mining firm, had obtained data collected by an application hosted in the Facebook platform two years earlier. Under guidelines and frameworks established by Facebook in 2014, this data should have never been shared with Cambridge Analytica. In 2015, according to Zuckerberg, the firm was formally approached by Facebook and signed affidavits to the effect that they destroyed all copies of said user data. It has come to light that Cambridge Analytica not only lied about the destruction of the data, but that the data also fueled various political campaigns (allegedly, the Trump and Ted Cruz campaigns, among others) through the 2016 election cycle.
I’ve been a data professional for close to 20 years. The majority of that time was spent in the healthcare industry, where securitization and stewardship of customer data is held in as high regard as is its use to save lives and preventing illness.
Data security should be the paramount concern for any data professional--and the expectation of the people generating that (often sensitive) data. Indeed, from what I have seen from the insights currently available on this emerging issue is that users perceive a level of data stewardship on Facebook's part. However, that doesn't seem to exist within the company itself. Rather, Facebook demonstrates a continued reactionary approach to fixing issues inherent in the Facebook platform’s structure.
In 2014, Facebook enacted changes to limit the amount of data apps were able to obtain from users and the data of friends of users, but it appears there was insufficient due diligence to ensure that existing applications were held to the same standards being enacted. Those changes are now being imposed four years later because of what has come to light with these recent allegations against Cambridge Analytica. Furthermore, when Russian campaign meddling in United States elections was alleged, the reaction from Facebook was precisely that: reactionary.
And now we have the next round of reactionary changes being implemented as a result of the Cambridge Analytica issue. According to Zuckerberg in an interview with Wired Magazine’s editor-in-chief Nick Thompson, Facebook has to first deal with living up to its own run book from the 2014 platform changes:
"The first action that we now need to go take is to not just rely on certifications that we've gotten from developers, but actually need to go and do a full investigation of every single app that was operating before we had the more restrictive platform policies--that had access to a lot of data--and for any app that has any suspicious activity, we're going to go in and do a full forensic audit," Zuckerberg said. "And any developer who won't sign up for that, we're going to kick off the [Facebook] platform."
Zuckerberg went on to say: "That's the step that I think we should have done for Cambridge Analytica ... we're now going to go do it for every developer who is on the platform who had access to a large amount of data before we locked things down in 2014."
My question is this: What's "a lot" of data? One million users’ records? One thousand? What if your data was one of those thousand records. I bet you’d consider that important to address.
Facebook is walking a fine line. Zuckerberg is telling us what we want to hear as users, but at the same time this message is utterly contradictory to Facebook's business model.
Simplifying the topic, Facebook states that it has a single product: its platform. I contend that the Facebook platform is the honey pot that attracts the company's true product: its users.
We are Facebook's product, and our collective data is the company's most valuable asset. As Facebook found its footing and business model around advertising to users, it found even better financial success in providing targeted audiences for companies looking to market to hyper-detailed pools of consumers. This model pushes intrusion into users’ profiles, posts and relationships to the legal limits. That reality is at odds with what Zuckerberg continues to state publicly to assuage the user base of Facebook: The company is aghast that its user data is being misappropriated for political or economic gain.
From an economic point of view for Facebook, our data exists only because it can be monetized for the company's paying customers. They, in turn, use it to target us where we spend an ever-increasing amount of time: on Facebook.
I’m also a business owner and have used Facebook to target users for my Tech Outbound SQL Cruise events that meet certain requirements (for example, around interests and location). I’ve seen success with those targeted ads, and I believe there is a place for this business model. However, it absolutely requires controls to protect the users and their data--even if that means there are compromises in the effectiveness of the data mining outcomes.
In addition to the retroactive analysis Zuckerberg proposed for apps with access to large amounts of data, isolating for suspicious activity and banning developers who don’t agree to audits of the data collected, he also vowed to:
Restrict developers' data access further to prevent other abuse
Reduce the required data consumers must offer up to an application when they register
Increase the visibility of tools now available for revocation of access in specific applications
These actions, again, are reactive and incomplete. There is no such thing as a limit on the size of the data. One account misappropriated is too much. Additionally, if apps are being allowed to request more data points than is necessary to serve the needs of executing the application, Facebook should have already monitored for such situations. In my opinion, much of this comes down to a corporate decision by Facebook to turn a blind eye toward these issues because limiting the sharing by users limits the profits to be realized by Facebook.
Time will tell where any granted rights to data security fall in line against the rights of businesses to turn a profit in an age of unread terms and conditions, as well as GDPR and competing regulations in the works. The matter will likely be decided in multiple courts during the next decade (conservatively), and will face challenges every time there is an advancement in technology. Until then, Facebook and other like entities will continue to walk the line between altruism and capitalism.
Where do you stand on this issue? What makes a good data steward? Let us know in the comments section below.
Read more about:
Meta PlatformsAbout the Author
You May Also Like