The Cambridge Analytica breach has the whole world talking – again – about data and privacy. This whole debacle just reinforces the point that data can be your friend or your worst enemy – and it’s up to each of us to choose.
Questions are being raised about the ethics of campaign targeting.
The #DeleteFacebook campaign is spreading across Twitter.
And angry Facebook users are deleting their accounts.
On the surface, what Cambridge Analytica did is not uncommon to marketers and political candidates who run ads on Facebook. It’s considered a best practice to use data to get your ads in front of your target audience. So why are Facebook and Cambridge Analytica in hot water?
In a nutshell, Cambridge Analytica gained access to the private information of over 87 million Facebook users, according to most recent estimates. (The number could get bigger, but Mark Zuckerberg has said that he thinks this is the final number, according to reports from NPR.org.) The data they collected (as reported by The New York Times), included details on users’ identities, friend networks and “likes.” The data was then used to build personas (something every marketer does) to map personality traits that they could then target with digital ads.
Here’s the thing: only 270,000 of those users consented to the collection of their data and they did it under the guise that it was for academic research. The users who didn’t consent? That data was collected because they were friends of the 270,000 people who did consent.
What’s even worse: once Cambridge Analytica used that personal data to build their personality profiles of likely voters on both sides of the aisle, they advertised to politicians that they could create ads that would target these personality types and influence their behavior. Once Cambridge Analytica had these psychological profiles of people, they used this data to run ads meant to target specific personality types and to psychologically manipulate them during the election cycle.
The result: possibly one of the first real-life examples of how data can be weaponized and used against a population.
Cambridge Analytica is in trouble for doing it, and Facebook is in trouble for letting the data breach happen and not having tighter control over its data.
What are the takeaways for businesses, for us as individual users, and for parents with children who are on social media?
Businesses: Big data means big responsibility
We love data at RTS Labs. We use it to draw deep insights for our clients, so they can run their businesses more efficiently. But collecting data comes with responsibility – from privacy, security, and ethical standpoints.
As a business, if you’re going to collect data, it’s crucial to have systems in place that safeguard it. That means following cybersecurity protocols and being sensitive to whom you employ and who on your team has access to information.
Beyond the security of your data, you need to make sure it’s ethically collected and used. Data can help you. But when it’s used in the wrong way, it can go against you, as with this case. Cambridge Analytica not only collected their data unethically, they sold that data to political candidates who used it to target erratic personalities in an effort to spread fake news and cause more divisiveness during the campaign.
Ultimately, it’s up to businesses and researchers alike to be good stewards of the data they’re collecting.
Users: What’s our responsibility?
Here’s the thing about the digital platforms that have made their way into our everyday lives (and are as commonplace as eating). ALL OF THEM collect data about you.
These free platforms we all use are not exactly free. The cost of use is permission to sell and use your data. Yep. Everything you do on these channels is tracked and used by advertisers.
There are so many ways advertisers can target you and other users: Age, gender, and sexual orientation; household income; education; people who “like” certain Facebook pages; people who are interested in specific activities, movies and other pop culture trends; people who show specific online behaviors such as staying up late or regularly sharing news stories; and the list goes on and on. Platforms like Facebook don’t get that information by asking people to take surveys. They get it by tracking everything their users do.
This is great for advertisers. But for users who are unaware, it can be very alarming, which is part of the reason why there is so much outrage over the Cambridge Analytica incident (in addition to how the data was then used).
That quiz you took last week on Facebook about what state you should live in or what Disney princess you are? The developer who made that may not have done it for “funsies.” It may have been commissioned and developed in order to harvest your data. Did you read the permissions you consented to? You may have inadvertently given them permission to collect all your information or even track your location and online activities.
This is the world we live in now. Harvesting and mining data is the whole premise of artificial intelligence and machine learning. As a user, it’s important to be mindful of the data you share. Read user agreements. Question permission requests. Find out what companies are collecting and why.
Parents: What does this mean for our children?
With all we do online, we have made ourselves so publicly available. And so have our children.
Founder Jyot Singh is concerned about our youth and the future they face. “Kids are super vulnerable these days,” he says. “When they grow up, they may be in for a shock when they apply for jobs because their lives are so exposed.”
Jyot predicted that marketing would become more powerful this year – scary powerful. In his article, “2018 Tech Trends and Predictions,” he wrote, “Marketing is going to become deeper and more data driven in 2018. Online behaviors will be tracked more closely and selling/marketing will become better but more intrusive. Social media platforms like Facebook, Instagram, and YouTube will get more powerful because of the sheer amount of data they’ve been collecting about users.”
So, if you’re a parent, make it a point to talk to your children about data, privacy, and being mindful of the information they put out there.
As a business, it is crucial to be data driven. But if you’re collecting data from people, take that responsibility seriously. As a consumer, be mindful of what you put out there and to whom you give access to your information.
Want to see firsthand what positive, responsible data analysis and business intelligence look like? Download our case study below!