Humans are used to being watched by things like security cameras, speed cameras, and security officials themselves. However, the recent rise of big data surveillance powered by machine learning presents something entirely new – the aggregation of small bits of freely provided information from an individual into an incredibly accurate portrait of their life and tendencies. While frightening, such technologies offer extraordinary opportunities. But in order for their continued use to be morally justified, a conscious effort needs to be made to educate people on exactly what information they are giving up, what it is being used for, and the capabilities of big data as a whole.
Most of the data used in this new surveillance is seemingly not invasive, so people have no qualms with its collection. The issue is that people do not tend to understand what can be learned from seemingly irrelevant bits of data. As the article on Facebook likes displays, bizarre predictors exist (such as liking curly fries being a signal for high intelligence), and the aggregation of such predictors can start to paint a creepily accurate picture of a stranger’s life.
The Target pregnancy advertising campaign illustrates the biggest problem with big data advertising– it tends to produce positive results using methods with which we are uncomfortable. Though the following would not alleviate all concerns, better informing people that their data is being used for surveillance would at least eliminate the surprise many have felt when they realize their data is being used. I think a big piece of the problem is that consent for data collection is often buried in the fine print that almost nobody reads. This means that we are legally giving consent for our data to be taken without consciously knowing it, which I believe is a corrupted view of what consent entails. I strongly believe that affirmative consent must be given for the distribution and use of someone’s personal data to be okay. A positive change would be to force everyone to actively consent to data collection with some sort of check box reading something like “I acknowledge that the data generated by this action is the property of the Company and can be disseminated as the Company likes.” The cost of not accepting these terms might be the inability to purchase or use a product, but at least this would ensure that everyone knows that their data is being mined.
We have already seen a concerted effort to regulate data usage in order to prevent abuses (especially following the Cambridge Analytica Facebook scandal from the 2016 US election). In fact, the European community has taken the lead in doing so with its recent passage of the GDPR, which significantly increases the standards for the protection of consumer data and threatens stiff penalties for violators. Notably, it follows my insistence on informed consent and ensures that “companies will no longer be able to use long illegible terms and conditions full of legalese, as the request for consent must be given in an intelligible and easily accessible form, with the purpose for data processing attached to that consent.” Such regulation is exactly what we need to enable us to enjoy the incredibly possibilities of big data surveillance while maintaining respect for privacy, and the adoption of similar standards elsewhere is crucial as we move forward in our increasingly data-driven world.
The biggest counter-argument to a backlash against targeted advertising is that this advertising allows us to access content for free. If we take the revenue source away from content creators and social media platforms alike, we will either lose access to the content or platforms or be forced to pay. We see some of this happening with ad blockers, which are starting to create a tragedy of the commons on the Internet. Everyone likes free stuff, and almost no one likes advertising, so anyone can get the best of both worlds by using ad blockers. However, if enough people use blockers, then the revenue source is gone and the above will occur. I don’t think there is anything unethical about using ad blockers, especially due to the privacy concerns I have raised and the fact that pages are figuring out ways to get around the blockers, but I think it is important for people to understand that content has costs.