If you scour the internet for your personal data, stop. It’s already out there in the hands of companies, and no number of removal requests will change that. What you should be worried about is whether executives at these companies are asking themselves the right question: What constitutes ethical use of consumer data?
What if you discovered, for example, that data your company has could cure cancer? Would you have an ethical obligation to disseminate that data even if your data source would prefer you didn’t? Is it OK if you’re profiting off of someone’s data without that person’s consent? What if you work at a health insurance company that denies someone coverage because he or she Googled “cancer” one too many times? What should you do?
That, in a nutshell, is the data dilemma that business leaders are facing. Collection isn’t the issue; use is.
A Question of Right and Wrong
When it comes to their data, many Americans see a black-and-white issue. In fact, 43 percent of them dislike their digital devices monitoring their activities, even when that data could be helpful on a personal or societal scale.
But how can data collection be morally “wrong” when it’s the backbone of so many services we use? How many lives and gallons of fuel have been saved by Google Maps’ turn-by-turn directions? How many jobs have been found by software that matches applicants’ attributes to open positions? How many human connections have been built through social media platforms that suggest friends?
When used responsibly, data can do a lot of good. In fact, it’s vitally important to today’s economy. Just as oil fueled the Industrial Revolution, data makes possible personalized digital services from Spotify to Google to Amazon. Outlawing collection would irreparably stifle innovation and progress.
But data can also do a lot of damage. Dictators love data because it makes cracking down on dissent incredibly easy. Social media platforms use data to sell targeted ads to actors such as Russia that divide societies with inflammatory, eye-catching propaganda. Over and over again, companies such as Equifax spill hundreds of millions of Americans’ financial data all over the web.
Mere collection of consumer data is morally neutral, and ending it would imperil the world’s economy. When companies, governments, or other entities use that data in ways that benefit them while adversely affecting others, that’s when ethical problems arise. That’s what we must find a way to regulate.
A Framework for Ethical Data Use
On May 25, the first large-scale attempt to balance data’s innovatory prowess with its potential for abuse went into effect. After two years of planning, the European Union rolled out the General Data Protection Regulation.
Although time will tell how effectively government can regulate and enforce data protections, GDPR will create a formal system of checks and balances. EU citizens domiciled in the Union will gain certain rights over their data such as the “right to be forgotten,” “right to access,” “right to correct,” and “right to object.” The companies collecting or processing their data will be held responsible for protecting consumers’ privacy and preventing breaches.
But the world can’t wait decades to decide whether or not GDPR works. The truth is that the companies that collect and use data need to take responsibility. Today, too many are mining consumers’ data without having a real reason to do so. Some don’t even know how they’ll use the information in the future.
Facebook is a particularly avid data collector. The company knows the car you drive, your favorite foods, how often and where you travel, the type of phone you use, the charitable donations you make, your political leanings, and much, much more. Facebook might even know you better than you know yourself. Does Facebook need all that data? At best, no. At worst, Facebook is selling private information to companies and governments that don’t have its users’ best interests at heart.
Not only is collecting unnecessary information unethical, but it’s also dangerous for both the consumer and the company doing the collecting. Every piece of stored data is one that could be leaked or stolen. That creates headaches for users, corporate compliance officers, and PR teams.
Before mining consumer data, online or off, corporate leaders must consider whether it aligns with their business mission and vision. If it’s not useable today, it shouldn’t be stockpiled for a rainy day. If it is useful today, the data collected should be anonymized, used in a way that benefits the consumer, and disposed of when it’s no longer needed.
Of course, most companies aren’t doing those things. No wonder just one in four consumers think that most companies handle their sensitive data responsibly, while one in 10 think they have complete control over their data.
To close the trust gap, companies must be transparent. Individuals whose data is collected deserve to know why, how it’s going to benefit them, and the steps the collector takes to strip it of identifying information. People will only start trusting companies with their data when those companies give them a reason to do so.
But don’t companies already seek consent and share data use details through end user licensing agreements (EULAs) and similar privacy agreements? You know, the kind you mindlessly scroll through and then click “I accept” at the end?
Technically, they do. But beyond the fact that most people would need a lawyer to understand them, EULAs tend to be overly broad, enabling “any use” the company deems acceptable. Then, once a user has accepted one, he or she has no real way of opting out of collection. Almost across the board, EULAs strip consumers of leverage they’d otherwise have against a company that uses their data unethically or unlawfully.
Even by collecting data on an “as needed” basis and improving consumer protections, however, companies can’t totally prevent breaches. So why not make that breached information worthless? In other words, why not anonymize and open-source all consumer data?
This would be a radical departure from how we currently handle data. Think of it as a centralized database or an open-sourced ombudsman in which no one entity owns the information. Everyone would be able to see what data is collected, by whom, how it’s used, and who it’s shared with. Rules about how data should or shouldn’t be used would be made collectively rather than by one company or government.
Will we get to a point when all data is openly shared? Probably not for decades or perhaps not at all. But we need a better answer than we have today. For the information economy to work, consumers must be able to trust companies with their data. Until we develop greater protections, consumer distrust will hinder innovation, just as banning all data collection would.
Data’s ethical dilemma won’t be solved today or tomorrow, but it will be sooner or later. If companies won’t decide to take better care of consumers’ data, governments or the free market will do it for them.