Privacy is dead, long live privacy


In May, businesses saw Y2K remastered. Europe’s General Data Protection Regulation arrived — and nothing happened. Companies worldwide and across sectors scurried to reach compliance, fearful of steep fines and consumer wrath.

But nearly three months after the regulation was enacted, there has been little action.

Though industry has not yet seen ramifications to the regulation, GDPR caused many organizations to rethink how they collect and use data. More companies are considering privacy as a business issue of note, not an afterthought. 

And one concept is making it easier to understand how data should be treated, injecting more privacy along the way: handling data as a currency.

Putting a price on data

The concept of data as currency is the successor to a more physical representation found in the phrase “data is the new oil.” A concept she coined 20 years ago in Europe, Michelle Dennedy, VP and chief privacy officer at Cisco, declared data is the new oil because it flowed throughout systems and was more valuable than gold or other currencies.

If data was the new oil, then companies would only need security to manage it, ensuring it does not leak and spark fires. But if, instead, data is seen as a currency, it is “wholly dependent on time, cultural understanding, conditions and context,” Dennedy said, in an interview with CIO Dive.

Every currency has a “wobble,” Dennedy said. Take, for example, what is happening with the Euro fluctuations, which illustrate how election cycles can influence currency valuations.

Organizations achieve success when they learn to value assets. If data is treated carelessly, and internal or external factors make an impact, organizations could find themselves in the crosshairs of regulators.

If data is seen as a currency, it is “wholly dependent on time, cultural understanding, conditions and context.”

Michelle Dennedy

VP and chief privacy officer at Cisco

“If you look at your sensitive data as an asset that causes you as much harm if it’s compromised as your actual funds, your dollars, then you will behave differently,” said Tanya Forsheit, partner and chair of the privacy and data security at law firm Frankfurt Kurnit Klein and Selz, in an interview with CIO Dive.

While the concept is gaining mainstream support, industry is not there yet. Companies considering data as currency quickly revert to associations with risk, believing data is something to lose.

The other constraint is how regulations define personal data. GDPR offered a broad definition.

In the U.S., personal data is considered personally identifiable information (PII). But under GDPR, personal data is any information that could be used to identify an individual, including device IDs and IP addresses.

Broadening the scope of personal data adds complication to its treatment as currency. Social security numbers, for example, hold much more value than an email address. This means the concept of data as currency requires an associated value.

If there are mechanisms to treat IP addresses as pennies and social security numbers as hundred dollar bills, then it has meaning, Forsheit said. “It’s a difficult mindset for someone to get ahold of.”

Companies starved for data

If firms did not over collect data, semantics surrounding its treatment and definition would be pointless. But alas, that’s not the case.

In the mid-1990s the internet started pivoting toward commerce and it became easier to obtain data, said Rebecca Herold, CEO of consulting practice The Privacy Professor and co-founder and president of SIMBUS, a privacy and security management consulting firm.

Before the internet, companies had to rely on hard copy ads and mailing to reach potential buyers, said Herold, in an interview with CIO Dive. But the rise of internet commerce overhauled marketing efforts and companies no longer had to ask for customer data. Instead, people simply gave information away.

“Companies in the U.S. historically are data hoarders. That’s what they do. They gather tons and tons of data, sometimes even without necessarily knowing what their ultimate goal is.”

Tanya Forsheit

Partner and chair of the privacy and data security at Frankfurt Kurnit Klein and Selz

The industry saw “how eager organizations were to start gathering more data than what they really needed,” said Herold.

The 90s served as foreshadowing. Today, companies are gathering and storing more data than they know what to do with, hoping big data analytics and artificial intelligence will make analysis easier. This over collection has had a direct impact on privacy.

“Companies in the U.S. historically are data hoarders. That’s what they do,” said Forsheit. “They gather tons and tons of data, sometimes even without necessarily knowing what their ultimate goal is.”

GDPR is working to change how companies interact with data, stopping the use of personal data in ways consumers didn’t expect or didn’t know was possible, according to Forsheit. By connecting disparate data sets, analysts can determine and outline personal information without a user’s knowledge, an action GDPR is trying to prevent.

Is privacy even possible?

Data can have a positive and negative impact, and those organizations fearful of regulatory repercussions and steep fines are working to rethink data collection and treatment.

Increased regulation beyond GDPR is also making an impact. U.S. regulators are trying to step up and create an ecosystem that considers privacy impacts of services offered by internet giants, as was the case with California’s recent legislation.

Industries across sectors are in an “awakening period” for data use, Herold said. Facebook’s recent — and well publicized — data use has put the industry on notice. Facebook was “asleep at the wheel” when it mapped how to sell data,” Herold said. “They were too trusting.”

There are two main problems with data, Herold said:

  • Organizations are making too many assumptions about what could and could not be considered personal data. And those same firms don’t think people could analyze data sets to derive personal insight on an individual.  

  • Most app developers and many tech companies do not spend enough time engineering controls into their solutions and products. Instead, they are doing the minimum as required by law.

This highlights the gap between what companies are legally obligated to do and what they should do, Herold said.

Certainly privacy is possible, but companies lack the incentive to make it a reality.

“The bad things that have happened are not because we don’t have laws or not because we don’t have regulators who care,” Forsheit said. “It’s because companies have been data hungry, and in some cases greedy, and have swept up as much as they could and then tried to leverage that as much as they can until they get caught because that is, in many ways, the American way.”