Sacred Data for Sale, steal it while you can!

Lex Sokolin
7 min readAug 5, 2019

Here is our factbase. Capital One recently suffered a data breach resulting from poor security practices that exposed 100 million credit card applications and accounts. They expect the breach to cost the company $150 million. Two years back, Equifax lost 140 million identities, again from poor security practices. At the time, I said that according to GDPR this should cost them $150 million. They have since settled for about $600 million — though some of that seems to be in-kind services coverage like free credit monitoring (lol!). Separately, Facebook has settled for a $5 billion fine associated with the Cambridge Analytica privacy “breach”.

What’s it Worth?

As a percentage of revenue, $5 billion out of $60 billion (~10% for Facebook) or $600 billion out of $3.5 billion (~20% for Equifax) seems to be of a similar magnitude. Capital One’s estimate for $150 million on $28 billion seems off, to say the least. But let’s get some macro data out there, before thinking more deeply about the issue. Identity and data, and in particular financial identity and data, are valuable. On average, a stolen digital human is about $200 on the black market, and the per-capita cost of a data breach to the company is roughly the same. Cyber insurance, which is in the aggregate supposed to counteract these damages for companies, is at least a couple of billion in annual premia — amounting to probably a few dozen billion in coverage.

So here’s the issue I have. There is a lazy thing to say, and I said it in 2017 about Equifax. It goes like this. Look at all those hypocrites in the large financial companies! They point to Fintechs and Crypto, the innovative parts of our economy, and accuse it of poor practices. They insist on inequitable, overly heavy-handed regulation and security expectations that stifle out young companies. And yet, only 2% of all Bitcoin transactions have anything to do with illicit activity — no different that in the traditional economy, which sees 2–5% of GDP pass through money laundering. And yet, they keep losing our most important data by the millions, never having to face repercussions for their sins.

Thinking with Context

That’s a fun, accurate, finger-wagging argument to make. But it doesn’t do any work. It is useless. Instead, let’s take a more systemic approach. We can acknowledge that crime, theft, and mutual destruction is a human attribute, not some externality of a technology. Yes, we would like to minimize the crime. But it is endemic to all human systems, it is a part of us. Therefore, we have to accept that some percentage of our data, money, privacy, and other valuables will be stolen, misappropriated, or destroyed. We will fight that — but some amount, let’s say 2%, will slip through. This issue is about the actors in the system itself, and today the problem is merely becoming more transparent.

The second step is to think about our rationality versus our feelings (if you want to read 1,000 pages on this topic as prep, I recommed Yudkowsky). From an economic perspective, the following two scenarios are identical. In scenario A, you lose 2% of your data with 100% certainty. Imagine this as losing a non-core credit card once per year, and then having to cancel it with the bank. Inconvenient, but nothing to worry about. In scenario B, you lose 100% of your data with 2% certainty. The expected (dis)utility of this outcome is exactly the same, but I would guess that most of us would pay way more to avoid such a problem, because we are risk-averse animals. Any chance that you will lose everything you have is terrifying — and much harder to remedy.

Another dimension I want you to think about is “sacredness”. Something is sacred, in the sense I am using the word, when the cultural significance attached to it precludes an economic discussion. For example, human lives are sacred. No amount of insurance will make up for an outcome where a person is killed! And yet, governments make these calculations all the time when evaluating policies on topics like speeding, smoking, and water safety. Further, some things are sacred to some people, but not to others. What is a political cartoon to one person, is a declaration of religious war to another. To bring us back down to Fintech and cyber security, my main point is that *privacy* and *personal data* could be sacred in one context (e.g., an American high income person that studient constitional law at Yale), and not as sacred in another (e.g. a farmer in China who gets loans from the government).

Sacredness is a multiplier on how important something is to the person within their context. For many of us, we are fine losing social media photos, Twitter puns, or even our passwords. But financial information can be much more personal and embarassing — take for example the fact that we still do not have Donald Trump’s tax returns. I would bet that he finds those to be a sacred screed. Similarly, Google has a lot of sacred data. Imagine exposing to the world all of your search history, or having that search history be the basis for eligibility to get a bank account. Ok. So with these tools, let’s put together a framework.

The Framework

What does this tell us? First, the Capital One and Equifax bits are negatively surprising, but in the way that losing a gambling bet is negatively surprising. We have always known that there is some low chance of loss, and we have known that the data at stake is our financial data. We took the gamble of a 2% loss on a 100% cost, and when that loss actualized, we felt badly. The outrage we see today is a response to experiencing the cost. Perhaps we thought the chance of loss was lower, or we are apalled at the technical incompetence of the humans involved in those cases. But there’s nothing deeper there, in my view.

The correct outcome is to improve the quality control of the system. This can be done perhaps by forcing cloud providers like Amazon to have more safety limitations out of the box, or to move more of our information onto blockchain-based systems where individuals control their own data. At least in that case, the losses will be internalized to each individual at the time of their personal failure (lost my keys!), rather than correlated and externalized to the entire group whose data a centralized party (e.g., Capital One) is managing. But we cannot fix human society structurally just by asking people to download wallets. We cannot change our lazy, careless nature.

The second thing the framework tells us is about the scenario of 100% loss with 2% cost. We used to believe that Facebook and Google had our information, but that it wasn’t particularly valuable, personal, or sacred. This is of course entirely wrong. We have learned the hard way that the Tech giants have everything; and that the more sacred it is, the more they want it. Second, we used to believe that what they have is relative secure and inaccessible to others. This too is incorrect. By opening up the honeypot to Cambridge Analytica, Facebook made it a core business practice to bleed out what we want to protect.

I would say what we have lost is the right and the ability to think our own thoughts. To make up our own opinion, crushed as we are in the maw of algorithmic advertising and propaganda.

Remedies

This second thing is far worse than a hack, and should be punished far more punitively. Systemic design that takes the probability of loss and turn it into a business model is a flawed system — and one we should abhor deeply. I don’t have to persuade people to be outraged at Facebook; they already are for far less clearly articulated reasons. But this thought process has helped me identify why invisible microthefts are a problem, and how to fix them. We see Facebook adressing the issue by both (1) lowering the chance of loss by saying that the open developer program that powered up Cambridge Analytica is now closed (or better monitored), and (2) lowering the value of the loss by re-focusing on privacy and submitting itself to increasing regulation. And yet here they are, trying to start a new global currency!

The good news is that people are finally waking up to the fact that they have made a bad bargain. We recognize that the faces of our children are used to power machine vision artificial intelligence algorithms, that our location and shopping data can be used to discriminate access to financial services product, and that our searches and conversations are neither private nor fully protected. With this recognition comes a sense of cost — how much are we willing to give up, now that we see that things are not free. Listen, all technology and human processes are fallible, and so we should not aim for perfection. We should aim at the intersection of marginal cost and marginal benefit around security, privacy, liberty, and convenience. We should assume the risk and sail into the Great Beyond.

Looking for more?

--

--

Lex Sokolin

Entrepreneur building next-gen financial services @Consensys @Autonofintech @Advisorengine, JD/MBA @columbia_biz, editor and artist @inkbrick