data privacy

Data Privacy And GDPR: Treading Carefully Is Still The Best Course

As the rage over Facebook/Cambridge Analytica situation continues, the calls for much more rigorous regulation for tech companies are becoming more and more common. On the surface, this seems reasonable: it's hard to argue that handling of users' data by many companies remains messy, with the users often left confused and frustrated, having no idea about the scope of the data they're sharing with those companies. And yet, I am going to argue that we — as users, customers and society as a whole — stand to lose a lot if we act purely on our instincts here: the excessive regulation, if handled poorly, can harm the market immensely in the years to come, and ultimately leave us worse, not better, off.

Current discussion around data privacy hasn't actually started with the recent Facebook scandal. Over the last few weeks, you might have received notices from multiple tech companies on updated terms of services — those are driven by the companies' preparation for General Data Protection Regulation, or GDPR, a new set of rules aimed to govern data privacy in the EU, to kick in on May 25th this year. If you're interested, here are a couple of decent pieces providing an overview of GDPR, from TechCrunch and The Verge.

Now, it is still the EU regulatory framework, so naturally, it only governs the handling of the data that belongs to the users who reside in the European Union, which prompts the question why should people in other geographies bother to learn about it? Well, to answer it, here's the quote from the recent The Verge article:

"The global nature of the internet means that nearly every online service is affected, and the regulation has already resulted in significant changes for US users as companies scramble to adapt."

And that's exactly right: while GDPR only applies to the data that belongs to the EU citizens, it's often hard, if not altogether impossible, to build a separate set of processes and products for a subset of your users, especially if we are talking about a subset so large, diverse and interconnected as the European users. Therefore, quite a few companies have already announced an intention to use GDPR as the "gold standard" for their operations worldwide, rather than just in the EU.

Quite a few things about GDPR are great: the new "terms of service" are about to become significantly more readable, the companies would be required to ask the users to explicitly opt in on the data sharing arrangements, instead of opting their users in by default, and then forcing them to look for the buried "opt out" options, and the opportunity for the users to request any company to provide a snapshot of all the data they have on them is likely to prove to be extremely useful. The abuse, like in Facebook/Cambridge Analytica case (irrespective of who's to blame there) is also about to become much harder, not to mention much costlier for the companies involved (under GDPR, maximum fines can reach 4% of the company's global turnover, or €20 million, whichever number is larger).

So what's the problem then? Well, first of all, GDPR compliance is going to be costly. Europe has already witnessed the rise of a large number of consultants helping companies to satisfy all the requirements of GDPR before it kicks in in May. The issue with that is that the large companies typically can afford to pay the consultants and the lawyers to optimize their processes. Instead, it's often the smaller companies, or the emerging startups, that can't afford the costs associated with becoming fully compliant with the new regulations.

That, in turn, can mean one of two things: either the authorities choose not to enforce the new laws to a full extent for the companies that are beyond a certain threshold in terms of revenue or the number of users, or GDPR threatens to seriously thwart the competition, aiding the incumbents and harming the emerging players. The second scenario is hardly something that the regulators, not to mention ordinary citizens, can consider a satisfactory outcome, especially in the light of the recent outcry over Facebook, Google and few other big tech companies — most people have no desire to see these companies become even more powerful than they are today, and yet that's exactly what GDPR might end up accomplishing, if it's enforced in the same fashion for all companies, irrespective of their size or influence.

The second problem is that while the first of the principles of GDPR, "privacy by design", isn't really new to the market, the second, "privacy by default" is a significant departure from how many tech companies, in particular those in the marketing/advertising space, operate today. In short, GDPR puts significant restrictions on the data about the user that companies are allowed to collect, and in the situations they're allowed to share it with their partners (and, in most cases, they'd need to obtain an explicit consent from the user before her data could be shared). That potentially puts at risk the entire marketing industry, as most of the current advertising networks employ various mechanisms to track users throughout the internet, as well as routinely acquire data from third parties on the users' activities and preferences in order to enable more effective targeted advertising. Right now, this way of doing things seems to be under direct threat from GDPR.

Now, there are plenty of people who believe that the current advertising practices of many companies are shady at best, and downright outrageous at worst, and any regulation that forces the companies to rethink their business models should be welcomed. To that end, I want to make three points on the situation isn't necessary that simple:

1. Advertising is what makes many of the services we routinely use free. Therefore, if the current business model of the vast majority of those companies comes under threat, we need to accept that we'll be asked to pay for many more of the services we engage with than we do now. The problem, of course, is that most consumers, for better or worse, really hate to pay for the services they use online, which means that a lot of companies might find themselves without a viable business model to go on with.

2. The incumbents are the ones who stand to win here. What comes to mind when you think about the companies that don't need to rely upon third-party data about their users to successfully advertise to them? Facebook, LinkedIn, Google. Those companies already possess huge amounts of information about their users, and therefore they'd actually be the ones that are the least threatened by tightened regulations on data sharing, and likely to become even stronger, if their competitors for the advertising dollars are put out of business.

3. A "separate web" for the EU users. Right now, it looks like many companies are inclined to treat GDPR as the "gold standard". However, it's worth remembering that they still have another option to go with. If GDPR compliance proves to be too harmful for their businesses, instead of adopting it globally, they might choose to go into trouble of creating a separate set of products and processes for the EU users. That, of course, would most likely mean that those products would receive less attention that their counterparts used by the rest of the world, and would feature more limited functionality, harming the users who reside in the EU. It would also harm the competitiveness of the European companies, as well as their ability to scale globally, as, unlike their foreign-based peers, they would face more restrictive and expensive to comply with regulations from the start, while, say, their U.S. peers would have the luxury to scale in the more loosely regulated markets first, before expanding to Europe — at which point, they'd be more likely to have the resources necessary to successfully withstand the costs of compliance.

Once all of this is taken into consideration, I'd argue that it becomes obvious that the benefits that come with the stricter regulation, however significant, don't necessary outweigh the costs and the long-term consequences. Data privacy is, of course, a hugely important issue, but there is little to be gained from pursuing it above everything else, and a lot to lose. With GDPR, the EU has chosen to put itself through a huge experiment, with its outcome far from certain; the rest of the world might benefit from watching how the situation around GDPR unfolds, waiting to see the first results, and then learning from them, before rushing in similar proposals in their home countries.

Cambridge Analytica Crisis: Why Vilifying Facebook Can Do More Harm Than Good

Throughout the week, I've been following Facebook and Cambridge Analytica scandal as it's been raging on, growing more and more incredulous. Yes, this is a pretty bad crisis for Facebook (which they inadvertently made even worse by their clumsy actions last week). But it still felt to me that the public outrage was overblown and to a significant degree misdirected. Here are the key things that contributed to those feelings:

1. Don't lose sight of the actual villains. Aleksandr Kogan and Cambridge Analytica are the ones truly responsible for this, not Facebook. Facebook practices for managing users' data might have been inadequate, but it was Kogan who passed the data to Cambridge Analytica in violation of Facebook policies, and then Cambridge Analytica who chose to keep the data instead of deleting it to comply with Facebook requests.

2. Nobody has a time machine. It might seem almost obvious that Facebook should have reacted differently when it learned that Kogan passed the data to Cambridge Analytica in 2015 — e.g. extensive data audit of Cambridge Analytica machines would have certainly helped. The problem is, it's always easy to make such statements now, yet nobody has a time machine to go back and adjust her actions. Was Facebook sloppy and careless when it decided to trust the word of the company that already got caught breaking the rules? Sure. Should it be punished for that? Perhaps, but rather than using the benefit of hindsight to argue that it should have acted differently in this particular case, it seems more worthwhile to focus on how most companies dealing with users' data approach those "breach of trust" situations in general.

3. Singling out Facebook doesn't make sense. To the previous point, Facebook isn't the only company operating in such a fashion. If one wants to put this crisis to good use, it makes more sense to demand for more transparency and better regulatory frameworks for managing users' data, rather than single out Facebook, and argue that it needs to be regulated and/or punished.

4. Don't lose sight of the forest for the trees. It's also important to remember that the data privacy regulation is a two-way road, and by making the regulations tighter, we might actually make Facebooks of the world better, not worse, harming the emerging startups instead. This is a topic for another post, but in short, strict data regulation usually aids the incumbents while harming the startups that find it more difficult to comply with all the requirements.

5. Data privacy is a right — since when? Finally, while the concept of data privacy as a right certainly seems attractive, it's not as obvious as it might seem. Moreover, it raises an important question — when exactly did the data privacy become a right? This isn't a rhetorical question either. It certainly wasn't so in the past: many of the current incumbents have enjoyed (or even continue to enjoy) periods of loose data regulation in the past (e.g. like Facebook in 2011-2015, or so). So if we pronounce the data privacy to be the right today, we are essentially stifling the competition going forward by denying the startups of today similar opportunities. Does this sound nice? Of course not, but that's the reality of the market, and we have to own it before making any rash decisions, even if some things seem long overdue.

Overall, this crisis is indicative of multiple issues around data management, and can serve to launch a productive discussion on how we might address the data privacy concerns going on. At the same time, it doesn't do anyone any good to vilify Facebook beyond necessary (and some of the reporting these days was utterly disgusting and irresponsible), the #deletefacebook campaign doesn't really seem to be justified (again, why not get rid of the vast majority of the apps then, given that Facebook isn't that different from the rest) and any further discussion about data privacy should be carefully managed to avoid potentially harmful consequences - most of us have no desire to find themselves in the world where we have perfect data privacy, and no competition.