I have a strong memory as a small child around reactions to the word “dog”. We didn’t have a dog at home; neither did our neighbours; but they were around. However, I quickly discovered that I could say the word “dog” and suddenly everyone around me would start looking around – even when there wasn’t one there. Such power from a three-letter word! Such fun! And the fun was particularly when there wasn’t a dog to be seen. This was an early introduction to and fascination with language.
(Please stay with me: you are on the right blog – humour me).
As toddlers, beginning to find our way around the world, we come to understand that ‘things’ out there all have names, labels that we can use to refer to things, even those that can’t be seen, either presently or at all. I had struggled to understand the word “electric” (and pity my poor parents here) but finally gotten that it was something about being shiny, new, and dangerous to touch. “Electricity”, I assumed meant “electric city” and, knowing that I lived in a city, meant I was in a dangerous place. Thankfully the delusion didn’t have lasting consequences but you can start to see how a little understanding can be potentially dangerous.
In today’s increasingly digital-driven world I feel that I am in a place dominated by toddlers, all struggling with vocabulary for this complex digital world growing around us. We think we understand some words – often, reasonably enough, based on our existing ‘world model’ or our attempts to have the word make sense according to that model – only to be let down at some decisive moment.
Two words clearly represent for me such a struggle: “data” and “privacy”. Together with the myriad phrases that can be built from those two innocuous words (“we value your privacy”, “your data is safe”), we are left – if we care to probe deeper – with a sense of inadequacy, of illiteracy even.
I should explain.
On Wednesday morning, I hosted my annual “Privacy Breakfast” on the eve of the IAPP Global Privacy Summit in Washington, D.C. As on previous occasions, I brought together around 25 friends and colleagues from Europe and the US, from private and public sectors, from legal, research and explicitly privacy-related professions. The theme for this year’s discussion (held under Chatham House rules, so no personal attribution of comments) was the role of Standards in ensuring privacy in cloud-based services.
As more and more services available to us through PCs, laptops, tablets and smartphones are delivered to us from cloud-based platforms, bewilderment and fascination grow around the types and scales of data that flow between our devices and these services – particularly data that might or does impact upon our privacy as individuals.
Whether as a response to public outcry, as a matter of principle or good business practice, we are faced with ever more notices that “we value your privacy” or that “your data is safe with us”. While laudable in their own right, and certainly preferable to silence on the matter, such statements of principle are often not enough, unless there is a pre-existing and strong bond of trust between the parties to any transaction. So how is anyone to trust the claims made by an unknown or untried merchant?
In medieval times, as cities grew, the number of merchants selling wares at a city market (and often coming to the city expressly and uniquely for such market days) far outstripped the small number known personally to any citizen. Reputation obviously played a part but could take time to cultivate and be lost overnight. Weights and measures were the early Standards at such markets – ensuring the customer received the length of cloth they paid for; and the merchant received payment in legally recognised coinage.
The statement of principle, “a yard of linen for only three groats!”, was backed up by the recognition of the unit of currency; the quality of the cloth validated by a Guild; and an objective measure of the lengths involved: the yardstick, the “el”, and so forth. The statement of principle was given teeth by having such a measure against which to verify, validate, or refute the claim.
If the claims don’t measure up, customers could run the merchant out of town or have them suspended from their Guild or could turn to the law or public authority to resolve any dispute. Claims again could be tested against the objective measures at hand.
In our conversation on Wednesday, we returned time and again to the role that Standards can play here but also to the paucity of a vocabulary for the digital age. The absence of a common vocabulary, even for critical issues such as cross-border data flows, is damaging. In its absence, we are all forced to rely on extremely subjective (and possibly self-serving) definitions. “It is like competing for privacy in a Tower of Babel”. Customers aren’t making the connections on their own either between claims about privacy and the daily realities of intense and often intrusive personal data mining and profiling (see Jeff Gould’s article for such an example).
There is a world of difference therefore between a cloud-service provider making the sincere but ultimately consequence-free statement “we value your privacy” and another that can demonstrate – often with third party certification – conformance and compliance with an objective set of tests and criteria. In the pre-digital economy, we have grown accustomed to relying on Standards for purchases of whole ranges of goods and services, from a humble lightbulb to a house: we don’t expect the lightbulb to explode; we do expect it to fit the socket; and we expect that the shop only stocks ones that conform to accepted Standards.
In the digital economy, it should be little different: it shouldn’t be down to the end-user or customer to have to read through reams of privacy statements or check millions of lines of code to be sure that their privacy really is being protected. Simple, certifiable and certified statements to this effect ought to be the norm.