Anonymity and privacy seem to be hot debate topics these days, especially when it comes to IoT, payments, and cryptocurrencies. However, it may be that more important issues are blurred because of the focus on privacy. The fact that consent is a mandatory requirement for GDPR is a positive evolution but it gives you only a certain level of control to the data you would like to transmit. In the hyper-connected world of IoT, smart wristbands, connected cars, and smart houses, anonymity might be difficult to achieve. Today, a quest for anonymity has set in, for some inspired by GDPR for others inspired by the idea of absolute freedom.
In the world of machine to machine communication, devices communicate all the time and everywhere. This world is one of constantly exchanging data: location data, time, perhaps other identifiers to tell the IoT space what device you are, where you are going, how heavy the device is, its velocity; the list can be endless. Eventually it will be anything but anonymous, if it isn’t already. The fact of the matter is that correlation of this data aims to detect patterns and such a pattern is could potentially put anonymity at stake.
Instantaneous photographs and newspaper enterprise have invaded the sacred precincts of private and domestic life; and numerous mechanical devices threaten to make good the prediction that "what is whispered in the closet shall be proclaimed from the house-tops. (The right to privacy (Warren & Brandeis 15 December 1890)
In the world of automation and robotization, it seems a daunting task to overcome privacy issues. These debates could slow down or even halt the technological progress. Strangely enough, we readily accept breaches to privacy in our physical world. We accept being filmed by cameras on the highway, and undergoing security measures at airports that contribute more to security theatre rather than of real safety. When it comes to technology, the bar is much higher. We insist on knowing how privacy is enforced, who may access our data, and what purposes it will be used for. This contrast is especially vacuous as these physical security measures and their outcomes all end up in a machine that analyzes them.
Balancing the impossible
Finding the balance between reality and the perception of that reality is key in the debate of the connected technologies. However, what we accept as a society is not the same as what we interpret or consider reality. We accept rigorous controls in our daily lives, but we experience them as invasive when technology is involved. Once machines are introduced into the story, we might even consider them dangerous for our future existence as a species. A balance between trust and control is required, how much control is acceptable and what controls are required to provide trust.
“Trust is the glue of life. It’s the most essential ingredient in effective communication. It’s the foundational principle that holds all relationships (Stephen R. Covey)
If we are to find a balance one day, we must assure that at all times controlling the controller is possible as a society, and that a controller can never tamper with data that highlights inappropriate behaviour. The good thing is, that Distributed Ledger Technology can provide just that proof. Control is not enforced by one person, but in fact by a ledger, like the Tangle . The difficulty lies in defining a tolerable level of control and provide an acceptable and neutral controller. Whenever governments step in as controller it can and will be considered as Big Brother, therefore it needs to be non-invasive and neutral in order to receive general acceptance.
Is true anonymity the ultimate goal modern society desires to obtain? Perhaps it is sufficient to gain control on your identity, virtual or real, combined with control on the data you like to share and how it will be shared.
Privacy is not something that I'm merely entitled to, it's an absolute prerequisite. (Marlon Brandon)
But what if safeguarding anonymity is utopia in a connected world? We might as well monetize our data in order to control the level of privacy we desire instead of giving it away freely. For example, an algorithmic model might be to pay a sample pool of users for its data (like the data market place). Bringing back the value to the user of their data, instead of offering a service to a user in return for giving up privacy and/or data control.
Today, we give away privacy-sensitive data in exchange for a free service, instead of making profit from our own data. A feasible financial model might be paying a sample pool of users (rather than everyone, which doesn’t make sense). Internet of Things may open a free market economy on how much each user’s data is worth.
What data are we giving away today (and why)?
- Phone (calls, contacts, physical metrics)
- Social networks (messages, who your friends are, what your opinions are)
- Search queries (what you care about)
- Online shopping (what you like to consume)
- Geo-location (where you are)
- Off-phone conversations (who you talk to, about what)
- Monitor your emotional state, know the best time to sell you stuff
- Track your political opinions
What data might we give away in the future?
- Biometrics (blood pressure, heart rate, health info, sporting habits (cross reference with fridge)
- Smart car (location, speed, music preferences, driving quality, who you talk to and about what, car mechanical state)
- Smart houses (when you come home, what time of day you open your fridge, how long do you sleep, what you watch on TV)
In the event where we sell the above data streams separately to different organizations we might be able to protect our identity. It might be difficult for the organization buying your data, to find out who you really are. However, if a data broker collects these streams and start correlating them, it might become very dangerous. We might be identifiable as a person, our habits will tell them when we’re at home, when we’re on holiday, what we eat, how much TV we watch etc. The result could be quite accurate, along the lines of “single male, 43 years old, regular beer drinker, cigar smoker, healthy eater, lives in Germany, never home on Friday evenings”. This type of data can become your digital nightmare when you are looking for health insurance, employment, or housing for that matter. Even when DLT provides some measure of anonymity, the data you sell does not.
Privacy is an illusion that is kept alive with rules and regulations in the hopes of protecting it, though we give it up readily by unconscious sharing of our data. It will be virtually impossible not to be tracked, but you better know who does it when and to what extent.
* Opinions are my own and not the views of my employer or any organisation I'm working for or with.
Koen Maris is director Cyber Security at PWC Luxembourg, transforming ideas in to new services helping customers to embed cyber security enterprise wide.
Koen Maris started an IT career as a software developer. This experience provided solid background in complex environments and a basis in the roll-out of challenging IT projects. After a few years, he swapped development for ethical hacking because of a natural curiosity to flaws in systems. This was the start of technical career in IT security, however due to rise of security problems his career evolved from ethical hacking to security solutions integration and eventually to the managerial side of security.
He has been CISO and CTO at an international IT service provider preceding his current role (He advices large organizations in a multi-industry environment to think on a long-term basis on Cyber Security and addresses complex security topics in layman terms for board of directors and executive committees.
Koen Maris serves as a trusted advisor for many organizations and is becoming a known speaker that challenges his audience and questions current applied security models.