Different kinds of privacy, empowerment and autonomy – centralized versus decentralized


In an article in the Guardian last week, Professor Alex ‘Sandy’ Pentland mooted the potential for Google to cleave in two, with one part dedicated to providing a regulated bank-like service for data. Pentland directs the MIT Human Dynamics Lab and co-leads both the Big Data and the Personal Data and Privacy initiatives of the World Economic Forum, and I’m surprised how often his name crops up in my hi:project related research, yet I find it difficult to reconcile his observation here with his fluency in the power of decentralized networks:

Social physics strongly suggest that the [Adam Smith’s] invisible hand is more due to trust, cooperation and robustness properties of the person-to-person network of exchanges than it is due to any magic in the workings of the market. If we want to have a fair, stable society, we need to look to the network of exchanges between people, and not to market competition.

Pentland continues under the heading: How can we move from a market-centric to a human-centric society?

Social physics suggests that the first step is to focus on the flow of ideas rather than on the flow of wealth, since the flow of ideas is the source of both cultural norms and innovation. A focus on improving idea flow, rather than financial flows, will allow individuals to make better decisions and our society to develop more useful behavioral norms. A key insight from social physics is that it is critical that the flow and exchange of ideas be inclusive, because insufficiently diverse idea flow leads to rigid and insular societies, and insular communities often inflict terrible damage on weaker communities with whom they share resources.

Having written about influence flows in my 2011 book, I couldn’t agree more with this observation. I find myself wondering why then Pentland might resign himself to such a centralizing vista as a Google data bank? Is such continued centralization so unavoidable? Are banks in the monetary sense so successful, so resistant to decentralization, as to provide a valid model for data?

Apple’s news this month provides an interesting example for discussion.


Asked in the Guardian article whether Apple may be up for something similar, Pentland responds:

Apple is extraordinarily non-transparent and they would have potentially a lot of difficulty.

Earlier this month, Apple announced ResearchKit:

Medical researchers are doing some of the most important work in the world, and they’re committed to making life-changing discoveries that benefit us all. To help, we’ve created ResearchKit, an open source software framework that makes it easy for researchers and developers to create apps that could revolutionize medical studies, potentially transforming medicine forever.

In other words, Apple is transforming the iPhone and imminent Apple Watch into sources of real-time, continuous data streams for medical research. ResearchKit sits on the previously announced HealthKit:

HealthKit allows apps that provide health and fitness services to share their data with the new Health app and with each other. A user’s health information is stored in a centralized and secure location and the user decides which data should be shared with your app.

Apple is then ticking some of those important boxes. User decides what? Tick. User decides who? Tick. Secured and hidden from Apple’s eyes and ears? Tick. Now let’s run it by the transdisciplinary perspectives of some hi:project members – in terms of quantified self, health data and the internet of things.

Quantified Self (QS)

Adriana Lukas, founder and organiser of London Quantified Self group:

It is indisputable that analysis of aggregate health data is beneficial. We could argue whether it is the most effective way of helping the individual – it may be that aggregation of individual insights [ie, post-analysis] would arrive at fundamental innovation in health and wellbeing faster, and therefore we need to provide individuals with the capabilities for uncovering such insights.

Adriana proselytizes self-managed QS, a future in which “expertise is supplied rather than outsourced”, where each of us acquires “agency as sense-maker”. When it comes to ResearchKit, she recognises that:

… in the absence of any clear progress in having access to sophisticated analysis at individual level, a well designed system for aggregating and analysing health data is potentially a Good Thing.

She defines ‘well designed’:

    1. Meeting the legal requirement of not using user data without consent.
    2. Not merely following the ‘letter of the law’ of user consent and opt-in request but also the ‘spirit of the law’ with transparent UI/UX making it obvious at each step what is being consented to.
    3. Going beyond the act of simply making the act of consent clear by providing a much fuller understanding by the user of what their data does or could reveal about them, who has or could have access to their data and the results of analysis, how such aggregation impacts their privacy and, for good measure, that anonymisation is an argument only used by people who merely pay lip service to privacy.

No matter how beneficial Apple ResearchKit will turn out to be – and there’s little doubt that if it achieves half its expected potential it will be an improvement – the downside for people like me is that its success will crowd out alternative approaches to medical research and personal health data. The big clever algorithm in the sky will improve our collective understanding of disease and health and the benefits will trickle down to each individual. It will not help me frame my own questions to ask my own data and arrive at my own understanding of my own health and wellbeing. And that is the future I am hoping for.

Health data

Adrian Gropper, CEO, HealthURL Consulting:

With the ResearchKit announcement, Apple said two things that no major company or institution in healthcare has ever said: (1) Apple will not see your data; and (2) ResearchKit is open source software. HealthKit already introduced the third key leg of the foundation by making Pairwise Pseudonymity the default.

I wasn’t too fluent with pairwise pseudonymity, although I look forward to dropping it into polite conversation henceforth. A pairwise pseudonymous identifier (PPID) identifies an individual to a third party (known as the “relying party”) in a way that cannot be matched with the individual’s PPID used with another third party, all in the pursuit of avoiding de-anonymization.

Together, these three principles become the foundation for technology that moves beyond opaque and coercive concepts like “consent” to effective transparency and authorization by the individual subject. Consent … is still an institutional tool aimed at justifying a priesthood or a brand.


We need a standards-based alternative to Apple’s walled garden where different communities of interest, be they patients or licensed clinicians, can set their own governance rules and certify their own open source apps. Individual doctors and patients must have the opportunity to choose apps and authorization standards at “the edge” of the network on a day-by-day basis.

This is what we’re doing in the OpenID HEART project. We are profiling the use of a personal authorization server [see UMA FAQ] to provide limited and automated policy-based authorizations for clinical and research access to personal data wherever it sits. We take pairwise pseudonymity for granted as a means to reduce or even prevent hidden surveillance and aggregation. The charter is designed to make open source, and therefore fully owned, authorization servers first-class citizens next to institutional ones like Apple’s so the institutions have to compete for our trust without coercive consent policies.

Readers familiar with the hi:project’s vision will appreciate immediately why the OpenID HEART project is an important partner.

Internet of Things

Rob van Kranenburg, Sociotal FP7 ICT Objective 1.4 IoT, and Founder of the Internet of Things Council:

Internet of Things (IoT) is at essence the seamless flow between the:

  • BAN – body area network eg, the ambient hearing aid, the smart t-shirt, Google Glass
  • LAN – local area network eg, the smart meter as a home interface
  • WAN – wide area network eg, telematics, intelligent transport systems, connected car
  • VWAN – very wide area network eg, the smart city, e-gov services no longer tied to physical locations.

Whoever ensures traceability, sustainability and security is able to offer the best possible feedback on physical and mental health, the best possible household decisions based on real time monitoring for resource allocation, the best possible decision-making based on real-time data and information from open sources, and the best possible alignments of local energy providers with the global potential of wider communities. That is what #IoT can be.

This corresponds to the hi:project’s submission to the UN’s Data Revolution Group in which we assert: “Personal data must be allowed to breathe for it to be of most value to the individual and society, and the corresponding parameters are best set by the individual in question with clear appreciation for the mutual value thus realised or suppressed.”

Rob concludes:

Google is rolling out Glass and Lens, the Google Power meter and Nest, the car and automotive, open data initiatives, and cultural hegemony through google.org. These gateways, the platform and the ‘app’ or service store, should be in public hands.

Given Pentland’s comparison of Google’s and Apple’s openness earlier, you might wonder why Rob isn’t apparently ‘getting’ open. I needn’t explain here however as this post does the job perfectly: Apple Research Kit is Open Source But Is It “Open”?

On the road to decentralization

The sentiment of this post is simple – there is centralized privacy, empowerment and autonomy, and then there’s decentralized privacy, empowerment and autonomy. Both improve on the status quo, but we must regard the former as simply a signpost pointing towards the latter. We must not be complacent and settle for the centralized version. Critically, the hi:project helps for-profit entities, at least those not benefiting from centralization today, take a lead with us in establishing decentralized privacy, empowerment and autonomy to mutual advantage.

Evgeny Morozov is a contributing editor at The New Republic and the author of The Net Delusion: The Dark Side of Internet Freedom and To Save Everything, Click Here: The Folly of Technological Solutionism. In a recent article, When apps are driven by the market, there’s only one winner. It’s not you … he asserts:

It could be that the worst excesses of capitalism were manageable, at least on a psychic level, precisely because we could occasionally shelter ourselves in various hermetic zones that did not bend to the logic of supply and demand. These zones, impervious to the rhythms of globalisation, reassured us that a personal autonomy outside the market bubble was a feasible objective.

It seems to me that Morozov may share Pentland’s emphasis of network of exchanges between people over market competition. He continues:

At a time when values such as solidarity, fairness and diversity are under constant attack, the ability to incorporate more information into our decision-making is only hastening their demise. Ignorance can, in fact, be bliss – especially if what awaits us on the side of knowledge is the imperative to become more efficient, competitive and profitable. In the absence of other radical projects to challenge the status quo, ignorance – or, rather, the informed refusal to know – can be a powerful antidote to the efforts to reduce everything to a knowable price whose very existence already formats citizens into consumers.

Perhaps the hi:project vision is just such a radical project, although I hesitate to posit the idea in response to a critic of technological solutionism!

With decentralization we move from deliberate big data to emergent masses of small data. We move from surveillance to socioveillance. We move forwards (or is it backwards?) from the consumer, from the user, to the human.

Image source: Apple press info

What do you think? (Please note that we moderate comments to keep quality up. We always accept comments made politely, in good faith, and preferably quoting references in support of any assertions. The use of disposable email addresses in this context tells us you consider your contribution disposable too.)