Battle in Print: Rethinking privacy and trust

Norman Lewis, 20 October 2009

One of the most confusing things about the question of privacy, and what makes it so elusive today, is that it is far from evident how people regard their right to privacy nor how these attitudes translate into day-to-day behaviours. The concept of privacy, once merely thought of as the right to be left alone, has been transformed as we have become more information oriented and digital technologies ensure that almost everyone now exists with a digital fingerprint. More complicating is the fact, that in recent decades, the boundary between private and public has become blurred. A new age of disclosure has emerged where reality TV and social networking sites now represents the acceptable face of public scrutiny.

Can one seriously argue that privacy is regarded as something important today? At one level it is clear that contradictory attitudes and practices co-exist (often in the same individuals). Some people, for example, while concerned about data collection and the potential abuse of power by the state, are quite happy to reveal their deeply personal and innermost thoughts on social networking sites. People who are concerned about privacy seem willing to bargain the release of very personal information in exchange for relatively small rewards (often financial). Others, who are keen to protect their sexual or medical histories, will gladly disclose vital details of their financial circumstances on commercial websites. Others who reveal themselves on social networking sites are paranoid about online transactions, fraud and identity-theft. When it comes to security, even those who regard privacy as intrinsic to personal liberty are willing to accept encroachments on their liberties and rights by the state with little protest or outcry – in the name of anti-terrorism or anti-crime (just think of the deployment of CCTV cameras in the UK). And even more worrying, many, appear to have no problem in any of these spheres.

A convenient trade-off?

What appears to be happening is that privacy is increasingly being regarded as an area of trade-offs, rather than a political principle that needs to be defended or upheld in all circumstances, particularly against the state:

• Privacy will be traded for free content online;
• Personal information will be shared with some institutions for some other benefit;
• State surveillance and data capture will be accepted as a trade off for personal security (although, as the authors of Database State argue, giving up privacy does not necessarily enhance security (1)).

There are numerous studies that show the differential attitude people have towards sharing information in different circumstances. The recently published Economic and Social Research Council’s study Assessing Privacy Impact reveals this clearly. According to Dr Ian Brown, of the Oxford Internet Institute, in their latest annual survey of internet use, they found that while more people were now happy to provide email addresses, and names to e-commerce websites, public concern over data collection by state institutions (beyond concerns about competence) remains very high: ‘the public is unhappy about extensive sharing, even for purposes such as counter-terrorism and medical research’ (2).

So how does one begin to understand what is really happening here, let alone what this might mean for the future? It appears that people’s willingness to share information about themselves depends to a large degree on who they are sharing that information with. It is precisely because people have different levels of trust (or confidence) towards different institutions that these differentials in attitude and behaviour can co-exist. This is what makes anticipating privacy behaviour so elusive: how much information people will disclose or how they will regard a breach of data protection depends not only on their prior attitude towards privacy, but also on how they trust the beneficiary of their self-disclosure or data breaches. Risk trade-off behaviour is mediated through a trust relationship between the discloser and the recipient of information. The distinction between sharing information with people in a social network versus institutions sheds considerable light on the behaviours we have noted above. 

Trust and confidence

In one of the most important studies of trust, by Professor Adam B Seligman and published in 1997, a distinction between trust and confidence is made and insisted upon (3). This is a very important and helpful distinction for our purposes.

Seligman argues that there is a fundamental difference between trust in people (interpersonal relationships) and confidence in institutions. (The same would apply to technological systems, which is not Seligman’s focus). This goes to the heart of what trust actually is – a relationship which is not based upon reciprocal calculation but is open-ended. Seligman argues convincingly that if a trusting act was based upon calculation of expected outcomes or on the rational expectation of a quantified outcome, this would not be an act of trust at all but an act based on confidence. This would be based upon the idea of confidence in the existence of a system that delivered what it promised. The suspension of reciprocal calculation is precisely what defines trusting relationships.

Trust not only entails negotiating risk, it implies risk (by definition, if it is a means of negotiating that which is unknown). But the risk is specific. It is based upon the implicit recognition of others’ capacity to act freely and in unexpected ways. Unconditionality and engagement sit at the heart of trust relations. As Seligman notes, if all actions were constrained or regulated there would be no risk, only confidence or a lack thereof. In relations of confidence roles are prescribed while passivity defines behaviour; we give data to the state, they act upon it, more often than not, outside of our control. Data protection legislation protects data and prescribes what can or can’t be done with that data. We are passive onlookers who give up that data willingly or inadvertently.

So, in our interpersonal relationships – in the realm of trust – we act as free individuals and recognise others’ free agency as well. But when we act in predefined ways (that is we are constrained) trust is not called for, nor established. Confidence that everyone will act in accordance with the law or existing moral standards suffices. It is only when aspects of behaviour transcend this that trust emerges systematically as an aspect of social organisation.

Thus, the origins of trust are rooted in our recognition of the freedom of others to act freely. It is inked to free will; the ability to act autonomously, recognise that in others, and the ability to act outside of predefined or ascribed roles.

Trust is therefore a very rare thing, and because it is based on free will, trust cannot be demanded, only offered and accepted. Trust and mistrust thus develop in relationship to free will and the ability to exercise that will, as different responses to aspects of behaviour that can no longer be adequately contained within existing norms and social roles.

High trust – low privacy

This provides some important insights into the complexities surrounding the contradictory privacy behaviours mentioned above. Sharing of personal information on Facebook is thus a fundamentally different act from allowing one’s personal data to be used by the National Health Service or any other government institution.

In the first instance, social networking sites are voluntary. Joining and participating is based upon free will and the ability to act autonomously. By adding friends to our network, we implicitly recognise this in our friends and recognise their ability to act outside of predefined roles. Reciprocity is an outcome, rather than a condition for participation. We gain acknowledgement from our peers without assuming in advance what form that should take. Outcomes cannot be predetermined. It is a trust relationship because it is open-ended where individuals are free to control their identities, how they present themselves and share what’s on their minds and recognise in their friends the same capacities. For younger people, in particular, this is perhaps their only truly private space. Not even their homes or bedrooms are as private as this. This is thus a high trust space where privacy has a negligible impact on behaviour.

Low trust – high privacy

The opposite pertains to environments where requests for information are made from institutions and organisations that we interact with. From what has been said above it should be clear that our relationships with state institutions are based upon confidence rather than trust: roles are ascribed while outcomes are intended and expected. Transgressions are resolved through the legal system. There is neither unconditionality nor active engagement but a passive prescribed role relationship, which is not subject to change or control. Passivity or the expectation of trust being delivered is thus anathema to the establishment of real trust relations.

In these circumstances it is clear that privacy concerns will come to the fore and influence behaviour far more than in the case of social networks. The blurring of the boundaries of public and private today and the general disengagement of atomised citizens with the political process means that the lack of confidence in the institutional frameworks of society is extremely high. In these circumstances of a lack of confidence, indeed, a lack of trust, privacy concerns come to the fore.

The future

The tentative conclusion and the fundamental insight this approach offers is that privacy attitudes and behaviours will change according to the level of trust or mistrust people have with regard to the people or institutions they are interacting with. How much they trust the potential beneficiary of their self-disclosure is now the overriding motivator of behaviour.

This suggests that any discussion that does not take this as a starting point will inevitably get it wrong. Regulation and legislation (data protection legislation, for example) or technologically based solutions (like identity management solutions, privacy policies etc) can exacerbate rather than allay fears because they fail to take into account the trust relations underpinning them. This is a problem of perception and social attitudes, not something that is susceptible to legal or technical fixes. For regulation or technological solutions to be effective they have to be based upon the prior question of how much trust the given institution or system has with the public.

Understanding how people develop their perceptions of trust and mistrust must be the starting point for any rethinking of the question of privacy. This is the challenge.

Author

With over ten years experience in Telecoms innovation, Norman is recognised worldwide as an expert on future trends and user behaviour around Voice and Messaging and digital lifestyles. He has been a keynote speaker on this at O’Reilly Emerging Telephony, eComm, Telco2.0, 3G World Congress and Mobility Marketplace, the Global Billing Association and many other events. Norman is also the Chief Strategy Officer, Wireless Grids Corporation, USA.

Footnotes

1) Database State, Joseph Rowntree Reform Trust, 2009
2) Assessing Privacy Impact, ESRC, October 2009, p11.
3) Seligman, Adam B (1997) The Problem of Trust, Princeton (University Press, Princeton New Jersey).

Festival Buzz
Particle Physics is Sexy

View: Particle Physics is Sexy

"Just when Kant's formulation that 'the public exercise of reason should be free' had begun to seem so remote and exhausted, the Battle should reinforce one's faith in the enduring worth of dissent and of the free traffic in ideas"
Swapan Chakravorty, professor of english, Jadavpur University