How incomplete privacy laws build echo chambers - RSA

How incomplete privacy laws build echo chambers

Blog

  • Picture of Josh Entsminger
    Josh Entsminger
  • Picture of Mark Esposito
    Mark Esposito
  • Picture of Terence Tse
    Terence Tse
  • Picture of Danny Goh
    Danny Goh

Can we liberate and protect our digital usage? Or will a misguided attempt at policy response end up making it all the more confined?

Your online choices are rarely private. With the recent revelations of the ‘do not track’ scandal, customers need to be more aware of whether the privacy options they are offered actually perform as advertised. More importantly, consumers need to understand that the wrong policy response may help to improve some versions of privacy at the expense of accelerating other, more innocuous trends that are shaping how online communities are organized and created.

Giving people control of their data without giving them the tools to understand that control may end up being negative. This would not be negative in and of itself but rather because this process does not go far enough. With new data laws, we are being given more choice as to how and when our data is used – or at least more visibility into its use – and shown how the ease of creating echo chambers exceeds the ease of creating the spaces that connect us. The implication is that the very process we expect to liberate and protect our digital usage may end up making it more confined.

For the most part, we meet people by accident and without prior design. But who we choose to become friends with is never accidental. We tend to pick people who agree with us in some regard. The same goes for the sites we repeatedly visit, the news we read, the items we buy, how we find apartments, how we find online dates – the list goes on.
We can curate our reality more readily and more precisely, but under such trends, the nature of that curation, and control over that curation, based on what kinds of information we allow the algorithm to use.

We adapt to the spaces available, to the tools given. Whether or not we intend it, we live in a marketplace of social judgement. We avoid some kinds of relationships and choose others, often by how they make us feel – comfortable, part of a community, or whatever it may be that makes us repeatedly pick that site or person over another.

How businesses see us is increasingly a matter of the data they have access to and the inferences that they feel comfortable making with that data. Producers at Netflix are known to base their recommendations on production decisions that are more heavily data driven, identifying how the choices they make translates to viewer losses. Indeed, Netflix is less a game of content than of producing a menu for various forms of disengagement, what to leave on in the background, to let people feel comfortable with on a fine grained level in the script and structure of the content itself.

The process of self-selection means businesses might be incentivized to curate personalized experiences in a way that mirrors echo chambers - competing to be the most inviting, the most ‘like-you’, the least judgmental. We pick experiences that affirm us in some sense – and often this means affirming our beliefs.

This process of self-selection is increasingly being decided in how we respond to privacy, or how we understand it at all. The nature of data profiles which businesses and governments have on us - or which we may eventually choose to allow - are already shifting the framework of who best understands us, who has access to the data that predict our choices and preferences, and furthermore who might be able to judge us least. Echo chambers are only incidentally digital, as they stem from a more fundamental habit of following the path of best convenience and the desire to curate our experiences accordingly.

Nike’s recent adventure into political marketing allows a clear vision of this trend without the personalization. Can Nike be everything for everyone? Can it selectively appeal to the right and the left with targeted political marketing and a neutral widespread campaign?

Beyond business, we can push deeper into the question of our personal lives. Socially, every person we meet has a different mental projection of us, created from different experiences and from their own biases; the same goes for businesses and governments. Minute differences in data sets and proprietary algorithms create – even if minutely – different experiences among these firms.

The worry is less that this process will accelerate how echo chambers can outnumber the spaces that connect us than this trend’s ability to push us to a place where we cannot tell the difference.

As more control over data is ceded to citizens, they can opt into different arrangements and businesses based on how they believe their data will be used. When we self-select how algorithms view us, selecting the data available for them to predict and perceive us, we need to consider not only how we want to be judged, but, unfortunately, also what that judgement serves to do in the context of the larger social dynamics at play.

We should beware a world where the likelihood of unexpected connections continues to decrease. The attempt to correct the imbalance between firms and citizens in data control may end up consolidating the control with businesses, a path which many have tried to avoid. We are moving to a world by design, where serendipity holds less sway.

Our guest bloggers Joshua Entsminger, Mark Esposito, Terence Tse, and Danny Goh all work at Nexus Frontier Tech. They have previously written for the RSA about what government should take into consideration when designing a roadmap for ethical AI.

Be the first to write a comment

0 Comments

Please login to post a comment or reply

Don't have an account? Click here to register.

Related articles

  • Prosperous Places: creating thriving communities

    Tom Stratton

    With regional growth at the top of the agenda, it is vital that we create thriving communities across economic, social and natural perspectives. Prosperous Places is a suite of interventions aimed at responding to the unique ambitions and challenges of places.

  • Pride interview: Felipe Tozzato

    Deborah Ajia

    The commercial photographer and RSA Fellow explains what Pride means to him, the importance of courage, making friends through rugby and why being gay is his superpower.

  • Let's smash the Rainbow Ceiling

    Ben Oliver

    Reflecting on Layla McCay’s recent RSA talk, Ben Oliver offers five ways for employers to create a positive culture for their LGBTQ+ staff that benefits both the individual and the organisation.