Accessibility links

 

In the wake of the Facebook emotion-gate scandal, I claim that we need to 're-claim the algorithm'. Problem is, I'm not quite sure what that means! (Here is something we prepared earlier).

Algorithms are everywhere. You may not be able to see them, can’t touch them, and don’t know what they taste like; but they see you and likely know both what you’d like to touch/own/buy and what your tastes are. Go onto Amazon and it will recommend purchases based on your browsing history. Go onto twitter and it’ll recommend people based on who you’re already connected to. Ask an investment banker a questions and it’ll be the algorithms doing all of the answers.

Ask better Questions: Algorithms for everyone!

Algorithms are in the news again. As you will probably have already heard, a study has been published which shows the effects of Facebook filtering what you see in your newsfeed, by either removing negative posts or removing positive posts. It finds that altering the emotional make-up of what you read, alters the emotional make-up of what you post, by a very small amount (re: the 'online social networks change your emotions' headlines, it is worth highlighting that posting is not feeling, and seeing people's reported feelings is not really being part of a social network).

Algorithms are basically rules of thumb, scaled up; they are “a process or a set of rules to be followed in calculations or other problem-solving operations, especially by a computer.”

As @dingtweets highlights, people's response to this seems to be linked to their sector.  Academics, for example, are up-in-arms over a perceived lack of informed consent (a camp I would place myself in), whilst others are most interested in the 'upworthy' side of things (you won't believe what these researchers found?!), or the "ARGH- massive-surveillance" side of things. As pointed out by Tal Yarkoni, the actual effects the researchers found were teeny-tiny:

To put it in intuitive terms, the effect ... is roughly comparable to ... increas[ing] the average height of the male population in the United States by about one twentieth of an inch.

@Zeynep's response was incredibly important, highlighting the potential political implications:

So, yes, I say we should care whether Facebook can manipulate emotions, or voting behaviour [sic], or whether it can model our personality, or knows your social network, regardless of the merits or strength of finding of one study. We should care that this data is proprietary, with little access to it by the user, little knowledge of who gets to purchase, use and manipulate us with this kind of data.

What emotion-gate is highlighting, is how the vast majority of us have not caught up with how quickly new technology is moving: we are angry about the UK government’s handling of care.data, maybe confused over NSA, indignant about phone-hacking … but we hand over the very footprint of our lives to the likes of Google and Facebook - often without reading their pretty comprehensive T&Cs.

Technological changes carve out huge new swathes of uncharted social and economic territory. The new frontier is no longer the conquest of land or a bubble of space rubble, but a continual battle to keep up with the geeks. Networks get there first, markets marketise thereafter, institutions dither and you?

What does informed choice means when it comes with the reading of reams of terms and conditions and a hastily selected ‘tick here’ box. What does opt-out mean when your grand-daughter’s baby pictures are only to be found on Facebook or Picasa?

As John Oliver highlighted in his pretty hilarious piece on net neutrality, most people do not keep up with big legal and technological changes as they either seem boring, not for them, too complicated, or just too far removed  from daily life. Still going to the shops, still the same nine-to-seven grind; there are just different ways in which you can pass the time - Kindle, iPad, self-monitoring app and smartphone.

The way in which your technology works has massive social, economic and cultural effects, and we need to start re-claiming it. Web 2.0 is ascendant, with its associated petitions and selfies and posts and tweets and messages and buzzfeeds and upworthy and cats and sometimes revolutions; as voting patterns and trust in institutions fall through the floor. Bitcoin went from 1 person’s idea, to 100s of people who saw its value, to 1000s that made it happen; from zero to global crypto-currency in two years. Companies that act in networked ways – from Valve to Instagram – are taking the bottom of the market away from old ‘market leaders’ (poor Kodak)!

We are at an odd juncture in history: modern information technologies have transformed the lives of everyone, everywhere, throughout the past decade, but we run the risk of forgetting some fundamentals – a recent U.N study highlighted that more people now have access to mobiles phones, than working sanitation. People power is no panacea: the century so far has been 15 years of massive people movements – Occupy, the Arab Spring – and 15 years of massively violent responses from state apparatuses.

Now is the time to reclaim the humble algorithm. They are not that scary. Algorithms are basically rules of thumb, scaled up; they are “a process or a set of rules to be followed in calculations or other problem-solving operations, especially by a computer.” 

But what does that mean? Honestly, I'm not quite sure, but here is something we made earlier.

 


Gaia Marcus is a Senior Researcher on the RSA Connected Communities project.

She is an Edgeryder and an UnMonk advisor, founded the RSA Social Mirror project and is ¼ of the ThoughtMenu.

This Summer she will be cycling 1000km in memory of our RSA colleague Dr Emma Lindley and in aid of Mind

You can find her on twitter: @la_gaia

 

2 Comments

Join the discussion

Please login to post a comment or reply.

Don't have an account? Click here to register.