’Tis the season to be algorithmically presented

Ending of the year presents us with the opportunity to look back and reflect the past year. Fortunately, if you happen to be in Facebook, they provide the review for you in their "yearly review"-app.  Based on your status updates of your year, the app puts them together and presents your year to you. Then you can share it. To be extra kind Facebook also pre filled the text on the status update.

In many ways, one could consider this as a nice gesture and piece of software from Facebook. Now we don’t have to do the hard work at reflecting back our year. No need to think and decide the important points. In a true style of the Silicon Valley solutionism  the problem is solved. - Of course, I know I do not have to share the presentation if I don’t want to, but it is popping in my feed time and time again and every time I get shivers down my spine from this.


Well, first it reminds me of how much this big corporation knows about me, or think it knows. And the information served to us in the form of yearly review is just a fraction of data Facebook collects about us. (and sells to their advertisers.) This yearly presentation just shows us a little glimpse of the algorithms working in the background. Churning away quietly, patiently collecting all the little bits and pieces we pour into it. 

It is not that algorithms are bad in some way; they are essentially just pieces of code, instructions to do predefined task. But the way and cases we use them trouble me. Why isn’t there more open algorithms aimed to enhance commonwealth and wellbeing? This video by Harlo Holmes for example, is just a tiny peek on what we could achieve with algorithms, when put on to good use.  Why is it that the most sophisticated algortihms are used to gather information about us, categorising us like a herd and then selling this information to advertisers?  You can be single, married, divorced, male, female, white, gay, in your mid twenties, etc. You might also be profiled to like a certain genre of clothing, music, movies. You are most active with these friends, and then your friend's data is compared to yours etc.... All this to collect a profile of the way you act, what makes you happy -how to deliver an advertisement that speaks to you. For your benefit, naturally. Another technological solution to a problem that doesn’t need solving: how to deliver ads so that they are effective and not annoying. (IMHO: all ads are annoying by their nature. )

There are naturally many more uses for algorithms; Christopher Steiner has a nice review of algorithms in his book: Automate this: How Algorithms came to rule our world

He writes: Algorithms normally behave as they’re designed, quietly trading stocks or, in the case of Amazon, pricing books according to supply and demand. But left unsupervised, algorithms can and will do strange things. As we put more and more of our world under the control of algorithms, we can lose track of who— or what— is pulling the strings. This is a fact that had sneaked up on the world until the Flash Crash shook us awake.

Second, they force me a solution I don’t want or need. They present me with a template for and of my life. All the things are from within the bounds of Facebook. - That’s where everything happens and we are our true selves, naturally. They want to engage me more into their ecosystem. -on a side note, have you noticed how difficult it is to share anything you find in Facebook outside of Facebook? Especially in mobile devices. It might be nice to take a look back at your status updates yourselves and see what you have done. This is similar to Think Up that might serve some meaningful glance about your use in social media. But it is worth asking that why share it with everyone? Or why Facebook wants us to share it with everyone? In a way, this reminds me of the ways Facebook slowly and quietly invades into our life more and more. Who remembers the beacon episode?  After that Facebook has slowly and quietly launched more and more ways to gather knowledge about us. But in a very quiet fashion. Of course, sometimes something spills

Why all this is a problem ?  -Maybe it isn’t  -depends on the angle you look at it. For me, personally I found all this to limit my freedom as a ”user”. And this is true to most social media sites and beyond. (Yes, I’m looking at you Google the behemoth) These companies keep their algorithms and source code so secret that you might suspect some sinister magic is beyond it. Probably is too. But the more down to earth reason is that if we want to live in digital world and interact with each other in it, wouldn’t it make sense to do it in either public open way, intheoretical open platform or in your own style? I don’t recall sending letters to my friends that were pre filled. Why to do it in the digital world? Because it is easy? Really?

Third, the yearly review does not have any thought in it. If we don’t count the countless hours some developers poured into it. But the working product is just algorithms, code. The process is thoughtless, emotionless. This can lead to inadvertent algorithmic cruelty, like in Eric Meyer’s case. And the inadvertent cruely may be even more common according to Dale Carrico in his post in World Future Society

Eric Meyer writes: Algorithms are essentially thoughtless. They model certain decision flows, but once you run them; no more thought occurs. To call a person “thoughtless” is usually considered a slight, or an outright insult; and yet, we unleash so many literally thoughtless processes on our users, on our lives, on ourselves.

After all this lecturing and preaching I must say that I am not opposed to seeing people share their yearly reviews (Must admit, I do not look at them.) I am just hoping that if you want to share your yearly review that it  would be genuine and hand-made. And not feel-good ego boosts, done in a double click. What the digital world does not need is that we  automate more emotions and empathy, what it does need is  handcrafted individual works made by humans, something that can be felt. 

Thoughts on digital citizenship

Last week I took part in Creative Citizens conference at Royal College of Arts in London. It was two days filled with interesting talks and presentation all based on creative citizenship. Creative Citizens is a research project and has three main themes: hyperlocal publishing, community-led design and creative networks. I think all of the different themes were nicely portrayed, and talks gave a nice variety on different topics. These sort of projects that fuse together different fields from politics, education to art is something I would like to see happen in Finland also. 
Nevertheless, after the conference I started to think (Well I am a Finn after all…) about digital citizenship portrayed through social media platforms and some particular question came to mind:  How much is using social media as an active, critical, citizen just playing with the tools that we are given to play with? Are we just playing in the sandbox of largely censored, monitored products, which turn our messages into money? How much more effective could our criticality be, if we had to say over the platforms? Or even: If we could be part of building them? 
Jean Burgess, professor at the Queensland University of Technology, gave a very good talk on creative citizenship and social media. She gave the example of silly citizenship (a great name and concept!) as one example to engage critically in social media platforms. Silly citizenship uses social medias own tools to work as a humorous way to engage in politics. One meme she mentioned as an example was a photo of British prime minister David Cameron calling to Barack Obama to have a talk about the situation in Ukraine. The photo was shared through twitter and was quite comical. Internet quickly followed with it own versions of it and so a meme was born. According to Burgess this kind of ”Silly Citizenship” can be  one way to engage with the world around us: even though the message is funny and entertaining it still carries a political message. 

I do think that this kind of activity is good and to be encouraged. Still, I do ponder on the platform itself*: This kind of activity seems very little one can do. Naturally there are ways to engage more but why are they so difficult? How Is the situation between new media corporations sitting in US and us different to western missionaries going to Africa in the olden days? Bringing with them shiny new things as an exchange for something much more valuable? - Thing is that most of the platforms we use (in western countries) are made by tiny number of US-based young, often white, male engineers. Many platforms still don’t have their revenue plan worked out yet (Twitter anyone?) and hardly any of them have given any thought on the platforms civic engagement, etc. Let alone have any research made on what would be a good platform. And of course why should they? These are startups and corporations looking to make the most money for them and their investors. We are offered free products as an exchange of our information. These products are not designed to be democratic or equal, but entertaining and engaging to use.
How will the future look like for us? In 20 years are we, as citizens, being colonized and depleted by algorithms of social media corporation?
This is one of the reasons I think we need an understanding of the digital world in a deeper level, we in a way need to be code literate in order to be equal in the digital world. We also would much need an open democratically funded social media platform to succeed. Then I think we could use social media much more effectively.

* I don’t mean the remaining pondering as a critique to Burgess’s talk, rather I found it inspiring and gave me good starting point for my own trail of thoughts to emerge.