’Tis the season to be algorithmically presented

Ending of the year presents us with the opportunity to look back and reflect the past year. Fortunately, if you happen to be in Facebook, they provide the review for you in their "yearly review"-app.  Based on your status updates of your year, the app puts them together and presents your year to you. Then you can share it. To be extra kind Facebook also pre filled the text on the status update.

In many ways, one could consider this as a nice gesture and piece of software from Facebook. Now we don’t have to do the hard work at reflecting back our year. No need to think and decide the important points. In a true style of the Silicon Valley solutionism  the problem is solved. - Of course, I know I do not have to share the presentation if I don’t want to, but it is popping in my feed time and time again and every time I get shivers down my spine from this.


Well, first it reminds me of how much this big corporation knows about me, or think it knows. And the information served to us in the form of yearly review is just a fraction of data Facebook collects about us. (and sells to their advertisers.) This yearly presentation just shows us a little glimpse of the algorithms working in the background. Churning away quietly, patiently collecting all the little bits and pieces we pour into it. 

It is not that algorithms are bad in some way; they are essentially just pieces of code, instructions to do predefined task. But the way and cases we use them trouble me. Why isn’t there more open algorithms aimed to enhance commonwealth and wellbeing? This video by Harlo Holmes for example, is just a tiny peek on what we could achieve with algorithms, when put on to good use.  Why is it that the most sophisticated algortihms are used to gather information about us, categorising us like a herd and then selling this information to advertisers?  You can be single, married, divorced, male, female, white, gay, in your mid twenties, etc. You might also be profiled to like a certain genre of clothing, music, movies. You are most active with these friends, and then your friend's data is compared to yours etc.... All this to collect a profile of the way you act, what makes you happy -how to deliver an advertisement that speaks to you. For your benefit, naturally. Another technological solution to a problem that doesn’t need solving: how to deliver ads so that they are effective and not annoying. (IMHO: all ads are annoying by their nature. )

There are naturally many more uses for algorithms; Christopher Steiner has a nice review of algorithms in his book: Automate this: How Algorithms came to rule our world

He writes: Algorithms normally behave as they’re designed, quietly trading stocks or, in the case of Amazon, pricing books according to supply and demand. But left unsupervised, algorithms can and will do strange things. As we put more and more of our world under the control of algorithms, we can lose track of who— or what— is pulling the strings. This is a fact that had sneaked up on the world until the Flash Crash shook us awake.

Second, they force me a solution I don’t want or need. They present me with a template for and of my life. All the things are from within the bounds of Facebook. - That’s where everything happens and we are our true selves, naturally. They want to engage me more into their ecosystem. -on a side note, have you noticed how difficult it is to share anything you find in Facebook outside of Facebook? Especially in mobile devices. It might be nice to take a look back at your status updates yourselves and see what you have done. This is similar to Think Up that might serve some meaningful glance about your use in social media. But it is worth asking that why share it with everyone? Or why Facebook wants us to share it with everyone? In a way, this reminds me of the ways Facebook slowly and quietly invades into our life more and more. Who remembers the beacon episode?  After that Facebook has slowly and quietly launched more and more ways to gather knowledge about us. But in a very quiet fashion. Of course, sometimes something spills

Why all this is a problem ?  -Maybe it isn’t  -depends on the angle you look at it. For me, personally I found all this to limit my freedom as a ”user”. And this is true to most social media sites and beyond. (Yes, I’m looking at you Google the behemoth) These companies keep their algorithms and source code so secret that you might suspect some sinister magic is beyond it. Probably is too. But the more down to earth reason is that if we want to live in digital world and interact with each other in it, wouldn’t it make sense to do it in either public open way, intheoretical open platform or in your own style? I don’t recall sending letters to my friends that were pre filled. Why to do it in the digital world? Because it is easy? Really?

Third, the yearly review does not have any thought in it. If we don’t count the countless hours some developers poured into it. But the working product is just algorithms, code. The process is thoughtless, emotionless. This can lead to inadvertent algorithmic cruelty, like in Eric Meyer’s case. And the inadvertent cruely may be even more common according to Dale Carrico in his post in World Future Society

Eric Meyer writes: Algorithms are essentially thoughtless. They model certain decision flows, but once you run them; no more thought occurs. To call a person “thoughtless” is usually considered a slight, or an outright insult; and yet, we unleash so many literally thoughtless processes on our users, on our lives, on ourselves.

After all this lecturing and preaching I must say that I am not opposed to seeing people share their yearly reviews (Must admit, I do not look at them.) I am just hoping that if you want to share your yearly review that it  would be genuine and hand-made. And not feel-good ego boosts, done in a double click. What the digital world does not need is that we  automate more emotions and empathy, what it does need is  handcrafted individual works made by humans, something that can be felt.