Is ignorance bliss? Or would better understanding of technology alleviate our privacy concerns?
Are we over-sharing our data?
Facebook’s study about emotional contagion in June of this year raised questions about privacy, consent and how much data we’re sharing online without even knowing.
The data scientists behind the study deliberately manipulated the news feeds of almost 700,000 Facebook users for a week in January 2012 to see whether happy and sad content on people’s feeds would affect their own posts. Some people were shown more positive content whilst others were shown sad posts. When the week was over, results were conclusive. Facebook newsfeed does influence our moods.
What part do algorithms play?
The Facebook study’s co-author, Cornell professor Jeff Hancock, asks his new students to complete an experiment at the beginning of the semester. The experiment starts by asking the students to Google exactly the same search term, then turn to the person next to them and compare the results on their screens. Inevitably, the students find that instead of each getting the same search results as their neighbour, they can be wildly different. And then comes the realisation that Google searches – much like everything else online – are governed by algorithms; the invisible systems built around our personal information.
This sounds a little bit like Big Brother in reality doesn’t it? But isn’t it better to have an online experience tailored to you? Algorithms are all around us and they aren’t new. They’re just getting smarter.
Who does Google think you are?
At this point you might be wondering what Google knows about you already. Dave Thier, a contributor writing for Forbes, found an easy way to figure this out. The link in Dave’s article takes you to Google’s Ad Settings, which were pretty accurate for him.
However, it’s plain to see that Google doesn’t always get it right. I took the test and based on the websites I’ve visited, Google thinks I’m aged between 35-44 (I’m 24!) and male. And whilst some of the interests listed did make sense (like computers and electronics, smartphones, social networks, SEO and marketing, technology news and travel), some were a bit weird. For example, soccer, cooking and astronomy all appeared on the list. Just for the record, I can’t stand football, I am an abysmal cook and I definitely won’t be identifying constellations any time soon.
What about giving Google our consent?
Hancock suggests that because algorithms are such a big part of the way the internet works ‘we may have passed the point where it’s possible for people to reasonably expect they’d have to give consent before a corporation messes with the algorithmic filters that affect the information they see online.’
The prevalence of algorithms and the frequency at which they are changed makes opt-out impractical at best. As Hancock puts it, what would obtaining consent even look like? For example, the algorithm behind Google search is tested and changed all the time, but we don’t see opt-out notifications every time it’s updated.
What about giants like Facebook, Amazon and Apple?
A marketing stunt for Watch Dogs, an Ubisoft game released earlier this year, requests permission to access a Facebook user’s account. From there, Digital Shadow pulls personal information to build a comprehensive profile of you as if you were an assassin’s target, just like in the game. Whilst viewed by some as just a bit of fun, it’s actually quite scary.
And yes, in the interest of writing this blog I decided to be a willing guinea pig for the campaign, which is still running. You can get your profile here if you want to. I thought I was quite savvy with my privacy settings on Facebook. Oh how wrong I was! Just using the information publicly available on my Facebook account, Digital Shadow could see:
- Some of my photos
- Which of my friends I interact with most and those who I rarely speak to
- Words commonly used by me and my friends
- When I’m most active on Facebook (between 7am and 8am on a Monday)
- Where I’ve been … complete with a photo of McDonald’s in Radcliffe. Guilty.
- An estimated annual salary based on my location, age, work and education
- Password possibilities generated by my interests and close friends
- And an estimate on the value of my accessible, private data generated online. It’s $49,269, although I’m not sure whether that’s good or bad.
It’s very much the same story with other big tech giants like Amazon and Apple. Just think about the recommendations you see when you log in to your Amazon account or your iTunes. They’re based on products you might have bought before, something you looked at but didn’t buy and items related your purchase history. Of course, this whole personalised experience is governed by algorithms, which work in the background collecting information as we shop. The same is true when you browse your newsfeed on Facebook, or search for something on Google.
Algorithms are set to change our lives in more ways than just streamlined shopping too. The tech giants have recognised niches in other markets and pushed the boundaries of innovation. A great example of this might be Facebook’s acquisition of Oculus VR, the virtual reality headset that’s been making some serious waves in the gaming industry. Meanwhile, Google made the headlines with their investment in driverless car technology and drones, and Apple’s plans seem to moving towards the healthcare sector with wearables.
Trust and technology
Hancock acknowledged there is often a state of mistrust surrounding new technologies, saying ‘it goes back to Socrates and his distrust of the alphabet, [the idea that] writing would lead to us to become mindless … It’s the same fear, I think. Because I can’t see you, you’re going to manipulate me, you’re going to deceive me.’
Maybe it’s just a case of needing clearer communication and open discussion. If more people knew what algorithms are and how they work, it might dispel that fear of what we don’t understand.
To some people all of this really does sound like Big Brother is watching you. We believe that raising awareness of the digital footprint you create is an important lesson for us all to learn. It’s about making informed choices about what you do and don’t share online.
More from the Digital Privacy series: