There is a lot of hype about big data privacy. Equally there appear to be double standards exhibited by many of those expressing the most concerns. There is widespread concern about what global corporations and governments do with our data and whether our rights are adequately protected, and yet there is little focus on the trend in modern society to wilfully disclose seemingly everything about themselves. This is an example of the privacy paradox.
People often express fears about the way social media providers use our data. Concerns often centre on how social media uses our data to target us with advertising and promoted posts. There are of course potential advantages to targeted advertising, including us seeing things that we are probably more interested in.
There are also concerns about how our social media data is used. Recently there was the example where Admiral Insurance wanted to offer insurance quotes based upon our Facebook posts. In this instance Facebook prevented this from proceeding. Aside from concerns about whether the algorithms that would have been used were fair and reasonable to derive insurance quotes, then clearly these were only going to happen when people gave access to their Facebook profiles for this purpose e.g. when people felt that there might be a benefit to them allowing Admiral the requested access. While this case was portrayed as a case where people’s privacy was being intruded upon for the benefit of a corporate, there was no recognition that it would also be an opportunity for people to engineer social media profiles purely to obtain cheaper insurance. If this business model were widespread then perhaps people would engineer their social media presence for such purposes or indeed perhaps companies would spring up to engineer such profiles and then sell them on to people to take advantage of such methods.
The key point to remember is that we do have a large degree of control of how our social media data can be shared by diligent use of the social media settings. Admittedly, social media companies do regularly revise the way settings work, but we do have the opportunity to use appropriate settings to minimise use of our data in ways that we do not want it to be used.
Similarly, we are all provided with Terms & Conditions to accept when signing up to social media services, and yet the majority of people simply tick the box to accept these without actually reading them.
I also find it astounding how many people post extensive text and pictures on social media about what they are doing and more importantly about where they will be going and when. If someone keeps posting how much they are looking forward to going abroad for 2 weeks in a month’s time and then sends posts while away about how great it is, then it’s not entirely unsurprising that some people may wish to use this information to their advantage. For example, knowing there house is empty.
While social media companies could make it easier for us to take control of the use of our data, we do have a responsibility to take that control ourselves. Responsible use of social media can greatly protect us from many of the perceived risks.
Wider digital footprint
Anyone who uses a smartphone and carries it with them at all times is leaving a very precise track of everywhere they ever go along with a precise location for where they are at any given moment. GPS is one method that allows such tracking and this is further compounded by phone apps that post your location if you allow them to e.g. showing where you sent your social media message from. You could turn off GPS to prevent this.
Your position can be identified by triangulating your mobile via the cellular network. This information is less available to most people, but is still available to your service provider.
Finally, people often overlook that even with GPS switched off organisations do map the positions of wireless networks and access points and can therefore estimate your position if you connect to one of these.
Again, no-one is forced to carry a smartphone and you don’t have to have it switched on at all times.
While people often focus their concerns on large corporations like Facebook and Google, many often install smartphone apps and allow permissions to use all manner of things without a second thought e.g. to use their phone contacts, to use their camera, to use their microphone. While often such permissions are required for the relevant application to provide the necessary functionality, people need to be more aware that malicious software could use these for unexpected purposes e.g. they could be used to listen to you at all times through your microphone or to film at all times through your camera(s). While the risk of most apps doing this is minimal, people need to be more aware of what permissions they are giving away, what data may be being produced, and to make a reasoned judgement on how much they trust the application.
To a large extent the benefits to people of using social media, smartphone applications and other technology outweighs the likely risks that would arise from the data produced. People will often take larger risks than they might rationally take because they want to join the majority that use these technologies. There is also a view that if you have nothing to hide, then you have nothing to fear.
However, as a society, we need to become more savvy, to read terms and conditions, and to consider what permissions we agree to before using the technologies and apps available to us. Safety in numbers does not apply, as just because many people use something, it does not necessarily mean that it is safe and secure. While people should ask questions about how corporations and government use their social media, applications and technology based data, we also need to take responsibility for our own actions and need to make our own judgements as to whether the benefits outweigh the risks for the technology and apps that we use.
Blog post by Tom Howard (Project Manager at the Business and Local Government Data Research Centre), please email us if you have any questions about the contents of this post.
Published 13 December 2016