Technology, Privacy and Innovation in 2014

Prediction lists for the coming year are always revealing, though perhaps more of the current public mood than the future. A write-up of the tech trends for 2014 by Fast Company's design blog is hardly controversial, but what is interesting is how the areas they’ve chosen highlight the existence of two wider and seemingly divergent technological trends. This apparent conflict in the way technology is heading is far from problematic. On the contrary, it shows our success in adapting and experimenting with new ideas and in response to shifts in the social and political context, without the need for any central guidance.

One thing clear from Fast Company's list is that 2014 will bring a continued increase in the volume and depth of the personal data we create. Things like Google Glass, the ‘quantified self’, hyperpersonalised online experiences and the interconnectivity of the Internet of Things all create new reasons and mechanisms for data capture. This in turn increases the value of our data to ourselves, the companies with access to it and, in some situations, the state.

However, the article also predicts that 2014 will see increasing concerns over cyber-privacy and a movement towards greater digital anonymity. Users will increasingly chose to control their own data and how this is profited from, whilst we will begin to discover the joy of ‘disconnecting’ from the digital world and see the creation of intentional blackspots.

The fact that we seem to be embracing deeper technological integration yet simultaneously finding ways to mitigate and avoid its consequences is certainly interesting. Does this show that we’ve raced forward too fast and are trying to claw back a space we’re realising we’ve lost? It’s perhaps possible that this is the case, but far from giving us cause for concern the two-track path we’re seeing shows the ability of consumers and the tech sector to adapt over time, and in turn gives some hints on the optimal tech policy.

Reservations about an increasingly digitized and tech-heavy world are common, be it concerns over ‘hyper-stimulation’, the aggressive monetization of our digital footprint or wide-scale data collection and its abuse by unscrupulous firms and governments. Concerns often partner with conservatism; a desire to slow down the pace of technological rollout and impose prior restrictions on how things may be used. More often then not, government regulations and restrictions are cited as the way to hold a check on technology and keep us safe.

For example, Google's announcement to purchase the home thermostat company Nest was met with calls for a "much-needed conversation about data privacy and security for the internet of things". However, despite the fact this conversation hasn’t actually taken place yet, the same article expresses dismay and concern that the US government has been reluctant to legislate in this fledgling area.

Clearly, security breaches and the abuse of sensitive information are unwanted, and the more data collected the larger a slip-up could be. However, as Adam Thierer points out “conjectural fears and hypothetical harms should not drive regulation”.

Even when a problem can be identified, it’s unlikely that a committee of concerned yet under-informed policy makers are best placed to deal with it. A case in point is the EU’s Privacy Directive, the progress of which has been continually stalled by conflicting interests and general confusion. Moreover the pace of government action often runs way behind business and societal developments, and policies forged to address a pressing issue today may be redundant in five years’ time.

Worse still, restrictions dampen innovation and risk choking off the next big breakthrough – clearly advances are less likely to come about if we can’t use our resources creatively. This is particularly true in fast-moving and dynamic technology sectors. It’s hard to imagine the success of the internet if companies and experiments had been subject to governmental approval and top-down control.

Ultimately, however, we should be reluctant to adopt state-imposed ‘solutions’ to technological problems is because the market is actually incredibly good at dealing with these issues itself.

This is exactly what the two sides to 2014’s tech trends show. 2013 gave us reasons to be more wary about what we give away about ourselves & put online – and developers have taken note. If we feel at the mercy of data-sucking giants we can begin to avoid them. As the public tires of Facebook, alternative social networks centred upon privacy and control continue to emerge. Hate search engines knowing what you’re looking for? Try out DuckDuckGo . Want greater control over your data? Look out for indiePhone and OS. This new wave of open-source and privacy-conscious technologies is marked by an increasingly sleek user experience as it moves out of the realm of geeks and into the mainstream.

Of course, not everybody will care about these things, and neither should they have to. The beauty of a world where experimentation is encouraged is that people can pick and choose what things (anonymity, relevant ads, seamlessly connected devices and so forth) are important to them, and make their tech usage decisions accordingly. In contrast, government restrictions impose a cost on the whole of society and assume that we hold the same preferences and level of risk aversion. When faced with new dimensions to questions like ‘How should companies use my data?’ and ‘Is it wise to let technology to do x?’, we’re more likely to find answers we’re happy with through personal experimentation and adaption than taking the word of interest groups and politicians.

We might get things wrong along the way and maybe even double-back on ourselves, but its clear that so long as we continue to innovate, we’re likely to solve our own problems and satisfy a range of preferences.

Previous
Previous

There's nothing neoliberal about work for the dole

Next
Next

Technology, Privacy and Innovation in 2014