Automated ethics

My latest essay for Aeon magazine asks when it’s ethical to hand our decisions over to machines, and when external automation becomes a step too far. The first few paras are below: read the rest on the magazine’s site.

For the French philosopher Paul Virilio, technological development is inextricable from the idea of the accident. As he put it, each accident is ‘an inverted miracle… When you invent the ship, you also invent the shipwreck; when you invent the plane, you also invent the plane crash; and when you invent electricity, you invent electrocution.’ Accidents mark the spots where anticipation met reality and came off worse. Yet each is also a spark of secular revelation: an opportunity to exceed the past, to make tomorrow’s worst better than today’s, and on occasion to promise ‘never again’.

This, at least, is the plan. ‘Never again’ is a tricky promise to keep: in the long term, it’s not a question of if things go wrong, but when. The ethical concerns of innovation thus tend to focus on harm’s minimisation and mitigation, not the absence of harm altogether. A double-hulled steamship poses less risk per passenger mile than a medieval trading vessel; a well-run factory is safer than a sweatshop. Plane crashes might cause many fatalities, but refinements such as a checklist, computer and co-pilot insure against all but the wildest of unforeseen circumstances.

Similar refinements are the subject of one of the liveliest debates in practical ethics today: the case for self-driving cars. Modern motor vehicles are safer and more reliable than they have ever been – yet more than 1 million people are killed in car accidents around the world each year, and more than 50 million are injured. Why? Largely because one perilous element in the mechanics of driving remains unperfected by progress: the human being.

Continue reading the essay here

Big data and artificial idiocy

I wrote a piece earlier this year for the Guardian about the perils and delights of big data, and the special stupidity it can breed. The first few paras are below: click through below for the whole piece.

Massive, inconceivable numbers are commonplace in conversations about computers. The exabyte, a one followed by 18 zeroes worth of bytes; the petaflop, one quadrillion calculations performed in a single second. Beneath the surface of our lives churns an ocean of information, from whose depths answers and optimisations ascend like munificent kraken.

This is the much-hyped realm of “big data”: unprecedented quantities of information generated at unprecedented speed, in unprecedented variety.

From particle physics to predictive search and aggregated social media sentiments, we reap its benefits across a broadening gamut of fields. We agonise about over-sharing while the numbers themselves tick upwards. Mostly, though, we fail to address a handful of questions more fundamental even than privacy. What are machines good at; what are they less good at; and when are their answers worse than useless?

Click here to read the whole piece on the Guardian site

Technology’s greatest myth

I wrote this at the end of last year as my final column for BBC Future, aiming to make 2014 a year for longer essays and projects (and paying attention to my young son). It’s a reflection on a couple of years of fortnightly writing about technology, ideas, and tech’s larger place in our sense of the world.

Lecturing in late 1968, the American sociologist Harvey Sacks addressed one of the central failures of technocratic dreams. We have always hoped, Sacks argued, that “if only we introduced some fantastic new communication machine the world will be transformed.” Instead, though, even our best and brightest devices must be accommodated within existing practices and assumptions in a “world that has whatever organisation it already has.”

As an example, Sacks considered the telephone. Introduced into American homes during the last quarter of the 19th Century, instantaneous conversation across hundreds or even thousands of miles seemed close to a miracle. For Scientific American, editorializing in 1880, this heralded “nothing less than a new organization of society — a state of things in which every individual, however secluded, will have at call every other individual in the community, to the saving of no end of social and business complications…”

Yet the story that unfolded was not so much “a new organization of society” as the pouring of existing human behaviour into fresh moulds: our goodness, hope and charity; our greed, pride and lust. New technology didn’t bring an overnight revolution. Instead, there was strenuous effort to fit novelty into existing norms. Continue reading

On video games: difficulty is the point, not the problem

Here’s a piece exploring the difficulties of discussing games compared to other media. It was written first for the book Early Modernity and Video Games (Cambridge Scholars Publishing, February 2014), then republished with Wired and Ars Technica – and, now, here.

Difficulty is built into video games in a different way than for any other medium.

A movie may be difficult, conceptually or in terms of subject matter; it may be hard to understand or to enjoy. Yet all you have to do to access its entirety is to sit and watch from beginning to end.

Written words can be still more difficult. For these, you need a formidable mastery of language, concepts and context; you must convert text into sense. Still, the raw materials are all there for you to work with. You do not have to pass a tricky test in order to get beyond the first chapter, or find yourself repeatedly sent back to the start of a book if you fail. You do not have to practice turning a page at precise moments in order to progress.

Yet this is what the difficulty of many video games embodies: a journey that the majority of players will not complete, filled with trials, tribulations and inexorable repetitions. Continue reading

Clouds, autocomplete and tweets

I’ve been experimenting with re-publishing a few of my recent columns on Medium. If you’ve arrived here in hope of reading them, simply follow (and share!) the links below.

I am the algorithm – on language, thought and ten more years of Twitter

Is autocomplete evil? – how a machine’s whispers in my ears are changing the way I think

The tyranny of the cloud – why you should think twice before you hit “upload”

Why computers will become invisible – our extraordinary intimacy with unseen technology

The meaning of Medium

I’ve been experimenting recently with the young writing space Medium (on which my own words are gathered here). Brought to you by Twitter co-founder, Ev Williams, the potted pitch is “a better place to read and write” – and it certainly delivers in terms of interface and ease of reading.

Behind the scenes, composing on Medium takes What You See Is What You Get to an elegant extreme, with as few options as sensibly possible left to the author: you can determine your text, title, subtitle, illustrations, bold, italics, quotations, links, and that’s about it. Almost every other aspect of formatting is automatic, with acres of white space in the best modern taste atop responsive design fit for any device.

All this, Williams explains, is about creating “a beautiful space for reading and writing — and little else. The words are central. They can be accompanied by images to help illustrate your point. But there are no gratuitous sidebars, plug-ins, or widgets. There is nothing to set up or customize.”

For a writer, it’s a little addictive. I’m typing this into the back end of my own website, which runs on WordPress. The interface is functional, the options and sidebars useful and not too overwhelming. I have a fairly good idea of what my post will look like once I hit “publish.” After Medium, though, it all feels a little weighty: a series of small barriers between these thoughts and their audience that, once lowered, loom disconcertingly large when they return. Continue reading

The attention economy: what price do we pay?

I’ve written my third essay this year for Aeon magazine, exploring the idea of the attention economy – and what it may cost us to pay for content with our time, clicks, tweets and endlessly aggregated attention.

How many other things are you doing right now while you’re reading this piece? Are you also checking your email, glancing at your Twitter feed, and updating your Facebook page? What five years ago David Foster Wallace labelled ‘Total Noise’ — ‘the seething static of every particular thing and experience, and one’s total freedom of infinite choice about what to choose to attend to’ — is today just part of the texture of living on a planet that will, by next year, boast one mobile phone for each of its seven billion inhabitants. We are all amateur attention economists, hoarding and bartering our moments — or watching them slip away down the cracks of a thousand YouTube clips.

If you’re using a free online service, the adage goes, you are the product. It’s an arresting line, but one that deserves putting more precisely: it’s not you, but your behavioural data and the quantifiable facts of your engagement that are constantly blended for sale, with the aggregate of every single interaction (yours included) becoming a mechanism for ever-more-finely tuning the business of attracting and retaining users.

Read the entire essay online at Aeon here

The main reason for IT project failures? Us.

My latest BBC column, republished here for UK readers, looks at some of the dispiritingly enduring human reasons behind IT project failures.

The UK’s National Health Service may seem like a parochial subject for this column. But with 1.7 million employees and a budget of over £100 billion, it is the world’s fifth biggest employer – beaten only by McDonalds, Walmart, the Chinese Army, and the US Department of Defence. And this means its successes and failures tend to provide salutary lessons for institutions of all sizes.

Take the recent revelation that an abandoned attempt to upgrade its computer systems will cost over £9.8 billion– described by the Public Accounts Committee as one of the “worst and most expensive contracting fiascos” in the history of the public sector.

This won’t come as a surprise to anyone who has worked on large computing projects. Indeed, there’s something alarmingly monotonous to most litanies of tech project failure. Planning tends to be inadequate, with projected timings and budgets reflecting wishful thinking rather than a robust analysis of requirements. Communication breaks down, with side issues dominating discussions to the exclusion of core functions. And the world itself moves on, turning yesterday’s technical marvel into tomorrow’s white elephant, complete with endless administrative headaches and little scope for technical development. Continue reading

Blocking net porn: worse than pointless

My latest BBC Future column, reproduced here for UK readers, looked at the perversities of censorship and online pornography.

What is the most searched-for term on the web? Contrary to popular myth, it’s not “sex”, “porn”, “xxx”, or any other common search term for pornography. Instead, as a quick glance at services like Google Trends shows, terms like “Facebook” and “YouTube” comfortably beat all of the above – as does “Google” itself. Onscreen as in life, it’s sport, celebrities and global news that command the most attention.

In fact, looking at lists of the world’s most-visited websites compiled by companies like Alexa, there’s strikingly little “adult” content. Search engines, social media, news and marketplaces dominate, with the world’s top pornographic site coming in at number 34, and just six others breaking into the top one hundred. As an excellent analysis by the Ministry of Truth blog notes, “overall, adult websites account for no more than 2-3% of global Internet traffic, measured in terms of both individual visits to websites and page views.”

All of this sits slightly strangely alongside recent hysteria and headlines (and dubious maths) in Britain. If you missed it, Prime Minister David Cameron announced his intention to prevent online pornography from “corroding childhood” by making internet service providers automatically block pornographic websites. Britain is set to become a place where internet users have to “opt in” to view pornography – a moral beacon in a world increasingly alarmed by the filth pouring out of its screens. Continue reading

In conversation with Mark Cerny

Last week, I profiled the PlayStation 4′s lead system architect, Mark Cerny, for the Independent. The profile is online here, and outlines his background and role. For those interested in a little more detail, this is an edited transcript of some of the key points from our conversation when we met in London in July.

In 1982, aged 17, Mark Cerny quit university in his native California to work as a designer and programmer for the era’s most important games company, Atari. By 1984, he had created his first hit game, Marble Madness. By 1985, he had moved to Japan to work with gaming’s rising giant, Sega, where he worked on both games and the cutting edge of console design – a combination that saw him leave in the 1990s to develop for one of the world’s first CD-based consoles, the 3DO Interactive Multiplayer.

The 3DO failed to take off – but by 1994 Cerny had become one of the first non-Japanese developers to work on Sony’s new PlayStation, and a major player in Sony’s success. His games Crash Bandicoot (1996), Spyro the Dragon (1998) and their sequels sold over 30 million copies. Cerny founded his own consultancy in 1998, and has since helped produce, programme and design a gamut of key titles for three generations of PlayStations.

Perhaps his most important work of all, though, is only just coming to fruition: the PlayStation 4, the hardware on which many of Sony‘s hopes for this decade rest. Since 2008, Cerny has worked as the machine’s lead system architect – a job he himself pitched to Sony Computer Entertainment’s senior management – as well as directing the development of one of its key launch titles, Knack.

Tom Chatfield: Historically, the original PlayStation came out in 1994. What’s your take on how it changed the games industry? Continue reading