Trigger Happy 2.0: The Art and Politics of Videogames

‘Wonderfully energising… focused literary joy’ — Eurogamer

Why can’t a wargame be anti-war? Why does “gamification” spit on the downtrodden? And why do so many videogames take the form of boring jobs? Investigating the aesthetics, politics, and psychology of modern videogames, the essays in this follow-up to 2000’s Trigger Happy are an edited and revised selection from my columns for Edge magazine. In it, you’ll find out why the Tomb Raider series is like the oeuvre of Mark Rothko, why Nietzsche might have enjoyed Donkey Kong, and what “self co-op”, “cognitive panic” and “unreliable agency” mean when you’re gripping a joypad or clawing at a mouse.

Available exclusively as an ebook through Amazon worldwide (can be read on any computer, smartphone or tablet using the free Kindle app) at £5.99 | €7.99 | $9.99 etc. Links: US, UK; search to find in European and other stores.

Nudge units, behavioural economics, political speech, advertising, and popular psychology — they are all trying to convince us that we’re irrational, helpless victims of our badly wired brains. Why should such an anti-humanist message have become the received wisdom of our age? That is the subject of my recent talk on BBC Radio 4’s “Four Thought”, which you can listen to here.

29 May 2013

On Big Data and its discontents

Data will save us. All we need to do is measure the world. When we have quantified everything, problems both technical and social will melt away. That, at least, is the promise of “Big Data” – the buzzphrase for the practice of collecting mountains of data about a subject and then crunching away on it with shiny supercomputers. The term has lately become so ubiquitous that people make wry jokes about “small data”. But Big Data is not only something geeks do in the science lab or the start-up company; it affects us all. So we had better understand what its plans are.

Miraculous things can already be accomplished. By analysing web-searches tied to geographical location, Google Flu Trends can track the spread of an influenza epidemic in near-real time, thus helping to direct medical resources to the right places. Another of the company’s services, Google Translate, is so effective not because it understands language to any degree, but because it holds huge corpuses of written examples in various tongues and knows statistically which phrase of the sample text is most often translated by which phrase in the target language. Meanwhile, aircraft and other complex engineering projects can be made more reliable once components are able wirelessly to phone home information about how they are functioning. This mammoth store of telemetry data can be analysed to predict part failures before they happen.

But Big Data is not just an approach that improves uncontroversially useful systems. It’s also a hype machine. IBM, for instance, offers to furnish companies with its own Big Data platform, the PR material for which is a savoury mix of space-age techspeak and corporate mumbo-jumbo. “Big data represents a new era of computing,” the company promises, “an inflection point of opportunity where data in any format may be explored and utilised for breakthrough insights – whether that data is in-place, in-motion, or at-rest.” It sounds rather cruel to disturb data that is “at-rest”, presumably power-napping, but inflection points of opportunity wait for no man or megabyte.

Another platform capable of changing the game, albeit in an unhappily permanent manner, might be the Big Data skiing goggles marketed by the tech-shades manufacturer Oakley. Rather as the computerised spectacles known as Google Glass promise to do for the whole world, these $600 goggles project into your eyes all kinds of fascinating information about your skiing, including changes in speed and altitude, and can even display incoming messages from your mobile.

Of course, it might happen that while reading a titillating sext from a co-worker you ski at high speed into a tree. And so the goggles are sold with a splendidly self-defeating warning on the box: “Do not operate product while skiing.” Clearly, all the information all of the time is not always desirable. Still less so when Big Data’s tendrils move out from cool gadgets or website tools into our personal lives, the workplace and government – notwithstanding the Panglossian boosters of global datagasm.

Read the rest at the New Statesman.

14 May 2013

Inferno
by Dan Brown

The tall writer Steven Poole opened the wooden door of the strong house and peered at the small figure on the stone doorstep. It was a boy. Cradled in his palms the boy nervously proffered a startling object. It was the new book by the famous novelist Dan Brown.

The tall writer took the precious artefact from the nervous boy’s hands and thanked him. The miniature human scuttled off. An idling engine revved into life. The writer glanced down the street, then retreated into the residential building. He knew he had better get to work. Looking at his Tag Heuer Swiss watch, he calculated that he had only 48 hours to decode the arcane puzzle of the bestselling author’s latest novel.

Read the rest at the Guardian.

13 May 2013

In central London this spring, eight of the world’s greatest minds performed on a dimly lit stage in a wood-panelled theatre. An audience of hundreds watched in hushed reverence. This was the closing stretch of the 14-round Candidates’ Tournament, to decide who would take on the current chess world champion, Viswanathan Anand, later this year.

Each round took a day: one game could last seven or eight hours. Sometimes both players would be hunched over their board together, elbows on table, splayed fingers propping up heads as though to support their craniums against tremendous internal pressure. At times, one player would lean forward while his rival slumped back in an executive leather chair like a bored office worker, staring into space. Then the opponent would make his move, stop his clock, and stand up, wandering around to cast an expert glance over the positions in the other games before stalking upstage to pour himself more coffee. On a raised dais, inscrutable, sat the white-haired arbiter, the tournament’s presiding official. Behind him was a giant screen showing the four current chess positions. So proceeded the fantastically complex slow-motion violence of the games, and the silently intense emotional theatre of their players.

When Garry Kasparov lost his second match against the IBM supercomputer Deep Blue in 1997, people predicted that computers would eventually destroy chess, both as a contest and as a spectator sport. Chess might be very complicated but it is still mathematically finite. Computers that are fed the right rules can, in principle, calculate ideal chess variations perfectly, whereas humans make mistakes. Today, anyone with a laptop can run commercial chess software that will reliably defeat all but a few hundred humans on the planet. Isn’t the spectacle of puny humans playing error-strewn chess games just a nostalgic throwback?

Such a dismissive attitude would be in tune with the spirit of the times. Our age elevates the precision-tooled power of the algorithm over flawed human judgment. From web search to marketing and stock-trading, and even education and policing, the power of computers that crunch data according to complex sets of if-then rules is promised to make our lives better in every way. Automated retailers will tell you which book you want to read next; dating websites will compute your perfect life-partner; self-driving cars will reduce accidents; crime will be predicted and prevented algorithmically. If only we minimise the input of messy human minds, we can all have better decisions made for us. So runs the hard sell of our current algorithm fetish.

Read the rest at Aeon magazine.

13 April 2013

Holy Sh*t: A Brief History of Swearing, by Melissa Mohr

Did he just say what I think he said? In late 2010, Britain erupted in merriment when a radio interviewer attempted to introduce his guest, “the culture secretary Jeremy Hunt,” and accidentally replaced the first letter of the man’s surname with an earlier-used consonant. Swearwords, even in our proudly informal age, have lost none of their power to offend or amuse. Last year, the Supreme Court earnestly discussed whether the Federal Communications Commission could punish broadcasters for “fleeting expletives”—words that “unexpectedly” arise during live conversation. For the moment, the court decided, the FCC can, though it was advised to reconsider its overall “indecency” policy.

One long-standing response to regulation and social censure has been to adopt an innocent expression and just change a letter or two. Norman Mailer’s World War II novel, The Naked and the Dead, featured salty-tongued sailors saying “fug” and “fugging.” More recently, Syfy’s much-admired TV series Battlestar Galactica had swearers in space saying “frak” and “frakking.” (Today these words are more likely to evoke a method of getting at shale gas.) One can be even more direct with homophony. Generations of students have giggled in not-quite-innocent pleasure over Hamlet’s asking Ophelia: “Do you think I meant country matters?”

We can safely assume that humans have been both reveling in and claiming to be offended by language deemed “obscene” for as long as they have been talking. Or at the very least, as Melissa Mohr demonstrates in her intelligent and enjoyable new book, since Roman times, when there were already a variety of names for acts and body parts, from proper to very lewd (the guessable “cunnus” and “futuo”; the more obscure “landica” and “irrumo”). In Holy Sh*t: A Brief History of Swearing, Ms. Mohr leads us on an often ear-boggling tour of verbal depravity, through the medieval and early-modern periods (via a fascinating analysis of scatological phrasing in early Bible translations) to the Victorian era and then our own time. She also makes a serious point, cutely captured in the book’s title. Our idea of “swearing” is irredeemably muddled—caught between the sacred, as in the taking of oaths (the title’s “Holy”), and the profane, as in the use of terms for evacuatory and erotic adventure (the title’s other word).

Read the rest at the Wall Street Journal.
Feeling at home with Facebook

The first mobile-phone call was made 40 years ago this week, by a Motorola engineer roaming the streets of New York. Phones have made amazing advances since then: I for one would be lost without Google Maps, literally and all the time. Having something called a “smartphone” makes me feel… well, smart. (Non-smartphones are known in the industry as “feature phones”.) And now the latest exciting evolution of the phone has just been announced: Facebook Home. Premiered on a new phone, the HTC First, it’s a forthcoming Android app that replaces your “home screen” with direct Facebook access. Wake up your phone and your Facebook news feed is right there. OMG, “Like”! Right?

Facebook promises that this will result in a “great, living, social phone”, which gives me alarming mental images of something alive wriggling around in my pocket, connected directly to Mark Zuckerberg’s brain. The instantly available news feed is apparently “for those in-between moments like waiting in line at the grocery store or between classes when you want to see what’s going on in your world”, which oddly implies that “your world” is not what is actually going on around you – which you could, after all, see by simply staring at it rather than fumbling for your phone. No, “your world” is Facebook’s world. Welcome to it!

Read the rest at the Guardian.

27 March 2013

Sound and its discontents through history
Noise: A Human History of Sound and Listening, by David Hendy (Profile)

During a classical music concert, a cough is rarely just a cough. According to a recent paper by the economist Andreas Wagener, people are twice as likely to cough during a concert as at other times. Furthermore, they are more likely to cough during modern, atonal music than during better-known repertoire and they cough more during slow or quiet passages than during fast and loud ones.

The classical cough, then, is no accident but rather a form of communication disguised as involuntary physiological tic. “Because of their ambiguity – they may always be forgiven as bodily reflexes – coughs are a noisy substitute for direct, verbal communication and participation,” Wagener writes. “They allow for social interaction up to contagious herding, propagate (possibly incorrect) assessments of the performance and reassure concert-goers in their aesthetic judgements.”

Coughers might thus be rebelling nonverbally against the hierarchy imposed on them – that of powerful, noise-making performers and submissive, silent audience. Wagener’s paper is too recent to have found its way into David Hendy’s book, but it reflects in this way one of Noise’s major themes – that social groups struggle for supremacy using sound as a proxy.

Read the rest at the New Statesman.