How Soon is Now? The Perpetual Present

When I was growing up, the year 2000 was the temporal touchstone everyone used to mark the advances of modern life. Oh, by then we’d be doing so many technologically enabled things: Cars would fly and run on garbage, computers would run everything, school wouldn’t exist. We were all looking forward, and Y2K gave us a point on the horizon to measure it all by. When it came and went without incident, we were left with what we had in the present. In Present Shock: When Everything Happens Now (Current, 2013), Douglas Rushkoff argues that the flipping of the calendar to the new millennium turned our focus from the future to the never-ending now. “We spent the latter part of the 20th Century leaning towards the year 2000, almost obsessed with the future, the dot-com boom, the long boom, and all that,” he tells David Pescovitz, “It was a century of movements with grand goals, wars to end wars, and relentless expansionism. Then we arrived at the 21st, and it was as if we had arrived.”

“We spent centuries thinking of hours and seconds as portions of the day,” he continues, “But a digital second is less a part of greater minute, and more an absolute duration, hanging there like the number flap on an old digital clock.” A digital clock is good at accurately displaying the time right now, but an analog clock is better at showing you how long it’s been since you last looked. Needing, wanting, or having only the former is what present shock is all about. It’s what Ruskoff calls elsewhere “a diminishment of everything that isn’t happening right now — and the onslaught of everything that supposedly is.” As the song goes, when you say it’s gonna happen “now,” well, when exactly do you mean?

Michael Leyton (1992) calls us all “prisoners of the present” ( p. 1), like runners on a temporal treadmill. He argues that “all cognitive activity proceeds via the recovery of the past through objects in the present” (p. 2), and those objects often linger longer than they once did thanks to recording technologies. In 1986 Iain Chambers described the persistence of the present through such media, writing,

With electronic reproduction offering the spectacle of gestures, images, styles, and cultures in a perpetual collage of disintegration and reintegration, the ‘new’ disappears into a permanent present. And with the end of the ‘new’ – a concept connected to linearity, to the serial prospects of ‘progress’, to ‘modernism’ – we move into a perpetual recycling of quotations, styles, and fashions: an uninterrupted montage of the ‘now’ (p. 190).

Present ShockNeedless to say that the situation has only been exacerbated by the onset of the digital. In one form or another, Rushkoff has been working on Present Shock his whole career. In it he continues the critical approach he’s sharpened over his last several books. Where Life, Inc. (Random House, 2009) tackled the corporate takeover of culture and Program or Be Programmed (OR Books, 2010) took on technology head-on, Present Shock deals with the digital demands of the now. A lot of the dilemma is due to the update culture of social media. No one reads two-week old Tweets or month-old blog posts. If it wasn’t posted today, in the last few hours, it disappears into irrelevance. And if it’s too long, it doesn’t get read at all. These are not rivers or streams, they’re puddles. All comments, references, and messages, and no story. The personal narrative is lost. It’s the age of “tl; dr.” The 24-hour news, a present made up of the past, and advertising interrupting everything are also all about right now, but our senses of self maybe the biggest victims.

“Even though we may be able to be in only one place at a time,” Rushkoff writes, “our digital selves are distributed across every device, platform, and network onto which we have cloned our virtual identities” (p. 72). Our online profiles give us an atemporal agency whereon we are there but not actually present. On the other side, our technologies mediate our identities by anticipating or projecting a user. As Brian Rotman (2008) writes, “This projected virtual user is a ghost effect: and abstract agency distinct from any particular embodied user, a variable capable of accommodating any particular user within the medium” (p. xiii). Truncated and clipped, we shrink to fit the roles the media allow.

Mindfulness is an important idea cum buzzword in the midst of all this digital doom. Distraction may be just attention to something else, but what if we’re stuck in permanently distracted present with no sense of the past and no time for the future? If you’ve ever known anyone who truly lives in the moment, nothing matters except that moment. It’s the opposite of The Long Now, what Rushkoff calls the “Short Forever.” Things only have value over time. Citing the time binding of Alfred Korzybski, the father of general semantics, Rushkoff illustrates how we bind the histories of past generations into words and symbols. The beauty is that we can leverage the knowledge of that history without going through it again. The problem is that without a clear picture of the labor involved, we risk mistaking the map for the territory.

James Gleick summed it up nicely when he told me in 1999,”We know we’re surrounding ourselves with time-saving technologies and strategies, and we don’t quite understand how it is that we feel so rushed. We worry that we gain speed and sacrifice depth and quality. We worry that our time horizons are foreshortened — our sense of the past, our sense of the future, our ability to plan, our ability to remember.” Well, here we are. What now?

The existence of this book proves we can still choose. In the last chapter of Present Shock, Rushkoff writes,

…taking the time to write or read a whole book on the phenomenon does draw a line in the sand. It means we can stop the onslaught of demands on our attention; we can create a safe space for uninterrupted contemplation; we can give each moment the value it deserves and no more; we can tolerate uncertainty and resist the temptation to draw connections and conclusions before we are ready; and we can slow or even ignore the seemingly inexorable pull from the strange attractor at the end of human history (p. 265-266).

We don’t have to stop or run, we can pause and slow down. Instant access to every little thing doesn’t mean we have to forsake attended access to a few big things. Take some time, read this book.


Chambers, Iain. (1986). Popular Culture: The Metropolitan Experience. New York: Routledge.

Leyton, Michael. (1992). Symmetry, Causality, Mind. Cambridge, MA: The MIT Press.

Morrissey, Steven & Marr, Johnny (1984). How Soon is Now? [Recorded by The Smiths]. On Hatful of Hollow [LP]. London: Rough Trade.

Rotman, Brian. (2008). Becoming Beside Ourselves: The Alphabet, Ghosts, and Distributed Human Being. Durham, NC: Duke University Press.

Rushkoff, Douglas. (2013). Present Shock: When Everything Happens Now. New York: Current.

Expanding Minds: Books on Hacking Your Head

Thinking about our own minds often seems so pataphysically impossible as to be useless and silly, but, to paraphrase Steven Johnson (again), trying to understand the brain is trying to understand ourselves. By contrast, trying to expand and enhance it seems much easier. You can expand your mind without really understanding how it happens. There are many ways to make your brain feel bigger, and these three new books provide many steps in that direction.

Upgrade your grey matter because one day it may matter.
— Deltron 3030

Mindhacker: 60 Tips, Tricks, and Games to Take Your Mind to the Next Level by Ron Hale-Evans and Marty Hale-Evans (Wiley, 2011), the “unofficial sequel” to Ron’s previous book, Mind Performance Hacks: Tips & Tools for Overclocking Your Brain (O’Reilly, 2006; which I mentioned previously). From the sublime to the silly, extensive lists of mental activities, experiments, and games comprise these books, and they’re as fun as they are fertile.

Many of the hacks here take advantage of the fact that the way you see your mind and your world are often radically related, if not often the same thing. What I mean is that a lot of these are not just mental exercises, but tricks for productivity, ways to communicate better, hacks for breaking bad habits, tips for time management, and creative ways to be more creative. It’s not just about the hacks though. Mindhacker is also stocked with other (re)sources: Relevant URLs, books, and articles are listed on every page, along with the stories of the hacks’ origins, and the book’s website has even more, including pieces of code as well as complete programs.

Speaking of programs, Andy Hunt’s Pragmatic Thinking and Learning (Pragmatic Bookshelf, 2008) tackles maximizing the mind from a programmer’s point of view, and it overlaps and complement’s the books mentioned above nicely. Maps, models, recipes, and other scripts and schedules are a part of Hunt’s push, but you don’t have to be code nerd to get plenty out of this book. It has helpful tips for everyone. Chapter four, “Get in Your Right Mind,” even suggests rock climbing, which I regularly use to clear my mind’s cache.

From the grounded to the grandiose, Supersizing the Mind: Embodiment, Action, and Cognitive Extension by Andy Clark (Oxford University Press, 2011) stretches the mind in multiple manners, also blurring the line between the brain and the world. Clark’s extended mind thesis posits the mind beyond the body… Sometimes. That is, sometimes we perform a Dawkinsian flip, seeing the biosphere as an endless network of DNA regardless of organismal boundaries; sometimes our brains and the brains of others are emphatically embodied. It’s a simple but sizable distinction. Where we draw those lines changes everything about how we see the mind and the world.

Other than a few minor missteps (e.g., In his conclusion, Clark unfortunately defines the mind as a “mashup,” when really he just means that it’s extremely diverse, infinitely adaptable, and ultimately mysterious), Supersizing the Mind is one of the better books I’ve seen in the neurosciences in a while.

If you want a brain book that’s handy and fun, I definitely recommend Mindhacker and Pragmatic Thinking and Learning. Those two, along with Dan Pink‘s book, A Whole New Mind (Riverhead, 2006), will get you a long way toward optimizing your cognitive output. If you want something a bit more theoretical, check out Supersizing the Mind. Either way, get to mining and minding your mind. It is still legal.

Obscured by Crowds: Clay Shirky’s Cognitive Surplus

In The Young & The Digital (Beacon, 2009), Craig Watkins points out an overlooked irony in our switch from television screens to computer screens: We gather together around the former to watch passively, while we individually engage with the latter to actively connect with each other. This insight forms the core of Clay Shirky’s Cognitive Surplus: Creativity and Generosity in a Connected Age (Penguin, 2010). Shirky argues that the web has finally joined us in a prodigious version of McLuhan’s “global village” or Teilhard de Chardin’s “Noosphere,” wherein everyone online merges into one productive, creative, cooperative, collective consciousness. If that seems a little extreme, so are many of Shirky’s claims. The “cognitive surplus” marks the end of the individual literary mind and the emergence of the Borg-like clouds and crowds of Web 2.0.

Okay, not exactly, but he does argue for the potential of the cognitive collective. So, Wot’s… Uh, the deal?

Is Clay Shirky the new Seth Godin? I’d yet to read anything written by him that didn’t echo things I’d read David Weinberger or Howard Rheingold (or Marshall McLuhan, of course), and I hoped Cognitive Surplus would finally break the streak. Well, it does, and it doesn’t. As Shirky put it in his previous book, Here Comes Everybody (Penguin, 2008), “society doesn’t change when people adopt new tools; it changes when people adopt new behaviors.” This time around he argues that we adopt new behaviors when provided with new opportunities, which, by my estimate, are provided by new tools — especially online.

Steve Jobs once said that the computer and the television would never converge because we choose one when we want to engage and the other when we want to turn off. The problem with Shirky’s claims is that he never mentions this disparity of desire. A large percentage of people, given the opportunity or not, do not want to post things online, create a Facebook profile, or any of a number of other web-enabled sharing activities. For example, I do not like baseball. I don’t like watching it, much less playing it. If all of the sudden baseballs, gloves, and bats were free, and every home were equipped with a baseball diamond, my desire to play baseball would not increase. Most people do not want to comment on blog posts, video clips, or news stories, much less create their own, regardless of the tools or opportunities made available to them. Cognitive surplus or not, its potential is just that without the collective desire to put it into action.

Shirky’s incessant lolcat bashing and his insistence that we care more about “public and civic value” instead comes off as “net” elitism at its worse. The wisdom of crowds, in James Surowieki’s phrase, doesn’t necessarily lead to the greater good, whatever that is. You can’t argue for bringing brains together and then expect them to “do right.” Are lolcats stupid? Probably, but they’re certainly not ushering in the end of Western civilization. It’s still less popular to be smart than it is to be a smartass, but that’s not the end of the world, online or off-. The crowd is as wise as the crowd does. Glorifying it as such, as Jaron Lanier points out in You Are Not a Gadget (Knopf, 2010), is just plain wrong-headed.

The last chapter, “Looking for the Mouse,” is where Shirky shines though. [Although its namesake echoes a story by Jaron Lanier from a 1998 Wired article about children being smarter and expecting more from technology. Lanier wrote, “My favorite anecdote concerns a three-year-old girl who complained that the TV was broken because all she could do was change channels.” Shirky’s version involves a four-year-old girl digging in the cables behind a TV, “looking for the mouse.”] His ability to condense vast swaths of knowledge into a set of tactics for new media development in this last chapter is stunning compared to the previous 180 pages. Perhaps he is the new Seth Godin afterall.


Lanier, J. (1998, January). “Taking Stock.” Wired, 6.01.

Lanier, J. (2010). You Are Not a Gadget: A Manifesto. New York: Knopf.

Shirky, C. (2010). Cognitive Surplus: Creativity and Generosity in a Connected Age. New York: Penguin.

Surowieki, J. (2005). The Wisdom of Crowds. New York: Anchor.

Watkins, S. C. (2009). The Young & The Digital. New York: Beacon.

Mirroring Minds

In researching technological mediation (which many of you know has been my most intense intellectual jones over the past few years), I started looking internally a year and a half or so ago. Internally meaning cognitively, thinking that quite a lot of the process I’m trying to figure out is going on inside our heads. I first read about mirror neurons when David Byrne and Daniel Levitin were in Seed Magazine‘s “The Seed Salon,” and I immediately knew I’d stumbled across something I couldn’t ignore. Continue reading “Mirroring Minds”

Mind Wide Shut: Recent Books on Mind and Metaphor

Scientists have used metaphors to conceptualize and understand phenomena since early Greek philosophy. Aristotle used many anthropomorphic ideas to describe natural occurrences, but the technology of the time, needing constant human intervention, offered little in the way of metaphors for the mind. Since then, theorists have compared the human mind to the clock, the steam engine, the radio, the radar, and the computer, all of increasing complexity. Continue reading “Mind Wide Shut: Recent Books on Mind and Metaphor”