Isolator is a small menu bar application that helps you concentrate. When you're working on a document, and don't want to be distracted, turn on Isolator. It will cover up your desktop and all the icons on it, as well as the windows of all your other applications, so you can concentrate on the task in hand.
I like very much the idea of a piece of software that puts everything out of focus except for the document that you’re working on. Equipped also with a pair noise-cancelling headphones, one could really get some work done. Except I think that very soon I would start obsessing about those blurry symbols, those muffled sounds. I’d want to know what goes on in the space at the edge of my attention, precisely because it has been artificially suppressed. I’d want to know what it is that I’m concealing from myself.
But then Nicholas Carr thinks that I have a problem. The internet is impairing my ability to concentrate and therefore to think in linear, analytical fashion. Always craving the stimulation of a thousand information sources, I’m no longer able to read a single text for long, uninterrupted periods of time, and engage deeply with its subject matter. He has a name for the place I inhabit: the shallows.
It is not an altogether new idea. Besides Carr’s own intensely debated 2008 article for The Atlantic entitled Is Google Making Us Stupid?, one could cite Maryanne Wolf’s research on the reading brain in Proust and the Squid, as well as Mark Fisher’s observations in Capitalist Realism on the difficulties of some of his students to cope with substantive written texts, later expanded to include some symptoms of his own:
I know that I would be more productive (and less twitchily dissatisfied) if I could partially withdraw from cyberspace, where much of my activity - or rather interpassivity - involves opening up multiple windows and pathetically cycling through twitter and email for updates, like a lab rat waiting for another hit.
This chimes very precisely with some of Stephen Judd’s recent self-observations, which also employ the image of the lab rat (as indeed does Carr on page 117 of The Shallows). Perhaps most troubling of all is Fisher’s reporting in the blog post cited above of the case of the father of one of his students, who told him in despair ‘that he had to turn off the electricity at night in order to get her to go to sleep.’ Here the search for a non-metaphorical isolation results in a highly symbolic gesture designed to make the information age and modernity itself go away, if only for the space of one night.
There are many more examples one could bring up, but I think that these are enough to suggest that the stuff is real. We may not all quite feel the way as Carr, Fisher or Judd, but self-reported experience isn’t without merit or value, and besides I would suggest that all of us may at least have an inkling of what they are talking about. I know for instance that I have been quite deliberate in never fully embracing Twitter, precisely in that I am wary of the consequences of opening yet another channel. For much the same reason, you won’t see me carry a smartphone any time soon. And my writing in this space too is regulated by a discipline that seeks to counter some of the pressures of the medium, especially the one to speak often. I feel the need not only to switch off, but also to be less intensely connected generally; and even if this reticence applied to a tiny minority of internet users, it wouldn’t make it any less real or meaningful or worthy of comment and analysis.
For these reasons, as in the case of Jaron Lanier’s You Are not a Gadget, I was initially well disposed towards Carr’s book; and yet in this instance too I was ultimately frustrated by it.
I happen to think that the manner in which we articulate our critiques of digital ideology is going to be crucial not only for how we grow cyberspace and resist its corporatisation, but also for our politics. That’s why we need analyses that enrich our understanding of its fluid and tangled phenomena, as opposed to reducing historical, technological and social change to a set of comforting and mutually exclusive binaries.
In Carr’s case, it’s the deep vs. shallow dichotomy. The literacy promoted by print culture, following Maryanne Wolf, is where one finds depth, that is to say, the means for long-time learning and reasoned, linear argumentation. Therefore the internet must be the place that leads to forgetfulness, shallow thinking and muddled logic due to the fragmented, constantly updating, forcefully distracting nature of its un-literacy.
The two prongs of the argument are Marshall McLuhan and neuroscience, and operate in seamless unison. So whereas McLuhan understood the transforming power of media to operate primarily at the level of epistemology, not neural circuitry, Carr claims that the taking old of the ‘new intellectual ethics’ of the Internet is synonymous with the rerouting of the pathways in our brains (pages 3 and 77). Indeed in a couple of key passages I half expected Mr McLuhan to walk into the shot and exclaim ‘You know nothing of my work!’ Not because those neurobiological implications are wholly absent in Understanding Media – whose original subtitle after all was ‘The Extensions of Man’ – but because they are secondary to the notion of media as metaphors that organise thought.
Having framed as ‘the crucial question’ what science can ‘tell us about the actual effects that Internet use is having on the way our minds work’ (115), Carr finds that yes, of course, science confirms the hunch: experiments have shown heavy internet use to shift the areas of activity in the brain, and reinforce certain processing functions at the expense of others. While the breadth of experimental evidence is impressive, Carr never interrogates the nature of the data, nor question the researchers’ assumptions as to what constitutes comprehension or learning. “Studies show…” is his default, un-nuanced position. This is also true of the experiments that may give us a little pause, such as the one that has suggested that spending time immersed in nature can sharpen our powers of concentration but that the same benefit can also be gained by staring at pictures of nature from the comfort of one’s own home. (With the possible, unstated implication that so long as you download some sort of bucolic screensaver, you’re good to go.)
When Carr turns his attention to the epistemological question, that is to say how media – new and old – are implicated in how a society constructs and expresses its ideas about truth, the conclusions are less clear-cut and the exposition a little, well, shallow, leaving one to wonder whether the author came to Cartesian dualism by way of Wikipedia, or the extent of the deep thinking that underlies some of his central claims. Thus for instance the proposition that
[w]hen people start debating (as they always do) whether the medium's effects are good or bad, it's the content they wrestle over (2)would appear to be contradicted by Carr's requisite, diligent and extensive treatment of Plato’s argument against writing in the Phaedrus – the granddaddy of all debates on the effect of media – which is in fact preoccupied exclusively with form. Similarly, the claim that ‘[t]he intellectual ethic of a technology is rarely recognized by its inventors’ (45) is pointedly belied as far as the internet is concerned by the towering figure of Norbert Wiener, the father of cybernetics, as well as Tim Berners-Lee’s.
Other pronouncements are little more than irritating, unargued clichés. Thus we are informed that ‘[a]s social concerns override literary ones, writers seem fated to eschew virtuosity and experimentation in favor of a bland but immediately accessible style […], and thus ‘[w]riting will become a means for recording chatter’ (107), or that ‘[o]ur indulgence in the pleasures of informality and immediacy has led to a narrowing of expressiveness and a loss of eloquence’ (108), something that no doubt will come as a surprise to your friends who are honing their aphoristic skills on Twitter, or their broader array of rhetorical skills on blogs and discussion forums.
This last point in fact is a key to understanding the limits of the book’s perspective, for Carr almost always takes the media consumer to be a reader – as is the case with the near totality of people in the medium of print – as opposed to a writer; whereas in fact one of the most notable features of cyberspace is that it makes writers of many if not most of its users. And so in order to make any sort of informed, useful statement on the relative depth or shallowness of the new medium, one would have to evaluate their literacy not just in terms of reading, but of writing as well, and seriously examine the kinds of knowledge produced across all media over the last two decades, a task in which science is unlikely to be able to supply the necessary value judgments.
What’s required is cultural work of the most serious and pressing kind, and whose outcome is difficult to predict. We might find that our twitchy interpassivity has insinuated itself into the deep language structures of the web, curtailing our capacity for expression; or that new forms of rhetoric have begun to emerge to match the repackaging of the world’s pre-digital knowledge into a single, infinitely searchable platform. In the meantime, it pays to heed the symptoms reported by Carr and the others, while we still can, before they become an invisible part of the experience of being awake in a world that is digital, and not dismiss their subjective experience, that feeling of vertigo: it may yet hold a key to understanding and designing out some of the most insidious aspects of the new medium, and therefore of our new selves.
Nicholas Carr. The Shallows: How the internet is changing the way we think, read and remember. London: Atlantic, 2010.
By conincidence, I was listening a blog which touched on the Atlantic Article and the book (A podcast to remember, Stuff you should know), but with less of a critical eye than your blog.
ReplyDeleteThey did make the interesting point that the internet was largely non-fiction writing and reading, whereas when we pick up books we're much more likely to pick up fiction writing.
Their leap was that fiction writing exercises those imaginative bits of the brain more than the non-fiction - they didn't really cite anything to back it up but it would be an interesting point.
Is spending time on the internet mean we're reading the same/more amount of non-fiction, but less non-fiction? And if so, what does that mean?
"‘[o]ur indulgence in the pleasures of informality and immediacy has led to a narrowing of expressiveness and a loss of eloquence’ (108), something that no doubt will come to a surprise to your friends who are honing their aphoristic skills on Twitter, or their broader array of rhetorical skills on blogs and discussion forums."
ReplyDelete+1
I'm not unsympathetic to the arguments of this book. I feel a pronounced restriction of thought to 140 character bursts.
ReplyDeleteHuge question, and hardly one to leave the internet out of by discussing it in print. Thank you for bringing it to us, Gio.
ReplyDeleteI found it quite easy to resist Twitter, strangely. I don't need to impose any discipline at all, I simply don't feel any desire to use it unless there's some urgent bit of news I want to keep up with.
Similarly with Facebook, it's just something I casually check once a day.
Blogs, on the other hand, can dominate my attention. But the singleminded focus is not of a shallow variety at all, it's more like the most intense reading of a short piece of work I've ever done, and my writing in it has about 10 times the care I would have ever put into anything I wrote at University. It's much more like a hard-out philosophical discussion I might have had in a tutorial or between lectures with friends, something I loved doing, which is probably why I do it now.
I don't know if that means I'm not neurotypical or something. I does, however, mean I'm off-task a lot.
I can see where the guy is coming from, though, most likely something that involves a lot of brain use is going to have an effect on our brains. The important question is what is good and what is bad about what has changed. It's a massive simplification to just say that our thinking has become more shallow.
"I don't know if that means I'm not neurotypical or something."
ReplyDeleteI was meaning to get into this: one of the most disappointing aspects of Carr's critique - although he shares it with so many commentators and researchers it almost seems unfair to single him out - it's the extent in which it assumes that neurotypical reading and comprehension (let alone communication) are the only kind there are. Anybody who has been shown what assistive technologies like tablet computers can achieve with children who otherwise give little or no sign of being able to read and write would know better than that.
Indeed. Mere changes in brain usage are interesting but not in themselves bad. Education changes the way brains work too. And there's a huge range, both in people who simply use their brains differently (like our particular children - I can hardly expect Marcus to be using the part of his brain that was damaged by his stroke), and in people who are famous geniuses.
ReplyDeleteI think our brains have (and will continue to) adapt to the world they find themselves in. We're living in a time saturated in information, and I think that what's going on in our brains reflects that. We aren't required to tax our memories in the same way, but who is to say that doesn't actually free up capacity for alternate processing? I don't find myself committing much to memory any more, but that is mostly because I am extracting the information, and I know if I need to find the actual data again, that it can be done with much higher reliability than wasting concentration and effort cramming irrelevant things in there.
Which is not to say that memory is of no use, I'm just saying that it adapts to purpose - people in different work have different kinds of memories, the memory of a graphic artist is probably quite different to that of a writer, although each can learn to think like the other with training too.
Furthermore, at a functional level, aren't books and the internet extensions of our memory? In fact, it goes much deeper, at least for me, my entire office is an extension of my memory which is part of the reason I don't like it when people muck around with it, it makes me less efficient, just like it would for a carpenter if you moved every tool around in his workshop.
"Furthermore, at a functional level, aren't books and the internet extensions of our memory? In fact, it goes much deeper, at least for me, my entire office is an extension of my memory"
ReplyDeleteI always think of this as an example of Dawkins' Extended Phenotype* - we outsource some of the protection of our internal organs to clothing and buildings; we outsource some of the digestion of our food to heat and chemicals; so we outsource some of the workings of our minds to the written word and, more recently, computers. We've been doing most of these things for thousands of years, and in the case of cooking and clothing/shelter, they have demonstrably affected our evolution - so at what point do I draw the line and say "this is part of me; this is not"?
*Disclaimer - my only knowledge of The Extended Phenotype (and, indeed, many of the topics on which I like to hold forth) comes from Wikipedia. Make of that what you will.
I think our brains have (and will continue to) adapt to the world they find themselves in. We're living in a time saturated in information, and I think that what's going on in our brains reflects that. We aren't required to tax our memories in the same way, but who is to say that doesn't actually free up capacity for alternate processing?
ReplyDeleteNicholas Carr, apparently, along with a whole lot of very keen neuropsychologists. And I’m not against people studying the problem and coming to their own conclusions, either, except as I think you are right to suggest the complexities go beyond our very limited capacity to replicate real world conditions in a laboratory, or even define the terms broadly and inclusively enough.
The thing about Plato’s warning in the Pheadrus is that he was right: externalising human memory on tablets and scrolls gradually made the great ancient art of memory disappear, by removing the cultural and technological conditions for its existence. We can say in hindsight that the overall cultural effects were still overwhelmingly positive; scholars have come to the conclusion that Plato probably did too. But the great merit of that argument – which was really more of a parable – was to highlight the profound, intimate effects of new media technologies on the mind, a full twenty-four centuries before McLuhan. And it reassures us still that it’s not wrong to fret, and that the outcomes aren’t predetermined (will it be an overall gain, or a loss?) and that certain biases of the new media can and likely ought to be resisted, but that at any rate it’s imperative to make them visible. Carr is just not very effective at this, but it’s not to say that the intent isn’t worthwhile.
We've been doing most of these things for thousands of years, and in the case of cooking and clothing/shelter, they have demonstrably affected our evolution - so at what point do I draw the line and say "this is part of me; this is not"?
ReplyDeleteI think if asked we’d all come up with slightly different lines of demarcation, but one thing I found in my research is that the idea of an inside and an outside is quite persistent even if we all recognise, from the blind man’s cane down to our iPads, that it’s hard to separate the tool from the user. The problem I think is that when you get past description and assign value to the assemblages. Which are the extensions that make us work better, feel better, think better? And is it even possible or desirable to discriminate?
I'd dispute that the ancient art of memory has disappeared. There are still mnemonists, and the odd prodigy. But the requirement for all educated people to also be mnemonists is lessened. People still seem to remember most of what they need to remember, though. Also, there is a big overlap period, in case there really is something disastrous about the technology. Oral history was still huge for a long time after writing was discovered, and still exists today. If there is something awful to be found in our cyber minds, it's not like we can't go back.
ReplyDeleteBut yes, there are very few lossless changes. Carr needs to do more work to show that more is lost than gained.
Agreed. I think the world of internet critics is crying out for a Neil Postman (indeed some aspects of Postman's critique of television, such as the "Now... THIS" model for organising the delivery of news, can be recycled more or less wholesale, as Stephen has argued).
ReplyDelete(You're right about the other thing too: the art of memory isn't quite dead and in fact it's being revived in interesting ways.)
@kylejits
ReplyDelete"Is spending time on the internet mean we're reading the same/more amount of non-fiction, but less non-fiction? And if so, what does that mean?"
Is it me, or adults have complained that they no longer read fiction since before the days of internet? I suspect that the supposed inability to read long books may be something that comes with a certain age and its related work pressures too.
I'm not sure that we can draw inferences one way or another, at any rate, insofar as Carr doesn't lament a loff of creativity but of our capacity for concentration and analysis (and if that is the case, shouldn't we be more easily drawn to something with a narrative than by non-fiction?). On this score I must admit I could never quite understand why television critics insisted in the early days of reality television that scripted drama was a superior form to quasi-documentary.
(And thank you for the tip, I'll look for that podcast.)
I noticed that I wasn't reading fiction in the mid-90s. For me, reading had lost quite a lot of its pleasure at University. Also, I simply came to like non-fiction a lot more than previously, and would read it purely for pleasure. This was well before I was reading a lot on the internet.
ReplyDeleteThe Extended Phenotype
ReplyDeleteAs far as knowledge goes, in The Science of the Discworld I think they call it 'extelligence'. For added relevence, I might add that this factoid to quite some remembering for me.
Extelligence is a philosophical and scientific term, produced in a different discipline but a similar milieu as external phenotype. I'm not sure how Pratchett used it, but for its originators it was essentially synonimous with what Carr calls externalised memory. And that's a problem in this area of speculation in and of itself: that when we say that we externalise these mental faculties we're seldom very clear on whether we mean intelligence or memory, and if we say memory whether we actually mean identity. Carr is no exception.
ReplyDeleteThere is an unfortunate association here with Pratchett that helps to see the limit of the optimistic view embodied by terms like extelligence and external phenotype. People who suffer from Alzheimer's disease - which is still commonly understood to primarily affect memory, as opposed to cognition - are helped very little by external memory devices and artefacts. Which suggests to me that we still need the bulk of our knowledge, and most especially of our knowledge of self, to reside in the brain in order to have a functioning mind.
>that when we say that we externalise these mental faculties we're seldom very clear on whether we mean intelligence or memory, and if we say memory whether we actually mean identity. Carr is no exception.
ReplyDeleteI think we're seldom clear because the differences actually aren't that clear. Intelligence is a particularly poorly defined word, and human memory is clearly very different to digital storage, although it can be used for that purpose. It can be tempting in a digital age to try to categorize our muscle memory as something akin to data, but it's not like we can use our minds to look up the potentials and connections of our neuron system.
human memory is clearly very different to digital storage
ReplyDeleteAnd yet it’s amazing how many people who should know better (I’ve talked about one of them here, but there are many others) don’t seem to be able to grasp that difference. Carr does, to his credit, and articulates it quite persuasively.
Hi,
ReplyDeleteCheck out! Nicholas Carr talks about Cloud consumerism, enterprises' growing interest in Cloud, and Nick's upcoming projects http://bit.ly/ixv4OV