Nothing Special   »   [go: up one dir, main page]

Jump to content

Is Google Making Us Stupid?

This is a good article. Click here for more information.
From Wikipedia, the free encyclopedia
(Redirected from Is Google Making Us Stupid)

Is Google Making Us Stupid? What the Internet Is Doing to Our Brains
Cover page in The Atlantic issue "Is Google Making Us Stoopid?"
WriterNicholas G. Carr
CategoriesAdvocacy journalism
First issuePublished in The Atlantic, July 1, 2008
WebsiteCover story

Is Google Making Us Stupid? What the Internet Is Doing to Our Brains! (alternatively Is Google Making Us Stoopid?) is a magazine article by technology writer Nicholas G. Carr, and is highly critical of the Internet's effect on cognition. It was published in the July/August 2008 edition of The Atlantic magazine as a six-page cover story.[1] Carr's main argument is that the Internet might have detrimental effects on cognition that diminish the capacity for concentration and contemplation. Despite the title, the article is not specifically targeted at Google, but more at the cognitive impact of the Internet and World Wide Web.[2][3] Carr expanded his argument in The Shallows: What the Internet Is Doing to Our Brains, a book published by W. W. Norton in June 2010.

The essay was extensively discussed in the media and the blogosphere, with reactions to Carr's argument being polarised. At the Britannica Blog, a part of the discussion focused on the apparent bias in Carr's argument toward literary reading. In Carr's view, reading on the Internet is generally a shallower form in comparison with reading from printed books in which he believes a more intense and sustained form of reading is exercised.[4] Elsewhere in the media, the Internet's impact on memory retention was discussed; and, at the online scientific magazine Edge, several argued that it was ultimately the responsibility of individuals to monitor their Internet usage so that it does not impact their cognition.

While long-term psychological and neurological studies have yet to yield definitive results justifying Carr's argument, a few studies have provided glimpses into the changing cognitive habits of Internet users.[5] A UCLA study led some to wonder whether a breadth of brain activity—which was shown to occur while users performed Internet searches in the study's functional MRI scans—actually facilitated reading and cognition or possibly overburdened the mind; and what quality of thought could be determined by the additional presence of brain activity in regions known to control decision-making and complex reasoning skills.

Background

[edit]

Prior to the publication of Carr's Atlantic essay, critics had long been concerned about the potential for electronic media to supplant literary reading.[6] In 1994, American academic Sven Birkerts published a book titled The Gutenberg Elegies: The Fate of Reading in an Electronic Age, consisting of a collection of essays that declaimed against the declining influence of literary culture—the tastes in literature that are favored by a social group—with a central premise among the essays asserting that alternative delivery formats for the book are inferior to the paper incarnation.[7][8][9] Birkerts was spurred to write the book after his experience with a class he taught in the fall of 1992, where the students had little appreciation for the literature he had assigned them, stemming from, in his opinion, their inaptitude for the variety of skills involved in deep reading.[10][11][12] In "Perseus Unbound", an essay from the book, Birkerts presented several reservations toward the application of interactive technologies to educational instruction, cautioning that the "long-term cognitive effects of these new processes of data absorption" were unknown and that they could yield "an expansion of the short-term memory banks and a correlative atrophying of long-term memory".[13]

In 2007, developmental psychologist Maryanne Wolf took up the cause of defending reading and print culture in her book Proust and the Squid: The Story and Science of the Reading Brain, approaching the subject matter from a scientific angle in contrast to Birkerts' cultural-historical angle.[2][8][14][15] A few reviewers were critical of Wolf for only touching upon the Internet's potential impact on reading in her book;[16][17][18] however, in essays published concurrent with the book's release she elaborated upon her worries. In an essay in The Boston Globe, Wolf expressed her grave concern that the development of knowledge in children who are heavy users of the Internet could produce mere "decoders of information who have neither the time nor the motivation to think beneath or beyond their googled universes", and cautioned that the web's "immediacy and volume of information should not be confused with true knowledge".[19] In an essay published by Powell's Books, Wolf contended that some of the reading brain's strengths could be lost in future generations "if children are not taught first to read, and to think deeply about their reading, and only then to e-read".[20] Preferring to maintain an academic perspective, Wolf firmly asserted that her speculations have not yet been scientifically verified but deserved serious study.[21][22]

In Carr's 2008 book The Big Switch: Rewiring the World, From Edison to Google, the material in the final chapter, "iGod", provided a basis for his later Atlantic magazine article titled "Is Google Making Us Stupid?"[23] The inspiration to write "Is Google Making Us Stupid?" came from the difficulties Carr found he had in remaining engaged with not only books he had to read but even books that he found very interesting.[3] This is sometimes called deep reading, a term coined by academic Sven Birkerts in The Gutenberg Elegies and later defined by developmental psychologist Maryanne Wolf with an added cognitive connotation.[11][21][22][24][25]

Synopsis

[edit]

"Is Google Making Us Stupid?" is a 2008 article written by technologist Nicholas Carr for The Atlantic, and later expanded on in a published edition by W. W. Norton. The book investigates the cognitive effects of technological advancements that relegate certain cognitive activities—namely, knowledge-searching—to external computational devices. The book received mainstream recognition for interrogating the assumptions people make about technological change and advocating for a component of personal accountability in our relationships to devices.

Carr begins the essay by saying that his recent problems with concentrating on reading lengthy texts, including the books and articles that he used to read effortlessly, stem from spending too much time on the Internet. He suggests that constantly using the Internet might reduce one's ability to concentrate and reflect on content. He introduces a few anecdotes taken from bloggers who write about the transformation in their reading and writing habits over time. In addition, he analyzes a 2008 study by University College London about new "types" of reading that will emerge and become predominant in the information age. He particularly refers to the work of Maryanne Wolf, a reading behavior scholar, which includes theories about the role of technology and media in learning how to write new languages. Carr argues that while speech is an innate ability that stems directly from brain structure, reading is conscious and taught. He acknowledges that this theory has a paucity of evidence so far, but refers to such works as Wolf's Proust and the Squid, which discusses how the brain's neurons adapt to a creature's environmental demands to become literate in new problem areas. The Internet, in his opinion, is just another kind of environment that we will uniquely adapt to.

Carr discusses how concentration might be impaired by Internet usage. He references the historical example of Nietzsche, who used a typewriter, which was new during his time in the 1880s. Allegedly, Nietzsche's writing style changed after the advent of the typewriter. Carr categorizes this example as demonstrative of neuroplasticity, a scientific theory that states neural circuits are contingent and in flux. He invokes the idea of sociologist Daniel Bell that technologies extend human cognition, arguing that humans unconsciously conform to the very qualities, or kinds of patterns, involved in these devices' functions. He uses the clock as an example of a device that has both improved and regulated human perception and behavior.

Carr argues that the Internet is changing behavior at unprecedented levels because it is one of the most pervasive and life-altering technologies in human history. He suggests that the Internet engenders cognitive distractions in the form of ads and popups. These concentration-altering events are only worsened by online media as they adapt their strategies and visual forms to those of Internet platforms to seem more legitimate and trick the viewer into processing them.

Carr also posits that people's ability to concentrate might decrease as new algorithms free them from knowledge work; that is, the process of manipulating and synthesizing abstract information into new concepts and conclusions. He compares the Internet with industrial management systems, tracing how they caused workers to complain that they felt like automata after the implementation of Taylorist management workflows. He compares this example with the modern example of Google, which places its computer engineers and designers into a systematized knowledge environment, creating robust insights and results at the expense of creativity. Additionally, Carr argues that the Internet makes its money mainly by exploiting users' privacy or bombarding them with overstimulation, a vicious cycle where companies facilitate mindless browsing instead of rewarding sustained thinking.

Carr ends his essay by tracing the roots of the skeptic trend. He discusses events where people were wary about new technologies, including Socrates's skepticism about the use of written language and a fifteenth-century Italian editor's concern about the shift from manually written to printed works. All of these technologies indelibly changed human cognition, but also led to mind-opening innovations that endure today. Still, Carr concludes his argument on an ambivalent note, citing a quote by Richard Foreman that laments the erosion of educated and articulate people. Though Google and other knowledge-finding and knowledge-building technologies might speed up existing human computational processes, they might also foreclose the human potential to easily create new knowledge.

Reception

[edit]

We can expect … that the circuits woven by our use of the Net will be different from those woven by our reading of books and other printed works.

Nicholas Carr, "Is Google Making Us Stupid?".[24]

Carr's essay was widely discussed in the media both critically and in passing. While English technology writer Bill Thompson observed that Carr's argument had "succeeded in provoking a wide-ranging debate",[2] Damon Darlin of The New York Times quipped that even though "[everyone] has been talking about [the] article in The Atlantic magazine", only "[s]ome subset of that group has actually read the 4,175-word article, by Nicholas Carr."[26] The controversial online responses to Carr's essay were, according to Chicago Tribune critic Steve Johnson, partly the outcome of the essay's title "Is Google Making Us Stupid?", a question that the article proper does not actually pose and that he believed was "perfect fodder for a 'don't-be-ridiculous' blog post"; Johnson challenged his readers to carefully consider their online responses in the interest of raising the quality of debate.[3]

Many critics discussed the merits of Carr's essay at great length in forums set up formally for this purpose at online hubs such as the Britannica Blog and publisher John Brockman's online scientific magazine Edge, where the roster of names quickly took on the semblance of a Who's Who of the day's Internet critics.[27][28][29][30] Calling it "the great digital literacy debate", British-American entrepreneur and author Andrew Keen judged the victor to be the American reader, who has a wide range of compelling writing from "all of America's most articulate Internet luminaries".[30]

Book critic Scott Esposito pointed out that Chinese characters are incorrectly described as ideograms in Carr's essay, an error that he believed undermined the essay's argument.[31] The myth that Chinese script is ideographic had been effectively debunked in scholar John DeFrancis' 1984 book The Chinese Language: Fact and Fantasy;[32] DeFrancis classifies Chinese as a logosyllabic writing system.[33] Carr acknowledged that there was a debate over the terminology of 'ideogram', but in a response to Esposito he explained that he had "decided to use the common term" and quoted The Oxford American Dictionary to demonstrate that they likewise define Chinese characters as instances of ideograms.[34]

Writer and activist Seth Finkelstein noted that predictably several critics would label Carr's argument as a Luddite one,[35] and he was not to be disappointed when one critic later maintained that Carr's "contrarian stance [was] slowly forcing him into a caricature of Luddism".[36] Then, journalist David Wolman, in a Wired magazine piece, described as "moronic" the assumption that the web "hurts us more than it helps", a statement that was preceded by an overview of the many technologies that had been historically denounced; Wolman concluded that the solution was "better schools as well as a renewed commitment to reason and scientific rigor so that people can distinguish knowledge from garbage".[37]

Several prominent scientists working in the field of neuroscience supported Carr's argument as scientifically plausible. James Olds, a professor of computational neuroscience, who directs the Krasnow Institute for Advanced Study at George Mason University, was quoted in Carr's essay for his expertise, and upon the essay's publication Olds wrote a letter to the editor of The Atlantic in which he reiterated that the brain was "very plastic"—referring to the changes that occur in the organization of the brain as a result of experience. It was Olds' opinion that given the brain's plasticity it was "not such a long stretch to Carr's meme".[38] One of the pioneers in neuroplasticity research, Michael Merzenich, later added his own comment to the discussion, stating that he had given a talk at Google in 2008 in which he had asked the audience the same question that Carr asked in his essay. Merzenich believed that there was "absolutely no question that our brains are engaged less directly and more shallowly in the synthesis of information, when we use research strategies that are all about 'efficiency', 'secondary (and out-of-context) referencing', and 'once over, lightly'".[39] Another neuroscientist, Gary Small, director of UCLA's Memory & Aging Research Center, wrote a letter to the editor of The Atlantic in which he stated that he believed that "brains are developing circuitry for online social networking and are adapting to a new multitasking technology culture".[40]

Testimonials and refutations

[edit]

In the media, there were many testimonials and refutations given by journalists for the first part of Carr's argument regarding the capacity for concentration; treatments of the second part of Carr's argument regarding the capacity for contemplation, were, however, far rarer.[41] Although columnist Andrew Sullivan noted that he had little leisure time at his disposal for contemplation compared with when he grew up,[42] the anecdotes provided by journalists that indicated a deficiency in the capacity to contemplate were described only in the context of third parties, such as columnist Margaret Wente's anecdote about how one consultant had found a growing tendency in her clients to provide ill-considered descriptions for their technical problems.[41][43]

Columnist Leonard Pitts of The Miami Herald described his difficulty sitting down to read a book, in which he felt like he "was getting away with something, like when you slip out of the office to catch a matinee".[44] Technology evangelist Jon Udell admitted that, in his "retreats" from the Internet, he sometimes struggled to settle into "books, particularly fiction, and particularly in printed form".[45] He found portable long-form audio to be "transformative", however, because he can easily achieve "sustained attention", which makes him optimistic about the potential to "reactivate ancient traditions, like oral storytelling, and rediscover their powerful neural effects".[9][45]

Also writing in The Atlantic, a year after Carr, the futurist Jamais Cascio argued that human cognition has always evolved to meet environmental challenges, and that those posed by the internet are no different. He described the 'skimming' referred to by Carr as a form of attention deficit caused by the immaturity of filter algorithms: "The trouble isn't that we have too much information at our fingertips, but that our tools for managing it are still in their infancy... many of the technologies that Carr worries about were developed precisely to help us get some control over a flood of data and ideas. Google isn't the problem; it's the beginning of a solution."[46] Cascio has since modified his stance, conceding that, while the internet remains good at illuminating knowledge, it is even better at manipulating emotion. "If Carr wrote his Atlantic essay now [2020] with the title 'Is Facebook Making Us Stupid?' it would be difficult to argue in favor of 'No.'".[47]

Cascio and Carr's articles have been discussed together in several places. Pew Research used them to form a tension-pair question survey that was distributed to noted academics. Most responded in detail; concurring with the proposition "Carr was wrong: Google does not make us stupid".[48] In The Googlization of Everything, Vaidhyanathan tended to side with Carr. However, he thought both arguments relied too much on determinism: Carr in thinking that an over-reliance on internet tools will inevitably cause the brain to atrophy, and Cascio in thinking that getting smarter is the necessary outcome of the evolutionary pressures he describes.[49] In From Gutenberg to Zuckerberg Naughton noted that, while many agreed Carr had hit on an important subject, his conclusions were not widely supported.[50]

Firmly contesting Carr's argument, journalist John Battelle praised the virtues of the web: "[W]hen I am deep in search for knowledge on the web, jumping from link to link, reading deeply in one moment, skimming hundreds of links the next, when I am pulling back to formulate and reformulate queries and devouring new connections as quickly as Google and the Web can serve them up, when I am performing bricolage in real time over the course of hours, I am 'feeling' my brain light up, I and [sic] 'feeling' like I'm getting smarter".[2][51] Web journalist Scott Rosenberg reported that his reading habits are the same as they were when he "was a teenager plowing [his] way through a shelf of Tolstoy and Dostoyevsky".[52] In book critic Scott Esposito's view, "responsible adults" have always had to deal with distractions, and, in his own case, he claimed to remain "fully able to turn down the noise" and read deeply.[31][41]

Analysis

[edit]

In critiquing the rise of Internet-based computing, the philosophical question of whether or not a society can control technological progress was raised. At the online scientific magazine Edge, Wikipedia co-founder Larry Sanger argued that individual will was all that was necessary to maintain the cognitive capacity to read a book all the way through, and computer scientist and writer Jaron Lanier rebuked the idea that technological progress is an "autonomous process that will proceed in its chosen direction independently of us".[29] Lanier echoed a view stated by American historian Lewis Mumford in his 1970 book The Pentagon of Power, in which Mumford suggested that the technological advances that shape a society could be controlled if the full might of a society's free will were employed.[23][53] Lanier believed that technology was significantly hindered by the idea that "there is only one axis of choice" which is either pro- or anti- when it comes to technology adoption.[29] Yet Carr had stated in The Big Switch that he believed an individual's personal choice toward a technology had little effect on technological progress.[23][54] According to Carr, the view expressed by Mumford about technological progress was incorrect because it regarded technology solely as advances in science and engineering rather than as an influence on the costs of production and consumption. Economics were a more significant consideration in Carr's opinion because in a competitive marketplace the most efficient methods of providing an important resource will prevail. As technological advances shape society, an individual might be able to resist the effects but his lifestyle will "always be lonely and in the end futile"; despite a few holdouts, technology will nevertheless shape economics which, in turn, will shape society.[23][54]

A focus on literary reading

[edit]

The selection of one particular quote in Carr's essay from pathologist Bruce Friedman, a member of the faculty of the University of Michigan Medical School, who commented on a developing difficulty reading books and long essays and specifically the novel War and Peace, was criticized for having a bias toward narrative literature. The quote failed to represent other types of literature, such as technical and scientific literature, which had, in contrast, become much more accessible and widely read with the advent of the Internet.[24][55] At the Britannica Blog, writer Clay Shirky pugnaciously observed that War and Peace was "too long, and not so interesting", further stating that "it would be hard to argue that the last ten years have seen a decrease in either the availability or comprehension of material on scientific or technical subjects".[36] Shirky's comments on War and Peace were derided by several of his peers as verging on philistinism.[25][56][57] In Shirky's defense, inventor W. Daniel Hillis asserted that, although books "were created to serve a purpose", that "same purpose can often be served by better means". While Hillis considered the book to be "a fine and admirable device", he imagined that clay tablets and scrolls of papyrus, in their time, "had charms of their own".[29] Wired magazine editor Kevin Kelly believed that the idea that "the book is the apex of human culture" should be resisted.[7] And Birkerts differentiated online reading from literary reading, stating that in the latter the reader is directed within themselves and enters "an environment that is nothing at all like the open-ended information zone that is cyberspace" in which he feels psychologically fragmented.[27][58]

Coping with abundance

[edit]

Abundance of books makes men less studious.

Hieronimo Squarciafico, a 15th-century Venetian editor, bemoaning the printing press.[59][60]

Several critics theorized about the effects of the shift from scarcity to abundance of written material in the media as a result of the technologies introduced by the Internet. This shift was examined for its potential to lead individuals to a superficial comprehension of many subjects rather than a deep comprehension of just a few subjects. According to Shirky, an individual's ability to concentrate had been facilitated by the "relatively empty environment" which had ceased to exist when the wide availability of the web proliferated new media. Although Shirky acknowledged that the unprecedented quantity of written material available on the web might occasion a sacrifice of the cultural importance of many works, he believed that the solution was "to help make the sacrifice worth it".[36] In direct contrast, Sven Birkerts argued that "some deep comprehension of our inheritance [was] essential", and called for "some consensus vision among those shapers of what our society and culture might be shaped toward", warning against allowing the commercial marketplace to dictate the future standing of traditionally important cultural works.[61] While Carr found solace in Shirky's conceit that "new forms of expression" might emerge to suit the Internet, he considered this conceit to be one of faith rather than reason.[25] In a later response, Shirky continued to expound upon his theme that "technologies that make writing abundant always require new social structures to accompany them", explaining that Gutenberg's printing press led to an abundance of cheap books which were met by "a host of inventions large and small", such as the separation of fiction from non-fiction, the recognition of talents, the listing of concepts by indexes, and the practice of noting editions.[55]

Impact of the web on memory retention

[edit]

As a result of the vast stores of information made accessible on the web, approximately one hundred critics pointed to a decrease in the desire to recall certain types of information, indicating, they believed, a change in the process of recalling information, as well as the types of information that are recalled. According to Ben Worthen, a Wall Street Journal business technology blogger, the growing importance placed on the ability to access information instead of the capacity to recall information straight from memory would, in the long term, change the type of job skills that companies who are hiring new employees would find valuable. Due to an increased reliance on the Internet, Worthen speculated that before long "the guy who remembers every fact about a topic may not be as valuable as the guy who knows how to find all of these facts and many others".[41][62] Evan Ratliff of Salon.com wondered if the usage of gadgets to recall phone numbers, as well as geographical and historical information, had the effect of releasing certain cognitive resources that in turn strengthened other aspects of cognition. Drawing parallels with transactive memory—a process whereby people remember things in relationships and groups—Ratliff mused that perhaps the web was "like a spouse who is around all the time, with a particular knack for factual memory of all varieties".[27] Far from conclusive, these ruminations left the web's impact on memory retention an open question.[27]

Themes and motifs

[edit]

Effect of technology on the brain's neural circuitry

[edit]
A model 1878 of the Malling-Hansen Writing Ball, which Nietzsche began using in 1882 when his poor eyesight made it difficult for him to write by hand[63][64]

In the essay, Carr introduces the discussion of the scientific support for the idea that the brain's neural circuitry can be rewired with an example in which philosopher Friedrich Nietzsche is said to have been influenced by technology. According to German scholar Friedrich A. Kittler in his book Gramophone, Film, Typewriter, Nietzsche's writing style became more aphoristic after he started using a typewriter. Nietzsche began using a Malling-Hansen Writing Ball because of his failing eyesight which had disabled his ability to write by hand.[23][65] The idea that Nietzsche's writing style had changed for better or worse when he adopted the typewriter was disputed by several critics. Kevin Kelly and Scott Esposito each offered alternate explanations for the apparent changes.[29][31][66] Esposito believed that "the brain is so huge and amazing and enormously complex that it's far, far off base to think that a few years of Internet media or the acquisition of a typewriter can fundamentally rewire it."[31] In a response to Esposito's point, neuroscientist James Olds stated that recent brain research demonstrated that it was "pretty clear that the adult brain can re-wire on the fly". In The New York Times it was reported that several scientists believed that it was certainly plausible that the brain's neural circuitry may be shaped differently by regular Internet usage compared with the reading of printed works.[6]

Although there was a consensus in the scientific community about how it was possible for the brain's neural circuitry to change through experience, the potential effect of web technologies on the brain's neural circuitry was unknown.[38][39] On the topic of the Internet's effect on reading skills, Guinevere F. Eden, director of the Center for the Study of Learning at Georgetown University, remarked that the question was whether or not the Internet changed the brain in a way that was beneficial to an individual.[6] Carr believed that the effect of the Internet on cognition was detrimental, weakening the ability to concentrate and contemplate. Olds cited the potential benefits of computer software that specifically targets learning disabilities, stating that among some neuroscientists there was a belief that neuroplasticity-based software was beneficial in improving receptive language disorders.[38] Olds mentioned neuroscientist Michael Merzenich, who had formed several companies with his peers in which neuroplasticity-based computer programs had been developed to improve the cognitive functioning of kids, adults and the elderly.[38][67] In 1996, Merzenich and his peers had started a company called Scientific Learning in which neuroplastic research had been used to develop a computer training program called Fast ForWord that offered seven brain exercises that improved language impairments and learning disabilities in children.[68] Feedback on Fast ForWord showed that these brain exercises even had benefits for autistic children, an unexpected spillover effect that Merzenich has attempted to harness by developing a modification of Fast ForWord specifically designed for autism.[69] At a subsequent company that Merzenich started called Posit Science, Fast ForWord-like brain exercises and other techniques were developed with the aim of sharpening the brains of elderly people by retaining the plasticity of their brains.[70]

HAL in 2001: A Space Odyssey

[edit]

In Stanley Kubrick's 1968 science fiction film 2001: A Space Odyssey, astronaut David Bowman slowly disassembles the mind of an artificial intelligence named HAL by sequentially unplugging its memory banks. Carr likened the emotions of despair expressed by HAL as its mind is disassembled to his own, at the time, cognitive difficulties in engaging with long texts.[2] He felt as if someone was "tinkering with [his] brain, remapping the neural circuitry, reprogramming the memory".[24] HAL had also been used as a metaphor for the "ultimate search engine" in a PBS interview with Google co-founder Sergey Brin as noted in Carr's book The Big Switch, and also Brin's TED talk. Brin was comparing Google's ambitions of building an artificial intelligence to HAL, while dismissing the possibility that a bug like the one that led HAL to murder the occupants of the fictional spacecraft Discovery One could occur in a Google-based artificial intelligence.[23][71][72] Carr observed in his essay that throughout history technological advances have often necessitated new metaphors, such as the mechanical clock engendering the simile "like clockwork" and the age of the computer engendering the simile "like computers". Carr concluded his essay with an explanation as to why he believed HAL was an appropriate metaphor for his essay's argument. He observed that HAL showed genuine emotion as his mind was disassembled while, throughout the film, the humans onboard the space station appeared to be automatons, thinking and acting as if they were following the steps of an algorithm. Carr believed that the film's prophetic message was that as individuals increasingly rely on computers for an understanding of their world their intelligence may become more machinelike than human.[2][24]

Developing view of how Internet use affects cognition

[edit]

The brain is very specialized in its circuitry and if you repeat mental tasks over and over it will strengthen certain neural circuits and ignore others.

— Gary Small, a professor at UCLA's Semel Institute for Neuroscience and Human Behaviour.[73]

After the publication of Carr's essay, a developing view unfolded in the media as sociological and neurological studies surfaced that were relevant to determining the cognitive impact of regular Internet usage. Challenges to Carr's argument were made frequently. As the two most outspoken detractors of electronic media, Carr and Birkerts were both appealed to by Kevin Kelly to each formulate a more precise definition of the faults they perceived regarding electronic media so that their beliefs could be scientifically verified.[74] While Carr firmly believed that his skepticism about the Internet's benefits to cognition was warranted,[25] he cautioned in both his essay and his book The Big Switch that long-term psychological and neurological studies were required to definitively ascertain how cognition develops under the influence of the Internet.[3][24][75]

Scholars at University College London conducted a study titled "Information Behaviour of the Researcher of the Future", the results of which suggested that students' research habits tended towards skimming and scanning rather than in-depth reading.[76] The study provoked serious reflection among educators about the implications for educational instruction.[77]

In October 2008, new insights into the effect of Internet usage on cognition were gleaned from the results, reported in a press release,[78] of a study conducted by UCLA's Memory and Aging Research Center that had tested two groups of people between the ages of 55 and 76 years old; only one group of which were experienced web users. While they had read books or performed assigned search tasks their brain activity had been monitored with functional MRI scans, which revealed that both reading and web search utilize the same language, reading, memory, and visual regions of the brain; however, it was discovered that those searching the web stimulated additional decision-making and complex reasoning regions of the brain, with a two-fold increase in these regions in experienced web users compared with inexperienced web users.[79][80][81][82] Gary Small, the director of the UCLA center and lead investigator of the UCLA study, concurrently released the book iBrain: Surviving the Technological Alteration of the Modern Mind, co-authored with Gigi Vorgan, with the press release.

While one set of critics and bloggers used the UCLA study to dismiss the argument raised in Carr's essay,[83][84] another set took a closer look at the conclusions that could be drawn from the study concerning the effects of Internet usage.[85] Among the reflections concerning the possible interpretations of the UCLA study were whether greater breadth of brain activity while using the Internet in comparison with reading a book improved or impaired the quality of a reading session; and whether the decision-making and complex reasoning skills that are apparently involved in Internet search, according to the study, suggest a high quality of thought or simply the use of puzzle solving skills.[86][87] Thomas Claburn, in InformationWeek, observed that the study's findings regarding the cognitive impact of regular Internet usage were inconclusive and stated that "it will take time before it's clear whether we should mourn the old ways, celebrate the new, or learn to stop worrying and love the Net".[5]

See also

[edit]

References

[edit]
  1. ^ Nicholas Carr (June 12, 2008). "Pages and "pages"". Rough Type. Retrieved November 1, 2008.
  2. ^ a b c d e f Bill Thompson (June 17, 2008). "Changing the way we think". BBC News.[permanent dead link]
  3. ^ a b c d Steve Johnson (June 18, 2008). "Read this if you're easily distracted lately". Chicago Tribune. Archived from the original on July 18, 2011. Retrieved February 10, 2009.
  4. ^ David Aaronovitch (August 13, 2008). "The internet shrinks your brain? What rubbish". The Times. Retrieved December 1, 2008.[dead link]
  5. ^ a b Thomas Claburn (October 15, 2008). "Is Google Making Us Smarter?: UCLA researchers report that searching the Internet may help improve brain function". InformationWeek. Archived from the original on October 27, 2008. Retrieved November 1, 2008.
  6. ^ a b c Motoko Rich (July 27, 2008). "Literacy Debate: Online, R U Really Reading?". The New York Times. Retrieved November 1, 2008.
  7. ^ a b Kevin Kelly (July 25, 2008). "The Fate of the Book (and a Question for Sven Birkerts)". Britannica Blog (originally posted at Kelly's blog The Technium). Archived from the original on January 9, 2020. Retrieved January 7, 2018. {{cite news}}: |author= has generic name (help)
  8. ^ a b Bernard Sharratt (December 18, 1994). "Are There Books in Our Future". The New York Times. Retrieved November 1, 2008.
  9. ^ a b John Naughton (June 22, 2008). "I Google, therefore I am losing the ability to think". The Observer. Retrieved October 20, 2008.
  10. ^ Birkerts 1994, pp. 17–20
  11. ^ a b Birkerts 1994, pp. 146–149
  12. ^ John Walsh and Kate Burt (September 14, 2008). "Can intelligent literature survive in the digital age?". The Independent. Retrieved October 20, 2008.
  13. ^ Birkerts 1994, pp. 138–139
  14. ^ William Leith (March 28, 2008). "We were never meant to read". The Daily Telegraph. Archived from the original on March 25, 2008. Retrieved November 1, 2008.
  15. ^ Wolf 2007, p. 17
  16. ^ Guy Dammann (March 28, 2008). "Stumbling over books". The Daily Telegraph. Archived from the original on April 10, 2008. Retrieved November 1, 2008.
  17. ^ Michael Dirda (September 2, 2007). "Reading is hard work for the brain, as this book proves". The Washington Post. Retrieved November 1, 2008.
  18. ^ Guy Dammann (April 5, 2008). "We're on a scroll". The Guardian. Retrieved November 1, 2008.
  19. ^ Maryanne Wolf (September 5, 2007). "Learning to think in a digital world". The Boston Globe. Retrieved November 1, 2008.
  20. ^ Maryanne Wolf (2007). "Reading Worrier". Powell's Books. Archived from the original on August 29, 2011. Retrieved October 13, 2007.
  21. ^ a b Veronica Rueckert (July 18, 2008). "Does spending time online change the way we think? (with guests Maryanne Wolf and Nicholas Carr)". Wisconsin Public Radio (Podcast).
  22. ^ a b Malcolm Ritter (December 3, 2008). "Scientists ask: Is technology rewiring our brains?". International Herald Tribune. Associated Press. Retrieved February 10, 2009.
  23. ^ a b c d e f Nicholas Carr (August 7, 2008). "'Is Google Making Us Stupid?': sources and notes". Rough Type. Retrieved November 1, 2008.
  24. ^ a b c d e f Carr, Nicholas (July 2008). "Is Google Making Us Stupid?". The Atlantic 301 (6). Retrieved on October 6, 2008
  25. ^ a b c d Nicholas Carr (July 17, 2008). "Why Skepticism is Good: My Reply to Clay Shirky". Britannica Blog. Archived from the original on November 12, 2017. Retrieved January 7, 2018.
  26. ^ Damon Darlin (September 20, 2008). "Technology Doesn't Dumb Us Down. It Frees Our Minds". The New York Times.
  27. ^ a b c d Evan Ratliff (August 14, 2008). "Are you losing your memory thanks to the Internet?". Salon.com.
  28. ^ "'Is Google Making Us Stupid?' (Britannica Forum: Your Brain Online)". Britannica Blog. July 17, 2008. Archived from the original on September 23, 2015. Retrieved November 1, 2008.
  29. ^ a b c d e "The Reality Club: On 'Is Google Making Us Stupid' by Nicholas Carr". Edge. July 10, 2008. Archived from the original on November 7, 2008. Retrieved November 1, 2008.
  30. ^ a b Andrew Keen (July 27, 2008). "Is the Internet killing the American reader?". The Great Seduction. Archived from the original on September 20, 2008. Retrieved October 15, 2008.
  31. ^ a b c d Scott Esposito (June 20, 2008). "Friday Column: Is Google Making Us Read Worse?". Conversational Reading. Archived from the original on March 7, 2009. Retrieved November 26, 2008.
  32. ^ Unger 2004, pp. 2–5
  33. ^ Wolf 2007, pp. 35–37
  34. ^ In a comment from Nicholas Carr Archived March 7, 2009, at the Wayback Machine on Book critic Scott Esposito's column concerning his criticism of Carr's usage of the term 'ideogram', Carr said: "As to 'ideogram,' I agree that there's debate on terminology, but in my article I decided to use the common term. The Oxford American Dictionary defines ideogram in this way: 'a written character symbolizing the idea of a thing without indicating the sounds used to say it, e.g., numerals and Chinese characters.'"
  35. ^ Seth Finkelstein (June 9, 2008). "Nick Carr: 'Is Google Making Us Stupid?', and Man vs. Machine". Infothought.
  36. ^ a b c Clay Shirky (July 17, 2008). "Why Abundance is Good: A Reply to Nick Carr". Britannica Blog. Archived from the original on January 2, 2018. Retrieved January 7, 2018.
  37. ^ David Wolman (August 18, 2008). "The Critics Need a Reboot. The Internet Hasn't Led Us Into a New Dark Age". Wired.
  38. ^ a b c d Andrew Sullivan (June 20, 2008). "Not So Google Stoopid, Ctd". The Daily Dish. Archived from the original on March 5, 2009. Retrieved January 15, 2009.
  39. ^ a b Michael Merzenich (August 11, 2008). "Going googly". "On the Brain" blog. Posit Science Web site. Retrieved November 1, 2008.
  40. ^ Gary Small (October 2008). "Letters to the Editor: Our Brains on Google". The Atlantic. Retrieved November 1, 2008.
  41. ^ a b c d Compiled (with help from Google) by Evan R. Goldstein (July 11, 2008). "CRITICAL MASS: Your Brain on Google", The Chronicle of Higher Education. NOTE: Contains excerpts from columnist Margaret Wente, author Jon Udell, blogger Matthew Ingram, book critic Scott Esposito, blogger Seth Finkelstein, technology analyst Bill Thompson, blogger Ben Worthen, and senior editor Andrew Sullivan.
  42. ^ Andrew Sullivan (June 15, 2008). "Google is giving us pond-skater minds". The Times. Archived from the original on October 7, 2008. Retrieved November 1, 2008.
  43. ^ Margaret Wente (June 17, 2008). "How Google ate my brain". The Globe and Mail. Retrieved July 1, 2008.
  44. ^ Leonard Pitts, Jr. (June 15, 2008). "Reader finds satisfaction in a good read". Miami Herald. Retrieved February 10, 2009.
  45. ^ a b Jon Udell (June 10, 2008). "A quiet retreat from the busy information commons". Strategies for Internet citizens. Retrieved November 1, 2008.
  46. ^ Cascio, Jamais (July 2009). "Get Smarter". The Atlantic Monthly. Retrieved February 4, 2016.
  47. ^ Anderson, Dan (December 2020). "New Imagining the Internet report: Digital Life 2020". Elon University. Retrieved February 9, 2021.
  48. ^ Janna Anderson; Lee Rainie (February 19, 2010). "Future of the Internet IV. Part 1: A review of responses to a tension pair about whether Google will make people stupid". Pew Research Center. Retrieved February 4, 2016.
  49. ^ Siva Vaidhyanathan (March 13, 2012). The Googlization of Everything: (And Why We Should Worry). University of California Press. pp. 181–. ISBN 978-0-520-95245-4. Retrieved January 27, 2016.
  50. ^ John Naughton (December 22, 2011). From Gutenberg to Zuckerberg: What You Really Need to Know About the Internet. Quercus Publishing. pp. 26–. ISBN 978-0-85738-547-5.
  51. ^ John Battelle (June 10, 2008). "Google: Making Nick Carr Stupid, But It's Made This Guy Smarter". Searchblog. Retrieved November 1, 2008.
  52. ^ Scott Rosenberg (June 11, 2008). "Nick Carr's new knock on the Web: does it change how we read?". Wordyard. Retrieved November 1, 2008.
  53. ^ Carr 2008, pp. 21–22
  54. ^ a b Carr 2008, pp. 22–23
  55. ^ a b Clay Shirky (July 21, 2008). "Why Abundance Should Breed Optimism: A Second Reply to Nick Carr". Britannica Blog. Archived from the original on November 14, 2017. Retrieved January 7, 2018.
  56. ^ Larry Sanger (July 18, 2008). "A Defense of Tolstoy & the Individual Thinker: A Reply to Clay Shirky". Britannica Blog. Archived from the original on July 11, 2018. Retrieved January 7, 2018.
  57. ^ Larry Sanger (July 30, 2008). "The Internet and the Future of Civilization". Britannica Blog. Archived from the original on January 8, 2018. Retrieved January 7, 2018.
  58. ^ Sven Birkerts (July 25, 2008). "Reading in the Open-ended Information Zone Called Cyberspace: My Reply to Kevin Kelly". Britannica Blog. Archived from the original on January 8, 2018. Retrieved January 7, 2018.
  59. ^ Ong 1982, p. 79
  60. ^ Lowry 1979, pp. 29–31
  61. ^ Sven Birkerts (July 18, 2008). "A Know-Nothing's Defense of Serious Reading & Culture: A Reply to Clay Shirky". Britannica Blog. Archived from the original on January 8, 2018. Retrieved January 7, 2018.
  62. ^ Ben Worthen (July 11, 2008). "Does the Internet Make Us Think Different?". The Wall Street Journal.
  63. ^ Kwansah-Aidoo 2005, pp. 100–101
  64. ^ "Friedrich Nietzsche and his typewriter - a Malling-Hansen Writing Ball". The International Rasmus Malling-Hansen Society. Retrieved November 26, 2008.
  65. ^ Kittler 1999, pp. 203, 206
  66. ^ Kevin Kelly (June 11, 2008). "Will We Let Google Make Us Smarter?". The Technium. Retrieved November 1, 2008. {{cite news}}: |author= has generic name (help)
  67. ^ Doidge 2007, pp. 45–48
  68. ^ Doidge 2007, pp. 70–72
  69. ^ Doidge 2007, pp. 74–83
  70. ^ Doidge 2007, pp. 84–91
  71. ^ Carr 2008, p. 213
  72. ^ Spencer Michaels, "The Search Engine that Could Archived January 22, 2014, at the Wayback Machine", The NewsHour with Jim Lehrer, November 29, 2002.
  73. ^ Belinda Goldsmith (October 27, 2008). "Is surfing the Internet altering your brain?". Reuters. Retrieved November 1, 2008.
  74. ^ Kevin Kelly (July 25, 2008). "Time to Prove the Carr Thesis: Where's the Science?". Britannica Blog. Archived from the original on September 21, 2015. Retrieved November 1, 2008.
  75. ^ Carr 2008, p. 227
  76. ^ "Information Behaviour of the Researcher of the Future: A Ciber Briefing Paper" (PDF). University College London. January 11, 2008. Archived from the original (pdf) on October 17, 2012.
  77. ^ Meris Stansbury (October 15, 2008). "Rethinking research in the Google era: Educators ponder how the internet has changed students' reading habits". eSchool News. Archived from the original on October 20, 2008. Retrieved November 1, 2008.
  78. ^ Rachel Champeau (October 14, 2008). "UCLA study finds that searching the Internet increases brain function" (Press release). UCLA Newsroom. Retrieved November 1, 2008.
  79. ^ Mary Brophy Marcus (October 15, 2008). "Internet search results: Increased brain activity". USA Today. Retrieved November 1, 2008.
  80. ^ Madison Park (October 14, 2008). "Study: Google does a brain good". CNN. Retrieved October 20, 2008.
  81. ^ Jeneen Interlandi (October 14, 2008). "Reading This Will Change Your Brain". Newsweek. Retrieved November 1, 2008.
  82. ^ "Internet use 'good for the brain'". BBC News. October 14, 2008. Retrieved November 1, 2008.
  83. ^ John Battelle (October 14, 2008). "Google Makes You Smarter? Hey, Who Said That?". Searchblog. Retrieved November 1, 2008.
  84. ^ Betsy Schiffman (October 15, 2008). "Study: Google Makes You Smart". Wired. Retrieved November 1, 2008.
  85. ^ Steve Johnson (October 28, 2008). "Searching for meaning in brain scans of seniors". Chicago Tribune. Retrieved November 1, 2008.
  86. ^ Nicholas Carr (October 17, 2008). "Googling and intelligence". Rough Type. Retrieved November 1, 2008.
  87. ^ "Is Google Making Us Smarter?". The New York Times. October 16, 2008. Retrieved November 1, 2008.

Bibliography

[edit]

Further reading

[edit]
[edit]