Nicholas Carr Famous Quotes
Reading Nicholas Carr quotes, download and share images of famous quotes by Nicholas Carr. Righ click to see or save pictures of Nicholas Carr quotes that you can use as your wallpaper for free.
In one recent experiment, Damasio and his colleagues had subjects listen to stories describing people experiencing physical or psychological pain. The subjects were then put into a magnetic resonance imaging machine and their brains were scanned as they were asked to remember the stories. The experiment revealed that while the human brain reacts very quickly to demonstrations of physical pain-when you see someone injured, the primitive pain centers in your own brain activate almost instantaneously- the more sophisticated mental process of empathizing with psychological suffering unfolds much more slowly. It takes time, the researchers discovered, for the brain "to transcend immediate involvement of the body" and begin to understand and to feel "the psychological and moral dimensions of a situation." (p220)
Culture is sustained in our synapses ... It's more than what can be reduced to binary code and uploaded onto the Net. To remain vital, culture must be renewed in the minds of the members of every generation. Outsource memory, and culture withers.
Google is neither God nor Satan, and if there are shadows in the Googleplex they're no more than the delusions of grandeur. What's disturbing about the company's founders is not their boyish desire to create an amazingly cool machine that will be able to outthink its creators, but the pinched conception of the human mind that gives rise to such a desire.
There is no economic law that says that everyone, or even most people, automatically benefit from technological progress.
It's the new technologies that govern production and consumption, that guide people's behavior and shape their perceptions.
He's not seeking some greater truth beyond the work. The work is the truth.
In the quiet spaces opened up by the prolonged, undistracted reading of a book, people made their own associations, drew their own inferences and analogies, fostered their own ideas. They thought deeply as they read deeply.
[Patricia Greenfield] concluded that "every medium develops some cognitive skills at the expense of others." Our growing use of the Net and other screen-based technologies has led to the "widespread and sophisticated development of visual-spatial skills." We can, for example, rotate objects in our minds better than we used to be able to. But our "new strengths in visual-spatial intelligence" go hand in hand with a weakening of our capacities for the kind of "deep processing" that underpins "mindful knowledge acquisition, inductive analysis, critical thinking, imagination, and reflection.
During the twentieth century, neuroscientists and psychologists also came to more fully appreciate the astounding complexity of the human brain. Inside our skulls, they discovered, are some 100 billion neurons, which take many different shapes and range in length from a few tenths of a millimeter to a few feet.4 A single neuron typically has many dendrites (though only one axon), and dendrites and axons can have a multitude of branches and synaptic terminals. The average neuron makes about a thousand synaptic connections, and some neurons can make a hundred times that number.
As social concerns override literary ones, writers seem fated to eschew virtuosity and experimentation in favor of a bland but immediately accessible style. Writing will become a means for recording chatter.
When, in an 1892 lecture before a group of teachers, William James declared that "the art of remembering is the art of thinking," he was stating the obvious.14 Now, his words seem old-fashioned. Not only has memory lost its divinity; it's well on its way to losing its humanness. Mnemosyne has become a machine.
The more a sufferer concentrates on his symptoms, the deeper those symptoms are etched into his neural circuits. In the worst cases, the mind essentially trains itself to be sick. Many addictions, too, are reinforced by the strengthening of plastic pathways to the brain. Even very small doses of addictive drugs can dramatically alter the flow of neurotransmitters in a person's synapses, resulting in long-lasting alterations in brain circuitry and function. In some cases, the buildup of certain kinds of neurotransmitters, such as dopamine, a pleasure-producing cousin to adrenaline, seems to actually trigger the turning on or off particular genes, bringing even stronger cravings for the drug. The vital path turns deadly.
Although neuroplasticity provides an escape from genetic determinism, a loophole for free thought and free will, it also imposes its own form of determinism on our behavior. As particular circuits in our brain strengthen through the repetition of a physical or mental activity, they begin to transform that activity into a habit. The paradox of neuroplasticity, observes Doidge, is that, for all the mental flexibility it grants us, it can end up locking us into "rigid behaviors."33 The chemically triggered synapses that link our neurons program us, in effect, to want to keep exercising the circuits they've formed. Once we've wired new circuitry in our brain, Doidge writes, "we long to keep it activated.
Jordan Grafman, head of the cognitive neuroscience unit at the National Institute of Neurological Disorders and Stroke, explains that the constant shifting of our attention when we're online may make our brains more nimble when it comes to multitasking, but improving our ability to multitask actually hampers our ability to think deeply and creatively.
Instead of requiring us to puzzle out where we are in an area, a GPS device simply sets us at the center of the map and then makes the world circulate around us.
The brain's plasticity is not limited to the somatosensory cortex, the area that governs our sense of touch. It's universal. Virtually all of our neural circuits - whether they're involved in feeling, seeing, hearing, moving, thinking, learning, perceiving, or remembering - are subject to change. The received wisdom is cast aside.
In the long run a medium's content matters less than the medium itself in influencing how we think and act. As our window onto the world, and onto ourselves, a popular medium molds what we see and how we see it-and eventually, if we use it enough, it changes who we are, as individuals and as a society.
Technology isn't what makes us "post-human" or "transhuman," as some writers and scholars have recently suggested. It's what makes us human. Technology is in our nature. Through our tools we give our dreams form. We bring them into the world. The practicality of technology may distinguish it from art, but both spring from a similar, distinctly human yearning.
Experiments show that just as the brain can build new or stronger circuits through physical or mental practice, those circuits can weaken or dissolve with neglect.
Another experiment, conducted by Pascual-Leone when he was a researcher at the National Institutes of Health, provides even more remarkable evidence of the way our patterns of thought affect the anatomy of our brains. Pascual-Leone recruited people who had no experience playing a piano, and he taught them how to play a simple melody consisting of a short series of notes. He then split the participants into two groups. He had the members of one group practice the melody on a keyboard for two hours a day over the next five days. he had the members of the other group sit in front of a keyboard for the same amount of time but only imagine playing the song--without ever touching the keys. Using a technique called transcranial magnetic stimulation, or TMS, Pascual-Leone mapped the brain activity of all the participants before, during, and after the test. he found that the people who had only imagined playing the notes exhibited precisely the same changes in their brains as those who had actually pressed the keys. Their brains had changed in response to actions that took place purely in their imaginations--in response, that is, to their thoughts. Descartes may have been wrong about dualism, but he appears to have been correct in believing that our thoughts can exert a physical influence on, or at least cause a physical reaction in, our brains. We become, neurologically, what we think. (p33)
Try reading a book while doing a crossword puzzle; that's the intellectual environment of the Internet. BACK
When a printed book - whether a recently published scholarly history or a two-hundred-year-old Victorian novel - is transferred to an electronic device connected to the Internet, it turns into something very like a Web site. Its words become wrapped in all the distractions of the networked computer. Its links and other digital enhancements propel the reader hither and yon. It loses what the late John Updike called its "edges" and dissolves into the vast, rolling waters of the Net. The linearity of the printed book is shattered, along with the calm attentiveness it encourages in the reader.
In a talk at a recent Phi Beta Kappa meeting, Duke University professor Katherine Hayles confessed, "I can't get my students to read whole books anymore."10 Hayles teaches English; the students she's talking about are students of literature.
The brighter the software, the dimmer the user.
You can take a book to the beach without worrying about sand getting in its works. You can take it to bed without being nervous about it falling to the floor should you nod off. You can spill coffee on it. You can sit on it. You can put it down on a table, open to the page you're reading, and when you pick it up a few days later it will still be exactly as you left it. You never have to be concerned about plugging a book into an outlet or having its battery die.
The mechanical clock changed the way we saw ourselves. And like the map, it changed the way we thought. Once the clock had redefined time as a series of units of equal duration, our minds began to stress the methodical mental work of division and measurement.
Although we don't tend to think of libraries as media technologies, they are. The public library is, in fact, one of the most important and influential informational media ever created - and one that proliferated only after the arrival of silent reading and movable-type printing. A community's attitudes and preferences toward information take concrete shape in its library's design and services. [ ... ] The library provides, as well, a powerful symbol of our new media landscape: at the center stands the screen of the Internet-connected computers; the printed word has been pushed to the margins.
What determines what we remember and what we forget? The key to memory consolidation is attentiveness. Storing explicit memories and, equally important, forming connections between them requires strong mental concentration, amplified by repetition or by intense intellectual or emotional engagement. The sharper the attention, the sharper the memory. "For a memory to persist," writes Kandel, "the incoming information must be thoroughly and deeply processed. This is accomplished by attending to the information and associating it meaningfully and systematically with knowledge already well established in memory."35 If we're unable to attend to the information in our working memory, the information lasts only as long as the neurons that hold it maintain their electric charge - a few seconds at best. Then it's gone, leaving little or no trace in the mind.
The mounting evidence of an erosion of skills, a dulling of perceptions, and a slowing of reactions should give us all pause. As we begin to live our lives inside glass cockpits, we seem fated to discover what pilots already know: a glass cockpit can also be a glass cage.
Some of the test subjects were given cards that had both words printed in full, like this:
Hot: Cold
Others used cards that showed only the first letter of the second word, like this:
Hot: C
The people who used the cards with the missing letters performed much better in a subsequent test measuring how well they remembered the word pairs. Simply forcing their minds to fill in a blank, to act rather than observe, led to stronger retention of information.
As these companies had expanded their operations in the wake of the Industrial Revolution, they'd found it necessary to collect, store, and analyze ever larger amounts of data - on their customers, their finances, their employees, their inventories, and so on. Electrification allowed the companies to grow larger still, further expanding the information they had to process. This intellectual work became as important, and often as arduous, as the physical labor of manufacturing products and delivering services. Hollerith's
Calm, focused, undistracted, the linear mind is being pushed aside by a new kind of mind that wants and needs to take in and dole out information in short, disjointed, often overlapping bursts - the faster, the better.
The influx of competing messages that we receive whenever we go online not only overloads our working memory; it makes it much harder for our frontal lobes to concentrate our attention on any one thing. The process of memory consolidation can't even get started.
Our writing equipment takes part in the forming of our thoughts.
Lawyer and technology writer Richard Koman, argued that Google has become a true believer in its own goodness, a belief which justifies its own set of rules regarding corporate ethics, anti-competition, customer service and its place in society.
We cannot go back to the lost oral world, any more than we can turn the clock back to a time before the clock existed.40 "Writing and print and the computer," writes Walter Ong, "are all ways of technologizing the word" and once technologized, the word cannot be de-technologized.41 But the world of the screen, as we're already coming to understand, is a very different place from the world of the page. A new intellectual ethic is taking hold. The pathways in our brains are once again being rerouted.
We've reached the point where a Rhodes Scholar like Florida State's Joe O'Shea - a philosophy major, no less - is comfortable admitting not only that he doesn't read books but that he doesn't see any particular need to read them.
Whenever we turn on our computer, we are plunged into an ecosystem of interruption technologies,
Our desire to segregate the mind's cogitations from the body's exertions reflects the grip that Cartesian dualism still holds on us. When we think about thinking, we're quick to locate our mind, and hence our self, in the gray matter inside our skull and to see the rest of the body as a mechanical life-support system that keeps the neural circuits charged. More than a fancy of philosophers like Descartes and his predecessor Plato, this dualistic view of mind and body as operating in isolation from each other appears to be a side effect of consciousness itself. Even though the bulk of the mind's work goes on behind the scenes, in the shadows of the unconscious, we're aware only of the small but brightly lit window that the conscious mind opens for us. And our conscious mind tells us, insistently, that it's separate from the body.
And what the Net seems to be doing is chipping away my capacity for concentration and contemplation. Whether I'm online or not, my mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles.
Should the Egyptians learn to write, Thamus goes on, "it will implant forgetfulness in their souls: they will cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks." The
Donald T. Campbell explained in a renowned 1976 paper, The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor.
We're still a long way from knowing where our clicks will lead us. But it's clear that two of the hopes most dear to the Internet optimists - that the Web will create a more bountiful culture and that it will promote greater harmony and understanding - should be treated with skepticism. Cultural impoverishment and social fragmentation seem equally likely outcomes.
THE TROUBLE with automation is that it often gives us what we don't need at the cost of what we do.
Once we bring an explicit long-term memory back into working memory, it becomes a short-term memory again. When we reconsolidate it, it gains a new set of connections - a new context. As Joseph LeDoux explains, "The brain that does the remembering is not the brain that formed the initial memory. In order for the old memory to make sense in the current brain, the memory has to be updated."30 Biological memory is in a perpetual state of renewal.
The Net grants us instant access to a library of information unprecedented in its size and scope, and it makes it easy for us to sort through that library - to find, if not exactly what we were looking for, at least something sufficient for our immediate purposes. What the Net diminishes is Johnson's primary kind of knowledge: the ability to know, in depth, a subject for ourselves, to construct within our own minds the rich and idiosyncratic set of connections that give rise to a singular intelligence.
Our brains turn into simple signal-processing units, quickly shepherding information into consciousness and then back out again.
In the YouTube economy, everyone is free to play, but only a few reap the rewards.
What the Net seems to be doing is chipping away my capacity for concentration and contemplation. Whether I'm online or not, my mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.
A drawn line can be many things," he says, whereas a digitized line has to be just one thing.27
The Net's interactivity gives us powerful new tools for finding information, expressing ourselves, and conversing with others. It also turns us into lab rats constantly pressing levers to get tiny pellets of social or intellectual nourishment.
Our conventional response to all media, namely that it is how they are used that counts, is the numb stance of the technological idiot," he wrote. The content of the medium is just "the juicy piece of meat carried by the burglar to distract the watchdog of the mind." P 4
The old botanical metaphors for memory, with their emphasis on continual, indeterminate organic growth, are, it turns out, remarkably apt. In fact, they seem to be more fitting than our new, fashionably high-tech metaphors, which equate biological memory with the precisely defined bits of digital data stored in databases and processed by computer chips. Governed by highly variable biological signals, chemical, electrical, and genetic, every aspect of human memory - the way it's formed, maintained, connected, recalled - has almost infinite gradations. Computer memory exists as simple binary bits - ones and zeros - that are processed through fixed circuits, which can be either open or closed but nothing in between.
The mind of the experienced book reader is a calm mind, not a buzzing one. When it comes to the firing of our neurons, it's a mistake to assume that more is better.
Looking ahead to future applications of electronics, [de Forest] grew even gloomier. He believed that 'electron physiologists' would eventually be able to monitor and analyze 'thought or brain waves', allowing 'joy and grief to be measured in define, quantitative unit.' Ultimately, he concluded, 'a professor may be able to implant knowledge into the reluctant brains of his 22nd century pupils. What terrifying political possibilities may be lurking there! Let us be thankful that such things are only for posterity, not for us.
All technological change is generational change.
We become, neurologically, what we think."(33)
The value of a well-made and well-used tool lies not only in what it produces for us but what it produces in us.
The more you multitask, the less deliberative you become; the less able to think and reason out a problem.
The bond between book reader and book writer has always been a tightly symbiotic one, a means of intellectual and artistic cross-fertilization. The words of the writer act as a catalyst in the mind of the reader, inspiriting new insights, associations, and perceptions, sometimes even epiphanies. And the very existence of the attentive, critical reader provides the spur for the writer's work. It gives the author confidence to explore new forms of expression, to blaze difficult and demanding paths of thought, to venture into uncharted and sometimes hazardous territory. "All great men have written proudly, nor cared to explain," said Emerson. "They knew that the intelligent reader would come at last, and would thank them.
The bond between a book reader and a book writer has always been a symbiotic one, a means of intellectual and artistic cross-fertilization. The words of the writer act as a catalyst in the mind of the reader, inspiring new insights, associations, and perceptions, sometimes, even epiphanies. And the very existence of the attentive, critical reader provides the spur for the writer's work. It gives the author the confidence to explore new forms of expression, to blaze difficult and demanding paths of thought, to venture into uncharted and sometimes hazardous territory.
When our brain is overtaxed, we find distractions more distracting.
The mind is not sealed in the skull but extends throughout the body. We think not only with our brain but also with our eyes and ears, nose and mouth, limbs and torso. And when we use tools to extend our grasp, we think with them as well. "Thinking, or knowledge-getting, is far from being the armchair thing it is often supposed to be," wrote the American philosopher and social reformer John Dewey in 1916. "Hands and feet, apparatus and appliances of all kinds are as much a part of it as changes in the brain."51 To act is to think, and to think is to act.
To date, there is no strong empirical support for claims that automating medical record keeping will lead to major reductions in health-care costs or significant improvements in the well-being of patients. But if doctors and patients have seen few benefits from the scramble to automate record keeping, the companies that supply the systems have profited. Cerner Corporation, a medical software outfit, saw its revenues triple, from $1 billion to $3 billion, between 2005 and 2013. Cerner, as it happens, was one of five corporations that provided RAND with funding for the original 2005 study. The other sponsors, which included General Electric and Hewlett Packard, also have substantial business interests in health-care automation. As today's flawed systems are replaced or upgraded in the future, to fix their interoperability problems and other shortcomings, information technology companies will reap further windfalls.
We want to be interrupted, because each interruption brings us a valuable piece of information. To turn off these alerts is to risk feeling out of touch, or even socially isolated.
But except in rare circumstances, you can train until you're blue in the face and you'd never be as good as if you just focused on one thing at a time." What we're doing when we multitask "is learning to be skillful at a superficial level." The Roman philosopher Seneca May have put it best two thousand years ago: "To be everywhere is to be nowhere.
One of its major recent thrusts has been to place a greater priority on what it calls the "freshness" of the pages it recommends. Google not only identifies new or revised Web pages much more quickly than it used to - it now checks the most popular sites for updates every few seconds rather than every few days - but for many searches it skews its results to favor newer pages over older ones.
The intellectual ethic of a technology is rarely recognized by its inventors. They are usually so intent on solving a particular problem or untangling some thorny scientific or engineering dilemma that they don't see the broader implications of their work. The users of the technology are also usually oblivious to its ethic. They, too, are concerned with the practical benefits they gain from employing the tool. Our ancestors didn't develop or use maps in order to enhance their capacity for conceptual thinking or to bring the world's hidden structures to light. Nor did they manufacture mechanical clocks to spur the adoption of a more scientific mode of thinking. These were by-products of the technologies. But what by-products! Ultimately, it's an invention's intellectual work ethic that has the most profound effect on us.
The near-continuous stream of new information pumped out by the Web also plays to our natural tendency to "vastly overvalue what happens to us right now," as Union College psychologist Christopher Chabris explains. We crave the new even when we know that "the new is more often trivial than essential.
Insull was a serious and driven young man. Born into a family of temperance crusaders, he spent his boyhood poring over books with titles like Lives of the Great Engineers and Self-Help.
When an inscrutable technology becomes an invisible technology, we would be wise to be concerned. At that point, the technology's assumptions and intentions have infiltrated our own desires and actions. We no longer know whether the software is aiding s or controlling us. We're behind the wheel, but we can't be sure who's driving.
Google, as the supplier of the Web's principal navigational tools, also shapes our relationship with the content that it serves up so efficiently and in such profusion. The intellectual technologies it has pioneered promote the speedy, superficial skimming of information and discourage any deep, prolonged engagement with a single argument, idea, or narrative.
The faster we surf across the surface of the Web - the more links we click and pages we view - the more opportunities Google gains to collect information about us and to feed us advertisements. Its advertising system, moreover, is explicitly designed to figure out which messages are most likely to grab our attention and then to place those messages in our field of view. Every click we make on the Web marks a break in our concentration, a bottom-up disruption of our attention - and it's in Google's economic interest to make sure we click as often as possible.
Quite a few people still listen to vinyl records, use film cameras to take photographs, and look up phone numbers in the printed Yellow Pages. But the old technologies lose their economic and cultural force. They become progress's dead ends. It's the new technologies that govern production and consumption, that guide people's behavior and shape their perceptions. That's why the future of knowledge and culture no longer lies in books or newspapers or TV shows or radio programs or records or CDs. It lies in digital files shot through our universal medium at the speed of light.