Question Center

Edge 320 — June 21, 2010
11,130 words


By John Brockman


By Stewart Brand
Introduction By Kevin Kelly

By Steven Pinker


Nicholas Carr, Douglas Rushkoff, Evgeny Morozov
on Steven Pinker's "Mind Over Mass Media"


New Scientist (UK), Poder 360 (Chile), Die Welt (Germany), L'Actualite (France), Gulf News (UAE)



[EDITOR'S NOTE: Last month I received an email from Melissa Ludtke, editor of Nieman Reports:

Writing to you as the editor of Nieman Reports, www.niemanreports.com, certainly not the most trendy Web site you've ever seen, but we hope one offering something of value, primarily I suspect for journalists, though a few others venture our way, too.

Heading toward our Summer 2010 issue — in the planning stage now, and so I'd welcome the chance to talk with you. Topic: your 2010 "big question" -- many of the answers to which I've read on your Web site — which draws a direct line to the core of what we are going to be exploring —-through the voices and experiences of journalists and others — in the Summer 2010 issue of our magazine, to be published in June.

The edition has been published, and my essay on the Edge Question, along with pieces by Nicholas Carr, Douglas Rushkoff, Sherry Turkle, and Esther Wojcicki, are available at the link below in the Summer 2010 issue of Nieman Reports, a lively, timely, and interesting publication. — JB]

[Our sister publication Nieman Reports is out with its latest issue, and its focus is the new digital landscape of journalism. There are lots of interesting articles, and we'll be highlighting a few here over the next few days. Here, John Brockman writes about how he came to ask a passel of intellectual luminaries how the Internet is changing how they think. —Josh]

[Keep reading at Nieman Reports »]


Edge posed this question; discover how a wide range of thinkers responded.

By John Brockman

As each new year approaches, John Brockman, founder of Edge, an online publication, consults with three of the original members of Edge—Stewart Brand, founder and editor of Whole Earth Catalog; Kevin Kelly, who helped to launch Wired in 1993 and wrote "What Technology Wants," a book to be published in October (Viking Penguin); and George Dyson, a science historian who is the author of several books including "Darwin Among the Machines." Together they create the Edge Annual Question—which Brockman then sends out to the Edge list to invite responses. He receives these commentaries by e-mail, which are then edited. Edge is a read-only site. There is no direct posting nor is Edge open for comments.

Brockman has been asking an Edge Annual Question for the past 13 years. In this essay, he explains what makes a question a good one to ask and shares some responses to this year's question: "How is the Internet changing the way you think?"

"Origins of Edge"

Read the responses in their entirety »

It's not easy coming up with a question. As the artist James Lee Byars used to say: "I can answer the question, but am I bright enough to ask it?" Edge is a conversation. We are looking for questions that inspire answers we can't possibly predict. Surprise me with an answer I never could have guessed. My goal is to provoke people into thinking thoughts that they normally might not have.

The art of a good question is to find a balance between abstraction and the personal, to ask a question that has many answers, or at least one for which you don't know the answer. It's a question distant enough to encourage abstractions and not so specific that it's about breakfast. A good question encourages answers that are grounded in experience but bigger than that experience alone.

Before we arrived at the 2010 question, we went through several months of considering other questions. Eventually I came up with the idea of asking how the Internet is affecting the scientific work, lives, minds and reality of the contributors. Kevin Kelly responded:

John, you pioneered the idea of asking smart folks what question they are asking themselves. Well I've noticed in the past few years there is one question everyone on your list is asking themselves these days and that is, is the Internet making me smarter or stupid? Nick Carr tackled the question on his terms, but did not answer it for everyone. In fact, I would love to hear the Edge list tell me their version: Is the Internet improving them or improving their work, and how is it changing how they think? I am less interested in the general "us" and more interested in the specific "you"—how it is affecting each one personally. Nearly every discussion I have with someone these days will arrive at this question sooner or later. Why not tackle it head on?

And so we did.

Yet, we still had work to do in framing our question. When people respond to "we" questions, their words tend to resemble expert papers, public pronouncements, or talks delivered from a stage. "You" leads us to share specifics of our lived experience. The challenge then is to not let responses slip into life's more banal details.

For us, discussion revolved around whether we'd ask "Is the Internet changing the way we think?" or probe this topic with a "you" focused question. Steven Pinker, Harvard research psychologist, author of "The Language Instinct" and "The Blank Slate," and one of several distinguished scientists I consult, advised heading in the direction of "us."

I very much like the idea of the Edge Question, but would suggest one important change—that it be about "us," not "me." The "me" question is too easy—if people really thought that some bit of technology was making their minds or their lives worse, they could always go back to the typewriter, or the Britannica, or the US Postal Service. The tough question is "us'"if every individual makes a choice that makes him or her better off, could there be knock-on effects that make the culture as a whole worse off (what the economists call "externalities")?

Ultimately it's my call so I decided to go with the "you" question in the hope that it would attract a wider range of individualistic responses. In my editorial marching orders to contributors, I asked them to think about the Internet—a much bigger subject than the Web, recalling that in 1996 computer scientist and visionary W. Daniel Hillis presciently observed the difference:

Many people sense this, but don't want to think about it because the change is too profound. Today, on the Internet the main event is the Web. A lot of people think that the Web is the Internet, and they're missing something. The Internet is a brand-new fertile ground where things can grow, and the Web is the first thing that grew there. But the stuff growing there is in a very primitive form. The Web is the old media incorporated into the new medium. It both adds something to the Internet and takes something away.

Early Responders

Framing the question and setting a high bar for responses is critical. Before launching the question to the entire Edge list, I invited a dozen or so people who I believed would have something interesting to say; their responses would seed the site and encourage the wider group to think in surprising ways. Here are some of these early responses:

  • Playwright Richard Foreman asks about the replacement of complex inner density with a new kind of self evolving under the pressure of information overload and the technology of the instantly available. Is it a new self? Are we becoming Pancake People—spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button?

  • Technology analyst Nicholas Carr, who wrote The Atlantic cover story, "Is Google Making Us Stupid?," asks whether the use of the Web made it impossible for us to read long pieces of writing.

  • Social software guru Clay Shirky says the answer is " ‘too soon to tell.' This isn't because we can't see some of the obvious effects already, but because the deep changes will be manifested only when new cultural norms shape what the technology makes possible. ... The Internet's primary effect on how we think will only reveal itself when it affects the cultural milieu of thought, not just the behavior of individual users."

  • Web 2.0 pioneer Tim O'Reilly ponders if ideas themselves are the ultimate social software. Do they evolve via the conversations we have with each other, the artifacts we create, and the stories we tell to explain them?

  • Stewart Brand, founder of Whole Earth Catalog, cannot function without the major players in his social extended mind—his guild. "How I think is shaped to a large degree by how they think," he writes. "Thanks to my guild's Internet-mediated conversation, my neuronal thinking is enhanced immeasurably by our digital thinking."

  • Hillis goes a step further by asking if the Internet will, in the long run, arrive at a much richer infrastructure in which ideas can potentially evolve outside of human minds. In other words, can we change the way the Internet thinks?

The Conversation

The 2010 question elicited, in all, 172 essays that comprised a 132,000-word manuscript published online by Edge in January.

speaks about a new type of mind, amplified by the Internet, evolving, and able to start a new phase of evolution outside of the body. In "Net Gain," evolutionary biologist Richard Dawkins looks 40 years into the future when "retrieval from the communal exosomatic memory will become dramatically faster, and we shall rely less on the memory in our skulls." Nassim Taleb, author of "The Black Swan," writes about "The Degradation of Predictability—and Knowledge" as he asks us to "consider the explosive situation: More information (particularly thanks to the Internet) causes more confidence and illusions of knowledge while degrading predictability."

Nick Bilton, lead writer of The New York Times's Bits blog, notes that "[the] Internet is not changing how we think. Instead, we are changing how the Internet thinks." Actor Alan Alda worries about "[speed] plus mobs. A scary combination." He wonders, "Is there an algorithm perking somewhere in someone's head right now that can act as a check against this growing hastiness and mobbiness?" New York Times columnist Virginia Heffernan writes that "we must keep on reading and not mistake new texts for new worlds, or new forms for new brains."

Numerous artists responded in enlightening ways, as their evocative headlines suggest:

My Favorites

I enjoyed the juxtaposition of responses by psychologist Steven Pinker, "Not At All," and Chinese artist and cultural activist Ai Weiwei, "I Only Think on the Internet." The response I most admired is George Dyson's "Kayaks vs. Canoes." It is a gem:

In the North Pacific Ocean, there were two approaches to boatbuilding. The Aleuts (and their kayak-building relatives) lived on barren, treeless islands and built their vessels by piecing together skeletal frameworks from fragments of beach-combed wood. The Tlingit (and their dugout canoe-building relatives) built their vessels by selecting entire trees out of the rainforest and removing wood until there was nothing left but a canoe.

The Aleut and the Tlingit achieved similar results—maximum boat/minimum material—by opposite means. The flood of information unleashed by the Internet has produced a similar cultural split. We used to be kayak builders, collecting all available fragments of information to assemble the framework that kept us afloat. Now, we have to learn to become dugout-canoe builders, discarding unnecessary information to reveal the shape of knowledge hidden within.

I was a hardened kayak builder, trained to collect every available stick. I resent having to learn the new skills. But those who don't will be left paddling logs, not canoes.

What do you think?


Origins of Edge

The Edge project was inspired by the 1971 failed art experiment entitled "The World Question Center" by the late James Lee Byars, John Brockman's friend and sometime collaborator. Byars believed that to arrive at an axiology of societal knowledge it was pure folly to go to Widener Library at Harvard and read six million volumes. Instead, he planned to gather the 100 most brilliant minds in the world in a room, lock them behind closed doors, and have them ask each other the questions they were asking themselves. The expected result, in theory, was to be a synthesis of all thought. But it didn't work out that way. Byars identified his 100 most brilliant minds and called each of them. The result: 70 people hung up on him.

A decade later, Brockman picked up on the idea and founded The Reality Club, which, in 1997, went online, rebranded as Edge.


By Stewart Brand

An afterword blurs a book in time. My final draft of April 2009 is here made unfinal. And what you have here is only a sample of the time smear I'm attempting with the online version of the book at www.sbnotes.com, where the text (much of it) dwells in a living thicket of its origins and implications. Instead of static footnotes there are live links to my sources, including some better ones that turned up after the writing. You should be able to follow my quotes upstream to the articles and Google Books pages they come from. There you can conduct your own version of my research and perhaps draw different conclusions. I continue to add updates in the margins of the text, along with pages of photographs, diagrams, and videos, plus the kind of additions that usually go in an appendix. I'll try to maintain the service as long as it has traffic. Maybe all nonfiction books will soon offer such online immersive versions of their material.

Foreword to Afterword
By Kevin Kelly

Information wants to be free, but it doesn't want to be final. The merry superconductivity of a bit of information means that updates, corrections, additions, deletions, re-interpretations, misinterpretations, anti-information, and denials of that same bit quickly follow.

The blessings, and curse, of a printed paper book are that its words, once stamped in ink, are fixed. But the rest of our fast-forward lives, and the slippery digital universe we swim in, tear at that fixity and demand that books keep improving, just like our iPhones do. Can books be upgraded?

Many readers of Stewart Brand's recent book, Whole Earth Discipline, praise it for its heretical synthesis of "edgy" ideas on a wide range of frontiers. And that it is. But I found Brand's book far more interesting as case study on how one can use information to adopt a permanent, mindful stance of flexibility. On every vector within his book Brand traced how his thinking was changed by a steady stream of informational evidence. Sometimes he altered his position more than once. The thrill of the book was watching how a top-notch thinker kept upgrading his views.

Whole Earth Discipline was published in the autumn of 2009. Nine months later whole worlds of science have lurched forward, digital news accelerated, and "what we know" is now different. If information wants to change, shouldn't an author have different ideas from the now frozen book he previously wrote?

Someday keeping a text constantly fresh will become both routinely possible and a chore for all of us. While a few authors/publishers have created successfully eternal ebooks, Brand has written a marvelous Afterwood to his book which does several things. First, in great detail it updates the news he first reported. This update is so well written that it can be appreciated even if you have not read the original book. But more importantly, and most remarkably, Brand courageously indicates how this news has changed his mind since he wrote the book.

When the liquid containers of electronic texts demand that we revise them yet again, I hope we can use Stewart Brand's "Afterword" as an inspiration to not only upgrade our facts, but also upgrade our made-up minds.

— Kevin Kelly, Editor-At-Large, Wired; Author, What Technology Wants

STEWART BRAND is cofounder and co-chairman of The Long Now Foundation. He is the founder of the Whole Earth Catalog, cofounder of The Well, and cofounder of Global Business Network.

He is the original editor of The Whole Earth Catalog, (Winner of the National Book Award). The Afterword is written for the paperback edition of latest book, Whole Earth Discipline: An Ecopragmatist Manifesto, which will be published in September.

Stewart Brand's Edge Bio Page

Kevin Kelly's Edge Bio Page



May 2010 . . .


An afterword blurs a book in time. My final draft of April 2009 is here made unfinal.

And what you have here is only a sample of the time smear I'm attempting with the online version of the book at www.sbnotes.com, where the text (much of it) dwells in a living thicket of its origins and implications. Instead of static footnotes there are live links to my sources, including some better ones that turned up after the writing. You should be able to follow my quotes upstream to the articles and Google Books pages they come from. There you can conduct your own version of my research and perhaps draw different conclusions. I continue to add updates in the margins of the text, along with pages of photographs, diagrams, and videos, plus the kind of additions that usually go in an appendix. I'll try to maintain the service as long as it has traffic. Maybe all nonfiction books will soon offer such online immersive versions of their material.

What belongs in an afterword? I did promise in this book that I would change my mind as needed, and I can already report a couple of such veerings. Of course history that has moved on from what I described in 2009 should be indicated. And books have come along that expound some of my topics better than I; I wish I'd had them in hand before.

Start, as the book does, with climate. In December 2009, the UN Climate Change Conference in Copenhagen was undermined by a suspiciously sophisticated hack of emails among climatologists at the University of East Anglia, England. Once again, climate change deniers dominated the public discourse and prevented action on greenhouse gases. I responded with a New York Times op-ed titled "Four Sides to Every Story," suggesting that it helps to distinguish four kinds of views about global warming according to whether they are driven mainly by ideology or by evidence. "Denialists" and "Skeptics" both have doubts about climate change, but only the science-based Skeptics change their opinions with changing evidence. Likewise, ideological "Calamatists" and scientific "Warners" are alarmed about climate, but only the Warners respond to contradictory evidence.

James Lovelock, for example, a Warner, has softened his sense of alarm about the pace of climate change. He is persuaded by "sensible skeptic" Garth Paltridge's book The Climate Caper (2009) that climate scientists have become overly politicized, and a paper in Science by Kevin Trenberth, head of Climate Analysis at the U.S. National Center for Atmospheric Research, led Lovelock to conclude, "The solar energy is coming in but much of it is going to some unknown destination. Sea level rise shows the Earth is warming as expected, but surface temperatures do not rise as they should." Something unknown appears to be slowing the rate of global warming.

In the first chapter this book I emphasized the many unknowns in climate dynamics that could trigger "abrupt" climate change — positive feedbacks and tipping points. Let me add further current unknowns in the climate system that might drive the pace of warming slower or faster than we expect. Trenberth (and Lovelock) is puzzled by the "missing energy" in the global net energy budget. Also there is a large and mysterious sink of carbon that varies from year to year. That "missing carbon" might be absorbed by woody plants or by microbes in the ocean or soils. We don't know yet, so we don't know how to assist the process. Climatologist James Hansen deplores our lack of good data on aerosols, and thus the overall impact of "global dimming" is uncertain. We're not sure yet whether an increase in clouds has a negative or positive feedback effect, and the same goes for the added moisture in the air that warming brings — it all depends on research that remains to be done on altitude effects. In other words, the progress of climate science is likely to keep on alternately terrifying and mollifying us till midcentury at least.

As Lovelock put it to me by email in May 2010:

The plot thickens. We do not know when the heat will turn on.

The missing energy: Down welling would be intellectually satisfying, especially since it would probably require a corresponding upwelling of cold bottom water, which stays at 4 degrees C in the lower parts of the ocean. It would help explain the current cool spell.

The aerosols over East and South Asia could be a cause of global cooling. I discussed this in The Revenge of Gaia. The effect of clouds is difficult to distinguish from the aerosol effect.

Increased biotic uptake seems unlikely but could be an arctic surprise as more cool algal-rich water is exposed by melting surface ice.

Increased atmospheric moisture, especially in the upper troposphere and lower stratosphere, could have a large positive effect on heating. Do not forget that much of the water in the stratosphere comes from methane oxidation.

Richard Betts from the Hadley Centre was here yesterday and had the good news that their huge model that includes Gaia has just been turned on and after a year of tests and settling down should be giving results.

Apart from a few friends like Richard Betts my name is now mud in climate science circles for having dared to consort with sceptics. Amazing how tribal scientists are.

The take home message is that it is now even more unwise for government to spend heavily on renewable energy and other green dreams. Use the gain in time to prepare for sensible adaptation.

All that I would add to my city chapters may be found in two outstanding books  published after mine. Kevin Kelly's What Technology Wants (2010) can be read as a companion volume to Whole Earth Discipline, because he makes it inescapably clear that biophilia and technophilia are not contradictory, but both are part of one long continuity. "Cities are technological artifacts," Kelly writes, "the largest technology we make." Humanity pours into cities by the millions for the simple reason that, like all technology, cities offer more options.

A book by science journalist Fred Pearce, The Coming Population Crash: And Our Planet's Surprising Future (2010), was full of revelations for me. He tracks the eugenicist agenda behind most population-control theory, the life-sapping depression in areas losing population (such as eastern Europe), the smart ambition of migrants, the room for growth in Africa, and the possibility that the permanent aging of society will be a boon.

Nuclear has the most news. President Obama shut down Yucca Mountain and assigned a blue ribbon committee to come up with a practical nuclear waste storage policy for the United States. One intriguing alternative being explored uses deep borehole technology developed by the oil and gas industry. At any reactor site you can drill a hole three miles deep, a foot and a half wide. Down there in the basement rock the water is heavily saline and never mixes with surface fresh water. You can drop spent fuel rods down the borehole, stack them up a mile deep, pour in some concrete, and forget about the whole thing.

Obama also committed $54 billion in loan guarantees to cover the building of up to ten new reactors to restart the industry in America. That settled the argument within the administration about expanding nuclear power. Outsiders like Al Gore and Amory Lovins lobbied against it, but pronuclear insiders like Energy Secretary Steven Chu and science adviser John Holdren prevailed. Also leaders in the Democratic Party, such as House Speaker Nancy Pelosi and energy bill coauthor John Kerry, pushed nuclear in new legislation. Republicans have always been pronuclear.

Amory Lovins attempted a preemptive strike on my nuclear chapter on the day of the book's publication with a 20,000-word critique titled "Four Nuclear Myths: A Commentary on Stewart Brand's Whole Earth Discipline and on Similar Writings," plus a summary at Grist.org. You can download the paper from Rocky Mountain Institute. It suffers a bit because Lovins had not read the rest of the book, nor did he know there was a Web site with all of my source material. He rightly busts me for misspelling a name and for two misuses of technical terminology — corrected in this edition. The rest of his argument was the familiar Lovins deluge; I didn't respond to it because I already had in the chapter. At the Nuclear Energy Institute blog David Bradish wrote a detailed countercritique of Lovins's paper.

The one surprise was that Lovins did not address my material on microreactors — "small modular reactors," or "SMRs," as they're called these days. Elsewhere, though, he dismissed them as "fundamentally a fantasy." In March 2010 Secretary Chu wrote an op-ed for the Wall Street Journal promoting small reactors and noting, "In his 2011 budget request, President Obama requested $39 million for a new program specifically for small modular reactors." A new player in the emerging industry is Babcock & Wilcox, builder of U.S. Navy reactors for half a century. The company is designing a 125-megawatt manufacturable light water reactor it calls "mPower."

One new book does an expert job of shattering Lovinsesque hopes that a stringent program of conservation, wind, and solar is all we need to make our energy climate-safe. It is Sustainable Energy: Without the Hot Air  (2009), by David MacKay (pronounced "ma-KIE"), who is a Cambridge physicist and chief scientist for Britain's Department of Energy and Climate Change. The book provides ruthless analysis, winningly told and illustrated, of what it will take for Great Britain to reduce its greenhouse gas emissions enough to make a difference to climate. As in the analyses by his ally Saul Griffith, the needed measures are horrifying to contemplate in aggregate, but they can get the job done. A quote of his that has gone viral is, "I'm not trying to be pro-nuclear, I'm just pro-arithmetic."

I owe to MacKay one of my changes of mind since finishing this book. On page 103 I'm pretty dismissive of "clean coal." Over dinner MacKay persuaded me that coal will keep being burned by nearly everybody, especially China and India, because it is so cheap. Therefore we have to figure out a way to burn it cleanly, capturing the carbon dioxide and burying it, or bonding it into concrete, or whatever it takes. In that light, Al Gore's expensive TV ads deriding clean coal are a public disservice.

In another shift, my fond hopes for space-based solar (page 81) have been dashed by Elon Musk, CEO of rocket-launching SpaceX and chairman of SolarCity. He informed me vehemently that even if access to orbit were free, the inefficiencies of energy collection and transmission rule space solar out as a viable source of baseload power on the ground.

In a final energy comeuppance, I came to regret leaving fusion out of my nuclear chapter. Like most, I figured it was too good to be possible — zero mining (the fuel is hydrogen), zero greenhouse gases, zero waste stream, zero meltdown capability, zero weaponization. Then I visited the National Ignition Facility at Lawrence Livermore National Laboratory in California. There a vast array of lasers aims to focus 500 terawatts of energy for a billionth of a second on a BB-sized target made of hydrogen isotopes and ignite it in a fusion reaction. Impressive early tests suggest that successful ignition could occur by 2011. From that point it might be as short as a decade to a working prototype of a 1-gigawatt fusion power plant.

There's been significant news in biotech as well. The environmental and economic benefits of GE crops in the United States were confirmed by an authoritative 250-page study from the National Academy of Sciences. It reported that GE farmers have the advantage of lower costs, higher yields, and greater safety than non-GE farmers, and that significant environmental gains come from their use of less pesticides, less toxic herbicides, and especially from no-till farming enabled by herbicide-resistant GE crops.

The next generation of transgenic crops is now called "functional foods," described as "any modified food or food ingredient that may provide a health benefit beyond that of the traditional nutrients it contains." A  Pew Research Center survey of current GE research noted that "food enhancements cover a wide range, including improved fatty acid profiles for more heart healthy food oils, improved protein content and quality for better human and animal nutrition, increased vitamin and mineral levels to overcome widespread nutrient deficiencies throughout the world, and reduction in anti-nutritional substances that diminish food quality and can be toxic." Organic farmers should be allowed to grow those crops. If they can't, they may be left with nothing but a diminishing nostalgia market of people willing to pay extra for less healthy food.

In a book called Hybrid: The History and Science of Plant Breeding (2009), by Noel Kingsbury, I found a story that belongs in this book, so I'll add it here. Back in 1998 in India, while Mahyco-Monsanto was running test plots of Bt cotton, Vandana Shiva was denouncing the technology as "seeds of suicide, seeds of slavery, seeds of despair." Meanwhile, Kingsbury writes:

Farmers . . . were desperate to obtain cotton that would not fall victim to bollworm and to avoid the costs and dangers of using pesticides. . . Seeds of the Bt cotton "escaped" from Mahyco-Monsanto's test plots and were used to breed new "unofficial" Bt cotton varieties. . . .

By 2005, it was estimated that 2.5 million hectares were under "unofficial" Bt cotton, twice the acreage as under the ones that had been sown from Monsanto's packets. . . . A veritable cottage industry had sprung up, a state described as "anarcho-capitalism," whereby small-scale breeders were crossing reliable local varieties with the caterpillar-proof Bt plant. The world's first GM landraces had arrived. . .

Shiva's "Operation Cremate Monsanto" had spectacularly failed, its anti-GM stance borrowed from Western intellectuals had made no headway with Indian farmers, who showed they were not passive recipients of either technology or propoganda, but could take an active role in shaping their lives. What they did is also perhaps more genuinely subversive of multinational capitalism than anything GM's opponents have ever managed.

"Synbio" crossed the threshold into "synlife" with the announcement in May 2010 that Craig Venter's team had successfully booted up a living, replicating cell with a genome totally created by means of chemistry and computers. The team's paper in Science noted, "If the methods described here can be generalized, design, synthesis, assembly, and transplantation of synthetic chromosomes will no longer be a barrier to the progress of synthetic biology."

Decades ago I suspect that environmentalists would have risen up in outrage and alarm against technology like Venter's, but I have found them surprisingly noncommittal about synthetic biology, even while they continue to complain about transgenic crops. While the uproar about nuclear power persists (though it is fading into a more primary focus on coal plants), I bet that fusion will be largely welcomed by Greens, if it comes to pass. Legacy resistance against old new tech continues, but new new tech appears not to arouse the fears and activism of old.

I should add an excellent online source for environmental news: Environment 360 — "Opinion, Analysis, Reporting & Debate" — run by the Yale School of Forestry and Environmental Studies.

There was significant geoengineering news. A step toward asteroid control was taken by the Obama administration. While canceling a return to the moon by NASA, the president proposed that the next deep-space human mission should be to an asteroid, which could occur by about 2025. His science adviser John Holdren remarked that developing the ability to nudge asteroids "would demonstrate once and for all that we're smarter than the dinosaurs and can therefore avoid what they didn't."

Two good books on geoengineering finally arrived: How to Cool the Planet (2010) by Jeff Goodell and Hack the Planet (2010) by Eli Kintisch. Both writers talked to most of the early players: Ken Caldeira, Lowell Wood, John Latham, Stephen Salter, Russ George, David Keith, James Lovelock, and David Victor. One new scheme has been put forward by Harvard's Russell Seitz to brighten parts of the ocean by aerating the water with microbubbles.

Geoengineers gathered in cautionary mode at the Asilomar Conference Center in California, echoing the recombinant DNA gathering there back in 1975. Environmental organizations were invited, and so was I. The conference adopted terminology from an influential report by the Royal Society, noting that geoengineering comes in two major forms — solar radiation management (SRM) and carbon dioxide removal (CDR). The view emerged that carbon dioxide projects would necessarily be slow and in most cases benign and therefore in less need of global regulation, but the opposite is true of efforts to manage sunlight with stratospheric sulfur dust or brightened clouds. The three days of discussion basically reaffirmed the "Oxford Principles" first proposed in a 2009 memorandum to the British Parliament by Steve Raynor from Oxford University:

  • Geoengineering regulated as a public good

  • Public participation in geoengineering decision-making

  • Disclosure of geoengineering research and open publication of results

  • Independent assessment of impacts

  • Governance before deployment

In other words, one way to geoengineer wrong would be for a private company to start injecting sulfur dioxide into the stratosphere without disclosing research plans or research results, without outside monitoring of effects, and without permission of a public governance body.

At the same time that the hardcover edition of this book was making its way in the world, a film called Earth Days, on the origins of the contemporary environmental movement, was released in theaters and on TV. I'm in it, along with others from this book such as Paul Ehrlich and Rusty Schweickart. The movie is really carried by Earth Day founder Denis Hayes and energy maven Hunter Lovins (Amory's former wife), but director Robert Stone gave me the concluding statement. What I said over a photograph of the Earth there will perhaps serve here as well:

We're engaging in a set of activities which go way beyond the individual life span, way beyond children, grandchildren, way beyond parents, grandparents, great-grandparents, to the whole frame of at least civilizational life. Once you get comfortable with that, then you start to go further out still, to three and a half billion years of life on Earth, and maybe we'll do another three and a half billion years. That's kind of interesting to try to hold in your mind. And once you've held it in your mind, what do you do on Monday?

By Steven Pinker

New forms of media have always caused moral panics: the printing press, newspapers, paperbacks and television were all once denounced as threats to their consumers' brainpower and moral fiber.

So too with electronic technologies. PowerPoint, we're told, is reducing discourse to bullet points. Search engines lower our intelligence, encouraging us to skim on the surface of knowledge rather than dive to its depths. Twitter is shrinking our attention spans. ...

...The new media have caught on for a reason. Knowledge is increasing exponentially; human brainpower and waking hours are not. Fortunately, the Internet and information technologies are helping us manage, search and retrieve our collective intellectual output at different scales, from Twitter and previews to e-books and online encyclopedias. Far from making us stupid, these technologies are the only things that will keep us smart.

STEVEN PINKER is the Johnstone Family Professor in the Department of Psychology at Harvard University and is the author of six books, including The Language Instinct, How the Mind Works, Words and Rules, The Blank Slate, and The Stuff of Thought.

Steven Pinker's Edge Bio Page

THE REALITY CLUB: Nicholas Carr, Douglas Rushkoff, Evgeny Morozov



[STEVEN PINKER:] New forms of media have always caused moral panics: the printing press, newspapers, paperbacks and television were all once denounced as threats to their consumers' brainpower and moral fiber.

So too with electronic technologies. PowerPoint, we're told, is reducing discourse to bullet points. Search engines lower our intelligence, encouraging us to skim on the surface of knowledge rather than dive to its depths. Twitter is shrinking our attention spans.

But such panics often fail basic reality checks. When comic books were accused of turning juveniles into delinquents in the 1950s, crime was falling to record lows, just as the denunciations of video games in the 1990s coincided with the great American crime decline. The decades of television, transistor radios and rock videos were also decades in which I.Q. scores rose continuously.

For a reality check today, take the state of science, which demands high levels of brainwork and is measured by clear benchmarks of discovery. These days scientists are never far from their e-mail, rarely touch paper and cannot lecture without PowerPoint. If electronic media were hazardous to intelligence, the quality of science would be plummeting. Yet discoveries are multiplying like fruit flies, and progress is dizzying. Other activities in the life of the mind, like philosophy, history and cultural criticism, are likewise flourishing, as anyone who has lost a morning of work to the Web site Arts & Letters Daily can attest.

Critics of new media sometimes use science itself to press their case, citing research that shows how "experience can change the brain." But cognitive neuroscientists roll their eyes at such talk. Yes, every time we learn a fact or skill the wiring of the brain changes; it's not as if the information is stored in the pancreas. But the existence of neural plasticity does not mean the brain is a blob of clay pounded into shape by experience.

Experience does not revamp the basic information-processing capacities of the brain. Speed-reading programs have long claimed to do just that, but the verdict was rendered by Woody Allen after he read "War and Peace" in one sitting: "It was about Russia." Genuine multitasking, too, has been exposed as a myth, not just by laboratory studies but by the familiar sight of an S.U.V. undulating between lanes as the driver cuts deals on his cellphone.

Moreover, as the psychologists Christopher Chabris and Daniel Simons show in their new book "The Invisible Gorilla: And Other Ways Our Intuitions Deceive Us," the effects of experience are highly specific to the experiences themselves. If you train people to do one thing (recognize shapes, solve math puzzles, find hidden words), they get better at doing that thing, but almost nothing else. Music doesn't make you better at math, conjugating Latin doesn't make you more logical, brain-training games don't make you smarter. Accomplished people don't bulk up their brains with intellectual calisthenics; they immerse themselves in their fields. Novelists read lots of novels, scientists read lots of science.

The effects of consuming electronic media are also likely to be far more limited than the panic implies. Media critics write as if the brain takes on the qualities of whatever it consumes, the informational equivalent of "you are what you eat." As with primitive peoples who believe that eating fierce animals will make them fierce, they assume that watching quick cuts in rock videos turns your mental life into quick cuts or that reading bullet points and Twitter postings turns your thoughts into bullet points and Twitter postings.

Yes, the constant arrival of information packets can be distracting or addictive, especially to people with attention deficit disorder. But distraction is not a new phenomenon. The solution is not to bemoan technology but to develop strategies of self-control, as we do with every other temptation in life. Turn off e-mail or Twitter when you work, put away your Blackberry at dinner time, ask your spouse to call you to bed at a designated hour.

And to encourage intellectual depth, don't rail at PowerPoint or Google. It's not as if habits of deep reflection, thorough research and rigorous reasoning ever came naturally to people. They must be acquired in special institutions, which we call universities, and maintained with constant upkeep, which we call analysis, criticism and debate. They are not granted by propping a heavy encyclopedia on your lap, nor are they taken away by efficient access to information on the Internet.

The new media have caught on for a reason. Knowledge is increasing exponentially; human brainpower and waking hours are not. Fortunately, the Internet and information technologies are helping us manage, search and retrieve our collective intellectual output at different scales, from Twitter and previews to e-books and online encyclopedias. Far from making us stupid, these technologies are the only things that will keep us smart.

Author, The Big Switch; The Shallows: What the Internet Is Doing to Our Brains

Steven Pinker is too quick to dismiss people's concerns over the Internet's influence on their intellectual lives. He asserts that digital media "are the only things that will keep us smart." But the evidence he offers to support the claim consists largely of opinions and anecdotes, plus one Woody Allen joke.

On neuroplasticity, Pinker expresses the skepticism characteristic of evolutionary psychology advocates. When faced with suggestions that "experience can change the brain," he writes, "cognitive neuroscientists roll their eyes." But is his opinion really shared so universally? In the reports on the Net's cognitive effects published in the Times last week, scholars like Russell Poldrack, Clifford Nass, Nora Volkow, and Adam Gazzaley offered views that conflict with Pinker's. He may disagree with these views, but to pretend they don't exist is misleading.

In considering "intelligence," Pinker paints with too broad a brush. He writes that "if electronic media were hazardous to intelligence, the quality of science would be plummeting." Intelligence takes many forms. Electronic media may enhance some aspects of intelligence (the ability to spot patterns, for example, or to collaborate at a distance) while at the same time eroding others (the ability to reflect on our experiences, say, or to express ourselves in subtle language). Intelligence can't be gauged by a single measure.

Pinker notes that IQ scores rose during the decades of TVs and transistor radios. But that rise, which began in the early 1900s, is largely attributable to gains in visual acuity and abstract problem-solving. Measures of other components of intelligence, including verbal skill, vocabulary, basic arithmetic, memorization, critical reading, and general knowledge, have been stagnant or declining.

Pinker argues that "the effects of experience are highly specific to the experiences themselves." But that's why some of us are deeply concerned about society's ever-increasing devotion to the Net and related media. Given that the average American now spends 8.5 hours a day peering at screens, it seems likely that we're narrowing the scope of our intellectual experiences. We're training ourselves, through repetition, to be facile skimmers and message-processors — important skills, no doubt — but, perpetually distracted, we're not learning the quieter, more attentive modes of thought: contemplation, reflection, introspection, deep reading. 

Pinker is right that "genuine multitasking" is a myth. But that's why many experts on multitasking are concerned about its increasing prevalence. People may think, as they juggle emails, texts, tweets, and glances at websites, that they're adeptly doing a lot of stuff at once, but actually they're switching constantly between different tasks, and suffering the accompanying cognitive costs. The fact that people who fiddle with cell phones drive poorly shouldn't make us less concerned about the cognitive effects of media distractions; it should make us more concerned.

We should celebrate the benefits that the Net and related media have brought us. I've enjoyed those benefits myself over the last two decades. But we shouldn't be complacent when it comes to the Net's ill effects. As Patricia Greenfield, the UCLA developmental psychologist, wrote in a Science article last year, research suggests that our growing use of screen-based media is weakening our "higher-order cognitive processes," including "abstract vocabulary, mindfulness, reflection, inductive problem solving, critical thinking, and imagination."

Media Analyst; Documentary Writer; Author, Life Inc.: How the World Became a Corporation and How to Take It Back

The main value in Pinker's statement is the implied notion that media technologies cannot be evaluated in a vacuum. Taken alone, neither aTwitter account nor a Facebook profile will diminish one's capacity to think or interact.

But nothing ever happens alone. These media are arising in contexts of business, economics, and other social factors. No one — or at least no one smart — is saying that PowerPoint reduces discourse to bullet points. What they are saying is that combined with the bias of the workplace for tangible metrics and easy slogans over long-term planning and complex solutions, the bias PowerPoint toward bullet points can exacerbate the worst existing tendencies in business. It turns out that PowerPoint is not the best tool for every purpose.

So while it would be incorrect to blame PowerPoint for the collapse of competency in America, or the continuing fall of corporate profits over assets, a re-evaluation of the program's universal application in all contexts is overdue.

Likewise, Facebook — as a way for college kids to meet and greet one another — was a terrific program. As a mirror through which young people forge an identity, however, the program is lacking the nuance of real life. Facebook — more than a program to be feared for its code — is a business plan to be feared for its ubiquity. The object of Facebook is to monetize social interactions. This is the bias of the program, and a bias of which most people are painfully unaware.

Meanwhile, the positive effects of new media — such as their destabilization of centralized currencies and challenge to the forced monopolization of value creation — will remain unrecognized until we move beyond our artificially polarized reaction to the tools, and engage in a more qualitative study of their influences in different circumstances.

The real power of our computers and networks to expand human capacity, promote a global consciousness, and catalyze the evolution of our species will only be realized if we rise above this endless tit-for-tat between "pro" and "anti" technology camps, and instead begin to reckon with the very real biases of these media, as well as how they amplify or diminishes the biases of the systems in which they are operating.

Commentator on Internet and politics "Net Effect" blog; Contributing editor, Foreign Policy

Unless I misread the zeitgeist, Steven Pinker's "Mind Over Mass Media"appears to be a rebuttal of Nicholas Carr's two-pronged lament about the Internet, namely that (1) "the logic of the Internet" dictates that it becomes an "engine of distraction", what with all those links, clicks and pop-ups (2) as the Internet plays an increasingly prominent role in our public life, such distraction will inevitably undermine its intellectual foundations (e.g. we won't finish all those lengthy Russian and French novels nor would we have the time to reflect, assess, and ponder). There are many things that I find problematic in Carr's thesis; Pinker's rebuttal fails to address most of them and exhibits quite a few problems of its own.

First, a brief note on Carr's overarching argument. Carr's mission is doomed, for he is essentially attempting to prove that poetry still matters through a series of complex mathematical equations. Clearly, that won't do. For once, I also happen to believe that "poetry still matters" and that it might be coming under attack from the Internet. However, the way to demonstrate this is not by embracing the science of looking inward — i.e. studying the subtle changes in the wiring of our brains — but by embracing the science of looking outward — i.e. analyzing the impact that the Internet has on the flow and the quality of ideas.

In other words, Carr needs to ditch neuroscience and embrace sociology, for one can't write an effective defense of humanism in the ink of amino acid glutamate. The intellectual limitations of Carr's project become apparent the moment one moves from the descriptive to the prescriptive: short of establishing the "dictatorship of the neurons", we can't legislate with a view of optimizing the well-being our brains alone — the public usually has many more priorities and needs.

Had Carr looked beyond the neuroscience, he may have found that many of the problems that he blames on the Internet — constant busyness, shrinking attention spans, less and less time for concentration and contemplation — are rooted in the nature of working and living under modern capitalism rather than in information technology or gadgetry per se. In fact, as Pinker correctly points outs, Carr's are very old complaints.

Exhibit A: back in 1881 the prominent New York City physician George Beard published "American Nervousness", a book about the sudden epidemic of "nervousness" sweeping America, which he blamed, in part on the telegraph and the daily newspaper (the book later proved a great influence on Freud).

Exhibit B: in 1891, almost 120 years before The Atlantic published Carr's "Is Google Making Us Stupid?", the same magazine ran "Journalism and Literature", an essay by the polymath William Stillman, where he attacked the cultural change enabled by the telegraph-enabled journalism much in the same vein that Carr attacks the Internet. Stillman complained that "we develop hurry into a deliberate system, skimming of surfaces into a science, the pursuit of novelties and sensations into the normal business of our lives".

The Internet may be amplifying each of these problems, but it surely did not cause them. When the famed sociologist Manuel Castells speaks of the "black holes of information capitalism", there is as much emphasis on "capitalism" as there is on "information".

Now, onto Pinker's rebuttal. Pinker, I fear, falls into the same conceptual trap as Carr, i.e. he sets to measure the Internet against the printing press, the comic book, and television. However, by viewing the Internet as just another medium, both Carr and Pinker end up significantly downplaying its importance.

But the Internet is not just another medium. Rather, it's a full-blown brand-new dimension to human affairs — and it is poised to profoundly affect all other dimensions. The proper analogy, thus, is not to the newspaper or the telegraph, but to religion and nationalism. However, just like one could not assess the overall impact of religion by looking at the rates of dissemination of religious literature, one cannot assess the impact of the Internet by looking at such a narrow slice of its impact as the consumption of information by its users (and Carr makes that slice even thinner by assuming that the Internet has a "logic" that is not malleable by the social, cultural, and economic environments in which it operates — an assumption I find rather dubious). Just like with religion or nationalism, there is absolutely no guarantee that the vector of social change unleashed by the Internet would be either positive or negative; most certainly, it will be both — so the sooner we find a way to diagnose and minimize its negative effects, the better.

As such, determining whether the Internet strengthens or erodes the intellectual foundations of our culture cannot just be inferred by studying what happens to the brains of its users. Instead, one needs to embark on a much broader (and certainly more painful) structural analysis that would also examine what the Internet does to the production of different kinds of information (will book reviews still be published in 2020?), how it changes the depth and the quality of of access to information (how many more people will have access to how many more resources by 2020?) and so forth. The exclusive focus on how humans consume electronic media blinds Carr to some of the positive aspects of change induced by the Internet — surely, there is more to Google Books' growing role in intellectual production than just inducing a culture of shallow skimming? — while also blinding Pinker to its numerous negative aspects.

Both Carr and Pinker are too extreme in their portrayal of potential human responses to technology. Carr, the skeptic, follows in the steps of Jacques Ellul and paints the modern human as a pathetically impotent creature, completely enthralled by technology and unable to resist its allure. Pinker, the optimist, presents us with the opposite image: for him, the human is in ultimate control, able to turn off the pesky email whenever it gets too distracting.

I find such glorification of human agency — Pinker's belief in our ability to "develop strategies of self-control" — quite disturbing, even more so when it comes from such a distinguished student of human nature as Pinker. Surely, self-control would be a great strategy to fight many other ills of human civilization, from obesity to pollution — only that it almost never works as advertised.

Fortunately, most societies no longer entertain such illusions and there are more and more subtle and effective ways in which we, the consumers, can be saved from ourselves. Perhaps, some form of soft paternalism — for example, finding a way to display the "calorie intake" of the information we consume somewhere in our browsers — might be in order. That consumers won't win this fight is made obvious by the fact that we are up against powerful corporate interests.

The political economy of today's Internet is such that Google, Facebook, and Twitter, having found a way to commodify and capitalize on our distractions, are the ones who stand to benefit the most. No wonder our information diet is so unhealthy: it's our eyeballs, not our minds, that are of primary interest to Internet companies. It's not exactly an environment conducive to practicing self-control. For all his insights into the human psyche, Pinker seems to have missed the real "invisible gorillas" of today's Internet; a hint: they all have headquarters in the Bay Area. Carr might be wrong to focus on neuroscience but some insights from psychiatry (not to mention the legal theory of consumer protection) would be much appreciated.

However, one doesn't need to subscribe to any such conspiracy theories to notice that the Internet has triggered many disturbing socio-economic processes that may not be particularly favorable to the production of new and iconoclast ideas. Take Pinker's argument that accomplished people become accomplished by immersing themselves in their fields — "novelists read lots of novels, scientists read lots of science". On first sight, Pinker seems right — but absolutely nothing about today's culture suggests that such "laws" would still be applicable a decade from now.

For example, I'm less optimistic about the future of free-roaming novelists (or, for that matter, any other kind of literary intellectuals who are unaffiliated with academic institutions) than Pinker. It's quite possible that in just a few years "lots of novelists" won't be reading "lots of novels" because (a) fewer readers would be eager to pay big bucks for their novels, making it quite hard to finance novelists' life-supporting advances; (b) freelancing opportunities would shrink or become even worse paid, as some magazines and newspapers shut down and others shift to repackaging free content they find online; (c) novelists, forced to prostitute themselves on Twitter, Facebook, and FourSquare in order to secure a deal or make a sale, simply wouldn't have much time for reading.

Similarly, Pinker is correct in saying that "special knowledge must be acquired in special institutions, which we call universities" — but one of the implications of the digital revolution might be that universities would also end up being the only places where new ideas could be produced. It works well for scientists like Pinker but I am not sure that every non-academic intellectual would be thrilled to be faced with such a choice.

All in all, the debate between Carr and Pinker confirms my long-running suspicion that one can't grapple with the macro-level social implications of the Internet by operating on the micro-level of neuroscience or psychology. These disciplines do provide useful insights — but we need a brand-new Internet-centric social science to make sense of them.

June 17, 2010

By Liz Else, associate editor and Shaun Gamble, contributor

"If you're confused by climate change, baffled by biodiversity and puzzled by particle physics, join us at Speakers' Corner to cut out the middle man and get the truth behind the headlines."

That was the invitation and challenge from the Zoological Society of London, the folks that run London Zoo. Just show up at the few square metres in London's Hyde Park that have become synonymous with freedom of expression, and look out for a bunch of scientists on soapboxes.

Fifteen scientists and science popularisers turned up on Monday to help invent a new form of science communication. This was the kind of public exposure that would make even an experienced stand-up comedian anxious, so wisely they all came armed with props, from a giant plastic ladybird to a blow-up globe.

The speakers' remit was to talk about the science the public care about most — or perhaps, more honestly, ought to care about most. So the kick-off session was Earth Evolution with talks including "Life on Mars from life on Earth", "Where do species come from anyway?" and "Pheromones: Smells at the heart of life".

Is this something that ought to happen more often? Chatting afterwards, one of the speakers, Exeter University professor Stephan Harrison, said he had come round to the view that engaging with the public was not just an important thing to do, it is a scientist's obligation.

New Scientist's own senior consultant Alun Anderson - whose "Vanishing Arctic" talk was guaranteed to appeal to a public in love with polar bears — agreed, adding that this kind of one-on-one connection could be positively "life-changing". ...

...That's all the more reason for doing more of these events, as apparently some of the speakers are now thinking of doing according to the event's organiser Seirian Sumner, whose team for the event also included Charlotte Walters and Kate Jones. Sumner is a featured essayist on how social insects got to be social in Max Brockman's book What's Next?, a who's who of science's next generation. ...


[EdgeLink: Serian Sumner: A Cooperative Foraging Experiment—Lessons from Ants]

June 2010

Not a typical column of innovation (No una típica columna de innovación)
By Carlos Osorio

The "dangerous ideas" are those that emerge to eliminate the validity of a paradigm and are rejected by the establishment of the day for their potential to change things.

Most innovation columns dedicated to present and discuss cases and draw conclusions that may be applicable to decision makers. This is fun at first, but soon ends up boring both author and readers. So this column will be different.

Here we will try to implement design approaches and innovation to analyze and discuss contingency and present them several times, find different points of view, unexplored and to identify and discover some "dangerous ideas" associated with them, and as defined by Steven Pinker Harvard University.

What are dangerous ideas? Pinker does not refer to this term to those that generate harm to society, as they could be racist or fascist ideologies, or weapons of mass destruction. Quite the contrary. Defined as those that emerge to eliminate the validity of a paradigm that has come to be regarded as normal and accepted, and that threat, as it is-is rejected by the establishment of the day for their potential to change things.

Why call it dangerous? They challenge the status quo and the economic, moral, political, religious or stability of an industry or sector. They are dangerous not because they may be "wrong" but because-oh, paradox could be "correct." These ideas are dangerous because, in testing an institutionalized idea, promise to make obsolete much of it invested in creating the system that maintains its validity. ...

...The aim of this column is to stimulate discussion and action on these dimensions. To learn more read "What is your Dangerous Idea?" Edited by John Brockman.

Spanish Language Original | Google Translation

June 6, 2010

As if it were a part of you, my love (Als wär's ein Teil von dir, my love)

... In the annual survey of the science platform edge.org the question was "What will change everything?"

The 87s-year-old physicist Freeman Dyson of Princeton regretted it itself can not do, but believes in a revolution by Radiotelepathie. With this technology, the brain of microwave sensors will be covered, register any neural activity and - on another person can send. It will then be possible to do what and how another person thinks, what and how another person feels. It will be connected to an almost unimaginable way intimate with another man. He is, as Freeman Dyson's hope at least understand it: in fact.

Until it can lead to initial tests, have two marginal technologies are invented, according to Dyson. He expects that it will take 80 years to make up the first-Radiotelepathie subjects large eyes.

But as these brains transmission power, the whole transitional object users are already pretty close.

German Language Original | Google Translation

May 25, 2010

Science / Le blogue de Valérie Borde

What Ethics for synthetic biology? (Quelle éthique pour la biologie synthétique?)

...a study published in the journal Science reports how 24 scientists from J. Craig Venter Institute have developed computer-synthesized and then assembled a small chromosome, which they then transferred to a cell previously devoid of any genetic material.

Driven by this bit of totally synthetic DNA, the cell expressed instructions codified in the new genome and has multiplied. Easier to tell than done, as you will read in this article Research.

I preferred to wait a few days before commenting on this announcement, as the reactions it provokes are almost as interesting as the study itself.

Obama, himself, has instead been specifically asked the Presidential Commission for the Study of Bioethical Issues, established last November to look at faster on the findings of Craig Venter (Obama letter in pdf).

In France, the association Vivagora also concerned at the highest point of the ethical issues raised by synthetic biology, summarized on its website. See also, if you read English, very enlightened views on this development, such as the development specialist Richard Dawkins, presented on the website of the Edge Foundation.

French Language Original | Google Translation

May 21, 2010

UAE | Heritage and Culture

Translation Initiative Looks to the Future
Adach releases Arabic versions of unpublished works of 25 leading scientists and thinkers

Abu Dhabi: Kalima, the translation initiative of the Abu Dhabi Authority for Culture and Heritage (Adach), has published the Arabic version of The Next Fifty Years: Science in the First Half of the Twenty-first Century, edited by John Brockman which contains the unpublished work of 25 leading scientists and thinkers.

Brockman is the founder of the non-profit Edge Foundation and editor of edge.org, the website devoted to discussions of cutting edge science.

The book, which is translated into Arabic by Fatima Ganem, provides 25 original never-before-published essays about the advances in science and technology that we may see within our lifetimes.



Edited by John Brockman

"An intellectual treasure trove"
San Francisco Chronicle

Edited by John Brockman

Harper Perennial


[click to enlarge]

Contributors include: RICHARD DAWKINS on cross-species breeding; IAN McEWAN on the remote frontiers of solar energy; FREEMAN DYSON on radiotelepathy; STEVEN PINKER on the perils and potential of direct-to-consumer genomics; SAM HARRIS on mind-reading technology; NASSIM NICHOLAS TALEB on the end of precise knowledge; CHRIS ANDERSON on how the Internet will revolutionize education; IRENE PEPPERBERG on unlocking the secrets of the brain; LISA RANDALL on the power of instantaneous information; BRIAN ENO on the battle between hope and fear; J. CRAIG VENTER on rewriting DNA; FRANK WILCZEK on mastering matter through quantum physics.

"a provocative, demanding clutch of essays covering everything from gene splicing to global warming to intelligence, both artificial and human, to immortality... the way Brockman interlaces essays about research on the frontiers of science with ones on artistic vision, education, psychology and economics is sure to buzz any brain." (Chicago Sun-Times)

"11 books you must read — Curl up with these reads on days when you just don't want to do anything else: 5. John Brockman's This Will Change Everything: Ideas That Will Shape the Future" (Forbes India)

"Full of ideas wild (neurocosmetics, "resizing ourselves," "intuit[ing] in six dimensions") and more close-to-home ("Basketball and Science Camps," solar technology"), this volume offers dozens of ingenious ways to think about progress" (Publishers Weekly — Starred Review)

"A stellar cast of intellectuals ... a stunning array of responses...Perfect for: anyone who wants to know what the big thinkers will be chewing on in 2010. " (New Scientist)

"Pouring over these pages is like attending a dinner party where every guest is brilliant and captivating and only wants to speak with you—overwhelming, but an experience to savor." (Seed)

* based On The Edge Annual Question — 2009: "What Will Change Everything?)

Edge Foundation, Inc. is a nonprofit private operating foundation under Section 501(c)(3) of the Internal Revenue Code.

John Brockman, Editor and Publisher
Russell Weinberger, Associate Publisher

Alexandra Zukerman, Assistant Editor
contact: editor@edge.org
Copyright © 20
10 By Edge Foundation, Inc
All Rights Reserved.