Page 3 of 6

Just yesterday, Chris Anderson and Michael Wolff’s article “The Web is Dead. Long Live the Internet” was officially posted on Wired.com. In it, they argue that the rise of apps for smartphones and iPads, RSS feeds, proprietary platforms like the Xbox and so on signal the end of the Web as the center of most people’s online lives. In many ways, it argues loosely for a vindication of Wired‘s notorious “PUSH!” cover story from 1997, which also argued for the end of browsers’ relevance.

But honestly, between Alexis Madrigal’s beautiful rebuttal at TheAtlantic.com, Rob Beschizza’s devastating graphics on Boing Boing (reworking Anderson and Wolff’s own choice of data), and other quick rebuttals springing up, has any ambitious piece of Internet-related punditry died a faster, more ignominious death? It seems as though the plausibility of this idea has been drained away even before issues of the paper magazine could have reached subscribers.

I don’t think Anderson and Wolff’s argument is entirely without merit, but the somewhat more nuanced version of it that seems more resilient is one that Farhad Manjoo endorsed in a couple of columns for Slate early this year, “Computers Should Be More Like Toasters” and “I Love the iPad” (both concurrent with the debut of Apple’s iPad, which I don’t see as even remotely coincidental). Manjoo was arguing for a simpler, more appliance-level interface for computers rather than the death of something as rich and vital as the Web still is. In effect, unlike Anderson and Wolff, Manjoo avoided the trap of arguing for the historically disproven “zero sum game” model of newer technologies driving older ones to extinction, which Madrigal debunks. Even if interfaces and operating systems become more app-like, the browser-mediated Web may continue to be a crucial part of most users’ lives.

There’s no reason to think the Web as we know it won’t eventually die; most things do. And apps seem destined to play a more ubiquitous role in all our lives for some time to come; terrific. It seems doubtful those two propositions are causally linked, however.

Update (added 8/21): Far more sophisticated discussion of the points raised in the Wired article, including additional smart rebuttals, is available in this series of exchanges between Chris Anderson, Tim O’Reilly and John Batelle—served up by Wired.com itself, to its credit.

Ray Kurzweil, the justly lauded inventor and machine intelligence pioneer, has been predicting that humans will eventually upload their minds into computers for so long that I think his original audience wondered whether a computer was a type of fancy abacus. It simply isn’t news for him to say it anymore, and since nothing substantive has happened recently to make that goal any more imminent, there’s just no good excuse for Wired to still be running articles like this:

Reverse-engineering the human brain so we can simulate it using computers may be just two decades away, says Ray Kurzweil, artificial intelligence expert and author of the best-selling book The Singularity is Near.

It would be the first step toward creating machines that are more powerful than the human brain. These supercomputers could be networked into a cloud computing architecture to amplify their processing capabilities. Meanwhile, algorithms that power them could get more intelligent. Together these could create the ultimate machine that can help us handle the challenges of the future, says Kurzweil.

This article doesn’t explicitly refer to Kurzweil’s inclusion of uploading human consciousness into computers as part of his personal plan for achieving immortality. That’s good, because the idea has already been repeatedly and bloodily drubbed—by writer John Pavlus and by Glenn Zorpette, executive editor of IEEE Spectrum, to take just two recent examples. (Here are audio and a transcription of a conversation between Zorpette, writer John Horgan and Scientific American’s Steve Mirsky that further kicks the dog. And here’s a link to Spectrum‘s terrific 2008 special report that puts the idea of the Singularity in perspective.)

Instead, the Wired piece restricts itself to the technological challenge of building a computer capable of simulating a thinking, human brain. As usual, Kurzweil rationalizes this accomplishment by 2030 by pointing to exponential advances in technology, as famously embodied by Moore’s Law, and this bit of biological reductionism:

Read Full Article

Previously, I slightly differed with David Crotty’s good post about why open blogging networks might be incompatible with the business models of established publishing brands, particularly for scientific brands, for which credibility is king. David had diagnosed correctly the very real sources of conflict, I thought, but those problems should only become unmanageable with networks whose pure, principled openness went beyond anything publishers seemed interested in embracing anyway. The more important consideration, in the eyes of the bloggers and the increasingly digital-savvy audience, will be how the brand handles the problems that will inevitably arise—in the same way that how publications manage corrections and mistakes already becomes part of their reputation.

Or to put it another way: the openness of a blogging network doesn’t imperil a brand’s editorial value so much as it helps to define it. (Would that I had thought to phrase it so pithily before!)

Nevertheless, I do think blogging networks potentially pose at least two other types of problems that could be more seriously at odds with commercial brands plans. But neither is exclusive to blogging networks. Rather, they are business problems common to many digital publishing models—the presence of a blogging network just amplifies them.

Read Full Article

Inspired by David Crotty’s post at the Scholarly Kitchen, the indomitable blogfather Bora Z. tweets:

For the most part, I agree with the individual points and criticisms that David raises. Whether I agree with his bottom-line conclusion that open networks are incompatible with established brands, and maybe most especially with brands built on scientific credibility, depends on the purity of one’s definition of open.

Unquestionably, leaving a troop of bloggers to their own scruples while publishing under your banner is fraught with risk, but as problems go, it’s neither unprecedented nor unmanageable in publishing. In fact, I’d say the open blogging network problem is really just a special case of the larger challenge of joining with fast-paced, out-linking (and poorly paying) online publishing culture. Some of the best prescriptions seem to be what David is suggesting or implying, so perhaps any disagreement I have with him is really over definitions rather than views.

Read Full Article

This sobering video by Isao Hashimoto speaks for itself. You’ll need about 15 minutes to watch the whole thing, but if you want a reminder of what the pace of testing in the nuclear age has been, this is well worth it.

The reminder that more than 1,000 nuclear explosion took place in the American Southwest almost leaves you wondering how the entire region isn’t an atomic wasteland, doesn’t it?

The official page for this video on the site for the Preparatory Commission of the Comprehensive Nuclear Test Ban Treaty Organization has more information about the artist.

(As the page notes, the video does not show two nuclear tests that occurred after 1998, both by North Korea, in October 2006 and May 2009.)

By rights, I should have seen and commented on this article in the New York Times days ago, and perhaps a better blogger would now shrug it off as a lost opportunity. But even if it is only a fairly trivial example of the chronic problem with false balance that dogs reporting on the climate policy debates, this particular instance of it still annoys me so much that I need to vent it out of my system anyway.

Tom Zeller, Jr., writes:

In any debate over climate change, conventional wisdom holds that there is no reflex more absurd than invoking the local weather.

And yet this year’s wild weather fluctuations seem to have motivated people on both sides of the issue to stick a finger in the air and declare the matter resolved — in their favor.

Last February, for example, as a freak winter storm paralyzed much of the East Coast, relatives of Senator James M. Inhofe, the Oklahoma Republican who is a skeptic of climate change, came to Washington and erected an igloo.

They topped it with a cheeky sign asking passers-by to “Honk if you ♥ global warming.” Another sign, added later, christened the ice dome “Al Gore’s new home.”

Now, with record heat searing much of the planet from Minnesota to Moscow, people long concerned with global warming seem to be pointing out the window themselves.

“As Washington, D.C., wilts in the global heat wave gripping the planet, the Democratic leadership in the Senate has abandoned the effort to cap global warming pollution for the foreseeable future,” wrote Brad Johnson at the progressive Wonk Room blog, part of the Center for American Progress.

Must it be necessary to point out that the climate deniers and the environmentalists are “invoking the local weather” in completely different, nonequivalent ways?

Inhofe and the sign makers were implying that the cold weather disproved claims of global warming and showed they were ridiculous. The scientific fraudulence of that argument is precisely why using isolated weather incidents to argue about climate is absurd.

In contrast, even though the Times titled this story, “Is It Hot in Here? Must Be Global Warming,” no one in the story makes that argument. Johnson did not say the heat wave proved global warming was real. Instead, he highlighted the irony of lawmakers forsaking a sound response to the problem while the world was suffering from extreme heat—a kind of heat that will only become more commonplace and extreme as global warming continues.

One side illegitimately used weather as evidence. The other used weather as an example. Were Zeller and his editors too foolish to recognize the difference or was the thrill of getting to say both sides are doing it just too sweet to ignore?

JR Minkel, who blogs as only he can over at A Fistful of Science, recently brought to my attention this Paul Adams article for Popular Science (and indirectly, this news story in the Guardian) about the underappreciated importance of insects as a food source for many people around the world. That prompted me to dig out this recollection of my own foray into eating insects, which I wrote up years ago.

Memoirs of an Entomophage

My reputation in some circles as a person who eats bugs has been blown out of proportion. Yes, I have knowingly and voluntarily eaten insects, but I wish people wouldn’t pluck out that historical detail to epitomize me (“You remember, I’ve told you about John—he’s the bug-eater!”). It was so out of character for me. As a boy, I was fastidious to the point of annoying priggishness; other children would probably have enjoyed making me eat insects had the idea occurred to them, but I wouldn’t have chosen to do so myself. Bug eating was something I matured into, and performed as a professional duty, even a public service.

New York Entomological Society logo

Here’s how it happened. Back in 1992, the New York Entomological Society turned 100 years old, and decided to celebrate with a banquet at the map-and-hunting-trophy bedecked headquarters of the Explorers Club on East 70th Street. Yearning for attention, the Society’s leaders had the inspiration to put insects not only on the agenda but also on the menu. For hors d’oeuvres, you could try the mini fontina bruschetta with mealworm ganoush, or perhaps the wax worm fritters with plum sauce. Would you care for beetle bread with your potatoes, or are you saving room for the chocolate cricket torte? Waiter, could I get more mango dip for my water bug?

Mind you, eating insects is not so bizarre and alien a concept in most of the world. According to Gene DeFoliart, the editor of the Food Insects Newsletter (that’s right, they have a newsletter), societies outside of Europe and North America routinely eat at least some insects, sometimes because they are the closest things to livestock that’s available. Most of the world does not share our squeamishness about eating things with antennae. Moreover, the consequences of our cultural bigotry can be serious. The U.S. and Europe largely drive the budgets for food-related research around the world, which means that most spending on raising better food animals goes to studying cows, chickens and the like. Millions if not billions in Africa, Asia and Latin America however, would get much more direct benefit from knowing how to improve the fauna with six legs (or more) that provide much of their protein.

Then, too, it’s not as though most of us in America haven’t ever eaten insects. Eight million of us live in New York alone, after all, and the Board of Health can’t be everywhere. The key difference is how many insects we’ve eaten, and how aware we were of it at the time.

Read Full Article