John Hudson at The Atlantic with the round-up
Chris Anderson at Wired:
You wake up and check your email on your bedside iPad — that’s one app. During breakfast you browse Facebook, Twitter, and The New York Times — three more apps. On the way to the office, you listen to a podcast on your smartphone. Another app. At work, you scroll through RSS feeds in a reader and have Skype and IM conversations. More apps. At the end of the day, you come home, make dinner while listening to Pandora, play some games on Xbox Live, and watch a movie on Netflix’s streaming service.
You’ve spent the day on the Internet — but not on the Web. And you are not alone.
This is not a trivial distinction. Over the past few years, one of the most important shifts in the digital world has been the move from the wide-open Web to semiclosed platforms that use the Internet for transport but not the browser for display. It’s driven primarily by the rise of the iPhone model of mobile computing, and it’s a world Google can’t crawl, one where HTML doesn’t rule. And it’s the world that consumers are increasingly choosing, not because they’re rejecting the idea of the Web but because these dedicated platforms often just work better or fit better into their lives (the screen comes to them, they don’t have to go to the screen). The fact that it’s easier for companies to make money on these platforms only cements the trend. Producers and consumers agree: The Web is not the culmination of the digital revolution.
A decade ago, the ascent of the Web browser as the center of the computing world appeared inevitable. It seemed just a matter of time before the Web replaced PC application software and reduced operating systems to a “poorly debugged set of device drivers,” as Netscape cofounder Marc Andreessen famously said. First Java, then Flash, then Ajax, then HTML5 — increasingly interactive online code — promised to put all apps in the cloud and replace the desktop with the webtop. Open, free, and out of control.
But there has always been an alternative path, one that saw the Web as a worthy tool but not the whole toolkit. In 1997, Wired published a now-infamous “Push!” cover story, which suggested that it was time to “kiss your browser goodbye.” The argument then was that “push” technologies such as PointCast and Microsoft’s Active Desktop would create a “radical future of media beyond the Web.”
“Sure, we’ll always have Web pages. We still have postcards and telegrams, don’t we? But the center of interactive media — increasingly, the center of gravity of all media — is moving to a post-HTML environment,” we promised nearly a decade and half ago. The examples of the time were a bit silly — a “3-D furry-muckers VR space” and “headlines sent to a pager” — but the point was altogether prescient: a glimpse of the machine-to-machine future that would be less about browsing and more about getting.
Michael Wolff at Wired:
An amusing development in the past year or so — if you regard post-Soviet finance as amusing — is that Russian investor Yuri Milner has, bit by bit, amassed one of the most valuable stakes on the Internet: He’s got 10 percent of Facebook. He’s done this by undercutting traditional American VCs — the Kleiners and the Sequoias who would, in days past, insist on a special status in return for their early investment. Milner not only offers better terms than VC firms, he sees the world differently. The traditional VC has a portfolio of Web sites, expecting a few of them to be successes — a good metaphor for the Web itself, broad not deep, dependent on the connections between sites rather than any one, autonomous property. In an entirely different strategic model, the Russian is concentrating his bet on a unique power bloc. Not only is Facebook more than just another Web site, Milner says, but with 500 million users it’s “the largest Web site there has ever been, so large that it is not a Web site at all.”
According to Compete, a Web analytics company, the top 10 Web sites accounted for 31 percent of US pageviews in 2001, 40 percent in 2006, and about 75 percent in 2010. “Big sucks the traffic out of small,” Milner says. “In theory you can have a few very successful individuals controlling hundreds of millions of people. You can become big fast, and that favors the domination of strong people.”
Milner sounds more like a traditional media mogul than a Web entrepreneur. But that’s exactly the point. If we’re moving away from the open Web, it’s at least in part because of the rising dominance of businesspeople more inclined to think in the all-or-nothing terms of traditional media than in the come-one-come-all collectivist utopianism of the Web. This is not just natural maturation but in many ways the result of a competing idea — one that rejects the Web’s ethic, technology, and business models. The control the Web took from the vertically integrated, top-down media world can, with a little rethinking of the nature and the use of the Internet, be taken back.
This development — a familiar historical march, both feudal and corporate, in which the less powerful are sapped of their reason for being by the better resourced, organized, and efficient — is perhaps the rudest shock possible to the leveled, porous, low-barrier-to-entry ethos of the Internet Age. After all, this is a battle that seemed fought and won — not just toppling newspapers and music labels but also AOL and Prodigy and anyone who built a business on the idea that a curated experience would beat out the flexibility and freedom of the Web.
Matt Buchanan at Gizmodo:
Chris Anderson’s new Big Idea—that the open web is giving way to a mere transport system for closed or semiclosed platforms like Facebook or iPhone apps from the App Store—is not very new. In its current iPhone-y, app-y incarnation, it’s at least a couple of years old. Wired even participates in the very phenomenon it bemoans, with its very fancy iPad app. (Because it has to: “The assumption had been that once the market matured, big companies would be able to reverse the hollowing-out trend of analog dollars turning into digital pennies. Sadly that hasn’t been the case for most on the Web, and by the looks of it there’s no light at the end of that tunnel.”) And the general idea itself goes back even further—Wired proclaimed the browser was dead in 1997, as he points out.
It’s true that the open, free-for-all web is besieged, but in a lot of ways Anderson doesn’t mention, like the potential neutering of net neutrality principles or the ongoing bandwidth crimp that could hamper innovative-but-data-intensive services—and, in turn, push users toward the kind of boxed services (cable VOD or ISP preferred content) that has Anderson so nerve-wracked. Like Comcast giving preferred access to NBC’s content by not counting it toward your monthly data allowance (since Comcast owns half of NBC now), or Verizon speeding up YouTube over Vimeo. You can look at it as a hardware problem vs. a software problem—and if the hardware is screwed, so is the software.
Erick Schonfeld at TechCrunch:
These shifts happen in waves. First the browser took over everything, then developers wanted more options and moved to apps (desktop and mobile), but the browser will eventually absorb those features, and so the leapfrogging continues. The ubiquity of the browser overcomes most of its technical deficiencies. Even in mobile, people will become overwhelmed by apps and the browser will make a comeback.
Rob Beschizza at Boing Boing:
Wired uses this graph to illustrate Chris Anderson and Michael Wolff’s claim that the world wide web is “dead.”
Their feature, The Web is Dead. Long Live the Internet, is live at Wired’s own website.
Without commenting on the article’s argument, I nonetheless found this graph immediately suspect, because it doesn’t account for the increase in internet traffic over the same period. The use of proportion of the total as the vertical axis instead of the actual total is a interesting editorial choice.
You can probably guess that total use increases so rapidly that the web is not declining at all. Perhaps you have something like this in mind:
In fact, between 1995 and 2006, the total amount of web traffic went from about 10 terabytes a month to 1,000,000 terabytes (or 1 exabyte). According to Cisco, the same source Wired used for its projections, total internet traffic rose then from about 1 exabyte to 7 exabytes between 2005 and 2010.
So with actual total traffic as the vertical axis, the graph would look more like this.
Clearly on its last legs!
Matthew Ingram at Gigaom:
As with some of his other popular writings, Anderson seems to be coming to this realization rather late in the game, and has resorted to a sensationalized headline to grab some attention. We at GigaOM (and plenty of others who cover the web and technology space) have been writing and talking about the rise of the app economy — and particularly the rise of mobile apps thanks to the iPhone, as well as the iPad and Google’s Android platform — for more than two years now. As Om has pointed out on a number of occasions, the success of Apple’s iPhone and application store has accelerated the evolution of the web from a free-for-all to a selection of specific apps for specific needs.
Om’s favorite comparison is to the real world of home appliances: we don’t just have a single all-purpose appliance — instead, we have toasters and coffee-makers and can-openers and other devices that perform specific tasks. So, too, we now have applications for maps, applications for photos, applications for reading books, and apps for video and location-based “check ins” and dozens of other things. That doesn’t mean the web is dead; it means that the web, and the way we use it, is evolving. Instead of wandering around on the web looking for interesting websites by using services such as Yahoo or AOL, we’re using task-specific devices in a sense.
Anderson is right in a technical sense when he says that the web is “just one of many applications that exist on the Internet, which uses the IP and TCP protocols to move packets around.” But he also gets it wrong when he conflates the demise of the web browser with the demise of the web itself. Plenty of applications are using web technologies such as HTTP and REST, just as web browsers do. In a sense, they’re like mini-browsers for discrete applications, and although it’s almost a footnote in the Wired piece, HTML5 has the potential to allow developers to create (as some already have) websites that look and feel and function exactly like apps do. (For more on that, read our recent GigaOM Pro piece on the potential of HTML5.) Where does that fit in the “web is dead” paradigm?
It’s also worth noting (as others have as well) that the chart Wired uses with its story is misleading, or at least the way it’s being portrayed is misleading. (It also has the wrong dates, according to TechCrunch.) It shows the amount of total U.S. Internet traffic that different types of content have accounted for over the last decade (as calculated by Cisco). At the far right-hand side of the graph, video is seen as making up a large proportion of that traffic, while something called “the web” makes up a much smaller proportion than it did in 1995. But this does little to prove Anderson’s thesis, since the bulk of video is still viewed using websites such as YouTube and Hulu — and the fact that we have a lot more video traffic than we used to isn’t exactly a revelation.
Choire Sicha at The Awl:
Between 2000 and 2010, Americans with Internet access went from 124 million to 230 million.
(The world at large, by the way, went from 393 million Internet users to 1.5 billion, but let’s keep the focus on America, right Wired? Because we’re so much more interesting and also we buy iPads.)
Rob Beschizza made a related point extremely well. He notes: “According to Cisco, the same source Wired used for its projections, total internet traffic rose then from about 1 exabyte to 7 exabytes between 2005 and 2010.”
So, just in terms of basic Internet-using population in any event, as the “web use” “declined” by half over the last ten years as a percentage of use accorded to Wired, the real world activity presumably, at the same time, “stayed constant due to the doubling of the Internet-user” in the U.S.
Except use of the web blew up far more than that.
There’s a number of other questions I have about these numbers, which are almost the only numbers in the piece, apart from a claim by Morgan Stanley that in five years, more people will use the Internet over mobile devices than PCs.
For instance: doesn’t this chart measure data usage as traffic? Would that perhaps be why the “video” section is so swollen?
Alexis Madrigal at The Atlantic:
The problem is Anderson’s assumption about the way technology works. Serious technology scholars long ago discarded the idea that tech was just a series of increasingly awesomer things that successively displace each other. Australian historian Carroll Pursell, in reviewing Imperial College London professor David Edgerton’s The Shock of the Old, summarized the academic thinking nicely:
An obsession with ‘innovation’ leads to a tidy timeline of progress, focusing on iconic machines, but an investigation of ‘technology in use’ reveals that some ‘things’ appear, disappear, and reappear…
Edgerton has the same flair for the flashy stat that Anderson does. For example, to illustrate the point that newer and older technologies happily coexist, he notes that the Germans used more horses in World War II than the British did in World War I. More prosaically, some of the electricity for your latest gadget was probably made in a power plant that’s decades old. Many ways to bind pieces of paper — staplers, binders, paper clips, etc — remain in common usage (“The Paperclip Is Dead!”). World War I pilots used to keep homing pigeons tucked inside their cockpits as communication tools (see above). People piloting drones and helicopters fight wars against people who use machetes and forty-year old Soviet machine guns; all these tools can kill effectively, and they all exist right now together.
But that’s not how Anderson presents technology in this article. Instead, technologies rise up and destroy each other. And there’s nothing you or I can do to change the course of these wars. This is the nature of technology and capitalism, and there is not much room for individual decisionmaking or social influence in the algorithm.
Ryan Tate at Gawker:
Where did this argument first appear? Funny you should ask!
- Irony 1: Wired released its cover story package first to the Web, on Wired.com. You won’t find it in Wired‘s iPad edition, and it’s not out in print yet. The death of the web might be the “inevitable course of capitalism,” but it apparently pays better to deliver that news via a dying medium.
- Irony 2: Revenue is up at Wired‘s profitable website this year, despite a fairly severe reduction in staff last year. Yet Anderson, who has no control over Wired.com, writes that most Web publishers haven’t been able to “reverse the hollowing-out trend of analog dollars turning into digital pennies… and by the looks of it there’s no light at the end of that tunnel .” That tunnel being the one Wired, itself, is not in, apparently.
- Irony 3: At the same time, circulation — and thus revenue, almost surely — are down for Wired‘s iPad edition, which was approaching (and possibly even surpassing) 100,000 copies for the debut issue but has since fallen off — to less than a fourth of what it was, one source claims. However large or small the decline, it could certainly be corrected; dropping off from a big bang launch is common enough in print and online media alike.But Wired’s iPad tumble does raise the possibility that Anderson is speaking as much from his hopes as from his analysis when he writes, “We are choosing a new form of Quality of Service: custom applications that just work.” The iPad team belongs to Anderson, after all (unlike, again, the web team).
- Irony 4: Isn’t this the guy who wrote a book called Free and noted, “You know this freaky land of free as the Web. A decade and a half into the great online experiment, the last debates over free versus pay online are ending?” Eh, maybe not so much; Anderson today writes, “Much as we love freedom and choice, we also love things that just work, reliably and seamlessly. And if we have to pay for what we love, well, that increasingly seems OK.”
To his credit, Anderson also runs a feature in which publishers Tim O’Reilly and John Battelle get the opportunity to basically tell the editor he’s nuts. (Battelle: “Splashing “The Death of the Web” on the cover might be, well, overstating the case just a wee bit.”) In the online package, Wired.com editor Evan Hansen does likewise (“the web is far too powerful to be replaced by an alternative that gives away so much of what developers and readers have come to love and expect”).
Like any provocative editor, in other words, Anderson has people talking. (See also this take from Rob Beschizza at BoingBoing and from blogging pioneer Dave Winer; TechMeme has more reaction.) Now we get to sit back and watch as the author/consultant/editor tries to explain why nearly the entire conversation about the Death of the Web is happening on the Seemingly Quite Alive Web. That should be, at the very least, entertaining.