There are quite a few subjects in the post that deserve treatment. But one theme that runs through is choice.
If you are conscious of your habits, it is not difficult to immerse yourself in a deep and lengthy work. However, people and students in particular choose not to read information in full context. How much of it is the way they have been taught to read and how much is the style of writing is debatable.
To tell you the truth, the last sentence of the post is probably my favorite:
"(I note a third hypothesis, also related to economics. Since information has become more important and more valuable, some authors have gotten much better at producing it. High quality information lowers patience with the lower quality forms. Once you've read a really well-written microeconomics textbook, say, it's much harder to wade through a poorly-written one, or a poorly written text on any subject.)"
Television is a good parallel that I can use as an example of why I think this is true. OMG, I didn't mean that. I am giving an example of production volume, not quality. Don't worry, I will get to finding quality a bit further on.
As with the Internet, cable and satellite bandwidth far outstrips the content a given individual wishes to consume.
Viewing habits are a good indicator. I survey my freshmen every semester on how many network channels they watch in any given week.
In a class of 25, the median is from 7 to 9 with very small variation. There will be one student with no television, one near 15 and one over 20.
This seems at odds with the availability of several hundred alternatives. Because I am a technology and society guy, I only survey news content viewing as a measure of choice variation. Again, there is very little spread. 80% of the average class watches one of only three channels out of approximately 25.
How much of the choice is due to poor interface? A Finnish researcher claimed in 2005 that our interactive (haptic) interface hasn't changed in over 100 years. This was based on his comparing current technology with mechanical arcade games that use pressure sensors to deliver feedback from the turn of the previous century.
In the case of television, development capital is scarce because of the low chance of economic success. Consequently both content and quality are scarce in comparison to bandwidth. (Think repeat of last season yet again.) In the case of Internet content, the development capital threshold is so low my students can afford it. Content creation has been democratized to an amazing extent. This makes the process of making a reading or viewing choice so difficult, we need the help of a search engine. Quality has to be treated differently as you will see.
The famous failure of full text search to live up to expectations is also to blame for a good portion of the reading statistics. Searches turn up hundreds of thousands of hits, a good number of which must be called up in order to eliminate them from consideration. This is the so-called "bounce rate" but it is not indicative when a reader finds exactly the page he is looking for. Bounce rate is really a measure of depth of site penetration, not content quality or relevance.
So Google does have something to do with it (!) because Google is unable to search for a concept that can be described in many ways. Take for instance a full text search for an emergent concept that people label or describe differently. But what the heck, it isn't magic, it's just a computer. The user had better be able to bring something to the table.
To some extent, that's why Google bought GapMinder. You can see the nations of the world compared by average income, infant mortality, longevity, and GDP over time. It is easy to see that if a population is healthy, average income will increase more rapidly than it does in a country where health care remains poor while economic development is happening.
GapMinder is an example of the semantic web and how it may well cause an accelerated discovery revolution comparable to that of the rapid discovery revolution of the 1600's.
Filtering noise in an effective way will be the next challenge that allows stress to subside. But the challenge is huge. How do you decide whether to exclude news from Fox or the Competitive Enterprise Institute in a meaningful way?
I feel that there are two things that will reduce stress when it comes to overload. First, open access to professional communities. Second, semantic utilities with tiered relationships. (Tiered in terms of technical knowledge and links outward to either collections of data or critical evaluations of relationships.
Is this just another whiz-bang computer nerd technology solution? That's what I heard in my SIG meeting when I proposed we sponsor a DOE symposium on one of their collaboration pieces. But Raymond Orbach (DOE) and Abe Leiderman (Deep Web Technologies) have produced good work with federated data previously and I see no reason to think that extending it a bit is unreasonable.
Hyperlinks and search results are seductive in the way wandering in the library is. Debating the percentage of the population susceptible to wandering off topic in a given circumstance is probably not productive. Again, it's choice.
Does Google make us stupid? To the extent that using a tool without knowing how full text search and popularity ranking works can produce ineffective and flawed work. (Student papers. rofl rofl) Using other search techniques such as reading full context and gauging professional reputation can help stabilize the work. In other words, making good choices is central to success in search.