A major refrain in Vannevar Bush's As We May Think article is the notion of consulting – how are we to wrangle, sift, or otherwise make sense of the growing the mountains of data humans are creating. Throughout his exploration of possibilities, Bush prophesies the advent of many current and emerging technologies: the digital camera – although he didn't have the terminology (page 4), the photocopier (page 4), the scientific calculator (page 7), the home computer (page 8), credit cards (page 9), the scanner (page 10), a Windows-like operating system (page 10), and arguably more – including the birth of the semantic web (page 11). Seeing how much of the world I know in 2009 was still a dream in the making in 1945, I took a brief survey of the advancements in various technologies as I have experienced them in my 38 years. Bear with me... this is illuminating considering how much of Bush's ideas have since come to exist:
The year before I was born, the 5¼ inch floppy disk was invented followed shortly by the dot-matrix printer and microprocessor. And although the existence of the VCR coincides with my birth, it wasn't until after the late 70's until it hit the commercial market.
1972 marked the appearance of the first word processor – and the first video game, Pong. The next year, Xerox announced the creation of the ethernet. In 1975, the laser printer was invented, followed by the ink-jet printer in 76.
The first spreadsheet was released in 1978, the year before cell phones and the Cray supercomputer were invented. It's no surprise that MS-DOS and the first IBM-PC share a birth year (1981). Three years later, Apple invented the Macintosh, and CD-ROMs hit the streets. Windows is released in 1985, during my last year in middle school. By the time I graduate, digital cellular phones and high-definition television have been invented.
My freshman year of high school, I owned a Kaypro 4-84 with two 5¼ inch floppies, monochrome monitor, internal modem operating at 96k baud rate, and a processor speed of 4mhz. It was a portable computer weighing in at 36 pounds.
The year after I graduated, Time Berners-Lee created the World Wide Web and Internet protocol (HTTP) and WWW language (HTML). Five years later, it is truly world wide. About that time, the DVD came into existence. In 2001, our relationship to information changes when Apple introduces the iPod. Half a decade later YouTube and Twitter both hit big around 2006.
After all that has come to pass – much of it predicted by Vannevar Bush, we are still precisely in the same situation as we were in 1945 in regards to consulting (or navigating) the great storehouse of human knowledge. We've moved past film and microfiche – both on the road to technological extinction. We're in digital land now, and we've got more information than be conceived. Part of our modern quandary is that it isn't just scientists creating and parsing the data. The digital realm is far more egalitarian – anyone with access can create content. More so than at any point in our history does the catastrophe of “truly significant attainments becom[ing] lost in the mass of the inconsequential” become a risk, especially as we move fully into the age of crowdsourcing.
We are now living in the information age, and many great minds are now bent to the task of sorting out how best to deal with the volume of information our age is issuing. As with many of Bush's predicitons, I think he hit the nail on the head in regards to creating associative links between points of knowledge. We're already doing that with tags. Tagging may be the way information is cultivated in the future: “Selection by association, rather than by indexing, may yet be mechanized” (10).
Logging off the memex,
Mark Oppenneer