Overview: The Information Age
The Information Age, also commonly known as the Computer Age or Digital Age, is a descriptive term for the current, modern age in history in which individuals are able to transfer information freely and have instant access to information. The Information Age came about by capitalizing on advances in computer microminiaturization, with a transition spanning from the advent of the personal computer of the late-1970s to the Internet reaching a critical mass in the early 1990s, followed by the subsequent adoption of such technology by the public in the two decades after 1990. Bringing about a fast evolution of technology, the Information Age has enabled rapid global communications and networking to shape modern society.
The Internet
While the roots of innovations like personal computers and the Internet go back to the 1960s and massive Department of Defense spending, it was in the 1980s and 90s that these technologies became part of everyday life. The Internet was first conceived as a fail-proof network that could connect computers together and be resistant to any one point of failure; the Internet cannot be totally destroyed in one event, and if large areas are disabled, the information is easily rerouted. At its initial stage, its only software applications were e-mail and computer file transfer.
Though the Internet itself has existed since 1969, it was with the invention of the World Wide Web in 1989 by two computer scientists, Tim Berners-Lee and Robert Cailliau, followed by its implementation in 1991, that the Internet truly became a global network. Today, the Internet has become the ultimate platform for accelerating the flow of information. It is presently the fastest-growing form of media and is gradually pushing many other forms of media into obsolescence.
Progression
Library expansion was calculated in 1945 by writer, inventor, and librarian Fremont Rider to double in capacity every 16 years, if sufficient space was made available. Rider advocated for replacing bulky, decaying printed works with miniaturized microform analog photographs, which could be duplicated on-demand for library patrons or other institutions. He did not foresee the digital technology that would follow decades later to replace analog microform with digital imaging, storage, and transmission mediums. Automated, potentially lossless digital technologies allowed vast increases in the rapidity of information growth. Moore's law—that the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years—was formulated around 1970.
The proliferation of smaller and less expensive personal computers and improvements in computing power by the early 1980s resulted in a sudden access to and ability to share and store information for more and more workers. Connectivity between computers within companies led to the ability of workers at different levels to access greater amounts of information.
Information Transmission
The world's technological capacity to receive information through one-way broadcast networks was 432 exabytes of (optimally compressed) information in 1986; 715 (optimally compressed) exabytes in 1993; 1.2 (optimally compressed) zettabytes in 2000; and 1.9 zettabytes in 2007 (this is the information equivalent of 174 newspapers per person per day). The world's effective capacity to exchange information through two-way telecommunication networks was 281 petabytes of (optimally compressed) information in 1986; 471 (optimally compressed) petabytes in 1993; 2.2 (optimally compressed) exabytes in 2000; and 65 (optimally compressed) exabytes in 2007 (this is the information equivalent of 6 newspapers per person per day).
In the 1990s, the spread of the Internet caused a sudden leap in access to and ability to share information in businesses, at home, and around the globe. Technology was developing so quickly that a computer costing $3,000 in 1997 would cost $2,000 two years later and only $1,000 the following year.
The Rise of Information-Intensive Industry
Industry is becoming more information-intensive and less labor- and capital-intensive. This trend has important implications for the workforce: workers are becoming increasingly productive as the value of their labor decreases. However, there are also important implications for capitalism itself; not only has the value of labor decreased, but the value of capital has also diminished. In the classical model, investments in human capital and financial capital are important predictors of the performance of a new venture. However, as demonstrated by Mark Zuckerberg and Facebook, it now seems possible for a group of relatively inexperienced people with limited capital to succeed on a large scale.
Technology and Culture
Like most technology-driven periods of transformation, the information age was greeted with a mixture of hope and anxiety upon its arrival. In the late 1970s and early 1980s, computer manufacturers like Apple, Commodore, and Tandy began offering fully assembled personal computers. (Previously, personal computing had been accessible only to those adventurous enough to buy expensive kits that had to be assembled and programmed.) In short order, computers became a fairly common sight in businesses and upper-middle-class homes. Soon, computer owners, even young kids, were launching their own electronic bulletin board systems, small-scale networks that used modems and phone lines, and sharing information in ways not dreamed of just decades before. Computers, it seemed, held out the promise of a bright, new future for those who knew how to use them.
The rise of the personal computer
This ad for the Apple II appeared in Byte magazine in 1977.
Casting shadows over the bright dreams of a better tomorrow were fears that the development of computer technology would create a dystopian future in which technology became the instrument of society’s undoing. Film audiences watched a teenaged Matthew Broderick hacking into a government computer and starting a nuclear war in War Games, Angelina Jolie being chased by a computer genius bent on world domination in Hackers, and Sandra Bullock watching helplessly as her life is turned inside out by conspirators who manipulate her virtual identity in The Net. Clearly, the idea of digital network connections as the root of our demise resonated in this period of rapid technological change.