Thursday, March 15, 2018

History of Computing II: Mouse forward

The move toward the possibility of a computer that could truly be called "personal" begins in many ways with Douglas Engelbart's question: "If in your office, you as an intellectual worker were supplied with a computer display backed up by a computer that was alive for you all day and was instantly responsive, how much value could you derive from that?” The question was posed on December 9, 1968, at what has come to be called the "Mother of all Demos," where Engelbart and his team at Augmentation Research Center at Stanford Research Institute in Menlo Park, California. Unlike anyone else in 1968, Engelbart had some concrete answers to this seemingly abstract question: he was seated at a console which included a chorded keyboard (not unlike that employed by the operator of the "Voder" at the 1939 World's Fair) as well as the first operational three-button mouse, which Engelbart had designed together with Bill English starting in 1963. Using this interface, as well as an audio and video projector, Engelbart demonstrated the other capacities of his system, which included collapsible and relational menus, a simple mapping system, a text editor, and basic programming tools.

Of course the computers that backed up Engelbart's console were still massive, and required a number of other human operators and technicians (he chats with several of them in the course of the demo). It would be nearly another sixteen years before advances in microchips, display screens, and hardware would enable the production of the Apple Macintosh, the first computer to incorporate a mouse along with a graphical user interface (GUI) and some degree of WYSIWYG (What you see is what you get) graphics. It was these technologies, much more than earlier screen and keyboard machines, that turned the modest interest in home computers into the revolution in personal computers that enabled the "Internet" age.

Interestingly, although Engelbart is actually depending on a remote set of machines connected over a cable, it would be a long time before the computer was not only an independent platform but also a means of communications. Early modems were slow and unreliable, and took hours to send long files; even then, they more often connected to a remote "host" which was itself isolated from the 'net, such as a BBS system. The earliest version of the Internet, known as ARPANET, was created from plans developed for the US Department of Defense by the RAND corporation, its architecture designed to link DoD facilities with contractors and research universities, with a "distributed" set of nodes which was chosen as the most likely to survive a Soviet nuclear attack. Even well into the late 1980's, when I sent my first e-mail (I was then a grad student doing a work study job at Brown's Graduate School offices) 90% of the traffic on the Internet went from one big host computer to another at universities and research institutes. I remember sending a message to someone with an odd-sounding hostname, and finding out only later that the user was in Tel Aviv, Israel!

The Internet was not opened to commercial traffic of any kind until 1993, and it was around this time that Sir Tim Berners-Lee released his hypertext "world wide web" protocol, and Mosaic, the first widely-used browser, came into use. This software, because it enabled terminal-to-terminal communication using an interface which worked in much the same way as the GUI's of individual computers (well, Macs in any case!), was the key step toward the 'net becoming a true mass medium. And it was only then that the answer, or rather answers, to Engelbart's question became clear, with nearly two billion Internet users worldwide, and global e-commerce quickly becoming the dominant means of trade and exchange throughout the developed world. And, of course, the humble "intellectual worker" -- such as yours truly -- has, and continues to derive great value from all this; in the case of my most recent book, which took about six years to research, I'd estimate that, without access to Internet-based historical materials, the project would have taken at least twice and long, and cost tens of thousands of dollars in airfare to travel to and search through archives around the world.

3 comments:

  1. In terms of social value the personal computer works for professional and academic needs, as well as personal communication. It stands out as a military based technology that improved the human condition. It does have its downside, with the disposal of so many units and the ecological impact of extracting the raw material required for their construction.

    I can relate to the professor's realization of the difference in time and space between the analogue and digital eras. Without the internet there is a great chance I would not have been able to attain my associate or bachelor's degree. It has also proven indispensable for my music and writing projects. Research has truly been made mobile as opposed to the days when I had to spend endless hours in libraries looking for texts and going over microfilms. While there is an air of nostalgia over those hard copy media days, I've had no qualms about adapting the digital virtual materials of recent history.

    ReplyDelete
    Replies
    1. No question, the digitization of older material -- books and newspapers and academic journals -- has been a huge boon. The books I've written in the Internet era would have taken twice as long to complete without these resources. On the other hand, this boon has led libraries around the world to get rid of an increasing portion of of their physical book collections, on the rationale that they're available digitally -- that, I think, is a colossal mistake,

      Delete
  2. I find it interesting that the government grabs hold of and funds most important technologies. There is so much capacity for evil in the ability to instantly share information of any kind, as we've found out with the revelations on lack of privacy in the Facebook realm. The internet and computers seem impossible to fully reign in, and I'm surprised they have become so universally used because of that fact. I would have thought that governments would want to heavily regulate them in the name of "safety". The business of cyber security is certainly fascinating.

    ReplyDelete