Doug Engelbart: The Interview
We often forget that people conceived and designed the tools we use, and that a window or a mouse was not predestined to be the computer metaphor of our age.
In 1968, in front of 2,000 people, Douglas Engelbart introduced concepts that, for millions of people, now define the experience of using a computer. At the Fall Joint Computer Conference in San Francisco, he produced the first public unveiling of the "mouse," the "window," and the "point-and-click" metaphor. This Mother of Demos reverberated around the world, and permanently altered how we experienced computers.
The most remarkable aspects of his vision were its clarity -- that computers could help everyday people work and think -- and the way he went about turning this idea into a reality.
Doug's projects were funded by the same agency that funded the creation of the Internet. You could say that he worked on the "front end" -- the software that would allow the network to be experienced by non-experts. To a large degree, he was successful: hypertext, the web, and the icons that you click on were all inspired by Doug's work.
Howard Rheingold's Tools For Thought gives some good background on Doug Engelbart, especially in Chapter Nine, "The Loneliness of the Long Distance Thinker."
Doug now leads The Bootstrap Institute. I spoke with him recently for an hour or so. We discussed his early days in computing, how he created his milestone metaphors, and whether today's world of web, Windows, and mice matches his expectations.
Here's the first part of a three-part interview with Doug Engelbart, in his own words.
David Bennahum: What was it about computing that drew you in? What had you read about computers that inspired you to think about them as a tool to help people think better?
Doug Engelbart: Well, I'd read some things beginning in '48, just semi-popular things about how digital computers were emerging and some of the things they could do. And then I'd read a book called The Giant Brain. It was a popularized picture of what computers could do. You got the picture that a computer could read punchtape or punchcards, and likewise punch cards or paper tape or print on paper. I had enough technical background to know that if a computer could print on paper or punchcards, it could print anything you want to appear on a cathode ray tube. And if it could read those things, it could read contacts and different other instruments itself, so it could enhance what you were doing. So it could interact with you: what you did on the keyboard (or any other things you could wiggle or push), it could compute and do on the screen.
I saw that this could let people start exploring brand-new kinds of symbology and knowledge portrayal. Then I realized: If people are equipped with similar workstations (whatever I was thinking of calling them, terminals; I'm not sure), then they could be tied to the