Preview Mode Links will not work in preview mode

Mar 21, 2020

Today we’re going to celebrate an article called As We May Think and it’s author, Vannevar Bush.

Imagine it’s 1945. You see the future and prognosticate instant access to all of the information in the world from a device that sits on every person’s desk at their office. Microfiche wouldn’t come along for another 14 years. But you see the future. And the modern interpretations of this future would be the Internet and personal computing. But it’s 1945. There is no transistor and no miniaturization that led to microchips. But you’ve seen ENIAC and you see a path ahead and know where the world is going. And you share it. 

 

That is exactly what happened in “As We May Think” an article published by Vannevar Bush in The Atlantic. 

 

Vannevar Bush was one of the great minds in early computing. He got his doctorate from MIT and Harvard in 1916 and went into the private sector. During World War I he built a submarine detector and went back to MIT splitting his time between academic pursuits, inventing, and taking inventions to market. He worked with American Radio and Research Corporation (AMRAD), made millions off an early thermostat company, and founded the American Appliance Company, now known as the defense contracting powerhouse Raytheon. 

 

By 1927 computing began to tickle his fancy and he built a differential analyzer, or a mechanical computer to do all the maths! He would teach at MIT penning texts on circuit design and his work would influence the great Claude Shannon and his designs would be used in early codebreaking computers. He would become a Vice President of MIT as well as the Dean of the MIT School of Engineering. 

 

Then came World War II. He went to work at the Carnegie Institute of Science, where he was exposed to even more basic research than during his time with MIT. Then he sat on and chaired the National Advisory Committee for Aeronautics, which would later become NASA - helping you get the Ames Research Crnter and Glenn Research Center started. 

 

Seems like a full career? Nah, just getting started! 

 

 he went to President Roosevelt and got the National Defense Research Committee approved. There, they developed antiaircraft guns, radar, and funded the development of ENIAC. Roosevelt then made him head of the Office of Scientific Research and Development who worked on developing the proximity fuse. There he also recruited Robert Oppenheimer to run the Manhattan Project and was there in 1945 for the Trinity Test, to see the first nuclear bomb detonated. 

 

And that is when he lost a major argument. Rather than treat nuclear weapons like the international community had treated biological weapons, the world would enter into a nuclear arms race. We still struggle with that fallout today. 

 

He would publish As We May Think in the Atlantic that year and inspire the post World War II era of computing in a few ways. The first is funding. He was the one behind the National Science Foundation. And he advised a lot of companies and US government agencies on R&D through his remaining years sitting on boards, acting as a trustee, and even a regent of the Smithsonian. 

 

Another was inspiration. As We May Think laid out a vision. Based on all of the basic and applied research he had been exposed to, he was able to see the convergence that would come decades later. ENIAC would usher in the era of mainframes. But things would get smaller. Cameras and microfilm and the parsing of data would put more information at our fingertips than ever. An explosion of new information out of all of this research would follow and we would need to parse it using those computers, which he called a memex. The collective memory of the world.

 

But he warned of an arms race leading to us destroying the world first.

 

Ironically it was the arms race that in many ways caused Bush’s predictions to come true. The advances made in computing during the Cold War were substantial. The arms race wasn’t just about building bigger and more deadly nuclear weapons but brought us into the era of transistorized computing and then minicomputers and of course ARPANET. 

 

And then around the time that basic research was getting defunded by the government due to Vietnam the costs had come down enough to allow Commodore, Apple, and Radioshack to flood the market with inexpensive computers and for the nets to be merged into the Internet. And the course we are on today was set. 

 

I can almost imagine Bush sitting in a leather chair in 1945 trying to figure out if the powers of creation or the powers of destruction would win the race to better technology. And I’m still a little curious to see how it all turns out. 

 

The part of his story that is so compelling is information. He predicted that machines would help unlock even faster research, let us make better decisions, and ultimately elevate the human consciousness. Doug Englebart saw it. The engineers at Xerox saw it. Steve Jobs made it accessible to all of us. And we should all look to further that cause.

 

Thank you for tuning in to yet another episode of the History of Computing Podcast. We are so very lucky to have you.