Preview Mode Links will not work in preview mode

Oct 8, 2020

This prefix “cyber” is pretty common in our vernacular today. Actually it was in the 90s and now seems reserved mostly for governmental references. But that prefix has a rich history. We got cyborg in 1960 from Manfred Clynes and Nathan S. Kline. And X-Men issue 48 in 1968 introduced a race of robots called Cybertrons, likely the inspiration for the name of the planet the Transformers would inhabit as they morphed from the Japanese Microman and Diaclone toys.

We got cyberspace from William Gibson in 1982 and cyberpunk from the underground art scene in the 1980s. We got cybersex in the mid-90s with AOL. The term cybercrime rose to prominence in that same timeframe, being formalized in use by the G8 Lyons Group on High-Tech Crime. And we get cybercafes, cyberstalking, cyberattack, cyberanarchism, cyberporn, and even cyberphobia of all those sound kinda’ ick. 

And so today, the word cyber is used to prefix a meaning around the culture of computers, information technology, and virtual reality and the meaning is pretty instantly identifiable. But where did it come from? The word is actually short for cybernetic, which is greek for skilled in steering or governing. And 

Cybernetics is a multi-disciplinary science, or psuedo-science according to who you talk to, that studies systems. And it’s defined in its most truest form with the original 1948 definition from the author who pushed it into the mainstream, Norbert Wiener: “the scientific study of control and communication in the animal and the machine.” 

Aaaactually, let’s back up a minute. French physicist André-Marie Ampère coined the term cybernétique in 1934, which he called his attempt to classify human knowledge. His work on electricity and magnetism would result in studies that would earn him the honor of having the Amp named after him. But jump forward to World War Two and after huge strides in General Systems Theory and negative feedback loops and the amazing work done at Bell Labs, we started getting MIT’s Jay Forrester (who would invent computer memory) and Gordon Brown, who defined automatic-feedback control systems and solidified servomechanisms, or servos in engineering applying systems thinking all over the place, which also resulted in Forrester applying that thinking to Management, resulting in the MIT Sloan School of Management. And Deming applied these concepts to process, resulting in Total Quality Management which has been a heavy influence on what we call Six Sigma today. And John Boyd would apply systems thinking and feedback loops into military strategy. So a lot of people around the world were taking a deeper look at process and feedback and loops and systems in general. 

During World War II, systems thinking was on the rise. And seeing the rise of the computer, Norbert Wiener worked on anti-aircraft guns and was looking into what we now call information theory at about the same time Claude Shannon was. Whereas Claude Shannon went on to formalize Information Theory, Wiener formalized his work as cybernetics. He had published “A simplification in the logic of relations” in 1914, so he wasn’t new to this philosophy of melding systems and engineering. But things were moving quickly. ENIAC had gone live in 1947. Claud Shannon published a paper in 1948 that would emerge as a book called “A Mathematical Theory of Communication” by 1949. So Wiener published his book called Cybernetics, or the Control and Communication in the Animal and the Machine, in 1948.  And Donald Mackay was releasing his book on Multiplication and division by electronic analogue methods in 1948 in England. Turing’s now infamous work during World War II had helped turn the tides and after the war he was working on the Automatic Computing Engine. John von Neumann had gone from developing game theory to working on the Manhattan Project and nuclear bombs and working with ENIAC to working on computing at Princeton and starting to theorize on cellular automata. J.C.R. Licklider was just discovering the computer while working on psychoacoustics research at Harvard - work that would propel him to become the Johnny Appleseed of computing and the instigator at the center of what we now call the Internet and personal computers.

Why am I mentioning so many of the great early thinkers in computing? Because while Wiener codified, he was not alone responsible for Cybernetics. In fact, the very name Cybernetics had been the name of a set of conferences held from 1946 to 1953 and organized by the Josiah Macy, Jr foundation. These conferences and that foundation are far more influential in Western computing in the 50s and 60s, and the principals that sprang from that and went around the world than credit is usually given.

All of those people mentioned and dozens of others who are responsible for so many massive, massive discoveries were at those conferences and in the clubs around the world that sprang up from their alumni. They were looking for polymaths who could connect dots and deep thinkers in specialized fields to bring science forward through an interdisciplinary lens. In short, we had gone beyond a time when a given polymath could exceed at various aspects of the physical sciences and into a world where we needed brilliant specialists connected with those polymaths to gain quantum leaps in one discipline, effectively from another. 

And so Wiener took his own research and sprinkled in bits from others and formalized Cybernetics in his groundbreaking book. From there, nearly every discipline integrated the concept of feedback loops. Plato, who the concept can be traced back to, would have been proud. And from there, the influence was massive. 

The Cold War Military-Industrial-University complex was coming into focus. Paul Baran from RAND would read McCullough and Pitts’ work from Cybernetcs and neural nets and use that as inspiration for packet switching. That work and the work of many others in the field is now the basis for how computers communicate with one another. The Soviets, beginning with Glushkov, would hide Cybernetics and dig it up from time to time restarting projects to network their cities and automate the command and control economy. Second order cybernetics would emerge to address observing systems and third order cybernetics would emerge as applied cybernetics from the first and second order. We would get system dynamics, behavioral psychology, cognitive psychology, organizational theory, neuropsychology, and the list goes on. 

The book would go into a second edition in 1965. While at MIT, Wiener was also influential in early theories around robotics and automation. Applied cybernetics. But at the Dartmouth workshop in 1956, John McCarthy along with Marvin Minsky and Claude Shannon would effectively split the field into what they called artificial intelligence. The book Emergence is an excellent look at applying the philosophies to ant colonies and analogizing what human enterprises can extract from that work. Robotics is made possible by self-correcting mechanisms in the same way learning organizations and self-organization factor in. Cybernetics led to control theory, dynamic systems, and even chaos theory. We’ve even grown to bring biocybernetics into ecology, and synthetic and systems biology. Engineering and even management. 

The social sciences have been heavily inspired by cybernetics. Attachment theory, the cognitive sciences, and psychovector analysis are areas where psychology took inspiration. Sociology, architecture, law. The list goes on. 

And still, we use the term artificial intelligence a lot today. This is because we are more focused on productivity gains and the truths the hard sciences can tell us with statistical modeling than with the feedback loops and hard study we can apply to correcting systems. I tend to think this is related to what we might call “trusting our guts.” Or just moving so fast that it’s easier to apply a simplistic formula to an array to find a k-nearest neighbor than it is to truly analyze patterns and build feedback loops into our systems. It’s easier to do things because “that’s the way we’ve always done that” than to set our ego to the side and look for more efficient ways. That is, until any engineer on a production line at a Toyota factory can shut the whole thing down due to a defect. But even then it’s easier to apply principles from lean manufacturing than to truly look at our own processes, even if we think we’re doing so by implementing the findings from another. 

I guess no one ever said organizational theory was easy. And so whether it’s the impact to the Internet, the revolutions inspired in applied and sciences, or just that Six Sigma Blackbelt we think we know, we owe Wiener and all of the others involved in the early and later days of Cybernetics a huge thank you. The philosophies they espoused truly changed the world. 

And so think about this. The philosophies of Adam Smith were fundamental to a new world order in economics. At least, until Marx inspired Communism and the Great Depression inspired English economist John Maynard Keynes to give us Keynesian economics. Which is still applied to some degree, although one could argue incorrectly with Stimulus checks when compared to the New Deal. Necessity is the mother of invention. So what are the new philosophies emerging from the hallowed halls of academia? Or from the rest of the world at large? What comes after Cybernetics and Artificial Intelligence? Is a tough economy when we would expect the next round of innovative philosophy that could then be applied to achieve the same kinds of productivity gains we got out of the digitization of the world? Who knows. But I’m an optimist that we can get inspired - or I wouldn’t have asked. 

Thank you for tuning in to this episode of the history of computing podcast. We are so lucky to have you. Have a great day.