The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution
H**T
The Complexity and Beauty of Innovation according to Walter Isaacson
The Innovators by Walter Isaacson is a great book because of its balanced description of the role of geniuses or disruptive innovators as much as of teamwork in incremental innovation. “The tale of their teamwork is important because we don’t often focus on how central their skill is to innovation. […] But we have far fewer tales of collaborative creativity, which is actually more important in understanding how today’s technology evolution was fashioned.” [Page 1] He also goes deeper: “I also explore the social and cultural forces that provide the atmosphere for innovation. For the birth of the digital age, this included a research ecosystem that was nurtured by the government spending and managed by a military-industrial collaboration. Intersecting with that was a loose alliance of community organizers, communal-minded hippies, do-it yourself hobbyists, and homebrew hackers, most of whom were suspicious of centralized authority.” [Page 2] ”Finally, I was struck by how the truest creativity of the digital age came from those who were able to connect the arts and sciences.” [Page 5]The computerI was a little more cautious with chapter 2 as I have the feeling that the story of Ada Lovelace and Charles Babbage is well known. I may be wrong. But chapter 3 about the early days of the computer was mostly unknown to me. Who invented the computer? Probably many different people in different locations in the US, the UK and Germany, around WWII. “How did they develop this idea at the same time when war kept their two teams isolated? The answer is partly that advances in technology and theory made the moment ripe. Along with many innovators, Zuse and Stibitz were familiar with the use of relays in phone circuits, and it made sense to tie that to binary operations of math and logic. Likewise, Shannon, who was also very familiar with phone circuits, would be able to perform the logical tasks of Boolean algebra. The idea that digital circuits would be the key to computing was quickly becoming clear to researchers almost everywhere, even in isolated places like central Iowa.” [Page 54]There would be a patent fight I did not know about. Read pages 82-84. You can also read the following on Wikipedia: “On June 26, 1947, J. Presper Eckert and John Mauchly were the first to file for patent on a digital computing device (ENIAC), much to the surprise of Atanasoff. The ABC [Atanasoff–Berry Computer] had been examined by John Mauchly in June 1941, and Isaac Auerbach, a former student of Mauchly’s, alleged that it influenced his later work on ENIAC, although Mauchly denied this. The ENIAC patent did not issue until 1964, and by 1967 Honeywell sued Sperry Rand in an attempt to break the ENIAC patents, arguing the ABC constituted prior art. The United States District Court for the District of Minnesota released its judgement on October 19, 1973, finding in Honeywell v. Sperry Rand that the ENIAC patent was a derivative of John Atanasoff’s invention.” [The trial had begun in June 1971 and the ENIAC patent was therefore made invalid]I also liked his short comment about complementary skills. “Eckert and Mauchly served as counterbalances for each other, which made them typical of so many digital-age leadership duos. Eckert drove people with a passion for precision; Mauchly tended to calm them and make them feel loved.” [Pages 74-75]Women in Technology and ScienceIt is in chapter 4 about Programming that Isaacson addresses the role of women. “[Grace Hopper] education wasn’t as unusual as you might think. She was the eleventh woman to get a math doctorate from Yale, the first being in 1895. It was not at all uncommon for a woman, especially from a successful family, to get a doctorate in math in the 1930s. In fact, it was more common than it would be a generation later. The number of American women who got doctorates in math during the 1930s was 133, which was 15 percent of the total number of American math doctorates. During the decade of the 1950s, only 106 American women got math doctorates, which was a mere 4 percent of the total. (By the first decade of the 2000 things had more than rebounded and there were 1,600 women who got math doctorates, 30 percent of the total.)” [Page 88]Not surprisingly, in the early days of computer development, men worked more in hardware whereas women would be in software. “All the engineers who built ENIAC’s hardware were men. Less heralded by history was a group of women, six in particular, who turned out to be almost as important in the development of modern computing.” [Page 95] “Shortly before she died in 2011, Jean Jennings Bartik reflected proudly on the fact that all the programmers who created the first general-purpose computer were women. « Despite our coming of age in an era when women’s career opportunities were generally quite confined, we helped initiate the era of the computer. » It happened because a lot of women back then had studied math and their skills were in demand. There was also an irony involved: the boys with their toys thought that assembling the hardware was the most important task, and thus a man’s job. « American science and engineering was even more sexist than it is today, » Jennings said. « If the ENIAC’s administration had known how crucial programming would be to the functioning of the electronic computer and how complex it would prove to be, they might have been more hesitant to give such an important role to women.” [Pages 99-100]The sources of innovation“Hopper’s historical sections focused on personalities. In doing so, her book emphasized the role of individuals. In contrast, shortly after Hopper’s book was completed, the executives at IBM commissioned their own history of the Mark I that gave primary credit to the IBM teams in Endicott, New York, who had constructed the machine. “IBM interests were best served by replacing individual history with organizational history,” the historian Kurt Beyer wrote in a study of Hopper. “The locus of technological innovation, according to IBM was the corporation. The myth of the lone radical inventor working in the laboratory or basement was replaced by the reality of teams of faceless organizational engineers contributing incremental advancements.” In the IBM version of history, the Mark I contained a long list of small innovations, such as the ratchet-type counter and the double-checked card feed, that IBM’s book attributed to a bevy of little-known engineers who worked collaboratively in Endicott.The difference between Hopper’s version of history and IBM’s ran deeper than a dispute over who should get the most credit. It showed fundamentally contrasting outlooks on the history of innovations. Some studies of technology and science emphasize, as Hopper did, the role of creative inventors who make innovative leaps. Other studies emphasize the role of teams and institutions, such as the collaborative work done at Bell Labs and IBM’s Endicott facility. This latter approach tries to show that what may seem like creative leaps – the Eureka moment – are actually the result of an evolutionary process that occurs when ideas, concepts, technologies, and engineering methods ripen together. Neither way of looking at technological advancement is, on its oqn, completely satisfying. Most of the great innovations of the digital age sprang from an interplay of creative individuals (Mauchly, Turing, von Neumann, Aiken) with teams that knew how to implement their ideas.” [Pages 91-92]Google about Disruptive and Incremental InnovationThis is very similar to what I read about Google: “To us, innovation entails both the production and implementation of novel and useful ideas. Since “novel” is often just a fancy synonym for “new”, we should also clarify that for something to be innovative, it needs to offer new functionality, but it also has to be surprising. If your customers are asking for it, you aren’t being innovative when you give them what they want; you are just being responsive. That’s a good thing, but it’s not innovative. Finally “useful” is a rather underwhelming adjective to describe that innovation hottie, so let’s add an adverb and make it radically useful, Voilà: For something to be innovative, it needs to be new, surprising, and radically useful.” […] “But Google also releases over five hundred improvements to its search every year. Is that innovative? Or incremental? They are new and surprising, for sure, but while each one of them, by itself is useful, it may be a stretch to call it radically useful. Put them all together, though, and they are. […] This more inclusive definition – innovation isn’t just about the really new, really big things – matters because it affords everyone the opportunity to innovate, rather than keeping it to the exclusive realm of these few people in that off-campus building [Google[x]] whose job is to innovate.” [How Google Works – Page 206]
J**R
Detailed and informative narrative on foundations of technology (kindle:great;hardcover: even better)
In a classic retelling of the story of digital revolution, Isaacson makes broader comments on the importance of collaboration and tries to de-romanticize the notion of innovation happening as a series of significant breakthroughs emanating from lone geniuses. In that sense, one could see that themes introduced in Where Good Ideas Come From and How We Got to Now: Six Innovations That Made the Modern World are (deliberately or not) explained well in the context of digital revolution. More specifically, the often 'incremental' nature of innovation, significant gaps between others realize the importance of someone's invention, impact of developments in unrelated fields, and the very nature of collaboration. Later on in the book, Isaacson quotes Twitter co-founder "....they simply expand on an idea that already exists". The author also makes an important point in reminding that corporations (IBM, Intel,Bell labs, Honeywell..etc) played a significant role in these developments, but their stories oftentimes unfairly gets discounted in the face of narratives centered around individuals.Trying to balance interpretive historical narration and cataloging key details pertinent to the digital revolution, Isaacson weaves a (mostly) linear complex storyline starting with Ada to more recent topics such as IBM's Jeopardy machine. Throughout these often dense chapters, a patient reader is able to understand the core tenets of computers, programming, and the Web itself - and how they evolved over time. The calibration, refinement, and sometimes negation of these ideas over time, as with most understanding in science we take for granted, is well-documented and very informative. The fairly long chapters on computers and programming could test the patience of a reader early on, but these chapters lay the foundation for the chapters describing the dramatic growth seen in the past few decades.One could argue that books such as The Intel Trinity: How Robert Noyce, Gordon Moore, and Andy Grove Built the World's Most Important Company, Tubes: A Journey to the Center of the Internetnumerous biographical sketches of Ada Lovelace, Crystal Fire: The Invention of the Transistor and the Birth of the Information Age (Sloan Technology Series) covered some of these topics with greater technical and/or biographical depth. However, most of these attempts have been stymied by a crucial fault - they all told history from a single point-of-view. In this book, there is no protagonist per se. That approach provides the author a dispassionate approach that allows for more incisive analysis, though he doesn't necessarily capitalize on it. Discussions on who should be given credit for the first computer is a rare example where the author manages to inject his own analysis.Given the vast research that went into this book and access to some of the key technology leaders of the time, one wishes the author attempted to predict the next few decades or hypothesize on what's required to make the next few steps in this field. Leveraging Ada's story to begin and end the narration gives a unique sense of closure for the reader - and a very stark reminder that despite all the advances we've seen so far, we are still far away from machines that can think (this last chapter (shortest and succinct), aptly titled 'Ada Forever' is one of the better-written chapters). The hype-less narration, systematic building of the key concepts, doing a good job in relating the developments across decades and tracing an investigative path to where we are, makes this a very compelling read for anyone interested in technology. 4.5 stars(Kindle version on iPad app worked great; though the layout of the photographs and the initial detailed timeline with rare pictures are much better in the hardcopy. It would've been great if the timeline at the outset of the book was available as a pullout)
C**N
Wow !!!
This book tells the complete history of the digital world as we know it so far. It is a masterful, in-depth reporting of the contributions of all the great minds in the cooperative hive of digital creation. Walter Isaacson thanks his wife in the acknowledgments. With all the time and effort he spent on this book, I'm surprised that he still has a wife !!!
Trustpilot
1 month ago
1 day ago