Saying Goodbye To The Kojo Nnamdi Show
On this last episode, we look back on 23 years of joyous, difficult and always informative conversation.
Guest Host: Elaina Plott and Jen Golbeck
The best inventions take teamwork. So says journalist and author Walter Isaacson in a new book tracing the stories of the innovators behind the personal computer and the Internet. Isaacson explains how the digital age grew out of creative collaborations among contemporaries and across generations. He joins us to examine the unsung heroes of the Internet age who stood at the intersection of vision and implementation.
Biographer Walter Isaacson on the story of how Steve Jobs and Steve Wozniak, with help from Bill Gates, created the desktop computer. “Out of that moment…you see the birth of Microsoft and the birth of Apple,” Isaacson said.
Excerpted from The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution by Walter Isaacson. Copyright © 2014 by Walter Isaacson. Reprinted by permission of Simon & Schuster, Inc. All Rights Reserved
MS. JEN GOLBECKFrom WAMU 88.5 at American University in Washington, welcome to "The Kojo Nnamdi Show," connecting your neighborhood with the world. I'm Jen Golbeck from the University of Maryland, sitting in for Kojo. It's been said that visions without execution are hallucinations. We all know Steve Jobs was the visionary whose elegant computing devices changed the way we live. But without Steve Wozniak at his side in the early days, rigging up circuit boards and hardware, Apple might not exist.
MS. JEN GOLBECKA new book by journalist and author, Walter Isaacson, says it's not the big-name superstars who deserve the credit for the digital revolution, but the teams of engineers, programmers, dreamers and business people who collaborated both in person and across generations. Isaacson recounts the development of the computer and the Internet through teams of innovators in government, academia and industry, who refined each others' ideas and built on concepts of those who came before them. Walter Isaacson joins me to look at the unsung heroes of the Internet age and the teamwork that revolutionized electronic communication. Walter Isaacson, it's good to have you here.
MR. WALTER ISAACSONIt's good to be here, Jen, especially with an expert like yourself.
GOLBECKI -- I will tell...
ISAACSONThis is your field.
GOLBECKI will tell everyone listening that I am a computer scientist. And so I will have some personal thoughts to share with you, kind of blurring my line as host.
ISAACSONWell, I'll blur the line. I'll just ask you questions the whole time.
GOLBECKThat would be great. I'd love it. A little bit of your bio for everyone. Walter Isaacson, CEO of the Aspen Institute, former chairman of CNN and former managing editor of Time Magazine. And his new book is "The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution." You can also watch a live video stream of our conversation on our website, kojoshow.org. And we'd love to hear from you. What creative team do you think had the biggest impacts on the digital revolution? Do you work better on your own or as part of a group? Give us a call at 1-800-433-8850. Or email us at kojo@wamu.org.
GOLBECKWalter, you've had a long career in journalism. But from an early age, you were no stranger to circuit boards and transistors. How did being a bit of a tech geek yourself pique your interest in this topic and help you identify with the people you write about?
ISAACSONYeah, I was sort of a geek growing up. I loved soldering circuits, making ham radios and Heathkits. And, you know, one of the things about it -- this is something Wozniak talks about more eloquently -- is that once you've built circuit boards, you have a visual sense and almost a feel for how on-off switches and logic gates, they can do logic for you. They can do what a computer does. Nowadays, I worry a little bit, because not only do we not open up our computers and sort of hack and jack into them the way we did when I was a kid, you're not even allowed to change the battery anymore in most of your devices.
GOLBECKThat's right.
ISAACSONAnd so I wanted to show the excitement that people had during the digital revolution of this notion of what electronic circuits can do.
GOLBECKYour original plan for this book was to examine the origin of the Internet. So how did you decide to add the evolution of personal computers to the story?
ISAACSONYeah, I started this in the early 1990s when the Internet was beginning to explode with the World Wide Web in '93 and '94. And I was at Time Magazine in charge of all of our digital media. We were moving from the online services, like AOL, to the Web. And one of my bosses asked me, "Who owns the Internet?" And I thought, Wow, what a clueless question. And then he said, "Well, how'd the Internet get built?" And I thought, Well that -- and then instead of thinking it was clueless, I sort of pondered on it a bit. And I realized that I had a lucky thing in my life. I got to meet all these people.
ISAACSONI mean, I got to see Gordon Moore, Andy Grove or Tim Burners-Lee and Vint Cerf -- the people who were there doing it -- Mark Andreesen, you know a young kid who was pitching his Netscape browsers. And so I thought I would gather string and try to put together a history of how the Internet was done. When I interviewed Bill Gates, he said, "No, you got it wrong. It's not just the Internet. It's like the industrial revolution in which, when you combine the steam engine with mechanical processes, you get a revolution. In this case, it was combining the Internet with the personal computer. You should do a joint history of how we built PCs but also how the Internet came to be and how it combined. And that was a combustible combination."
ISAACSONAnd I decided, okay, cool. I'll do that.
GOLBECKTechnology is central to our lives today, but Al Gore jokes aside, we can't point to a single individual who invented the Internet. How did the Internet age reflect teamwork more than individual achievement?
ISAACSONWell, first of all, I'm going to give Al Gore a shout-out. Because remember I said, '92, '93, '94, we were moving to the Internet? That was because of the Gore Act of 1992. He shouldn't have misspoken the way he did, you know, when he talked about, I was there at the creation. But he was the person who made it legal for normal people like me to go on the Internet, instead of just people who were at research centers. The Internet was created collaboratively and it still bears the fingerprints of that collaborative creation. They were doing it as, an ARPANET it was called, from the Defense Department. But being research centers in the Defense Department, they delegated it to a lot of graduate students.
ISAACSONSomebody right here in Washington named Steve Crocker was just a graduate student back then at UCLA, and he'd gathered a whole lot of graduate students to make the protocols. Those are little instruction sets that tell the packets how to scurry through the Web and how to, you know, recombine to make a message. And they did it through a process that Steve Crocker named, requests for comment. In other words, whenever they had a proposal, an idea, or a rule or a regulation for the Internet, instead of calling it that, he would just call it a request for comment, because it made it seem more collaborative and collegial. And everybody felt they could be a part of building this thing.
ISAACSONSo it was built in this collaborative way with no on-off switch, no central hubs, nobody in control. And that's really cool.
GOLBECKAnd that's something that we still see today on the Internet, these requests for comments on all of the standards.
ISAACSONYeah, it's still the way they build the Internet. It's still done collaboratively. And if you're trying to figure out -- say, for example, they wanted to decide whether to put bitcoin as a payment system, as part of a protocol on the Web, you know, you can't just call up the Commerce Department or the government and say why don't we do that? Or lobby Congress. It's done in this collaborative way by these non-governmental organizations. And this is why the Internet is so magical that you actually want to know who created the magic.
GOLBECKAnd I confess I have had that same conversation as you early in the days of the Web, like who owns it? Does the telephone company own it? Who owns it?
ISAACSONWell, the telephone company could have owned it when Steve Crocker and all were -- and then Vint Cerf and Bob Kahn, as you know, created the Internet protocols. And when they're doing packet switching, this notion of instead of having a direct circuit where your voice is -- takes up all the circuit line, you just break the message up into packets and let it scurry through the Internet. They tried to convince AT&T to do it. And AT&T spend, you know, weeks explaining to Paul Baran and other people who had created the notion of packet switching why it would never work. And after they explained it to them, they said, "Now do you see why we're not going to do it?" And he said, "Nope." And AT&T could have owned it.
GOLBECKAnd there's a lot of these stories throughout your book of people who developed things and the big companies or organizations say no.
ISAACSONYou know, it's kind of cool that things are disruptive. Some big companies do well. IBM was slow off the mark doing a personal computer. It was done, as you know, by the people like Altair and Apple and all these people in garages and strip malls. But IBM, pretty soon after Apple comes out with the Apple II, you know, they decide, okay, we're going to create our own little group down in Boca Raton, Fla., because we know if we do it at headquarters, it'll get messed up with the bureaucracy. So it's interesting to watch how larger companies occasionally can do something right, because disruption, you know, only works when some of the big companies are fighting back.
GOLBECKLet's look at some of the innovators you discuss, often in pairs, starting with Ada Lovelace and Charles Babbage. Who were they and how did they collaborate on a 19th century conception of a computer?
ISAACSONYeah, that's the preface of the book, because it's a hundred years before we get the first real electronic computers. In the -- Ada Lovelace was Lord Byron's daughter. And being Lord Byron's daughter, she was rather poetic. But her month, Lady Byron, was not particularly fond of Lord Byron when Ada was growing up. And if you knew about Lord Byron, you'd know he was too much of a romantic poet to be a good husband. So she had Ada tutored mainly in mathematics, as if that were some antidote to cure her of being poetic. Instead, she creates what she -- she loves what she calls poetical science, combining the humanities with technology.
ISAACSONShe loves the way punch cards are instructing the looms of England to do beautiful patterns. Her father, Lord Byron, was a Luddite -- and I mean that literally -- he -- his only speech in the House of Lords defended Ned Ludd and his followers who were breaking all the looms, thinking they were putting people out of work. But Ada loved the way the punch cards were doing that. And she has this friend Babbage, Charles Babbage, who was making a calculating machine to do numbers. And it had punch cards. And she said, "Well, with the punch cards, this calculating machine can do more than just numbers. It can do anything that can be noted in symbols. It can do art and it can do music. It can do words. In other words, she envisions the computer.
ISAACSONAnd she writes a scientific paper -- not often done in the 1830s and 1840s by a woman -- publishes it, even writes a computer program showing how you'd program Babbage's machine, the first published computer program we have. So she really is the -- probably the first computer programmer, if you want to count only those who publish their programs.
GOLBECKAnd about a hundred years after that, we get Howard Aiken and Grace Hopper.
ISAACSONYou know, they almost are echoes. Because Grace Hopper is another cool woman. Woman have been written out of the history of technology. I try to, you know, give them their due here, because there aren't enough role models. And I know you teach at the University of Maryland in the Computer Science Department...
GOLBECKAnd my dog is named Hopper, by the way, after Grace Hopper.
ISAACSONAfter Grace Hopper, of course. And we just had Grace Hopper Day not -- oh, actually, Grace Hopper Day is in about three weeks. I'm giving a talk at Harvard, where she programmed the Mark I computer at Harvard during World War II. Because she was a professor of math, had a PhD in Math from Yale. But when Pearl Harbor happened, she's, you know, typical of some of the geeks in the book. She dumps her husband, runs away, quits her job and joins the Navy, thinking she's going to help during World War II. And indeed she does. She's assigned to program this new computer they're building at Harvard, the Mark I.
ISAACSONAnd when Grace Hopper and Howard Aiken do it, they look back and they find in the attic of the Harvard Science Building a piece of Charles Babbage's, you know, machines.
GOLBECKThat's right. They have the actual thing.
ISAACSONA hundred -- they actually have the wheels and the cogs which they actually put -- you can go and see it in the Science Center at Harvard, because they put it inside the computer they're building. So, you know, it'd be -- it's right mounted on the side of it. And they write a manual -- actually Grace Hopper writes a manual -- and the whole first chapter is on Babbage and Lovelace. So you see how people who have a feel for history get to repeat it.
GOLBECKAnd Grace Hopper is another one of these examples of someone who comes up with something and people tell her, it won't work. Because she invented the first compiler, which lets you write this sort of English-looking computer code, which even non-programmers have probably seen, turns it into something a computer can understand. And it took her years to get people to believe that it would work, because they told her, computers don't understand English. It won't work.
ISAACSONWell, you know, it's an incredibly important thing to come up with programming languages that are like compiled. And as you know, with a compiler, it works across different pieces of hardware. So the boys with their toys -- whether it's the men building ENIAC at Penn or doing the Mark I at Harvard -- they kind of thought that the programming was, you know, sort of clerical work. And that's why women were doing it. And they didn't realize that the hardware would become somewhat unimportant in terms of being interchangeable. It didn't matter whether you're using a UNIVAC, a Sperry-Rand or Honeywell or whatever. What mattered was the software, the operating-system languages and the software.
ISAACSONAnd so people like Grace Hopper, who did compilers where you could write across platforms, they were the ones who actually defined what the computer revolution became.
GOLBECKAnd at this point in the history, we're in World War II. Can you talk about how World War II and the Defense Department, both here and in the U.K., what role did they play in furthering the development of computers and also a way to network them?
ISAACSONWell, you're going to see a movie in a few weeks called "The Imitation Game," which is about Alan Turing. Alan Turing was also a fan of Grace Hopper -- I mean a fan of Ada Lovelace. And, you know, he comes up with the notion of the universal computing machine, one that can do any logical sequence. But then, during the war, he works at Bletchley Park, a secret facility in England to break the German wartime codes. And they build, you know, some mechanical computers. And finally colossus, which is this big electronic computer using vacuum tubes.
ISAACSONAnd so that was the example of the wartime spending, creating a computer in England. The computers we just talked about, Mark I at Harvard and ENIAC at the University of Pennsylvania were also funded by the U.S. Defense Department. And they used to be -- all the way through the Eisenhower years in particular, because Eisenhower loved scientists, a sort of collaboration between research universities (technical) and the government each funding basic research and working together. And that's how the internet, the computer, the laser, the microchip, all these things come about.
GOLBECKAnd let's stick on Alan Turing for a second. As a computer scientist he's one of these amazing figures right at the beginning of the field because he didn't just come up with one part of computer science. He came up with things that touched this whole giant field in a lot of ways. And one of those is artificial intelligence. So...
ISAACSONYeah, he has -- it comes again from Ada Lovelace. One of the things Ada Lovelace wrote in that paper I mentioned is that machines will be able to do everything. As I said, they'll do music, they'll do words, they'll do art, but then she puts a caveat. She says, but they won't think. They'll never be able to originate thought or be creative. Alan Turing, 100 years later, says, how do we know that? And he labels it Lady Lovelace's Objection and he invents a test which you call the Turing Test now, but he called it the Imitation Game, which is the name of the movie coming out.
ISAACSONAnd it's a test for artificial intelligence where you put a machine and a human in a different room, you send in questions. And if you can't tell the difference between the machine and the human, he says there's no reason to say the machine's not thinking. Now his own life, in my opinion, is somewhat of a deeply tragic and heroic refutation of that thought because deep inside all the way through his life he was homosexual. And he knew that, you know, there was, for him at least, freewill is an important thing, but he's wondering, are we preprogrammed?
ISAACSONIt's all part of his psyche trying to figure this out. And so when he does this, he comes up with the imitation game. But a little bit later he gets arrested because of his homosexuality and he gets convicted. And they sentence him to have hormone treatments as if he were a machine, as if you could just sort of put new inputs in and get a different output. And he goes along with it and he seems to be taking it in stride. But then he takes an apple, dips it into cyanide, bites into it and commits suicide.
ISAACSONIt's incredibly tragic but it also makes you think, is that something a machine would've done? And isn't there some basic difference between humans and machines? And that is the core of the argument that we still have about the Turing test and about artificial intelligence.
ISAACSONAda Lovelace would say, no, machines and human minds are different. Human minds can be creative. And there's no reason to try to create artificial intelligence. You should create machines that partner, become compliments or symbiosis with humans, connect the humanities to technology, whereas the artificial intelligence strand keeps looking for ways to have machines that will think without us.
GOLBECKAnd the movie that you mentioned is called "The Imitation Game" where Alan Turing will be played by Benedict Cumberbatch will be out soon. And anyone who wants to know more about this part of computing and World War II and also a really tragic story should check it out. It's going to be a great movie.
ISAACSONYeah, I have a whole chapter in my book about Turing and his imitation game and the notion of how that ties into the development of computing. And I must say I'm quite excited that even though in my book I try to make him understandable, Benedict Cumberbatch is going to make him famous again.
GOLBECKThat's right. We're going to take a quick break but we'd like to hear from you. Give us a call at 1-800-433-8850. I'm Jen Golbeck sitting in for Kojo and we'll be right back.
GOLBECKWelcome back. I'm Jen Golbeck from the University of Maryland sitting in on "The Kojo Nnamdi Show." I'm talking with Walter Isaacson about his new book "The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution." Walter, one theme in your book is that innovation happens in stages. How does the development of the transistor at Bell Labs in the late '40s demonstrate the way the pieces come together from invention to production to marketing?
ISAACSONWell, I think the great thing about Bell Labs is you had theorists who sort of hung out with experimentalists as well as business people as well as pole climbers who had grease under their fingernails from trying to amplify telephone signals. And they're all in this great place called Bell Lab. And they're sick of the vacuum tube, which is an expensive horrible device. And they're trying to do something solid.
ISAACSONAnd so they look at silicon and other forms of semiconducting material, which means that they don't really conduct electricity well but they also don't resist it well. And you can dope silicon or germanium with things like boron and it becomes a better or worse conductor. And so doing all that with especially the theorist like John Bardeen who understands the surface state of silicon and how the electrons are dancing, they can figure out how to take a little tiny chip of silicon and make it an on-off switch or an amplifier.
ISAACSONAnd so that makes computers personal. You no longer have to have these big old vacuum tubes. And eventually they're able to etch a lot of these transistors on a single piece of silicon. And that's where we get the microchip. So to me that type of mix of theory and practice and practicality, that's the type of team that led to big leaps. Bell Labs was great when it was doing that.
GOLBECKSo you've talked about how we are now moving from these big kind of room-sized computers to something that will become the personal computer. Let's look at the other side of this. In the '60s we get an MIT psychologist and technologist named J. C. R. Licklider. And he proposed an intergalactic computer network, which is a great phrase. How is his vision ultimately realized that the Pentagon is ARPANET and why does he get credit for being one of the fathers of the internet?
ISAACSONYou know, I had to answer a question recently for the New Republic of who was the most important unsung hero. And to me it's J. C. R. Licklider. Licklider was this Missouri guy, ah, shucks, you know. He loved giving credit more than taking it. And he comes up with three great concepts. First of all, he's in charge of an air defense system. So instead of having that old fashioned batch processing where you gave your punch cards to a computer and you finally got the answer back, he said, you got to have instant interactive computing. Because a missile's coming in, you can't wait for the batch process.
ISAACSONYou also have to have wonderful graphical display so that you can tell a plane from a missile, from a pigeon or whatever it may be and which way they're traveling. And finally, he says, we've got to network them together. So he comes up, jokingly, he calls it the intergalactic computer network. And then he gets to move to the Pentagon and he says, will let me fund this thing. And it becomes ARPANET which become the internet. It's a way of tying together computers. And that's when you first get this connection between networks and computers.
ISAACSONHe also was one of those people in the tradition of Ada Lovelace who really cared about your particular field which is human computer interaction. Licklider was a psychologist and then he becomes a computer specialist. And so what he does is, we got to make it easy to have human computer interaction. He wasn't one of these Alan Turing guys who thought that the computers would be off doing artificial intelligence on their own. He wanted to have that close intimate personal relationship with you and your computer.
GOLBECKAnd that's something that I am a human computer interaction researcher and...
ISAACSONTell me what -- your lab at Maryland is called what?
GOLBECKThe Human Computer Interaction Lab and we're about 25 faculty who study all aspects of how people interact with computers, whether it's social media, which is what I do, or visualization, looking at pictures of things to understand data better, all the way down to education.
ISAACSONWell, you know, in my book, that tradition of human computer interaction, which Maryland was great at, MIT was, there's a whole stream of them starting with Licklider. Doug Engelbart who does the great -- who invents the mouse but also, you know, east graphic inter phases so people can understand the computers well. And then of course Alan Kay at Xerox PARC who does, you know, trash cans and icons. And then of course Steve Jobs who goes to Xerox PARC and basically borrows it or steals it or whatever. And you get the Macintosh.
ISAACSONAnd that's why the history of what you do, human computer interaction, is to me the great theme of the digital age. And you say you do it with social media but also with the whole visual display too, right?
GOLBECKEverything from -- yeah, how you interact with your mobile device, how you talk to your computer and what you do on it.
ISAACSONAnd the people who are doing that like yourself and people at the University of Maryland, to me, they're in the forefront of computing, not the people trying to create artificial intelligence where we won't interact with our computers ever again.
GOLBECKAnd this is an interesting point that comes up in the book, this contrast between Turing and Lovelace because Ada Lovelace really had this attitude that people and computers put together would be more creative and be able to come up with more things than just a computer by itself. And Turing sort of argued against that but now in the space where I do research, we're still having this conversation about, look, if you put people with the computers, don't just have the computers by themselves, put people in the loop, you can come up with much more incredible things than a computer can do on its own.
ISAACSONYou're signing my song because every time I go on the radio people ask me, well, aren't we going to have artificial intelligence? I say, no, why would you want to take humans out of the loop? Our wetware of our carbon-based analog brains, you know, we add something to the party too that's much different than a silicon, you know, digital brain of a computer.
ISAACSONAnd when you combine the two -- Garry Kasparov figured this out. He was the guy who got beaten in chess by IBM's Deep Blue. And he said, well, wait a minute. And then he took just regular computers doing chess and pretty good chess players and pared them. And always they could beat either the best machine or the best grand master. Likewise, Watson, the IBM computer that wins in Jeopardy, Ginni Rometty and IBM are now paring or combining them with doctors to do cancer diagnosis, trying not just to replicate or replace the doctor but say, let's form a symbiosis. That was the word Licklider used or, you know, a partnership, we might call it, between the human mind and the machine.
GOLBECKAnd there's so many interesting stories of this in computing ware. Some of the early artificial intelligence too say examine slides or X-rays to try to find cancer. It was more accurate than doctors were but if you showed it to doctors, they started getting worse because they would just rely on the artificial intelligence. So the things that they were good at, that their human perception was good at, they would override because they thought, well, the computer must have it right. But when you put them together, the things that humans are good at perceiving, they can correct errors that a computer will never be able to detect themselves.
ISAACSONNo. And that's why your field of human computer interaction, Jen, is so important. You know, you look at -- every now and then I go look at all these computers that are supposed to do things. Like, you go to Applied Minds and they show you the robot. And then the robot can't actually walk across an unfamiliar room and pick up a crayon and write his name. I go to the Total Domain Awareness System in lower Manhattan where all the cameras in Grand Central. But I say, okay, can you pick the mother's face out of the crowd? No, it can't do that.
ISAACSONI give you 100 examples and you know the odd thing about each one of these examples is a four-year-old kid can easily do those things. And so the combination of the human mind and the computer still seems to me, at least for the next four or five centuries, they'll be more powerful than just saying, let's create machines that replace us.
GOLBECKYeah, and, you know, I think the core issue there is that we don't yet understand the human mind in a way that we can replicate it in a computer.
ISAACSONAnd you always read starting in 1950 when Alan Turing does his imitation game, you can read in any -- the New York Times almost every three or four years has something like the Perceptron. That was a computer in the 1950s that's supposed to replicate the human brain, the neural nets in the human brain. And it was going to do -- it was going to think just the way a human brain did so it could do language.
ISAACSONAnd you read the exact same story, you know, just three months ago. They're having -- oh, you know, these chip companies are doing neuromorphic chips which basically try to replicate the human brain's pathways. And I say, fine, fine but I've read this story, you know, from every decade since 1950. And it's always 20 years away. I'm beginning to suspect it's a mirage and totally replicating the human brain will always be 20 years away.
GOLBECKWell, and the issue there is that even if you can replicate the exact physical way the human brain works, that doesn't get you to the human mind, right, the sorts of perceptions and things related to that.
ISAACSONI believe that too. You'll get a lot of debate on that but you and I will be on the same side of that debate.
GOLBECKYou're welcome to debate that if you're listening. You can give us a call at 1-800-433-8850. And I'd like to come back to where we were. So we have computers getting smaller. We've got ARPANET being created. And it leads us into the story of Steve Jobs and Steve Wozniak. For those who haven't read your biography of Steve Jobs, describe the teamwork that led them to develop a desktop computer.
ISAACSONYou know, what's really cool is that Intel, which is formed out of the group in Bell Labs as doing transistors and then Shockley forms a group and it explodes because he's a really bad leader. So you have Gordon Moore and Andy Grove and Bob (word?). They do Intel and they get a microprocessor, putting on a chip all the elements of a, you know, computer central processing unit. And when they do it, all these hobbyists and hackers and hippies out in Ceno, California in the Bay Area, they want control of their own computers. They don't want the big honking computer to be owned by corporations and the Pentagon.
ISAACSONSo they're always looking for this hobbyist computer. And finally in the early 1970s they come along with the Intel 88 microprocessor. And they show them off, the Altair being the first one of them. It's on the cover of Popular Electronics. Two things happen from that cover. First of all, I was in college. I go, wow. But somebody else was in college who was even cooler and 100 times and 1000 times smarter than me.
ISAACSONBill Gates is sitting there and his friend Paul Allen brings him Popular Electronics in December of '73 and says, this thing is happening without us. So Bill Gates drops out of Harvard and starts writing basic for the Altair. They bring the Altair to the Home Brew Computer Club in Palo Alto. First of all, the people in the Home Brew Computer Club believe software should be free. So they take a copy of Bill Gates' tape and makes 70 copies and give it away for free, which infuriates Bill Gates.
ISAACSONBut something else even more important happened which is Wozniak is there at the meeting and he says, whoa, I can make a better one than this. I can put a monitor on it. I like the specs of this microprocessor. And so he makes one and he connects it to a TV monitor and he gets his friend from down the street, Steve Jobs, to carry the TV monitor for him. They show it off at the Home Brew meeting.
ISAACSONAnd of course Woz being kind of a -- you know, a hacker hippie, wants to give it away for free, hands out the specs for this new thing. And Steve Jobs says, wait a minute. We can go to my parents' basement -- I mean, my parents' garage and we can make these things and we can sell them and we will make money. And so just out of that one moment where the Altairs at the Home Brew Computer Club, you see the birth of Microsoft and the birth of Apple.
GOLBECKYour biography of Steve Jobs portrays him as a brilliant visionary but also a prickly and headstrong person. At the same time he led one of the most successful teams in tech history. What made him a good team leader?
ISAACSONYou know, at first when I was reporting on Steve Jobs, everybody would tell me the stories about how prickly he was. In fact, their words they used you probably should not use even on a radio show like this. So -- and I thought, well, that's kind of odd because he did have a very loyal team. But over and over again people would tell me, especially from the original Macintosh team in the early '80s but also the Apple team he left behind, you know, in 2011 when he died, they'd say, yeah, he drove me nuts. Yeah, he drove me to distraction but he drove me to do things I never thought I'd be able to do. And I would not give up having worked for him for anything in the world.
ISAACSONAnd as he was, you know, in his last year I asked him, what was the most -- what was the product you were proudest of? And I thought he'd say the original Mac or maybe the iPod or iPhone. He said, no, those were hard products to make but what's really hard is making a team that will continue to make good products. So my -- the product I'm proudest of is Apple, the company.
ISAACSONAnd whether it was in the early 1980s where he got a team of 30 engineers, the original Mac team, he got a pirate flag and they put it above the Bandley Building in Cupertino for their, you know, merry band of pirates or the team he left behind at Apple with great people like Tim Cook and Johnny Ive and Phil Schiller, his ability to create teams transcended his ability to annoy people, I'll say politely.
GOLBECKWe're going to have to take a quick break but you can join the conversation. And you can also watch a live video stream of our conversation on our website kojoshow.org. I'm Jen Golbeck sitting in for Kojo and we'll be right back.
ISAACSONWelcome back. I'm Jen Golbeck from the University of Maryland sitting in for Kojo Nnamdi. I'm talking with Walter Isaacson about his new book "The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution." You can join the conversation. Give us a call at 1-800-433-8850 or send us an email to kojo@wamu.org. And you can also watch a live video stream of our conversation on our website kojoshow.org.
GOLBECKSo we've talked kind of about the internet, the computer. We're kind of up into the '70s. I'm almost born now at the point in our history. But one of the things that you emphasized in this book that we have talked a little bit about here with Steve Jobs and Steve Wozniak is the importance of collaboration and how these teams are actually critical to innovation. And so I thought maybe you could give us kind of a high-level view of that important theme in the book.
ISAACSONWell, I think, you know, one of the things that surprised me -- I'm a biographer and then we biographers realize we distort history a little bit. We make it seem like there's some gal or guy in a garage or a garret and they have a light-bulb moment in this loner invent history. If you take the original computer, there are two people in the United States you could sort of say helped create the computer. One is a guy named John Vincent Atanasoff at Iowa State who's working almost alone in a basement at Iowa State.
ISAACSONAnd, you know, he really comes up with the whole concept but he can't get the punch card burners to work. He's only got one graduate student helping him out. So when he leaves to join the Navy they just throw the machine away. But there's a guy named John Mauchly who is the ultimate of the other extreme. He travels around for the 1939 World Series, even drives out four days to Iowa to look at this other computer. And he creates a team of people at the University of Pennsylvania that includes a great engineer, Presper Eckert. Because, as you said at the beginning of the show, you know, vision without execution is hallucination. And six women who were the programmers, we talked about of ENIAC. As well, you know, mechanics, you know, all sorts of people.
ISAACSONAnd you see that all the way through, whether it's Mark Zuckerberg bringing in Sheryl Sandberg to help manage Facebook, or Sergey Brin and Larry Page bringing in Eric Schmidt, this notion of creating the right team is always crucial to execution.
GOLBECKSo a lot of research has shown that diversity is actually a really important part of collaboration. And so I wanted to ask you some questions and get your thoughts about the role women in computer science in the current culture in these companies, which I know is not exactly in your book, but I think because collaboration is such an important theme that you touch on, that you might have some thoughts to share here. And so, of course, I'm coming from a place of being a woman in the field of computing.
GOLBECKAnd I've certainly been in academia, but what we seen in a lot of these, you know, start-up companies especially, but even in some of the bigger corporations, is what's kind of a frat boy culture. Whitney Wolfe, she's a co-founder of Tinder, the dating app. And she was actually stripped of her co-founder title because the chief marketing officer, a guy named Justin Mateen, thought "having a young female co-founder makes the company seem like a joke and devalues the company." So why do you think that some of these…
ISAACSONI have not heard of that company, but I'll not use it.
GOLBECKTinder. Yeah, I don't…
ISAACSONActually, I have heard of it, but I won't use it.
GOLBECK…think you need to use it. But…
ISAACSONYou know, this is a really bad problem, which is the gamer/hackathon mentality, frat boy mentality. And as you say, you can't have a great revolution where half the people are left out. That's why I looked, you know, when I was doing this history, to try to emphasize the role of Ada Lovelace, the six women of ENIAC, the Grace Hopper and others because my daughter, you know, who is a computer science geek and studied computer science at college, she said, you know, "I didn't know that we had role models until I read a Batman comic and there was a woman who was a programmer.
GOLBECKWow.
ISAACSON"It's the first I knew that women could code." And so I do think that not having role models is a problem. Fortunately today, you have great role models who should become, you know, a little bit more famous I would think. Obviously, Sheryl Sandberg is, to some extent Marissa Mayer who runs Yahoo. And now we have Megan Smith, a great engineer, a wonderful person, who had been at Google and Google (x) for a long time, is now the chief technology officer for the United States.
ISAACSONAnd so people like that I think will create role models and I hope will get the geek frat boy, anti-woman sentiment that may have existed in sort of the -- in some of the realms of hackerdom -- it'll blow that away.
GOLBECKYeah, that sort of bro culture is a hard thing to look into. Klout has this poster that they used for their recruiting events. It said, "Want a bro down and crush code? Klout is hiring." Which is…
ISAACSONWhat's it like at the University of Maryland? Do you have a lot of women doing, now, human/computer interaction?
GOLBECKIt's interesting. So I did my Ph.D. at Maryland. And now I'm on the faculty there. And I did my Ph.D. in the computer science department and I had an amazing advisor, Jim Hendler, who was there for about 30 years. And I felt very welcome in that space. At that same time, there are very few faculty members in -- female faculty members in the computer science department, which is something that they're actively trying to work on.
GOLBECKOn the other hand, I have switched departments. So my faculty's appointment is in an interdisciplinary space called Information Studies. And we're half women. The human/computer interaction lab is at least half women on our faculty, I think. And it's a space where it's not kind of this fraternity, competitive culture like you see in a lot of these startups. It's not a place where you could get away with, you know, posting pictures of women in bikinis -- not that you could in the C.S. department, either.
GOLBECKBut the human/computer interaction space, I think very much values the role of all kinds of users. And looks at their ethics and the way they interact. And that's treated as important and I think that's something that attracts women who, you know, I know I had these experiences as a student where I was told that well, women aren't very good coders because they're not competitive in the way men are. I mean, I had a guy say that to me in a class once in front of all the other men in the class.
ISAACSONWow. You know, but, I mean, people like Danah Boyd who were pioneers in your field. You know, it's interesting because the next phase of the digital revolution, I think, will not just be engineering driven, but it will be the combination of engineering with some of the more creative disciplines, as well as things, as you say, like information studies or for that matter, arts and literature, whatever it may be. And I think we're going to have to see greater diversity because you can't have real creativity in cross-disciplinary fields without having different types of people there.
ISAACSONBut one of the most depressing statistics for me was in 1984, I think 38 percent of people getting computer science degrees at American universities were women. And now it's 17 percent. It's shocking. It's gone the exact wrong direction.
GOLBECKYeah, and it's -- part of it is that there's a pipeline that women go through, where they come in as undergraduates and then some of them drop off at the masters level and some drop off at the Ph.D. level. So when you look at female faculty in computer science, the percentages are even smaller. So it's hard to see people who look like you in that space.
ISAACSONAnd I think it's really important, again, back to the people I wrote about in the book -- I had, you know, my father, two uncles who are electrical engineers. They, you know, they were role models. You know, I think it is important -- take the women of ENIAC or take Grace Hopper, these are people most people don't really know about. They're not huge popular names. And yet they are the role models.
ISAACSONAnd I hope, you know, just by giving them a prominent place in this history, people will see anybody can code. And anybody can, you know, combine -- as Ada Lovelace did -- the beauty of the humanities with the real beauty of math and science.
GOLBECKYou can also join our conversation and share your thoughts on this issue and also talking about collaboration and making successful businesses and innovations. Give us a call at 1-800-433-8850. And remember that you can see a live stream of us on our website at kojoshow.org. So if we come back to book, I'd like to get your thoughts on whether computers and the internet are some of the most game-changing innovations that we're going to see in our lifetimes. Or do you think that there's more to come on that level?
ISAACSONWell, I think that we'll now see the combination of the creative industries with computers and the internet. Especially in a collaborative way. I took parts of this book, many of the chapters, posted them online and had people collaborate. You know, 18,000 people in one week gave me stories, anecdotes and corrected some of my pieces about the 1970s. So I think, you know, maybe for non-fiction writing, collaborative books will be the next big thing.
ISAACSONHopefully, we'll -- it'll disrupt the payment and financial system with things like bitcoin, so that we can have easy payment systems, so we can all share the revenue from a book or something or for that matter a play or a LARP or role-playing game, whatever people want to create collaboratively. So I think that's among the things we'll see in the next phase of the revolution. But also we'll see it connecting to biotechnology, as well as -- I hope the physical industries.
ISAACSONThere's something harder about doing something big and physical than there is from doing something like Facebook, which you can do in your dorm room. And people like Elon Musk, who are trying create cars and batteries, those are fields like the financial sector and the health sector that have not yet been disrupted as much as they should be about the digital revolution.
GOLBECKWe have an email from Will, in Adelphi, who says, "What are some of the stories about equally brilliant people who are doing similar things but failed to make it big. Was the difference luck, social skills or something else?
ISAACSONWell, obviously, there's some luck involved. I do think it's mainly a matter of execution. There's so many people, you know, I'll walk around a TechCrunch conference and people, you know, grabbing onto you and showing you their new thing or their innovation or their plan or whatever. And I'll pull out my iPhone and say, okay. Let me download and see how it works. They say, oh, no, no. It's just a concept.
ISAACSONWell, you know, concepts are a dime a dozen. What usually fails is when people can't create the right team and that's what we've said over and over again. They can't find the right engineers or the marketers or the business people or the sales people. Even America Online had this great visionary Bill Von Meister, but it took Jim Kimsey who, you know, ran bars here on Connecticut Avenue, and Steve Case, you know, to sort of say, oh, I can turn this into a real service. So I think that's the big leap. It ain't luck. It's execution.
GOLBECKLet's take a call from Ken, in King George, Va. Ken, you're on the air. Go ahead.
KENHi. Good program. I've been working with computers and software development since the mid-60s on the Navy base down here. And so I've got to see a lot of changes. And I'm still kind of a geek. And so I try to keep up with the technology. And what I see are new things like Google Glass, and eventually -- and I agree with you, that I think the future is with computers coming together with humans. And that's my question. Is given things like Google Glass, embedded computer technology that it will have in it will help us talk to each other and collaborate much quicker with even larger groups. Where do you see us going? It looks like a new world brain almost.
ISAACSONWell, I think that's a very important concept, which as we said, computers do become more intimate. I've already got my Google Glass. I can't wait for my Google -- I mean, for my Apple Watch. And, you know, I have my computer -- two computers in my pocket, basically, my iPhone and my other device has more computing power than my first P.C. had. So I do think mobile and social networks are the phase we're in right now. And it helps reinforce that the Holy Grail is not artificial intelligence, but human computer interaction.
GOLBECKWe've talked on this show about the tension in American schools today between STEM education and the humanities. And you write about the need for both visionaries and doers. Do you think the schools are doing a good job of nurturing both of those tendencies?
ISAACSONYeah, I think we're starting to see schools teach collaboration and teamwork. You know, back when I went to school that was called cheating. But now, you know, saying, okay, do this in a team, like the Computer Science 50 course Harvard. Everybody's supposed to do their work as a team, not, you know, try to each do individual projects. I also think that we're good at, you know, sparking creativity. We allow people to -- kids to question authority, which actually is important for the digital age and be comfortable with the free flow of information.
ISAACSONHowever, I mean, there are a couple of things I worry about. One, is that we used to have the best education system in the world, measured by graduation or math or reading rate. Now we're about 17th, 18th, 19th, however you want to measure it. And so we're not going to be the most productive and innovative economy. We don't have the best K-12 education. And even as equally frightening is that there's become a gap between the good education rich kids can get and the education poorer kids get.
ISAACSONAnd I think when you have a divide like that it's not good for having a revolution, especially in information technology, where people are going to have to have good feel for how information technology works to succeed in the new economy.
GOLBECKAnd following on our conversation about women, we had a call from Olivia, who couldn't stay on the line. And she said, "In terms of college preparedness for computer science, how many women might be shut out from college courses because they're not educated in computers in high school?" And this I think is an issue not just in high school, but going back to what you talked about at the beginning of the show, where you're in the basement as a kid and you're taking stuff apart and you're building all kinds of crazy things. That is traditionally something that boys are encouraged to do more than women. And that plays into this whole STEM debate.
ISAACSONYeah, I mean, I would hope that everybody should have access to computers. And I think the bigger problem now is between, you know, people who were born in, you know, more privileged backgrounds. They have Wi-Fi, they have laptops and kids in some, you know, many kids, you know, the kids in New Orleans that I work with -- because that's my hometown -- just getting a cellphone, getting Wi-Fi at home, that's pretty difficult.
ISAACSONSo I think we have to make sure that we don't create a digital divide. I also think that, you know, this notion that, well, the humanities and arts are important -- yeah, we've got to teach the humanities and the arts because that's what makes us both creatively good at working with our computers, but also ethically good. We have to keep up ethically. You know, the first half of your show was about moral issues we deal with on privacy and everything else.
ISAACSONSo that's an important thing on the humanity. But the flip side is also true. I love people who applaud the humanities and arts and then I say, yeah, but, you know, you would be appalled if somebody didn't know the difference between Macbeth and Hamlet. But yet, you kind of brag that you don't know the difference between gene and a chromosome or a transistor and a resistor or an integral equation and a differential equation.
ISAACSONSo the notion that some girls or some people should grow up learning arts, but not also immersing themselves in the beauty of science of math, I think that's a dangerous concept. We should all -- like Ada Lovelace did -- think, oh, I get it. A line of math is just something I can visualize and it's as beautiful as some of my dad's poetry. Like "she walks in beauty like the night," you know, Ada could visualize that. An algorithm, she could visualize that. So I think that's what brings people in more inclusively.
GOLBECKAnd I agree. I think if we could teach kids in elementary and high school the beauty of an equation or an algorithm, which really, truly is there…
ISAACSONAnd to visualize it. Just like it's the rosy-fingered dawn. You know, I hate the way we teach math now, where you'd even be doing calculus or algebra and you don't realize that it's a brush stroke painting something beautiful in nature.
GOLBECKI wish we could talk more, but we're out of time. I'd like to thank Walter Isaacson, author of the new book, "The Innovators: How a Group of a Hackers, Geniuses and Geeks Created the Digital Revolution." Thanks for joining us. I'm Jen Golbeck, sitting in on "The Kojo Nnamdi Show." Thanks for listening.
On this last episode, we look back on 23 years of joyous, difficult and always informative conversation.
Kojo talks with author Briana Thomas about her book “Black Broadway In Washington D.C.,” and the District’s rich Black history.
Poet, essayist and editor Kevin Young is the second director of the Smithsonian's National Museum of African American History and Culture. He joins Kojo to talk about his vision for the museum and how it can help us make sense of this moment in history.
Ms. Woodruff joins us to talk about her successful career in broadcasting, how the field of journalism has changed over the decades and why she chose to make D.C. home.