This is the Marine Valley Library Podcast. I'm Troy I'm Tish, and today we're gonna talk about the upcoming One Book, One College program on saac Asimov's I, Robot. We are joined today by two faculty members. Hi, I'm Carey Millsp-Spears and I teach composition and literature. I am Amani Wazwaz,and I also teach composition and literature Thank you both for being here. For just as a quick overview, I want to read a little bit about the book I Robot because I think it's a historic piece and really is foundational in the in science fiction. So, this is from the Encyclopedia Britannica page, I thought they did a very nice job, I Robot is a collection of nine short stories by science fiction writer Isaac Asimov that imagines the development of the positronic brain and the development of robots. It wrestles with the moral implications of that technology the stories originally appeared in science fiction magazines between 1940 and 1950 in the year 1950. They were first published together in book form Asimov's treatment of robots is being programmed with ethics rather than as marauding metal monsters was greatly influential in the development of science fiction over the 20th century. So, it seems like an important book for us to pick for this one book as technology is more and more intertwined in our lives so although this book is really focused on technology and robots and the creation of robotics. Amani, if you could talk a little bit about why this book is significant in helping us understand ourselves actually more than understanding robots. All right, Tish this selection for this year is extraordinary and it's very very significant for the very reason that you mentioned. I'll backtrack just a little bit and say that in the 1940s when Isaac Asimov came up with the idea of robots this idea these stories were earth-shattering. They were they were new. Now granted much much earlier Mary Shelley in Frankenstein had come up with this phenomenal idea of humans creating a creature that thinks on it on its own and she delved very deeply into the psychology of this that became very humanistic and very humanoid. We have a lapse of years and years and in the 1940s Isaac Asimov comes up with these incredible and very thought-provoking stories and what I love about the fact that Moraine will be reading these is this is a very thought centered collection of stories. What's beautiful about Asimov is that in creating these stories it helps us to understand ourselves who are we we have been developing technologies but the idea of like the thinking being the thinking robot is something very significant and Asimov starts to have us try to understand our own ways of thinking for example in putting us in these situations where we're interacting with robots. He allows us to understand well. Why is it that we develop attachments to such robots? And we see we see that it's our humanity it's the way that we interact with robots. We assign them particular traits and we make memories with them and it's for the very fact that we create memories and we interact with them that we come to care for them we come to become attached to them. We become attached to our very own creations and he's asking us to think of that and to think of what is it that makes a social human being just because I, a child, a teenager, a young adult is interacting with technology and communicating with it does that make that human being that child that teenager less social less of a social being does it take to human beings interacting with one another to create a fully social fully functioning human being is technology bad. He's opening our hearts and souls to these questions you get characters who are terrified of robots. Robots are not good they're not good creatures they don't have a built-in sense of ethics or or morals so Asimov is also pushing us to ask that maybe what we should do is take a look at our ways of thinking and what is it that makes us uncomfortable with these robots is it actually them or is it actually us that we should be terrified of something that is fascinating about Asimov is the rule of spirituality and religion as well – you get his robots acting so much like us, acting so much like human beings. We as humans refuse to accept that we are just skin bones and muscle that were workers, oh no there's gotta be a greater reason to our existence and so when you have Asimov's robots thinking this way you know this is us. It's Asimov trying to have us understand ourselves on a deeper level that we want. We want to assign a greater purpose to our existence so he's having us ask these questions you know how is it that we reason the purpose of our being into our world it's like it's as if Asimov is telling us oh it's these robots who are thinking that way but we know we know this is us and here we are human beings thinking oh my god we created something and we've lost control we should have been the gods but oh my god, they've developed a different divinity, a different divine beings for them to bow down to and not us also we have become the jealous gods in in other words. In that way I see Asimov poking fun at human beings how arrogant we are. The robots ultimately are following the rules that are embedded within their robotic brains and the rules go like this the rules are as follows rule number one you should not hurt any human being and rule number number two you should obey orders unless it violates rule number one and rule number three you should not get hurt as a robot unless it violates rule number one and rule number three. And so, we're embedding these rules in the minds of these robots so that we human beings are gonna be protected. But wait a second are our rules of morality are they absolute truths the philosophical rules that we stand by are they absolutes or are they subject to context? Are they subject to de variety of different conflicts and a variety of different settings? And Asimov in a very genius fashion shows oh my god, there's no absolutes. Sometimes in order to obey rule number two you gotta somehow violate number one and number three. And so, what we see is that robots at times they cannot function they just sit there and then they shake and they twirl around and they dance in circles because they don't know what to do and I think this is like a very important conversation that Asimov brings to the table. He brings in beautiful ideas that later on Stephen Hawking, would Stephen Hawking later on would say like be careful. You know, artificial intelligence may take over may take over you know and this is what, Asimov also you know he brings in people's fears. But as I mentioned earlier it's more like how, yes, because we want to stay in control. Well, do you think how do you think that then translates back to Frankenstein. You bring that up that you know Shelley brings this up first. So, how do you what kind of connection do you make? Okay, what I make between between the two, Carrey, is the following; For Mary Shelley, she humanizes the monster very deeply and and the monster becomes a human being and the monster becomes so incredibly lonely and lonesome and feels feel such depth of rejection from dr. Frankenstein. And when I've taught Frankenstein, and I've taught it in a religion and literature class and and it was a multi-faith, multi-cultural a religion and literature class. And what I see between both of them is for the monster for Mary Shelley's monster, you have forsaken me. You know you have created me, and you have left me alone, and here I am, you know, I am frail. I have my own vulnerabilities, and you have left me and you have discarded me. How dare you. And it speaks to Mary Shelley, beautifully speaks to how we create and we let go of our creations we are not responsible. We are not responsible for taking care of what we have let what we have created. And we see that in here we see that in here as well to where I've created something. And how dare it not respond to the way that I envisioned it. You know, it's almost like a dysfunctional parent-child relationship. It's almost like that. It's almost like when I think of spirituality and religion. It's almost like the fall from grace. You know human beings are created, expected to be perfect. And then when they're not, they're cast down, you know, from the garden of the Garden of Eden. We see that here, like, you know, it's almost scary like we human beings are the robots. You know, I 100% agree with you and when I read the first chapter of I, Robot all I could all I could imagine was the creature the creature in Frankenstein, and how in the novel he has a voice, and if you watch the James Whale movies, Frankenstein, the Bride of Frankenstein, and all the others, you know, monster films of that era, the creature doesn't have a voice. And he's very he's very similar to Robby, the Robot, right, he exhibits emotions he can emote-- but he can't speak. But one of my favorite passages in the world would be when the creature meets Viktor in the countryside, and he's begging him for a mate. He says, “Have I not suffered enough that you seek to increase my misery? life although it may be an accumulation of anguish, is dear to me, and I will defend it. Remember, thou has made me more powerful than thyself. My height is superior to thine my joints. More supple but I will not be tempted to set myself in opposition to thee. I am thy creature, and I will be even mild and docile to my natural Lord and king if thou will perform thy part which thou owest me.” And it reminded me so much of when Robby is cast out of the family right. So the creature if for those of you haven't read Frankenstein, the creature lives next door to a family in volume 2, and he has essentially cast out from the family. When they see him and no one obviously goes after the creature. No one helps the creature but in in I Robot in the chapter Robby, the little girl, Gloria, says to her mother he was not a machine screamed Gloria. He was a person just like you and me and he was my friend. I mean if you would have some I think as I'm off as giving a my and I you know I'm a little I don't know, I'm more team creature than team Victor, because I think Victor, you know, does everything wrong in Frankenstein. But that's another conversation. But I think Asimov gives the creature a little bit of a voice in Robbie, and I just loved that chapter. It was probably my favorite in in I, Robot. I don't know what did you have another one that you like. I loved the Robbie chapter as well. I loved the innocence of Robbie. I loved so much that Robbie is such a sweetheart and that he wants to listen to Cinderella. Yes, over and over and over again. I love that as well, Carrey. I love the Robbie chapter. I also loved reason tremendously, and I loved the lawyer chapter a lot. I loved it because the robot basically lies to the human beings and tells them what's on their minds and their wishes and human beings at first are like, oh my god they get their wishes come true. And then when they realize the robot has just been longing to them. They cannot handle it. So it's like you know the robot okay, I want to tell you that robot in there I think his name was, Herbie, right. I think so okay okay I loved Robbie. I loved Herbie as well. Sort of like oh my god, how dare you do that to Herbie, you know? He was speaking good he was being kind and compassionate and telling you your dreams. How dare you do that to him? But again, again, you know what in our world again, you know, Asimov is telling us what does it mean to be a compassionate human being? What does it mean to be a good human being? Herbie was good. Herbie was compassionate. Maybe too compassionate. Do you think you know? Asimov gives him these great names. Robbie and Herbie. And he's like very childlike names and some would and they're cute yeah cutie and they're they're just very they have these sort of elements to them that made me really think about you know Rousseau and the idea of you know childhood and the fact that people are born good and Society makes them evil. And I think that that kind of plays out and I Robot in a lot of these different stories. And I study a lot of popular culture as you know I'm a nerd and I'm a big science-fiction fan. And I'm a I guess if one could be a scholar of Star Trek. I, I would be one of those and in in Star Trek The Next Generation there are a pair of Android brothers. Quote-unquote brothers, and they're played by a Brent Spiner and the characters are Data and his brother Lore. And they're actually sort of like the twin and the evil twin sort of incarnate. And in these stories and one actually dresses in black. I mean, it's just it's very on the nose as far as that goes but the positronic brain isn't part of their makeup. And they were created to be better than humans data has no emotions he can feel no emotions but Lore can feel all the emotions and so that's where their conflict kind of comes throughout the series. And I just find it kind of interesting they even actually mentioned Asimov in some of the episodes and the discussion of these androids. And in one episode in particular, the androids are, Data is put on trial to find out if in fact he's real enough to be sentient. This idea of sentient beings can we create them and that kind of again I think Asimov starts those conversations that are pretty typical in popular culture. Today, I mean everything from Battlestar Galactica and the Cylons to you know? Star Trek and the different incarnations of technology through that but I mean, you even have that in in more, you know, realistic stories with Westworld as being another example. Those androids are created for entertainment. You know, some people just go to Westworld to be entertained by these, these, you know, protohumans. They look like humans. They feel like humans. But they're not and so I think thatpopular culture kind of taking it into a whole other realm in some ways that we're even farther than. What he envisions in this particular set of short stories. But I want to ask you one other thing. You mentioned religion in a couple of ways and in this idea of taking the rule of God. And do you, do you think that the the robots see it that way? Or that just the creator's see it that way? Do I think the robots see it that way? Or, yeah, when they see that is the rule sort of being religious in nature? Or do you think they're just sort of like guidelines? You know, that they follow. Okay concerning the rules, I think they just follow it. They just follow the rules, and they don't see it as a kind of religion. But in Reasoning which is one of my favorite stories, here what happens is Herbie becomes a thinking reasoning robot, and he starts thinking of his higher purpose. And his place of existence so when he himself starts asking about his place of existence in the world, he refuses to believe that he is there to replace a worker. And be just a worker and that human beings are just gonna go away and so he even says, you know, I realized that once I started thinking that there is more to me in my place of existence so what it's very interesting that Asimov has the robot thinking this way but I think what Asimov is trying to say is we think this way we human beings are meaning makers we want a higher purpose and I think he continues this conversation that many many many writers and thinkers have thinking about fur though for the longest time I think Harry his genius is he instead of human beings talking about this in a Shakespearean play you know in in a regular novel or short story he has our mirror reflections which are robots thinking thinking like us and this is what I find fascinating so in terms of of the rules I think it's been embedded in into into them but in particularly like the reason the reason chapter they start just switch it around and oh you know we we are the better ones we are that we are the superior beings and thou shalt not hurt well I'm smarter than you I the robot am smarter than you and I will not hurt you I will do whatever I can to not hurt you I'll control you right I mean I think that that's sort of like the the core of these things you take these things to a certain extreme and then you have the sort of aftermath of that extreme sort of positioning in in these kind of a narrative so to go back to my example of these androids the same thing happens but these Android brothers one decides that he will become you know the ruler of everyone because he's the smartest I mean he has the fastest thinking brain and he can do all these things and he's superhuman in his strength and all of that but it's it comes from a position not of wanting to help other people it's just to want to rule them and so I think that kind of definitely speaks to to what you're saying and I wanted to mention that you also bring up the idea that that popular culture gives us these mirrors that are different from you know reading Shakespeare or you know any other very difficult texts and I think some people and I've met I've met other you know faculty from other places and other disciplines you sometimes think pop culture is not necessarily a way to talk about things that are in a college or in a university level setting but I would I would say the the opposite is true it gives us a way into a conversation it kind of puts our foot into a door into a big conversation so if I were to bring iRobot into my classroom and we would read the first chapter of Robby spoiler alert that'll probably happen in the fall then I will I will then you know maybe pair that with a with a discussion of maybe James whales version of the creature and and kind of have a start thinking about that before you know before someone would even attempt to read Frankenstein for example because it's sort of far removed from us so can you think of any other examples that you would you would pair iRobot with okay what I thought of was I would pair up iRobot with the movie artificial intelligence because in artificial intelligence the little boy robot starts to become a feeling being and starts to hunger for the sense of the mother so deeply that he okay okay so Euler Euler I apologize he even starts to hallucinate a mother a mother figure in the end and that Spielberg's yes yeah that still works are two artificial intelligence I thought of pairing this book for my COM 101 either I am gonna be teaching this in COM 101 in COM 102 as well too I thought of that and and Spielberg's artificial intelligence and I also thought of Frankenstein as well too because he is like such a very humanized monster so I thought of those two and I also thought of pairing this with writings by Stephen Hawking where he where he warns you know against artificial intelligence taking over humanity another one is a documentary called prophets of doom where in a certain part they warn against against robots taking over us well I mean I think it's kind of funny because we have all these warnings against these things you know we shouldn't have artificial intelligence and robots are gonna take over I often show a commercial in my COM 101 to teach the straw man fallacy and it's an old Dodge Charger commercial and it actually says we've seen that movie robots are gonna harvest our bodies or whatever and it's just basically talking about if you have too much technology in your car your car is going to break down so that's sort of the metaphor used in the commercial and so we have all these warnings against these things but yet we still do it so I went to Lowe's and there are no cashiers I have to self check out everything and so with this automation of work I think is something else that you brought up and I think that also as present in Frankenstein to this idea that we can create beings to take on the jobs that we don't want to do so I can give you example from another science-fiction series that I'm obsessed with with this Battlestar Galactica and it's the rebooted one not the one from the 70s where they were way too groovy but the the rebooted one they actually have they built all these machines to take on their jobs and so essentially the machines become more human more fully realized and they think themselves perfect and they actually create their own religion to go back to that you know so they do everything that Amani said earlier and then they come back to to basically wreak havoc on their creators you know out of revenge and it's order of this idea that they take on all of the work that no one else wants to do and they have to do all that work so do you see sort of this novel or this collection of essays and I Robot as sort of a warning against automation or technology and you know become more cyborgs you know I mean I'm wearing a Fitbit I have a phone in my pocket that connects to my car when I get into it so what am I now right I really found the last two stories so compelling right that you know and asking the question could you tell the difference between a human in a robot and then also you know like what is what is work what is the economy and you know what if our technology just decides for us that we aren't needed in the economy and I mean I think those are key themes that we wanted to talk about over this next year which is part of the selection of the book I don't know what that that means for our students what kind of work they what will be available in the future I don't know I think it's interesting question but I do think that's something else that's brought up in this in this novel okay for me this has begun the conversation and I look forward so much to August September October November oh it's like so I'm really looking forward to how this is gonna branch out well I'm really looking forward to the the discussion that we could have between humanities faculty and sort of stem faculty about about this kind of story about this sort of discussion of Technology and humanity because I think that in the humanities we've been fighting for our existence lately that we are relevant that we have a place in the world and that you know yes you can get a great job studying engineering and I you know I love engineers my son's going to be an engineer that's not the case but I think we also have to discuss the fact that at the core of engineering there are human discussions there are human feelings there are human you know there are things that are we needed to be aware of and so I think it'd be I'm a most excited to talk about the sort of intersection between the humanities and stem in stories like this so I think that would be fun thank you both so much I am super excited about just thinking about all of these things for a while over the next year and have some great ideas and great questions that I can't wait to like pose to you later and to all of our other faculty and our students so really enjoyed this conversation yes thank you so much Thanks.