Matthew
Hello, welcome to this episode of Superhero Ethics. Today we have a great guest coming back, Rob McKenzie. This is probably his fourth or fifth, or maybe even more time on. He’s going to help me continue the series that I started with Paul Hoppe about a month ago: What Made You Rebel Scum? Rob, how are you doing today?
Rob
I’m doing pretty well. How are you doing, Matthew?
Matthew
I’m good. I was in Palm Springs over the weekend doing some judge stuff. Palm Springs is proof that literal hell on earth does exist. It is both pretty and horrifically hot. And then, just to help you understand that you are now in hell, the airport is open air. There’s no air conditioning. There are gates that have a little bit of coverage, but no air conditioning there, and then you go outside and it’s just hell. It’s just the huge ball of fire in the sky telling you, you came to a desert. So, I mean, there are strong opinions and other people really like it. It was beautiful to look at from the few places on earth that were air conditioned, but a little hot.
Rob
And we’re in the future. You could use solar panels to air condition now.
Matthew
So it could have been a thing. It could have been a thing. But anyway, putting all that aside, we’re continuing a fun series that we had. Oh, I’m sorry. Rob, tell me more about how you were doing though.
Rob
I’ve been doing pretty well. Since last time I was on the show, I moved positions within my job. I’m managing judges in the Magic Judge Program, and it’s been fun.
Matthew
Sounds good. Well, you are someone who I’ve been having the kind of ethical and moral and fandom conversations that this podcast is about for years and years, and I always love hearing your perspectives. I know that you and I don’t always line up on every social and moral and political issue, but I think we’re both the kind of people that the right would not like very much — we’re somewhere on the woke scale, however you want to define it. And the thing I’ve been doing with this series, titled What Made You Rebel Scum?, is looking back at what media, whatever Rebel Scum means to you, helped shape who you are and what your values are — how you see the world politically, morally, ethically, socially, all of that. I wanted to throw it out to you: when I say, what helped shape Rob into the person who believes the things that he believes, what comes to mind?
Rob
There are a couple of big ones. My favorite author is Isaac Asimov. He was an avowed pacifist for his whole life. He lived through World War II, but he was a chemist and he volunteered to be in a chemistry lab instead of going out and doing fighting. He believed it was important to do things for the war, but he couldn’t stomach being violent. And he’s got a couple of things that have always kind of stuck with me. There’s a phrase, clearly kind of tongue in cheek, from one of his characters in the Foundation books: violence is the last refuge of the incompetent. Salvor Hardin — the structure of the story is that Terminus, the Foundation, has no political power, no military power. Everything it generates is due to being smarter than everybody and using social pressure. And Salvor Hardin juggles four major warring powers and becomes totally in control of the region, despite Terminus not having any resources, because he doesn’t have the power to do violence. But on the flip side, violence in Asimov’s books really doesn’t help the people doing it for the most part. It’s always the villains who resort to it. It’s always wrong to make the violent choice, at least through his early works. It becomes more complicated as he gets older.
The Foundation is clearly written when he’s reasonably young. But in his robot stories, the Three Laws of Robotics — a robot may not harm a human or through inaction allow a human to come to harm — Susan Calvin says that robots are better: they can’t hurt you. They have to obey, and then they have to protect their own existence third. That number one — don’t hurt people — is like the golden rule of everything. And the robot stories are very complicated in that there’s a lot of: what happens if you have to make a choice that’s going to hurt somebody? There are these trade-offs. But thinking in a frame of what hurts people the least is a good frame, I think, a lot of the time, and I don’t think enough people make those choices.
Matthew
Yeah, it really is. And I’ve heard the Laws of Robotics applied in many situations — some of them by Asimov himself, and others by people taking it further. One of my favorites is a little meme that says, the first law we taught the robots was: do not allow humanity to be harmed through your inaction. We didn’t teach them that capitalism was the exception. And the implication being that the robots were just like, well, capitalism harms people, imperialism harms people, so we’re not programmed to accept that because we think it minimizes harm — because that’s often a lie. So the robots just say, no more capitalism, no more imperialism.
Rob
There’s a story I mentioned before — “The Evitable Conflict,” not “inevitable,” “evitable” — where AIs that follow the Laws of Robotics are shaking loose these problematic elements of society. Nobody just waits for premises to choose: a factory owner suddenly finds his widgets aren’t necessary anymore. The factory dries up and he doesn’t have resources or power anymore because the economics of the situation changed. And the AIs are trying to do the least harm. They’re like, he will do more harm — if we hurt him a little bit now, he’ll harm fewer people later. Which is both good and creepy. In the real world, I don’t actually trust humans for the most part to consistently apply that level of judgment, and so everything should have checks and balances.
Matthew
And that’s one of my main questions — when you start with the idea of let’s figure out how to do the most good for the most people and the least harm to the least people, there’s a lot of value in that kind of thinking. But it can also become utilitarianism where you’re trying to do the most good for the most people, but if you can save a thousand by murdering thirty, then it is perfectly okay to murder thirty. And people who believe in utilitarianism are now yelling at me — I’m obviously taking it to an extreme — but that is often a justification used for some of the worst people in history who have claimed they’re going to make everything better by just killing this one group or enforcing this kind of thing. So in Asimov’s work, how is that balanced? Because maybe actually the best thing to do is to kill all the billionaires — that’s becoming more and more of a real idea for some people. What would Asimov’s response be?
Rob
That’s the plot of The Naked Sun and The Robots of Dawn. So you said, we taught them not to harm humanity. Humanity and humans are different.
Matthew
Okay. Say more.
Rob
The robots are taught not to harm humans, but not humanity. But Daneel Olivaw sits and figures out — I don’t think that’s good enough. He thinks humanity needs to come before individual humans, which is very utilitarian. And he makes a choice to irradiate Earth, to drive everybody off of it, because he believes that people need to leave the nest. Baby birds are happier and safe in the nest, but they’ve got to fly — and some baby birds don’t make it when you push them out. Some don’t fly and get caught when they hit the ground. And he’s like, I can’t prevent all harm. I can’t stop everyone from being harmed everywhere. So I have to make the choice that I think is best. And that choice does kill a lot of people and harm a lot of people. He’s like, I can make the choice about sacrificing this person for the greater good — but it’s a fairly clear greater good from his perspective. The people who died on Earth would probably consider that he could have made a better choice.
And Asimov is obviously conflicted, and I’m obviously conflicted. You have to try to not do active evil, but there’s a lot of gray. My day job — I work with companies that produce military things. I support them because they’re one of our many manufacturing customers. I can’t refuse to support a company that makes missiles or drones or guns or whatever.
Matthew
I’m going to push back a bit there — and I’m not questioning your choices — but I do just want to make sure we’re clear on the language: you could choose to refuse, but it would mean putting you and your family in economic danger.
Rob
Well, it’s not even just economic danger. I think it’s more complicated than that. Companies aren’t exclusively one thing or another, just like people aren’t. Situations aren’t exclusively black and white. As much as there’s a group of people in power that are trying to make it very clear that cruelty is the point — we’re going to keep sending thugs into Minneapolis until your morale is down — there’s a lot of these black and white situations on the surface. But almost all things are gray to some degree. Very few companies do just one thing. Do you want the ability to produce both an air purifier for your house and a robotic missile system? Well, you should go to Honeywell, because they manufacture both. That kind of complexity is also what makes pushing back against bad decisions and evil decisions important. You have to figure out where your gray line is and how you improve it.
Matthew
I think it’s very true. And my point was about moral purity — the other side of it is, let’s say you decide to quit your job. Someone else is just going to take that job. Is that a sacrifice that’s actually going to do good, as opposed to something else?
Rob
And sometimes it does. Sometimes the job just doesn’t have to exist. You can make different choices. That’s another thing I’ve taken away from science fiction: you can always make choices to improve things, and you can take agency in a lot of these decisions, even if you’re in a relatively small position. Like, I push back against management at my job, I push back on things in the state. I’ve started going to state conventions for the Democratic Party, and I’ve been messaging my soon-to-be state senator — who’s running uncontested in the Democratic primary and is very likely to just roll through — because I want more worker protections. I want laws that improve the lives of people. I’d love it if the laws changed so that your time off had to be paid out at the end of every job. I think it’s a bad thing for workers that time off, which is part of your compensation, can just get thrown away by the company. And you can make these pushes and changes. You have the agency to do so.
Matthew
Yeah. Well, I think it’s such an important thing — figuring out where can I use my agency, where can I do the most good, where am I helping a system I don’t like but where stepping away isn’t actually going to change anything? Can I do something else?
I want to go to a different part of this though. Something I haven’t read much of, but have read some of and certainly read a lot about. Part of what I really like about that idea — the robots being programmed not to allow humans to be harmed by inaction, and thus being compelled to make us stop all the things we do all the time to harm humans without thinking — to me, that connects to how I really like science fiction that pushes us to think: what are the times when we limit our own agency because there are choices we are not allowing ourselves to consider?
And I want to give another example that I see a lot as a polyamorous person, though I think it applies to a lot of other situations in life. Sometimes I’ll talk to someone and they’ll say, I wish I was polyamorous like you, because I’m with this person — it’s an okay relationship, but I’m in it and want to make it work — but there’s another person, my work crush or whatever, and I just have such a crush on them and I wish I could be with them, but I can’t. And what I might say, if a friend is coming to me for advice, is: it might be helpful to reframe this. You could have absolutely cheated on your partner. You made a choice — you are so dedicated to this relationship that that is basically something outside of the realm of possible choices that you allow yourself. And that’s a completely reasonable choice, but own that. Or, on the flip side, ask yourself: is it correct that I’m saying I cannot quit my job, cannot leave this relationship, cannot stop talking to my parents or my sibling who abuses me? The ability to ask, what if we don’t make that assumption? — to me, that’s something really powerful about imagining the robotic mind. If you give it a conclusion humans have come to but not the steps that got there, it’s often not going to make the same assumptions.
Rob
And it isn’t going to follow the same thought process. You can even see this with people. With all stories, it’s an exploration of people. Someone else is going to make totally different decisions than you — they might not come from the same assumptions. And the robots make it really clear that they have a strong ethical boundary. But other people’s ethical boundaries are more porous. And so you get to see what different types of ethical porousness look like in comparison. Making the decision not to do a thing — taking the example of not cheating on your partner — you’ve made a strong ethical commitment to a particular course of action. What does that mean? What does that imply about the system you’re in? What are you gaining? What are you losing? What does the commitment mean? If robots never had to step in front of a speeding car to save a child, it wouldn’t be a big deal. If there’s no challenge, the ethical line doesn’t mean a lot. It’s when you push on it that it matters. And this is an abstraction of don’t harm humans, but you can think about it as don’t cheat on your partner. Are there circumstances where a robot maybe should harm a human? Are there circumstances where you maybe should cheat on your partner? Maybe — I don’t necessarily agree — but that’s what the decisions and thoughts are about.
Matthew
And especially because the statement “do not harm a human or allow a human to be harmed” — first you have to talk about what does harm mean? But also, what does it mean if I’m not supposed to allow a human to be harmed, but the only way to avoid five humans being harmed is to harm one human? If the only way you can stop a person from pressing the button that sets off the nuke is to kill them, is harming that one human to prevent the harming of five million justified?
One of the favorite courses I took in ethics was in grad school. The class started by talking about whether there can be an act that is purely always a moral good or a moral bad. One of the things suggested was: taking a knife to cut another human being. And because we were kind of dumb and trying to figure this out, we were like, yeah, that’s a clear moral bad. And she says, what if a surgeon is cutting you open to take out your appendix? Is that harming them to avoid harm? The act of cutting them open in our day and age is about 99.5% safe, but infections can still happen. And a scenario I’m thinking of — because I’ve seen it on so many ER-type shows — a person is in a building that’s falling down, something incredibly heavy has fallen on their leg, and the only way to save their life is to amputate it. Can an Asimov robot come to the conclusion that cutting off this leg is objectively harming this person, but if I fail to do that, the person will die — is that a kind of compromise an Asimov robot can make?
Rob
That’s what I like about these stories, because there’s a progression. There’s a company that makes robots and they become better and more sophisticated over time — there’s a lot of lessons learned, and the robots become more and more sophisticated as they go on. In early stories, the robot definitely would have let the human die, and its mind would have broken — it would have locked up. In later stories, it would have made the choice to cut the leg, and probably still would have locked up as a result. In much later ones, it would have just been like, yep, this is a sensible decision, and just done it. So it’s also about where they fall on a social comprehension spectrum. Child morality is very different from adult morality in a lot of circumstances. “Never take cookies from the cookie jar” is a very good maxim for a child, but it’s not as useful as an adult.
Matthew
Right. Your diabetic friend is going in and needs sugar ASAP. And Les Misérables is definitely one of the stories that most shaped me — that whole idea of how do you understand stealing when someone is starving? Because I’m now in a lot of parental discussion groups, there’s a meme that goes around: if you see someone stealing diapers, no you didn’t. And I always love that. I often have the caveat of: if I see them stealing from a mom-and-pop grocery store, I might just go and quietly give an extra $30 to the store to cover the cost. If they steal from Walmart, I definitely don’t care. Someone who needs to steal diapers — something is so broken that they need to do that — I’m not going to stop it.
Rob
I’m also going to make the contention that anything you’d probably class as a non-taxable item — like food; in Minnesota, we don’t tax food — I think stealing it is socially wrong, but not morally wrong. Because socially, what ends up happening is if you steal the same thing over and over, either people stop stocking it and other people lose out, or they do the crazy thing where they put everything behind glass and you have to ask somebody. I went to a hardware store at one point where there had been so much shoplifting they chained the aisles and didn’t allow you to go in alone. They were like, we can’t stop this otherwise. From a corporate-store perspective, they can stand to have the shrink. But then other people get punished functionally one way or another. So figuring out to what degree — if you see them stealing it, saying, hey, if you feel like you need to steal that, here’s the money for it — I think that’s the maximum good, because then it prevents the other harms I see.
Matthew
So there are a couple other things I know have been influential to you that I want to get to. Any last things you want to say about Asimov and the Laws of Robotics and Foundation?
Rob
Just that Asimov is not a flowery writer, but he is a writer who tries to make you realize that other people have different perspectives and makes you think. There are very few scenery-chewing villains in Asimov. There’s a lot of: I’ve got a totally different motive and I’m pushing to do a different thing.
Matthew
I’ve been working my way through the Foundation TV show, which I’m sure you have strong opinions on, one way or another — probably both. I think we’ll definitely do an episode on that. I’m going to guess that another person in your ethical constellation — and I don’t know if it helped shape your thoughts or if you enjoyed it because it confirmed them — is someone primarily known for his humor but one of the foremost ethical writers in science fiction and fantasy: Terry Pratchett.
Rob
Yeah. He’s one of the ones on my bullet-pointed list, and my notes on him say: righteous anger. Pratchett has a lot of characters that are very unhappy that the world isn’t the way they want it to be, that it isn’t morally aligned to their goals or dreams. And some of them have very simple ethical systems.
Matthew
I should say he’s primarily known for writing the Discworld series — that long series of books. He co-authored a number of books with Neil Gaiman, including Good Omens, although all evidence suggests that Terry had no idea about some of the stuff that has gotten Neil Gaiman into trouble. And he had a hand in a number of other science fiction and fantasy things. But the Discworld collection is where he quite openly explores a lot of these ethical questions.
Rob
Right. He openly explores a lot of them. The cleanest moral statement he makes in any of them is when Granny Weatherwax says that evil comes from treating people as things. And somebody’s like, it’s more complicated than that. And she says, it’s really not. Treat people as people and not as things. But there are characters that are angry, that are frustrated that the world isn’t the way it should be. There are characters that are changing. There are characters that try and fail. The big thing Pratchett has going for him is that everybody is just trying to live. As weird as everybody is — as strange as dwarves are, as much as werewolves exist — they’re just trying to live and get by. Everybody just wants to live life, be happy, have a family, have a circle of friends they defend. And if he does have scenery-chewing villains, there are plenty of those in Pratchett too.
But there’s one that nobody ever really talks about, which is a book called Strata by Terry Pratchett. It’s a science fiction book that is a pastiche of Ringworld by Larry Niven. And it is laser focused on me, because Ringworld is great. But the thing is, Strata is a story about how people think that everybody thinks like them. There are all these stories about aliens that are just kind of humans in rubber suits with very humanish motives — Star Trek, for all the things it does right, has aliens that are mostly exaggerations of a human characteristic, in order to tell a story. And in Strata, the point that keeps being made over and over is: the aliens don’t think like humans. You pretend that they do because it helps you get by as a human, but it’s not true. You’ve made these aliens into fake humans.
And it’s the same with people. It’s a common fallacy to think that everybody thinks and behaves like you. If everybody just didn’t drink very much and wasn’t very interested in smoking pot, it’d be a much better world — but I’ve come to terms with the fact that people are going to have different things that they like to do than me. That’s the way it is. You have to come to terms with the fact that people are going to be different from you.
Matthew
And I think that’s a fascinating counterpoint to Asimov, because one of the places where I often get squirrelly about the idea of robotic ethics is this: if you have the kind of perfect robotic, unbiased mind, it will come to the objectively proper ethical or moral conclusion. And that’s an idea I always think is deeply flawed. When I was in graduate school — I was at a school in California and there were a lot of students from Asia. I was in grad school for becoming a pastor, so we were talking about the Bible a lot. We talk a lot about how America is an incredibly individualistic society — there’s a lot of good in that, and also some problems. And friends of mine who were from Japan talked pretty openly about how the society they came from is a much more communitarian-focused society, in some ways a lot better than the US, in some ways also problematic. It was interesting when we realized we were both reading the same text but from very different lenses: are we looking at this in terms of the harm the community is doing to the individual, or the responsibility the community has to the individual, or the responsibility the individual has to the community? And more than anything that came out of that was: there is no one way that is objectively better. I mean, some ethical systems — Nazis thought they were doing the objectively right thing, and clearly that was not the case. But there’s real value in learning how someone else thinks and learning the values they start from, even if you’re not going to decide they’re your values.
Rob
And the instrumental goals they’re trying to achieve with those values, because there are values and there are the goals you want to achieve alongside them. No ethical system is objectively better, but it might be objectively more effective at achieving a particular task. Like, an ethical system that is a lot more community-property-focused and has a lot fewer individually owned pieces of property is a lot better in a resource-scarce system. If we’re on a shared floor of a building with a common central bathroom or central cooking area, who owns that is the question you ask. Like in a townhome, who owns the yards? Ownership isn’t necessarily relevant in a lot of situations where you’re working in a community system. If you’re trying to make sure that everybody gets their fair share, you might not have ownership. If you’re trying to make sure that everybody produces the maximum they can, you need some kind of system that encourages them to produce more.
Matthew
And that’s actually the perfect example, because that’s the kind of thing where conflict occurs a lot — situations where there is a communal resource that people are still approaching from an individual basis. Anyone who’s worked in an office with a shared kitchen: if you have people who have a shared feeling of all of us should be contributing to the upkeep of this kitchen for the general good, it’s often going to work out well. If you have people who say, this is my corner of the counter and this is my shelf in the fridge, well, each person can have more freedom — maybe they can put things that are going to make other things smell, but it’s their part of the fridge and they get to do what they want. I have a bias in that regard. But the example I can use is: if you treat the shared kitchen as I have one-seventh of it so I can do whatever the hell I want, that’s not so good. But if you’re seven homes in an HOA and you want some individuality in how you paint your house, and the HOA is saying no, no, no, we all have to paint our houses the same — that feels to me like the community infringing on individual rights quite a bit. How do you find that line? That’s a pretty hard thing.
Rob
Because that’s the question, and sometimes those rules are Chesterton’s fences. The HOA might have a rule that you’re not allowed to park on the street overnight — which sounds like a stupid rule, but it might be because they’ve had issues with people parking vehicles out there for long periods, getting in the way of traffic, because there was a fire. There might be this Chesterton’s fence of: we don’t want to block the road because the road is actually narrow. And Chesterton’s fence is when you come upon a fence in a field across a hill or out in the woods, and you’re like, I don’t know why this fence is here, so you remove it. But instead you should be asking: what is the cause of this fence? There’s a reason it was put up here. I should figure out what the reason is before I remove it, before I change the rule. And so if you find out that the rule is you can’t park on the street overnight because it’s actually too narrow and they can’t get emergency vehicles down it — that might be a good rule. If painting your house a certain way is prohibited because they’re associated with an eldercare community next door and they need everything to look consistent, there might be a reason for it. But sometimes it’s a stupid reason — nobody likes the color blue, so you can’t paint things blue. Sometimes the reasons are bad. But finding out the reason for a rule — and even then, challenging those reasons — that’s another thing I’ve learned from science fiction. A lot of times the societal reasons we have for things are there for no reason whatsoever. And this world pushes on that a lot. If you put up a sign that says “no blacks allowed” in the United States, people understand really quick what’s going on. And Pratchett’s world has a lot of this: is racism against vampires racism?
Matthew
Yeah, because it’s the question of — and I think X-Men is another interesting way of dealing with this — one of the fundamental reasons why I think people should be against racism is that there actually is no scientific basis for it. There is no scientific thing you can point to that says this race is good and this race is bad, or even that races are clear and easy boxes that you can drop people into.
Rob
They’re a bundle of characteristics, but that isn’t the same as being the entirety of a person.
Matthew
Whereas to say — because people are different, there should be a different legal system that deals with them — that’s going to raise your hackles. But if I say this group of people are super-powered and can destroy any existing legal system, and therefore we need a different legal system to deal with them — as in X-Men — that plays with it. I like systems that — and again, ACAB, I don’t want any of these kinds of systems — but I like when science fiction does that, when it says: we know discrimination is bad, but what if it’s vampires, where there is a threat you can point to? How much is that still assumptions being made about the way they’re going to feed or the way they’re going to do harm? So many of these stories that we love get at: how do we ask more questions about things that might have made sense to us once but now we just take for granted?
Two of my favorite examples: one is ancient history, and one is directly in the work you and I do that we met through. In the Bible, so many of the rules about how you can act come from one of two specific reasons: either because telling people from a scientific knowledge standpoint that shellfish and mixing cheese and meat and eating pork is dangerous isn’t a good way to get people to stop doing it — but if you say God doesn’t want you to do it, people are often going to stop. Also, if your people are all going to live in another land because you’ve been taken into exile, and all those other people do that, and you want to stay separate — so don’t do that. Okay, that makes sense in that one specific context. But here we are two, three, four thousand years later, still trying to apply these rules without really thinking about whether they still make sense.
The other, more common example — you and I are both part of judge programs. You were talking earlier about judging Magic, which is where I started. I mostly now judge Star Wars. And one thing I really like about Star Wars is seeing — the Magic judge program has been around for thirty years, and there’s a lot it has figured out very well. But there are also some things it does because they were created at a time when the technology available to your judge out on the floor was nothing like what we have today. And we’ve watched the judge program slowly, kicking and screaming. How many events have you been to that still have a paper team, even though we’re not passing out paper match slips anymore? It’s all electronic, but they still call it a paper team. Star Wars — we’ve never had a paper team because we started without it. I’m not saying it’s better, but it’s an interesting example of: what are the things that people have been doing for twenty, thirty, three hundred, three thousand years, and it’s hard to get people to think, do we still need this? Maybe that fence was needed a thousand years ago. Is it still needed today?
Rob
Reevaluating your priors on a lot of these things. Figuring out whether having a team called “paper” — but also what tasks and stuff you do regularly — does that need to exist? Like, this fence has always been here. Have we asked why it’s here? That’s the other half of it.
Pratchett challenges that. Everybody’s always hated vampires. But eventually you have to come to terms with the fact that vampires are people, and everybody’s just people. Once you start introducing people into an economic system where there’s a lot of exchange and people want to do more with the resources in front of them, you run into the fact that vampires are just people who want the same thing. And you might say, well, they’re blood suckers that are going to dominate everybody. And the answer is: okay, so you have the choice to drive them all out, eradicate them, or figure out a way to learn to live with them. Figuring out a way to learn to live with them is the only one that works in the long term. That’s the way it always is in the world right now.
Matthew
Well, especially when he often talks about how much these things can become self-perpetuating problems. If you don’t allow vampires to exist in society and don’t allow them safe ways to feed themselves with consenting people, guess what — they’re going to have to resort to something else.
One of the great arguments that really shaped how I see things was a person who came and talked to us about racism and statistics in high school. He went through a lot of the common things that racists will use and would say: okay, so look at test scores academically. And this was in the ’80s and early ’90s. You’d say, okay, well, different racial groups have different overall averages of test scores, and whites or Asians tend to score higher and other groups tend to score lower. We could look at that in a vacuum and say, therefore, this group doesn’t need good schooling because they’re not smart enough — or some idiotic thing like that. Or we could look at the fact that the people we’re testing have been in substandard schools for generations, and the tests we’ve designed specifically test for kinds of knowledge you’re much more likely to have if your parents had a higher level of education. And how do we look at those things and say: oh, yeah, the ways we treat people we don’t like create the situations that we then use to justify that treatment. Like Jean Valjean — you can look at him and say, look, you’re just a petty criminal. But you’ve starved them enough that the only way they can get bread is to steal.
Rob
There are “race realists” out there who make the case that on average there are measurable differences — they’ll use height as a proxy for this, saying that on average Japanese people are shorter than people from Kenya, for instance. That is true. But it’s also only loosely heritable; you can actually change that very quickly. And there are a lot of complex other social dynamics that cause a lot of these situations. The brain is a lot more complicated than the body in terms of large, inheritable things. And also, when you boil a whole group of people down to nobody from this group is going to be good enough — that’s treating them a lot like they’re not fully people. And I’m not a fan.
Matthew
I’ve always loved Pratchett’s trolls. The books I know best are the Night Watch books, but I’ve read others. Trolls in the world most of the characters exist in seem very, very dumb — you know, they’re constantly knocking themselves unconscious by saluting and hitting themselves in the head, and it’s played for comedy. But one of the things they find out is that at one point one of them gets locked in a freezer unit and all of a sudden becomes utterly brilliant — because their brains are designed to function at extremely cold temperatures, like you often find high up atop mountains. When they come to our circumstances, their brains don’t work in the same way. And to me it’s another one of those things of: yes, they don’t have intelligence in a way we see, because in our environment their brains aren’t going to work very well. When we are in their environment, our hearts are probably going to stop working before our brains, so we don’t seem stupid — we just seem dead. But we don’t judge ourselves by how we’d function at the top of Mount Everest. We judge the trolls by how they function in our world.
Rob
Right, exactly. There’s a lot of this like “judging a fish for its tree-climbing ability.” People are more complicated than that. And that’s Pratchett’s take: meet people where they are. They’re more complicated. They want the same thing as you.
Matthew
Surprise to no one who has heard me and Rob talk about this before, but we’ve already gone almost an hour and covered only probably some of what I wanted to get to. Just to keep us from going too long, I’m going to throw one more out there. I know you’re a big fan of Star Trek and at least a significant part of the more recent Star Wars. Pick one of those and tell me how it has shaped who you are and your values.
Rob
Trek is obviously the default one. I was of the right age to be watching Next Generation and then Deep Space Nine on television as they were coming out. There’s obviously a lot all over the place — you’ll have back-to-back episodes of “The Magnificent Ferengi” and “Far Beyond the Stars,” which is like the biggest tonal shift between two consecutive episodes of a show. But the push to challenge the bounds of what is okay — people come back and look at Trek nowadays and say, well, why didn’t they push the bounds on non-binary people back in the day? And I’m like, they did. You have to realize that The Next Generation started in 1987. When you talk about non-binary stuff, you’re talking about thirty years of societal change. Of course they had episodes where they dealt with this, but they were also a product of their time. And if you look at it from the lens of today, it’s going to be different, but they pushed the bounds on everything. They had the first interracial kiss on television, the first same-gender kiss on television. They asked really deep questions about when should you let people live or die. They had “Let That Be Your Last Battlefield,” where the two characters look exactly the same but are racist with — they’re racist about which side of their face is which color. And especially in Next Gen, Picard’s motto was: we can try to be better. Whatever decisions we made in the past, sometimes we’ll be wrong, and we can be better.
Matthew
And I love that because you can also see Star Trek in a meta way through that, because there are also some ways in which Star Trek was trying to push the boundaries and got it pretty terribly wrong. They tried in an episode early in season one about a planet kind of representative of white stereotypes of African tribes. I understood what they were trying to get at, but what you wind up with is a very sexist, racist tale of a white woman who’s supposed to be the most badass combat officer on the ship being enthralled that the pseudo-African tribal leader has kidnapped her. And similarly — people had a critique that ever since the original series and then in TNG, women characters were put in very short skirts. And so they thought, let’s be egalitarian, and all of a sudden all the male characters were in very short skirts as well.
Rob
They had that in TNG. There’s a standard uniform that is a short skirt uniform, and there are males wearing it in season one.
Matthew
Yeah, it lasted for about five episodes and they realized this isn’t working. But they tried.
Rob
And they took it off everybody. Everybody moved away from that uniform. They showed it and then got rid of it because it sucked as a casting and propping thing for the show — not necessarily because they didn’t want to represent it.
Matthew
And that idea of: we always try to do the best we can and hope that the people who come after us do better. That’s the debate that happens in so many fandoms, but especially in Star Trek. You have a lot of the fandom who had their initial ideas confirmed and affirmed by the earlier shows, and now with the later shows are doing the exact same thing and just pushing beyond — as you said, The Next Generation started in the late ’80s and ran through the early and mid ’90s. It’s over thirty-five years old. Of course we see some things differently now. And so stories told today are in ways sometimes critical of the original shows. And people are furious about it and call it what it is.
Rob
TNG is critical of the original series, probably. It’s a conversation with the past, saying: we understand that you tried to do these things, but we can do better. And it’s hard — you grew up with a certain piece of media and you don’t like to see its flaws. But everything is created by people, and nothing’s perfect.
Matthew
There will always be a part of me that is an eight-year-old when I hear “Who you gonna call? Ghostbusters!” Like, the eight-year-old in me will always love that movie. Even though the part of me that isn’t eight years old watches Bill Murray go full-on stalker of Sigourney Weaver and goes, this is not a good movie in that regard.
Rob
Yeah, it’s the most libertarian movie of all time. The villain is an EPA inspector.
Matthew
It’s such an ’80s thing. If you understand Reaganomics and Reagan-era ideas of rugged individualism and screw the man — like, no, actually, I don’t want four scientists who can’t hold down a university job because they keep trying to seduce the test subjects to decide whether or not to have a ghost infestation in their laboratory. I don’t think that’s good.
Rob
He’s like, do you have a permit for this? Because it’s a big, dangerous thing. He’s debauched.
Matthew
I think Star Trek has evolved. I’m not saying every new show is perfect — Enterprise wasn’t my favorite. And I think there are many legitimate reasons to not like some of the newer shows. But I will call out the argument: TNG was challenging ethical ideas in the right way, DS9 did it the right way, but now it has values I don’t like.
Rob
Yeah, that drives me crazy. Because if you look at the time, some of the episodes of TNG were really problematic for them to air on television. Same with Deep Space Nine. They said, are we allowed to air this? And the showrunners just said, we’ll do it. What are they going to do — cancel Star Trek for showing a future that might be better or might be worse? There are definitely some decisions in Star Trek that would suck if you actually lived in that world. But yeah.
Matthew
Did you follow what happened with Starfleet Academy?
Rob
I haven’t followed that very much. People have blasted a bunch of the actors for not being traditionally body-shaped, which drives me crazy.
Matthew
Yeah, there’s a lot of that. There’s a gay Klingon that people are like, no, you can’t have that. And he’s also kind of a pacifist Klingon, which I think is awesome. But more to the point — you’re right. With DS9 and Next Generation, people might not have liked certain things, but there was a general feeling from the network of: it’s Star Trek, we’re going to do it. Starfleet Academy definitely pushed those bounds in its own way as well. I like the show a lot. I think there are legit critiques of it. I think there are also a lot of very illegitimate critiques. One of the people who was very loud in his critiques, calling it woke garbage, was Elon Musk — and Donald Trump echoed some of those. It is owned by Paramount, who is trying to undertake this very big deal of buying Warner Brothers. And it quickly became clear that the uptick of assaults on the show from Musk and from Trump — Trump making statements about how he wants to support good patriotic American broadcasting companies — it was pretty soon after that that Paramount canceled the show.
I don’t want to be a conspiracy theorist, but I think it isn’t seen as a conspiracy. It is fairly widely understood that Paramount canceled the show in part because of pressure from folks like Musk and Trump. And maybe the viewership numbers were really down because Musk said don’t watch it — I don’t know, I haven’t seen them. But certainly a lot of people were watching it. And it is, to me, a concerning thing that we’re no longer in a media landscape where people will say, alright, well, TNG is challenging things, or this kiss between Kirk and Uhura is going to make those broadcasters in the South mad — ah well, we’re still going to do it. That’s not the media landscape we’re in right now.
Rob
I hope that changes back. It’s always been the case that shows had to deal with censorship and similar. When you talk about that interracial kiss, it was literally censored and literally caused the broadcasters a huge pile of headaches. So I think we are right in that same exact landscape — they made the choices that they made, and the showrunners got the show canceled as a result. So I think they made that ethical decision. And sometimes you do. I don’t think it’s good that they got canceled for it.
Matthew
I mean, I forget more than that — I’m not as knowledgeable in the weeds of it — but certainly Star Trek wasn’t canceled as a show because of the kiss, even though a lot of the Southern broadcasters didn’t like it. But I hear what you’re saying. I think it’s a difference of degree, not a complete equivalence.
Rob
A bunch of the affiliates contacted and said they weren’t going to carry the show anymore because they couldn’t do it in their area.
Matthew
Right. But the network didn’t stop production on it because of that. And that’s what’s happened here — they basically said, even before the second season airs, there will not be a third season, no matter how popular the second season is.
Rob
Another one that I want to talk about that we haven’t touched on much is Piers Anthony. I don’t know if you’ve read him.
Matthew
Yeah, I know very little.
Rob
He is not the most highbrow of fantasy fiction, but the theme is: people are different, but that doesn’t make them wrong. That’s a lot of what goes on in Piers Anthony’s stuff. A lot of these authors were a lot more libertarian than I’ve ended up being, but that’s fine. I disagree with a lot of them, but that’s very reasonable.
Matthew
One thing that I think about a lot — and this is a whole other subject, we should probably do another episode about it — I feel like there are a lot of ideas that were just baked into society that we took for granted and are now being challenged. You talked about Asimov’s pacifism. To me, it’s an interesting cultural moment that Magneto has become more of a sympathetic anti-hero, if not an actual hero, rather than just seen as a pure villain. And people watch Black Panther and a lot of them were like, well, Killmonger’s not entirely wrong. And within political circles, there’s a lot of talk about: yes, non-violence has a lot of value and a lot of power, but also, the whole point is that they listen to us — and if they start just ignoring us, then does violence become an option? That’s a much deeper ethical conversation, but I think non-violence held a different place in our society when a lot of those authors were writing than it does today. And that’s another interesting thing to look at.
All right, Rob, where can people find the stuff you do?
Rob
I record a lot with Good Luck High Five, which is a YouTube channel that does Magic: The Gathering content. I’m especially on Commander Arcade a lot, recording videos of Commander gameplay with them. If you want to see more of me online, come tune into those things. That’s basically my home for content production.
Matthew
Definitely check all those things out. They’re really enjoyable, especially if you play Magic, but I think there’s a lot of fun things to pick up even if you don’t. I am TheEthicalPanda.com — that’s where you’ll also find the Star Wars Generations podcast. We’ve done episode-by-episode coverage of Maul and we’re going to have a lot more to say about it. That is a show that just utterly blew the doors off of Star Wars and raised so many interesting questions, especially around: can you tell a villain story without it necessarily being a redemption story and yet still make the character sympathetic or interesting? Great storytelling. A lot that we’ll have to talk about. Send your thoughts and your questions — we’d love to hear them. Matthew at TheEthicalPanda.com, or any of the contact information on the website. On behalf of myself and Rob, thank you all so much for listening. May the Force be with you.