Ars Technica: It’s interesting you mentioned neurodivergence. I would hesitate to draw a direct comparison because it’s a huge spectrum, but there are elements of Murderbot that seem to echo autistic traits to some degree.
Paul Weitz: People look at something like the autism spectrum, and they inadvertently erase the individuality of people who might be on that spectrum because everybody has a very particular experience of life. Martha Wells has been quoted as saying that in writing Murderbot, she realized that there are certain aspects of herself that might be neurodivergent. So that kind of gives one license to discuss the character in a certain way.
Apple TV+
That’s one giant and hungry worm monster.
Apple TV+
Apple TV+
Murderbot to the rescue!
Apple TV+
Apple TV+
Murderbot needs a bit of TLC after his encounter with the worm monster.
Apple TV+
Murderbot to the rescue!
Apple TV+
Murderbot needs a bit of TLC after his encounter with the worm monster.
Apple TV+
Chris Weitz: I don’t think it’s a direct analogy in any way, but I can understand why people from various areas on the spectrum can identify with that.
Paul Weitz: I think one thing that one can identify with is somebody telling you that you should not be the way you are, you should be a different way, and that’s something that Murderbot doesn’t like nor do.
Ars Technica: You said earlier, it’s not human, but a person. That’s a very interesting delineation. What are your thoughts on the personhood of Murderbot?
Chris Weitz: This is the contention that you can be a person without being a human. I think we’re going to be grappling with this issue the moment that artificial general intelligence comes into being. I think that Martha, throughout the series, brings up different kinds of sentients and different kinds of personhood that aren’t standard human issue. It’s a really fascinating subject because it is our future in part, learning how to get along with intelligences that aren’t human.
Paul Weitz: There was a New York Times journalist a couple of years ago who interviewed a chatbot—
Chris Weitz: It was Kevin Roose, and it was Sydney the Chatbot. [Editor: It was an AI chatbot added to Microsoft’s Bing search engine, dubbed Sydney by Roose.]
Paul Weitz: Right. During the course of the interview, the chatbot told the journalist to leave his wife and be with it, and that he was making a terrible mistake. The emotions were so all over the place and so specific and quirky and slightly scary, but also very, very recognizable. Shortly thereafter, Microsoft shut down the ability to talk with that chatbot. But I think that somewhere in our future, general intelligences are these sort of messy emotions and weird sort of unique personalities. And it does seem like something where we should entertain the thought that, yeah, we better treat everyone as a person.
Leave a Reply