S2E16 - Reality Resists Simplicity, Embrace Nuance

Transcript
Yeah, it was just a lot of trouble. Well, hello audience. Welcome back to behind the Locked Doors, a weekly check in that we pretend is a real podcast.
Speaker B:I weekly check in question biweekly.
Speaker A:Biweekly, Yes.
Speaker B:I wasn't here last week. I don't know about y'.
Speaker A:All. Yeah, no, no, I released the other one so late that it has just been a week since we released it. So that being said, it'll probably take me forever. I am Scott Paladin. I am working on a horny werewolf audio drama called It Takes a Wolf.
Speaker C:And I'm Sam Stark. I am really not working on Azim west right now, but I will, I promise I'm going to get back to.
Speaker A:It someday in the future. It'll come back.
Speaker B:Hi, I'm Jack. Same kind of, same deal. I feel like you and me, Sam, both have like full plates of other shit going on right now, making it tough to actually put meaningful work in on as nuest. But we are like, in the broad sense, making an audio drama called Essence.
Speaker C:Yes.
Speaker A:You're sitting by the window as it rains outside and waiting for the side project to come home from the war.
Speaker C:Yeah, no, that's 100% what's happening. And I, I want to like, just put it out here on this, on this check in that we record like a podcast that I do think about as in west, like a lot. Like, it's in my brain all the time. It is a pure passion project. Like, I. I love it and I feel very strongly about it. It's just one of those where, like, I have other things that I'm being paid for that I have to do first. And so I. I chip away deadlines. Yeah, exactly. I chip away at as in west when I can. And it's like my fun. Relax, Relax and like, create whatever I want kind of a thing. So at least, I mean it. It doesn't happen often. It's going very, very slow. But it does bring me a lot of joy to do it. So it's. It's fine. I don't mind that it's taking a while. I don't know about anybody else, so it's. It's fine with me that it's.
Speaker B:Yeah, we don't have like, release deadlines for this at all. Like, when we make it, we'll make it, but we're not in a hurry.
Speaker A:Yeah, yeah, yeah. That. That's just how it goes. Sometimes. I'm in the. I'm in the world of like, constructing deadlines for myself and then like, hearing them. Whoosh as they go past. Which is also, again, that's, that's only self imposed. There's no. I haven't done anything. So actually we should do our check ins real quick.
Speaker B:Yeah.
Speaker A:My goal was to have my casting call done as of today. I turned. That turned out not to be the case because I had a little bit of. I didn't know everything I needed to do for it. I got like, the vast majority of it is I wrote all of my, almost all of my character descriptions. I wrote all, I did all of my line selection for all of the characters except for one. And that's been the problem. One which is I need to do the accomplice. And then it turned. I was like, oh yeah, I need to rewrite all of her lines because they don't, they don't work. Like, oh no. Like the bulk of her dialogue was written before I thought it was going to be in a conlang. And I was like, oh, I'll just write it. And I tried to translate it into it and didn't work the way it worked later when I had a character in mind for that. So I need to go back and redo her thing. And then I was like, well, what if I. I also need to like write a pronunciation guide for. I was like, oh, there's like another, you know, 10 hours of work or something like that. I don't know what it actually is, but some amount of work that needs to go into that before I can actually release it. But pretty much everything else is done. And for. For reasons that I won't say on the podcast, I can't put the casting call out until September anyway. So that is fine. Like we're still on schedule and all that stuff. I need to get back. Yeah, it's. It's fine. I just was like, oh, I. I just discovered work as I was doing.
Speaker B:Hate it when that happens.
Speaker A:So how did y' all do?
Speaker B:Okay, so real quick, the only thing I accomplished was uploading the second half of the pilot to the shared Google Drive that Sam so kindly set up. Can't find. Find the current version of the first half anywhere. Like, it's so weird. Like the old versions in my Google Drive.
Speaker A:Yeah.
Speaker B:Because at some point, Sam, you sent it to me to do like, dialogue tweaks, whatever, and I did them, but then we didn't end up using that version. And the new version, the link to it in our Discord channel is broken. It like only goes to a. This has been deleted. Like, yeah.
Speaker A:Oh, lovely.
Speaker C:So I finally watched the Movie Deadpool and Wolverine last night because I'm really, really behind.
Speaker B:And how was that?
Speaker A:It was.
Speaker B:Was great.
Speaker C:It was really, really good. It was really fun, especially for a comic book lover like myself. And then I, like, totally passed out on the. On the couch. Like, I had put everybody to bed, and I was like, yeah, this is such a nice, fun evening. Zonk.
Speaker B:Just unconscious on the couch.
Speaker C:Yeah, yeah, yeah. And I woke up, like, sort of somewhere between like, 2 and like, 3:30, because the dog was, like, stepping on my stomach. Of course. That's what dogs do. And I. I was like, okay, let's go outside, you know? And I had my phone while I was outside, and I saw your message, Jack, and I saw the, like, I can't. Whatever it said, like, I can't open or something's gone or something's deleted. And I. In my days of being basically still asleep, I tried to go into the Google Drive, but I couldn't read anything. Like I was still asleep. So I was like. So I was. I was getting like, like dreamily annoyed, like, Jack, what the fuck are you talking about? Nothing works. What did you do?
Speaker B:What did you break, Jack?
Speaker A:What the.
Speaker C:Exactly. So then I brought the dog back inside, I went back to bed, and then I woke up this morning and, like, the first thought I had, I was like, you know, pulling myself out of the sleep and, you know, just kind of, you know, slowly wake up. Jack. Oh, God. And like, I. But it. But for some reason, my brain had made it into this story of something was really wrong and, like, something we had lost something or whatever. And then as I woke up, I was like, oh, no, no, no. They just can't get into the. Everything is fine.
Speaker B:Oh, my God. Okay, I just can't access the current version for whatever reason.
Speaker C:Y. Yes. And I'm. I. I will fix all that today. It was just my brain decided to take me on a journey and. Oh, my God, it was a 3am Journey.
Speaker B:I hate that when that happens. It's crazy, I think, like, especially when you pass out on the couch or, like, in a less than opportune location.
Speaker C:Yeah.
Speaker B:At a weird time of night. And then you wake up in the middle of the night, totally disoriented. Like, what planet am I on? What year is it? Where am I? What's happening? Like, that always makes my brain construct the most bonkers scenarios about anything and everything. It just. It really does happen like that.
Speaker A:It's like, yeah. It's like trying to learn how to be a person all over again. You're like, I am a blank slate. It's 11pm And I just woke up from like an eight hour nap. And like the world is like, what the hell?
Speaker B:The devil's nap?
Speaker C:Absolutely. It's now my first day on earth.
Speaker A:Or like the, the, the compound that with. Have you ever gone to like a movie theater at like it's still bright outside, like it's 6 5pm or whatever. And then it's a movie so you fall asleep during it and you wake up and the movie's over and you leave and it's dark and you don't even know what time it is.
Speaker B:You come out and you're like, I'm in the Twilight zone. Yeah.
Speaker C:You're like, I think it's, I think it's actually weirder if you come out of the theater and it's still bright out. I saw, I saw, I just went and saw the new 28 years later and we saw it really, really early in the day because I just don't have $75 to see a movie. And so we did a matinee and I didn't even fall asleep in the middle of that movie because it was a great movie. And we come out and it's bright ass daytime and we' like totally discombobulated. It's so weird. But yeah, yeah, it's all good.
Speaker B:Anyways, we'll figure out where that file is and I will make the dialogue edits I was supposed to make. After we've done that.
Speaker C:Yay. And I will someday get to the parts that I was supposed to do. I. I went to worldcon.
Speaker B:Yeah. How was that?
Speaker C:It was really fun. And I have so many stories and so many fun things that I could share, but I'm actually going to share. There was a booth that had two guys that were like, I guess, advocating for AI narration.
Speaker B:Oh my God.
Speaker C:In the middle of this fantasy, sci fi and horror literary con.
Speaker B:That's crazy.
Speaker C:They had people like George R.R. martin and Neil Clark from Clark's World and Becky Chambers and like people like, you know, writers that write books, actual human hands. And we had me and Shirumi and a bunch of other really great narrators there. And like we're walking past this booth like, what the fuck? And it inspired several. Not like nobody got in like a fisticuffs kind of a fight, but there was a lot of like shouting around there.
Speaker B:I bet there was. Yeah.
Speaker C:It was Wednesday, Thursday, Friday, Saturday. It was five days long. I had to actually do that on my hand just now. And like, all five days there was some sort of like.
Speaker B:Like kerfuffle.
Speaker C:Yeah, there was some sort of kerfuffle at that tent. And then apparently they had a panel or whatever. And like, it was just. I understand. I understand that everybody's really upset about it and obviously I'm too, because it's like taking jobs from me. But, like, I also wish that people could be like, you know, maybe we could just talk about it, not shout at people, because it was like, that's not gonna help anybody. But I. I just wanted. I would have. I didn't have the brain or the, like, the mental fortitude to go and talk to them because I would have just been like, the. But I wish that I'd had a little bit of time to like, maybe go over there and like, maybe get, like, where they were in their brains, because it would have been interesting to know, like, what. Why are you here? Like, you are in the midst of, like, people that legitimately hate what you do. And what made you decide to have a booth here?
Speaker A:Yeah.
Speaker B:That's crazy.
Speaker A:I wonder if that's just general, like, tech, bro. Unawareness. Because they can't. Like, a lot of the people who are really, really into AI, like, don't have the perspective to understand how other people view it.
Speaker C:Yeah.
Speaker A:And so they can't imagine the reaction to it or they. They dismiss it as being like, you know, oh, well, these people are just behind the times or whatever. You know, they're. They're just stuck in their ways or whatever. And they don't really understand what the objections are to it. Like, so it could be just that or. Or maybe they were, you know, maybe. I mean, there's also the people who are like, any coverage is, you know, any. Any intention is good attention, whether it's, you know, negative or not. And so they might have been deliberately provoking people. Yeah, yeah. AI is one of those things where it's like. Like I'm probably. I'm not as hardline as the rest of the community that I see. The, like, my section of the Internet where, like, everybody. There's a lot of people who have, like, all uses of any kind of AI, which doesn't really mean anything, that that word has been completely robbed of meaning anymore. Yeah. But, like, there are a lot of people who are like, any use of anything. Like, this is like 100% a non starter for me. But, like, there was. I remember those same arguments about Photoshop about 25 years ago. Like, this isn't real art. And the same thing has Happened with like music production and all that stuff with Dawes and everything. So like there will be a place for these kinds of tools in production chains, but they and the world will have to get used to some of this stuff. But the way that it's being shoved into everything without you asking for it.
Speaker B:Without asking and replacing human people who need to do their craft and like.
Speaker A:So far the only thing it can produce is like slop. Right.
Speaker B:And it's not even a useful tool much. There are things that really could be useful for. This is what drives me up the wall.
Speaker A:I do occasionally see like an interesting AI video generation thing that like not only would it be like, it's one of those things where like the amount of, the amount of man hours it would take to create wouldn't be worth the return for like experimenting. So like there are these videos of people like, like they're like the ASMR videos but they're like people like cutting clouds and stuff or like molten lava. Like things that patently could not. You would have to spend, you have to spend thousands upon thousands of dollars to generate with traditional visual effects tools. Now it would. Not that it wouldn't be possible, but that it would just be much more difficult. Yeah, those are interesting because they're stuff that wouldn't be. The cost would be prohibitive otherwise. Right. Like you're doing a thing that would be too expensive to do with the manual tools. This. But this new tool is good at it. But then when you see, when you hear AI narration because you know, anybody scrolling on any kind of short form video app these days is going to run across, you know, like AI voices and it's like this doesn't work. Like this isn't good enough.
Speaker B:It's not.
Speaker C:Yeah, yeah, it's.
Speaker B:It just is not good enough to replace humans for sure.
Speaker A:And like the idea that you want to replace actual creative input from humans with just like prompt engineering and like seeing what random, a random noise generator kind of outputs doesn't super appeal to me. But those, but like I said, these tools do have a place within the, within a production chain. There are like I use the, like I said, I've used the, the Adobe AI voice enhancer they call it, which I don't know how much. I mean they're doing some amount of actual like neural network reprocessing and they're obviously keeping that all behind locked doors. So. I know, but it's not just that. It's also doing. It's got some standard compression settings and volume leveling. And stuff like that. But it is using some sort of a quote unquote, AI reprocessing. And I've used that. But, like, that doesn't replace anybody. Like, that's not going to take somebody's job away, because that's really just saving some audio that I would have had to throw away otherwise.
Speaker C:Yeah.
Speaker B:Starting with something that a human created that, that there was like a quality issue with or a technical issue with, and repairing it with a tool. I feel like that is so different than firing a human who was going to do work for you and being like, we'll just let the computer do it, or even not and do it.
Speaker C:Once, even not look for a human in the first place.
Speaker B:Right.
Speaker C:Never even have that considered that now. I would have, I probably wouldn't have had enough, enough. I'm still tired. I probably wouldn't have had as much of a problem with them if they were like, AI, how can we use it? Or, you know, like, what are the uses for it? But they're, they're, they'. Their booth thing actually said, like, AI narration or whatever. It was like actually using an AI to narrate your books. And that's what just made everybody so.
Speaker B:Right.
Speaker C:Yeah. And that's why I was like, what.
Speaker B:The, how did you get here? Right.
Speaker C:And, and I actually read a couple of articles. One was really incredible. I will try and find it and then you can like, link it somewhere. Scott.
Speaker A:Sure.
Speaker C:And it was a English professor, and I don't remember what university it was, was like Stanford or Harvard, one of the big ones, and Cambridge maybe. And he had his class all write papers using AI and like, they did a round of like, using AI. And then after he did that, and it was all. They were, they were fine papers, but, like, when they read them together, it sounded like the same paper, even though nobody collaborated. And then he had them write another paper on anything they wanted in the whole wide world. And he said, use AI tools to research to get, get ready and then write the paper yourself. And everybody wrote the best paper that had ever written in their entire life. And so he, he was making this argument like, it's a tool, like, don't replace your writing, but, like, use it to help you. And I was like, wow, that's really smart. I, I moderated the Batman panel at a worldcon, and I was having a lot of trouble, like, structuring the panel because none of us were like, Batman industry people. We were just fans. And I was like, well, I can't ask any, like, you know, what was it like when you penned this? You know, how did you feel when you were writing blah, blah, blah. So I had to just think up questions or think up topics about Batman that would be just fun for everybody. And so I used like, like a, an AI thing to like come up with questions. And I didn't end up using any of the questions that they generated, but it helped me kind of like, oh, maybe we can instead of that do that. And so it kind of was like bouncing, like another person was bouncing ideas off of me.
Speaker A:Yeah.
Speaker C:Or I was bouncing off of them. And that worked really well. And so like, I just, I'm also like you, Scott. I'm not going to be like absolutely no AI ever. Because there are tools, AI tools that are really, really helpful. Just don't, don't replace people doing the art.
Speaker A:Yeah, well, like the most useful, like it's one of those things. One of my, like one of the principles that like sort of guides me is the idea that I would never, I can't, or at least at the, at the moment with the way the industry is, I could never use AI to produce something that could then be showed to the customer, customer, end user, audience, whoever shown publicly. Right. Like you're never going to have it write copy for me. Like I'm never gonna have it, right. Like ad copy or, or text or like generate voices that I'm gonna use in a thing. Like I don't, I don't want or like use AI generation to create a logo or something like that. But like what it's good at is. And part of this is just because all the other tools have gotten so shitty is for generating things that you can then react to. Like I used to be able to go to Pinterest or some other website and say, you know, Flickr or like there were a bunch of image sites out there where I could say, you know, mid century logos, right. And just see a sea of real logos that were from vintage. And that stuff is so hard to find now. And most of the time when you're looking at it for, you're going to get a bunch of AI slop anyway, that I might as well just go to an image generation tool and say, hey, show me a bunch of ideas for logos. And then from there go, well, that's, I don't like any of this crap, but there's, there's some ideas in here that I could then use on my own work. Or you know, I'm going to having something like ChatGPT or one of those Other ones ask you questions that you're answering is also useful where you're like, I'm going to approaching an idea for you, ask me questions about it so that I. You were eliciting the responses that I wouldn't otherwise think of. Right. Like it's using as a sort of a soundboard or something like that. Those are, I think, completely legitimate uses for something like this. When you can't like bother another person about something. It's, you know, 2:00am and you're like, I need, I need, I need to.
Speaker C:Talk that 3am brain we were talking about.
Speaker A:Yeah, yeah. So like, there are, there are definitely places for it. It's just not, I don't feel comfortable ever using it to produce something that I would then show to somebody and be like, here is a thing I made because you didn't make it. Right. Also, just like the way that the. I know just enough about like under the hood stuff that like, it just. The AI the way, like the fundamental ideas behind the way AI generation stuff works right now is using random noise, basically. And if you look at random noise, like zoomed out, like if you take an actual, you can go into Photoshop right now and generate a random noise image, right. And it's just, it's static, it's, it's flat. There's no like dynamic ness to it. And that leaks into everything that AI actually generates. Like, if you've ever heard any of the like, AI generated music, if you look at enough AI generated art, you're like, this is all kind of flat. There aren't like, there isn't like really bright and dynamic sections of it usually. Like, it's all kind of flat. It's all kind of the same. And that's because like, you can't get at the moment, they haven't solved the problem. Perhaps it will change in the future. But you just can't get like real weird creativity out of something that is like just using this flat noise as an input. It's like garbage in, garbage out kind of thing.
Speaker C:Yeah.
Speaker B:Probably out of the three of us, I'm the most like, hard line, no AI use of the three of us. But mostly because I'm trained as an artist, like a visual artist. And there's so much stuff with like copyright and like ownership of work and stuff like that. Where the AI usage in the visual arts sphere is really just like a plagiarism machine.
Speaker C:Yep.
Speaker B:Where it's eating art that humans have created and spitting out salad, like visual salad. And people are going, I made this. And I'm like, actually you stole art from all these other people who didn't consent and fed it into the wood chipper and they chipped out a bunch of like, you know, turds and said, okay, I've made art and I basically cut up a bunch of other artists work to make it. And I'm like, okay, well that's hard. And also it's like a big like, environment resource issue where it's using tons and tons of processing power and electricity and water and all of this to produce basically garbage. So those are my like, personal issues with it. But I do think that like, there are like Scott said, production lines, like uses where if we got it to the point where we were being a little more ethical in how we collected the information that it uses, it could be useful for that stuff. It's not at that point right now.
Speaker C:No.
Speaker B:Like every time you ask chat GPT a question, you've used like 30 gallons of water or whatever.
Speaker C:You know what I mean?
Speaker B:Like, you just haven't figured that stuff out yet.
Speaker A:The thing is that like, the unfortunate part of that is that there's like all of your objections are 100% true about like copyright problems and environmental problems. And the thing that kills me is that those are not fundamentally different from working with any other big tech, like, industry. Right? Like, like, okay, yeah, it uses more power than like, you know, going on Reddit or whatever, but like Reddit, Facebook, Amazon, aws, all of these things. Like they're, they're already doing all of the things that like the AI companies are also doing. Like, it's just like an accelerator. Like there's nothing fundamentally different about AI versus just using, you know, a really high computational power resource for other things. Right?
Speaker B:Like, well, that's the same thing with like the bitcoin farms and stuff where they're using up colossal amounts of energy.
Speaker A:Yeah, yeah, yeah. Stresses. And there's all kinds of knock on effects from all of that. It stresses the grid. It, you know, is like the kind of thing that can be put overseas to like, put a lot of extra stress on developing nations.
Speaker B:Oh yeah, that's the other thing that you hear about is like when you find out that somebody was using an AI generator to do blah, blah, blah, and it turns out it was just some guy getting paid 2 cents an hour sitting there typing away at his computer in Bangladesh or somewhere.
Speaker A:And it's, it's one of those things where it's like the. None of those problems, which I agree are like big and real are Fundamentally different from all of the other big problems that happen in your late stage capitalism. And like, you know, it's all late.
Speaker B:Stage capitalism all the way down, baby.
Speaker A:You want to get, you wanted to get into copyright talk where like, you know, despite the fact that like we live in a world where like, you know, copyright used to be like 21 years with an opportunity to extend it for like, you know, seven more. So like Star wars would be public domain if we had lobbyists hadn't gone in and like removed like extended the copyright and stuff. And so like there's all these things in which like capital capitalist instructors have like bought and stolen like cultural property, things that should have been part, like property of humanity as a whole and are now saying, no, we own this. Disney owns a huge part of culture because they bought Star wars and Amazon and Marvel and all of these other things that they own and they say that nobody else can use them. So like, like weirdly enough, like, you know, like they are and then they're the ones who are capitalizing on AI stuff because they're, they can just rob from everybody else because they're powerful enough to like, nobody else can sue them, you know, because nobody else, the small guys don't have money.
Speaker B:Who's going to sue Disney?
Speaker A:Right? And so like it's this like the issue like AI is just one of those things. Like it's the tip of a spear that is this big problem all the way back down to its roots and.
Speaker B:Right.
Speaker A:Like, I don't, like, again, I don't, I don't think there's anything fundamentally different about this technology in that way. It's just a matter of degrees and like all of the objections to it are true of other things in different, in different amounts. But like, that being said, the fact that like, and I, you know, like the idea of you of using it to replace people wouldn't even be a problem if putting people out of a job didn't rob them of, of food and health care and a place like.
Speaker B:A job they want to do and have been trained to do because they enjoy it and because they're good at it.
Speaker A:Like, yeah, yeah, but even, even jobs that people don't really want to do but that they do because they're lucrative. Like truck driving, like long distance truck driving is a relatively, is one of the last careers that like you can go in with like no high school without even high school diploma and you can make a living at it, make good and you could make good money at it.
Speaker B:And now it's a necessary like. Right. Service that you're providing.
Speaker C:Well, I did it for a while. It is incredibly good money.
Speaker A:Yeah. And like although trains would be better but that being said.
Speaker B:Right, right, right.
Speaker A:But, but now there's a huge pushback from the trucking industry about replacing jobs that, with like a self driving long distance truck or something like that. And, but like wouldn't the. In a, in a, in a properly set up world you could be like, okay, all of the truck drivers go home, we're going to give you the money anyway. But you know, they don't have to do this thing. Like you're freed up to go do something else that we as a society would. Because like if we can replace a truck driver with a machine that does the same job, just like we can replace somebody who used to like weave fabric by hand with a loom, right. Then like that frees up that person who used to have to make cloth by hand for you know, 2 cents an hour to then go do something else that the society would be benefit more from. Right. Like they can do work with their mind. They can just sit back and relax. They can take care of their families or elderly relatives. Like there's all these things that like people would be better off. We would, we as an entire species would be better off if our human hours were spent doing and if the additional efficiencies that came from technology were distributed evenly then everybody would have a six hour workweek instead of a 40 hour workweek. And like we, you could, you could have a huge amount of the society idle. You know, like they wouldn't be working for money but they would instead be still adding value to humanity as a whole. But instead what happens is that you just have the everybody work more and more efficiently and all of those resources then go to making mega yachts for really, really rich people or you know. Yeah.
Speaker B:So it's like, and like part of the issue, right. It is an addendum is that a lot of the stuff we have automated as a society at this point is, is still run by people making 2 cents an hour. It's just that the machine that weaves the cloth has to be overseen by humans. So when it jams five times an hour, the human standing there can reach their arm into the machine and risk getting their limb chopped off. To unjam the machine, like all the stuff still requires human input. We don't have the option to the level that it needs to be to actually replace a human worker.
Speaker A:Well, no, it's. We do, we just get from each human man hour invested, we get a lot more output. It's a lot more efficient. Right, because instead of having 200 people manually operating looms or manually weaving cloth, you have one guy walking down a row of machines and when one of them jams up, you stop the machine and you fix it. That's a. Is you've, you've reduced the number of person hours required to produce products. In some ways this is like, the thing is like, we are so far afield from where we started. Like, we're talking about fundamental, fundamental problems with like technology, technological advancement in a capitalistic structure. And like, that's, that's what I'm talking about. Like the, all of the, not all of the problems, but most of the problems that exist within this space are bigger societal problems that we are also fit. It's just like this is one of the, it's all the newest, sharpest point of that. And like, I'm so like there's, there are like, you know, what are we going to do? Rewrite society? I don't know, like, nice.
Speaker C:It'd be nice. Yeah, somebody that worked.
Speaker B:That'd be cool.
Speaker A:That would be great. But like also, you know, like, the revolution's not going to come, guys. We gotta like, work on something so.
Speaker C:Well, I'm sorry to have started a whole. No, no, no, no chat. It worked out really well. I'm glad, I'm glad everybody got to say stuff about it. But I, I do, I do kind of wish that I had stopped by the booth and talked to the AI tech boys because maybe they just had a really terrible sign. And what they meant was, you know, we think that there are tools, AI tools that you could use to write more efficiently or get better ideas from or whatever. But I did not give them that chance. One, because I was way too busy and two, they made me sort of nauseous. So.
Speaker B:Yeah, valid.
Speaker C:Yeah, I. Next time I think I'm going to take the time to try and talk to them.
Speaker A:Yeah.
Speaker C:Because I would. There's another, there's another big thing coming up, Author Nation. And it's like the business side of self publishing and I've been invited a few times. I have a very good friend who's like really into like the. All of that. And then like with the staff people and stuff and there's so many AI people there. But my friend who works for the staff is like, no, no, no, they're not like, let's use AI narrations. No, they're like, let's see how AI can help artists do the thing that they do. And I'm like, okay, I'm a little, I'm a little bit interested in maybe talking about that. But like you were saying, Jack, I just want to put in my, my quick $0.02 here in the, the realm of visual art and be that movies or, or actual like paint on canvas or any of those types of art. You know what I mean when I say art?
Speaker B:Yeah, yeah, right.
Speaker C:I should not be there, just absolutely should not be over there right now.
Speaker B:Especially maybe in the future to, like, do it ethically.
Speaker C:There's almost no way, maybe in the future when we figure out, like, when AI is suddenly sentient and we need to like, let them know that they are their own beings and they can paint pictures if they want to, but we don't have androids trying to like, find sentience. Sentience yet. So now it needs to stay out of the visual art medium completely, totally, 100%.
Speaker A:I mean, I'm, I, I, I can't object with you. I can't argue with you on that either. Like, I think that, like, especially with the visual stuff, it is so computationally expensive to produce things like that and video is even worse because you're just doing the same thing a lot that it is. Yeah, I mean, like, there's just isn't. I'm not gonna say there's absolutely no place for it, but that there is. Like, what we're doing now, the vast majority of it's right now.
Speaker C:It's wrong.
Speaker A:Yeah.
Speaker B:We are not at the. It could even be much of a useful tool because, because yeah, it does at the moment only involve stealing from human work and spitting out garbage. Like, that is all that it is good for at the moment. I, I would love to see it become like a workflow, useful tool someday when all of those problems have been solved or at least mitigated like, substantially from where they are.
Speaker C:Like, maybe, like, maybe it could help mix paints and like, you had a really fast, like, I need a really weird green. Let's, let's mix greens really fast. And you could, like, do it really, really quick. Because when I, the one time that I painted a picture in my entire life, I spent like 20 minutes trying to get like, the right shade of blue that I wanted. So, like, if you had an AI to be like, can you help me figure out how to mix this blue? Let's do this together. You know, something like that.
Speaker A:What you're talking about, Sam, is like, that's the step from like, physical media to photoshop Right where you're. Now you're just like. I select the blue, the green I want out of the color paper.
Speaker B:Yeah, we actually have that right now. And you do it.
Speaker C:See how much I know nothing. I know nothing.
Speaker A:I'd be really interested to see what y' all two thought of their. Have you seen the Corridor Crew anime Rock Paper Scissors videos?
Speaker C:No.
Speaker A:The videos themselves are whatever. You have to like that style to think they're interesting. But the behind the scenes of their work process where they are essentially using AI as a filter over the top of video, they're basically using it to. Oh, what's the word? When you. When you trace over. When you make an animation by tracing over a live action video. I forgot the word.
Speaker B:Rotoscoping.
Speaker A:Rotoscoping. So they're essentially using AI to rotoscope over live action video that they themselves recorded. And their initial version of it, they used stills from an anime. They. Since for like, when they got a lot of flack on that one, they went back and they hired an artist to create the assets that then got fed into their pipeline to get themselves A, to give themselves an old style and B, they hired an artist to actually generate the style for them. But they're essentially using it as a. Like I said, a giant rotoscoping machine that then comes in and draws over the top of a live action video to produce something that feels more animated. That's the kind of thing where, like, that is much closer to a. Use a real useful way. And again, like, humans are involved in every step of this process, except for the actual tracing. Like.
Speaker B:Except for the.
Speaker C:Yeah.
Speaker B:Let me tell you what I once animated. I rotoscoped seven seconds of live footage and it took about 40 hours of my time to do that. Yeah, like, it's really different to do it in an intentional way where you are specifically including humans to make the stuff and not just like using it as a shortcut to cut humans out of the pipeline or cut humans out of getting paid. Because what I usually see it used for in my spheres is that you hire an artist to do a sketch commission for you and then you use an AI to finish the drawing and like, not pay that artist the. However much more money they would have made if you had them finish the drawing the way they normally would have. Stuff like that. Whereas, like, that's the most common usage I am seeing. And I'm like, there's no way to like, wrangle this back. It's like the cat is so far out the back yeah, yeah.
Speaker A:It is a. It is a huge mire of crap.
Speaker B:It's a mire.
Speaker A:It is a mire. Yeah. Like we are. It's a bog in a lot of ways. And like, like everybody just has to kind of figure out where their point in the sand is and then like do that and hopefully, you know, you can live with yourself afterwards.
Speaker C:You know, I did, I did get asked one sort of interesting question and I want your guys answer.
Speaker B:Yeah, yeah, sure.
Speaker C:Let's say they asked me specifically, like, if I was making a podcast. Because that's, you know, what I do. They were like, what if. What if you were making a podcast and you. It was like a Sci Fi thing and you had a whole bunch of actors that were playing all the parts and you had writers and you were paying everybody and it was all very ethical, but one character was an AI and you had the AI voiced by an AI. Is that unethical or is it own voices?
Speaker B:Oh, my God. The thing is. Okay, this is my $0.02. AI is not a person right now. We don't have true artificial intelligence existing on this earth right now. Therefore, it's not anybody's own voice. It's voices stolen from living humans who are people.
Speaker A:Yes.
Speaker B:Sorry, that's. That's my. I hit my pop filter because I'm so passionate. Passionate about this. But I really feel like when we get to the point and I, I imagine we will get there when we have like a mass effect style, like robot voice who is a person with a mind of their own. I think that's probably a thing that will happen in the timeline of this earth. We are not there right now. And I don't think a lot of people, like just lay people understand that when they hear that robot voice talking. It's not a person and it's not speaking from its own mind. It's. It's garbling back a word salad that it has been fed by someone who actually wrote it with their human hands and brain. Do you know what I mean? Like, we just actually aren't there yet. We don't have true artificial intelligence right now.
Speaker A:Yeah, it's the. My particular opinion on that is, is what you're doing the best way to do it? Not really.
Speaker B:Like, because, like, if ethical concerns aside.
Speaker A:Yeah, like, like, I don't know, like the. If you are, like, again, like, there's a. There's a vast difference between, like, I am going to type into ChatGPT, make me a podcast about blah, blah, blah, blah, and then just getting whatever it comes out of versus this is one step of a much larger process that a bunch of humans are involved with. If you know all of the words that were created were written by a human and then you pass it to Microsoft sam, which is a different kind of text to speech generator than the AI ones that you use now, and it produces a result. But that's not the result you're looking for. You need a different. Like, like, the, the mere presence of AI doesn't bother me that much. The real thing is like, is like, other than concerns about, like, you're now not hiring a voice actor that you would have hired before, but would you have hired like, is this replacing Microsoft Sam? Or is it replacing, you know, Sam Stark? Right. Like, is it replacing a Texas, A different Texas Houston, or is it replacing a Va Va? Right. Like, but how can you determine how that would have been, how it would have went, how things would have gone in the alternate universe where you didn't have access to this tool? I don't know it.
Speaker B:Like, I think the tools would have to be better at what they do before you'd get to the point where using the tool is better than hiring a human. Like, sure. When you write a part for a ship AI.
Speaker A:Yeah.
Speaker B:And you let the AI voice it, and you let a human actor voice it, there's a decent shot that the human actor will do a better job imitating a computer voice than the AI or at least.
Speaker A:And yeah. And to the extent that like you need that character to sound better than current text to speech because it's the future and stuff like that, like, there's all kinds of stuff there. I think an interesting take on that one is there are several services now that you can feed a human performance into and it will try to spit out something of that performance in a different voice. Right. Like they've trained it on the way to transform one person's voice into another person's voice.
Speaker B:Well, like basically voice modulation.
Speaker A:Yeah, voice modulation. There's a bunch of tools. There's at least a couple of tools that I know of that exist that way. And the question is like, well, a human was still involved in this process. You are resulting in something different. But so like, you're like, I don't know, it gets. It's.
Speaker B:There's no, it's so muddy. The lines are just so wiggly everywhere.
Speaker A:There's nothing like, there's no. To me, at least there's no like, specific litmus test about like, what makes this good or bad. Right. The tools aren't as good as the tech. Like, we're. We're in a world where both, like, the people who hate all AI. I keep putting scare quotes on Scott.
Speaker B:Making the scare quotes, but he is making them.
Speaker A:Yes, but like, every. There are people out there who are like, everything that involves quote unquote AI is bad, and they're not right. But also the tech bros who just want to fucking do everything with it all of the time and replace all of the humans and make it all AI are also not right. The middle point is not clear. There's no. There's no Rubicon. There's no line in the sand.
Speaker B:It's a huge sliding scale.
Speaker A:Yeah. And it. And circumstances are different everywhere and, like, not. And all of that to say, like, there's a. There's a real inhuman instinct that when you come across something like this that is big and complex and there's no right answer to sort of write the whole thing off and do whatever the fuck you want, which is the absolute wrong instinct where it's like you, you know, if presented with a situation like. Like you're trying to figure out is modulating a human voice with AI Evil, or is generating a voice with AI evil, But I wrote the words and I'm the one tweaking the sliders or whatever. Like, when you're. When you're down to that point, like, that's just like. Yeah, you have to get into it and start figuring out where these lines are.
Speaker B:It'll get sorted out.
Speaker C:There's no way.
Speaker B:You have to think with your human brain about how you feel about it.
Speaker A:It's a complex issue. So there's no simple solution. And that is not an excuse to give up and stop looking for answers.
Speaker B:Yeah. And what I really don't recommend doing is asking ChatGPT, hey, how should I feel about this issue? Like, please think with your brain. Please think with your human brain.
Speaker C:Do not figure it out.
Speaker A:Do not talk to other people.
Speaker C:Like a. Like a therapist, this person that asked me came at it with such, like, I don't want to. I don't want to say fake sincerity, but, like, he came at me like he really, really cared about, like, the own voices issue. And. And we. He was talking about, like, you know, but what. Because I had just talked about accessibility through audiobooks on a panel. So of course he came up to me and talked to me about this, and he was talking about accessibility, and he was like, well, what about. What about people that have always wanted to voice, act in something and can't, like, vocalize that can't talk. And you know, you could, you could. They could use AI. Like, the AI could be their voice. And I just kept coming back to like, we're not, we're not there yet. Like, it's just not. We cannot do that yet. And you're still stealing from people right now. That's not a thing. I answered his question a lot of the ways that you guys did, and I really kind of focused on people that watch and listen to or whatever to sci fi stuff. Right now there is a. There is an AI voice that they are expecting. And if you have the current AI generated, real AI generated voice, do that chip computer. It's going to sound like a real AI voice and it's going to sound fake. And a lot of people who are listening for that kind of almost sassy lady that's like, you know, talking in the ship, you know, you're not going to get that because it's not a real person that is doing this AI voice. And I kind of just stuck with that because I didn't want to get too deep into that conversation with him. But it was really kind of sleazy coming at me with, like, he was trying to be accessible about it. So, yeah, people beware that now they're trying to speak languages that you understand.
Speaker B:That has been growing in the visual arts space from the very beginning because it's people going, well, I shouldn't have to go to art school to paint a beautiful picture. I just want to be able to do it in five minutes by knowing what prompts to put in. And I'm like, take a crayon, take a piece of paper, learn like a baby from the beginning. Like, like we all should do as part of human experience. You know what I'm saying?
Speaker C:Well, yeah.
Speaker A:There's like an entire other conversation to be had about, like, don't fear being bad.
Speaker B:Like, yes, it is so important to the human condition to start something and be bad at it and then improve and get better by practicing it.
Speaker A:Yep. Well, and also just like reveling. Like, there's. The artists I love the most are the ones who revel in their own naivete about it. Like, they go, I'm not going to try to please your design style. I'm going to do what. Even though, like, you know, even what they're doing is creating that's kind of off or it's discouraging or like, it doesn't look traditionally good, but like, they're. They've decided what they're going to do. And they're fucking going to have it whole hog. Yeah, those, those guys are heroes because, like, they're pushing back. Like they're, they're the most artists an artist can be is like, they're not conforming to other people's rules.
Speaker B:It's really great making things from like the absolute core of their being. You know what I mean?
Speaker A:Can I, can I tell you guys, I don't know if I've mentioned this on the podc, but, like, since we're talking about AI, I have to. I have to vent this one frustration.
Speaker C:Spill. Spill it.
Speaker A:Is that in 21, 2022? Something like that. 2021, 2020. I was working on an anthology audio drama. I had several episodes written entirely about AI stories.
Speaker B:Wait a minute. I remember you telling me this at some point. Go on.
Speaker A:Yeah. And put it on the shelf for a minute to work on other stuff. And when I. By the time I got back to it, the word AI has been ruined. I can't, like, none of the stories that I want to tell anymore work under the current idea of AI, But I have like, like the, the one I had, and I have an entire. I don't know if it's finished. I have most of an episode that is entirely about a. A small artificial intelligence that is instantiated specifically to sit with somebody while they die. Right? So, like, sure. The whole idea is that this nobody can. That way nobody dies alone, right? That, like, they're. There's a. They just have a little companion. And the twist on it is at the end of the. The when the person dies, this little AI is shut down. And because of HIPAA compliance rules, they can't incorporate that patient data back into the main part of the AI. So this little version of the AI also dies at the end. Right.
Speaker B:And that's really poignant and beautiful.
Speaker C:So good.
Speaker A:I love this idea. And like, I don't have a place for it anymore, you know, and like, I've got other ones about, you know, future artificial intelligence is looking back at all media of human, of humanity and trying to reconstruct lost media that they know existed, but they don't know what it is. Like, there's. There's all these really good ideas there, but, like, I can't market that anymore because, like, a podcast about AI means something different than it did four years ago.
Speaker C:I think, though. I think, though, Scott, that, like, there will be a time for those stories.
Speaker B:I want this.
Speaker C:And I get. I want, I want them. I want them really bad. Especially because I understand that you would have just the worst time marketing this. And, like, a lot of people would be really mad. But, like, if you, if you wait until, like, things kind of. We figure out what we're doing and we figure out where we're going, and you. And you put those out. I mean, obviously humanizing AI is always going to be interesting and fun and poignant. So please, please save them for me.
Speaker A:I've got. I've got. I've got my, my scripts somewhere in my, in my stash and I've got my other ideas and all that stuff written down. If I could think of a way to, like, present it in a way that elicits the old version of A.I. right. Like, if it's, you know, I don't want to use robots or androids or one of those other. Like, there's. If I just find the way to, like, present it to people that I think would be usable, I think that it could be still very good.
Speaker B:I can see the, like, version, the alternate version of this story about the, like, bedside companion who sits with you when you die. That's like a clone who's grown in a tube and comes out with exactly the knowledge they need to comfort this person and then just dies a natural death along with the person at the end. Like, you could do that instead, but that's not really what you want to do.
Speaker A:No, that's not the story. And, well, and like, possibly if I wanted to present this as an anthology, the other way to do it is make that a prominent theme, but don't make it part of, like, the pitch. Right? Where, like, instead you just treat it something more like love, death and robots or metal, heavy metal or whatever. Where it's like, this is just like sci fi fantasy ideas that we have. There's no theme where it's just stuff.
Speaker B:It's totally theme quote, unquote.
Speaker A:Well, yeah, but like, but like, and then you. That also frees you up to do other things, right? Like, you do have to put some of the other stuff in there.
Speaker B:You could.
Speaker C:Yeah, you could pepper some of these AI stories in with other things that have nothing to do with AI and then it's just a sci fi anthology.
Speaker A:Yeah, yeah. Then it's just a. Yeah, okay, but that's, that's deep in the. In the vault. I have to finish the one I'm working on now first.
Speaker B:Yeah, you got horny werewolves to do right now.
Speaker A:Okay, so I think we've had a rousing conversation and I'm. I'm now both angry and disappointed about the existence of AI in our world in many about a bunch of different facets of it.
Speaker B:So a lot of things. But I'm really happy to be making stuff with the two of you and other people that I know who like, like to make things with their human hands and brain.
Speaker C:Yes.
Speaker A:Yeah, yeah. There's nothing, there is nothing within the space of using AI that doesn't feel like you're working alone, really. And the, all of the best things I've ever gotten out of making things creatively has been when I collaborate with other humans and like, get their input and talk to the brain and like, we are not work with other people. People like do it. Go, go collaborate, make some friends, find.
Speaker B:Community, work with other artists. It's the best, most fulfilling feeling in the world anyway.
Speaker A:But we, we need to, we need.
Speaker C:To wrap this up.
Speaker A:Yeah, we need to wrap it up and set our goals. I, I need to find out when our next date is because I need to know if it's in.
Speaker B:Oh, good question. When is our next one?
Speaker A:And I just look at a calendar. You used to be able to click on the like day on the computer and just get a little calendar and I could like see what it was and it's not there. They robbed me from it in Windows 11 and so now I, I have.
Speaker B:To like Google the fourth. The next day we're usually on is the fourth.
Speaker A:Okay. So my goal, I will definitely have the casting call done by the fourth. I would really love to launch the casting call on September 1st. That's a Monday. That would actually be really cool. I don't know that I can do that, but I'm really thinking about it. Yeah. So I will definitely be done. There's a possibility I will kick the bucket, kick the butt. That's all.
Speaker B:Oh my God.
Speaker C:Wait a minute.
Speaker A:No, no. There's a possibility to kick the can down the road to whatever it is, the 8th or whatever, like the next, the next weekend to launch. I will definitely have, I will have the casting call done and I'll be working on my revision. So that's what I'm, that's what I'm finishing. And I gotta, I gotta finish Bex the accomplice and, you know, rewrite her lines so that she has lines for people to say. And I can actually have people, man, I, I, I do not envy. I don't know. Actually, I'm gonna say I do not envy. I totally envy the people who get a conlang as their audition piece. If I got that from somebody else, I would Be so jazzed doing that. So maybe I'm wr. Shouldn't feel bad for these people. Maybe I'm giving them a delicious.
Speaker B:Giving them a treat.
Speaker A:Okay, so that's where I'm at. Do you guys have goals for the next two weeks?
Speaker C:Yes. So I am going on a. Like a retreat with my very, very good friends. Also, slash.
Speaker B:You're always going places.
Speaker C:I know. I'm always doing something.
Speaker A:I'm the jetsetter.
Speaker C:The. The weekend is going to be obviously a hangout and like, relax and kind of recharge, but we're also going to be working together because we just started an LLC together and we were marketing it at worldcon. We're doing a. We. We've press. We're doing a short story anthology and I am the head of, like, the audio department. I'm going to be doing the podcast and I'm going to be narrating the stories along with a couple other people. And I completely forgot that I also need to write a story that's going to be in this anthology. So I've written a couple, but they're way, way, way too long. I only get like 5 to 10,000 words, and the first one I wrote was like 27,000 or something. So that's not going to work. Work. So this weekend I'm going to be writing a story for the anthology, but I'm not going to be writing the anthology story the whole time. And I was thinking, like, maybe some as in west stuff would be really nice. Kind of like a palette cleanser. Like little break. Yeah, like break between this other stuff. And so I would love to come back on the fourth and have not only my story for the anthology because it's due on the 1st, but I would also like to have the conversation between Walker and the bot that I don't know what happened to. And I would like to look through all of the dialogue for what we've got so far for west. And I will make sure that all of the. Everything is available in the Google Drive.
Speaker B:Perfect.
Speaker C:And then probably have like a sit and brainstorm session about the next few episodes because, you know, I'm going to be doing a lot of just sort of relaxing by the water or in the forest, wherever we are. And that would be such a great time to just sort of take a notebook and like, just sort of free.
Speaker A:Free.
Speaker B:And if you want to do like another 30 minute, like, phone call with me during one of those brainstorming sessions, that would be dope. I know we talked a little bit ahead about the episode, like, three, that. Whatever's the next one. But, yeah, I would love to speak about both that and the following episode.
Speaker C:Yep, yep. So that was really sort of all over the place, but they're in my brain. They are very like.
Speaker B:Like they're rotating like a rotisserie chicken.
Speaker C:It's like three or four little balls, and they're doing, like, the atom thing where they're just circling around something. Yeah. So that's what I'm going to try and do. Cool.
Speaker A:Jack, how about you?
Speaker B:Yeah. Once I have access to that first half of the episode, I will make whatever dialogue tweaks I want to make, and then I'll find a time to talk to Sam before the fourth to talk about the next couple episodes.
Speaker A:Cool. Cool, cool.
Speaker B:Cool.
Speaker A:All right, well, in that case, we will catch the audience in two weeks. Bye.
Speaker C:Bye.
Speaker A:Once again, I put off recording this outro until my apartment is noisy. So short outro noises here. Bye.
WARNING! SPOILERS FOR UPCOMING PROJECTS CONTAINED BEHIND THESE LOCKED DOORS.
Corridor Crew BTS for Rock Paper Scissors Anime
Library of Cursed Knowledge Patreon
Support Behind the Locked Doors by contributing to their tip jar: https://tips.pinecast.com/jar/behind-the-locked-doors
Find out more at http://behind.library.horse