The AI Book Scandal Rocking Publishing

Listen to this episode

Speaker A: Hey, I’m Kate Lindsey, and you’re listening to icymi, or, in case you missed it, Slate’s podcast about Internet culture. And joining me today is Slate contributing writer Imogen West Knights. Hi, Imogen.

Speaker B: Hi, Kate.

Speaker A: Imogen is also the author of the novel Deep Down. But we are here today to talk about a different novel, one that was actually pulled from publication after the Internet alleged it was written by AI. But before we talk about that, Imogen, it is your first time on icymi. I’ve already had this teased to me that you have a really good answer to this question. So we have to ask, what is your first Internet memory?

Speaker B: I’m not sure it’s a really good answer so much as quite a strange answer. And I was thinking about it. I was like, wow, that is.

Speaker A: That’s perfect.

Speaker B: So my dad was, like, a really early Internet guy, so he was kind of plugged into it all before other people were. And so I used to go to his office where he had, like, the Internet, and make him let me play this game called Tamale Loco, which was like a browser game. And you played it was probably quite racist. Now I’m thinking it was like, a Mexican mouse.

Speaker A: Oh, of course. Yeah.

Speaker B: And you had to, like, collect the ingredients to make various different kinds of burritos, and you would, like, jump around, and he would scream the names of the ingredients. And I just thought this was the best thing in the universe, and I insisted on playing it all of the time.

Speaker A: So it was on. It was on the Internet. It was like.

Speaker B: It was. Yeah. It wasn’t a downloaded game. You had to, like, find it on the Internet and play it. And it would be so laggy, and I couldn’t save it anywhere. So I’d have to start the game from the beginning every time I played it.

Speaker A: Yeah.

Speaker B: But it was just. It was captivating to me. Also, Mexican food. I’d never seen or heard of Mexican food at the time. It was very exotic.

Speaker A: Right. Because I was gonna ask, was this in the uk?

Speaker B: Yeah, exactly.

Speaker A: That’s very. Yeah. I was like, what the h*** is this?

Speaker B: I was like, a burrito.

Speaker A: Wow. Yeah. Yeah. Is your dad, like, your dad was an early Internet adopter? Is he tech savvy still?

Speaker B: He is still tech savvy, although he shocked me the other day when he said he’s been asking Grok things, and I was like, don’t talk to Grok.

Speaker A: That is, I think, the secret thing that’s happening that we aren’t talking about enough is that, like, the actual victim of These AI chatbots are parents, like boomer parents, who are just like, you’ll notice because they’ll randomly be sending you AI images of your cats or something.

Speaker B: Yes. My mom, in fact only yesterday started a small situation in our family, WhatsApp, by sending a picture of my brother and his new girlfriend as the king and queen.

Speaker A: Yes. This is some thing that they love to do. They’re like, we want to take.

Speaker B: Where are they even doing this?

Speaker A: There’s like, we made you as this thing. And it’s like, okay. Yeah. No, not quite as tuned in. Yeah. My mom worked in IT and so her main job was getting the viruses off the computer that I would download by trying to download the sims, add ons and things like that. But no, my dad was not an early adopter of the Internet. And his most famous technology claim to fame is he once b*** dialed himself. Wow. By sitting on his cell phone and it rang our landline and. And he answered and was like, hello. Yeah. So.

Speaker B: To his own b***. Wow.

Speaker A: To his own b***. Yeah. Oh my God. It’s like he’s gonna be. I mean, I’ll know if he listens to this. Cause he’s really tired of that being brought up. But like b*** dialing yourself is too perfect. Well, today’s episode is about, you know, something a little more literary than my dad’s b***. It is about a self published book talk sensation that was then picked up by a publisher in the uk. Now, a little over a year later, it’s been pulled from the shelves and its US publication has been canceled. When we come back, how did an allegedly AI written novel make it through a big five publisher? And is this only just the beginning? And we’re back. Now, I had never heard of this book before, this controversy, but it sounds like it was one of those maybe like self published sensations that we’re seeing a lot these days where, you know, people are publishing it on places like Wattpad or themselves. But then it catches the attention of traditional media. So the book is called Shy Girl by author Mia Ballard. Mia Ballard is American. She lives in North Carolina. It was self published in February 2025, but then Hachette Book Group published it in the UK in November and it was slated for US publication this spring. Imogen, you read the book for a piece you wrote for Slate. We’ll get into what happened later, but first I want to know, what is this book about?

Speaker B: So the book is. And the sort of conceit of it sounds good. You know, it sounds pretty interesting. You’d be like, oh, it’s about a woman who is down on her luck financially and she signs up for a sugar baby website, gets put in touch with this man, and eventually he takes her prisoner and keeps her as a pet, basically.

Speaker A: Yeah.

Speaker B: And she begins to develop, like, animal quality. She kind of turns into a dog. So it’s kind of like Night B**** kind of vibe.

Speaker A: Netflix easily snapping the dog.

Speaker B: Right. So it’s quite, you know, like a snappy premise.

Speaker A: Well, yeah. So what we’re. What we’re hedging is that in January 2026 and perhaps a little earlier, because you mentioned your piece about this that, like, it was popping up. But this sort of. I think one thing that people cite as the inciting incident for this is a Reddit post on R Horror Lit. And this person did a long post and they write Shy Girl by Mia Ballard. Does anyone else think this was written by ChatGPT? And basically, this person’s evidence is that they used to work with AI Generated writing, creative writing in their job. Yeah. Book editor. And so they have experience with, like, what this sounds like. And so mostly they just walk through some of the hallmarks of spotting LLM writing. There’s every noun has an adjective, which is something that, like, ever since I read this to research this, I was writing this morning and I was like, oh, my God, do all my nouns have adjectives? Like, now I’m getting self conscious. But every noun has an adjective. It’s repetitive. The syntax is very like, poetic and high drama. And then they put some sections of the book in here to kind of show, like, here’s how I’m seeing this show up. And so this I know, takes off in Reddit and then it ends up prompting a YouTube video that has over 1.4 million views by a user named Frankie’s Shelf. It’s titled I’m pretty sure this book is AI Slop, and I’ll just play a little clip from it right now.

Speaker C: The book itself is nothing. It does nothing. It says nothing. I have spent valuable hours of my life reading it, and I feel worse off for it. It’s so empty. It’s flat in every way. Themes, characters, plot, writing. And that is because, in my opinion, large chunks of this book were written with assistance from a generative AI. And generative AI does not think or feel or have opinions.

Speaker A: So, like we said up top, you have read this book, Imogen. Does this match your take? What did you think of it as a book?

Speaker B: I thought it was terrible. I have to say, I really did not think it was good. And so, I mean, it was so interesting to read it knowing already that there had been all of this drama. And the implication was that it was written, at least in part by AI Because I want. I do think. I do think I would have noticed without that context. Because. And it’s. I mean, this whole issue goes so deep, it’s fascinating. Like, the. All those tells you were talking about, they do exist, but it’s. There’s something that happens when you read an entire book that is largely written by AI where it’s just this sort of feeling that you get of that emptiness that it’s quite hard to describe, but it’s just. You really can tell there is no mind behind it. And I’m sure people have felt this when they’ve read, like, AI generated LinkedIn slop or whatever. This feeling of, like, what are you. Like, there’s no. There’s no texture here. There’s no personality. But to read an entire book with that alarm bell going off all of the time was really quite a bleak experience. I can see why that. Because that video is like three hours long.

Speaker A: Yes. No, right. I should have said that is so exhaustive.

Speaker B: Yeah, it’s so. And I can see. I can understand how they got themselves into a froth enough that they were like, I’m gonna do a three hour video about this because I’m p***** off.

Speaker A: I know. What’s also so interesting is that, like, this, both the Reddit post and this video went up in January. And so this clearly was like, simmering for a little bit because nothing had really happened. But then on Thursday, March 19, the New York Times basically published. They published the news that Hachette was pulling the book from publication. And what they. In the piece, all they say is, the New York Times brought their own findings to Hachette. I don’t know what they used or what they did, but I think it’s one of those cases that I know could happen in journalism where basically the moment, the place, you know, whatever it is in question, realizes a publication like the New York Times is about to do something about it. There’s suddenly movement. And so they bring this to Hachette. And Hachette tells the Times that, so this is under the Orbit imprint, which is the imprint that’s publishing it in the US or was going to. They decided not to publish Shy Girl, which was due out this spring. After conducting a thorough and lengthy review of the text, Hachette said it will also discontinue the book in the UK and in Their statement, they wrote, Hachette remains committed to protecting original creative expression and storytelling. And that Hachette requires all submissions to be original to the authors and asks authors to disclose to the company whether they are using AI during the writing process. Which I will say those two things actually kind of seem to contradict one another. Like they don’t allow it, but you have to tell them if they use it. I guess so. But the question you kind of get to in your piece and that your investigation is hinged around is that if a Big five publication, which is basically just the term for the major publishing houses, has these types of AI restrictions and presumably the resources to identify them, how did this get through?

Speaker B: Well, that’s the interesting thing that you’d mentioned resources, because I think that’s what they don’t really have on a number of different levels. They don’t. None of us do in a way. Right. Like it’s so new, this whole phenomenon of suddenly there’s these, these washes of non human generated content on the Internet and we’re all getting a bit better probably at recognizing it. But it’s, it’s very, very new. Publishing houses are these big, you know, massive corporations that are a bit slow moving and AI detection software is not that reliable. Like it’s, it’s not always gonna pull up the examples of things that it should. And then on top of that, the people who are the first line of defence against these things are the editors. Yeah. And what really, I don’t know whether it shocked me, I sort of half knew this and dared not think about it because it was so depressing, but that like editors mostly don’t have that much time for editing.

Speaker A: Right, right. And especially with what you’re saying we were talking about earlier when you read it, that it was just like a feeling of emptiness. That’s like a hard thing, I feel like to, if you’re an editor reading it, to be like confident enough in, to be like this must mean it’s AI because another point that your piece brings up that I thought was super interesting was about. All right, so the way you kind of politely phrase it in your piece is that sort of this more recent emphasis on quote, like idea driven books, which basically means getting a book because you know it’ll sell. Not. But not really because the writing is great. And so essentially like the problem that people are coming up with is like. Or editors. I imagine it’s like, is this writing just bad because it’s a bad writer or is it bad because it’s AI like, yeah, I thought that was super interesting.

Speaker B: And also like with what probably happened and you know, Hachette being very tight lipped about the process of it all, is that it was a self published book. Lots of people read it wherever it was, they were reading it and said that they thought it was fun or good or whatever. And so it can’t. You know, the pipeline then from self published book to traditionally published book can be a really great thing for the self published author and for the publishing house. Like, okay, people already like this. It’s probably going to sell. The author gets money from a book deal. Win, win. And but the danger then is that there’s probably even less editing going on with a self published tradition because it’s like, well, it already exists and people already like it.

Speaker A: Right, right. It’s like even if they did have edits they would make, it’s like you don’t want to risk losing whatever it is that got.

Speaker B: Exactly.

Speaker A: Yeah. So like we said, this book did not come out through like traditional publishing. It came out from being self published, getting popularity, and then was. Was picked up by Hachette. Imogen, as someone who has, I assume, published the book traditionally your book.

Speaker B: Yes, in fact, with Hachette. So I can really speak. I can really speak.

Speaker A: Oh my God. Okay. I would. Wait. Yes. All right. So you went, you literally went undercover for this story. You got a book published with Hachette. What does that process look like versus what happened with Shy Girl?

Speaker B: So when a book is self published, I mean, the clues in the name, you kind of, you put it out there, wherever that is. You can put it on Amazon, you can put it on places like wattpad, you can literally just put it on your website, whatever that may have been edited by. You could have hired someone to edit it for you if you wanted. You could have just edited it yourself. It could have been just a first draft. Anything goes really. Then when it gets picked up for like publication by traditional publishing house, it will go through. And I mean, I have nothing bad to say about my editor at Hachette who was great and I got like all the bells and whistles. She was really engaged. I think now maybe I even just got lucky with, I don’t know, either the timing or her as a person. But in an ideal world it should go like this, that they take the book in and the editor goes through and does kind of macro to micro edits. So they’ll start with the structural stuff and move things around, character stuff, anything kind of big level. And then it will get smaller and smaller and smaller until you’re down at the sentence level and then it will get copy edited after that and then it goes to publication in a book form. That should be a quite involved process and every word should be being raked over.

Speaker A: And even to get to that point, I can more speak from this experience. It’s difficult because you have to get the book in front of a publisher and that often requires an agent. You have to have an agent that wants to read your work. There’s so many bare barriers, which in many ways is what’s great about self publishing is because those barriers can be prohibitive to certain groups of people. Self publishing is great in many ways, but without the sort of more stringent editing. Yeah, something like an AI generated book can just slide right through. But there is a plot twist when we come back. The only thing more complicated than a publisher pulling a book for being written by AI is a publisher pulling a book for being being written by AI even when the author says it wasn’t. Hey there. Are you ever listening to an episode of icymi and going, gosh, I wish I could see Kate’s face. Well, you’re in luck because The Icy My YouTube channel has launched and our Chapel Rowan episode featuring creator Josh Laura is is live now and it includes some extended bits that didn’t make it into the audio episode. So after you finish listening to this, you can head over to icymypod on YouTube to see everything we’ve been cooking up. All right, back to my conversation with Imogen. And we’re back. And one of the things that I like about your piece is these questions about technology and especially AI often do come down to something very like a very human thing. And your piece determines that really this issue comes down to trust. And that is one of the things that’s making this specific issue really complicated is because Mia Ballard herself disputes that this was written by AI or at least not entirely. She responded to the New York Times request for comment saying that an acquaintance she hired to edit the self published version of the novel had used AI. So we’ll get to that in a second. But what she said is this controversy has changed my life in many ways and my mental health is at an all time low and my name is ruined for something I didn’t even personally do. So we don’t really get any information of what she means by edited using AI because she said she could not elaborate because she’s pursuing legal action. But it sounds like she’s perhaps contending that through editing AI was inserted into her book.

Speaker B: Yeah.

Speaker A: How did that development, like, sit with you?

Speaker B: I mean, it’s. It’s got a pretty strong flavor of like, my friend took my phone and sent that text message, you know.

Speaker A: Right. Yeah.

Speaker B: But I mean, it’s really interesting because if she does take legal action about it, it’s not impossible that she had, you know, a friend maybe from like her writing group or whatever, do a pass over the book and give us some feedback or whatever. It strikes me as, look, I don’t know. Not true. But then, like, I don’t know.

Speaker A: Yeah, it’s.

Speaker B: It’s what you would say, I suppose, because. But then how do you not, you know, how do you. How would you get the book back from this supposed friend and not be like, hey, you’ve changed, like everything and it sucks now.

Speaker A: Right. I think that’s what you. Again, it’s like a trust thing of like, well, how does this work? Because already it’s not happening in sort of the traditional publishing industry way of like, you know, like it’s self published first and then they’re not really touching it. And so I think there’s a lot of like, sort of a curtain behind which that we have to just kind of take her word for it. I’d love to hear from this friend.

Speaker B: Oh, yeah.

Speaker A: But I can’t imagine that friend’s gonna volunteer themselves anytime soon. But something I think is interesting and this could bear with me for this comparison, but we recently did an episode on Taylor. Frankie Paul, who is of Secret Lives of Mormon Wives, was Bachelorette. And then after this video came out of a domestic violence incident, ABC axed it got rid of her. And it was this whole thing, the sort of issue that people had with that was that this sort of the domestic violence allegations, the dynamic of the relationship she was in, where this video was taken is something that ABC was very aware of because it was very public. And so it felt very much like only when it hurt ABC did they act. And so coming to this, on that Frankie Shelf video, Frankie themselves commented after it had come out that the book was AI Kind of some reflections on their video. They wrote now that Hechette has pulled Shy girl in such a massive public way. I think the conversation has totally changed since I made this video. And we really need to be questioning them even more so than Ballard. Obviously she f***** up big time, but Hachette, as a Big 5 publisher is just getting away with all their f*** ups in this situation while throwing a black woman under the bus for it. And that is insanity. Hachette picked up this truly God awful book from self publication to make a profit from it. Put nothing back in, a little formatting maybe, but no editing, no due diligence whatsoever, no responsibility for what they’re publishing under their name. And this is all what Frankie alleges, I think, comparing the self published version to what they read in the published one.

Speaker B: Right.

Speaker A: And when they were caught, they got to pretend like they were making this honorable, noble move and taking a stance against AI while letting the author, they failed, get raked over the coals and take all the heat. So I think this is a really interesting point, especially in light of what we were speaking about when it came to sort of why people turn to self publishing, which is that someone like a black woman does noticeably and notoriously not get the same attention from traditional publishing as, you know, obviously like a white male, white woman. And so it’s understandable that people from a more marginalized background would use self publishing and use more unconventional resources like having their friend edit as a way to get their work out there. And it gets out there and it is successful. But then it’s a shame when then they do finally this does open the door to a big five publishing opportunity. And then something like this happens and it’s just really unclear where the blame lays. Like what, Imogen, is your reaction to this perspective?

Speaker B: I think it’s kind of. Everybody has got a bit of a share of the blame here because also, again, back to it being also new. It’s like what. What is and is not acceptable. So back to, back to what they, what you were saying about like Hachette’s guidelines about, you know, it all has to be original work and they have to disclose if they use AI. That kind of does make sense because what they’re hedging for there is, is the fact that lots of Authors are using ChatGPT for research, say, but then the slope seems to get slippery for some people. So maybe they might be like, oh, I’m really having trouble with this transition, these two sections. Like, maybe I’ll just see what ChatGPT has to say about it, get some ideas. And I think, like, I wouldn’t want to do that myself, but I would find it difficult to argue that that’s immoral or, you know. Yeah, so where’s the line?

Speaker A: What qualifies as something you need to disclose? Yeah. And do you have a sense that. Because, you know, I guess Hachette sort of like what you said has sort of a specific policy. I was kind of Looking into what broadly the publishing industry’s policy is, and I think maybe for the reasons you’ve just said, it doesn’t seem like they’ve landed on one. At least, at least there’s no consistency because I also saw this piece in an outlet called the Bookseller that says some editors, quote, uploading confidential manuscripts to ChatGPT to read quickly agent claims, which makes me think of what you were saying, which is that editors appear to be. Not have the time to edit. And so it’s possible that they are also using ChatGPT to speed things along.

Speaker B: Like, everyone’s using ChatGPT in the publishing industry. Like, I’m. I would be flabbergasted if, like, sales and marketing were not using it to do, like, all the bumps that they need to do. And it’s not a publishing problem, it’s an everyone problem that we’ve been handed this tool that’s supposed to make our. Our lives easier, but is like, also making our lives quite s*** and making our work quite s***. And maybe that will improve and change as the software either develops or we bury it all in the desert. But for now, I feel like there’s this sense of people want to make their working lives easier and are falling on their faces a lot in the pursuit of doing that.

Speaker A: Yeah, I don’t, like I’m not in the publishing industry, but I very much am seeing this type of unsteadiness. And no one quite knows where the line is in the journalism industry, which I’m sure you can also speak to, because it felt like for the first few years of generative AIs like, existence sort of as a commonplace tool, it was very much like we were not using it, we’re not touching it. But then last month there was kind of this slew, including this sort of controversy of basically reveals or cracks that were showing that actually people, I think, is what you’re saying. We’re maybe kind of always using it and are now like, okay, not everyone, of course, but, you know, starting to test the waters because the Wall Street Journal profiled this Fortune editor who is. They would describe him as all in on AI because he uses it to basically write seven stories a day and just, I guess, feeds prompts into it. And I don’t. But it’s like he gets this whole photo shoot where he’s like, at his desk doing nothing.

Speaker B: It’s like, okay now, right?

Speaker A: Yeah. And like. And it’s like, brag that he’s not doing anything, whatever. I find that whole thing crazy also because my Whole thing is like you’re using AI. You only got seven stories a day. Like. Like that was the amount that, like at sort of my early days in content farms that I wrote just. And all my other colleagues wrote just like, as a person, not saying it’s like those stories should exist, but it’s like you only did seven with AI. I could run a whole website with it, but. But then at the same time, so this, this guy’s getting a photo shoot. And then on the New York Times, they actually ended up letting go a book reviewer because they used an AI tool for their review. It’s not quite specific what it is, but it says the tool ended up pulling verbatim from a Guardian review of the same novel he was writing about. And so basically what happened is people were reading the New York Times review and it had like full sentences that were taken from the Guardian review that.

Speaker B: Do you know, that one p***** me off so bad because it’s like being a freelance book reviewer for the New York Times, you don’t make any d*** money doing that. You do it for love of the game. And yet he can’t even love the game.

Speaker A: Right. I know it’s like very. The motivations are very confusing. And then I will admit that what makes it more confusing is that one person doing that is getting a photo shoot and the other person doing that is getting fired.

Speaker B: Yeah, yeah.

Speaker A: And so it creates just like this very unsteadying environment. I mean, I would say that the easiest thing to do in this scenario is just not use it at all. But I think it’s time to acknowledge that they’re are people for valid reasons. Like with editors who are overworked but being told they still need to meet these demands, that these tools help, but they make the art, as this perhaps allegedly proves, they make the art much worse.

Speaker B: But I don’t think any of them want to be doing it. I mean.

Speaker A: Yeah.

Speaker B: Oh, it’s a mess. It’s a mess.

Speaker A: Yeah, yeah. I think the question that hasn’t though been like, as talked about in this controversy is so it’s like, should writers be allowed to use AI? Should publishing companies have better policies? But what about readers? Like, do readers care about consuming AI written work? Like, were it not taboo, do you think there’s a. I’m thinking of, like, obviously people in the book world flagged this because it felt hollow and empty. But also enough people were reading it initially that it got picked up by a big publisher. Like, do you have any sense about what the reader attitude towards AI written content is.

Speaker B: It’s really hard to say. And it’s hard to say without being a b****. I think it’s like, I suppose, a book. There’s all different kinds of books, right? Like some of them, what did I call them? Ideas driven. Ideas driven where it’s more like the point is the plot and that that’s kind of what you’re there for. The quality of the writing is kind of neither here nor there to you. You might be a big fan of stuff like that, and then maybe that person wouldn’t care as much or wouldn’t be bothered that a robot had had a hand in writing the thing that they were writing. But if you’re. If you’re talking about books as like, art. I think that people who read books as artworks and works of literature do care. I mean, I do because I’m a snob. Basically. I want it to be like a beautiful thing that a human being has like poured their blood, sweat and tears into. And I suppose even if I couldn’t quote, unquote, tell if it got so good, Chachi Beatty or whatever, that it could write something that was indistinguishable from great literature, I still would not want it. And maybe that is an indefensible position. But I want the sense of a mind and effort and of a coherent vision behind something that I’m reading.

Speaker A: No, I think if one of the things that just the proliferation of AI has taught me, at least, and I think a lot of other people, is that we never before had to really understand or think about so much of the value of what we were consuming, what we were reading, watching, listening to. The fact that humans came together and made it was always a given. And then when that’s taken away, it’s what you’re saying when you were picking up on an emptiness, it’s like you feel that absence of humanity and surprise. We like it when there’s humanity. And it’s almost like, what’s the point of this thing if someone didn’t put in the effort to make it?

Speaker B: I would rather read bad human slop than good AI slop. Because bad human slop is. Is bad in beautiful, fascinating, different ways.

Speaker A: Yes.

Speaker B: And there’s so much like every bad writer is a bad writer in their own way. And they will turn out something like, interesting if bad. Whereas AI AI stuff is just. It’s the one not of it that I find very off putting.

Speaker A: I like that. I like that. Like, let’s not let perfect be the enemy of the good. Bring back. Human Slop.

Speaker B: Yes.

Speaker A: Okay, that’s the show. Thank you so much to Imogen for joining us. And we’ll be back in your feed on Saturday, so definitely subscribe. That way you never miss an episode. Leave us a rating and a review in Apple or Spotify and tell your friends about us. To see all the visuals referenced in today’s episode, you can follow us on Instagram Cymi Pod, and you can always drop us a note@icymilate.com ICYMI is produced by Vic Whitley Berry and me, Kate Lindsey. Daisy Rosario is our senior supervising producer, Mia Lobel is Slate’s executive producer of podcasts, and Hilary Fry is Slate’s editor in chief. See you online or b*** dialing yourself.