After months of complaints from the Authors Guild and other groups, Amazon.com has started requiring writers who want to sell books through its e-book program to tell the company in advance that their work includes artificial intelligence material.
The Authors Guild praised the new regulations, which were posted Wednesday, as a “welcome first step” toward deterring the proliferation of computer-generated books on the online retailer’s site. Many writers feared computer-generated books could crowd out traditional works and would be unfair to consumers who didn’t know they were buying AI content.
In a statement posted on its website, the Guild expressed gratitude toward “the Amazon team for taking our concerns into account and enacting this important step toward ensuring transparency and accountability for AI-generated content.”
Considering what a total wasteland amazon’s self published section is, i don’t know that it could be much worse.
Of course any author with an IQ over 70 would have the good sense to never disclose they were using AI.
What’s worse is AI generated books on mushrooms, etc, which can be literally deadly (and yes such books has already been published!)
Darwin in action. Anyone who’d use a guidebook to figure out what mushrooms to eat is gonna have a bad time regardless, it’s not really something you can sum up in a book in a safe way.
My mother was originally going to self-publish her first novel on Amazon, but she realized what a scam it was. I’m glad she found a real publisher. She’s on her sixth book now. They aren’t bestsellers or anything, but people actually buy and read them. eBooks and physical copies.
This is absolutely a good move, though I don’t know how effective it’ll be on its own. Unfettered AI garbage “content” is soon going to flood every storefront and service around, and the only way to really solve it is to close things down and move to more highly curated platforms. I wish that wasn’t the case, but I can image a future where it’s hard to find anything worthwhile in a sea of AI-generated junk.
some?
Hopefully the mushroom foraging guides. Amazon is going to have a pretty big lawsuit if AI hallucinations start mixing up Death Caps with choice edible mushrooms and get someone killed.
why not all though?
Well all would be preferable.
wait some? the fuck they mean some, why not all
Good. Unless you’re using an AI you yourself cultivated using your own creations: you’re plagiarizing with extra steps.
isn’t everything just plagiarising with extra steps
You both owe Rick and Morty royalties
You joke, but you bring up an excellent point as to why I dislike the “AI is plagiary” argument that I see a lot of these days.
Everything is plagiarized in some way. No thought is truly original. Unless you spend your whole life with zero contact to anyone or anything and consume zero media of any form (in which case, have fun conveying your original thoughts with the language you’ve had to invent for yourself that nobody else could possibly translate), then every idea is based on another idea before it. Every single thought has an inspiration behind it. LLMs aren’t just copy/pasting content; the actual logic behind generative content production is incredibly similar to how people form thoughts and ideas of their own.
That said, if you’re writing a book using AI, I’d argue a case for laziness more than plagiary. Though I don’t see an inherent problem in using AI to help write a story. But if the whole book is AI-generated, I can’t imagine it will be good enough to sell enough to justify the time and effort it takes to produce that amount of text and have it published, so I wouldn’t foresee it being a very widespread problem just yet.
deleted by creator
As of now it looks like the patent office is taking the opinion that if the AI wrote all or most of the work then it’s not eligible for copyright.
Because only human authors/artists can obtain a copyright and if the AI wrote it then a human did not.
The courts will have to determine how much of the work needs to be done by a human to consider the AI just a tool used and not the creator.
deleted by creator
Taking inspiration from something is different than creating a Frankensteins monster. AIs replicate, they do not create.
That’s not actually how generative AI works. LLMs don’t copy/paste material, unless deliberately instructed to. And even then, most are coded in a way that it will still not reproduce it’s training material word-for-word.
Yes, change a few words here and there: it totally isn’t plagiarism!
I’m not arguing this with people who have likely never created anything other than code.
Again, not how LLMs work. Maybe before you decide who you do and don’t argue with, you should decide if you even should argue something you don’t understand in the first place.
Overly simplistic outlook.
If you provide the sources and direct the LLM to use those sources, and then proofread the damn thing and cite the sources, it flat out is not plagiarism.
It’s as much plagiarism as using a calculator is to find square roots.
Removed by mod
The article really isn’t about this so this comment probably would have been better as a response to some of the bitching above.
If AI writes something, it should be flagged as AI. Right now there’s AI written mushroom foraging books on Amazon. If those books weren’t proofread by a mycologist or skilled forager and someone trusts that information, an AI hallucination could get them killed.