{"id":61580,"date":"2023-09-13T05:48:48","date_gmt":"2023-09-13T05:48:48","guid":{"rendered":"https:\/\/jurn.link\/tentaclii\/?p=61580"},"modified":"2023-09-13T06:13:37","modified_gmt":"2023-09-13T06:13:37","slug":"ai-declaration-rules-at-amazon","status":"publish","type":"post","link":"https:\/\/jurn.link\/tentaclii\/index.php\/2023\/09\/13\/ai-declaration-rules-at-amazon\/","title":{"rendered":"AI declaration rules at Amazon"},"content":{"rendered":"<p>Amazon&#8217;s has new &#8216;AI declaration&#8217; rules, currently only being applied to ebooks&#8230;<\/p>\n<blockquote><p>We define AI-generated content as text, images, or translations created by an AI-based tool. If you used an AI-based tool to create the actual content (whether text, images, or translations), it is considered \u201cAI-generated,\u201d even if you applied substantial edits afterwards.<\/p><\/blockquote>\n<p>Book creators must declare any use such of generated AI, even if later heavily edited by a human. <\/p>\n<p>Thus it seems important not to have book covers that include any AI created elements. Even if you use a stock AI-created backdrop for a book cover, and it&#8217;s only 20% of the final cover, Amazon requires the whole book be labelled &#8220;AI generated&#8221;.<\/p>\n<p>If not labelled then there seems a real risk it will be pulled from the store. This will be especially relevant when AI watermarking is rolled out, as Amazon&#8217;s bots will then be able to auto-detect the AI. In the meanwhile there&#8217;s also a risk with content that might attract the attention of activists of either the right or the left, seeking a way to have it &#8216;cancelled&#8217;. They might pounce on an undeclared use of AI.<\/p>\n<p>AI translation is also covered. Thus if a scholar uses an AI-powered translation service to translate just one required quote (from Latin, say), then presumably again the whole book has to be labelled &#8220;AI generated&#8221;. AI-made abstracts, tables-of-contents, cover blurbs (and eventually AI generated back-of-the-book indexes) could also fall foul of the new rules. Even if heavily edited by a human. <\/p>\n<p>And you might say&#8230; how will they tell? Ah, well&#8230; AI output from the main corporate tools is set to be invisibly watermarked, with Google already rolling out its version of the watermarking last week. Nvidia <a href=\"https:\/\/www.neowin.net\/news\/nvidia-adobe-ibm-and-others-promise-the-white-house-steps-to-mitigate-ai-risks\/\">just signed up to watermarking<\/a>, raising the prospect of embedding at the graphics-card level. Steganography&#8230; look it up.<\/p>\n<p>And where such labelling leads to is very uncertain. For instance, having your book labelled &#8220;AI generated&#8221; might soon mean it doesn&#8217;t appear in search, or is only to be found on the Amazon store with difficulty. You may even find it&#8217;s blocked by some third-party Web browser add-on, cooked up by an AI-hater.<\/p>\n<p>An example is DeviantArt&#8217;s AI declaration, required of people posting pictures. This seemed benign at first&#8230; until it wasn&#8217;t. Some weeks later, users found they could block all those &#8220;AI&#8221; tagged images. Those who had been honest and trusting of the company suddenly found their work being automatically &#8216;disappeared&#8217;.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Amazon&#8217;s has new &#8216;AI declaration&#8217; rules, currently only being applied to ebooks&#8230; We define AI-generated content as text, images, or &hellip;<\/p>\n<p><a href=\"https:\/\/jurn.link\/tentaclii\/index.php\/2023\/09\/13\/ai-declaration-rules-at-amazon\/\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[37,21],"tags":[],"class_list":["post-61580","post","type-post","status-publish","format-standard","hentry","category-ai","category-odd-scratchings"],"_links":{"self":[{"href":"https:\/\/jurn.link\/tentaclii\/index.php\/wp-json\/wp\/v2\/posts\/61580","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/jurn.link\/tentaclii\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/jurn.link\/tentaclii\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/jurn.link\/tentaclii\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/jurn.link\/tentaclii\/index.php\/wp-json\/wp\/v2\/comments?post=61580"}],"version-history":[{"count":14,"href":"https:\/\/jurn.link\/tentaclii\/index.php\/wp-json\/wp\/v2\/posts\/61580\/revisions"}],"predecessor-version":[{"id":61594,"href":"https:\/\/jurn.link\/tentaclii\/index.php\/wp-json\/wp\/v2\/posts\/61580\/revisions\/61594"}],"wp:attachment":[{"href":"https:\/\/jurn.link\/tentaclii\/index.php\/wp-json\/wp\/v2\/media?parent=61580"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/jurn.link\/tentaclii\/index.php\/wp-json\/wp\/v2\/categories?post=61580"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/jurn.link\/tentaclii\/index.php\/wp-json\/wp\/v2\/tags?post=61580"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}