{"id":17847,"date":"2025-08-29T23:23:51","date_gmt":"2025-08-29T21:23:51","guid":{"rendered":"https:\/\/plus.maciejpiasecki.info\/index.php\/2025\/08\/29\/your-claude-chats-are-now-training-ai-unless-you-take-action\/"},"modified":"2025-08-30T06:00:07","modified_gmt":"2025-08-30T04:00:07","slug":"your-claude-chats-are-now-training-ai-unless-you-take-action","status":"publish","type":"post","link":"https:\/\/plus.maciejpiasecki.info\/index.php\/2025\/08\/29\/your-claude-chats-are-now-training-ai-unless-you-take-action\/","title":{"rendered":"Your Claude Chats Are Now Training AI, Unless You Take Action"},"content":{"rendered":"<p>If you\u2019re a user of Anthropic\u2018s popular AI assistant, Claude, there\u2019s a new update you need to be aware of. The company has made some big changes to Claude AI\u2019s consumer privacy policy. Now, you have to decide whether your chats will be used to train future AI models.<\/p>\n<p>Previously, Anthropic had a policy of not using consumer chats for training purposes. But that\u2019s all changing. The firm now wants to use your conversations and coding sessions to improve its AI. If you agree, your data will be retained for up to five years. For current users, the default setting is already switched \u201con\u201d in a new pop-up window, so you\u2019ll have to actively switch it off if you don\u2019t want to participate.<\/p>\n<p>You have a deadline to opt out of Anthropic\u2019s new privacy policy for Claude AI<\/p>\n<p>The deadline to make your choice is September 28. If you choose not to accept the new policies, your chats will continue to be deleted after 30 days.<\/p>\n<p>So, why is this happening? Anthropic\u2019s official reason is that your data will \u201chelp us improve model safety\u201d and lead to a more accurate and better-performing Claude for everyone. By using real-world data, the company can make the AI better at everything from reasoning and analysis to writing code.<\/p>\n<p>While that all sounds great, the full truth is likely a little simpler: in the race to build the best AI, every major company needs a vast amount of high-quality data to stay competitive. Accessing millions of real-life conversations from its own users is a goldmine for Anthropic. The company might need such a move to continue competing against rivals like OpenAI and Google.<\/p>\n<p>This move mirrors a broader industry trend. OpenAI, which similarly partitions its enterprise users from its data policies, has faced legal battles over its data retention practices.<\/p>\n<p>Many could miss this setting<\/p>\n<p>The most concerning part of this change is how easy it is for users to miss it. New users will be prompted with the choice during sign-up. Meanwhile, existing users will see a pop-up with a large \u201cAccept\u201d button and a much smaller toggle switch for data sharing that is automatically enabled. This design has raised concerns that users might quickly click through without realizing they are agreeing to a major change in their data\u2019s usage.<\/p>\n<p>The bottom line is simple: if you use Claude, it\u2019s a good idea to check your privacy settings before the September 28 deadline. With this simple move, you will make sure your data is being handled exactly the way you want it.<br \/>\nThe post Your Claude Chats Are Now Training AI, Unless You Take Action appeared first on Android Headlines.&#013;<br \/>\n<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/plus.maciejpiasecki.info\/wp-content\/uploads\/2025\/08\/Anthropic-claude-scaled-1.jpg\" width=\"2560\" height=\"1440\">&#013;<br \/>\nSource: ndroidheadlines.com&#013;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>If you\u2019re a user of Anthropic\u2018s popular AI assistant, Claude, there\u2019s a new update you need to be aware of. [&hellip;]<\/p>\n","protected":false},"author":67,"featured_media":17848,"comment_status":"false","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[],"class_list":["post-17847","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-bez-kategorii"],"_links":{"self":[{"href":"https:\/\/plus.maciejpiasecki.info\/index.php\/wp-json\/wp\/v2\/posts\/17847","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/plus.maciejpiasecki.info\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/plus.maciejpiasecki.info\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/plus.maciejpiasecki.info\/index.php\/wp-json\/wp\/v2\/users\/67"}],"replies":[{"embeddable":true,"href":"https:\/\/plus.maciejpiasecki.info\/index.php\/wp-json\/wp\/v2\/comments?post=17847"}],"version-history":[{"count":1,"href":"https:\/\/plus.maciejpiasecki.info\/index.php\/wp-json\/wp\/v2\/posts\/17847\/revisions"}],"predecessor-version":[{"id":17849,"href":"https:\/\/plus.maciejpiasecki.info\/index.php\/wp-json\/wp\/v2\/posts\/17847\/revisions\/17849"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/plus.maciejpiasecki.info\/index.php\/wp-json\/wp\/v2\/media\/17848"}],"wp:attachment":[{"href":"https:\/\/plus.maciejpiasecki.info\/index.php\/wp-json\/wp\/v2\/media?parent=17847"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/plus.maciejpiasecki.info\/index.php\/wp-json\/wp\/v2\/categories?post=17847"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/plus.maciejpiasecki.info\/index.php\/wp-json\/wp\/v2\/tags?post=17847"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}