Curated by THEOUTPOST
On Tue, 8 Oct, 4:10 PM UTC
20 Sources
[1]
Adobe wants to make it easier for artists to blacklist their work from AI scraping
The web app, called Adobe Content Authenticity, allows artists to signal that they do not consent for their work to be used by AI models, which are generally trained on vast databases of content scraped from the internet. It also gives creators the opportunity to add what Adobe is calling "content credentials," including their verified identity, social media handles, or other online domains, to their work. Content credentials are based on C2PA, an internet protocol that uses cryptography to securely label images, video, and audio with information clarifying where they came from -- the 21st-century equivalent of an artist's signature. Although Adobe had already integrated the credentials into several of its products, including Photoshop and its own generative AI model Firefly, Adobe Content Authenticity allows creators to apply them to content regardless of whether it was created using Adobe tools. The company is launching a public beta in early 2025. The new app is a step in the right direction toward making C2PA more ubiquitous and could make it easier for creators to start adding content credentials to their work, says Claire Leibowicz, head of AI and media integrity at the nonprofit Partnership on AI. "I think Adobe is at least chipping away at starting a cultural conversation, allowing creators to have some ability to communicate more and feel more empowered," she says. "But whether or not people actually respond to the 'Do not train' warning is a different question." The app joins a burgeoning field of AI tools designed to help artists fight back against tech companies, making it harder for those companies to scrape their copyrighted work without consent or compensation. Last year, researchers from the University of Chicago released Nightshade and Glaze, two tools that let users add an invisible poison attack to their images. One causes AI models to break when the protected content is scraped, and the other conceals someone's artistic style from AI models. Adobe has also created a Chrome browser extension that allows users to check website content for existing credentials.
[1]
Adobe wants to make it easier for artists to blacklist their work from AI scraping
The web app, called Adobe Content Authenticity, allows artists to signal that they do not consent for their work to be used by AI models, which are generally trained on vast databases of content scraped from the internet. It also gives creators the opportunity to add what Adobe is calling "content credentials," including their verified identity, social media handles, or other online domains, to their work. Content credentials are based on C2PA, an internet protocol that uses cryptography to securely label images, video, and audio with information clarifying where they came from -- the 21st-century equivalent of an artist's signature. Although Adobe had already integrated the credentials into several of its products, including Photoshop and its own generative AI model Firefly, Adobe Content Authenticity allows creators to apply them to content regardless of whether it was created using Adobe tools. The company is launching a public beta in early 2025. The new app is a step in the right direction toward making C2PA more ubiquitous and could make it easier for creators to start adding content credentials to their work, says Claire Leibowicz, head of AI and media integrity at the nonprofit Partnership on AI. "I think Adobe is at least chipping away at starting a cultural conversation, allowing creators to have some ability to communicate more and feel more empowered," she says. "But whether or not people actually respond to the 'Do not train' warning is a different question." The app joins a burgeoning field of AI tools designed to help artists fight back against tech companies, making it harder for those companies to scrape their copyrighted work without consent or compensation. Last year, researchers from the University of Chicago released Nightshade and Glaze, two tools that let users add an invisible poison attack to their images. One causes AI models to break when the protected content is scraped, and the other conceals someone's artistic style from AI models. Adobe has also created a Chrome browser extension that allows users to check website content for existing credentials.
[3]
Adobe's App Lets Creators Protect Work from Generative AI training
Adobe announced a free web app called Adobe Content Authenticity on October 8, that allows content creators to add "Content Credentials" to their work. These credentials are metadata that contain information about the creator and the context in which they created or edited the content. Creators can also use them to signal if they do not want their content used to train generative AI models. The company stated that it intended the app to empower creators and help them take credit for their work. The web app is an extension of the already existing Adobe Content Credentials program, which was released in 2019. The app will serve as a centralised hub for managing Content Credentials, and will integrate with Adobe Creative Cloud apps like Photoshop and Lightroom. The app will help creators apply Content Credentials in batch to sign their digital work, including images, audio and video files. These credentials can include details such as their name, website and social media accounts. Adobe stated that the credentials "combine digital fingerprinting, invisible watermarking and cryptographically signed metadata" and stay connected to the content even through screenshots. Adobe also released a Google Chrome extension that can recover and display available credentials associated with any content on the internet. Creators can also signal if they do not want their content used to train Generative AI, including by Adobe Firefly, the company's own AI model. Training datasets for Large Language Models (LLMs) often include publicly available data scraped from the internet, often without the consent of the creators. A July investigation from Proof News revealed that Silicon Valley tech giants like Apple and Nvidia trained their AI models on YouTube videos without the consent of the creators. A number of news organisations and authors have already filed copyright infringement lawsuits against AI developers like OpenAI and Nvidia, including the New York Times and the Centre For Investigative Reporting.
[4]
Adobe has a new tool to protect artists' work from AI
The Content Authenticity web app can be used to widely apply attribution data to content that contains the creator's name, website, social media pages, and more. It also provides an easier way for creatives to opt their work out of AI training en mass compared to laboriously submitting individual protections for their content to each AI provider. The web app will act as a centralized hub for Adobe's existing Content Credentials platform. Content Credentials are tamper-evident metadata that can be embedded into digital content to disclose who owns and created it and if AI tools were used to make it. The web app will integrate with Adobe's Firefly AI models, alongside Photoshop, Lightroom, and other Creative Cloud apps that already support Content Credentials individually. And importantly, the hub will allow creatives to apply Content Credentials to any image, video, and audio file -- not just those made using Adobe's apps.
[5]
How is Adobe Content Authenticity web app protecting artists' work from AI?
Adobe is taking another step in supporting creative professionals with its new Content Authenticity web app, designed to help creators protect, attribute, and opt-out of AI training for their digital work. This free tool protects how creatives manage and safeguard their images, videos, and audio content in the age of AI. The Content Authenticity web app is part of Adobe's expanded Content Credentials initiative, which allows creators to embed important information, such as their name, website, or social media profiles, into their digital work. More importantly, it gives creatives a streamlined way to label their work with a "do not train" tag, helping them opt out of generative AI models that scrape the internet for content. The web app enables creators to quickly apply attribution data across multiple files at once, saving time compared to manually embedding credentials into individual pieces. This attribution includes essential details like creator names, links to portfolios, and more. Adobe recognizes the concerns many creatives have about their work being used to train AI models. The web app allows users to opt-out their content from AI training, ensuring that their images, videos, and audio are protected from being used without permission. The app is compatible not only with Adobe's tools like Photoshop and Lightroom but also with any other digital content, making it a powerful hub for creators to manage their digital assets across platforms. Adobe has integrated advanced technologies like digital fingerprinting and cryptographic metadata, which help restore Content Credentials even if someone tries to remove them. This adds an extra layer of protection for creatives worried about their work being misused or uncredited. In addition to creating protections, the app also includes a feature to inspect Content Credentials on websites. Creators and viewers can check if content has been tagged with the metadata, even on platforms that don't natively display it. A Google Chrome extension is also in beta, allowing users to directly inspect credentials from a webpage. Adobe has been proactive in addressing the growing concerns within the creative community about AI's impact on intellectual property. The Content Authenticity web app is launching as part of a broader effort to support creators who want to protect their work from being used without consent. Houston-based photographer, Alexsey Reyes states: "There's a blanket of security I didn't know I needed as an artist sharing online until looking into the benefits of adding Content Credentials to my art. Using Adobe Content Authenticity is like one of those 'things I wish I knew when I started' moments." Although Adobe's AI models, like Firefly, only train on licensed or public domain content, the web app's protection features are aimed at influencing the broader AI market, encouraging other AI developers to adopt similar ethical standards. One of the key and perhaps the most important benefits of the web app is its ability to opt creators out of generative AI training en masse, rather than requiring individual submissions to different platforms. While the Content Authenticity web app represents an advancement in protecting digital work, Adobe still faces the challenge of getting more tech and AI companies on board with these protections. Currently, only a few companies, such as Spawning, support these opt-out features. However, Adobe is actively working to expand industry-wide adoption of Content Credentials and AI protections. As AI continues to grow, tools like Adobe's Content Authenticity web app could play a critical role in maintaining ethical standards and safeguarding the rights of creators. Whether it's ensuring proper attribution or protecting work from unauthorized AI usage, this web app marks a pivotal moment in digital content management. Launching in public beta in early 2025, the Content Authenticity web app will be free for all users with an Adobe account, making it more accessible to the creative community. As artists struggle to find a balance between AI and digital ownership, Adobe's new tool offers a powerful solution to help them retain control over their work.
[6]
Adobe proposes a way to protect artists from AI ripoffs | TechCrunch
As the engine powering the world's digital artists, Adobe has a big responsibility to mitigate the rise of AI-driven deepfakes, misinformation, and content theft. In the first quarter of 2025, Adobe is launching its Content Authenticity web app in beta, allowing creators to apply content credentials to their work, certifying it as their own. This isn't so simple as altering an image's metadata -- that kind of protection is too easily thwarted by screenshots. Content credentials take provenance a step further. Adobe's system uses digital fingerprinting, invisible watermarking, and cryptographically signed metadata to more securely protect an artwork, including images, video, and audio files. Invisible watermarking makes changes to pixels that are so minute that they evade detection by the human eye. The digital fingerprint operates similarly, encoding an ID into the file to make sure that even if the content credentials are removed, the file can still be identified as belonging to its original creator. Adobe's Senior Director of the Content Authenticity Andy Parsons told TechCrunch that with this kind of technology, Adobe can "truly say that wherever an image, or a video, or an audio file goes, on anywhere on the web or on a mobile device, the content credential will always be attached to it." Opt-in initiatives like this are only as strong as their adoption. But if any company can reach a quorum of digital artists and creators, it's Adobe, which has 33 million subscribers that pay for its software. And even artists who aren't Adobe users can use the web app to apply content credentials. Then, there's the issue of making content credentials accessible across the internet. Adobe co-founded two industry groups that work to preserve content authenticity and bolster trust and transparency online -- their membership includes camera manufacturers representing 90% of the market, content-creation tools from Microsoft and OpenAI, and platforms like TikTok, LinkedIn, Google, Instagram, and Facebook. These companies' membership doesn't mean that they will integrate Adobe's content credentials into their products, but it does mean Adobe has their ears. Still, not all social media platforms and websites visibly display provenance information. "In the meantime, to bridge that gap, we're going to release the Content Authenticity browser extension for Chrome as part of this software package, and also something we call the Inspect tool within the Adobe Content Authenticity website," Parsons said. "These will help you discover and display content credentials wherever they are associated with content anywhere on the web, and that can show you again who made the content, who gets credit for it." Ironically, AI is not very good at telling whether something is AI or not. As it becomes harder to distinguish real images from synthetic ones, these tools could offer a more concrete method of determining an image's origin (so long as it has credentials). Adobe isn't against the use of AI. Rather, the company is trying to make it clear when AI is used in an artwork, and prevent artists' work from being used in training datasets without consent. Adobe even has its own generative AI tool called Firefly, which is trained on Adobe Stock images. "Firefly is commercially safe, and we only train it on content that Adobe explicitly has permission to use, and of course, never on customer content," Parsons said. Although artists have shown great resistance to AI tools, Parsons says that Adobe's Firefly integrations in apps like Photoshop and Lightroom have received positive feedback. The generative fill feature in Photoshop, which can extend images through prompting, saw a 10x adoption rate over a typical Photoshop feature, Parsons said. Adobe has also been working with Spawning, another tool to help artists retain control over how their works are used online. Through its website called "Have I Been Trained?", Spawning allows artists to search to see if their artworks are present in the most popular training datasets. Artists can add their works to a Do Not Train registry, which signals to AI companies that this work shouldn't be included in training datasets. This is only effective if AI companies honor the list, but so far, HuggingFace and Stability are on board. On Tuesday, Adobe is launching the beta version of the Content Authenticity Chrome extension. Creators can also sign up to be notified when the beta for the full web app launches next year.
[7]
Adobe Content Tool Helps Protect Art From AI Training -- But Will It Work?
Adobe unveils a Chrome extension and Content Authenticity, a web app that lets artists add details to their images and videos including whether they can be used for AI training. Adobe has announced a new web app, dubbed Adobe Content Authenticity, that image creators and artists can use to label their work with the goal of retaining artists' authority online. It's also launching a companion Chrome extension that lets internet users check any webpage for existing labels on content. Adobe Content Authenticity lets users add Content Credentials to their images and videos. Content Credentials is a feature developed by the nonprofit group Coalition for Content Provenance and Authenticity (C2PA). The group has corporate members like Adobe, Google, and Meta -- and is trying to get internet users to voluntarily label their AI-generated creations as such. Adobe's new platform for adding Content Credentials allows creators to add a range of different tags to their work regardless of whether it was AI-generated. The labels can then appear via an expandable button in the top corner of the work on supported platforms. Creators can add their names, websites, social media profiles, and other details, like when the piece of media was created and how exactly it was made or edited. They can also use Adobe's tool to "signal" if they don't want AI models to train on their work. "Adobe is actively working to drive industry-wide adoption of this preference so that other generative AI models that support it do not train on or utilize creators' work," the company adds. But, as Adobe says, this is just a "preference," not a guarantee. Data-poisoning tools like Nightshade, which can garble some AI outputs, may be a more effective solution for concerned artists. Nightshade doesn't ask politely -- it effectively forces AI models not to swipe the work by discreetly "poisoning" the art behind the scenes. As for Adobe itself, the company previously faced controversy over a policy change that allowed it to "access" and "view" users' content on its apps like Photoshop or Premiere Pro -- and run that content through its own AI tools. But Adobe said the situation was a big misunderstanding, and it never uses customer-generated content to train its AI models or tools like Firefly. Adobe has also taken action against Adobe Stock accounts that tried to sell AI-generated imitations of famous artists' work. While Content Credentials is a step in the right direction, it's unclear to what extent the general public will demand credentials on every image they see online. There's also the chance bad actors could misuse the feature to assign an incorrect name or false labels to a piece of work. The opt-in feature isn't compatible with every site or social media platform, so Adobe is launching a Content Authenticity extension for the Google Chrome browser. That will let you see whether a piece of art has been assigned a label at some point, but it means you'll have to install the extension (if you use Chrome). Adobe claims it's difficult to remove Content Credentials from images or videos once they've been added -- even if someone takes a screenshot. "Content credentials combine digital fingerprinting, invisible watermarking and cryptographically signed metadata, helping to ensure that Content Credentials remain intact and verifiable across the digital ecosystem," Adobe's announcement reads. The tool doesn't directly address problem of deepfakes, which are AI-generated images crafted and published with the intent of being purposely misleading. But, if the data in Content Credentials were verified by a trusted third party and became the expectation and the norm across the entirety of the internet, then unlabeled content might become inherently distrusted. That could take some time and would require the cooperation of some of the biggest companies in the world, however. Adobe's Content Authenticity web app is expected to go live early next year in a free public beta. The Chrome extension is available in beta now.
[8]
Adobe Launches Content Authenticity Web App to Protect Creators' Work from Generative AI Misuse
Adobe's web app includes a feature that allows creators to signal whether they want their work to be used by AI models. Adobe has unveiled the Adobe Content Authenticity web app, a free tool designed to protect creators' work and ensure proper attribution. This new app enables users to easily apply Content Credentials -- metadata that serves as a "nutrition label" for digital content -- ensuring their creations are safeguarded from unauthorised use. Supported by popular Adobe Creative Cloud apps such as Photoshop, Lightroom, and Firefly, Content Credentials provide key information about how digital works are created and edited, offering creators ways to claim ownership and protect their creations. The company launched its Content Authenticity Initiative in 2019. With over 3,700 members backing this industry standard, the initiative aims to combat misinformation and AI-generated deepfakes. Adobe's new web app builds on this legacy, offering a centralised platform where creators can apply, manage, and customise their Content Credentials across multiple files, from images to audio and video. A recent Adobe study revealed that 91% of creators want a reliable method to attach attribution to their work, with over half expressing concerns about their content being used to train AI models without their consent. In response, Adobe's web app includes a feature that allows creators to signal whether they want their work used by AI models, ensuring their rights are respected. "Adobe is committed to responsible innovation centered on the needs and interests of creators," said Scott Belsky, chief strategy officer at Adobe. "By offering a simple, free way to attach Content Credentials, we are helping creators preserve the integrity of their work, while enabling a new era of transparency and trust online." The app also offers features such as batch credential application and the ability to inspect content for associated credentials through a Chrome extension. This ensures that the information remains visible, even if platforms or websites fail to retain it. With this new tool, Adobe is not only empowering creators to protect their work but is also driving a broader push for transparency across the digital ecosystem. The company has gone all in on generative AI. Last month, they introduced new features in Adobe Experience Cloud, including Adobe Content Analytics and real-time experimentation tools. The tool will help personalise, test, and evaluate AI-generated content across various channels while offering actionable insights to improve marketing performance and boost customer engagement.
[9]
Adobe unveiled a new tool to help protect artist's work from AI - and it's free
Adobe understands that an artist's work should be protected from being used to train AI LLMs. Here's how to get access to its new free tool, which helps do just that. Artists around the world have been up in arms about having their work used to train AI. Once an AI service has used an artist's work to train its large language model (LLM), that work can be incorporated into the results created by user prompts. To some, that's akin to copyright infringement. Also: Open-source AI definition finally gets its first release candidate - and a compromise To combat this, Adobe has developed Content Credentials so that artists can be credited for their work, consumers can easily identify what is AI generated and what isn't, and ensure an artist's work is protected. To add to this, Adobe has now launched a free web-based application that allows users to apply specific information to their work (images, video, and audio) and even opt their work out of generative AI models. Benefits of using the new Content Credentials web app, according to Adobe, include: "By offering creators a simple, free, and easy way to attach Content Credentials to what they create, we are helping them preserve the integrity of their work while enabling a new era of transparency and trust online," said Scott Belsky, Chief Strategy Officer and EVP, Design & Emerging Products at Adobe. "The Adobe Content Authenticity web app will not only benefit creators but also help consumers navigate the digital ecosystem with greater clarity." The only caveat to Content Credentials is that an AI service has to support the feature for it to work. In other words, if an AI service isn't on board with what Adobe is doing, it can circumvent these protections. The Content Credentials are automatically applied to content created by the Adobe Creative Cloud apps, including Photoshop, Lightroom, Firefly, and other third-party partners. Also: Apple Intelligence is finally coming to your iPhone at the end of October A free public beta of the Adobe Content Authenticity web app will be launched during Q1 of 2025. If you're interested in joining, make sure to sign up for the waitlist, so you can be informed about the release. There is also a free beta of the Content Authenticity extension for Google Chrome available starting today.
[10]
Adobe's New App Hopes to Clear Up Some AI Confusion
Katelyn is a writer with CNET covering social media, AI and online services. She graduated from the University of North Carolina at Chapel Hill with a degree in media and journalism. You can often find her with a novel and an iced coffee during her time off. In this era of generative AI, it's harder than ever to discern whether an image is real or created by AI. That's why Adobe is launching a new web app called Content Authenticity that lets anyone with a free Adobe account create content credentials for their work. These next-generation credentials are more durable (meaning tamper-proof), and artists can use them to indicate they want their work excluded from AI training. Content credentials are like a digital nutrition label you can read to understand the origins of a piece of content. They're a robust kind of digital signature that creators will be able to manage in the web app, adding information like their websites and social media. Beyond giving artists proper credit for their work, Adobe's credentials can also show an image's editing history, including whether AI was used -- something that's getting harder for us to determine on our own as AI services improve. Adobe's also working on a Google Chrome browser extension that will recognize credentials for content you come across online. These updates are part of Adobe's work with the Coalition for Content Provenance and Authenticity, or C2PA, which builds content credential tech and the Content Authenticity Initiative, which was founded by Adobe and promotes the use of the C2PA. "This is all in service of a basic right we think people have to understand what the content is that they're looking at," Andy Parsons, Adobe's senior director of the CAI, told CNET. Previously, only Creative Cloud users working in Photoshop could create credentials for their projects. However, anyone will be able to use the new web app -- no subscription required. Creators will be able to apply credentials to existing work, and in a new update, they will be able to add a preference indicating they don't want that work used for training AI models. Checking this preference won't ward off every AI company, particularly ones that scrape the open web. But Adobe confirmed that these labels will be honored by Spawning, one of the biggest AI opt-out aggregators. The updated content credentials are also more durable -- meaning credentials are more tightly tied to an image's data and therefore harder to tamper with. Adobe will use a combination of digital fingerprinting, invisible watermarking and cryptographic metadata to securely attach credentials to content. That way, when someone tries to take a screenshot of a photo, the credentials stay attached -- thwarting a frustratingly common issue for creators who share their work online. Credentials are one tool in artists' arsenal against threats AI poses to creative industries. They are increasingly concerned about misuse and misattribution of their work. Adobe found that 44% of US creators had encountered what they believed was AI-generated work that was similar to their own, and 91% said they would use a tool to verify work as their own online. Some of these artists are duking it out in court with tech companies over copyright infringement. You can sign up to preview the Content Authenticity app today, with a public beta expected to be released sometime in early 2025. The Chrome browser extension beta version is also available starting today.
[11]
Adobe's new free web app will protect your content from being stolen by AI online
It used to be fairly easy to spot AI images - everybody had six fingers and an extra limb or two, but as AI evolves it's getting harder and harder to spot the difference between real art and AI. Worse still, AI is being trained on artists' work without their permission and without them being compensated. Creatives aren't necessarily opposed to AI - many can see the benefits of using AI tools, especially when it comes to automating mundane tasks - but there are real concerns that AI is stealing people's artwork, learning from it, then regurgitating knock-off copies in an artist's style. In an effort to protect artists' work, Adobe is launching a new, free, web app designed to protect artists online. We talked to Andy Parsons, Senior Director and Head of the Adobe Content Authenticity Initiative about its new Content Credentials scheme. "Transparency is the most important foundational concept to provide an objective shared sense of what's fact, what's real, who gets credit for things," said Parsons. "We've been working on developing an open source code, open standards that will be ratified by the International Standard Organization. Fingers crossed, that will happen this year or next year." Adobe's answer is to attach Content Credentials to images so that you can see at a glance how they were created and by whom. The way Adobe wants you to think about Content Credentials is like a nutrition label on your groceries, showing you where the food came from. "We think the same is true and very necessary for digital content where we want the judgment about where something came from, whether to trust it or not, whether it is a photograph or not a photograph, to be really in the hands of the consumer", explained Parsons. To show an image has Content Credentials a little CR icon appears over the image and can be clicked on. At the moment the CR icon requires a Google Chrome extension to be visible, but once clicked on you get a whole range of information about the providence of the image. Adobe has high hopes for the widespread adoption of the technology. "We think this could become as ubiquitous as a copyright symbol", explains Parsons. When it comes to protecting artists' work online from being crawled by AI bots and used to train AI models, Adobe thinks its solution can come to the rescue there as well. You can indicate in Content Credentials if you don't want your artwork to be used in training AI models. Obviously that's going to depend on how scrupulous the AI companies are and if they respect the wishes of the content creators, but Adobe has got many of the big names to sign up to its plan. "We are working hard to get this adopted by the biggest names in AI. We're not necessarily seeing any resistance, but there is a sense that some will wait for legislation. Others will go ahead and say 'Yeah, you know this is a reasonable thing that creators want and we'd like to provide it'. So, I think we'll see broad adoption and it will take us some time but we are working with [...] specific folks like Spawning, the world's largest opt-out registry. Whenever spawning encounters this setting, when people opt-out, it will add that to its registry. So we're excited about that. We know some social media sites don't show content credentials in a visible way yet so we're bridging that gap with our extension for Chrome." The Adobe-led Content Authenticity Initiative has 3,700 members including Microsoft, Amazon, Google, NVIDIA, OpenAI, and more who are putting Content Credentials to work in their products. Notably the list includes about 90% of the cameras market, including Sony, Leica, and Fuji, with several cameras coming to market this year that have Content Credentials built into the device. Social media apps like Facebook, Instagram, Threads, TikTok, LinkedIn, and YouTube will also be embracing its Content Credentials. Adobe's solution is open source and works independently of its Creative Suite applications, so it can be applied to any image, for free. From today you'll be able to sign up to be notified when the beta is available. We expect it to go live between December and February. Access to a free beta version of the Content Authority extension for Google Chrome is available today. With Content Authenticity you can apply Content Credentials in batch to your images, audio, and video, and once it's been added, it can't be removed. Parsons describes it as "durable". We did wonder if there would be an arms race between people like Adobe working to add Content Credentials and hackers trying to remove them, "This is always a concern, but this is a very strong countermeasure against copyright theft and the ability to be recognized through your work", he replied.
[12]
This New Adobe Web App Will Let Creators Apply AI Labels to Content
Content Credentials are supported in Adobe Creative Cloud apps Adobe Content Authenticity, a free web app that allows users to easily add content credentials as well as artificial intelligence (AI) labels, was introduced on Tuesday. The platform is aimed at helping creators with their attribution needs. It works on images, videos, and audio files and is integrated with all of the Adobe Creative Cloud apps. Alongside adding attribution, creators can also use the platform to opt out of training AI models using their content. It is currently available as a Google Chrome extension in beta. In a newsroom post, Adobe detailed the new platform. Notably, while it is available as a Chrome extension currently, a free web app will be available in public beta in the first quarter of 2025. Users can sign up here to be notified when the beta is available to download. The company highlighted that the platform is aimed at "helping creators protect their work from misuse or misrepresentation and build a more trustworthy and transparent digital ecosystem for everyone." The app will act as a one-stop shop for all the attribution needs of creators. They can use it to add Content Credentials, which is the information added to a file's metadata highlighting details about its creator. The app can be used to add these attributions to a batch of files. Creators can also choose the information they want to share and it can include their name, website, and social media accounts. Adobe said that Content Credentials can protect creators from unauthorised use or misattribution of their work. Interestingly, while the web app supports all the Adobe Creative Cloud apps, content not created on its platform can also be attributed. This goes for images, videos, and audio files. Apart from attribution, the web app will also let users mark if they do not want their content to be used by or to train AI models. The company highlighted that it only trains Adobe Firefly, the in-house family of generative AI models, on content which is either publicly available or has permission to use. However, adding the AI label will also protect the creator from other AI models in the market. However, that will only work if other companies decide to respect Content Credentials. Currently, only Spawning, the opt-out aggregator of generative AI, has committed to recognise this attribution. Adobe said it is actively working to drive an industry-wide adoption of this preference. Unfortunately, there is a downside. If a creator does not allow their work to be used for AI training, the content will not be eligible for Adobe Stock.
[13]
Can Adobe repair its AI reputation with its new Content Authenticity web app?
Several companies have experienced reputational issues as a result of their approach to AI in 2024. Chief among them is Adobe, whose Terms of Service debacle in the summer led to a fierce backlash from creatives. But the brand's latest move might just help ingratiate the company with AI-sceptics. The company has announced a new Content Authenticity Web App designed to "champion creator protection and attribution". An extension of the Content Authenticity Initiative we've already seen Adobe develop over the last couple of years, the new web app offers a free way for creators to apply content credentials to their work. As well as adding credentials, the web app lets creators set AI training and usage preferences, essentially letting artists opt-out of being trained by models. Of course, this relies on other models recognising this, but Adobe is "actively working to drive industry-wide adoption of this preference." A free, public beta of the Adobe Content Authenticity web app will be available in Q1 2025, and users can sign up for more information here. As well as the web app, there'll also be a handy Chrome extension for inspecting credentials. "Adobe's new tool for signing artwork is a game-changer," gushes illustrator Pepita Sandwich via the Adobe website. "It makes adding Content Credentials to digital pieces easy and professional. The interface is intuitive and ensures seamless integration, while the security features give me peace of mind. I also love that it offers the option to opt-out of AI training -- perfect for protecting my creative rights." It's certainly a timely move for a company that has endured its fair share of criticism from artists this year, as well as facing the small matter of getting sued by the US government over its convoluted cancellation processes. But with the brand having previously committed to "doing the right thing" when it comes to AI, focussing its efforts on the Content Authenticity Initiative seems like a wise move.
[14]
Adobe is giving creators a way to prove their art isn't AI | Digital Trends
With AI slop taking over the web, being able to confirm a piece of content's provenance is more important than ever. Adobe announced on Tuesday that it will begin rolling out a beta of its Content Authenticity web app in the first quarter of 2025, enabling creators to digitally certify their works as human-made, and is immediately launching a Content Authenticity browser extension for Chrome to help protect content creators until the web app arrives. Adobe's digital watermarking relies on a combination of digital fingerprinting, watermarking, and cryptographic metadata to certify the authenticity of images, video, and audio files. Unlike traditional metadata that is easily circumvented with screenshots, Adobe's system can still identify the creator of a registered file even when the credentials have been scrubbed. This enables to company to "truly say that wherever an image, or a video, or an audio file goes, on anywhere on the web or on a mobile device, the content credential will always be attached to it," Adobe Senior Director of Content Authenticity Andy Parsons told TechCrunch. Recommended Videos Both the Chrome extension and the forthcoming web app will be available to the public, whether you're one of Adobe's 33 million paying subscribers and Firefly users. "We're going to release the Content Authenticity browser extension for Chrome as part of this software package, and also something we call the Inspect tool within the Adobe Content Authenticity website," Parsons said. "These will help you discover and display content credentials wherever they are associated with content anywhere on the web, and that can show you again who made the content, who gets credit for it." This announcement comes amid Adobe's larger push for building trust through transparency in digital content. The company has founded a pair of industry groups -- the Content Authenticity Initiative (CAI) and the foundational open standards consortium the Coalition for Content Provenance and Authenticity (C2PA) -- to spur adoption of its Content Authenticity initiative. To date, the groups have attracted more than 2,000 signatories, including nearly every major camera manufacturer, along with AI front runners Microsoft and OpenAI, as well as social media platforms including TikTok, Instagram, and Facebook.
[15]
Adobe Debuts App to Protect Against AI Content Copycats | PYMNTS.com
Adobe has introduced an app to help creators protect their work against AI copycats. Content Authenticity, unveiled Tuesday (Oct. 9), is a free web app that lets creators receive attribution for their work with Content Credentials, which the tech company calls a "nutritional label" for digital materials. "As concerns over misinformation and AI-generated deepfakes have grown, Content Credentials have become a valuable tool for publishers, allowing them to provide key information about digital content to help consumers assess its trustworthiness," Adobe said in a news release. "Now with the web app, Adobe is unlocking the full potential of Content Credentials, helping creators protect their work from misuse or misrepresentation and build a more trustworthy and transparent digital ecosystem for everyone." According to the release, the app lets creators apply Content Credentials in batch to sign their digital work -- such as images, audio and video files. Creators retain control of the information included in Content Credentials, like their name, website and social media accounts, with the company planning to eventually offer more customization options. "Attaching this information can help creators receive attribution for their work, protecting it from unauthorized use or misattribution and ensuring recognition," Adobe said. The launch of Content Authenticity comes weeks after Adobe debuted tools to help brands demonstrate the business impact of their AI-generated content and adjust and optimize their campaigns in real time. "Marketers are being challenged to ensure that AI-generated variants also resonate with customers, and Adobe's latest innovations will help brands meet the moment through real-time experimentation and actionable insights," said Amit Ahuja, senior vice president, digital experience business at Adobe. In other AI news, PYMNTS wrote earlier this week about a new survey by the National Cybersecurity Alliance showing a startling trend: 38% of employees share sensitive work information with AI tools without their employer's consent. The problem is especially pronounced among younger workers, with 46% of Generation Z and 43% of millennials admitting to the practice, compared to 26% of Generation X and 14% of baby boomers. Dinesh Besiahgari, a front end engineer at Amazon Web Services (AWS) with a background in AI and healthcare, warned of the risks behind seemingly innocuous AI interactions. "What stands out most is the scenario where employees use chatbots to make payments or make any form of financial transactions where they have to give out payment details and other account information," Besiahgari told PYMNTS.
[16]
Adobe to offer free app to help with labeling AI-generated content
(Reuters) - Adobe said on Tuesday it will offer a free web-based app starting next year, aimed at helping the creators of images and videos to get credit for their work used in AI systems. Since 2019, Adobe and other technology companies have been working on what the firms call "Content Credentials," a sort of digital stamp for photos and videos around the web to denote how they were created. TikTok, which is owned by China's ByteDance, has already said it will use Content Credentials to help label AI-generated content. San Jose, California-based Adobe said it will offer a free service to allow the creators of photos and videos to affix Content Credentials to their work. In addition to indicating that they authored the content, the creators can also use the free app to signal if they do not want their work to be used by AI training systems that ingest huge amounts of data, the company said. The use of data in AI training systems has sparked legal responses in multiple industries, with publishers such as the New York Times suing OpenAI, while some other firms have opted to work out licensing deals. As yet, no large AI company has agreed to abide by Adobe's system for transparency. In a release, Adobe said it was "actively working to drive industry-wide adoption" of its standards. "By offering creators a simple, free and easy way to attach Content Credentials to what they create, we are helping them preserve the integrity of their work, while enabling a new era of transparency and trust online," Scott Belsky, chief strategy officer and executive vice president for design and emerging products at Adobe, said in a statement. (Reporting by Stephen Nellis in San Francisco; Editing by Shinjini Ganguli)
[17]
Adobe to offer free app to help with labeling AI-generated content
Oct 8 (Reuters) - Adobe (ADBE.O), opens new tab said on Tuesday it will offer a free web-based app starting next year, aimed at helping the creators of images and videos to get credit for their work used in AI systems. Since 2019, Adobe and other technology companies have been working on what the firms call "Content Credentials," a sort of digital stamp for photos and videos around the web to denote how they were created. TikTok, which is owned by China's ByteDance, has already said it will use Content Credentials to help label AI-generated content. Advertisement · Scroll to continue San Jose, California-based Adobe said it will offer a free service to allow the creators of photos and videos to affix Content Credentials to their work. In addition to indicating that they authored the content, the creators can also use the free app to signal if they do not want their work to be used by AI training systems that ingest huge amounts of data, the company said. The use of data in AI training systems has sparked legal responses in multiple industries, with publishers such as the New York Times suing OpenAI, while some other firms have opted to work out licensing deals. Advertisement · Scroll to continue As yet, no large AI company has agreed to abide by Adobe's system for transparency. In a release, Adobe said it was "actively working to drive industry-wide adoption" of its standards. "By offering creators a simple, free and easy way to attach Content Credentials to what they create, we are helping them preserve the integrity of their work, while enabling a new era of transparency and trust online," Scott Belsky, chief strategy officer and executive vice president for design and emerging products at Adobe, said in a statement. Reporting by Stephen Nellis in San Francisco; Editing by Shinjini Ganguli Our Standards: The Thomson Reuters Trust Principles., opens new tab
[18]
Adobe is debuting a new 'nutrition label' for digital content
Adobe announced a free web app for creators to apply credentials to their work, in an effort to help protect content from unauthorized use and ensure creators receive proper attribution. Adobe said its content credentials are to be thought of as a sort of "nutrition label," for digital content, creating metadata that creators can attach to their work to share information about themselves and provide context on how specific content was created and edited. Creators can also choose to opt specific content out from being used to train generative AI. Adobe is also releasing a free Google Chrome browser extension for consumers to see which assets on websites, like Facebook or X, have credentials and were shaped by artificial intelligence. A free beta of the Adobe Content Authenticity web app will go live in the first quarter of 2025. The Content Authenticity extension will be available after Tuesday, the company said.
[19]
This Move by Adobe Will Help Creators Get Credit for Work Used by AIs
TikTok, which is owned by China's ByteDance, has already said it will use Content Credentials to help label AI-generated content. San Jose, California-based Adobe said it will offer a free service to allow the creators of photos and videos to affix Content Credentials to their work. In addition to indicating that they authored the content, the creators can also use the free app to signal if they do not want their work to be used by AI training systems that ingest huge amounts of data, the company said.
[20]
Adobe will allow creators to digitally sign their art
Why it matters: The tool, which enters private beta today and will be made more broadly available next year, is part of Adobe's broader strategy to help authenticate digital content by showing how it was captured or created and indicating any changes made using AI. Driving the news: The free public beta of the Adobe Content Authenticity web app is planned for the first quarter of next year, and will allow content creators to add credentials to their work. The big picture: It's part of Adobe's broader content authenticity initiative, which also aims to verify digital images from the moment of capture, in the case of photos and videos, or from the moment of creation, in the case of digital images. Yes, but: Apple has yet to publicly commit to Adobe's approach. Google has joined the Adobe effort, but has not yet built content credentialing into the Android camera process.
Share
Share
Copy Link
Adobe introduces a new web app that allows artists to add content credentials to their work and opt out of AI training, aiming to protect digital creations in the age of generative AI.
Adobe has announced the launch of a new web app called Adobe Content Authenticity, set to enter public beta in early 2025. This free tool aims to empower digital creators by providing them with enhanced control over their work in the age of generative AI [1][2].
The web app introduces several crucial features:
Content Credentials: Creators can add metadata to their digital content, including their name, website, and social media profiles. This serves as a digital signature for the 21st century [1][3].
AI Training Opt-Out: Artists can signal that they do not consent to their work being used to train AI models, addressing concerns about unauthorized use of copyrighted material [1][4].
Batch Processing: The app allows creators to apply Content Credentials to multiple files simultaneously, streamlining the process of protecting their work [4][5].
Cross-Platform Compatibility: Content Credentials can be applied to any digital content, not just those created with Adobe tools [2][4].
Adobe has incorporated sophisticated technologies to ensure the robustness of Content Credentials:
The introduction of Adobe Content Authenticity comes at a time when the creative industry is grappling with the implications of AI on intellectual property:
While Adobe's initiative is a step forward, challenges remain:
As AI continues to evolve, tools like Adobe Content Authenticity may play a pivotal role in maintaining ethical standards and protecting creators' rights in the digital landscape [5].
Reference
[1]
MIT Technology Review
|Adobe wants to make it easier for artists to blacklist their work from AI scraping[1]
MIT Technology Review
|Adobe wants to make it easier for artists to blacklist their work from AI scrapingAdobe addresses concerns about AI-generated content in its Stock platform and clarifies its vision for Firefly, its AI tool for creators. The company emphasizes its commitment to supporting human artists and maintaining transparency in the age of AI.
2 Sources
Adobe introduces new AI features while emphasizing responsible use and creative integrity. The company aims to strike a balance between AI innovation and protecting human creativity in the digital realm.
2 Sources
Adobe introduces AI-powered features across its Creative Cloud suite, emphasizing the need for artists to adopt AI tools to remain competitive in the evolving creative landscape.
4 Sources
Adobe has launched new AI-powered features for its Stock platform, allowing users to edit and customize stock images using generative AI technology while ensuring fair compensation for original contributors.
3 Sources
Adobe introduces generative AI video capabilities to Firefly, reaching 12 billion generations. The company faces scrutiny over AI training data while emphasizing safety and expanding its presence in India.
5 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2024 TheOutpost.AI All rights reserved