BEST OF THE WEB

AI-generated images set to be labeled

Microsoft, Adobe, and other big names have pledged to add metadata to their AI-generated images so that future compatible apps will flag them up as machine-made using a special symbol.

The goal is to provide a way for people to see if a picture was model or human-made and how. The symbol, described as an “icon of transparency,” depicts the lower case letters “cr” inside a speech-mark like bubble. It was created by the Coalition for Content Provenance and Authenticity (C2PA).

C2PA’s Content Credentials metadata can be used for any picture, but it is particularly useful for AI-generated images. The metadata includes information about the source of the image, the AI model used to generate it, and the time and date of creation.

Microsoft and Adobe have promised to include Content Credentials metadata in their AI image generators at some point in the future. This means that users will be able to see if a picture was generated by AI simply by looking for the “cr” symbol.

This is an important step in the fight against deepfakes, which are AI-generated videos or images that are designed to look like they are real. Deepfakes can be used to spread misinformation or to damage someone’s reputation.

The sources for this piece include an article in TheRegister.

IT World Canada Staff
IT World Canada Staffhttp://www.itworldcanada.com/
The online resource for Canadian Information Technology professionals.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

ITW in your inbox

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

More Best of The Web