Meta is incorrectly marking real photos as 'AI generated'


Several photographers have shared examples in the past few months, including recently when Meta flagged a photo taken by former White House photographer Pete Souza of a basketball game as AI-generated. In another recent example, Meta incorrectly added the label to an Instagram photo of the Kolkata Knight Riders winning the Indian Premier League cricket tournament. Interestingly, like Souza's photo, the label only appears when viewing the images on mobile, not on the web.

Souza says he tried unchecking the labels, but he couldn't. He believes that using Adobe's cropping tools and flattening images before saving them as JPEG images may trigger Meta's algorithm.

However, Meta has also falsely flagged real photos as AI when photographers remove even the smallest objects using generative AI tools such as Adobe's Generative Fill. Petapixel report. The publication tested this for itself by using Photoshop's generative fill tool to remove a blemish from an image, which Meta marked as AI-generated on Instagram. Strangely, however, Meta did not add a “Made with AI” label Petapixel Uploaded the file back into Photoshop and then saved it after copying and pasting it into a black document.

Many photographers have expressed their frustration that minor edits like these are being unfairly attributed to AI.

“If 'retouched' photos are 'AI generated', then the term doesn't mean anything,” photographer Noah Kalina wrote on the thread. “If they're serious about people's safety, they should auto-tag every photo 'not an accurate representation of reality.'”

In a statement VergeMeta spokeswoman Kate McLaughlin said the company is aware of the issue and is evaluating its approach “so that [its] The labels reflect the amount of AI used in an image.”

“We rely on industry standard indicators that other companies incorporate into the content of their devices, so we are actively working with these companies to improve this process so that our labeling approach matches our intent,” McLaughlin said.

In February, Meta announced it would begin adding a “Made with AI” label to photos uploaded to Facebook, Instagram and Threads ahead of this year's election season. Specifically, the company said it would add the label to AI-generated photos created with tools from Google, OpenAI, Microsoft, Adobe, Midjourney and Shutterstock.

Meta hasn't revealed what exactly triggers the “Made with AI” label, but all of these companies have – or are working on – adding metadata to image files to indicate the use of AI tools, which is one way Meta identifies AI-generated photos. Adobe, for example, started adding information about a content's origin to its metadata with the release of its Content Credentials system last year.



Leave a Comment

“The Untold Story: Yung Miami’s Response to Jimmy Butler’s Advances During an NBA Playoff Game” “Unveiling the Secrets: 15 Astonishing Facts About the PGA Championship”