ADVERTISEMENT

OpenAI, Google, others pledge to watermark AI content for safety, the White House announced today!

This is certainly a start!

Next, we must work to gain agreement on these principles internationally!

I have been saying loudly and clearly since discussions began on the ratification of the latest DGA contract that protections must include a watermark on all media generated for public display that we in the entertainment industry create. I have proposed that such a watermark would contain (or link to) references of anyone who might retain creative and/or financial rights such as residuals for the reuse of those files for any purpose that would include public display.

In doing so, we would retain control over AI's or any other future technology's manipulation of our intellectual properties.

Articles from the New York Times and this one from The Economic Times and others do indeed address these matters... in part!

 

"As part of the effort, the seven companies committed to developing a system to "watermark" all forms of content, from text, images, audios, to videos generated by AI so that users will know when the technology has been used.

 

This watermark, embedded in the content in a technical manner, presumably will make it easier for users to spot deep-fake images or audios that may, for example, show violence that has not occurred, create a better scam or distort a photo of a politician to put the person in an unflattering light.

It is unclear how the watermark will be evident in the sharing of the information.

The companies also pledged to focus on protecting users' privacy as AI develops and on ensuring that the technology is free of bias and not used to discriminate against vulnerable groups. Other commitments include developing AI solutions to scientific problems like medical research and mitigating climate change."

 

What our negotiators and especially those who are currently engrossed in the protections for our actor and writer colleagues MUST do is to draw upon this concept and agree with the AMPTP to integrate additional technology that links to these watermarks so that all media can be traced in such a way that replays can be tracked for the purpose of processing all accrued residual payments.

The actual technology that tech companies utilizing AI, or any future innovations use internationally must be compatible with, and interact with the watermarks we create somewhere within the production or postproduction chain that will protect the media files. This must become industry standard just as timecodes in motion media and metadata is in digital still photographs.

I welcome your thought and comments!

ADVERTISEMENT