News organizations, camera makers, and tech companies create a web tool called Verify for checking the authenticity of images for free. It's being adopted by Nikon, Sony, and Canon.
Camera Companies Fight AI-Generated Images With 'Verify' Watermark Tech::undefined
I guess this is better than nothing, but what happens if you take a photo of a generated photo? There are setups where the result will be impossible to tell that it's a photo of a photo, and then you can have the camera digitally sign the fake photo as real.
Consoles (Xbox, Nintendo, PlayStation) are all hacked eventually. All that will happen is someone will hack a camera to sign any image sent to it.
I think this tech (signed pictures) is just going to make the problem worse. Once a camera is hacked, it's "signed" but fake... Same spot we are now but now we have fake verified pictures
And consoles are a walled garden, here you would have to build a resilient trust network for all camera manufacturers, any private key gets leaked and the system is compromised.
And I'll be sure to let them know that I use windower add-ons and DAT mods when playing FF11. Maybe they'll ban my PS2/PlayOnline from any future updates?
It's not just a sig on the image, but on metadata as well. Harder to fake time + place if they implement it thoroughly. (I.e., they would have to make it only trust GPS and verify against an internal clock, I suppose, and not allow updating time and location manually.)
...including the date and time a photo was taken as well as its location and the photographer...
Not including gps and time makes this worse, but including it makes it useless because you can't ever verify a photo sent across social media, since the exit tags will be stripped.