upvote
deleted
reply
Signed images don’t get you much. You can just hardwire the image sensor to a computer and sign raw pixels.
reply
Is the situation brighter for a company who owns the hardware and the software, for Apple?

Taking a picture of an AI generated image aside, theoretically could Apple attest to origin of photos taken in the native camera app and uploaded to iCloud?

Fascinating, by the way, thank you!

reply
Make cameras tamper resistant, like POS terminals.
reply
Ultimately even with that tech, you can still take a photo of an AI generated scene. Maybe coupled with geolocation data in the signature or something it might work.
reply
Any thoughts on attempted multiple camera/360 camera solutions? Can make it cost prohibitive to generate exceptional fakes… for a little while

Kind of like showing the proctor around your room with your webcam before starting the exam.

I think legacy media stands a chance at coming back as long as they maintain a reputation of deeply verifying images, not being fooled.

reply
I see signing chains as the way to go here. Your camera signs an image, you sign the signed image, your client or editor signs the image you signed etc etc. Might finally have a use for blockchain.
reply
Yes, I think they have been for years. C2PA Content Credentials are supported in cameras and some phones already today.
reply