@[email protected] to No Stupid [email protected] • 1 year agoFrom a legal/evidence perspective, what is going to happen when it will become impossible to tell the difference between a video generated by AI versus the real thing?message-square34fedilinkarrow-up164
arrow-up161message-squareFrom a legal/evidence perspective, what is going to happen when it will become impossible to tell the difference between a video generated by AI versus the real thing?@[email protected] to No Stupid [email protected] • 1 year agomessage-square34fedilink
minus-square@[email protected]linkfedilink1•1 year agoNo need to stream the whole video externally, you could just send the checksums every X minutes and then provide the video with that checksum later. It doesn’t entirely stop the problem though, as you could still insert faked videos into the stream. You just couldn’t do it retroactively.
minus-square@[email protected]linkfedilink2•1 year agoSure, but robbing a store and simultaneously hacking their video feed is harder than robbing a store and retroactively creating fake footage.
minus-square@[email protected]linkfedilink1•1 year agoI’m not particularly worried about robbery. There are far more sophisticated ways to attack an organization or person.
No need to stream the whole video externally, you could just send the checksums every X minutes and then provide the video with that checksum later.
It doesn’t entirely stop the problem though, as you could still insert faked videos into the stream. You just couldn’t do it retroactively.
Sure, but robbing a store and simultaneously hacking their video feed is harder than robbing a store and retroactively creating fake footage.
I’m not particularly worried about robbery. There are far more sophisticated ways to attack an organization or person.