Not sure how that relates at all...completely different situations. This is going to give everyone a tool with zero actual skill or time required to create fake videos for all kinds of nefarious purposes...you can't close pandora's box again.
Yeah this is what I see as the real issue.
Using it for deep fakes of famous folks and politicains...that will be less of an issue..and they will sort it out because it affects them.
The real issue is the everyday person being a target.
Take a look at seaart.ai
They have a face swap tool
You can simply upload a video and an image and it will replace the face and the results are pretty good, still not perfect.
I had used one of the boyband samples they had and replaced the face with Iger, it took 5 minutes at most to process.
Unfortunately I cant upload a video here.
Thing is you don't really need any sort of skill for that. For now it's mostly harmless (Don't panic
).
Using LORA models and generating images of a specific person still takes a bit of effort, especially to get it to be perfect.
It's not simple for the average user, but it's getting better an better every month.
With a service like seaart you don't need any kind of hardware - its all on their backend- you can use a phone.
Hands are still iffy, but again improving.
It would not take a lot of effort for someone with the motivation of revenge to get a bunch of photos of someone, generate explicit images, and share them to their work or post them on adult site and share to their work.
You could even create the background of their specific office.
Yes its this type of stuff has always gone on with photography, but it was not point and click that anyone could do and the output usually had flaws on deep inspection.
As this gets better and better, and easier, finding those flaws will get tough. Especially if its just some rando in an office trying to explain to HR its a fake image going around the office or a site they have nothing to do with.