One small addition to my stock rules because apparently we live in a dystopian cyberpunk nightmarescape now
My stock, including photos, video, and my personal likeness may never be used in the development or production of so called "Deep Fake" media technology. Any deep fake media produced with my stock or using my likeness is a violation of my terms.
While I am certainly impressed with the advancement of imaging technology in 2018, I have seen and read enough to come to the conclusion that I don't want to take part in the development or production of deep fakes. I know my face can be seen from many angles, and this can be helpful for artists learning to draw and sketch, but it also places me in a position where my likeness is vulnerable to being placed in non-consenting pornography, and I just ain't about that business.
If you're not in the habit of placing non-consenting women into digitally manipulated porn videos, you have nothing to worry about, and I hope y'all are having a great day!
" If you're not in the habit of placing non-consenting women into digitally manipulated porn videos, you have nothing to worry about, and I hope y'all are having a great day! " .. wait.. WHAT ?!? (goes back and reads it again)... at the very least people who do that should be slapped upside the head and told to go sit in a corner for .. let's say 10 years or so ... to think about what they did. That is just.. pathetic... and sad... to put it mildly. I had no idea this was even being done ! You have enlightened me once again much like you have with your previous journal entries from way back ...
Holy cow! I knew about the technology behind this, but I never thought about how damaging it could be. I think a good starting point for making this illegal is to expand the definition of the crime of slander (the action or crime of making a false spoken statement damaging to a person's reputation). That won't catch all of the damaging deep-fakes, but as I said, I think it's a good place to start. The real problem after that will be pinpointing the culprit.
This video by the Wall Street Journal is pretty eye-opening. I'm glad the woman who was a victim of this technology was able to pass laws about non-consenting deep fakes in australia, but currently there are no laws on the books in the USA or Canada.
Just like revenge porn, the law is far, far behind what technology is capable of doing, for both good and ill. I think deep fakes have huge creative potential in entertainment, gaming and virtual reality media, but their use in pornography or fake political videos and people's inability to tell them from the real thing is really frightening.