I want to assume that the dozen or so people that the Onion uses for "people-in-the-street" fake comments and interviews (with hundreds or thousands of fake, sometimes hilarious, names) all have signed releases and receive residuals for the use of their faces. Do they get extra-duty pay for especially idiotic comments or truly insulting fake names? I certainly hope so.
Scammy fake endorsements have a very long history (which the Onion mocks), but AI has militarized and industrialized them. The simplest solution is to believe nothing in advertising or on social media, and for large summary judgements against anyone who uses someone's face or name without permission. For example, fake, minimally altered avatars on web sites.
From the article: In one video, taken on Saturday morning at the corner of Arguello Street, Broadway and Marshall Street in Redwood City, a voice claiming to be Zuckerberg says that “it’s normal to feel uncomfortable or even violated as we forcefully insert AI into every facet of your conscious experience. And I just want to assure you, you don’t need to worry because there’s absolutely nothing you can do to stop it.”
Huh. Weird how Meta specifically says public person. Private citizens, it would seem, is fine to rip off. Probably because they don't have the lawyers to sue.
I have zero followers on ticktock, or even an account. I honestly feel like I wouldn’t care if someone wanted to use my ugly mug to sell…I don’t know…catheterization products? OK, I guess I would care, but only enough to say, “Gimme a share.”
Ta, Robyn. What passes for AI was supposed to be used to replace humans at mind-numbingly boring jobs, e.g. answering the phone at a call center (which has largely been offshored to India, anyway). The reason given was so humans could have more time for their LIVES, including our creative pursuits. Instead, AI is being used to rip off artists, writers, and musicians. LLM spells death to human creativity, and I hope NVidia keeps failing.
I'd rather watch an all robot performance at the ballet. At least that might be visually interesting.
I need an AI of myself to send to people I don't want to talk to that lets them down easy with a good excuse like me being flown out to Ibiza by a tech billionaire to be his pampered lover.
Not dunking on you, but some day someone is going to say "you need money for justice" one more time and it will be "Luigi solutions" up and down the block.
i'm not victim blaming here, but given how much of themselves people share online - all the vids and tik toks and things - no one who does this should be surprised that their content is stolen and used without permission
no i'm not - i'm not implying they deserved it (per your example), i'm saying that people don't take personal data safety seriously enough. people put their likeness all over the place, they share so much of their personal data and lives that it was only a matter of time before someone found out their face and voice was stolen. there is probably hundreds of people in the same boat, probably with fewer followers so no one has noticed or told them. they don't deserve it, and they absolutely should have all the support to make it right. they are victims. it's the lack of data awareness that got them there though. i wouldn't blame a tourist for being mugged by going to the wrong place, i would question if they were blindly trusting that everything is as safe as it is at home (especially western tourists). regular posters on this site have had real world impacts because of things they've posted here - they've learned not to do that. they didn't deserve it, and they should never have experienced it, and they've learned a painful lesson from it. I'm not blaming, i'm pointing out to everyone here that our data and our information is not as safe as we believe. i've never thought anyone should be harassed because of what they wear, nor do i believe that your gender or sexuality should be used against you and i'm disappointed that you choose to take a tone that was not implied. there is an inherent risk in social media that needs addressing beyond the bullying, the shaming, the lies and disinformation, the harassment, hate, intolerance and the fascism - and that is what you put out there can be stolen, just like leaving a wallet on the table, or the keys in the ignition - all mistakes that can be costly and something we need to think about
I’m seeing a number of ads before YouTube videos that I believe bear the hallmarks of AI, such as photorealistic faces that barely seem to move except for the lips and not-quite-realistic voices. I can’t quite tell if these faces are completely artificial (remixing features of real people’s faces) or stolen from online/public figures I can’t quite place. Always pushing shady sh*t like “stock market advice” or worse.
Sure sucks that privacy laws are under attack—do ordinary individuals even have a protected right to their own likeness?
This is actually a very salient question. I've had to research this in another life, and this is what I gleaned from a conversation with a respected IP lawyer. I was photographed for a Japanese magazine (made the cover) and there was some question as to whether the release I signed covered everything in the photo. I'll spare you the details about why that was an issue, but I was wearing a shirt with my company's brand clearly visible and their lawyer had questions as to whether this required a separate release.
With the proviso that I am not a lawyer, I believe the answer is no in most cases. The rationale is that you allowed your features to be seen by the public, and this allows reproduction in one of several forms.
An artist could remember your face in enough detail to paint you realistically, but the expression would belong to the artist, not you.
The exceptions come in when the reproduction is intended to be published or for commercial use. In that case, depending on the jurisdiction, the artist/photographer needs a model release that states the terms of use.
When AI is involved things get a lot trickier. If I ask an AI to give me a likeness of a person well known enough to be recognizably generated, then I'm not taking an image of a person at all, I'm creating a synthetic image of a public figure. Then, as I understand it, there are a lot fewer restrictions, with the obvious exceptions for representations that could cause reputational damage, porn or the like. The law seems to be undeveloped in this area.
A lot of the context has to do with how the image is presented, an image of Elon Musk with the caption "Elon Musk says buy Bitcoin now" is a different case than if he just appears in the background. A secondary consideration is the jurisdictional rules in effect where the image is provided from, Eastern Europe has relatively few rules while the West has more.
If we’re not going to have immigrants anymore and Americans won’t work for immigrant-level wages, and the tariffs keep us from importing cheap stuff from foreign countries, the job creators are going to need some kind of a work-around.
It's important to call out the people/companies who write the code that uses stolen images and stolen product, to "train" their machine learning/ai models. They should be sued as well as people who buy those programs and use them, without checking to make sure they aren't using material that is private or belongs to someone else. Adobe is careful to train its ai on its own stock (already licensed) or on common usage/out of copyright images. Adobe states that they want to help creators, not replace them - which makes sense since creators are their customers. There may be other companies who are ethical as well. Every country that cracks down on ai theft learning will help. AI is the classic 2 edged sword. Extremely useful for good stuff - and for seriously bad shit. It doesn't care what you do with it - but the people behind it can be made to care a lot. It won't disappear - but a major peer and legal push for ethical use is critical.
I want to assume that the dozen or so people that the Onion uses for "people-in-the-street" fake comments and interviews (with hundreds or thousands of fake, sometimes hilarious, names) all have signed releases and receive residuals for the use of their faces. Do they get extra-duty pay for especially idiotic comments or truly insulting fake names? I certainly hope so.
Scammy fake endorsements have a very long history (which the Onion mocks), but AI has militarized and industrialized them. The simplest solution is to believe nothing in advertising or on social media, and for large summary judgements against anyone who uses someone's face or name without permission. For example, fake, minimally altered avatars on web sites.
This feels like a good post to leave this: "Silicon Valley crosswalk buttons apparently hacked to imitate Musk, Zuckerberg voices" https://www.paloaltoonline.com/technology/2025/04/12/silicon-valley-crosswalk-buttons-apparently-hacked-to-imitate-musk-zuckerberg-voices/
From the article: In one video, taken on Saturday morning at the corner of Arguello Street, Broadway and Marshall Street in Redwood City, a voice claiming to be Zuckerberg says that “it’s normal to feel uncomfortable or even violated as we forcefully insert AI into every facet of your conscious experience. And I just want to assure you, you don’t need to worry because there’s absolutely nothing you can do to stop it.”
"factually dubious answers to our Google queries"
I skip right over those AI generated crapulosas.
I’m the guy in the video. But my name is Rich Roll. Not Rich Rolls and definitely not Rich Ross. So perhaps update the article?
"Rick Roll"?
Huh. Weird how Meta specifically says public person. Private citizens, it would seem, is fine to rip off. Probably because they don't have the lawyers to sue.
The minute they use my image, I will commit some heinous crime so that I am forever associated with their product.
Like Jared and Subway, but NOT THAT.
I have zero followers on ticktock, or even an account. I honestly feel like I wouldn’t care if someone wanted to use my ugly mug to sell…I don’t know…catheterization products? OK, I guess I would care, but only enough to say, “Gimme a share.”
Ta, Robyn. What passes for AI was supposed to be used to replace humans at mind-numbingly boring jobs, e.g. answering the phone at a call center (which has largely been offshored to India, anyway). The reason given was so humans could have more time for their LIVES, including our creative pursuits. Instead, AI is being used to rip off artists, writers, and musicians. LLM spells death to human creativity, and I hope NVidia keeps failing.
I'd rather watch an all robot performance at the ballet. At least that might be visually interesting.
Artists will continue to make art. It is what we do. I have been writing words and music for 40 years, never made a dime off it, and still do it.
I need an AI of myself to send to people I don't want to talk to that lets them down easy with a good excuse like me being flown out to Ibiza by a tech billionaire to be his pampered lover.
If famous people have “image rights” - the ability to control who can use a picture of you and for what purpose - so does everybody else.
Good luck asserting those rights if you don't have the money to pay a lawyer.
Not dunking on you, but some day someone is going to say "you need money for justice" one more time and it will be "Luigi solutions" up and down the block.
i'm not victim blaming here, but given how much of themselves people share online - all the vids and tik toks and things - no one who does this should be surprised that their content is stolen and used without permission
No, you are very much victim blaming here.
"I'm not victim blaming here, but given how short her skirt was, no one who wears that kind of clothing should be surprised they get harassed."
no i'm not - i'm not implying they deserved it (per your example), i'm saying that people don't take personal data safety seriously enough. people put their likeness all over the place, they share so much of their personal data and lives that it was only a matter of time before someone found out their face and voice was stolen. there is probably hundreds of people in the same boat, probably with fewer followers so no one has noticed or told them. they don't deserve it, and they absolutely should have all the support to make it right. they are victims. it's the lack of data awareness that got them there though. i wouldn't blame a tourist for being mugged by going to the wrong place, i would question if they were blindly trusting that everything is as safe as it is at home (especially western tourists). regular posters on this site have had real world impacts because of things they've posted here - they've learned not to do that. they didn't deserve it, and they should never have experienced it, and they've learned a painful lesson from it. I'm not blaming, i'm pointing out to everyone here that our data and our information is not as safe as we believe. i've never thought anyone should be harassed because of what they wear, nor do i believe that your gender or sexuality should be used against you and i'm disappointed that you choose to take a tone that was not implied. there is an inherent risk in social media that needs addressing beyond the bullying, the shaming, the lies and disinformation, the harassment, hate, intolerance and the fascism - and that is what you put out there can be stolen, just like leaving a wallet on the table, or the keys in the ignition - all mistakes that can be costly and something we need to think about
And if you’re old enough, you remember the outrage when Fred Astaire was Forrest Gumped into dancing with a vacuum cleaner.
I say we let AI rip off the Bible and create it's own religion.
Surely there would be no objection to that.
I’m seeing a number of ads before YouTube videos that I believe bear the hallmarks of AI, such as photorealistic faces that barely seem to move except for the lips and not-quite-realistic voices. I can’t quite tell if these faces are completely artificial (remixing features of real people’s faces) or stolen from online/public figures I can’t quite place. Always pushing shady sh*t like “stock market advice” or worse.
Sure sucks that privacy laws are under attack—do ordinary individuals even have a protected right to their own likeness?
This is actually a very salient question. I've had to research this in another life, and this is what I gleaned from a conversation with a respected IP lawyer. I was photographed for a Japanese magazine (made the cover) and there was some question as to whether the release I signed covered everything in the photo. I'll spare you the details about why that was an issue, but I was wearing a shirt with my company's brand clearly visible and their lawyer had questions as to whether this required a separate release.
With the proviso that I am not a lawyer, I believe the answer is no in most cases. The rationale is that you allowed your features to be seen by the public, and this allows reproduction in one of several forms.
An artist could remember your face in enough detail to paint you realistically, but the expression would belong to the artist, not you.
The exceptions come in when the reproduction is intended to be published or for commercial use. In that case, depending on the jurisdiction, the artist/photographer needs a model release that states the terms of use.
When AI is involved things get a lot trickier. If I ask an AI to give me a likeness of a person well known enough to be recognizably generated, then I'm not taking an image of a person at all, I'm creating a synthetic image of a public figure. Then, as I understand it, there are a lot fewer restrictions, with the obvious exceptions for representations that could cause reputational damage, porn or the like. The law seems to be undeveloped in this area.
A lot of the context has to do with how the image is presented, an image of Elon Musk with the caption "Elon Musk says buy Bitcoin now" is a different case than if he just appears in the background. A secondary consideration is the jurisdictional rules in effect where the image is provided from, Eastern Europe has relatively few rules while the West has more.
If we’re not going to have immigrants anymore and Americans won’t work for immigrant-level wages, and the tariffs keep us from importing cheap stuff from foreign countries, the job creators are going to need some kind of a work-around.
Prison labor.
And PAB is going to arrest everybody, so there will be lots of workers.
Capitalism for the win!
Wait! You’re saying there’s a product that Oprah hasn’t endorsed?
It's important to call out the people/companies who write the code that uses stolen images and stolen product, to "train" their machine learning/ai models. They should be sued as well as people who buy those programs and use them, without checking to make sure they aren't using material that is private or belongs to someone else. Adobe is careful to train its ai on its own stock (already licensed) or on common usage/out of copyright images. Adobe states that they want to help creators, not replace them - which makes sense since creators are their customers. There may be other companies who are ethical as well. Every country that cracks down on ai theft learning will help. AI is the classic 2 edged sword. Extremely useful for good stuff - and for seriously bad shit. It doesn't care what you do with it - but the people behind it can be made to care a lot. It won't disappear - but a major peer and legal push for ethical use is critical.