(www.theguardian.com)
There's one photographer, François Brunelle, who has a project where he takes pictures of doppelgängers: http://www.francoisbrunelle.com/webn/e-project.html
Some more examples
https://www.wbur.org/hereandnow/2024/10/14/francois-brunelle...
https://www.reddit.com/r/BeAmazed/comments/1cimhns/canadian_...
Dunno. Grit?
AFAIK, copyright allows for independent creation (unlike patents), so unless one person had deliberately copied the other's appearance, there should be no problem.
I was just recently trying to find an associate from my past with an unfortunately common whole full name in his language and was rather surprised at how many of the people depicted online with his name looked extremely similar to him, but upon closer discernment were surely not him. How do you discern that a “deepfake” (what a dumb term) is similar to you and not just similar to anyone else?
Also, what if AI is just trained with images of you? The consequent image will similarly only be an inspiration of you, not you, not the same as even using images in an attempt to graft a very similar facial feature onto an image or map it into a video.
It is in fact also what artists do in physical medium, they look at something/someone and are inspired by it to create an illusion that gives the impression of similarity, but it is not that thing/person. Will this new law possibly make art illegal too because people have not thought this through?
On a digital screen, it is of course also not you at all, it is individual pixels that fool the mind or give an illusion. It is really a pernicious muddling of reality and logic we have allowed to emerge, where the impression of depiction is the property of someone even though it is not that person, but also only if it is the means for control, ie money. Mere peasants have no control over their image taken in public.
The Sphere in Vegas is another good example of this on a large scale, each “pixel” is roughly 6” apart from the other and about 2” in diameter, for all intents and purposes separate objects, each only projecting one array of colors in a matrix of individual LEDs. Up close it looks no different than a colored LED matrix, only when you stand sufficiently far away is your mind tricked into believing you see something that is not really there.
Frankly, these moves to “protect” are very much a direct assault on free expression and even may create unintended consequences if art exceptions do not apply anymore either. Is it now illegal for me to paint a nude, how about from an image that I took of someone? What about if I do it really well from my own memory? What about if I use a modeling tool to recreate such a nude as a digital 3D object from images or even memory? Is AI not also simply a tool? Or is it more?
Presumably the only reason to use a deepfake of a specific person is to produce things specifically in relation to that person. Otherwise, why bother? So “is this about the individual or just coincidence?” isn’t likely to be a factor in any complaint made. This seems like a hypothetical rather than something that is likely to need answering in practice.
You presume both too much and not enough.
How are you going to do that unless it actually looks like you?
Then we see how they’re doing and decide - hey let’s not be like them.
Otherwise everyday photography in public spaces would become legally risky or impractical, especially in crowded areas where avoiding all faces is nearly impossible and where the focus clearly isn't on the individuals but the landmark or scene itself.
If a deepfake is made of someone, that person was clearly the subject of the image/video and thus violates his/her privacy. This extra legislation would help protect in case the original image/video was taken with consent (so no privacy issue).
Imagine I drew a Coca Cola logo in paint. Now I own the copyright to my picture of the Coca Cola logo. Next I stick it on my new brand of soda. That’s not allowed.
Coca-cola own rights to their logo. You should own rights to your face and voice.
I think your conclusion is correct, but the child comments mentioning fair use would not apply because fair use is a copyright concept, not a trademark concept. And I'm really only familiar with US laws, so I'm not even sure if fair use is a concept in Denmark or not. They have a different notion of copyright than we do in the US.
That said, I think Denmark did the wrong thing here. I think face and identity is much closer to a trademark than it is to something that is a created piece of artwork. Trademark law is a little bit narrower because it allows you to use the trademark to refer to the company, but it prohibits you from using the trademark to confuse customers about whether this is an authentic product. This feels quite analogous to the issue that Denmark is trying to address with the deep fakes, so I'm a little bit surprised on the copyright angle, since I can imagine a lot of legitimate uses for taking a picture of someone without needing their permission and distributing it. CCTVs, traffic cams, mug shots, police body cams, and the ever-increasing trend of recording in any situation that becomes tense or dangerous. Will a civilian be told by the cop that he's violating copyright because they want to film the interaction with the police? This reminds me of when cops would play Taylor Swift loudly in the background so that people couldn't post videos of the interaction on YouTube.
Andy Warhol drew images of Campbell's soup cans in paint. https://en.wikipedia.org/wiki/Campbell%27s_Soup_Cans
He controlled the copyright to that painting. That's transformative, and the result does not meaningfully affect Campbell's ability to trade.
Quoting that Wikipedia link: "Although Campbell's never pursued litigation against Warhol for his art, United States Supreme Court justices have stated that it is likely Warhol would have prevailed.", with two quotes from two Supreme Court cases.
The second such quote is from Neil Gorsuch: "Campbell's Soup seems to me an easy case because the purpose of the use for Andy Warhol was not to sell tomato soup in the supermarket...It was to induce a reaction from a viewer in a museum or in other settings."
On the other hand, were Warhol to stick his copyrighted images on a new brand of soup, that would violate trademark law as it would confuse buyers.
Let's say, as an example, a married politician having an affair with someone. Generally, news sites will publish photos with the face of the politician visible but would blur the other person. The former is clearly a person of public interest, the latter is not. Even if it's a photo taken in a public space.
The issue of money gatekeeping legal rights is another issue entirely which should be addressed for everything, not just this specific problem. It's also, in my opinion, a lot more prevalent in the US than the rest of the first world countries.
So it’s not a photo of a politician doing something bad. It’s an AI recreation of what they are alleged to have done.
The law does have to be written very carefully.
Then, when someone uses their face to promote something, someone else can repeat the face with what it promotes.
So I think the whole thing actually works in this particular case.
What specific behaviors does this forbid that weren't already forbidden?
But me (not really) on my website (I don't have one) where I trash politicians (I don't) and post a photo of said politician eating poop, that should be 'frowned upon'. (Or worse to shame an ex-gf or a colleague that 'won't yield to my sexual advances').
While reading the article though, I thought of the cases where a paparazzo takes a photo of CelebrityA, then the CelebrityA posts said photo to her Insta (without getting permission from the agency) and the agency sues her. Now (in Denmark) the CelebrityA can sue the paparazzo for taking her photo in the first place (right?). This would protect people from getting uncomfortable photos.
What we'll probably see is, celebrity look-alikes will be contacted to license out their own "features".
What Denmark is proposing is simply an extension of the existing safeguards that many EU countries, as well as EU legislation generally, already provide regarding the control of their citizens over their likeness. It is not really revolutionary but a sensible addition in the light of technological advancements.
https://rinckerlaw.com/name-image-and-likeness-how-to-protec...
But Jakob Engel-Schmidt has been talking about this in the Danish news since April/May, back when two opposing political parties created a computer-generated fake video depicting Mette Frederiksen saying things that would have outraged voters.
There is no creativity involved whatsoever. Plenty of people look similar enough that they share "copyrighted" features. Cartoons of prominent people = copyright infringement? (Europe has a long history of judgments and precedents that prominent people can be parodied etc., how will that square with a fancy copyright protection.) You can principially make money on your copyright, so if a twin "sells" their face rights and the other twin demands a share, then what?
Just make deepfakes a specific crime and do not mess with IP any further. It is already a mess.
While in the West people have no respect to other people, and don't bother to blur anything. I think it would be better for everyone if you couldn't post photos of other people without their permission and if annoying Youtubers would go to jail.
Also when talking about some celebrity on TV they often show a drawing if they could not obtain rights to a photo.
Big overgeneralization. Here in Germany the "Recht am Eigenen Bild" (literally right to your own image) has existed for decades, and similar to Japan publishing images of others has some pretty big limitations and without consent is usually restricted to places or persons of public interest. To the chagrin of Google Street view or Twitch streamers
Am I missing something or is this just plain racism? There are lots of japanese people who don't look japanese, foreigners who are permanent residents, and japanese-looking people that aren't japanese - how is it respectful to protect just a certain ethnic groups privacy?
Business don't exist to respect or care about people they exist to generate profit so the idea of a business "respecting" something is not even realistic.
The law is what outlines what are the limits and guarantees the basics rights to everyone.
> Business don't exist to respect or care about people they exist to generate profit so the idea of a business "respecting" something is not even realistic.
People work at businesses, and those people make decisions that are influenced by their personal feelings. In a free market people will boycott them for doing bad things. Racist businesses being outted, receiving bad press and losing revenue - in this scenario it sounds like the japanese are okay with it. Being respectful is profitable.
> A paper signed by someone in the government doesn’t make you Japanese.
You claim that the government of Japan does not have the power to bestow citizenship on individuals?
Please stop confusing the US / maybe UK, Canada, Australia, NZ with "the West".
Countries like Germany and France have very strong privacy protections.
Every burglars wet dream. I have no idea what crime is like in Japan but in EU this is not an option.
"If, for example, you use a continuous recording of the road in which other vehicles' license plates are visible to defend yourself against a traffic ticket, you could be violating data protection, a serious offense that could be punishable by a fine of up to 300,000 euros."
Public photography? does this mean your image cant be sold if take in public? I'm sure there are many other scenarios that would be interesting to argue about as well.
This right is restricted for people of public interest. An important politician might be an "absolute" person of public interest and as long as they remain in their public position, certain private acts might still be judged to be in the public interest to be documented and published.
But anybody can become a "relative" person of public interest. I.e. you were one of the people climbing the Berlin Wall during the night that it fell. If you get your picture taken at that point in time, there is an obvious public interest in publishing it. But just because you participated in this one public event, a reporter can't snap your picture and publish it a week later.
There is also an exception for "panorama" images, i.e. a person being in an image of a public place "by chance". In that scenario the person isn't allowed to be the main subject, their presence must be circumstantial.
All of this has been hashed out over decades by lawmakers and courts and it is quite easy to understand if you read up on it even a tiny bit. A common sense approach will get you 95% there.
(I'm talking philosophically by the way, not legally)
For someone like Cormac McCarthy, whose sparse punctuation, biblical cadences, and apocalyptic imagery create an unmistakable "voice," the argument seems strong. His style is as identifiable as vocal timbre e.g. readers recognize McCarthy prose instantly, just as they'd recognize his speaking voice.
Your likeness on the other hand is pretty unique for most people in the world and the technological ability to produce convincing and arbitrary copies of it has very obvious and frightful consequences. Resolving this problem is of tremendous importance for social cohesion.
Is it a moral right which cannot be transferred? Is there a time limit? Does it expire upon death? If not, who inherits the right?
Or is it more like an economic right, which may be transferred?
Or is the author using "copyright" in a very broad and non-legal sense?
California, for example, has laws concerning the misappropriation of likeness, but these are not copyright laws.
Does the proposed Danish law allow deepfake use by consent, and what counts as consent? If clause §123/43.b of the Microsoft MacGoogleMeta user agreement says "by agreeing to this service you allow us to make and distribute deepfakes" - does that count as consent?
I guess you worry about stuff like person A looks like celebrity person B and sells their image for, say, frosty frootloop commercials. As long as A is not impersonating B, ie. claiming to be B, I can't see a problem. "Hi, my name is Troy McClure, you may know me for looking like Serena Williams." I guess it will be the decade of the doppelgänger agencies, like in Double Trouble ;) [1]
[1] https://www.imdb.com/title/tt0087481/?ref_=nv_sr_srsg_1_tt_8...
Same situation as today: if you have a lookalike out there who does pornography, and somebody you know runs across it, they'll think it's you and not much you can do about that except explain.
Dollars to doughnuts that this law is used against people not misrepresenting themselves, who happen to look like famous people.
There have been many cases where a company wanted to hire say, actor X to voice their commercial, actor refused, so they hired someone else with a nearly identical voice, the original actor sued and won(!!!!!) because apparently it's their "signature" voice.
I disagree because obviously that means the other person has no right to make money using their voice now, at no fault of their own?
But yeah I'd imagine you'd have the same problem here - you can't generate a picture of say, Brad Pitt even if you say well actually this isn't Brad Pitt, it's just a person who happens to look exactly like him(which is obviously entirely possible and could happen).
(In music, some other cases have been about suspected misuse of actual recordings, e.g. a cover band being sued because the original musician believes they actually used one of their recordings, and disproving that can be tricky. I don't think that can as easily happen with look-alikes)
Hosting your stuff outside the E.U. does not really protect you from this. More likely than not, your content relies on other companies (host, ISP, etc.) to be delivered to your audience and those companies either do have a presence in the E.U. or want to retain a working relationship with E.U. authorities. It's simply a question of how much energy E.U. authorities would be willing to invest in enforcement.
From Wikipedia: "Public figures can be photographed as part of their function or professional activity... A photograph of a public figure taken as part of his private life therefore still requires explicit authorization for publication. Thus, the Prime Minister cannot oppose a journalist photographing him at the exit of the Council of Ministers or during an official lunch, but he can prohibit the publication of photographs representing him at an event in his private life, such as a family reunion.”
https://fr.wikipedia.org/wiki/Droit_à_l%27image_des_personne...
For actual photographs of real persons Denmark and many other E.U. countries already have a broad system of legislation that governs your right to your own likeness. It usually assumes that it would be illegal to take and/or publish an image of a random person in a place/during an activity where they would have a reasonable expectation of privacy (i.e. in their backyard) or that was taken in a manner explicitly designed to circumvent reasonable precautions to ensure privacy.
This is counteracted by the public interest, which can cover certain activities of public persons (i.e. a prominent politician having an affair) or public activities of random persons (i.e. you taking part in a protest march). It's decided on a case by case basis based on quite detailed laws and decades of jurisprudence.
Is a Donald Trump impersonator (example[1]) copying the creative performance of Donald Trump (president)? What if someone did intend to create deepfakes of Donald Trump (president) and instead of using an image or audio of Donald Trump (president) as source material, use the Donald Trump impersonator as the source material?
Win-win. For the lawyers.
Like Arnie wouldn't allow his likeness in the C64 predator game (Which also had backstory not in the movie, blew my mind, games could build on movies and actors had rights to the likeness of a movie character they were)
Does this mean corporation's can't CCTV me like I can't film in a theater?
A lot of problems with this, and the real privacy benefits won't be enforced, we will see what happens.
Or if I get tattoo wit logo, is that "my own feature" and now I have copyright?!
This is like giving copyright to a name, there will be collisions and conflicts.
Punitive damages are very rare or non-existent depending on the country and the loser of the case usually has to pay the winning party's legal fees. There just isn't the incentive to sue someone over something silly like what you've mentioned.
I sure as hell don't.
It's also why the idea that "code is law" popular in certain circles was always misguided.
We moved past content scarcity decades ago and we are squarely in the attention scarcity regime. We use copyright against itself to have open source. We prefer interactivity and collaboration, as in open source, social networks or online games. Copyright stands in the path of collaboration and interaction.
Will companies now need to license "the likeness" of people too? Will "likeness" be property to be sold or rented?
- either the famous person cannot use their look if a lookalike refuses to agree
- or they have to pay all lookalikes to use their own image
- or the lookalikes get less protection under this law
- a person might lose their look-rights if they change their appearance to look like someone else
- someone who wants to go into acting might not get hired if they look too much like a famous actor
They already do.