Why was it Yeom Hye Ran? The AI ​​controversy that may have been intentional

Why was it Yeom Hye Ran? The AI ​​controversy that may have been intentional

Actress Yeom Hye Ran’s likeness was used without her consent in the film The Meter Reader. In other words, although she never appeared in the project, her face, her body, her expressions, her voice and even her most subtle movements were recreated using artificial intelligence, resulting in a production that felt like it was her film.

The creators of the video said they obtained prior consent from Yeom Hye Ran and her agency. However, this turned out to be false. Both the actress and her representatives stated that there had been no prior discussion or approval and that they only became aware of the video after it was uploaded to YouTube. Eventually the film was removed and placed privately.

The most intriguing part of this incident is that the creators made a claim that could easily be verified and proven false. Furthermore, Yeom Hye Ran is an actress who is widely appreciated by the audience not only for her acting skills but also for her personal image. Many viewers consider his presence alone a guarantee of quality.

Given his status, using his image without verification seems an unusually imprudent decision for industry professionals. Rather than a mistake or misunderstanding, it raises suspicions of a deliberately manufactured controversy, in other words, a calculated form of viral marketing with a specific objective.

In fact, many viewers who watched The Meter Reader were shocked to learn that the figure on the screen was not Yeom Hye Ran, but an AI-generated imitation. As we know, the surprise doesn’t come from the fact that it wasn’t her. Instead, it lies in the blurred boundary between reality and fiction, an ambiguity created so convincingly that it becomes the primary reaction to the video.

Controversy over portrait rights has essentially become a powerful exposure strategy for creators. In the age of artificial intelligence, where technology is widely used in content production, viral marketing can increasingly rely on provoking ethical sensitivities that audiences are highly attuned to. This is what makes this case particularly surprising.

A similar incident occurred in 2024, when OpenAI introduced ChatGPT’s “Sky” voice mode, which sounded remarkably similar to the voice of Scarlett Johansson, who played AI assistant Samantha in the movie Her. The situation turned into a legal dispute, although OpenAI argued that the voice was not a replica but performed by a different professional voice actor.

al

Even then the public was less concerned that the voice had been directly copied. Instead, they were fascinated by how much she looked like Samantha, bringing a fictional character into reality. If this had been intended as viral marketing, it would have been undeniably successful.

Sources: Daum

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top