Researcher: Revenge of the Clones

Context:
Your lab possesses a unique dataset of emotional voice recordings, donated for mental health research. A tech giant offers you 1 million euro to license it for a commercial voice-cloning product. The original participant consent forms are ambiguous, never explicitly forbidding this commercial use.
Dilemma:
A) Reject the offer, upholding the spirit of the participants' consent for altruistic research.
B) Accept the payment, arguing the legal forms permit it and the funds will accelerate your lab's work.
Story behind the dilemma:
In May 2024, OpenAI unveiled a new, flirty voice assistant named "Sky" as part of its GPT-4o demo. The voice was immediately noted by critics and the public for being eerily similar to Scarlett Johansson's performance in the film Her. This was no coincidence; CEO Sam Altman had twice asked Johansson to license her voice for the assistant, months prior and again just days before the launch. She refused both times.
Despite her refusals, OpenAI proceeded with the "Sky" voice. Johansson stated she was "shocked" and forced to hire legal counsel. The situation was amplified by Altman himself, who posted the single word "her" on X following the product reveal, directly referencing the film.
OpenAI suspended the "Sky" voice and published a defense, claiming it was not an imitation of Johansson but belonged to a different professional actress, cast before their outreach to the star. They apologized for poor communication but not for the similarity. The incident highlights the ongoing ethical battles in AI, where companies are increasingly accused of using creative work and likeness without permission, placing OpenAI under further scrutiny following lawsuits from authors and media organizations.
Resources:
