The Perils of Artificial Intelligence and Estate Disputes

In the ever-evolving landscape of technology, artificial intelligence (AI) has emerged as both a tool for innovation and a source of ethical dilemmas. One such dilemma that has recently come to light is the unauthorised use of voices through AI, raising concerns about privacy, consent, and the protection of individuals’ rights. In Australia, where the debate over AI ethics is gaining traction, it’s crucial to address these issues head-on to safeguard against potential abuses and ensure accountability in the digital world.

The unauthorised use of voices in AI applications presents a multifaceted challenge with far-reaching implications. Whether it’s replicating a celebrity’s voice for commercial purposes or synthesising the voice of an acquaintance without their consent, such actions raise serious ethical questions about the boundaries of technology and the rights of individuals.

At the heart of this issue is the concept of consent. Every individual has the right to control how their voice is used and disseminated, and this right should be respected in the digital world as much as it is in the physical world. Unauthorised use of voices through AI not only violates individuals’ privacy but also undermines their autonomy and agency over their own identity.

Recently, American Estate lawyers for the late comedian, George Carlin, successfully won a court case between the estate of George Carlin and the makers of a podcast who used generative artificial intelligence to impersonate the late stand-up comic’s voice and style for an unauthorised special.  This case marks what’s believed to be the first resolution to a lawsuit over the misappropriation of a celebrity’s voice or likeness using AI tools. It comes as Hollywood is sounding the alarm over utilisation of the tech to exploit the personal brands of actors, musicians, and comics, among others, without consent or compensation.

Prominent Australians such as Gina Rinehart, Dr. Karl, Andrew Forrest, and Margot Robbie have also raised concerns over unauthorised alterations of their voices and images through AI, alongside the illicit use of their likenesses in the promotion of goods or fraudulent schemes.

In Australia, a Will must be in writing to be valid, so anything a person says verbally cannot be enforced as a legal Will.  However, the recorded video or audio wishes of a Will maker have been accepted by the Courts to clarify the wishes of a deceased person.

Accordingly, it is foreseeable that we may start seeing disputes arise where a disappointed beneficiary were to produce a recording of the deceased that purportedly altered terms of the Will in favour of that beneficiary, which was actually produced by AI using the deceased’s likeness.

So, in this day and age, it is important for the executor or administrator of a deceased estate to seek legal advice if they believe that AI-based subterfuge pertaining to a Will is afoot.

Steve Hodgson, senior associate, and the Estates Team at Salerno Law have assisted many individuals and couples understand their options and plan their estates per their testamentary wishes, as well as dealing with disputes that arise after death.  Based on their experience, the team at Salerno Law are practiced in determining what each client really wants and excel helping them navigate the issues that may arise with estate planning, administration, and disputes, especially in the unchartered territory of AI.

Want to know more about the George Carlin Estate matter, read more here Hollywood Reporter Article

Author: Steven Hodgson