*/
The law will have to radically adapt to the way new digital technologies are changing not only the use and exploitation of property but even its definition, writes Nick De Marco KC
From the mundane use of social media to socialise or video conferencing for work, to the more extravagant worlds in which people invest in virtual property, Non-Fungible Tokens (NFTs) or expensive virtual handbags for their virtual personas, our social and economic lives are increasingly lived in a digital world. We are all having to adapt to this new way of living and working, but the law needs to develop new and creative approaches to be relevant to these innovations.
The challenges posed by the metaverse, AI and the new digital world are substantial. New forms of virtual property and exchange, the increasing value of data and its exploitation, the dangers as well as benefits of AI, the challenges to personal privacy, the control of open spaces for political discourse, the undermining of intellectual property as we have known it are just some of most obvious and immediate legal issues.
Take a few recent examples. In April this year, the TikTok producer ‘Ghostwriter’ received millions of hits for his AI generated song, ‘Heart on My Sleeve’ where the vocals were made to sound like well-known hip-hop artists, Drake and The Weeknd. Ghostwriter did not directly copy the singers’ voices, but by feeding an AI machine with voice samples was able to generate new voices with a similar sound. While Universal managed to have the song taken down on many platforms the legal position remains uncertain. Copyright law protects originally created ideas and works, such as literature, song writing, art and so on. There is no copyright in the normal sound of a voice. The law of passing off might apply but not if the producer makes clear the song is not by someone else but only made to sound like them.
Other examples arise from the world of photography and art. I asked the app ‘Starryai’ to create photographs from various prompts I entered and quickly got results of varying quality and realism. Likewise, paintings ‘in the style of’ by another app, ‘Photoleap’. The AI can do this this because it has been fed hundreds of thousands of individual (and often copyright) photographs and art works. That is one of the issues in Getty Images (US) Inc v. Stability AI Inc, US, (and similar litigation in England). The photo agency claims Stability AI misused more than 12 million Getty photographs to train its AI image-generation system. The legal issue in the case is likely to be whether the machine’s use of Getty’s images falls within the ‘fair use’ exception.
In the UK, the ‘fair dealing’ exception to copyright infringement is narrower than the more fluid ‘fair use’ exceptions in the USA. What’s called ‘text and data mining’ (TDM) is currently restricted to use for non-commercial research. The UK government originally proposed to extend the ‘fair dealing’ exception to TDM of copyright material for any purpose, going even further than the 2019 EU amended rules which allowed TDM but allowed artists to ‘opt-out’. The backlash from the UK’s creative industries led the government to scrapping the plans, at least for now.
The recent Hollywood actors’ and writers’ strike raised similar issues. One of the major concerns of the strikers are the producers’ plans to use AI generated unapproved likenesses of actors for the rest of eternity with little or no compensation.
Data rights in sport is another current hot topic. In England, ‘Project Red Card’ has focused on the use of players’ personal data for commercial exploitation, especially by betting companies and sports analysts using AI algorithms to set odds and provide analysis. Thousands of professional football players have had their performance and tracking data used by third party betting companies without their express knowledge, consent or any remuneration. The sports betting industry is one of the most sophisticated and profitable in the world. It uses this data to set odds on bets. The data is personal data under data protection law, so it cannot be used without players’ consent or some other legitimate reason. Players agree, by standard professional contracts, that the FA or applicable league can use their data, and there are legitimate interests such as improving health and fitness for doing so. But then some of those bodies sell the data on to third party businesses for profit, with the legal basis for onward use being untested.
In 2021, the UK Supreme Court considered a class action brought on behalf of unidentified Apple iPhone users who had their tracking data unlawfully used by Google to target advertisements at them. The Court ruled that damages for ‘loss of control’ of data under the applicable data protection law could not be awarded without showing material damage or distress. That might be right for unidentified iPhone users, but what about professional sportspeople who have clear value in their own personal data which they could sell to third parties had the data not already been traded without their consent? The law must surely provide protection here.
It is clear that the law will have to radically adapt to the way new digital technologies are changing not only the use and exploitation of property but even its definition. Non-Fungible Tokens (NFTs) have been another controversial example. But beyond the hype, mystery and scams surrounding NFTs, many see these as a way in which creatives and others might be able to protect their own creations and data in ways current laws do not. The regulation (or not) of crypto currencies and NFTs, and whether they ought to be treated as gambling or financial products, is another topic outside the scope of this article – but underlines the point: the need for the law to develop to be relevant in the new digital world.
A recent case of the Civil Court of first instance of Florence, Italy, demonstrates the extent to which jurists have creatively developed the law in this space. GQ magazine superimposed the image of a model on an image of Michelangelo’s David statue for a front cover. While the statue is in the public domain, and not subject to IP law, the court found GQ’s use in breach of the Italian constitution and ordered it must pay damages. It did so by applying a creative interpretation to art. 9 of the Italian Constitution, protecting Italy’s historical and artistic heritage, read with art. 10 of the Civil Code which protects ‘image rights’ of individuals (who can sue for damages where their image has been published without their consent). The Court found that GQ’s use of the image of the statue was ‘detrimental to the image of cultural heritage as an expression of the cultural identity of the nation’, and therefore violated the ‘collective identity of citizens’ of Italy.
Providing art works with their own (or the community’s) image rights might be considered a step too far by many lawyers, but new and innovative thinking by lawyers is needed more than ever for the law to be as relevant in the digital world as it is in the ‘real’ world.
From the mundane use of social media to socialise or video conferencing for work, to the more extravagant worlds in which people invest in virtual property, Non-Fungible Tokens (NFTs) or expensive virtual handbags for their virtual personas, our social and economic lives are increasingly lived in a digital world. We are all having to adapt to this new way of living and working, but the law needs to develop new and creative approaches to be relevant to these innovations.
The challenges posed by the metaverse, AI and the new digital world are substantial. New forms of virtual property and exchange, the increasing value of data and its exploitation, the dangers as well as benefits of AI, the challenges to personal privacy, the control of open spaces for political discourse, the undermining of intellectual property as we have known it are just some of most obvious and immediate legal issues.
Take a few recent examples. In April this year, the TikTok producer ‘Ghostwriter’ received millions of hits for his AI generated song, ‘Heart on My Sleeve’ where the vocals were made to sound like well-known hip-hop artists, Drake and The Weeknd. Ghostwriter did not directly copy the singers’ voices, but by feeding an AI machine with voice samples was able to generate new voices with a similar sound. While Universal managed to have the song taken down on many platforms the legal position remains uncertain. Copyright law protects originally created ideas and works, such as literature, song writing, art and so on. There is no copyright in the normal sound of a voice. The law of passing off might apply but not if the producer makes clear the song is not by someone else but only made to sound like them.
Other examples arise from the world of photography and art. I asked the app ‘Starryai’ to create photographs from various prompts I entered and quickly got results of varying quality and realism. Likewise, paintings ‘in the style of’ by another app, ‘Photoleap’. The AI can do this this because it has been fed hundreds of thousands of individual (and often copyright) photographs and art works. That is one of the issues in Getty Images (US) Inc v. Stability AI Inc, US, (and similar litigation in England). The photo agency claims Stability AI misused more than 12 million Getty photographs to train its AI image-generation system. The legal issue in the case is likely to be whether the machine’s use of Getty’s images falls within the ‘fair use’ exception.
In the UK, the ‘fair dealing’ exception to copyright infringement is narrower than the more fluid ‘fair use’ exceptions in the USA. What’s called ‘text and data mining’ (TDM) is currently restricted to use for non-commercial research. The UK government originally proposed to extend the ‘fair dealing’ exception to TDM of copyright material for any purpose, going even further than the 2019 EU amended rules which allowed TDM but allowed artists to ‘opt-out’. The backlash from the UK’s creative industries led the government to scrapping the plans, at least for now.
The recent Hollywood actors’ and writers’ strike raised similar issues. One of the major concerns of the strikers are the producers’ plans to use AI generated unapproved likenesses of actors for the rest of eternity with little or no compensation.
Data rights in sport is another current hot topic. In England, ‘Project Red Card’ has focused on the use of players’ personal data for commercial exploitation, especially by betting companies and sports analysts using AI algorithms to set odds and provide analysis. Thousands of professional football players have had their performance and tracking data used by third party betting companies without their express knowledge, consent or any remuneration. The sports betting industry is one of the most sophisticated and profitable in the world. It uses this data to set odds on bets. The data is personal data under data protection law, so it cannot be used without players’ consent or some other legitimate reason. Players agree, by standard professional contracts, that the FA or applicable league can use their data, and there are legitimate interests such as improving health and fitness for doing so. But then some of those bodies sell the data on to third party businesses for profit, with the legal basis for onward use being untested.
In 2021, the UK Supreme Court considered a class action brought on behalf of unidentified Apple iPhone users who had their tracking data unlawfully used by Google to target advertisements at them. The Court ruled that damages for ‘loss of control’ of data under the applicable data protection law could not be awarded without showing material damage or distress. That might be right for unidentified iPhone users, but what about professional sportspeople who have clear value in their own personal data which they could sell to third parties had the data not already been traded without their consent? The law must surely provide protection here.
It is clear that the law will have to radically adapt to the way new digital technologies are changing not only the use and exploitation of property but even its definition. Non-Fungible Tokens (NFTs) have been another controversial example. But beyond the hype, mystery and scams surrounding NFTs, many see these as a way in which creatives and others might be able to protect their own creations and data in ways current laws do not. The regulation (or not) of crypto currencies and NFTs, and whether they ought to be treated as gambling or financial products, is another topic outside the scope of this article – but underlines the point: the need for the law to develop to be relevant in the new digital world.
A recent case of the Civil Court of first instance of Florence, Italy, demonstrates the extent to which jurists have creatively developed the law in this space. GQ magazine superimposed the image of a model on an image of Michelangelo’s David statue for a front cover. While the statue is in the public domain, and not subject to IP law, the court found GQ’s use in breach of the Italian constitution and ordered it must pay damages. It did so by applying a creative interpretation to art. 9 of the Italian Constitution, protecting Italy’s historical and artistic heritage, read with art. 10 of the Civil Code which protects ‘image rights’ of individuals (who can sue for damages where their image has been published without their consent). The Court found that GQ’s use of the image of the statue was ‘detrimental to the image of cultural heritage as an expression of the cultural identity of the nation’, and therefore violated the ‘collective identity of citizens’ of Italy.
Providing art works with their own (or the community’s) image rights might be considered a step too far by many lawyers, but new and innovative thinking by lawyers is needed more than ever for the law to be as relevant in the digital world as it is in the ‘real’ world.
The law will have to radically adapt to the way new digital technologies are changing not only the use and exploitation of property but even its definition, writes Nick De Marco KC
Our call for sufficient resources for the justice system and for the Bar to scrutinise the BSB’s latest consultation
Marie Law, Head of Toxicology at AlphaBiolabs, discusses alcohol testing for the Family Court
Louise Crush of Westgate Wealth explains how to make sure you are investing suitably, and in your long-term interests
In conversation with Matthew Bland, Lincoln’s Inn Library
Millicent Wild of 5 Essex Chambers describes her pupillage experience
Louise Crush of Westgate Wealth explores some key steps to take when starting out as a barrister in order to secure your financial future
From a traumatic formative education to exceptional criminal silk – Laurie-Anne Power KC talks about her path to the Bar, pursuit of equality and speaking out against discrimination (not just during Black History Month)
James Onalaja concludes his two-part opinion series
Expectations, experiences and survival tips – some of the things I wished I had known (or applied) when I was starting pupillage. By Chelsea Brooke-Ward
If you are in/about to start pupillage, you will soon be facing the pupillage stage assessment in professional ethics. Jane Hutton and Patrick Ryan outline exam format and tactics
In a two-part opinion series, James Onalaja considers the International Criminal Court Prosecutor’s requests for arrest warrants in the controversial Israel-Palestine situation