AI Consultation Faces Human Rights Problem

Experts find hole in controversial UK government proposals

By Chas Rowe | UK male voice-over artist

Big Ben and the UK Houses of Parliament. Photo by Daria Agafonova

Legal consultants have argued that the UK Government’s proposals to enable AI companies to feed models on copyrighted material under a so-called “data-mining exception” could be challenged under human rights laws.

Handley Gill, which describes itself as a “legal, regulatory and compliance consultancy” contends that the implications of the government’s Consultation on Copyright and AI may fall foul of Protocol 1, Article 1 of the European Convention on Human Rights.

The protocol concerns the protection of property, which not only includes physical property, but can also include intellectual property such as licences and patents.

Scales of justice. Photo by Katrin Bolovtsova

Protocol One

The protocol in The Human Rights Act states that:

“Every natural or legal person is entitled to the peaceful enjoyment of his possessions.

No one shall be deprived of his possessions except in the public interest and subject to the conditions provided for by law and by the general principles of international law.”

While the law does allow for certain, limited exceptions to this principle, the firm has indicated in its response to the AI consultation that copyright holders’ property rights may be infringed by entities “expropriating or otherwise controlling their use”.

The proposals, say lawyers, “would enable AI developers to utilise published copyright works… to train their artificial intelligence (AI) models without having to remunerate rights holders”.

Statement by Handley Gill Limited and shared on its LinkedIn page

Precedent over Property

As well as pointing to the Act, lawyers cite case law from 2007, when the European Court of Human Rights in Strasbourg recognised that the protocol “is applicable to intellectual property”.

The firm therefore considers that the government’s proposals “would constitute an interference with copyright holders’ possessions”.

It also explained that any interference must be “in the legitimate public or general interest”, suggesting that the “highly profitable” and “well financed” AI industry’s motives are not.

A further contention was that any public or general interest would need to be “proportionate” and “strike a fair balance” between the needs of the affected parties.

The emergence of fairness and justice. Photo by Tara Winstead

Notoriously Laborious

The government’s 52-question AI consultation ran from 17th December 2024 until 25th February 2025 – a total of 10 weeks.

The online form covered many areas of concern surrounding AI, including alongside others, outlining proposals for affected individuals and companies to opt in to, and opt out of, data mining by AI companies.

Further questions aimed to investigate regulation and compliance, asking what the “legal consequences” should be for companies who ignore people who want to reserve their rights.

The AI consultation also invited discussion around the issues of “digital replicas”, which include deepfakes and vocal clones – the latter of which has caused consternation among celebrities such as Stephen Fry and David Attenborough, whose ‘voice prints’ have been made and published without their permission, or payment.

Copyright use and payments going hand in hand. Image by Mohamed Hassan from Pixabay

Balancing Interests

The preface of the AI consultation outlined that “copyright is a key pillar” of the creative economy.

However, it also made clear the government’s determination to “build on the strengths of the UK AI sector”, saying that both industries “are essential to drive economic growth”.

The government’s AI consultation has drawn praise and criticism from across society.

The BBC, for example, has said that it is “positive about the potential of AI”, but has called for “fair licensing arrangements and the authorised use of content“.

Meanwhile, more than 1,000 musicians have released a “silent album” of background noise from “empty studios and recording spaces” to symbolise the predicted effect of artists not being able to work, if copyright laws were changed.

In a UK first, national newspapers also carried identical front-page advertising (MAKE IT FAIR), following the end of the AI consultation on Tuesday 25th February 2025.

Chief Executive Officer of News UK, David Dinsmore, told the Chris Evans Breakfast Show on Virgin Radio: “from a journalistic point of view, it’s really important that you have up-to-the-moment, verified information – or these [AI] models simply don’t work.”

“We want to license and sell our content and we already have deals with some companies…. that’s a good outcome.”

The union for performers and actors, Equity, has called for an “artist-centred approach” and has outlined eight principles for members and engagers when working with AI. These include consent, limited licensing and fair payment.

As for Handley Gill, it has encouraged the UK government “to establish and encourage mechanisms for licensing, including exploring subsidies, to grow the creative and professional industries at the same time as UK AI, as well supporting enforcement against infringements” in one of its LinkedIn posts.

Handley Gill Limited is a London-based legal and regulatory compliance consultancy, advising on areas including data protection, online safety, artificial intelligence and human rights.

 

 

Thanks for reading – and in the meantime, happy voice-over hiring.

If you liked this article, please share it via the social media buttons above. A credit, link and a thank you are always appreciated.

 

About the author

Chas Rowe is a UK voice-over artist, writer, former radio journalist and newsreader, and an advocate for best practice in professional voice-over production and hiring. 

Chas holds a BA in French and German, an MA in Film & Television Studies from the University of Warwick, and an MA in Multimedia Journalism from Bournemouth University. 

To hire Chas for your next voice-over project, or to discuss syndication of this article, please email: [email protected]

 

© Copyright Chas Rowe 2025