Artificial Intelligence

All posts related to this category.

UK Creators Demand Prime Minister Recognise Creators’ Human Rights And Protect Copyright Now

Over 70 leading UK creators and creative organisations, including Elton John, Sir Paul McCartney, and Getty Images, have signed an open letter urging the Government to defend copyright holders’ rights and stop Big Tech AI firms from exploiting their work. The signatories accuse ministers of ignoring international human rights obligations and prioritising US trade deals over the UK’s creative economy.

IPTC publishes best-practice guidance on Generative AI opt-out for publishers

The IPTC has released a set of guidelines expressing best practices that publishers can follow to express the fact that they reserve data-mining rights on their copyrighted content. Read more...

Regulation on artificial intelligence: reaction of Rachida DATI, Minister of Culture, to the joint declaration of European and international rights holders.

France's Minister of Culture, Rachida Dati, emphasises the need for fair remuneration for creators and rights holders in the context of the European AI regulation. The government is actively working with cultural and media sectors and AI developers to establish a fair remuneration model, ensuring the protection of copyright in the evolving AI landscape.

A Missed Opportunity to Enforce Meaningful Transparency Obligations under the EU AI Act

In July 2025, the European Commission released documents intended to ensure transparency for General-Purpose AI providers under the EU AI Act, CEPIC argues they fall short in protecting Europe’s creative and media sectors. The voluntary Code of Practice, insufficient Transparency Template, and overly lenient Guidelines for GPAI providers fail to provide meaningful protection for rights holders, prompting CEPIC to call for stronger, enforceable measures.

Joint statement by a broad coalition of rightsholders regarding the AI Act implementation measures adopted by the European Commission

A broad coalition of rightsholders from the cultural and creative sectors in the EU has expressed dissatisfaction with the European Commission's implementation measures of the AI Act, particularly the GPAI Code of Practice and Guidelines. They argue that the measures fail to address key concerns about the protection of intellectual property rights and do not fulfil the promises of the AI Act, which was intended to safeguard European copyright holders in the age of generative AI.

Don't Let the AI Training Fair Use Headlines Get You Down

Joe Naylor discusses two recent legal rulings that address AI's use of copyrighted works for training large language models (LLMs). While the decisions in Bartz v Anthropic and Kadrey v Meta favour AI companies in some respects, they also highlight potential future challenges for AI training on copyrighted content, with Judge Chhabria’s ruling offering a roadmap for future plaintiffs to pursue infringement claims.

Combating Online Misinformation: Launch of "Provenance for Trust"

Provenance for Trust," a new initiative from TrustMyContent, UncovAI, and the Journalism Trust Initiative, aims to restore trust in media by ensuring content authenticity and traceability. The program addresses key issues like AI transparency and copyright protection, providing solutions to combat misinformation.

Major publishers call on the US government to ‘Stop AI Theft’

Hundreds of US publishers, including The New York Times and The Washington Post, have launched an ad campaign urging the government to ensure Big Tech compensates creators for content used in AI training. The "Support Responsible AI" campaign calls for fair compensation, mandatory attribution, and the protection of copyrighted content from unauthorised use by AI companies.

CEPIC 2025

CEPIC 2025 promises even more business opportunities – and more enjoyment – than ever before. Set in the South of France, it’s an unmissable opportunity to network with an array of visual media professionals from across the globe, make contacts and do good business.