Benjamin Field Benjamin Field

Our Generative AI Policy

It all begins with an idea.

Deep Fusion Films: Our Approach to Generative AI

Leading with Integrity in a Changing Industry

At Deep Fusion Films (DFF), we believe the future of storytelling lies at the intersection of innovation and responsibility. Generative AI (Gen AI) is already transforming how we create, deliver, and experience content. It offers powerful new possibilities - but also brings with it serious ethical, creative, and environmental questions.

We don’t shy away from those questions. Over the past 18 months, we have been a visible and vocal advocate for ethical AI in the creative industries - speaking at WIPO events, giving evidence to a UK Government Select Committee, and helping shape policy for industry unions and representative bodies. We are not passive adopters of AI. We are actively helping define how it should be used - and how it should not.

This document sets out how we use Gen AI at Deep Fusion Films. It is a living commitment: not a marketing statement, but a framework of accountability to our industry, our collaborators, and our audiences.

1. The Reality of the Industry Today

Broadcasters and networks are already leaning into Gen AI - especially when it:

  • Speeds up valid processes

  • Reduces cost or post-production time

  • Is used transparently and doesn’t mislead audiences

To stay relevant, trusted, and commissionable, we must evolve with the industry - but not at the expense of ethics or trust. That’s why we’re choosing to lead from within. We use Gen AI responsibly, with human direction, visible boundaries, and creative intent at the core. We acknowledge the imperfections within the system and are actively involved in building a future with equitable remuneration for creatives, data provenance and copyright protections.

2. Why We Use Gen AI

We use Gen AI to enhance and accelerate pre-existing production processes - not to replace the work of human creatives.

Our current uses include:

  • Animating licensed stills

  • Converting artwork or illustrations into photorealistic images

  • Generating background plates where traditional filming is impractical

  • Assisting with transcription, subtitling, and localisation

  • Speeding up previsualisation and asset development

These are roles that follow a well-established lineage of production tools - like colourisation, motion interpolation, rotoscoping, and digital VFX. Gen AI is the next step in that evolution.

We do not use Gen AI to generate entire films, stories, or performances from scratch unless it is for Research and Development. Our internal development work may be released online for the purpose of informed audience feedback.

3. Human Creativity Is Always Central

Every Gen AI output at DFF is:

  • Initiated by human creatives

  • Directed, approved, and integrated within a human editorial vision

  • Reviewed for tone, accuracy, and audience perception

We never use Gen AI to:

  • Replace scriptwriters, editors, or performers

  • Mimic living individuals without explicit consent

  • Generate misleading visual material for factual productions

When AI is used in the creative process, we credit the human, not the tool. AI assists. It never authors.

4. Consent, Estates, and Likeness Rights

We do not and will not use Gen AI to recreate the likeness, voice, creative style, or identity of any living person without their explicit, contractual consent.

Where deceased individuals or estates are involved, we obtain written permissions and engage in equitable partnership models to ensure their legacy is respected.

5. Addressing the Training Data Dilemma

We acknowledge that many Gen AI tools have been trained on vast datasets that likely include copyrighted content or third-party materials without consent. This is a fundamental and unresolved issue - one that requires transparency and reform.

At the same time, Gen AI models must learn from real-world patterns to function. An AI doesn’t understand how trees move, how faces light, or how emotions play out unless it has seen millions of examples. That’s modelling, not mimicry.

There’s nothing copyrightable about gravity or motion - but we recognise that creative work is part of what taught these systems how to see.

That belief directly informs our reinvestment strategy (6).

Additionally, while Gen AI is capable of imitation and replication, and therefore capable of infringing on existing copyright or IP, we will never knowingly use Gen AI tools or processes to create new material based on existing intellectual property without permission. 

We carry out due diligence to clear the rights in content we use in AI assisted processes

6. Reinvesting in the Industry

We believe that if AI tools benefit from the creative ecosystem, they should help sustain it.

That’s why we:

  • Donate a portion of profits from AI-assisted productions to the TV & Film Charity, supporting industry professionals in need

  • Deliver a dedicated training programme in partnership with the National Film and Television School (NFTS), focused on the responsible, creative, and practical use of Gen AI in production

Through both financial support and knowledge sharing, we aim to ensure the future of the industry is shaped by those who understand it - not just those who code it.

7. Transparency and Guardrails

We operate with full transparency. On every project, we:

  • Declare the use of Gen AI to commissioners, collaborators, and clients from the outset

  • Track all use of Gen AI internally via a tiered risk framework

  • Label and disclose AI-generated or enhanced materials clearly in documentation or credits

  • Disclose use to the audience where it may impact understanding - especially in factual or documentary content.

Our tiered framework:

  • Tier 1 – Technical enhancement only (e.g. transcription, stabilisation)

  • Tier 2 – Augmenting licensed material (e.g. animating stills, background creation)

  • Tier 3 – High editorial sensitivity (e.g. recreating people, historical events) - used only with sign-off and clear labelling

We have evaluated the Gen AI tools we use and we never commercially use software or platforms where the materials we input - such as scripts, stills, or footage - are used to train or improve that provider’s models. We work only with services that offer clear contractual assurances that user-uploaded content remains private and proprietary.

We are also developing internal tools to ensure this tracking is consistent, reviewable, and auditable.

8. Responsible Use in Factual and Historical Storytelling

We do not use Gen AI to generate misleading “archive,” or create visuals that could distort real events. 

If synthetic material is used in factual work:

  • It will be clearly labelled

  • It must be justifiable

  • It must not risk misleading the viewer

History is not a prompt. We treat it with respect.

9. Our Role in the Labour Market

We do not use AI to cut corners or reduce headcount.

Gen AI should support, not replace, skilled labour. We build our workflows to keep editors, writers, designers, and performers in the loop - and to free them from repetitive or non-creative tasks.

We will not use AI to eliminate core creative roles purely for cost savings.

10. Environmental Awareness

AI systems - especially visual generation - are compute-intensive. We are committed to reducing the environmental cost of our work.

We:

  • Prioritise efficient models and workflows

  • Minimise unnecessary rendering or compute cycles

  • Work with partners to explore energy offsetting

In addition, we have:

  • Created an internal environmental impact calculator to measure our AI energy use on a per-project basis

  • Commissioned an internal study into the full environmental cost of our AI pipeline, with the intention to publish and share insights with the wider industry

Innovation must also be sustainable. We’re making sure it is.

11. A Living Commitment

This policy is not a statement of perfection - it’s a declaration of direction.

We continually take advice from respected third-party professionals - including legal, insurance, and policy experts - to inform and test our thinking. This ensures we are not operating in a bubble or echo chamber. We welcome challenge, scrutiny, and new perspectives, and we actively learn from them. Our position continues to evolve because we believe ethical leadership requires constant listening, not static rules.

As technology evolves, so will we. We will continue to:

  • Refine our internal tooling and tracking

  • Build AI products that embed ethical principles by design

  • Participate in public discourse, industry panels, and policy conversations that shape the future of creativity

We are not just responding to the wave - we are helping navigate it.

The Bottom Line

We will use Gen AI.

We will use it responsibly.

We will use it to speed up work - not to replace people.

We will give back to the creative ecosystem.

We will build tools with accountability in mind.

We will educate others and challenge ourselves.

We don’t hide what we do - we label it, explain it, and stand behind it.

AI won’t define Deep Fusion Films.

But the way we use it will.

Read More

30/06/2025

AI SPECIALIST DEEP FUSION FILMS STRIKES ARC PARTNERS PACT

UK-based Deep Fusion Films has its sights set on deeper U.S. traction.

The production house has teamed with bi-coastal management and advisory services firm ARC Partners to expand its networks, build strategical relationships with buyers and production companies and deepen its position as a key player in AI policy, ethics and legislation. ARC is led by former WME and Gersh agent Collin Reno out of New York and Ivo Fischer in L.A., and is known for advising U.S. and U.K. studios, branded entertainment companies, production firms, and creative talent.

“Deep Fusion’s unique approach – where cutting-edge technology meets compelling storytelling, underpinned by a strong ethical commitment – really stood out to us,” said Reno, founder and Partner at ARC Partners. “Their blend of creativity and innovation aligns perfectly with our expertise in guiding clients through change and building strategic partnerships. Together, we’re focused on supporting Deep Fusion’s ambitious initiatives, unlocking new opportunities, and thoughtfully shaping the future of content in an evolving media landscape.”

“Deep Fusion has grown rapidly over the past few months and, as we enter this new frontier of media innovation, we need to work with people who see a different way of working within the traditional landscape – people who are not afraid to break with tradition to forge a new pathway,” added Deep Fusion co-founder and CEO Benjamin Field.

“I’ve known Ivo for a couple of years and have huge respect for the way both he and Collin think. I’m genuinely excited to be partnering with ARC as we continue to build something meaningful at the intersection of media and responsible AI.”

The partnership comes soon after Fischer joined ARC Partners, which in May rebranded from The ARC Representatives. Fischer, also a former long-serving WME agent, had exited Buchwald in November 2024 after a two-year run at the agency as Head of Unscripted Content and Talent. Since being founded in 2023, Deep Fusion’s management have authored several policy papers on AI, as the entertainment world continues to wrestle with how to approach the technologically ethically. Deep Fusion’s work as a contributor to the UK government and international organisations has centred around how to resolve emerging ethical, legal and sustainability challenges around the use of AI to find fair and equitable solutions.

Furthermore, Deep Fusion has place itself in the middle of the generative AI (genAI) debate, looking at ethical grey areas and IP risks. Its genAI workflow has also been offered insurance, in what could be an industry first.

Elsewhere, the company has launched Verbl, an AI-supported transcription, translation and dubbing service that it claims “combines ethically driven and fully licensed machine efficiency with human oversight to ensure accuracy, cultural sensitivity and editorial integrity.”

This comes after the launch of Deep Fusion Technologies in May. Obinna Emmanuel Obi-Akwari has been named as the new tech division’s UI/UX media and innovation lead. This follows the appointment in April of former ITV and BBC development exec Charissa Coulthard, who took on a newly-created role as development executive, charged with shaping the next generation of unscripted formats across the company’s digital-first and traditional broadcast slates.

16/05/2025

DEEP FUSION FILMS LAUNCHES TECH DIVISION WITH DUBBING SERVICE ‘VERBL’

Obi-Akwari hired to maximise user experience and lead AI media innovation

Deep Fusion Films, the production company and innovative technology specialist, has announced the launch of a new technology division, Deep Fusion Technologies (DFT). 

DFT has been established to supercharge the company’s commitment to reshaping how content is developed, produced and delivered—by building tools that meet the needs of today’s creators, not today’s software.

Obinna Emmanuel Obi-Akwari has been appointed as UX / AI Media & Innovation Lead, reporting to Deep Fusion Film’s head of creative AI Christian Darkin who joined the company last September. Obi holds a multi-disciplinary background in front-end and back-end development, education, law and UX design 

The first product to launch under the DFT banner is Verbl—an AI-supported transcription, translation and dubbing service designed to make localisation faster, more cost-effective, and more accessible. Verbl combines ethically driven and fully licensed machine efficiency with human oversight to ensure accuracy, cultural sensitivity, and editorial integrity—enabling content to be easily monetised in global markets.

Benjamin Field, Co-founder and director at Deep Fusion Films, said: “Welcoming Obi to the team at such a pivotal moment makes him ideally placed to help shape the next generation of creative tools.  His talent and energy are already shaping the way we think about the future of storytelling tools and celebrates the creative outcomes of cross-fertilising knowledge and skills. With the launch of Verbl, we’re proving that AI can drive opportunity—not redundancy. This isn’t about replacing people; it’s about building better workflows, supporting the creative economy, and creating skilled jobs in the process.”

Deep Fusion Technologies will operate under a 'Creative First' ethos, developing software that enhances the creative process across all Deep Fusion Films productions. Where existing tools do not fulfil creative demands, DFT will build bespoke solutions—designed not to replace human creativity, but to unlock it.

This is the second strategic hire in three months for Deep Fusion Films following the appointment of former ITV/BBC development executive Charissa Coulthard to the newly created role of development executive, charged with shaping the next generation of unscripted formats across Deep Fusion’s digital-first and traditional broadcast slates.

Deep Fusion’s intention is to focus on strong, responsible growth in a sector often overshadowed by fears of AI-induced job losses by creating jobs, building capacity, and placing talented people at the heart of technological change.

06/05/2025

NFTS LAUNCHES GROUND-BREAKING COURSE IN AI FOR FILM AND TELEVISION, IN PARTNERSHIP WITH DEEP FUSION FILMS

The National Film and Television School (NFTS), one of the world’s leading training grounds for film, television and games is proud to launch its first-ever course focused on the transformative and growing role of Artificial Intelligence (AI) in the screen industries. 

Developed in partnership with Deep Fusion Films, producers of Virtually Parkinson and Hammer: Heroes, Legends and Monsters, and pioneers in the ethical use of AI within the creative process, the ground-breaking new Certificate in AI Protocols and Practices for Film and Television will be delivered online weekly over six months.  


Created in response to a rapidly evolving landscape, the course explores the growing need for critical, creative and ethical engagement with AI, equipping participants to enhance and streamline their creative practice in film and television while examining the responsibilities and challenges that come with AI’s use.

Applications are now open for the part-time Certificate in AI Protocols and Practices for Film and Television, designed for industry professionals, freelancers, and emerging creatives eager to deepen their understanding of AI’s growing influence across the film and television pipeline. Led by Ben Field, co-founder and CEO of Deep Fusion Films, the course will feature weekly sessions combining workshops, masterclasses and group tutorials.

Participants will gain hands-on experience with cutting-edge AI tools and workflows, alongside critical insights into the legal, ethical and cultural considerations they raise. From copyright and consent to bias, authorship and intellectual property, the course tackles the real-world issues shaping the future of storytelling.

Whether streamlining production, enhancing post, or unlocking new creative formats, the course goes beyond technical training, providing insights into not just how to use AI, but when, why, and with what impact.

While much of the discourse around AI focuses on efficiency or disruption, this unique course offers a more expansive view. It positions AI not as a threat, but as a transformative opportunity. It’s aim is to teach that if guided by ethical awareness and creative insight, AI use can break down barriers, spark collaboration, and enable entirely new storytelling possibilities. Covering every stage of the process from development through to distribution, the course equips participants with both the practical skills and critical thinking needed to lead in a fast-changing industry.

The course opens for applications just ahead of the Creative Cities Convention’s inaugural Skills Summit taking place in Bradford on 7th and 8th May, supported by NFTS Leeds and other leading industry partners. The National Film and Television School’s Director of Curriculum Mark Readman will join Deep Fusion’s Ben Field for a live session exploring the intersection of AI, creativity and screen industry skills on 7th May.   

To support greater accessibility and industry inclusion, the NFTS is partnering with PACT, the UK’s leading screen sector trade body, to offer two 50% scholarships for applicants from PACT member companies.

Jon Wardle, Director of the NFTS, commented:
“At the NFTS, we’re committed to preparing storytellers not just for today’s industry, but for where it is headed. While much of the current conversation around AI focuses on disruption, we see its true potential as a creative enabler if used responsibly. This new course is about more than mastering tools, it’s about developing the critical thinking and ethical awareness needed to harness AI in ways that will open doors, spark collaboration and unlock entirely new forms of storytelling.”

Benjamin Field, co-founder and CEO of Deep Fusion Films added: “AI is already reshaping the screen industries, not in some distant future, but right now, on real productions, affecting real jobs. This course isn’t about fear or hype. It’s about giving creatives the tools, context and critical confidence to work with AI responsibly and artistically. We’re not teaching machines to be creative, we’re helping people use machines to unlock new creative possibilities, while protecting the values and rights that underpin the industry.”

Applications are now open, with the course commencing in September 2025. Apply via the NFTS website: nfts.co.uk/ai-protocols-and-practices-film-and-television