Developments in AI licensing: what the Disney-OpenAI deal would have meant
As tensions between artificial intelligence (AI) developers and IP rights holders start to shape the future of creative industries, high-profile partnerships have begun to signal how such competing interests may be reconciled. The recently-scrapped deal between Disney and OpenAI attracted significant attention as a potential template for IP licensing in the age of generative AI.
Disney and OpenAI’s highly publicised agreement was notable, not because it was the first of its kind, but because it illustrated the direction of travel towards bespoke, negotiated licensing arrangements between AI developers and major rights holders. In light of the impending shutdown of Sora, the deal is perhaps best understood as a snapshot of how the AI-IP landscape was evolving and what it may still become.
Key terms and safeguards (as announced)
Disney and OpenAI announced a three-year licensing agreement in December 2025, under which Sora (a generative AI video creation platform) was to be permitted to use over 200 animated, masked and creature characters (as well as costumes, props, vehicles and environments) from Disney, Marvel, Pixar and Star Wars.
Rather than adopting a traditional licensing model based on upfront or recurring royalties, Disney made a strategic bet on artificial intelligence, reportedly securing a $1 billion equity stake in OpenAI as well as stock warrants allowing it to purchase additional shares if OpenAI’s valuation increased. The deal, as announced, would also have seen Disney integrating OpenAI’s technology, including ChatGPT, into internal workflows for research, marketing and production, with a view to improving efficiency.
The parties’ public statements emphasised a shared commitment to advancing “human-centred AI that respects the creative industries”, with some clear boundaries: notably, the exclusion of real actors’ voices or likenesses, and mutual commitments to maintaining robust controls to prevent the generation of illegal or harmful content.
In the wake of Sora’s shutdown, announced on 24 March 2026, spokespersons for Disney indicated that the entertainment giant would “continue to engage with AI platforms to find new ways to meet fans where they are while responsibly embracing new technologies that respect IP and the rights of creators”.
A note on the technology
Rather than storing or retrieving existing video clips or character models, platforms such as Sora generate new video frames by predicting what each frame should look like based on patterns learned during training. During training, such models learn statistical relationships between visual features and language descriptions.
Systems of this kind typically rely on diffusion models to generate the end product. Diffusion models work by gradually refining random visual noise over many steps until it becomes a coherent image that aligns with the user’s text prompt and the visual patterns learned during training. For video, such systems must also ensure that successive frames remain consistent so that characters, environments and camera angles do not change unpredictably between frames.
In other words, such models do not insert “stored” copyright-protected works (or other IP assets) into videos, but iteratively generate images that become progressively closer to the outputs described in the prompt, based on what the model learned during training.
This distinction is legally significant. Rather than reproducing specific IP assets, such systems generate new content that may nonetheless resemble protected characters if trained on relevant material. In the enforcement context, this gives rise to a number of issues, including: the likelihood of infringing outputs appearing to end users; whether the learned parameters of AI models constitute “infringing copies” for the purposes of secondary copyright infringement; and the extent to which AI developers should bear responsibility for user prompts that result in infringing content.
The Disney-OpenAI agreement can be seen as an attempt to contractually navigate such grey areas by authorising the use of high-value IP in the training and output stages.
Broader legal context
Amid a backdrop of AI-related litigation around the world (including the high-profile dispute between Getty Images and Stability AI) and ongoing debates regarding regulatory intervention, Disney and OpenAI’s agreement reflected a growing expectation among rights holders that the use of their high-value IP by AI systems should be subject to negotiated control and remuneration.
It would likely have contributed to the gradual normalisation of AI-generated franchise content and may have served as a precedent for character licensing in an AI context. It also formed part of a wider trend towards legitimising AI outputs through private contractual frameworks in the absence of comprehensive statutory regimes.
Ongoing considerations
While the agreement was set to offer legal certainty to Disney and OpenAI, entertainment unions such as Equity emphasised that the underlying content of such agreements is the result of professional work by creatives whose rights must be protected. Such comments underscore the importance of ensuring that proper consideration is given to key issues including authorship, originality and attribution rights in any licensing or dispute context involving creative works that are covered by a variety of intellectual property rights.
Key takeaways
For AI developers, the deal demonstrated the commercial and reputational advantages of securing licensed access to premium, IP-rich content. For rights holders, it reinforced the strategic value of established IP portfolios as licensable assets, and signalled a growing reliance on private contractual frameworks in the absence of comprehensive AI regulation.
With the benefit of hindsight, the agreement can be seen as an indicator of a possible future in which bespoke licensing agreements could emerge as the preferred mechanism for managing AI use of high-value IP. The shutdown of Sora does not negate that trajectory but highlights the uncertainty and volatility that continues to shape the evolution of AI and IP collaboration. We will continue to monitor developments in this area as the legal, commercial and technological landscape evolves.
Related article
The future of AI and copyright law: GEMA v OpenAI and Getty Images v Stability AI, 05 January 2026: dycip.com/ai-copyright-gema-openai

