AI Licensing, Skills, and Readiness: Why UK SMEs Are Caught in the Gap
Note: This article was originally published by Tina Saul on LinkedIn.
UK Creative SMEs are challenged by an AI policy gap where complex licensing and outdated skills frameworks collide, forcing a governance deficit. This reality of legal risk and unsupported self-teaching hinders confident AI investment, especially for a human-first approach.
Last week, two conversations collided in ways that should concern anyone running a creative business.
In one room, Creative UK's Skills and Futures Network was wrestling with a familiar problem: how do you build a workforce for a creative sector where 94% of firms are SMEs, 30% of workers are freelancers, and the formal skills system barely acknowledges you exist? The Lifelong Learning Entitlement (LLE), launching in 2027, currently omits creative disciplines entirely from its modular offer. Meanwhile, one in three creative vacancies goes unfilled due to skills shortages.
Meanwhile, the Lords Communications Committee was hearing evidence on AI and the creative industries, surfacing structural problems that most creative businesses recognise but cannot solve alone: who owns what, who gets paid, and what transparency actually looks like when your work is being scraped at industrial scale.
These are not separate problems. They are the same problem, viewed from different angles.
How is AI Regulation Affecting UK SME Operations?
Creative SMEs are caught in the middle of a policy vacuum. Licensing frameworks are stuck between big tech operating at planetary scale and individual creators demanding fair compensation. The conversation assumes two parties, but most creative businesses are neither. They are the studios, agencies, and production houses who both create original work and use tools trained on data of uncertain provenance. They face risk on both sides, and clarity on neither.
Crucially, skills frameworks remain decades behind the reality of generative AI. Government assumes formal training pathways exist for this technology. They do not. According to sector research, 97% of creative practitioners are currently self-teaching AI. Fewer than 40% of teams have any structured support.
This is not a training gap. It is a governance gap disguised as a skills problem.
What UK Leaders Actually Need
The agency principals and creative directors I speak to are not asking for more tools. They are asking for clarity.
Can I use this? What are my obligations? How do I protect my own IP while using systems trained on everyone else's? What happens when a client asks whether our work was AI-assisted, and I am not sure of the answer?
These are not abstract policy questions. They are operational decisions that land on a founder's desk every week. And right now, most leaders are making them alone, without clear guidance, in a regulatory environment that changes faster than anyone can track.
New research from the Creative PEC analysing 168 million UK job postings found something telling: since the launch of ChatGPT, demand for creativity and AI skills increasingly appears together, particularly in high-skilled roles located within established creative clusters. The job market is not replacing creativity with AI. It is seeking people who can do both, in tandem, with judgement.
But here is the reality. That co-occurrence is concentrated in regions that already have thick labour markets and creative density. Left-behind areas are not catching up. They are falling further behind. The researchers call it the Matthew Effect: those who have, get more. Those who do not, lose ground.
The Case for Practical Clarity
For creative SMEs, the path forward is not waiting for policy to catch up. It is building internal clarity now, underpinned by the UK's Data Protection Act 2018.
That means understanding your own readiness before investing in tools. It means having honest conversations about where AI helps and where it introduces risk. It means governance that is proportionate to your size, not borrowed wholesale from enterprise playbooks that do not fit.
At KINTAL, we call this human-first AI. Not because it sounds good, but because it describes what actually works. Respecting rights. Clarity of usage. Clean workflows. Responsible production. These are not optional extras for creative businesses. They are the foundations of sustainable adoption.
Most creators would license their work if they were paid fairly (DACS survey trend). Most teams would adopt AI more confidently if they understood the boundaries. The lack of clarity, not the technology, is the main obstacle.
What Comes Next
Complexity is high. But I do know that creative SMEs deserve better than being caught between parliamentary inquiries and policy frameworks that were not designed with them in mind.
What they need is not more noise. It is practical clarity on how to work with AI safely and confidently.
If that resonates, we would genuinely like to hear how you are navigating this. What questions are landing on your desk that you cannot answer yet? What would practical support actually look like for your team?
Get a free AI check in just two minutes.

