30th March 2026
“Welcome to the March 2026 edition of our Technology & Digital round-up. This month we’re looking at further proposed amendments to the EU AI Act, the CMA’s guidance on using agentic AI and Ofcom’s calls on major platforms for tougher age checks.”
If you’d like to receive the Technology & Digital round-up and other similar updates direct to your inbox, please click here.
Ready to protect your business against cyber-attacks? Click here to access our cybersecurity and data protection tool.
Get in touch with Sally Mewies, Andrew Northage, Nick Stubbs, Paul Armstrong, Luke Jackson, Matthew Lingard or any member of our Technology & Digital team if you have any queries or need advice or assistance.
Here’s your top stories for March.
The Council of the EU has agreed its position on targeted amendments designed to simplify and harmonise implementation of the AI Act as part of the wider “Omnibus VII” simplification package. The aim is to reduce administrative burden and give businesses clearer, more proportionate rules ahead of the Act’s phased roll‑out.
A key addition is the introduction of an explicit ban on AI systems generating non‑consensual sexual or intimate content or child sexual abuse material, strengthening the Act’s safeguards against the harmful use of AI. The Council has also confirmed fixed compliance deadlines for the high‑risk regime – 2 December 2027 for stand‑alone high‑risk AI systems and 2 August 2028 for high‑risk AI systems embedded in products – providing greater legal certainty around compliance timelines.
Further changes include reinstating the obligation for providers to register AI systems in the EU database for high-risk systems, even where they self-determine that their AI system is not high-risk.
Following the Council’s approval the amendments will now be considered by the European Parliament.
The Competition and Markets Authority (CMA) has issued new guidance for businesses using agentic AI in customer interactions, stressing that consumer law applies equally to AI and human agents, and that businesses remain responsible for their AI systems’ actions. The guidance highlights four core expectations:
The CMA also published accompanying research showing that current use of agentic AI is limited to controlled functions – such as customer service and internal business operations – but may expand into agentic commerce, where AI agents monitor prices or contract terms and trigger actions over time. This guidance has been described as “initial,” signalling that further AI‑specific guidance is likely to follow.
“The CMA’s latest publications show that the regulator is still feeling its way through the implications of agentic AI, but one message is already unmistakable: consumer law applies in full, whether a decision is made by a person or by an AI system. Businesses remain responsible for the actions and outputs of their AI agents, must be transparent about their use, and must ensure those systems don’t mislead, manipulate or erode consumer choice.”

Ofcom has ordered Facebook, Instagram, Roblox, Snapchat, TikTok and YouTube to show how they will stop under‑13s accessing their services and strengthen wider child‑safety measures, giving them a deadline of 30 April 2026 to explain their plans. Ofcom will then report on these responses and the next steps in May.
Although Ofcom has already taken enforcement action under the Online Safety Act, including disrupting access to child sexual abuse material and securing age checks on high‑risk services, it warns that industry efforts remain inadequate – many firms are still failing to make children’s safety a core design priority.
The regulator has set out four required actions: 1) enforcing effective minimum‑age policies, 2) putting in place failsafe grooming protections, 3) ensuring safer algorithmic feeds, and 4) ending product testing on children, with mandatory risk assessments before major updates.
“Ofcom’s stance makes clear that child safety can no longer be treated as a secondary consideration. This reinforces the need to build robust age assurance, safer algorithms and responsible product design into your platforms from the outset, or face heightened regulatory scrutiny.”

If you have queries about any of the points covered in this edition of the Technology & Digital round-up, or need further advice or assistance, please get in touch with Sally, Andrew, Nick, Paul, Luke, Matthew or one of our Technology & Digital experts.
Want to watch a previous webinar? Visit our digital academy, home to a library of digital content including webinars, our bite-sized video nuggets and podcasts, including our 60 second videos on what is an NFT and what is a blockchain.
Want to learn more from our Technology & Digital experts and be the first to receive important updates, developments and events from the team? Then visit our #WMTechTalk page or sign up for our newsletter, the Technology & Digital round-up here.
“These amendments are intended to strike the right balance between innovation and accountability, but it remains uncertain whether they will be adopted. For now, if you’re looking for regulatory certainty, you may have to wait a little longer.”
– Paul Armstrong, Director, Commercial