Skip to main content
Comment & Opinion

Technology & Digital round-up: March 2026

“Welcome to the March 2026 edition of our Technology & Digital round-up. This month we’re looking at further proposed amendments to the EU AI Act, the CMA’s guidance on using agentic AI and Ofcom’s calls on major platforms for tougher age checks.”

- Luke Jackson, Director, Commercial

If you’d like to receive the Technology & Digital round-up and other similar updates direct to your inbox, please click here.

Ready to protect your business against cyber-attacks? Click here to access our cybersecurity and data protection tool.

Get in touch with Sally Mewies, Andrew Northage, Nick Stubbs, Paul Armstrong, Luke Jackson, Matthew Lingard or any member of our Technology & Digital team if you have any queries or need advice or assistance.

Here’s your top stories for March.

#1: EU Council advances amendments to streamline the AI Act

The Council of the EU has agreed its position on targeted amendments designed to simplify and harmonise implementation of the AI Act as part of the wider “Omnibus VII” simplification package. The aim is to reduce administrative burden and give businesses clearer, more proportionate rules ahead of the Act’s phased roll‑out.

A key addition is the introduction of an explicit ban on AI systems generating non‑consensual sexual or intimate content or child sexual abuse material, strengthening the Act’s safeguards against the harmful use of AI. The Council has also confirmed fixed compliance deadlines for the high‑risk regime – 2 December 2027 for stand‑alone high‑risk AI systems and 2 August 2028 for high‑risk AI systems embedded in products – providing greater legal certainty around compliance timelines.

Further changes include reinstating the obligation for providers to register AI systems in the EU database for high-risk systems, even where they self-determine that their AI system is not high-risk.

Following the Council’s approval the amendments will now be considered by the European Parliament.

“These amendments are intended to strike the right balance between innovation and accountability, but it remains uncertain whether they will be adopted. For now, if you’re looking for regulatory certainty, you may have to wait a little longer.”

Paul Armstrong, Director, Commercial

#2: CMA publishes new guidance on responsible use of agentic AI

The Competition and Markets Authority (CMA) has issued new guidance for businesses using agentic AI in customer interactions, stressing that consumer law applies equally to AI and human agents, and that businesses remain responsible for their AI systems’ actions. The guidance highlights four core expectations:

  • Be transparent with your customers when AI agents are used.
  • Train AI agents to comply with consumer law and avoid misleading interactions.
  • Monitor AI performance, with human oversight where needed.
  • Refine AI agents quickly when errors or risks emerge.

The CMA also published accompanying research showing that current use of agentic AI is limited to controlled functions – such as customer service and internal business operations – but may expand into agentic commerce, where AI agents monitor prices or contract terms and trigger actions over time. This guidance has been described as “initial,” signalling that further AI‑specific guidance is likely to follow.

“The CMA’s latest publications show that the regulator is still feeling its way through the implications of agentic AI, but one message is already unmistakable: consumer law applies in full, whether a decision is made by a person or by an AI system. Businesses remain responsible for the actions and outputs of their AI agents, must be transparent about their use, and must ensure those systems don’t mislead, manipulate or erode consumer choice.”

Della Heptinstall, Associate, Competition

#3: Ofcom demands tougher action to keep underage children off major platforms

Ofcom has ordered Facebook, Instagram, Roblox, Snapchat, TikTok and YouTube to show how they will stop under‑13s accessing their services and strengthen wider child‑safety measures, giving them a deadline of 30 April 2026 to explain their plans. Ofcom will then report on these responses and the next steps in May.

Although Ofcom has already taken enforcement action under the Online Safety Act, including disrupting access to child sexual abuse material and securing age checks on high‑risk services, it warns that industry efforts remain inadequate – many firms are still failing to make children’s safety a core design priority.

The regulator has set out four required actions: 1) enforcing effective minimum‑age policies, 2) putting in place failsafe grooming protections, 3) ensuring safer algorithmic feeds, and 4) ending product testing on children, with mandatory risk assessments before major updates.

“Ofcom’s stance makes clear that child safety can no longer be treated as a secondary consideration. This reinforces the need to build robust age assurance, safer algorithms and responsible product design into your platforms from the outset, or face heightened regulatory scrutiny.”

Sally Mewies, Partner and Head of Technology & Digital

More recent updates…

  • The ICO has issued an open letter to major social media and video‑sharing platforms, calling on them to strengthen age‑assurance measures and move beyond easily bypassed self‑declaration to prevent under‑13s accessing services not designed for them. This reinforces Ofcom’s parallel demands for stricter age checks and safer design, signalling a coordinated regulatory push to ensure tech firms finally put meaningful, enforceable protections at the heart of their products.
  • The UK Government has announced a £2 billion “Quantum Leap” programme to make the UK the first country to deploy quantum computers at scale by the early 2030s. The investment aims to drive breakthroughs in healthcare, high‑paid job creation, and national security, positioning quantum as a defining technology of the future.
  • Ofcom has set out its final rules to drive the UK’s full‑fibre rollout to completion, aiming to boost productivity by expanding access to faster, more reliable broadband. Nearly 78% of homes now have full‑fibre, with networks on track to reach around 29 million properties by 2027. The updated framework supports further investment and could allow gradual deregulation of Openreach – but only where strong competition emerges.
  • UK businesses are being urged to check their company records after a major Companies House glitch briefly allowed logged‑in users to view or edit other firms’ confidential details, including directors’ home addresses and emails. The incident was reported to the ICO and NCSC, and Companies House says the flaw – introduced during a 2025 system update – has now been resolved, with investigations ongoing into whether any unauthorised changes were made.
  • AI company Anthropic is seeking to hire a chemical weapons and explosives expert to help strengthen safeguards that prevent its systems from being misused to provide dangerous weapons‑related information. This reflects growing industry concern about catastrophic misuse of AI, but involving AI systems in handling sensitive weapons knowledge could itself pose safety risks in the absence of international standards or oversight.

…and in other news

  • Organisations worldwide are rushing to create a globally recognised “AI‑free” or “human‑made” label, as demand grows for clearer signals that content or products were produced without generative AI.
  • The UK Government is establishing a £40m AI research lab to accelerate fundamental breakthroughs and tackle core issues such as hallucinations and unpredictable reasoning in current AI systems.
  • UK organisations are being urged by the NCSC to review and strengthen their cyber security posture in light of the evolving conflict in the Middle East, with a heightened risk of indirect cyber threats to those with operations or supply chains in the region.
  • The Business and Trade Committee has launched an inquiry into the opportunities and costs of AI for businesses and the workforce. It will examine AI’s future trajectory, its adoption across the UK economy, impacts on work, skills needs, and whether the UK’s regulatory framework and government strategy are adequate to manage AI’s risks and opportunities.

How we can support you

If you have queries about any of the points covered in this edition of the Technology & Digital round-up, or need further advice or assistance, please get in touch with Sally, Andrew, Nick, Paul, Luke, Matthew or one of our Technology & Digital experts.

Want to watch a previous webinar? Visit our digital academy, home to a library of digital content including webinars, our bite-sized video nuggets and podcasts, including our 60 second videos on what is an NFT and what is a blockchain.

Want to learn more from our Technology & Digital experts and be the first to receive important updates, developments and events from the team? Then visit our #WMTechTalk page or sign up for our newsletter, the Technology & Digital round-up here.

Our people

Sally
Mewies

Partner

Head of Technology & Digital

CONTACT DETAILS
Sally's contact details

Email me

CLOSE DETAILS

Nick
Stubbs

Partner

CONTACT DETAILS
Nick 's contact details

Email me

CLOSE DETAILS

Andrew
Northage

Partner

Regulatory & Compliance

CONTACT DETAILS
Andrew's contact details

Email me

CLOSE DETAILS

Paul
Armstrong

Director

Commercial

CONTACT DETAILS
Paul 's contact details

Email me

CLOSE DETAILS

Matthew
Lingard

Director

Intellectual Property, Trade Marks & Designs

CONTACT DETAILS
Matthew's contact details

Email me

CLOSE DETAILS

Luke
Jackson

Director

Commercial

CONTACT DETAILS
Luke's contact details

Email me

CLOSE DETAILS

Della
Heptinstall

Associate

Competition

CONTACT DETAILS
Della 's contact details

Email me

CLOSE DETAILS