Most organisations overestimate how well they understand AI. The risk? They buy tools that quietly create new legal and reputational risks.

This is why we wrote the AI Procurement chapter in Global Legal Insights' AI, Machine Learning & Big Data 2026, which was published on 11 May 2026. Whether you are beginning to explore AI tools or already deep into vendor selection, the chapter offers a structured, practical framework to help legal, compliance and procurement teams ask the right questions, avoid common pitfalls, and put robust contractual protections in place before commitments are made.

What’s in the chapter

The chapter is practical guide, taking you from preparatory groundwork to implementing contractual frameworks. Here’s what we cover:

  • the readiness gap: just over half of workers expect to use AI in the next 12 months, yet only 21% feel confident doing so. We explain why that gap stalls procurement;
  • common failures: buying tools before defining the problem; unclear data ownership; no adoption plan; weak risk assessments. We set out how these are all avoidable;
  • the people problem: nearly half of global leaders expect cultural resistance to slow AI adoption. If your people won’t use it, you’ve bought expensive shelfware;
  • due diligence checklist: this includes issues such as financial viability; training data provenance; data retention; security architecture; regulatory readiness across the EU, US and UK; incident reporting; and AI insurance;
  • board-level questions: what problem are we trying to solve? Who owns the output? How do we pause or decommission safely?;
  • risk assessment framework: this includes proportionate assessments covering data sensitivity; harm potential; deployment scale; regulatory exposure; and supplier maturity;
  • contractual controls: AI moves faster than the rules that govern it. Contracts are now the primary mechanism for allocating risk. We cover data training restrictions; IP warranties; bias controls; audit rights; explainability; incident escalation and model change governance; and
  • negotiation realities: what this means for you: large LLM providers rarely negotiate. Smaller suppliers’ terms contain gaps, but those contracts are often more negotiable.

Why now?

There’s no single AI rulebook. Contracts are now typically how you operationalise your AI strategy and enforce your risk appetite on your own terms. Those that get governance right early move faster and avoid costly missteps.

One action today: download the chapter and share it with whoever owns AI purchasing decisions in your organisation. Business teams want AI deployed fast so they need to know what the risks and opportunities are before they’ve started the ball rolling.

Access now

Please get in touch with us directly to discuss how this applies to your organisation.

Authors