Artificial Intelligence · December 2, 2025

A woman with digital code projections on her face, representing technology and future concepts.

Human in the Loop AI in Public Procurement

Human in the loop AI is going to be the difference between “useful assistant” and “future PACAC inquiry exhibit”. Under the Procurement Act 2023, that line really matters.

Below is a practical, UK public sector focused guide to building human in the loop AI for procurement, tied explicitly to the new regime.

1. Why the Procurement Act 2023 changes the AI conversation

The Procurement Act 2023 came fully into force on 24 February 2025, replacing the previous patchwork of EU-derived regulations and creating a single regime for most public procurement in the UK.

Section 12 of the Act requires contracting authorities, when carrying out a covered procurement, to have regard to the importance of:

  • delivering value for money
  • maximising public benefit
  • sharing information so suppliers and others can understand policies and decisions
  • acting, and being seen to act, with integrity
  • reducing barriers for SMEs(GOV.UK)

The Government’s Transforming Public Procurement programme and the new Central Digital Platform also raise the bar on transparency, with more notices and more data about how decisions are made.(GOV.UK)

At the same time, the UK’s AI regulation white paper sets out five cross-cutting principles for regulators and organisations using AI: safety and robustness, appropriate transparency and explainability, fairness, accountability and governance, and contestability and redress.(GOV.UK)

Put together, this means: if you use AI anywhere in your procurement lifecycle, you will need a clear story about oversight, fairness, transparency and accountability. That is exactly what “human in the loop” is for.

2. What “human in the loop” should mean in public procurement

The ICO’s guidance on AI and data protection distinguishes between:

  • AI assisted decisions where humans remain clearly in charge
  • AI influenced decisions where a system provides a score or recommendation that humans usually follow
  • Solely automated decisions where no meaningful human involvement takes place(ICO)

Under UK GDPR, individuals have specific rights where decisions are based solely on automated processing and have legal or similarly significant effects. Awarding, excluding or debarment decisions in public procurement will almost always fall into that category of “significant effect”.(ICO)

The ICO has also set a high bar for “meaningful human intervention”. It is not enough for a person to click “approve” at the end of an opaque automated process. Meaningful intervention requires:

  • interface design that allows the reviewer to understand and challenge outputs
  • training so reviewers know when to question or override AI
  • authority and support for staff to change the outcome when needed(ICO)

So in public procurement, “human in the loop” should mean at least this:

AI may propose. A trained, empowered and accountable human must decide.

Anything weaker risks both legal challenge and reputational damage.

3. A simple AI use case taxonomy for procurement teams

To make governance workable, it helps to classify AI use cases by risk and match them to clear rules.

Tier 1 – Low risk “productivity assistant”

AI supports routine work, with no direct legal or commercial effect:

  • drafting early-stage specifications, market engagement notices or emails for buyer review
  • summarising supplier questions, tender documents or evaluation comments
  • searching past contracts, framework call-offs or policies
  • spend classification and coding where humans sign off category structures and edge cases

Guardrails

  • Approved tools and prompt templates
  • Clear statement that outputs are drafts only
  • No personal data beyond ordinary business contact details without a Data Protection Impact Assessment (DPIA) check

Tier 2 – Medium risk “decision support”

AI informs decisions that matter, but humans must clearly retain control:

  • flagging potential conflicts of interest, abnormal bids or pricing outliers
  • highlighting evaluation comments that may be inconsistent with published criteria
  • supplier risk scoring that blends financial, performance and ESG indicators
  • predicting where contract performance issues are likely, for management attention

Guardrails

  • Mandatory DPIA where personal data is used, as recommended by the ICO for major AI projects(ICO)
  • Documented methodology for scores, thresholds and weights
  • Formal sign-off that humans understand the model’s limitations and do not treat outputs as binding
  • Clear record of how AI-generated insights were considered, accepted or rejected

Tier 3 – High risk “decision automation”

AI is involved directly in:

  • supplier exclusion and debarment decisions
  • evaluation scoring that directly affects rankings
  • award recommendations
  • major contract management actions such as termination or step-in

Guardrails

For most authorities, the safest position under the Act and UK GDPR is:

No fully automated decisions for Tier 3. Ever.

If AI is used at all here, it should be strictly as decision support, for example:

  • pre-scoring sections as a draft that evaluators must review and edit
  • providing consistency checks, such as flagging where scores and narrative do not align
  • highlighting potential patterns of bias or inconsistency across evaluators

Every final score, ranking and award decision should be traceable to a human decision maker, not a model.

4. Connecting human in the loop to the Section 12 objectives

4.1 Delivering value for money

AI can help analyse options, benchmark prices and identify patterns of waste or duplication. That directly supports the value for money objective in section 12.(GOV.UK)

Human oversight is needed to:

  • check that “cheapest” is not being confused with “best overall outcome”
  • weigh wider social value, environmental and innovation factors now emphasised in the National Procurement Policy Statement and social value guidance(socialvalueportal.com)
  • sense-check model outputs against market knowledge and clinical or operational realities

4.2 Maximising public benefit

AI could support better public benefit by:

  • identifying suppliers with stronger local employment, skills or sustainability offers
  • modelling whole life carbon or social value impacts

But if models are trained mainly on historic data, they can easily “bake in” the under-representation of SMEs, VCSEs or newer market entrants. The Act explicitly expects authorities to consider and reduce barriers for SMEs.(GOV.UK)

Human reviewers should therefore challenge:

  • whether training data and assumptions unfairly favour large incumbents
  • whether thresholds, risk scores or financial ratios are proportionate for SMEs

4.3 Sharing information and transparency

The Act’s transparency ambition includes extensive notice publication on the Central Digital Platform across the procurement lifecycle.(GOV.UK)

That naturally extends to AI:

  • authorities should be ready to explain in plain language where and how AI was used
  • evaluation reports should note where AI tools assisted, and how human judgement was applied
  • internal audit files should include enough documentation to reconstruct decisions

This dovetails with the AI White Paper’s emphasis on appropriate transparency and explainability, tailored to the risks in each context.(GOV.UK)

4.4 Acting, and being seen to act, with integrity

The Act puts integrity at the heart of the new regime.(GOV.UK)

Unchecked AI use can undermine that in several ways:

  • “black box” scoring that cannot be explained to unsuccessful bidders
  • unrecorded use of generative AI to draft sensitive correspondence or evaluation narratives
  • procurement data being used by external AI suppliers in ways that conflict with confidentiality obligations

Human in the loop controls help preserve integrity by ensuring:

  • clear lines of accountability for every decision
  • full audit trails of prompts, outputs and human changes
  • conscious choices about the trade-off between convenience and fairness

5. A practical human in the loop control framework

You do not need a 100-page policy to get started. Focus on clear roles, a simple lifecycle and robust documentation.

5.1 Roles and responsibilities

At minimum, define:

  • Senior Responsible Owner for AI in Procurement
    • accountable for how AI tools are used across commercial activity
    • signs off higher risk use cases and associated DPIAs
  • AI Product Owner or System Owner
    • responsible for day-to-day operation of specific tools or models
    • maintains documentation, training material and user access lists
  • Data Protection Officer / Information Governance
    • advises on DPIAs, lawful bases and controller / processor allocations, drawing on ICO guidance(ICO)
  • Procurement and Evaluation Leads
    • ensure tender documents and evaluation processes reflect AI use
    • provide feedback on how AI tools work in practice

Where external platforms or models are used, make sure controller and processor roles are properly mapped and recorded. The ICO highlights that AI deployments often result in complex joint controller and processor relationships, so you should not rely on generic SaaS contract templates.(Society for Computers & Law)

5.2 Lifecycle controls for AI in procurement

Use a simple lifecycle that mirrors the way you already manage procurements.

  1. Design and justification
    • define the specific procurement problem you are trying to solve
    • decide which tier (1, 2 or 3) the use case falls into
    • check alignment with the AI regulation principles, especially fairness, accountability and contestability
  2. Data and training
    • understand what training data or document corpora are being used
    • check for personal data, special category data or commercially sensitive material
    • where personal data is involved and risks are high, complete a DPIA and follow ICO guidance on AI specific DPIA considerations(ICO)
  3. Testing and validation
    • test on real but controlled examples before deploying at scale
    • compare AI suggested scores or rankings with human baselines
    • check for systematic differences in how SMEs, VCSEs or newer suppliers are treated
  4. Deployment with human in the loop
    • design user interfaces so evaluators can see why a suggestion has been made, not just the output
    • make it easy to disagree with the AI, and require a note where the human accepts it for high impact decisions
    • ensure reviewers have training in both the tool and the relevant legal framework
  5. Monitoring and review
    • track how often humans override AI suggestions and why
    • review models and prompts after each major procurement round
    • feed issues back into corporate AI governance and risk registers

6. What documentation should you keep?

The accountability principle in UK GDPR expects you to be able to show how you complied, not merely claim that you did. The ICO explicitly recommends DPIAs and AI-specific documentation as evidence of good governance.(ICO)

For each AI use case, keep at least:

  • AI use case summary
    • what the tool does, who uses it and which tier it is in
    • which stages of the procurement lifecycle it touches
  • Risk assessment and DPIA (where applicable)
    • legal bases, categories of personal data, and key risks
    • mitigation measures, including human in the loop arrangements
  • Technical and procedural notes
    • description of models or services used, including whether supplier fine tunes on your data
    • version history of prompts, templates and model settings
  • Decision logs
    • for significant procurements, records showing how AI suggestions were considered
    • examples where humans overrode the AI and why
  • Supplier facing transparency text
    • short paragraphs you use in procurement documents explaining whether and how AI tools are being used

This documentation can be proportionate. A simple spreadsheet or OneNote linked to your Procurement Act file structure is far better than nothing.

7. Practical guardrails for procurement teams

To keep things real, it helps to translate policy into a simple “do” and “do not” list.

7.1 Things AI can do with light touch controls

  • Drafting early versions of specifications, PINs and clarifications, for buyers to edit
  • Summarising lengthy supplier submissions or policy documents
  • Suggesting evaluation questions that align with published award criteria
  • Classifying spend and suppliers to support pipeline and category strategies

7.2 Things AI should only do with strong human in the loop controls

  • Flagging potential conflicts of interest or abnormally low tenders
  • Identifying patterns in pricing or performance data that might signal risk
  • Suggesting draft scores on qualitative questions for evaluator review
  • Prioritising suppliers for contract management attention

These require at least a structured review process and, where personal data features heavily, a DPIA.

7.3 Things AI should not do in the current regime

  • Make final award, exclusion or debarment decisions
  • Apply pass or fail decisions to mandatory criteria without human review
  • Generate evaluation narratives that are sent to suppliers without careful editing
  • Ingest or generate highly sensitive personal data (for example health information or detailed employee records) without explicit legal analysis and safeguards

These are the areas most likely to attract scrutiny from auditors, courts or the ICO if something goes wrong.

8. Where to start in the next 90 days

If your organisation is only beginning to formalise its approach, a pragmatic three step plan might look like this:

  1. Take stock of reality
    • survey commercial, procurement and IT teams to understand which AI tools are already in use
    • classify each into Tier 1, 2 or 3, and pause anything that looks like Tier 3 automation
  2. Pick two safe high value pilots
    • for example, document summarisation and searching previous contracts during planning, using clearly approved tools
    • wrap them in simple human in the loop controls and document the approach
  3. Update your core artefacts
    • add a short AI section to your procurement strategy or standing orders
    • build a one page “AI in procurement” guidance note for buyers and evaluators
    • adjust evaluation and award templates so they can record where AI has supported the process

From there, you can iteratively add more sophisticated use cases, knowing that you have a governance spine that stands up against both the Procurement Act and the UK’s AI regulatory direction of travel.

Used well, AI can absolutely help you deliver the Act’s objectives. The key is to make sure it is seen for what it should be in public procurement: a powerful assistant, not an invisible decision maker.


References

  1. Legislation.gov.uk, Procurement Act 2023, Section 12 – Covered procurement objectives
    https://www.legislation.gov.uk/ukpga/2023/54/section/12
  2. Legislation.gov.uk, Procurement Act 2023 – Contents and status
    https://www.legislation.gov.uk/ukpga/2023/54/contents
  3. NHS Shared Business Services, Procurement Regulations – Transforming Public Procurement – The Procurement Act 2023
    https://www.sbs.nhs.uk/services/procurement-services/procurement-regulations/
  4. Cabinet Office, Procurement Act 2023 – Guidance documents
    https://www.gov.uk/government/collections/procurement-act-2023-guidance-documents
  5. Government Commercial Function, The Official Transforming Public Procurement Knowledge Drops
    https://www.gov.uk/guidance/the-official-transforming-public-procurement-knowledge-drops
  6. UK Government, Department for Science, Innovation and Technology, AI regulation: a pro-innovation approach – White paper
    https://www.gov.uk/government/publications/ai-regulation-a-pro-innovation-approach/white-paper
  7. UK Government, A pro-innovation approach to AI regulation: government response to consultation (PDF)
    https://assets.publishing.service.gov.uk/media/65c1e399c43191000d1a45f4/a-pro-innovation-approach-to-ai-regulation-amended-governement-response-web-ready.pdf
  8. Information Commissioner’s Office, Guidance on AI and data protection
    https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/artificial-intelligence/guidance-on-ai-and-data-protection/
  9. Information Commissioner’s Office, What are the accountability and governance implications of AI?
    https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/artificial-intelligence/guidance-on-ai-and-data-protection/what-are-the-accountability-and-governance-implications-of-ai/
  10. Information Commissioner’s Office, How to use AI and personal data appropriately and lawfully (PDF)
    https://ico.org.uk/media/for-organisations/documents/4021860/how-to-use-ai-and-personal-data-appropriately-and-lawfully.pdf
  11. Information Commissioner’s Office, Governance and accountability in AI – data protection audit framework toolkit
    https://cy.ico.org.uk/for-organisations/advice-and-services/audits/data-protection-audit-framework/toolkits/artificial-intelligence/governance-and-accountability-in-ai/
  12. UK Government, Information and guidance for suppliers – Procurement Act 2023
    https://www.gov.uk/government/collections/information-and-guidance-for-suppliers
  13. Social Value Portal, The Procurement Act 2023 and Social Value – what authorities and suppliers need to know
    https://www.socialvalueportal.com/news-and-insights/the-procurement-act-2023-social-value-what-authorities-and-suppliers-need-to-know
  14. NEUPC, Procurement Act 2023
    https://www.neupc.ac.uk/procurement-act-2023