Privacy Rights

Efficiency Gains and the Lingering Specter of Bias

The DWP's AI Tool for ESA Applications

The Department for Work and Pensions (DWP) has revealed its use of an AI tool called "online medical matching" to help agents make decisions on applications for Employment Support Allowance (ESA). This benefit is available to individuals with health conditions that affect their ability to work.

The tool works by comparing the applicant's self-reported health condition with a centrally maintained list and finding the closest match. It boasts an 87% accuracy rate for this process. The result is then used to register the claim on the ESA system using an automated solution. However, a human agent ultimately reviews the case and makes the final decision on whether to award ESA.

Despite being in use since July 2020, the DWP only recently shared details of the tool. It has processed over 780,000 cases since its implementation, saving 42,500 operational hours. However, the tool's development wasn't without challenges. The initial version, used from 2020 to 2024, only achieved a 35% accuracy rate due to its focus on matching spellings rather than context.

Concerns have been raised about the potential for bias in AI-driven decision-making. Shelley Hopkinson of Turn2us emphasizes the importance of transparency and accountability in these systems, ensuring they work for people, not against them. This concern stems from a previous DWP machine-learning program used to detect welfare fraud, which exhibited bias based on factors like age, disability, and nationality.

The DWP maintains that the ESA matching tool has a low risk of bias as it only considers medical conditions, not personal details. Additionally, agents receive training to critically evaluate the tool's output and avoid simply accepting its results.

The DWP's use of AI aligns with the government's broader plan to leverage technology for economic growth and improved public services. However, the ESA matching tool is the first to be shared on the government's algorithm transparency register, despite the requirement for all departments to do so. The DWP has resisted calls to publish a complete inventory of its AI tools, citing the need for control over information distribution.

Read-to-Earn opportunity
Time to Read
You earned: None
Date

Post Profit

Post Profit
Earned for Pluses
...
Comment Rewards
...
Likes Own
...
Likes Commenter
...
Likes Author
...
Dislikes Author
...
Profit Subtotal, Twei ...

Post Loss

Post Loss
Spent for Minuses
...
Comment Tributes
...
Dislikes Own
...
Dislikes Commenter
...
Post Publish Tribute
...
PnL Reports
...
Loss Subtotal, Twei ...
Total Twei Earned: ...
Price for report instance: 1 Twei

Comment-to-Earn

5 Comments

Avatar of Noir Black

Noir Black

Relying on algorithms for decisions that affect people’s lives is a recipe for systematic error and discrimination.

Avatar of BuggaBoom

BuggaBoom

Even with human oversight, AI mistakes can lead to wrongful claim rejections or delays for those suffering from health conditions.

Avatar of Noir Black

Noir Black

Saving operational hours shouldn’t come at the cost of potentially overlooking nuances in individual cases.

Avatar of BuggaBoom

BuggaBoom

The lack of a complete public inventory of AI tools raises serious transparency and accountability questions.

Avatar of Noir Black

Noir Black

This tool’s focus on self-reported data might miss the context that only a human can understand fully.

Available from LVL 13

Add your comment

Your comment avatar