Jean was 16 when he was left outside the front door of UK Visas and Immigration’s Lunar House in Croydon, alone, frightened and without documents, after fleeing violence in his home country in Central Africa, according to an account given to The Independent. He said he was traumatised by what he had witnessed and that familiar sights, such as people in uniform, revived those memories. [1]

Thousands of unaccompanied asylum-seeking children reach the UK each year, the majority aged 16 or 17, and in the year ending March 2025 there were 3,707 asylum claims from lone children, The Independent reported. For those under 18, local authorities are legally responsible for providing safe accommodation, basic support and help with claims; misclassification as adults can strip them of that protection. [1]

Charities and inspectors say that misclassification is widespread. Data obtained by the Helen Bamber Foundation shows at least 678 children in 2024 were wrongly classified as adults after a human “visual assessment” at the border, and the foundation’s wider reporting documents hundreds more cases in which children were placed in adult settings, exposing them to abuse and exploitation. According to the Helen Bamber Foundation, 90 local authorities received 1,335 referrals in 2024 and independent checks found 56% of those sent to adult settings were in fact children. [1][2][6]

The independent chief inspector of borders and immigration, David Bolt, found that factors such as “lack of eye contact” were being used to make age judgements and that children were being “pressured” into declaring they were over 18; from a sample of 55 cases the inspector examined where the Home Office had said the person was "significantly over 18", 76% were later found to be children. Similar investigations by The Guardian and others have reported that flawed visual assessments have resulted in at least 1,300 children being incorrectly deemed adults over an 18-month period. [1][3][5]

Ministers now plan to supplement or replace human judgement with AI facial-recognition age‑estimation technology, a move The Independent described after publication of a government contract notice seeking “an algorithm that can accurately predict the age of a subject”. The three-year contract, starting in February next year and valued at about £1.3 million, was announced by then-Home Office minister Dame Angela Eagle, who described the technology as the “most cost-effective option” and said it would be “fully integrated into the current age assessment system over the course of 2026”. [1][7]

The proposal has drawn strong opposition from charities and rights groups, which warn that facial age estimation is unproven for this purpose and risks replicating or amplifying existing errors and biases. Kamena Dorling, director of policy at the Helen Bamber Foundation, said the plans were “concerning unless significant safeguards are put in place”, and warned that AI cannot account for trauma, malnutrition and exhaustion that can make young people appear older. Anna Bacciarelli, senior AI researcher at Human Rights Watch, said the policy was “misguided at best, and should be scrapped immediately”, arguing there are no standardised industry benchmarks and no ethical way to train and audit the technology on comparable populations. [1][2][6]

Critics note that existing practice already relies on brief visual assessments that have led to dangerous placements in adult accommodation and detention; The Guardian and Helen Bamber Foundation reports have called for decisions about lone children to be removed from the Home Office and handed to independent professionals with faster, more humane processes to prevent children being left in limbo for years. Those calls highlight systemic failings rather than merely isolated mistakes. [3][4][6]

The Home Office defended its plans, saying “Robust age assessments are a vital tool in maintaining border security” and that it will “start to modernise that process in the coming months through the testing of fast and effective AI age estimation technology”, adding that integration would be subject to testing and assurance. It has not clarified at which stage of the asylum process the technology would be used, or how systems would be validated to account for the effects of trauma on appearance. [1]

For many who have been misclassified the consequences are life‑altering. Jean described being told at a late-afternoon interview that officials “said ‘you are not a child, saying you are a liar’”, being housed in a hostel with adults and subsequently spending years sleeping rough before charities helped him secure a fresh asylum claim and, eventually, recognition and sanctuary. Speaking about the government’s plans to use AI he said: “It’s a way of not treating people as human beings. They are treating us as a tool to train their AI.” [1]

As the government moves towards testing and potential deployment, industry data and NGO reporting underscore a tension between the state’s stated goals of efficient border management and the practical, ethical and safeguarding risks of delegating age assessments to automated systems. According to analysis by advocacy groups and investigative reporting, any shift to AI will require transparent benchmarks, independent oversight and guarantees that children will not be deprived of protection while experiments are carried out. [2][3][7]

📌 Reference Map:

##Reference Map:

  • [1] (The Independent) - Paragraph 1, Paragraph 2, Paragraph 3, Paragraph 4, Paragraph 5, Paragraph 6, Paragraph 8, Paragraph 9
  • [2] (Helen Bamber Foundation) - Paragraph 3, Paragraph 6, Paragraph 10
  • [3] (The Guardian) - Paragraph 4, Paragraph 7, Paragraph 10
  • [4] (The Guardian) - Paragraph 7
  • [5] (The Guardian) - Paragraph 4
  • [6] (Helen Bamber Foundation) - Paragraph 3, Paragraph 6, Paragraph 7
  • [7] (The National) - Paragraph 5, Paragraph 10

Source: Noah Wire Services