The UN’s refugee data shame, and what needs to be done

Thursday June 24 2021

A Rohingya refugee holds ID cards, in Shamlapur refugee camp, Cox’s Bazar, Bangladesh on 25 March, 2018. PHOTO | FILE


I saw this coming, and I wish I had been wrong.

Back in 2017, I wrote of the risks of the UN’s refugee agency, UNHCR, collecting biometric registration data from Rohingya refugees, noting that the data could be used to drive unwilling repatriation; that collecting such data may make refugees believe their access to aid depends upon providing such data; and that – once collected or shared – such biometric data is virtually impossible to get rid of.

Nearly four years later, a report from Human Rights Watch (HRW) says these worst-case scenarios have come true: A detailed database of the Rohingya refugee population has been handed over to Myanmar’s government, which drove them across the border into Bangladesh almost four years ago. The same millitary that conducted the (most recent) genocide against the Rohingya now holds the biometric data of the population it has tried to eradicate.

A UN investigation team described Myanmar’s treatment of its minority Rohingya population as war crimes and crimes against humanity; characterised by brutal physical force, civilian casualties, villages razed to the ground, as well as internet shutdowns and information blackouts.

Refugees in the camps in Bangladesh told HRW they accepted the need to be registered with UNHCR to be recognised as refugees and get services. But they said they didn’t get a chance to opt out of a government-backed digital identity card – a process UNHCR also handled.


HRW reports that the data connected to those “Smart Cards”, including biometric scans, was shared with Myanmar, often against the refugees’ wishes. UNHCR denies HRW’s claims it misled refugees or broke its own data rulebook.

There are so many failures in what has happened: institutional failures to abide by organisational policies and guidelines; decision-making failures in permitting the data-sharing to happen; moral failures in UNHCR’s irresponsible actions and continued denial of its role in what happened; and sector-wide failures in the utter lack of accountability mechanisms – to name just a few.

These failures have real impacts, and biometric data is immutable – it will stay linked to their bodies until they die. Now the data has been shared, it’s impossible to take it back.

According to HRW, the data included not only the biometric data of the 830,000 individuals, but also details of their family compositions, their place of origin, and information on their relatives overseas.

Once they knew their biometric data had been shared, refugees interviewed by HRW suggested they could be targeted for forced repatriation or for retribution if they return to Myanmar. Some went into hiding when Bangladesh attempted to begin returns in 2019, using names from the database.

Targeted identification of persecuted populations to facilitate targeted killings and violence has long been a tactic of genocidal regimes, only this time the data is digitised – fast to access, quick to scale, and easily accessible. Meanwhile, the Myanmar military has been purchasing spyware that can “extract data from smartphones, access phone conversations, and monitor people’s movements”,thus demonstrating an appetite to use technical tools for repression.

In its rebuttal, UNHCR says refugees consented. Real consent in situations of such power asymmetry is practically impossible. But there is a deeper, more insidious harm here.

That data was taken from the bodies of human beings, and shared with those who explicitly want to cause them harm. It is a betrayal of their right to self-determination, of dignity, of their very personhood. At a time when personal data rights are a core part of data protection legislation in areas of the world where UNHCR and other humanitarian agencies have their headquarters, the act of treating personal data so callously and with such little regard for the wishes and well-being of the people it came from is inexcusable.

There is no way that the personal data of nearly a million European people would be treated like this without a massive outcry, without resignations and policy overhauls, without fines, firings, and legal ramifications.

It should go without saying, but here goes: A person’s nationality, citizenship or lack thereof, country of origin, or ethnicity should not affect the way in which they are treated, and this extends to the way in which their data is treated. And instead of courts and data commissioners, what recourse (other than social media outrage, articles like this one, or reports like HRW’s) do the Rohingya have?