An Algorithm Aimed To Help Jordan’s Poor. It Excluded Some In Need, Report Finds

Published 10 months ago
By Forbes | Rashi Shrivastava
Roman Theater in Amman
(Photo by Ali Balikci/Anadolu Agency via Getty Images)

As international organizations push for automation in social assistance programs, a report by the Human Rights Watch finds that the algorithms can’t always identify the most vulnerable households.


The World Bank is increasingly incentivizing countries to develop technologies that can find and rank people in poverty so they can be provided with cash transfer and social assistance programs, according to a report by the Human Rights Watch. But an algorithm used by a Jordanian government, which ranked households on 57 socio-economic factors in order to distribute funds from the World Bank, was found to be inaccurate and biased, the report says.

A cash transfer program for low-income households in Jordan, called Takaful, uses an algorithm to track indicators like property, car and business ownership, the number of members in a household, and electricity and water consumption to assess which families qualify for assistance. The program, launched by the National Aid Fund (NAF) in Jordan in 2019, was responsible for the allocation of roughly $1 billion provided by the World Bank and has 220,000 families enrolled as beneficiaries. The Human Rights Watch, a nonprofit that advocates and researches human rights, found that the algorithm failed to take into account the economic complexities faced by Jordanians who are under the poverty line, leading to some people in dire need being excluded.

Lead researcher Amos Toh, whose team conducted 70 interviews with families and business owners in Jordan since late 2021, says some factors took the country’s social realities into account. For instance, because women comprise a small part of the Jordanian labor force, households that are headed by a woman were more likely to qualify for the program. But other indicators that influence the algorithm, such as asset ownership and electricity consumption, are not appropriate measures of poverty and can make the program seem like a “lottery,” he says.

Advertisement

“These rigid indicators and measurements that comprise the algorithm lead to a loss of nuance and aren’t able to capture what any family is going through at one point in time.”Amos Toh, researcher at Human Rights Watch

The study found that some families who were disqualified had recently inherited property but didn’t have the financial means to make ends meet. Other applicants who owned a car that was less than five years old or a business that was worth over $4300 were automatically excluded by the algorithm, even if the car owner couldn’t afford petrol or the business was failing.

Mariam, a woman who lives with her family outside the city and needs a car for commuting and transporting water, was dropped from the Takaful program in 2022, according to her interview with the Human Rights Watch. She says that owning a car, which is one of the criteria factored into the algorithm, affected her chances of getting financial aid. “The car destroyed us,” she told Human Right Watch.

Jordan is not the first country whose government has used an algorithm to make crucial decisions about people’s lives. An algorithm used by Rotterdam that ranked people who received welfare benefits based on their risk of fraud was found to be biased and flawed. Crime prediction software used by police departments in the U.S. has been repeatedly found to be racially biased and problematic. In 2016, the Australian government used an automated system to calculate overpayment and issue debt notices to welfare recipients which wrongly accused about 400,000 people.

Advertisement

“When we were in Jordan and interviewing people, they kept talking about how the algorithm is depriving them of support,” Toh told Forbes. “These rigid indicators and measurements that comprise the algorithm lead to a loss of nuance and aren’t able to capture what any family applying is going through at one point in time.”

The algorithm was developed by the National Aid Fund, a social protection agency in Jordan. It pulls in 80 percent of the application data from the National Unified Registry, a database of applicants’ information that includes their income, employment and education. The World Bank gave a loan of $2.5 million to create the registry with the help of a third-party contractor called Optimiza. Also known as Al-Faris National Investment Company, Optimiza is an Amman-based software provider and tech consultancy that processes the data and manages the database. The company, which said the NAF was a client, declined to comment.

“The car destroyed us.”Mariam, applicant of the Takaful program

Toh says human-induced inaccuracies and inconsistencies in the data compilation could affect the ranking of individuals in the algorithm. Data collected from surveys and mass registration exercises are often conducted infrequently and in limited areas, the study found. Another flaw in the program was that it didn’t accept applications where an individual’s expenses were 20 percent more than their income. To work around it, applicants had to arbitrarily inflate their income or lower expenses, Toh says.

Advertisement

In a detailed response to Human Rights Watch, the National Aid Fund said that each of the 57 indicators held a certain weight and the indicators were developed based on the concept of “multidimensional poverty.” NAF said it conducted house visits to verify data collected in the form and that it has a complaints and appeals process for citizens. The World Bank said in response that targeting through algorithms and formulas is an “effective tool… to make the most of constrained fiscal space” and reduce poverty.

The World Bank also has given loans for similar projects to upgrade technologies and databases in seven other countries in the region, including Egypt, Iraq, Lebanon, Morocco, Yemen and Tunisia. HRW found that the World Bank has loaned $100 million to Tunisia to develop and integrate machine learning and a targeting AI model into the country’s social registry to detect fraud and improve compliance.

“At the end of the day, the algorithm generates a crude ranking of people’s poverty that essentially pits one household needs against another for support and generates social tension,” Toh says.

Advertisement