World Health Organization warns of ageism in artificial intelligence

Mar 7, 2022 | Biz/Tech

Elderly people are at significant risk of being discriminated against by artificial intelligence in healthcare, the World Health Organization said in a report.

Vânia de la Fuente-Núñez, who worked on the report for the WHO’s Global Campaign to Combat Ageism, said new technologies should ensure data used by AI is representative of older populations or it would create a source of bias.

De la Fuente-Núñez, part of the WHO’s Demographic, Change, and Healthy Aging Unit that researches and promotes strategies for healthy aging, said as more AI is used in healthcare, developers should ensure they aren’t replicating the biases of the data sources used to program it.

“We have evidence showing that older adults tend to be excluded from health research, including clinical trials, even though they actually account for a disproportionate share of the total burden of disease and the use of prescription medicines,” she said.

“What happens is that this entire process of designing, testing and implementation tends to exclude older people,” De la Fuente-Núñez said.

The report identifies drug development and monitoring long-term care homes as intersections between AI and the elderly population. There is also growing use in diagnostics and resource management. It says ageism is already part of the healthcare system and the biomedical datasets used in AI development will include that bias.

“Chronological age is often used to determine who receives certain medical procedures and treatments, including things like ventilator support or access to intensive care units,” de la Fuente-Núñez said. “These older populations received inadequate treatment compared to other people, then the AI technology will replicate these biases.”

Ageism won’t just be limited to healthcare, says Charlene Chu, an assistant professor at the University of Toronto’s Faculty of Nursing.

“All of the social determinants of health are all being infiltrated by AI, and so we can’t just look at AI in relation to health,” Chu said. “We have to look at it in relation to everything.”

Chu was the principal investigator on a study about digital ageism, a term she created for the overall ageism in technology.

She said the data used to create the AI is part of a larger overall pattern of technology designed without elderly adults in mind, even if they are users. The elderly have less access to the internet and digital technology, which results in a “cycle of injustice” based on inaccurate datasets, Chu said.

She said the gaps in data are either ignored or filled by development teams whose knowledge of the elderly is often based on stereotypes.

“There’s a lack of diversity and representation in the data so correcting that, recognizing that first and also correcting that that there’s a lack of representation of older adults in the team of development and the testing of the actual technology,” Chu said.

“It’s not folded into the technology that’s being designed and so the effects of excluding older adults along the technology,” she said.

The U.S. government conducted a study in 2016 into discrimination in automation using AI. It showed algorithms used for a variety of functions, including bank loans, hiring practices, and the justice system, were discrimination was based on race and gender.

Chu said ageism in AI should be recognized the same way. She said discrimination can overlap where people will have to deal with ageism along with racism and/or sexism in automation. If these issues aren’t addressed in the development process, they could affect many aspects of life, she said.

“If AI is able to determine who goes to what school or who gets what job that indirectly will impact somebody’s well-being and health, we need to be able to identify these biases and also try to rectify the biases so that it’s not just amplified,” Chu said.

The WHO report outlines eight considerations that could address ageism in AI and include elderly people in its development. Some solutions include more age-related data, governance policies, a more robust ethics policy, and an accessible digital infrastructure designed for the elderly.

De la Fuente-Núñez said much of the solution lies in spreading awareness of the issue so programmers can know what to look for in data. She said developers should ensure elderly adults are involved in the AI process, especially when they are heavily impacted by the results.