An algorithm that screens for child neglect raises concerns

Australia News News

An algorithm that screens for child neglect raises concerns
Australia Latest News,Australia Headlines
  • 📰 AP
  • ⏱ Reading Time:
  • 232 sec. here
  • 5 min. at publisher
  • 📊 Quality Score:
  • News: 96%
  • Publisher: 51%

The latest tool in the arsenals of social workers are algorithms that help decide which families get investigated for child neglect. The AP has identified issues with reliability of the tool and how they can exacerbate racial disparities in the system.

https://apnews.com/article/child-welfare-algorithm-investigation-9497ee937e0053ad4144a86c68241ef1

From Los Angeles to Colorado and throughout Oregon, as child welfare agencies use or consider tools similar to the one in Allegheny County, Pennsylvania, an Associated Press review has identified a number of concerns about the technology, including questions about its reliability and its potential to harden racial disparities in the child welfare system.

​​If the tool had acted on its own to screen in a comparable rate of calls, it would have recommended that two-thirds of Black children be investigated, compared with about half of all other children reported, according to another study published last month and co-authored by a researcher who audited the county’s algorithm.

The developers have described using such tools as a moral imperative, saying child welfare officials should use whatever they have at their disposal to make sure children aren’t neglected. In addition, the county confirmed to the AP that for more than two years, a technical glitch in the tool sometimes presented social workers with the wrong scores, either underestimating or overestimating a child’s risk. County officials said the problem has since been fixed.

“When you have technology designed by humans, the bias is going to show up in the algorithms,” said Nico’Lee Biddle, who has worked for nearly a decade in child welfare, including as a family therapist and foster care placement specialist in Allegheny County. “If they designed a perfect tool, it really doesn’t matter, because it’s designed from very imperfect data systems.”

In a memo last year, the U.S. Department of Health and Human Services cited racial disparities “at nearly every major decision-making point” of the child welfare system, an issue Aysha Schomburg, the associate commissioner of the U.S. Children’s Bureau said leads more than half of all Black children nationwide to be investigated by social workers. “Over surveillance leads to mass family separation,” Schomburg wrote in a recent blog post.

“We know there are many other child welfare agencies that are looking into using risk assessment tools and their decisions about how much fully to automate really vary,” said Stapleton. “Had Allegheny County used it as a fully automated tool, we would have seen a much higher racial disparity in the proportion of kids who are investigated.

“We encourage agencies to listen to those critical voices and to make leadership decisions themselves,” she said. In several public presentations and media interviews, Vaithianathan and Putnam-Hornstein said they want to use public data to help families in need. But when AP asked county officials to address Carnegie Mellon’s findings on the tool’s pattern of flagging a disproportionate number of Black children for a “mandatory” child neglect investigation, Allegheny County questioned the researchers’ methodology by saying they relied on old data.

Dalton said her team wants to keep improving the tool and is considering new updates, including adding available private insurance data to capture more information about middle class and upper income families, as well as exploring other ways to avoid needless interventions.“If it goes into court, then there’s attorneys on both sides and a judge,” Dalton said. “They have evidence, right?”

The county initially considered including race as a variable in its predictions about a family’s relative risk but ultimately decided not to, according to a 2017 document. Critics say even if race is not measured outright, data from government programs used by many communities of color can be a proxy for race. In the document, the developers themselves urged continuing monitoring “with regard to racial disparities.

“Given that our data is drawn from public records and involvement with public systems, we know that our population is going to garner scores that are higher than other demographics, such as white middle class folks who don’t have as much involvement with public systems,” Noel said. “It’s their life and their history,” said Thad Paul, a manager with the county’s Child, Youth & Family Services. “We want to minimize the power differential that comes with being involved in child welfare … we just really think it is unethical not to share the score with families.”

For years, California explored data-driven approaches to the statewide child welfare system before abandoning a proposal to use a predictive risk modeling tool Putnam-Hornstein’s team developed in 2019. The state’s Department of Social Services spent $195,273 on a two-year grant to develop the concept.

We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

AP /  🏆 728. in US

Australia Latest News, Australia Headlines

Similar News:You can also read news stories similar to this one that we have collected from other news sources.

Denver Weather: Whipping Wind Raising Fire Concerns AgainDenver Weather: Whipping Wind Raising Fire Concerns AgainFire danger high with very strong wind set to blow in on Friday.
Read more »

Juneau officials question police plan to buy armored vehicleJuneau officials question police plan to buy armored vehicleSome officials in Juneau have raised concerns about militarization of the police force after learning of the local police department’s plans to buy an armored security vehicle that can seat 12 officers.
Read more »

Concerning clusters of severe hepatitis cases in children being investigatedConcerning clusters of severe hepatitis cases in children being investigatedGlobally, around 170 cases have been identified, according to WHO officials.
Read more »

Horrible Facebook Algorithm Accident Results In Exposure To New IdeasHorrible Facebook Algorithm Accident Results In Exposure To New IdeasMENLO PARK, CA—Assuring users that the company’s entire team of engineers was working hard to make sure a glitch like this never happens again, Facebook executives confirmed during a press conference Tuesday that a horrible accident last night involving the website’s algorithm had resulted in thousands of users being…
Read more »

Canadian Blood Services to end ‘blood ban,’ bring in behaviour-based screeningCanadian Blood Services to end ‘blood ban,’ bring in behaviour-based screening.CanadasLifeline asked GovCanHealth to allow it to scrap questions about gender or sexuality - via healthing_ca healthing health canada news healtnews
Read more »



Render Time: 2025-02-27 16:43:36