Can an Algorithm Tell When Kids Are in Danger?
Can an Algorithm Tell When Kids Are in Danger?In particular, he told me, the kids who were screened in were more likely to be found inneed of services, “so t...
🔥 Related Trending Topics
LIVE TRENDSThis video may be related to current global trending topics. Click any trend to explore more videos about what's hot right now!
THIS VIDEO IS TRENDING!
This video is currently trending in Czech Republic under the topic 'telly'.
About this video
Can an Algorithm Tell When Kids Are in Danger?
In particular, he told me, the kids who were screened in were more likely to be found in
need of services, “so they appear to be screening in the kids who are at real risk.”
Having demonstrated in its first year of operation
that more high-risk cases are now being flagged for investigation, Allegheny’s Family Screening Tool is drawing interest from child-protection agencies around the country.
In August 2016, Allegheny County became the first jurisdiction in the United States, or anywhere else, to let a predictive-analytics algorithm — the same kind of sophisticated pattern analysis used in credit reports, the automated buying
and selling of stocks and the hiring, firing and fielding of baseball players on World Series-winning teams — offer up a second opinion on every incoming call, in hopes of doing a better job of identifying the families most in need of intervention.
Beginning in 2012, though, two pioneering social scientists working on opposite sides of the globe — Emily Putnam-Hornstein, of the University of Southern California,
and Rhema Vaithianathan, now a professor at the Auckland University of Technology in New Zealand — began asking a different question: Which families are most at risk and in need of help?
What predictive analytics provides is an opportunity to more uniformly and evenly look at all those variables.”
For two months following Emily Lankes’s visit to the home of the children who had witnessed an overdose death, she tried repeatedly
to get back in touch with the mother to complete her investigation — calling, texting, making unannounced visits to the home.
But the human brain is not that deft at harnessing and making sense of all that data.”
She and Putnam-Hornstein linked many dozens of data points — just about everything known to the county
about each family before an allegation arrived — to predict how the children would fare afterward.
In particular, he told me, the kids who were screened in were more likely to be found in
need of services, “so they appear to be screening in the kids who are at real risk.”
Having demonstrated in its first year of operation
that more high-risk cases are now being flagged for investigation, Allegheny’s Family Screening Tool is drawing interest from child-protection agencies around the country.
In August 2016, Allegheny County became the first jurisdiction in the United States, or anywhere else, to let a predictive-analytics algorithm — the same kind of sophisticated pattern analysis used in credit reports, the automated buying
and selling of stocks and the hiring, firing and fielding of baseball players on World Series-winning teams — offer up a second opinion on every incoming call, in hopes of doing a better job of identifying the families most in need of intervention.
Beginning in 2012, though, two pioneering social scientists working on opposite sides of the globe — Emily Putnam-Hornstein, of the University of Southern California,
and Rhema Vaithianathan, now a professor at the Auckland University of Technology in New Zealand — began asking a different question: Which families are most at risk and in need of help?
What predictive analytics provides is an opportunity to more uniformly and evenly look at all those variables.”
For two months following Emily Lankes’s visit to the home of the children who had witnessed an overdose death, she tried repeatedly
to get back in touch with the mother to complete her investigation — calling, texting, making unannounced visits to the home.
But the human brain is not that deft at harnessing and making sense of all that data.”
She and Putnam-Hornstein linked many dozens of data points — just about everything known to the county
about each family before an allegation arrived — to predict how the children would fare afterward.
Video Information
Views
7
Total views since publication
Duration
3:13
Video length
Published
Jan 4, 2018
Release date