Jayden is 16, has dropped out of school after being suspended multiple times, and can’t really say what his plans are for the future. His mum is fed up – why doesn’t he get a job down at a fast food outlet while he decides what to do, or talk to his uncle about getting an apprenticeship?
Many of Jayden’s family have had difficulties finding work, and right now, Jayden’s prospects aren’t looking too great.
Jayden is a hypothetical 16-year-old, so let’s say that one day his hypothetical mum gets a call from MSD’s Youth Service, with the offer to put Jayden in touch with a youth coach. It’s Jayden’s choice whether he takes up the offer of help. But the youth coach can work with him to figure out what he wants to do next, and what training or job experience can help him get there.
After a couple of months working with the youth coach, Jayden decides to start training to get in to construction. For the first time he can see himself in a job he is keen about doing.
Jayden’s experience is real for thousands of young people who end up working with a youth coach after getting a phone call out of the blue from the Youth Service. That initial contact is the result of predictive modelling, in which we use data matching and models to identify those who are most likely to benefit from assistance. We identify young people at risk of being on a benefit for more than three months by the age of 19.
In the year to June, 3302 young people ended up getting back into education, training or work-based learning as a result of being contacted through the use of this model. As a result many of these young people are getting their lives on to a positive track.
At MSD, as well as paying benefits and pensions to more than a million people each year, we work closely with those who are searching for work, or who need training or other assistance to get ready for work. Big employers such as Downer, Fletcher Building and Accor Hotels offer job opportunities for our clients.
One of the problems we encounter is that sometimes those who would benefit most from our help haven’t asked for help yet or don’t know what we have on offer. Left to his own devices Jayden may not have come to us until he was 18 and applying for a benefit. He will have lost valuable time.
We are amongst more than a dozen government agencies that use tools like this to support operational decisions. Statistics NZ and the Department of Internal Affairs have released a stocktake of the use of algorithms by government agencies.
We have always used tools to support our operations, tools that have become more sophisticated over time. We believe predictive analytics should be part of the toolkit, but only if there are clear benefits for New Zealanders, and it is underpinned by strong ethics, privacy and human rights protections. So how does the model result in that phone call to Jayden’s mum?
Using data from the Ministry of Education for young people who leave school before they turn 18, the model then also looks at other known risk factors. These are whether their parents are on a benefit, any history of involvement with Oranga Tamariki, and their school history. Young people are rated as having a high, medium, low, or very low risk. The majority are rated as very low risk, and no further action is taken on those.
Contact details for those rated high, medium or low are passed on to a Youth Service provider. The provider can then make contact and offer help. The young person can choose whether or not to accept that help.
The personal and confidential information that goes into determining the risk rating is not passed on to the provider, and is stored securely.
We also use predictive analytics in a second area, to prioritise additional work-focused help for MSD clients who are on a benefit. This targeted approach involves looking at each client’s situation, such as their age, the length of time they have been on a benefit, any part-time or full-time work obligations, benefit type, and the age and number of children.
The model takes that information and compares it with the experiences of previous clients with similar profiles. If the model suggests they would benefit from more intensive support, they are moved up the queue for access to a work-focussed case manager.
It’s important to note that any client can also get access to this help by making an appointment at their local service centre.
Typically the extra support will involve the case manager talking to the client about their goals and aspirations and making a plan with them on what the next steps should be towards those goals. It may include looking at what training or other support is needed in hunting for jobs; discussing the client’s personal circumstances and how that will fit in with work options; and having a discussion on what additional assistance may be available in the transition to work, such as childcare assistance.
Gathering together people’s information in new ways needs to be justifiable, and meet privacy, human rights and ethics considerations. We have developed a framework known as PHRaE, which stands for Privacy, Human Rights and Ethics. This framework ensures that for any new service or process, we carefully think through questions such as: What are the benefits of the proposal? Do we even need to use personal information for it? Are there unintended harms that could result? What safeguards are in place?
Our use of predictive models is still developing. We believe with their use comes a need to be open about how they are being used and why.
One of the questions we’ve grappled with is how to achieve real transparency in relation to these sorts of complex analytic techniques. We’ve been looking at how we explain our use of predictive models to those with a non-technical background. Making the code or technical documentation available might appear transparent, but in reality only a small number of people would have the technical background to understand them.
Describing the model on its own also doesn’t tell you perhaps the thing people care most about – how is it actually being used operationally? We also need to make sure we have information at the right level of detail – what people who use MSD services want to know and understand will differ from what an informed commentator or researcher might want.
We have published information on our website, including videos and information aimed at clients, on how the models work. While we know we won’t have got the level of explanation exactly right, we’ve done our best to explain the models simply. We welcome discussion on how we continue to improve transparency over time.
In the meantime, for all the Jaydens out there, predictive models are already making a difference in people’s lives.