Why Machine Learning In Your ATS Might Just Be Biased

Applicant Tracking Systems have been around since the game changed and multiple applications could be fired off at the click of a button from a job board. The mass influx of applicants into the Recruiter’s inbox means that technology had to adapt. Technology created the ‘Bouncer’, the muscle at the door to make sure that whilst everyone was invited to the party, not all were welcome in. The ATS has been evolving ever since.  

The technology keeps evolving

The pioneering applicant tracking technologies had complicated application forms, often requiring the candidate to painfully replicate the resume.  A frustrating process meant that candidates often deselected themselves. Candidates that often made it past the hurdles, may have been the ones with the most patience, not necessarily the most suitable.  Next up, enter the Resume Parser, extracting the useful information from the resume and building the database. It sounds great and it worked well, for standard formatted resumes. The robot would do a lot of the selection heavy lifting, analyse the text, rank the candidates on key words.  The ATS ‘Bouncer’ is now a Machine Learning, A.I Robot.

Where the bias comes into play

As the ATS tech stack evolved, so did the tools that made selection easier, increased machine learning and the massive uptake in video interviews. We are seeing A.I in all components of the selection process. The machines are now choosing the talent, naturally you would think this should eliminate unconscious bias and promote diversity. How can machines have bias, right?

Amazon certainly didn’t find that to be the case. They had to scrap their machine learning tool after they found, it didn’t like women. A.I analysed data from the hiring patterns, worked out that the majority of hires were men as their data scientists and engineers were male hires, so the tool learned that men were the best hires. As a result, Amazon is no longer using A.I for these roles.

The Vieple difference

Vieple strives to make the recruiters job more efficient and has therefore built features and assessments to help screen out unsuitable candidates faster. We are however, very careful about introducing technology that unfairly rejects candidates or has in built bias in it.

Video Interviewing usage has exploded in popularity, it’s great at helping to reduce unconscious bias in the interview process. Every candidate has the opportunity to answer the same set of questions, without the recruiter interaction. Removing human’s natural ability to build rapport and therefore minimising unconscious bias. It is important to note that it’s impossible to completely remove unconscious bias and that unconscious bias training may be beneficial for reviewers in further reducing U.B.

A.I is being used in Video Interviewing to analyse internal data and build the ‘ideal candidate’. So, what happens if the machine learns to simply hire people that are just like all the other people in your organisation. It has clear implications for diversity, and it is not a great predictor of future performance.  The risk is that what Video Interviewing was once great at doing (removing unconscious bias), is being replaced by machine driven unconscious bias.


At Vieple we carefully curate a bulk recruitment campaign for you, from Application Page to Job Offer with the assistance of an Organisational Psychologist who specialises in Diversity initiatives.

To find out how Vieple can automate some of the screening process for you, while keeping the information that often falls victim to unconscious bias, blind, contact us at hello@vieple.com