
The Chicken or the Egg?
One of the most frustrating things right now for job seekers is that employers are screaming they cannot find enough employees. Yet, thousands of well-qualified people remain unable to find a job.
Why?
The reasons are many, including biases. For example, age discrimination is real. During the pandemic, it has gotten worse as many older employees lost jobs, and they have found it very difficult to replace the lost jobs.
Another stubborn (and less publicized) bias is the bias against the unemployed.
One common saying is that it is always easier to find a new job while you have a job. Sadly, that is true. Many companies screen out people who have been unemployed for six months or more for unknown reasons. There is no good reason to screen out the entire unemployed out of work for six months just automatically. That reflects all kinds of bias. For example, it screens out people who have taken time out of the workforce to raise a family. It screens out people who may need to take time off for a serious health condition or care for family members with a serious health condition. As those people try to go back to work, they hit the brick wall of bias against the unemployed.
A new area getting a lot of attention right now is artificial intelligence in the hiring process. Most job search boards and many large companies use artificial intelligence and algorithms to help screen the job applications they receive.
While AI is undoubtedly a helpful tool for companies to use in screening applicants, it can create many problems. Applicants have to figure out the “magic” word that will cause the AI to catch their resume or job application. If an applicant does not use just the right word to trigger a hit, a human will never see that application.
Because the AI is programmed to look for certain experience levels, it does not catch the strong applicants out there who do not have the perfect match in terms of experience sought but could still be excellent employees if given a chance and a bit of training. Intangible skills don’t necessarily translate well to AI.
Unfortunately, research shows that bias can permeate the algorithms used by those AI tools. This may cause discrimination against all kinds of protected groups. For example, years ago, Amazon had to scrap a recruiting tool it had designed to screen resumes for top talent. When it tried to use the tool, it found the AI screened out qualified women because the algorithm was based on ten years of patterns in resumes submitted to the company. Because most of those resumes came from men (due to the pattern of male dominance in the tech industry), the system did not rate candidates in a gender-neutral way and had to be tossed.
A company can run tests on the algorithms it uses in job screening to search for bias, but many companies do not do that. However, it sounds like it would be smart for companies to do so.
One of the next waves of litigation will be the challenge of such biased tools in hiring. Smart companies will make sure their AI can—and has—passed the tests against bias before that wave hits.
Though AI can save time, it’s clear it can never replace the human touch. Humans are so much more than words on a piece of paper screened electronically. Sometimes it takes a little time and effort—and luck—to find the perfect person for the role. I’d like to see all candidates get a fair chance at jobs with eliminating bias—particularly the bias against the unemployed. What a great world that would be.