Artificial intelligence continues to grow in both technical ability and scale, and companies across a wide range of industries are implementing the new technology into hiring processes. The pace of innovation and adaptation has tended to outpace guidelines and laws around it.
Recently, the Partnership on Employment & Accessible Technology, an organization funded by a wing of the U.S. Department of Labor, released guidance on using AI in hiring. The framework focused on making sure that hiring processes that use AI are more inclusive for disabled individuals and covered 10 separate focus areas.
Heather Vickles, partner at Venable, told Law Week via email that the framework’s intention is to curb disability discrimination in hiring practices that use AI. She noted that the framework was optional and that there were parts that she found useful for companies to know and other sections that she wouldn’t apply.
“To briefly summarize the guidance, we are taking the beginning steps in a very long journey of AI in the workplace,” Vickles said. “Employers should pace themselves and move strategically when deploying AI.”
Moving forward starts with knowing the objectives and goals that the business is trying to achieve, according to Vickles. Once the goals are set, she noted that employers should use diverse teams to develop their systems and that the system’s practices and policies that are developed by a company should be subject to regular oversight.
“Employers cannot simply deploy AI, kick back, and relax; human involvement and oversight throughout the life cycle of AI systems is important for mitigating disability discrimination and providing reasonable accommodations,” Vickles said. “Employers should remember that AI in the workplace can help mitigate, or exacerbate, other forms of discrimination.”
“I think the #1 takeaway from the Department of Labor is that AI in hiring should not be wholly automated. Without proper human oversight, systems that are designed to mitigate bias may create the opposite outcomes,” Vickles added.
To ensure that there is oversight in an AI system, Vickles recommended that companies adopt a structured approach rooted in trust, responsibility and fairness. And when it comes to decisions about humans, Vickles added that humans, along with internal oversight or audits run by vendors or third parties, should play a role in the oversight.
For employers who are interested in bringing in AI into their hiring process, Vickles said there are vendors and third parties companies can look to, rather than trying to reinvent the wheel and build their own hiring systems.
“Of course, the key will be to identify reputable vendors and third parties and consider AI governance as these systems are employed,” Vickles noted.
When bringing in third-party vendors, Vickles emphasized two points. “Employers should review vendor attestations describing compliance and risk mitigation and look for providers with good and established industry reputations,” Vickles said.
Second, she noted that a third-party contract doesn’t mean that the employer gets rid of their liability.
“If the vendor’s system results in a discriminatory outcome and the employer is sued, the employer likely cannot get off scot-free,” Vickles said. “While liability cannot be contracted away, a good indemnification clause can provide the employer with some assurances.”
Where Vickles disagreed with the framework was on the subject of information sharing. She explained that the PEAT framework advised employers to publicly share incident information.
“Of course, there is a time and a place where employers are required to disclose incidents of discrimination, but there are many instances where an employer would not benefit from publicizing an incident of discrimination, regardless of whether the incident was an unintentional result of AI,” Vickles said.
Another part of the guidance advised employers to develop methods for conducting impact assessments that included job seekers.
“This is a new frontier. While it is important to include a wide array of diverse voices in developing impact assessments, there are many circumstances where it would not be appropriate for a company to open its internal processes to job seekers,” Vickles said.
She noted that while external transparency could be a positive for the general social perspective, it would be more beneficial for employers building new systems to focus on internal transparency and transparency with their service providers and vendors.
“Put simply, the key to transparency is appropriate disclosure; more is not necessarily better,” Vickles explained.