Warning -- I'm about to go on a rant.
Do you ever read something in the news that just makes you go, "Sheesh, people!!!" Or words to that effect?
And, no, I am not talking about the Presidential Election.
The Wall Street Journal had an article this week about employers who use artificial intelligence to determine whether their executives are at risk of developing dementia. Here's a link, but you may need a paid subscription to access.
The technology, I admit, sounds pretty cool in some ways. The AI can apparently tell from people's patterns of speech whether they are at risk . . . long before a qualified human physician would be able to diagnose the condition.
Although impressed with this development from a tech standpoint, in my head I was screaming, "What about the ADA? What about the ADA? Has anybody thought about the ADA?"
The article did not discuss the fact that an employer's using AI in this way might violate the Americans with Disabilities Act. But I think it is a big risk for employers. Worse than the risk that a perfectly compos mentis executive might get dementia six or seven years down the road.
The article says that the AI is correct in about 80 percent of cases. In other words, the AI is wrong in about 20 percent, or one-fifth, of cases. And, of course, the employer won't realize that the AI was wrong until it's too late because the AI is predicting future dementia, not diagnosing current dementia.
I guess I don't have dementia (yet) because I am able to recall that in early 2023, I asked ChatGPT to write a blog post for me on Groff v. DeJoy, a religious accommodation case that at the time was going to be heard by the U.S. Supreme Court. (That case has since been heard and decided.) ChatGPT did a nice job writing my post, except for one little detail . . . it said that Groff was a disability accommodation case under the Rehabilitation Act of 1973 instead of a religious accommodation case under Title VII. It also said that the case had multiple plaintiffs instead of a single plaintiff. Here's the quote it gave me:
The Supreme Court has recently announced that it will review the case of Groff v. DeJoy, a case that has the potential to significantly impact the rights of individuals with disabilities in the workplace. This case was brought forth by a group of individuals with disabilities who argue that the United States Postal Service (USPS) failed to accommodate their disabilities in violation of the Rehabilitation Act of 1973.
(Emphasis was mine.)
At least ChatGPT got the name of the case right.
Since I wrote that post, we've been hearing about lawyers writing briefs with the "help" of AI. Then they end up being sanctioned by the courts because the AI made up case law, meaning the lawyers were citing nonexistent court decisions in support of their clients' positions. As a result, many courts now have rules requiring AI-using attorneys to check their cases the old-fashioned way before submitting their briefs, and to certify to the courts that they have done so.
And we want to use AI to diagnose whether a person will have a devastating medical condition at some indeterminate time in the future? And we want to use that "information" in making employment decisions?
Um, yes it will.
Seven reasons why this probably violates the ADA
Here is why using AI in this way is going to get employers in trouble under the ADA and also under many state disability protection laws:
No. 1: Dementia is a disability, as are many other medical conditions.
No. 2: I feel sure that the U.S. Equal Employment Opportunity Commission, which enforces the employment provisions of the ADA, will say that a medical assessment conducted by AI is a "medical examination." Heck, it's an ADA "medical examination" for a frontline supervisor to casually ask an employee if she's limping because she has a bad hip.
No. 3: The ADA prohibits employers from requiring job applicants to undergo any sort of "medical examination" before a conditional offer of employment has been made.
No. 4: The ADA allows employers to conduct "medical examinations" after a conditional offer of employment has been made, but the information obtained cannot be used to disqualify the offeree. The only exception applies if the medical examination indicates that the offeree cannot perform the essential functions of the job, with or without a reasonable accommodation. I don't think a four-out-of-five chance of getting dementia in six years is going to cut it.
No. 5: Generally, it violates the ADA for an employer to discriminate against an applicant, offeree, or employee based on a concern that the individual "might" develop a medical condition in the future.
No. 6: An employer may not require a current employee to undergo a "medical examination" unless the examination is "job-related and consistent with business necessity." In other words, there has to be a job-related reason for requiring the medical examination, such as a performance issue or behavior concern that could reasonably be attributed to a medical condition. Sending an executive (or any other employee) for a medical examination to determine whether the individual is at risk for developing a medical condition in the future is not going to cut it.
No. 7: Merely asking these questions without a legal justification is an ADA violation. Even if the employer never actually uses the information against the employee. And, of course, if the information is used against the employee -- look out!
I will end on a positive note. If an employee is showing objective signs of developing dementia (or some other medical condition that seems to be affecting job performance or behavior), the ADA would allow the employer to send the employee for a medical examination to determine
- whether the employee can perform the essential functions of the job,
- whether reasonable accommodation is necessary or possible, and
- the types of accommodations that might be advisable.
In this context, the medical examination is likely to be "job-related and consistent with business necessity." And the use of AI to assist with the diagnosis (or reasonable accommodation recommendations) should not create an ADA problem.
*whew* Thanks, you guys. I feel better now.
- Partner
Robin has more than 30 years' experience counseling employers and representing them before government agencies and in employment litigation involving Title VII and the Age Discrimination in Employment Act, the Americans with ...
Robin Shea has 30 years' experience in employment litigation, including Title VII and the Age Discrimination in Employment Act, the Americans with Disabilities Act (including the Amendments Act).
Continue Reading
Subscribe
Contributors
- William A. "Zan" Blue, Jr.
- Obasi Bryant
- Kenneth P. Carlson, Jr.
- James M. Coleman
- Cara Yates Crotty
- Lara C. de Leon
- Christopher R. Deubert
- Joyce M. Dos Santos
- Colin Finnegan
- Steven B. Katz
- Ellen C. Kearns
- F. Damon Kitchen
- David C. Kurtz
- Angelique Groza Lyons
- John E. MacDonald
- Kelly McGrath
- Alyssa K. Peters
- Sarah M. Phaff
- David P. Phippen
- William K. Principe
- Sabrina M. Punia-Ly
- Angela L. Rapko
- Rachael Rustmann
- Paul Ryan
- Piyumi M. Samaratunga
- Robin E. Shea
- Kristine Marie Sims
- David L. Smith
- Jill S. Stricklin
- Jack R. Wallace
Archives
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- October 2019
- September 2019
- August 2019
- July 2019
- June 2019
- May 2019
- April 2019
- March 2019
- February 2019
- January 2019
- December 2018
- November 2018
- October 2018
- September 2018
- August 2018
- July 2018
- June 2018
- May 2018
- April 2018
- March 2018
- February 2018
- January 2018
- December 2017
- November 2017
- October 2017
- September 2017
- August 2017
- July 2017
- June 2017
- May 2017
- April 2017
- March 2017
- February 2017
- January 2017
- December 2016
- November 2016
- October 2016
- September 2016
- August 2016
- July 2016
- June 2016
- May 2016
- April 2016
- March 2016
- February 2016
- January 2016
- December 2015
- November 2015
- October 2015
- September 2015
- August 2015
- July 2015
- June 2015
- May 2015
- April 2015
- March 2015
- February 2015
- January 2015
- December 2014
- November 2014
- October 2014
- September 2014
- August 2014
- July 2014
- June 2014
- May 2014
- April 2014
- March 2014
- February 2014
- January 2014
- December 2013
- November 2013
- October 2013
- September 2013
- August 2013
- July 2013
- June 2013
- May 2013
- April 2013
- March 2013
- February 2013
- January 2013
- December 2012
- November 2012
- October 2012
- September 2012
- August 2012
- July 2012
- June 2012
- May 2012
- April 2012
- March 2012
- February 2012
- January 2012
- December 2011
- November 2011
- October 2011
- September 2011
- August 2011
- July 2011
- June 2011
- May 2011
- April 2011
- March 2011
- February 2011
- January 2011
- December 2010
- November 2010
- October 2010