Legal updates and opinions
News / News
Do Androids Dream of Unfair Dismissal?
A federal judge in California recently considered an employment discrimination claim that involved AI. Even though it considered specific circumstances which are fairly unique and groundbreaking, this is likely to become more common practice in workplace disputes and to find application to the South African workplace in the future. The California court considered a proposed class action lawsuit, filed by a job applicant (Derek Mobley) against Workday Inc., an American on‑demand financial management, human capital management and student information system software vendor, where the lawsuit claimed that Workday’s AI-powered hiring software perpetuates existing biases against job applicants, on race, age and disability. Mobley claimed that he was overlooked for over a hundred jobs due to these biases.
In a groundbreaking ruling, even though Workday was not screening Mobley for direct employment with it, and instead relied on the results of the AI-powered hiring software to determine whether or not to put Mobley forward as a candidate for its clients, who sought candidates for employment, and as such Workday was acting in the traditional role of a recruitment agency, the California court determined that Workday could be seen as an employer under federal anti-discrimination laws because it handles screening tasks typically done by its clients. While the California court dismissed claims of intentional discrimination and ruled Workday is not an “employment agency”, it maintained that the company’s AI tools could make it liable for discrimination as they perform crucial hiring functions. The court took into account that Workday’s customers delegated their traditional hiring functions, including rejecting applicants, to the algorithmic decision-making tools provided by Workday.
This finding could have an impact on South African employers which may use service providers such as Workday, either when they engage the service provider directly as a recruiter for an open position, or where the employer utilises software, tools or platforms bought or licenced from the service provider to identify or select candidates for employment. This arises since, in South Africa, employees are protected against unfair discrimination in terms of the Employment Equity Act (EEA), which provides that no person may unfairly discriminate, directly or indirectly, against an employee, in any employment policy or practice, on one or more grounds, including race, gender, sex, pregnancy, marital status, family responsibility, ethnic or social origin, colour, sexual orientation, age, disability, religion, HIV status, conscience, belief, political opinion, culture, language, birth or on any other arbitrary ground. The EEA extends this protection to applicants for employees.
Given the above protections against unfair discrimination, which are already entrenched in SA labour laws and practices, and the broad scope of what is intended to be protected against, it may be that in future, if a South African employer engages a third party such as Workday as a recruiter, or itself uses software, tools or platforms provided by a third party such as Workday, it is possible that such employer could be involved in unfair discrimination claims. In the case where the employer engages a recruitment service, any court considering whether the recruitment service’s utilisation of AI or algorithmic decision-making tools results in unfair discrimination against a person, could take into account that the EEA restricts not just an employer, but “no person” from unfairly discriminating against even an applicant for employment at the recruiter’s client. Any such claim would possibly be brought against the potential employer, which could notionally, under South African law, be held responsible on the basis of the acts of the recruiter as its agent, or on the basis of the potential employer relying on or using the discriminatory results of the algorithmic decision-making tools. In addition, if the potential employer uses software, tools or platforms provided by a third party such as Workday, and these tools or processes produce discriminatory results, not only could the potential employer face legal claims, but it is possible that the provider of the algorithmic decision-making tools could itself face legal claims, as a developer of the tool. This is aligned to the legislative landscape taking shape in various countries around the globe to manage the potential harms that can arise from the development and use of AI systems, where the developers and deployers of AI systems could be held liable for, amongst others, biases in their AI tools. For example, the EU AI Act classes AI systems intended for recruitment or selection of employees and employee performance monitoring and evaluation as high-risk AI. AI systems that use automated decisioning to profile individuals in, among other things, a workplace context is also high-risk AI. This class of AI requires stringent obligations to be met under this legislation, including in respect of the data quality used in relation not the AI tool, to prevent biases, ensure accuracy, address cybersecurity risks and to provide transparency on how the AI tool works, to have human oversight to ensure consistency and prevent harms and ensure responsible use.
Algorithmic decision-making tools may also need to comply with specific South African laws, including the Protection of Personal Information Act, where such AI tools may not be permitted if they derive decisions and create a profile solely on the basis of automated processing of personal information.
Although the decisions of the courts of any foreign jurisdiction are not binding on SA courts, where a novel point of law is to be considered, foreign jurisprudence can be considered and can be considered persuasive. As such, the approach of the California court in the Workday case should be noted.
South African employers should note that the use of AI by recruiters could inadvertently draw them into claims for unfair discrimination by employees or applicants for employment; recruiters could face similar claims directly. Not only should the algorithmic decision-making tools be carefully calibrated to remove any biases that could constitute discriminatory grounds and to ensure compliance with any applicable laws and regulations, but the contracts for services between the parties should be carefully thought through to provide for commercial protection in the event that such legal risks materialise.
Latest News
Defamation in Labour Law – Manqele V Baloyi Masango Inc Attorneys and Others (896/2023) [2025] Zampmbhc 75 (12 August 2025)
by Bankey Sono, Director and Neo Sewela, Senior Associate It is not unusual for employers to appoint a law firm [...]
Voluntary liquidations: A cost effective and efficient method of conducting a corporate clean-up, and for ending the existence of dormant companies
by Brendan Olivier Quite understandably, the word 'liquidation' can send shivers down the spine, and cause a company director to [...]
Substance dependence in the workplace- misconduct or incapacity?
by Bradley Workman-Davies - Director, Nasheetah Smith - Senior Associate & Isabella Keeves - Candidate Attorney One of the challenges [...]
Cutting the baby in half – when equality meets reality: Paid maternity leave after Van Wyk v Minister of Employment and Labour
by Bradley Workman-Davies, Director and Kerry Fredericks, Director The Constitutional Court's recent judgment in Van Wyk and Others v Minister [...]
SME cashflow threats: when liquidation strikes a supplier or customer
by Brendan Olivier An SME that permits its customers and suppliers to trade with it on credit terms, does so [...]
Global developments in gambling, betting and e-sports regulation: Lessons for South Africa
by Tebogo Sibidla, Director Like many other sectors of the economy that rely on technology, online gambling, gaming and betting [...]
