President Biden’s AI executive order will impact private entities
The executive order is lengthy and identifies several areas of concern with the use of AI, including cybersecurity, competition, labor, health, privacy, and education.
On October 30, 2023, President Biden took a major step in regulating the use of artificial intelligence (AI) with the issuance of the “Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence.” The executive order is lengthy and identifies several areas of concern with the use of AI, including cybersecurity, competition, labor, health, privacy, and education. While the executive order is focused on directing the federal agencies to achieve the goals outlined within the document, it is evident that this will have a broad impact on private entities as well.
Some of the impact stemming from this document is evident from the text itself. Private entities will undoubtedly be affected by the wage and hour and union implications contained in the executive order. For instance, the Secretary of Labor is tasked with consulting with labor unions, workers, and other outside entities to “develop and publish principles and best practices for employers that could be used to mitigate AI’s potential harms to employees’ wellbeing and maximize its potential benefits.”
The executive order also directs the Secretary of Labor to issue guidance to ensure that any employer using AI tools to monitor or augment its employees’ work are compensating their employees for all hours worked under the Fair Labor Standards Act. This marks the first time that the Biden administration has expressed concerns regarding how AI is being used in the wage and hour context. This means that the Department of Labor’s Wage and Hour Division will likely issue guidance regarding the interplay between AI and the wage and hour laws.
Further, the executive order directs the U.S. Department of Justice to increase coordination with the offices on AI and algorithmic discrimination issues, “improve external stakeholder engagement to promote public awareness of potential discriminatory uses and effects of AI; and develop, as appropriate, additional training, technical assistance, guidance, or other resources[.]” This will likely lead to increased enforcement actions related to algorithmic discrimination issues, an issue the Equal Employment Opportunity Commission, Federal Trade Commission, and Department of Justice have expressed concerns about. Additionally, companies that develop (or intend to develop) AI tools for the federal government are required to provide the government certain information, reports, and records regarding the tools.
Private entities that contract with the federal government will also be directly impacted by the executive order. The document directs the Secretary of Labor to publish guidance for federal contractors regarding “nondiscrimination in hiring involving AI and other technology-based hiring systems.” As a result, the Department of Labor’s Office of Federal Contract Compliance Programs will likely issue guidance soon to clarify how the agency will apply existing guidance, including the Uniform Guidelines on Employee Selection Procedures, to hiring decisions involving AI and other emerging technologies. Any guidance issued by Department of Labor agencies related to AI will outline important standards that employers should consider incorporating into their practices.
The executive order also provides a method for the agencies to indirectly regulate the use of AI. The executive order leverages the federal government’s pocketbook, as one of the largest purchasers of technology, to ensure the compliance of private companies. Private entities will likely, or at least should, look to the federal government as the “model employer.” Thus, many private entities will be indirectly influenced into modeling their practices to mirror those followed by the federal government to ensure compliance with the principles outlined in the executive order.
Read more: Do the benefits of AI-based technologies outweigh the risk for employers?
Private entities will also likely be impacted by the executive order as they may be called upon to push certain objectives over the finish line. It is likely that private entities will be influential in deciding how to implement aspects of the regulation outlined in the executive order. This impact on private entities is fitting as the Executive Order is the byproduct of a continued cooperation with some of the leading AI companies. More specifically, in the summer of 2023, the White House announced that it had secured voluntary commitments from fifteen of the leading AI companies in an effort to control the risks posed by AI tools. It is likely that as the use of AI continues to evolve, private entities will be called upon to help address the ever-changing AI landscape.
Bradford J. Kelley is a shareholder in Littler Mendelson P.C.’s Washington, D.C. office. Previously, he was a senior official at the U.S. Equal Employment Opportunity Commission (EEOC) and the U.S. Department of Labor’s Wage and Hour Division.
Kellen Shearin is an associate at Littler Mendelson P.C. and represents and advises employers on all aspects of labor and employment law.