top of page

WHD Warning Regarding the Use of Artificial Intelligence for Employers

The Wage and Hour Division (WHD) of the U.S. Department of Labor issued a Field Assistance Bulletin (FAB), cautioning employers about the potential legal ramifications of utilizing Artificial Intelligence (AI) and other automated systems.

The bulletin highlighted that such technology might inadvertently lead to violations of laws enforced by the WHD. Among these laws, the Family and Medical Leave Act (FMLA) stands out as particularly susceptible to being affected by the integration of AI into Human Resources departments or other operational areas of a company.

The Spread of AI in the Workplace


The rising presence of AI in various professional sectors is swiftly expanding, offering both opportunities and challenges for businesses. While its integration into the workplace is undeniable, the consequences of this trend remain a topic of ongoing debate. The recent FAB warns against its indiscriminate use, highlighting potential legal ramifications if not applied appropriately, despite its benefits.


The increasing prevalence of artificial intelligence in professional environments is driven by the growing recognition of its potential for enhancing data management efficiency. For employers facing constraints on time and workforce availability for tasks like monitoring worker performance, scheduling, task assignment, and complex HR functions, automated systems can appear highly attractive.



However, the Department of Labor's latest Field Assistance Bulletin strongly advises against relying solely on such systems, citing potential legal consequences. It might be wiser for businesses to prioritize investing in additional staff for data management rather than risk compliance issues arising from inaccuracies in AI-generated reports.


The Utilization of AI in Business Operations


As previously mentioned, integrating AI systems can significantly enhance data management efficiency for employers. To streamline processes effectively, it is crucial for employers to implement necessary changes that facilitate the completion of essential tasks, thereby fostering growth.


AI systems currently implemented in many businesses are designed to track employee information in various ways, depending on the specific data being monitored and integrated into the system.



For instance, AI and other technologies can assign tasks and set work schedules. In the hospitality industry, automated systems in some hotels independently prioritize and assign tasks to housekeeping staff. When a guest checks out or requests service, these systems automatically delegate the cleaning task to an available worker based on various factors.


In other settings, AI systems perform multiple functions for employers, such as tracking work hours, measuring worker performance, and executing complex human resources functions. Other tools may specialize in more specific or limited functions.


Many of these tools are used with primarily remote or hybrid workforces, utilizing the technology necessary for professional tasks. However, AI is not confined to remote work; similar systems are also deployed in onsite workplaces, including offices, restaurants, retail stores, call centers, and warehouses.


The aforementioned AI systems could potentially create compliance or legal issues due to the lack of human oversight in their implementation. Additionally, many AI tools used for Human Resources tasks may lead to inaccurate reporting, which can adversely affect employees or limit their access to benefits if errors occur. Further issues could additionally arise from AI systems not being updated as frequently as federal, state, and county laws.


For instance, the Field Assistance Bulletin (FAB) posted by the Wage and Hour Division (WHD) highlights concerns about AI and automated systems in handling leave requests, medical certifications, and FMLA interference and retaliation.



Another option to streamline your business, aside from using AI systems, is to hire a Professional Employer Organization (PEO). By managing data efficiently, employers and employees can focus on their preferred day-to-day tasks while the PEO facilitates operations behind the scenes. A PEO also helps ensure compliance and provides expert human interaction, leveraging decades of experience- unlike AI systems that lack the extensive knowledge and expertise of PEO teams.


Examples of Risk Stated in the Field Assistance Bulletin (FAB):


The FAB outlines various examples illustrating the risks linked to AI and automated systems. Below are some highlighted instances, aimed at providing clarity on the reasons behind sharing this information.


  • Risk from Timekeeping and Monitoring AI Systems

    • Relying on automated timekeeping systems without adequate human oversight can cause compliance issues with federal wage and hour laws. If an AI program misclassifies compensable hours as non-work time, based on its analysis, it could lead to unpaid wages for hours worked.

    • AI-powered timekeeping systems can predict and auto-fill entries based on past data and schedules. However, employers must still ensure record accuracy and pay employees for all hours worked. AI doesn't exempt employers from ensuring accurate records of breaks and proper compensation for all hours worked under FLSA regulations.

  • Risk from Assigning Tasks and Setting Work Schedules

    • In the context of housekeeping workers, modern systems dynamically assign tasks based on real-time data and offer managers live tracking of employee activities. Instances where employees lack sufficient personal time, are not completely relieved of duties, or must remain near their workstation without a specified return time are deemed "engaged to wait" under the FLSA, constituting hours worked. Employers must diligently record both waiting periods between tasks and time spent on assigned tasks to ensure FLSA compliance, regardless of the technological tools utilized for task allocation, scheduling, or management.

  • Risk from Monitoring Employees Location to Track Work Hours

    • An automated system tracks employees' "work hours" based on their location upon entering and exiting the job site. However, relying solely on location monitoring to determine work hours can lead to compliance issues. For instance, if a construction worker starts their workday before arriving at the designated site by picking up tools or purchasing supplies, or if they continue working after leaving the site to unload supplies or complete tasks elsewhere, these hours may not be accurately recorded. Such systems may overlook travel time between work sites or hours spent working at different locations, potentially resulting in violations of minimum wage or overtime pay regulations.

  • Risk Specific to FMLA

    • Automated timekeeping programs may inaccurately assess employee hours, leading to errors in determining FMLA eligibility. Moreover, systems that excessively test for FMLA eligibility could wrongfully deny leave. Additionally, they might miscalculate available FMLA leave, potentially resulting in unauthorized denials. While such violations could arise from human error, AI or automated systems could amplify these issues across the entire workforce.

    • Employers utilizing AI for FMLA leave management face potential risks of violating the FMLA's certification requirements in determining leave eligibility.


  • Risk in Concluding Rates of Pay from AI Systems

    • Some AI systems autonomously compute workers' pay rates by analyzing various data points like demand, customer flow, location, performance, and task type. These systems can adjust pay rates throughout the day, leading to fluctuating rates week by week. Similarly, automated task assignment systems adjust tasks based on metrics, potentially affecting pay rates for piecework or task-based pay. Employers must ensure these systems adhere to federal wage standards, including minimum wage and overtime pay, by providing proper oversight. Compliance with FLSA and other relevant laws remains essential regardless of AI or technological use in wage calculation and determination.


  • Risk for Nursing Mothers

    • AI and other technologies are employed to monitor employee work hours, schedules, task assignments, break management, and productivity assessment. However, automated systems that restrict nursing employees' pump breaks violate the FLSA's requirement for reasonable break time. Similarly, systems penalizing workers for reduced productivity due to pump breaks also violate the FLSA.


  • Risk from Body Movement Recognition from AI Systems

    • Certain AI technologies can analyze eye measurements, voice patterns, micro-expressions, and body language to detect deception. However, under the EPPA, any use of lie detector tests, including those incorporating AI, is prohibited unless specific exemptions are followed.

    • Artificial intelligence or monitoring systems that track keystrokes, eye movements, or internet browsing to measure productivity do not determine "hours worked" under the FLSA. These metrics cannot replace the analysis of whether the employee was allowed or required to work during that time, thereby constituting "hours worked" under the FLSA.



When used responsibly, AI has the potential to enhance legal compliance. However, without proper human oversight, these technologies can endanger workers' adherence to labor standards, potentially resulting in violations of laws enforced by WHD.


Further, the implementation of AI in the workplace poses the risk of systemic violations across the workforce. Regardless of the use of AI or other automated systems, employers are accountable if their utilization leads to violations of laws enforced by WHD. Ultimately, employers must ensure the responsible deployment of AI to maintain compliance with WHD-enforced laws.


Comments


bottom of page