
IN AN ERA where the lines between work and personal life are increasingly blurred, the rise of new worker surveillance and control technologies is creating a dystopian reality that demands urgent attention from policymakers.
Legislation filed on Beacon Hill, An Act Fostering Artificial Intelligence Responsibility, known as the FAIR Act, would provide Massachusetts workers with much-needed protection against reckless and harmful uses of “bossware” technologies. Employers use these electronic and algorithmic decision systems to automate managerial functions, including determining whether workers get a job, tracking workers’ locations and communications throughout — and sometimes even after — the workday, and deciding how much workers get paid and whether they get promoted, demoted, or fired.
To workers across sectors — in warehouses, offices, factories, and job sites across the country — these technologies and their wide-ranging risks are not theoretical or distant. Call center workers have long experienced continuous, intrusive monitoring. AI-powered cameras monitor delivery drivers, who have even been forced to sign biometric information consent forms. The Trump administration and its Department of Government Efficiency (DOGE) have installed software that tracks federal workers’ mouse movements and keystrokes and can even remotely activate their webcams.
Unchecked surveillance threatens workers’ privacy, and research shows that it increases the risk of accidents, injuries, and numerous mental and physical disorders. Automated decision systems have caused workers to unfairly lose job opportunities and face pay cuts and termination based on inaccurate or incomplete data, often because those systems are deeply flawed and biased.
The FAIR Act (SD.838/HD.1458) would establish crucial safeguards, including limiting electronic monitoring to situations necessary for a legitimate business purpose, such as quality control and network security. Employers would have to notify workers of their electronic monitoring activities. The bill would also restrict the sale or transfer of employee data and prohibit the collection of sensitive information like biometric data, including fingerprint, face, and iris scans.
A recent study by the Center for Democracy & Technology and Coworker.org found that workers desire protections like those the FAIR Act offers. After participating workers discussed various types of workplace surveillance among themselves, the results were striking: overwhelming majorities strongly supported requiring greater transparency regarding employers’ surveillance and data collection practices, prohibiting off-clock surveillance, limiting location and biometric tracking, and barring employers from engaging in productivity monitoring that would harm workers’ mental or physical health.
Notably, this support transcended traditional political divides. For example, nearly identical and overwhelming majorities of liberal (96 percent), moderate (95 percent), and conservative (94 percent) participants favored requiring employers to disclose what types of employee data they collect and how they collect it. The FAIR Act’s electronic monitoring protections are thus not merely the right thing to do for workers; they also have the support of workers from all political backgrounds.
Of course, new technologies can also help workers perform their jobs better–but only if workers have input and autonomy in how those technologies are used. The FAIR Act thus protects workers’ roles in decision-making by shielding employees from adverse actions if they reasonably refuse to follow the output of an AI system because it would lead to a negative outcome. This ensures decision-making authority ultimately stays with humans rather than machines. That benefits not only workers, but the clients, customers, and patients their employers serve.
Importantly, the FAIR Act would also tackle automated decision systems by requiring independent impact assessments before employers implement such systems, banning their use in predicting employee behavior or interfering with protected activities, and ensuring workers receive meaningful notice before employers use them to make key decisions about them.
Bossware practices, and the dystopian workplaces they create, do not align with Massachusetts values. Unfortunately, we cannot expect the Trump administration to adopt rules or regulations addressing these practices — but as technology reshapes our workplaces, our laws must evolve to protect workers’ rights and dignity.
The FAIR Act represents a thoughtful, balanced approach to addressing the challenges posed by bossware systems. By passing this legislation, Massachusetts can set a national standard for worker protection in the digital age.
Chrissy Lynch is president of the Massachusetts AFL-CIO. Amanda Ballantyne is executive director of the AFL-CIO Technology Institute. Matthew Scherer is senior policy counsel for workers’ rights and technology at the Center for Democracy & Technology. Lynch is a member of the board of MassINC, the nonprofit civic organization that publishes CommonWealth Beacon.
The post We need to protect workers from dangerous ‘bossware’ technology appeared first on CommonWealth Beacon.