Post Tech Policy in the Pandemic Consumer Privacy Cybersecurity

Your Office Doesn’t Need To Be a Permanent Surveillance State to Stop Coronavirus

July 30, 2020 , , , ,
personal office space

As workplaces begin to reopen, one question seems top of mind: How do I ensure that my employees aren’t sick? In a rush to answer this question, tech companies claiming to have the answer have begun marketing surveillance measures to track an employee’s risk for coronavirus infection. Unfortunately, it’s uncertain how effective these unproven technologies are at halting the spread of the virus. What’s worse, these measures pose serious threats to workers’ privacy and autonomy. As of right now, no federal privacy law exists that regulates the use of these surveillance devices in the workplace. This means that employees returning to their workplaces will experience more surveillance than ever before — without sufficient legal protections or regulatory guardrails to protect them.

Employee surveillance isn’t new. Employers have used wearables, management software, and even remote activity trackers to monitor workers’ productivity. The data derived from these tools have been used to judge an employee’s behavior, personality, and even fitness for employment. Little has been done to address how pervasive tracking may harm employees and lead to workplace discrimination. The push to adopt coronavirus surveillance in the workplace will only worsen this phenomenon, and the existing legal framework is not up to the task of addressing these problems.

Temperature Checks 

Many have proposed adopting temperature checks at the workplace to identify those infected with the coronavirus. Temperature checks are not unique to the business world, with some countries instituting these checks at borders and other hubs of transportation. However, temperature checking has proven ineffective.

First, not all people with coronavirus develop a fever. Second, checking someone’s temperature, especially with a ‘temperature gun,’ is not as easy as it seems. Depending on how closely you hold the device to a person’s forehead, the reading generated can be too high or too low, causing an increase in both false positives and false negatives. Lastly, and most importantly, surveillance companies are flooding the market with temperature cameras and thermal screeners not designed for medical use — while advertising these devices as ways to detect workers carrying the coronavirus. Unfortunately, these devices only measure skin temperature as opposed to core body temperature, resulting in less accurate readings. To add insult to injury, the warehouses, factories, or other large plants installing these less accurate temperature cameras then send “infected” workers home, and often without pay. This means the least effective tool for taking a person’s temperature is being used at workplaces where being defined as “sick” has the most negative consequences.

Coronavirus Screening Questionnaires

Coronavirus screening tools serve as mandatory questionnaires employees must take before starting their day that ask if they have been experiencing common coronavirus symptoms. Employee health providers or insurers, or sometimes even employers themselves, administer these tools. The idea behind these applications is that if a person “passes” the questionnaire by not self-reporting symptoms, then they can safely go to work. Employees who “fail” the questionnaire get sent home.

These questionnaires raise several potential privacy concerns. They collect a significant amount of data, including health data. The Equal Employment Opportunity Commision has made it clear that health data must be kept confidential and separate from normal personnel data, as the collection of this data carries risks to the employee. However, regular screenings may lead the employer to discover underlying health conditions that the employee did not wish to disclose. Additionally, this collected and stored data is now vulnerable to security breaches. Furthermore, any employee who isn’t paid while sick is incentivized to lie about their symptoms, drastically reducing the effectiveness of questionnaires while simultaneously increasing their invasiveness. In short, these tools aren’t worth the risk.

Intra-Office Contact Tracing

Contact-tracing applications, like the one developed by PwC, are being marketed to businesses as a way to monitor the spread of coronavirus within an office. The application is loaded onto the employee’s phone and first determines whether the employee is on work property through the use of geolocation data. Once the application determines that the employee is at work, it uses a combination of bluetooth and Wi-Fi signals to track the employee’s interactions throughout the day. When an employee reports testing positive for the coronavirus, the application then allows the employer to see who has had significant interactions with the infected employee. The employer can then take protective action, like notifying other employees who may have been exposed, scheduling deep cleanings for the affected areas, or closing the office entirely.

These contact-tracing apps sound eerily similar to the ones built using Google and Apple’s privacy-protective application programming interface — yet these apps aren’t designed to protect an individual’s privacy. The company who is using the app gets to see every interaction that an infected employee has had over the course of weeks, if not months or years. Ostensibly, this is so a company can easily contact their employees to tell them they may be at risk. However, this technology also gives employers extraordinary insight into how workers go through their day, from how many times they get up to get a drink of water to who they’re speaking with, when, and how much. Companies large and small could find many uses for this data. What’s more alarming is that this kind of technology can also be used to discover if employees are thinking of unionizing or taking action against their employer — information that could jeopardize their very jobs once employers find out and intervene.

Recommendations

This is only a taste of the kind of workplace surveillance that is likely to arise during the pandemic. We need policymakers to take action to mitigate the harms we are likely to see arise from these initiatives.

  1. Any privacy law that covers data collected due to the pandemic must include protections for data collected by employers about their employees. This is where the most surveillance and data collection is likely to take place for a majority of Americans. It is ridiculous to exclude employers from having to provide basic protections required of all data gatherers like data minimization, retention, and deletion requirements.
  2. The White House must prioritize publishing industry-specific guidance for businesses reopening. In its reopening guidelines, the White House says that businesses need to “monitor workforce for indicative symptoms.” This guidance is too vague to be useful. In order for industry to reopen safely, the executive branch should be publishing reviews of different COVID-19 surveillance technology, focusing on effectiveness and data protection while explaining risk profiles for different industries and publishing case studies on effective methods of curtailing transmission as they become available. A big reason there is a wide enthusiasm about these different surveillance technologies is because the White House has provided only this vague guidance.
  3. Make it economically viable for workers to call in sick. Many workers in the United States are not paid unless they work, especially hourly and low-wage workers. This means that for every screening tool, temperature check, and monitoring application, there will be employees who will attempt to circumvent these systems so that they don’t lose their paycheck. Congress needs to address this problem, because unless employees are incentivized to stay home from work when sick, coronavirus transmissions will continue.

 


About Sara Collins

Sara Collins joins Public Knowledge as a Policy Counsel focusing on all things privacy. Previously, Sara was a Policy Counsel on Future of Privacy Forum’s Education & Youth Privacy team and specialized in higher education. She has also worked as an investigations attorney in the Enforcement Unit at Federal Student Aid, as well as the Director of Legal Services for Veterans Education Success. Sara graduated from the Georgetown University Law Center in 2014, where she was the symposium editor of the Journal of Gender and the Law. After graduating law school, she completed a Policy & Law Fellowship at the Amara Legal Center, an organization dedicated to fighting domestic sex trafficking within the DMV area. Originally from Chicago, Sara attended the University of Illinois, where she received a B.A. in both Political Science and English.