With home working widespread post-Covid and businesses gauging options to monitor staff, we outline the key legal risks
The Future of Work Report 2021: Remote/Controlled shows that a majority of employers expect an increase in employee monitoring to develop in the post-COVID-19 world of working from home and hybrid working. What is the reason for this? Working from home requires trust. This is apparently not a given for every employer. Some may feel they lose control over their employees when they work from home rather than in the office. This loss of control can potentially be compensated for by using employee monitoring software. Tech companies have developed a long list of monitoring and control tools, including: control of log-in data, movement profiling via work mobile phones, monitoring PC cameras, monitoring of browsing history, monitoring of emails, using keylogger software, using data loss prevention tools, facial recognition systems, plus many other tools.
What should tech companies consider when developing monitoring software from a legal perspective?
“Tech companies will need to carefully consider the complex web of regulations that employers must navigate when using the technology."
When developing their technologies, tech companies will need to carefully consider the complex web of regulations that employers must navigate when using the technology. When using monitoring software and technologies employers must respect privacy, data protection, telecommunications laws, cybercrime laws and, in some countries, the rights of employee representatives. Employers often walk a fine line between using data to monitor misconduct and violating these laws and regulations. For multinational organisations, this can be a particularly complex area as there are often subtle differences in regulations as well as local expectations and norms.
Different requirements in different legal markets
If tech companies take into account the different requirements of different legal markets when developing their products, this will certainly give them an advantage over competitors who choose a one size-fits-all approach. Monitoring software (such as facial recognition systems which, after verifying the employee's identity, judge the level of attention or distraction, and kick an employee out of the work network if the system thinks the employee is not focusing enough) is permissible in the US, for example, but must be measured against the strict limits of the General Data Protection Regulation (GDPR) in the EU and the UK GDPR in the UK.
In addition, when processing employee data in the EU and/or the UK, the data protection requirements such as the obligation to minimise data, transparency, purpose limitation, data economy, etc. must be observed by the employer and should be 'built in' when developing monitoring software (thus adhering to the principle of privacy by design). Tech companies should also note that there are – even in the EU – countries in which employee data protection is subject to strict local law. For example, in Germany, the constitutionally guaranteed right to informational self-determination also applies in the employment relationship. As a principle, employers may process personal data for employment-related purposes only where strictly necessary for hiring decisions or carrying out termination of an employment contract.
According to German case law, hidden employee monitoring measures are typically only permissible if they are based on a specific suspicion that an employee has committed a severe breach of contractual duty or a crime and even then the use is subject to a balancing of interest test. However, openly communicated surveillance measures can be permitted if required to protect the employer's legitimate interests, depending on the scope and circumstances as well as on available alternatives.
In addition to the strict requirements from a data protection perspective, the implementation of employee monitoring measures triggers works council co-determination rights in Germany.
Requirements across Asia will vary. For instance, under China’s Personal Information Protection Law (PIPL) which came into effect on 1 November 2021, employee biometric or image data, may require employees to provide clear, voluntary and informed consent, which can be withdrawn, before such data can be processed for any specific purpose. There are also very strict requirements for transferring employees’ personal data outside China.
Case Study: key logger software
In 2017, the German Federal Labour Court held that the hidden use of keylogger software which captured any keyboard input on the PC and which, in the case at hand, created screenshots of the computer screen at regular intervals for the purpose of employee monitoring, violated data protection law. It further ruled that findings obtained by such monitoring were barred as evidence in court proceedings. According to the judgment, hidden use of keylogger software in an employment relationship requires a specific suspicion that the employee in question has committed a crime or another serious breach of duty.
Consequences for introducing monitoring software that violates the GDPR
If an employer introduces monitoring software that violates the GDPR, administrative fines of up to EUR 20 million or up to 4% of the total worldwide annual turnover of the preceding financial year, whichever is higher could be imposed. An anonymous complaint by a dissatisfied employee would be sufficient reason for a data protection authority to start an investigation which in the end could result in a fine. In addition, both the GDPR and the UK GDPR include a compensation regime enabling individuals (such as disgruntled employees) to claim compensation for any illegal use of personal data leading to material or non-material damage.
Social and ethical dimension to consider
“Employers need to ensure that the relationship of trust is not undermined by the use of such software."
There is also the moral or social dimension that tech companies should consider when developing their products. Employers need to ensure that the relationship of trust is not undermined by the use of such software. Transparency here is key ad we would always recommend that employers should inform their employees in advance about the use of any such controls, as well as updating their internal guidelines on what is allowed while working on the company's computer and network.
Risk of employee activism
According to our Future of Work Report, the increasing use of monitoring software could trigger employee activism in the workplace. This includes actions which could have a detrimental effect on corporate reputation or which directly bring about legal liability. In jurisdictions such as Japan and Korea which seek to tightly regulate working hours, employee whistleblowers may seek access to monitoring data to support complaints for breach of labour laws and underpayment claims. Monitoring and control tools may therefore not only consider productivity but assisting employers to comply with local labour laws.
Employers should therefore consider putting in place measures to control or restrict their employees' actions to the extent permissible.
In Germany, this risk is limited, as the introduction of a monitoring software is subject to co-determination by the works council, which ensures that employee rights are respected.
In conclusion, whilst it seems that employee monitoring may be on the rise, with multiple sophisticated technology systems available to support this changing demand for employee transparency, both technology developers and their employer customers need to ensure that they remain within the boundaries of a myriad of regulatory regimes when implementing any such monitoring. In addition, the fundamental employment relationship of trust should be maintained, something that no law can regulate for.