Technological change: a social process
With Covid-19, the digitisation of work has taken flight. Much of this is useful: during the government-imposed lockdowns many firms flocked to Zoom, Microsoft Teams and their equivalents, to allow employees to connect and continue working from home. At the same time, there is increasing evidence that digital technology at the workplace is undermining workers’ labour conditions, by enabling round-the-clock surveillance and fine-grained automatic management and control.
Some of the excesses are well-known, for instance Amazon’s use of algorithms to push US warehouse workers beyond their limits, which results in high rates of physical injury.[1] But evidence is accumulating that these negative trends are much more widely spread than is often assumed, also in Europe. An insightful reference here is the recent report from Cracked Labs on digital surveillance and control at work. [2]
Now, in business and policy circles, it is often argued that undermining worker autonomy is an inevitable consequence of the march of progress: without detailed monitoring and control of workers, managers cannot improve efficiency. They are willing to accept certain legal limits, to preserve a minimum of human dignity, but these restrictions are often considered to come at the cost of firms’ competitiveness. In the same view, it is acknowledged that automation may lead to job losses and deskilling, but these are said to be necessary to improve productivity and ultimately societal wellbeing.
However, this is too simplistic a view. There should be much more scrutiny, debate and workers’ influence over the digital transition. Whereas technology can be used to increase productivity, that is often not the sole purpose. Just as likely, technology is introduced to enhance management’s capacity to exert power over the workforce as a separate objective. Policy makers often conflate these two goals, by assuming that increasing power of the management will automatically lead to increases in productivity. But this is incorrect. Often that power is used to selectively automate tasks that lead to the deskilling of labour, make jobs more monotonous and hence allow firms to lower their labour costs.
As W. Brian Arthur observed, “technology is a means to fulfil a human purpose.[3]” Indeed, but whose purpose specifically? It matters who is designing and implementing the technology. If your focus is to reduce labour costs, you will likely favour technology that breaks individual jobs down to their tiniest parts, to make workers replaceable and to hence make labour cheaper. If you are a worker, you may have more of an interest in reducing the tedious, dangerous, and repetitive tasks of your job, and using technology to enhance the most interesting, valuable parts. To push for a digital transition in which technology complements labour, rather than replacing and undermining it, it is therefore necessary that workers have more of a say.
Exploring existing tools for worker influence: the GDPR
The question then becomes how to give workers more of a say in the on-going digitisation of the workplace. Whereas some believe this requires new legislation at EU level – for instance a specific directive on artificial intelligence in employment[4], it is important not to lose sight of existing legislation that can provide tools now, not in 4 to 5 years’ time, when technological developments and business’ incentives will have likely crystallised around unsustainable models of automated management and surveillance.
Especially the General Data Protection Regulation (GDPR) can become a strong instrument for workers to co-shape technological developments in the workplace. In essence, the digitisation of the workplace revolves around collecting workers’ data and using that for a variety of monitoring and digital management systems. Therefore, the rules that determine how such data can be collected and what rights workers have, is of crucial importance. The legal provisions of the GDPR do not only allow individual workers to make defensive use of their data rights – for instance by objecting to a specific instance of illegitimate processing of his or her personal data, but they also allow workers’ representatives to influence what technologies are being implemented at the workplace.
For instance, much of the systems that rate employees’ performance, track their whereabouts, or their communications, rely on workers’ personal information. This means employer’s need a legal base to process such data. A popular legal ground is consent, but because the employer-employee relationship is hierarchical, individual employees cannot freely give their consent in this case. Therefore, workers’ representatives, such as works councils, can use the opportunity to negotiate collective agreements with management that stipulate the exact conditions around collection, storage, and use, and which would provide a more appropriate basis for data processing in accordance with the law.
In addition, according to the GDPR, employers need to carry out data protection impact assessments when implementing novel technology that includes the systematic surveillance and performance assessment of workers. On a reasonable interpretation of the law, which multiple Data Protection Authorities subscribe to, employees need to be consulted in this process. This allows an entry point for workers representatives to discuss the purposes of new systems and to ensure their operation and use also takes workers’ interests into account.
Taking action
However, it is true that right now, enforcement of the GDPR is grossly insufficient. Data Protection Authorities are unable to fulfil their tasks and the workplace is rarely a priority for them. And yet, this should not be a reason for the labour movement to resign itself to widespread use of illegal systems, but instead for unions to become much more active in this domain. They can hire technical and legal specialists, provide training to works councils and shop stewards, assist individual workers with advice, and bring cases to court.
Beyond that, unions should reach out to other groups that have similar or shared aims and build flexible coalitions. Such partnerships can include academia, civil society groups that protect citizens and consumers’ (digital) rights, as well as relevant authorities. This can help build the expertise and impact to address issues on an increasingly opaque and globally operating software market that may be difficult to tackle by unions or works councils alone.
1. Strategic Organizing Center (2021) ‘Primed for Pain: Amazon’s Epidemic of Workplace Injuries’, May, accessed at: https://thesoc.org/wp-content/uploads/2021/02/PrimedForPain.pdf.
2. Cracked Labs (2021) ‘Digitale Überwachung und Kontrolle am Arbeitsplatz’, September. Accessed at: https://crackedlabs.org/daten-arbeitsplatz.
3. W. Brian Arthur (2009) The Nature of Technology. What it is and how it evolves (Penguin Books; London), p. 28.
4. Aída Ponce Del Castillo (2021) ‘The AI Regulation: entering an AI regulatory winter? Why an ad hoc directive on AI in employment is required’, ETUI Policy Brief, July.
Digital Policy Analyst at the Foundation for European Progressive Studies (FEPS)
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.