A presumption of employment and rights on algorithmic management are at the heart of the revived platform-work directive.
Autonomy or algorithmic control? The directive will give persons performing platform work transparency rights when it comes to their algorithmic management.
Last Monday, European Union member states reached a provisional agreement on the compromise text of the directive on platform work. This momentous decision comes more than two years after the European Commission’s initial proposal and just weeks before the legislative window closes ahead of the elections in June to the European Parliament. While some formal passages remain to be completed, the imminent adoption of the directive comes as a surprise to many observers and insiders alike.
Politically, it is a landmark success for the co-legislators. The commission can rightfully claim to have delivered on the European Pillar of Social Rights. The parliament has secured a victory—despite the big gap between the final text and its preferred version. Even the Council of the EU, where an unprecedented ‘qualified majority’ was reached at the expense of the French and German governments, can present this as a significant step ahead.
Presumption of employment
Indeed, the digital platforms were caught off-guard. Despite their spinning to minimise the impact of the new text, the keystones of the directive represent a realistic and concrete compromise to curb unscrupulous business models, based on bogus self-employment, pervasive surveillance and capricious decision-making. While the presumption of employment could have been stronger and the chapter on algorithmic management could have prompted fully-fledged collective-bargaining rights, overall the directive is a step forward.
First, member states will be required to adopt an effective presumption of employment for platform workers. This does not equate to automatic reclassification as employees. Rather, it serves as a procedural tool to facilitate the determination of employment status, as defined by national law and collective agreements in conjunction with EU jurisprudence—thereby ensuring access to employment rights for those who are de facto employees, even if their contractual agreements suggest otherwise. The model should not be burdensome for claimants or respondents.
Importantly, this rebuttable and adaptive presumption of employment status will have to be based on ‘facts indicating control and direction’ rather than legal criteria or indicators, as in the commission’s and council’s earlier texts. This will help platform workers claiming reclassification: courts will have to verify concretely the working conditions of platform workers when deciding on employment status, ‘irrespective of how the relationship is classified in any contractual arrangements’ agreed upon by the parties. This privileging of substance over form is a wise application of the principle enshrined in the International Labour Organization’s Employment Relationship Recommendation (198).
The presumption, whose contours have been discussed at length, is less stringent than the strong legal tool sought by the parliament but more effective than what member states and initially the commission proposed. ‘Control and direction’ of the performance of work is a dynamic notion which can take several forms, direct and indirect. Paradoxically, the more abstract formulation of the presumption and the lack of rigid factors could counter rapid obsolescence of legal criteria faced with a fast-paced business environment. At the same time, this open definition is less prone to being gamed by unscrupulous platform operators. Necessity may be the mother of positive invention.
Each member state will have the autonomy to define the method for activating the presumption of employment and, in the process, shifting the burden of proof towards the platform. So there will be no standardised set of criteria across the EU, which means that those countries already ensnared by the platforms’ lobbyists could be still more lenient towards ‘gig’ companies. In all cases, however, the platform will have the opportunity to prove the autonomous nature of the relationship, rebutting the presumption. National measures, their wording and mechanics will be crucial to give substance to the EU’s objectives.
There is a remarkable record of ‘arbitrage entrepreneurialism’, to disrupt compliance with regulation, and platforms’ creativity knows no bounds—for instance through recourse to intermediaries and undeclared work. The directive’s drafters are however aware of this and measures are encouraged to avoid the directive being turned into a mere formality. It is difficult to imagine a return to the status quo.
The proof of the directive’s success will thus be in the transposition and implementation. Guidance for social partners and the training of competent authorities, such as labour inspectorates, is mandated by the new text. After this leap forward, energies can be invested in a regulatory and enforcement agenda for improving working conditions for all non-standard workers.
Algorithmic management
The directive has a chapter on algorithmic management, unpacked into ‘automated monitoring systems’ and ‘automated decision-making systems’. The former are used to support or conduct monitoring, supervision or evaluation of the performance of ‘persons performing platform work’. The latter are employed to take or support decisions that significantly affect persons performing platform work, including their working conditions: recruitment, access to and organisation of work assignments, earnings and prices, safety and health, working time, training, promotion or its equivalent, contractual status (including account restriction) and suspension or termination.
This chapter stands out as unique. It introduces rights for persons performing platform work, thus including self-employed workers, to receive adequate information about the algorithms used to hire, monitor and discipline them. The scope of this imperative of transparency is wide. All workers, regardless of their employment status, must be informed about categories of decision made or supported by technologies. It is for platforms to disclose—besides the very existence of automated monitoring and decision-making—the types of actions monitored, the purposes of the monitoring and the recipients of such information.
When it comes to automated human resources, categories of decisions supported by, or outsourced to, software must be revealed to workers along with the underlying parameters and their relative importance. Transparency measures cover also the grounds for decisions to restrict, suspend or terminate accounts or refuse payment, as well as those on contractual status or otherwise having a critical impact on individuals’ lives and livelihoods.
The breakthrough here comprises the limits placed on data collection and processing in monitoring and decision-making. Platforms are banned from using automated systems to process data on emotional and mental states of workers, data concerning their rights to bargain collectively and strike or conversations with representatives and any data generated when they are not logged on. Prohibited too is processing sensitive data covering grounds traditionally protected under non-discrimination law. Borrowing from the General Data Protection Regulation model, a data-protection impact assessment must be carried out when algorithm-based practices result in a high risk to rights and freedoms.
The directive gives workers the right to obtain justifications, request human review and challenge or rectify those decisions which infringe on their rights. Such ‘due process’ safeguards are much needed in the digital arena. Automated systems will have to be overseen closely by platforms’ employees—with the involvement of worker representatives—to avoid discrimination and occupational hazards.
Notably, according to the directive, workers’ representatives will receive relevant information, in a complete, accessible and detailed way, and will have information-and-consultation rights on when and how they are deployed. Platforms will be obliged to assess, together with the representatives of platform workers, the risks of discrimination that may arise from the use of algorithm-based technology. They must also ensure that algorithms do not push workers to adopt an unsustainable work tempo which puts them at physical or psychosocial risk. The promotion of collective bargaining in platform work resonates with the commission’s guidelines on agreements regarding the working conditions of solo self-employed persons.
Era of accountability
Next year, the EU estimates, there could be 43 million platform workers across the union. Large chunks of the social acquis should in principle cover them but the challenges posed by platform work require specific measures. The directive, once approved, will arguably inaugurate a new era of accountability. As a consequence, platforms will have the opportunity to improve their business model to afford genuine autonomy to workers or opt for employment contracts when they are willing to exercise direction and control over the workforce.
They can not represent this as the end of the world. All responsible businesses in conventional sectors and a handful of food-delivery platforms operate in this way without major headaches—in most cases profitably. Even some chief executives have welcomed the creation of a level playing-field.
The uncontroversial chapter in the directive on data-driven practices is its most promising: it could pave the way for more specific safeguards and interventions, especially with regard to artificial intelligence and algorithms assuming managerial functions. Concerns have been raised about the adequacy of the GDPR and the AI Act to govern data processing and decision-making in work environments, given the abundance of exceptions and the absence of a strong collective dimension.
The startling result is that platform workers could mobilise stronger data-protection rights than workers in conventional labour-market sectors. Indeed the directive could serve as a ‘pilot’ for a broader tool to regulate the uses of technology in the workplace. Yet another reason to celebrate a good compromise.
Source: Social Europe https://www.socialeurope.eu/gig-workers-in-europe-the-new-platform-of-rights
Antonio Aloisi is Marie Skłodowska-Curie fellow and assistant professor of European and comparative labour law at IE Law School, IE University, Madrid.
Valerio De Stefano is a law professor at Osgoode Hall School, York University, Toronto.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.