New guide to facilitate transparency in the use of algorithms in the labour field and launch of the Max algorithm to prosecute fraudulent overtime
The government, under the impulse of the Minister of Labour and Social Economy, Yolanda Díaz, has published a practical guide and the tool on the business obligation to report the use of algorithms in the field of employment. The next innovation in the fight against fraud has also been announced: the Max algorithm.
It is becoming increasingly common for companies to use algorithms or artificial intelligence systems to make automated decisions that affect employees in terms of hiring, scheduling, performance evaluation, productivity control, promotions, dismissals, etc.
These methods may sometimes be unknown to the workers or job candidates themselves, and they may consider these decisions to be made by human beings.
The proliferation of the use of intelligent technology and automated decision-making systems can therefore be seen as a risk to workers’ fundamental rights.
In this sense, there is a risk of infringement of fundamental rights of individuals to privacy (Article 18.1 EC), protection of personal data (Article 18.4 EC), equality and non-discrimination (Article 14 EC) and occupational health and safety (Article 15 EC).
What are the current regulations concerning the use of algorithms and artificial intelligence systems in companies?
To this end, current legislation establishes corporate information obligations so that the persons concerned are aware that their data are being processed automatically and that certain elements of their employment relationship (hiring, working conditions, dismissals, etc.) depend on an algorithm.
For this reason, the Ministry of Labour and Social Economy has published a guide whose aim is to bring together in a single document the existing obligations and rights regarding algorithmic information in the Spanish labour law system.
This is, on the one hand, to point out the obligations of the company with regard to the information that it must provide to the legal representatives of the workforce and to the workers themselves; and, on the other hand, to indicate what information the legal representatives of the workforce and the workers themselves can request in accordance with the regulations in force. It also analyses the company’s obligations to negotiate the algorithm, audit and impact assessment.
Likewise, the guide includes a tool that aims to specify and systematise the information obligations regarding the use of algorithms and automated decision-making systems in the workplace.
This tool focuses on 4 headings: general information, information on logic and operation, information on consequences and other relevant information.
Each heading consists of a series of questions such as: what kind of technology the algorithm uses, who developed it, with which decisions, whether and to what extent people are involved in this decision process, what training data has been used or whether the candidate has been informed about the use of algorithms.
It can be seen, therefore, that the process of planning and building an automated decision system is subject to many design decisions and the resulting final system will depend on each of them. Through the information and use of the tool it will be possible to combat the so-called “algorithmic discrimination” that reproduces classical patterns of discrimination.
In this context, it is the government’s desire and responsibility to put technological advances at the service of the social majorities and social justice through innovation and by offering protection and stability to working people.
After all, in a context in which the use of algorithms is becoming increasingly common in the business world to make automated decisions, and with the certainty that this may jeopardise the fundamental rights of workers, the Ministry of Labour and the Social Economy wants to make algorithmic regulation a reality in order to minimise the inequality and precariousness that may arise from the use of technology.
The Max algorithm
In this sense, it has already been announced that the Labour and Social Security Inspectorate (ITSS) is outlining the Max algorithm to combat both excessive overtime and unpaid overtime.
According to ITSS sources, the Max algorithm will be fed with information in various ways; on the one hand, it will automatically cross-reference data to look for discrepancies or inconsistencies (for example, it will be able to track down the time records that some companies have computerised); on the other hand, the algorithmic rule will also allow indications to be obtained based on a series of variables, such as the size of the workforce, turnover or volume, so that they can provide data on undeclared overtime.
For example, if a company produces or invoices much more than it should according to the size of its workforce, the Max algorithm will alert the Labour Inspectorate, which will assess and carry out inspections to detect possible fraud.
When clearly fraudulent situations are found, the ITSS will send an automated report directly to the companies, as is currently being done in the framework of the campaigns on domestic workers or fraudulent temporary hiring.
It is worth remembering that in the Spanish productive fabric , 6.6 million hours are worked outside the working day each week and 44% of these are not paid, according to data compiled in the latest Labour Force Survey (EPA). It is because of this reality that the algorithm is currently in the testing phase, with pilot experiments being carried out in companies.
The AddVANTE labour management department remains at your disposal for further information or to resolve any queries that may arise in connection with this article.