The use of algorithms by companies to achieve a certain result has become increasingly common. This is the case of Facebook's algorithm, for example, which is one of the best known for defining what will be displayed in each user's feed. Little has been discussed, however, regarding the liability of companies and the consequences of the use of algorithms.

Recently, the Section Specialized in Individual Disputes (Subsection II) of the Court of Labor Appeals for the 1st Circuit granted the performance of an expert examination in the source code of an application to determine the existence or lack thereof of an employment relationship between the company and the driver for the application.

The intention is to identify the artificial intelligence data that influence the employment relationship, such as the distribution of calls, the definition of values of services, the preterition and/or preference of some workers as a result of the ratings provided by consumers, and even possible application of sanctions, such as blocking of the worker himself.

Although the result of the expert report is not yet available, the mere granting of the plaintiff's request is enough for some reflections on the impact of algorithms on the employment relationship, this time as the very evidence of the employment relationship.

According to the judge drafting the appellate decision that granted the performance of the expert examination, one could not fail to consider information deposited in technological instruments with rigorous records, to the detriment of other more "weak" means of proof, such as oral evidence, in view of the fallibility of human memory, and the variability of perception of the facts, which allows for various different narratives about the same reality.

In other words, although the legislation provides that there is no hierarchy among forms of evidence, one of the grounds that supported authorization of the expert opinion refers to the alleged robustness of the expert evidence in detriment of the others. However, when it comes to algorithms, this premise could easily be disproved.

The purpose of the algorithm is dissociated from the purpose of the Labor Courts, which is why it is possible that the number of hours the driver was logged on does not correspond to the number of hours he was actually working, for example.

In addition, we cannot forget that, in order to obtain data by the algorithm, it is essential to have a person responsible for defining the structure of the basis on which the data will be stored and a person responsible for supplying it, even based on virtual sensors. This means that algorithms are easily manipulated, since they necessarily involve human participation, whether for the definition of applicable guidelines or for the organization, development, and governance of information, such that they also present fallibility.

That is where the second discussion on the subject comes in: the liability of developers and companies. By accepting as a means of proof an expert examination done on algorithms as though it were something absolutely automated and safe, the Labor Courts themselves create a distance between programmers who build algorithmic systems and companies.

The distance created by the Labor Courts weaken the very accountability of programmers, who tend not to feel morally and legally responsible for the negative effects of the algorithm, among them the use of biased databases that may originate even in data collection, triggering the problem of algorithmic discrimination, for example. For this reason, the Labor Prosecutor's Office has set up a group against algorithmic discrimination in order to investigate companies that use biased algorithms. It is called machine bias or algorithm bias.

Although the distance between programmers and companies may denote an apparent absence of liability of these professionals, every employee, like every individual, is subject to the general rule of civil liability provided for in article 927 of the Civil Code, according to which those who, through an unlawful act, cause damage to another are obliged to repair it.

The company itself may also be held liable, as controller of the data subjects' personal data, pursuant to article 42 of Law No. 13,709/18 (LGPD), according to which the controller who, due to the exercise of personal data processing activity, causes to another person property damage, non-economic, individual or collective, in violation of personal data protection legislation, is obliged to repair it.

Thus, it will be incumbent on the companies to take the precautions necessary to avoid this liability, including in relation to the programmers, who make the algorithmic system as fallible as testimonial evidence, for example.

To this end, it is important that the creation of innovation areas in companies be accompanied by the execution of specific employment contracts with these professionals, with express provisions relating to cases of employee liability, as well as the creation of their own internal policies that must be observed for this purpose.

This means that, with the implementation of innovation areas to optimize the business model of companies, certain labor precautions are required, the observance of which can even prevent the liability of companies and programmers for the algorithmic system used, which, more often than not, may go unnoticed in internal policies.