Algorithms are sequences of actions to be performed by software to solve a problem or achieve a certain result. They are used, for example, to automatically search for previously defined profiles in extensive databases in order to obtain specific data for a search, application, replacement of vacancy, and even diagnosis of diseases.

For the data to be obtained by the algorithm, it is essential to have a person responsible for defining the structure of the basis on which the data will be stored and a person responsible for supplying it, even based on virtual sensors, considering the parameters previously established. Human participation necessarily occurs at some of these moments, either to define the guidelines applicable to the algorithm or organize, develop, and govern the information.

There are various examples of the use of algorithms in labor relations, such as automatic review of job applicants' resumes or monitoring of employees' activities according to productivity targets set by the company for the purposes of bonuses or dismissal.

The question, however, is whether the use of algorithms alone is incompatible with Law No. 13,709/18, the General Data Protection Law (LGPD), which provides for the processing of personal data, including in digital media, by individuals or legal entities, in order to safeguard the privacy of data holders.

Before assessing any compliance situation, it is necessary to identify whether personal, sensitive, or anonymous data are involved, and whether these data are based on the LGPD for their processing.

Personal data may only be used in the following cases: fulfillment of a legal obligation, performance of studies by a research organization, performance of a contract or preliminary procedures related to the contract to which the data subject is a party, regular exercise of rights, legitimate interest, protection of credit, processing and shared use of data necessary for the execution of public policies, protection of life, protection of health, and by consent.

Sensitive data, on the other hand, cannot be used in the following cases: performance of a contract and protection of credit (if there is no consent from the data subject) and legitimate interest, but only in the other cases or to ensure prevention of fraud and security of the data subject in the processes of identification and authentication of registration in electronic systems.

If the data processed by the algorithm is anonymized and therefore cannot be identified, it will not be considered personal or sensitive and may be used. Considering that most algorithms process anonymized data, it is commonly understood that they do not have any prohibition on processing in the legislation, but this view is largely mistaken.

Anonymized data does not prevent certain processing of data from being considered discriminatory, which is prohibited by articles 3, subsection IV, and 5, subsection XLI, of the Federal Constitution and 6, subsection IX, of the LGPD. It must be kept in mind that algorithms, even if anonymous, are not impartial and may reflect prejudices rooted in the history of the data, as demonstrated by various recently reported cases that should be reviewed by companies.

The problem of algorithmic discrimination through the use of biased databases can still originate in data collection, including with human participation, and this situation has become increasingly common. For this reason, the Labor Prosecutor's Office has recently set up a group against algorithmic discrimination in order to investigate companies that use biased algorithms. It is called machine bias or algorithm bias.

The removal of biased algorithms is an issue that has been widely discussed by companies, which should review governance and human participation in the use of technology to legitimize it.

Facebook recently announced the launch of a board of experts from around the world with multidisciplinary and multicultural background, called Oversight Board Administration. This board is currently composed of 20 members and will be responsible for defining, for example, what type of content should or should not be removed from the social network, according to what is considered inappropriate, irrelevant, or excessive. It is an independent body that seeks to enhance the integration between the human being and artificial intelligence.

Another problem identified in the algorithms that process seemingly anonymized data refers to the decisions arising from their use. According to article 20 of the LGPD: “The data owner has the right to request review of decisions made solely on the basis of automated processing of personal data affecting his interests, including decisions to create his personal, professional, consumer, and credit profile or aspects of his personality.”

This means that if a certain algorithm has led to discriminatory treatment of the employee, both for recruitment and dismissal purposes, for example, the company must respect the principle of transparency, provided for in article 6, VI of the LGPD, and provide all necessary information on the processing of the data on which the decision was based, under penalty of being audited by the National Data Protection Authority (ANPD).

To confirm the legality of the algorithms, therefore, two questions must be taken into consideration and analyzed with great caution: classification of the data indicated (whether personal, sensitive, or anonymous) and whether they are legitimately presented in this manner.

Otherwise, the companies will be in breach of the LGPD and must review the use of the technology in accordance with the guidelines of the new regulation, under penalty of being subject to the administrative sanctions provided for by law (the application of which may occur as of August 1, 2021), as well as the payment of compensation for moral damages in the event of litigation in the labor sphere.