Trade secrets: an obstacle to algorithmic transparency?
Recently, news broke about an alleged leak of internal Google data that reveals confidential information about how its famous search engine works. According to the report, the data suggests that there are inconsistencies in the official statements about the company’s search algorithm. Without wanting to get into the merits of the authenticity of this information, although the company has confirmed that the leaked documents are real, the point is that, in general, little is known about how the algorithms that permeate our lives work.
From the way search results are presented to users to the analysis of resumes in job vacancies, there is a lack of transparency regarding the way in which most automated decisions are currently made, making algorithms true “black boxes”[1] with highly destructive capacity for society[2]. In this opaque scenario, companies usually argue that disclosing this information could expose their companies’ trade secrets, in addition to making their systems vulnerable to attacks by external agents, compromising the security and innovation of their services and products. As it is often presented by the sector, trade secrets therefore become a real obstacle to algorithmic transparency.
According to Elisabeth Fekete, trade secrets correspond to confidential information that is useful to business activity, which has a certain degree of originality, but is not protected by patents. The author also emphasizes that the information must have economic value, be transferable or alienable, and have restricted access, based on the adoption of reasonable measures that aim to keep it confidential. In addition, she warns about the importance of this information being lawful, since it would make no sense to protect data that may violate the law.
Trade secrets are divided into two types: industrial secrets (technical information, formulas, manufacturing methods, etc.) and commercial secrets (client lists, advertising projects, market studies, software, research data, etc.). In this sense, they are a lesser type of protection in relation to other traditional intellectual property institutions, such as patents and copyrights, but they are not subject to registration costs or the obligation to disclose the information, which ends up being protected indefinitely[3].
Despite the doctrinal conception, there is no legal definition for the term in the Brazilian legal system[4]. Art. 195, XI, of the Industrial Property Law (LPI) deals with the issue as a crime of unfair competition, so it takes into account the usefulness, non-publicity and non-obviousness of the information or knowledge to attract the protection of the norm. At the international level, Brazil has been a signatory to the Agreement on Trade-Related Aspects of Intellectual Property Rights (TRIPS) since 1994. Thus, in addition to the requirements mentioned above, art. 39 of the TRIPS provides for additional criteria, such as the confidentiality of the information and its economic value residing in the fact that it is kept secret, as well as the need for the owner to adopt measures to ensure its confidentiality is maintained[5].
In the General Data Protection Law (LGPD), it is also possible to find several references to the expression throughout the text. According to art. 6°, VI, of the law, the principle of transparency guarantees the owners access to clear, precise and easily accessible information about the processing of personal data, but also ensures the protection of companies’ business secrets. This protection is even reinforced in the articles that deal with the attributions of the National Data Protection Authority, which must, in the exercise of its powers, ensure the observance of commercial and industrial secrets.
According to Ana Frazão, despite the lack of a legal definition for the expression, which could generate conflicts between the obligations of transparency and the guarantee of business secrecy of companies, the LGPD already offers certain limitations that prevent the indiscriminate use of this protection by companies, especially based on the guarantee of the right to explanation and review of automated decisions, provided for in its art. 20. The LPI, according to the author, also establishes restrictions on industrial and commercial secrecy, by providing for the possibility of disclosing this confidential information in the course of legal proceedings, as long as they are conducted under judicial secrecy, as provided for in art. 206 of this law.
Furthermore, Frazão argues that the ANPD is competent, based on §2 of art. 20 of the LGPD, to conduct audits on discriminatory aspects in the automated processing of personal data, when the refusal to make the information available is based on the company’s commercial and industrial secrets. Although she acknowledges that this may not be enough to completely resolve the issue, the author also highlights the importance of conducting independent external audits that could ensure greater reliability of the information made available by companies, as well as the need to further discuss the issue of reversing the burden of proof and adopting the relative presumption in complex cases involving conflicting legitimate interests.
In this context, it is worth noting that Bill 2630/2020, which became popularly known as the “Fake News Bill”, also intended to introduce new transparency duties for large technology companies. According to the text of the proposal, the terms of use of providers should present the parameters used in their content recommendation and targeted advertising systems, presenting general information about the algorithms used, the criteria that determine the recommendation or targeting of content to the user and alternative configuration options for users, making reservations regarding the commercial and industrial secrets of the companies (art. 21).
In addition, the project determined the duty of providers to produce clear and accessible semi-annual transparency reports, which should include information, for example, about the criteria, methodologies and metrics used in the systems, the number of active users and user profiles, the number of complaints, notifications and content moderation procedures (art. 23). The project also established the need to conduct and publish an annual external and independent audit to assess compliance with the law (art. 24), as well as stipulating that providers should provide free access to aggregated data for academic research purposes (art. 25).
At the time the bill was still under discussion, these obligations were not considered controversial, especially by a large part of the private sector, which, on other points, argued that the proposal required the detailed disclosure of strategic information about how their systems worked, affecting the usefulness and security of their services and products. Despite the technical complexity involved in algorithmic processes and the difficulty of translating mathematical formulas into clear and accessible explanations, the disclosure of certain information about how algorithmic systems work may, in fact, undermine protection against external agents and limit the capacity for economic exploitation of the products and services developed, harming innovation in the sector. However, total opacity regarding systems may, on the other hand, be used intentionally to secure unfair competitive advantages, allow companies to escape the application of standards and possible liabilities, and may hide errors, inconsistencies, manipulative methods, and unethical and discriminatory models. Without transparency about how algorithms work, there is no way to assess whether they are truly aligned with democratic values or in compliance with the law. In the context of the use of risk assessment tools in the US criminal justice system, Taylor Moore argues about the dangers that unrestricted protection of algorithms, based on trade secrets, can pose to the exercise of fundamental rights. The author argues, in this sense, that it is necessary to develop mechanisms of social balance, which can reconcile the interests in question, even if this may discourage innovation, since it is counterproductive to encourage the development of a technology that has the potential to provoke more discrimination.
In this sense, experts argue that the current scenario of opacity may be, to some extent, the result of the lack of regulation and obsolete legislation, which fails to address the problem adequately. Thus, one possible path would be to invest in more transparency through appropriate regulation to equalize the interests at stake. In this case, Moore explains that it is necessary to avoid binary responses, in which it is necessary to choose between total secrecy or total transparency of information. Along the same lines, Pasquale highlights the importance of carrying out independent audits that can be a way of mitigating these problems, without this leading to the disclosure of information that may be sensitive to the companies’ business.
The discussion, however, is far from having a single and definitive answer even if the issue is regulated. Incidentally, transparency itself, defended throughout this text, should not be seen as an end in itself[6]. On the contrary, it should be understood as a means to intelligibility, so that we can understand, at the very least, the main aspects behind automated decisions and have effective ways to challenge possible injustices caused by the use of algorithmic systems[7]. The most important thing, in short, is to ensure that innovation is at the service of social well-being and not just another way of perpetuating and reproducing prejudices and inequalities.
[1] PASQUALE, Frank. The Black Box Society: the secret algorithms that control money and information. Cambridge, MA: Harvard University Press, 2015.
[2] O’NEIL, Cathy. Algorithms of Mass Destruction: How Big Data Increases Inequality and Threatens Democracy. 1st ed. Santo André, SP: Editora Rua do Sabão, 2020.
[3]RAYMUNDI DOS SANTOS, Gabriela. The protection of commercial and industrial secrets in the General Data Protection Law. Undergraduate Course Conclusion Work. Federal University of Rio Grande do Sul. Faculty of Law. Course in Legal and Social Sciences. Porto Alegre, 2020.
[4] Ibidem.
[5]Ibidem.
[6] PASQUALE, Frank. The Black Box Society: the secret algorithms that control money and information. Cambridge, MA: Harvard University Press, 2015.
[7]Ibidem.
Rhaiana Valois
Graduated in Law from the Federal University of Pernambuco (UFPE); participant in the 41st Exchange Program of the Administrative Council for Economic Defense (PinCade); former member of the Legal Design Laboratory at USP and of the Law and Information Technology Commission (CDTI) of the OAB/PE. At IP.rec, he works in the area of Regulation of Digital Platforms.