Uruguay is poised to develop a “Facial Identification” database for public safety purposes under the Ministry of the Interior. This system was approved using the National Budget Act as an “omnibus law,” thus preventing proper discussion about the issue due to the tight deadlines for approval of this type of law.
Development of this database will be under the responsibility of the Ministry of the Interior, using the database currently under the control of the National Directorate of Civil Identification, the organization in charge of issuing identification cards. The database will include facial images of adults, first and last names, sex, date of birth, nationality, and identification card number, as well as issue and expiration dates. The Ministry of the Interior has already purchased automated facial recognition software and currently has a system of 8433 cameras distributed in the country’s 19 departments, in addition to private surveillance systems. The national government has admitted that the intended use of this facial identification database is automated surveillance using facial recognition algorithms.
Particularly concerning is the broad discretion given to the Ministry of the Interior as to the possible uses of this facial identification database, since it includes any type of use for public safety purposes covered under the missions of the Organic Police Law. The concept of “public safety” is so broad that it does not define public authorities’ limitations in use of personal data.
What could go wrong?
Several recent studies , ,  warn that most commercial facial recognition systems have significant bias and are still immature technology. Biased facial recognition technology is particularly problematic for uses related to public safety because errors could lead to false accusations and unjustified arrests.
But let’s suppose that the facial recognition algorithms work correctly, and the database is managed carefully from a technical point of view by the Ministry of the Interior. That would mean state surveillance systems could identify each individual perfectly. In that case, the question is: do we really want to go there?
The use of this technology entails great risks: it may be used to find and arrest protesters or protest organizers, or it may be used to track people remotely without their knowledge, among other concerning uses. In addition, living in a surveillance society affects people’s privacy and can also affect freedom of expression, movement and assembly, in ways we do not yet suspect. How will facial identification affect the behavior of Uruguayans? Has the impact of the possible social consequences of using biometrics in the public space been analyzed? Is it necessary and proportional?
In Uruguay the topic was included in a Budget Act (omnibus law) with no public discussion, but in other countries legislators have proposed or even approved laws prohibiting the use of facial recognition by the government to surveil its citizens      , including prohibiting the use of other biometrics technologies such as voice recognition, gait recognition and recognition of other immutable physical characteristics. Several organizations working on human rights and technology issues in Latin America have highlighted problematic cases related to use of facial recognition and have warned about the risks that this technology represents for the population.
In addition, it is important to emphasize that international human rights organizations have warned about the potential risks of abuse and recommend that countries regulate it by law, analyzing the detailed scope of its use, the need and proportionality. In this sense, in the year 2020, renowned tech companies decided to place moratoriums on offering their facial recognition solutions to governments, requesting that their use be regulated by parliamentary procedure.
Warning of civil society
On October 13, 2020, these two articles were approved by the Chamber of Representatives without debate of any kind. When the Budget Act passed to consideration by the Chamber of Senators (during the months of October and November), the DATYSOC team warned about the potential dangers of using automated facial identification for public safety purposes and about the excessive discretion granted to the Ministry of the Interior, managing to establish the issue on the media’s agenda       . In turn, along with over 20 organizations from Uruguay and the region, we sent a letter to the Chamber of Senators of Uruguay, asking that these articles be removed from the draft of the Budget Act.
Based on these warnings issued by civil society, several senators took a position in favor of separation of these articles or requested inclusion of a requirement for prior court order to authorize use of these facial identification data for public safety purposes. Unfortunately, due to the little time for discussion that a Budget Act implies, no agreement was reached. These articles, which included a “carte blanche” for the Ministry of the Interior, were approved without amendments by both chambers, preventing the in-depth and necessary parliamentary debate the issue requires.
Having exhausted the options to influence the parliamentary discussion process surrounding the Budget Act and seeking possible pathways to avoid greater harm, we at DATYSOC have decided to press for the inclusion of this issue in the 5th Open Government Action Plan 2021-2025. We are seeking a commitment from the Ministry of the Interior that allows, at minimum, the possibility of an informed debate with the participation of the many interested parties prior to its implementation.
Furthermore, we are closely analyzing the impact of these measures on human rights, seeking the greatest possible transparency in the process and its implementation, to keep the population informed.
Learn more about the issue
Selected Projects 2020
Digital and physical spaces are increasingly connected. Political and social tensions, the public’s relationship with the state and its use of technology are raising new and complex challenges to digital rights. The ongoing pandemic, and related state responses, are creating further cause for concern: Throughout the region, we are seeing widespread misuse of personal data, limits on expression, lack of information and knowledge being distributed to vulnerable communities, and many other alarming developments.
To support the advancement of digital rights in the region, Indela opened its second open call in 2020. We received 138 proposals from 15 Latin American countries.
Today, the Indela team is pleased to announce the six projects selected for its second funding cycle. We are very proud to support these innovative initiatives, which will work on free and fair copyright reform, reducing online gender based violence, localizing public data protection policies, and user-centric cybersecurity laws, among others.
These six projects will receive funding for 12 to 18 months, as well as customized support to strengthen the impact of their work.
The final selections from Indela’s 2020 open call, are as follow:
- “REMIX: discussing copyright and the Internet,” by Agência Lema and InternetLab, will foster a public conversation about copyright in Brazil, and the need for progressive reforms.
- “Supporting Victims of Online Gendered Violence” by the Cultivando Género Civil Association, will support women and girls in Aguascalientes, Mexico, who have been targeted by digital violence, to learn about the legal options available to them, and make informed decisions in exercising their rights.
- “DATYSOC: towards a Comprehensive Digital Rights Agenda in Uruguay,” by DATA Uruguay, will strengthen Uruguay’s digital rights legislative digital rights agenda by advocating for public interest copyright regulation and internet intermediary liability policies.
- “Multicultural digital rights frameworks for indigenous and afro-descendent communities in Bolivia: comparative analysis and public policy advocacy,” by Asociación Aguayo and Fundación InternetBolivia.org, will work to develop contextualized regulatory frameworks for internet access and personal data protection in selected Bolivian municipalities.
- “A Multi-sector Initiative for Information Security and Fundamental Rights,” by Vía Libre Foundation, is a collaboration between public- and private- sector actors to develop policies that safeguard digital assets (including personal data and critical infrastructure) in Argentina.
- “Building bridges between Latin America’s digital rights and consumer defense communities,” by the Brazilian Institute for Consumer Defense (IDEC), will coordinate the digital rights work of consumer defense advocates, with the strategies of the region’s digital rights community. In particular the project will focus on personal data protection policies and their enforcement.
Congratulations to the organizations selected in Indela’s 2020 Open Call!
Ramiro Alvarez Ugarte
The privacy movement has always waged battle in difficult terrain, marked by the objective willingness of citizens to cede their data in exchange for benefits that they perceive as useful. The COVID-19 pandemic poses a new challenge that must be viewed as part of this long history in order to be effectively understood, and I believe that the challenge has never been so great.
Let me begin by presenting the adversary in the best possible light. A massively adopted surveillance technology already exists in society, on top of which an additional layer is being proposed for deployment in exchange for two concrete benefits: slowing the spread of the virus and lifting quarantine measures more efficiently. The promised benefits are significant and should not be quickly cast aside. Fewer deaths and less sustained economic damage appear as desirable objectives in the context of a rather frightening situation that has literally all of us locked up in our homes.
The new layer of surveillance on offer could take different forms. In China, it has manifested as a digital passport accessible via a popular e-wallet system that classifies people according to nebulous criteria. For the West, large companies are promising solutions that are more respectful of citizens’ privacy. Meanwhile, several governments have looked for some kind of technological solution to the problem. All these scenarios have their disadvantages: the ones that are specific to Latin America have already been pointed out by local organizations. In this brief space, I would like to draw attention to a structural dimension that I believe should guide us in reacting to this challenge. To do so, I’ll employ the old concept of layers, from the “internet world.”
Indeed, I believe that the best way to approach the newly proposed surveillance is to understand it as “one more layer” that would be deployed over a series of underlying layers. The image is useful because the underlying layers largely define how the top layer operates; they determine what the top one can achieve and what it asks of us in return. That is to say, the new layer of surveillance is deployed on top of an existing infrastructure, on top of specific socioeconomic situations, and, crucially, on top of the operating patterns of democratic institutions, both in terms of decision-making and mechanisms for accountability.
It is this layer of “democratic governance” that I want to focus on, seeing as I sense that the new layer of surveillance being offered to us is more or less inevitable, in part because it already exists: global users of Google Maps already allow access to their location data in return for a richer user experience. Who wouldn’t allow access to their location data in exchange for their neighbors’ survival? This focus asks us, therefore, to make a tactical pivot in order to reduce the damage or influence the decision-making process that would lead (or not lead) to a new layer involving sensitive data and the manifest intention to make it “shareable.” By doing so, we enable other people and/or health authorities to more closely monitor the population during times of quarantine.
The first step in this direction requires assuming that surveillance technologies indeed operate in one way or another in accordance with the function of the underlying layers. Thus, for example, an invasive system of mass surveillance of communications may be questionable in and of itself, but its consequences will simply be different according to whether it’s implemented by a dictatorship or by a democracy. Similarly, a CCTV system can have different consequences if the authority in charge of it is, for instance, accountable on a regular basis to a legislative oversight committee, versus if it has absolute discretion to expand, use, and enhance that system.
The second step is, then, to analyze and study the aspects of democratic governance that will determine the function of the new layer of surveillance and take direct actions specific to them.
• The personal data infrastructure. What kind of protection exists in our country? Is it effective? Is the current legislation up to date? How do the bodies responsible for protection or enforcement work? In many countries, the answers to these questions are disappointing, but there is also special protection for “sensitive” data. Another issue that needs to be examined is related to existing mechanisms for accountability. Could the judiciary branch, for example, exercise effective control over contagion tracking systems? What about the legislative branch?
• The reach of the exception. It is fundamental that we assess the extent to which our institutions are capable of creating policies of “exception” and respecting that exceptional character. In my country, for example, rules for exceptional situations have often been created and then normalized. Can we prevent that from happening? On the other hand, it is also important to pay attention to the various instruments that can be used to define the limits of the state of emergency (time limits/expirations, objective criteria such as numbers of cases, etc.). Therefore, it is essential to evaluate the current constitutional mechanisms that allow the state of emergency to be legally “produced” and the controls that regulate these mechanisms.
• Science. Another possible focus is to partner with epidemiologists, who know that an app could never be the sole tool used to manage a pandemic, a complex equation of which contact tracing is only a small part. For example, everything indicates that without a substantial increase in a society’s testing capacity, a tracking application could yield erroneous results (false positives and negatives). This is a typical tactical approach: focus on the underlying vulnerability of a proposal that is simpler and cheaper, and which surfaces as an alternative intervention when a more complex and effective proposal is too costly.
• Constitutionality. There is no reason why, upon being confronted with the deployment of technological tools, the least possible invasion of privacy cannot be demanded. In this case, the constitutional arguments that demand there should be limitations on measures that restrict rights, without sacrificing their capacity to meet their objective, should be helpful. The analysis of these limitations should take into account the amount of information collected, the guarantees established, the voluntary nature of monitoring programs, etc. Local constitutional law should be the basis of these indirect strategies, even more so, I believe, than international human rights standards.
• Transparency. All actions to be taken by governments must be transparent; they should also be designed and implemented for the benefit of citizens, with sufficient political, legal, and social controls. It is likewise essential to consider the ways in which an additional layer of control can be set up, quickly and at no additional cost, and placed on top of the new layer of surveillance, e.g. via workplaces or traffic on public roads, where public authorities or private entities oversee the right of entry. Thus, if a digital passport system is established, what prevents a supermarket from requiring the credential to be displayed before entering? This possibility should lead us to a second and vital issue: the multiple discriminatory impacts that such technologies could have.
Those who have been working on digital rights in Latin America for years will see that what has been said up to this point looks like a hasty repackaging of old challenges. But this is because these challenges are persistent: despite advocacy efforts, increased public awareness, legal changes, and occasional scandals, invasive technologies seem to be advancing at a more rapid pace. COVID-19 presents an even more difficult scenario in which existing models intrude even deeper in exchange for concrete benefits. Never was the offer of those asking us to exchange our privacy for some possible benefit so tempting. It would be unacceptable political innocence not to see the issue from the point of view of the political leaders currently in office, who face an unexpected situation with an impact that, while still uncertain, will be massive. And it would be inept of us not to adjust tactics (and strategies) to rise to the new challenge.
Ramiro Alvarez Ugarte is an associate professor of constitutional law at Universidad de Buenos Aires and a professor of law and social change at Universidad de Palermo (Buenos Aires). He is currently pursuing a JSD at Columbia Law School. Previously, he worked as a human rights lawyer at the Inter-American Commission on Human Rights (2009-2011) and at Asociación por los Derechos Civiles in Argentina (2011-2014), where he developed its privacy agenda. He holds an LLM from Columbia Law School (2009), where he was both a Harlan Fiske Stone Scholar and Fulbright Visiting Scholar.