Thursday 24 October 2019 ,
Thursday 24 October 2019 ,
Latest News
  • Swechchhasebak League president removed
  • Shocked BCB chief sees ‘conspiracy’
  • Govt or drivers alone can’t ensure road safety: PM
  • 3 lakh disaster-tolerant houses in next 4 years
  • Trudeau to form minority govt
  • Bangladesh, EU to seek accountability
18 August, 2019 00:00 00 AM
Print

Tech versus human rights

Claire Downing
Tech versus human rights

Imagine that your government could identify you and track your movements digitally, solely based on your physical appearance and perceived ethnicity or race. This is not the stuff of dystopian science fiction, but is happening now due to the widespread use of artificial intelligence (AI) tools. One of the most egregious examples of the abuse of AI tools like facial recognition is their use in China’s repression of the Uighurs, an ethnic minority group that lives mostly in the country’s far-western Xinjiang province. From police checkpoints to detention camps where at least one million people are incarcerated, horrific stories have emerged about China’s effort to “reeducate” the mostly Muslim minority. Chinese authorities have even designed an app specifically to monitor the Uighurs’ activities.

But this phenomenon is not only prevalent in China. Facial recognition software presents one of the largest emerging AI challenges for civil society, and new surveillance technologies are quietly being implemented and ramped up around the globe in order to repress minority voices and tamp down dissent. Authoritarian countries like the UAE and Singapore have jumped on the facial recognition bandwagon. Despite raising serious concerns over privacy and human rights, the international response to the use of these new technologies has been tepid.

In the United States, reaction to this technology has been mixed. A New York district will soon become the first in the country to implement facial recognition software in its schools. Meanwhile, San Francisco recently became the first city to ban facial recognition software due to the potential for misuse by law enforcement and violations of civil liberties, and a Massachusetts town of Somerville just followed suit. In short, some local and national governments are moving ahead with facial recognition while others are cracking down on it.

So why is this uneven response problematic? The short answer is that the same software that is used to help track and detain Uighurs in China can be employed elsewhere without proper technological vetting. While facial recognition software may be touted as a more efficient way to track and catch criminals or to help people get through airports more easily, it is not a reliable or well-regulated tool. Human rights organizations have raised major concerns about government use of such technologies, including accuracy issues with facial recognition software and the software’s propensity to reinforce bias and stereotypes. Last year, a researcher at the Massachusetts Institute of Technology found that while commercially available facial recognition software can recognize a white face with almost perfect precision, it performs much worse for people of color, who are already disproportionately affected by over-policing.

Embed from Getty Images

As governments embrace facial recognition software, some tech companies have taken notice of the related human rights issues. Microsoft recently refused to partner with a law enforcement agency in California over concerns about potential misuse of its products in policing minorities. An Israeli startup has developed a tool to help consumers protect their photos from invasive facial recognition technology that can violate their privacy. Still, in most cases, companies cannot be trusted to regulate themselves. Amazon, which developed the facial recognition software Rekognition, offered to partner with US Immigration and Customs Enforcement (ICE),  raising concerns that its technology could be used to target immigrants. There is still insufficient oversight of these companies and, more importantly, of the governments that continue to partner with them. As a result, these companies are complicit in the repression of groups vulnerable to this technology.

So what can policymakers and others do to combat the challenges presented by facial recognition technology? First, lawmakers around the globe need to craft legislation that limits their respective governments’ use of facial recognition software and limit companies’ abilities to export these tools abroad, as has been the case with other invasive tech tools.

Second, individual cities and countries across the world, beyond liberal bastions like San Francisco, should prohibit police from using facial recognition tools. Seattle, and several cities across California have adopted similar policies but have not gone as far as San Francisco.

Third, international bodies like the United Nations should take a more active role in advising governments on the intersection of tech tools and human rights. As Philip Alston, the UN special rapporteur on extreme poverty and human rights, recently noted, “Human rights is almost always acknowledged when we start talking about the principles that should govern AI. But it’s acknowledged as a veneer to provide some legitimacy and not as a framework.” The UN is well-placed to provide an international framework for tech governance, and should do so.

Finally, human rights organizations have been raising concerns about facial recognition software and other AI tools for years, but instead of focusing exclusively on legislative fixes, they need to increase investment in public information campaigns.

 Fair Observer

 

Comments

More Op-ed stories
Coming a full circle It was a meeting without the Gandhis but it was one that zeroed in on the leadership of the senior-most member of the trio: Sonia Gandhi, Rahul and Priyanka.  When the Congress Party’s working…

Copyright © All right reserved.

Editor : M. Shamsur Rahman

Published by the Editor on behalf of Independent Publications Limited at Media Printers, 446/H, Tejgaon I/A, Dhaka-1215.
Editorial, News & Commercial Offices : Beximco Media Complex, 149-150 Tejgaon I/A, Dhaka-1208, Bangladesh. GPO Box No. 934, Dhaka-1000.

Editor : M. Shamsur Rahman
Published by the Editor on behalf of Independent Publications Limited at Media Printers, 446/H, Tejgaon I/A, Dhaka-1215.
Editorial, News & Commercial Offices : Beximco Media Complex, 149-150 Tejgaon I/A, Dhaka-1208, Bangladesh. GPO Box No. 934, Dhaka-1000.

Disclaimer & Privacy Policy
....................................................
About Us
....................................................
Contact Us
....................................................
Advertisement
....................................................
Subscription

Powered by : Frog Hosting