4/26/2019
Profiled article
Knight, Will. “How to Hide from the AI Surveillance State with a Color Printout.” MIT Technology Review, MIT Technology Review, 23 Apr. 2019, www.technologyreview.com/f/613409/how-to-hide-from-the-ai-surveillance-state-with-a-color-printout/.
About the article authors
Will Knight is MIT Technology Review’s Senior Editor for Artificial Intelligence. He covers the latest advances in AI and related fields, including machine learning, automated driving, and robotics.
Summary
Artificial intelligence, also known as AI, powers video recognition technology which tracks and identifies our faces and bodies everywhere we go. In some cases, this technology is being used to enhance our lives, with applications like eliminating the need for a checkout line at the grocery store. However, this technology is also emerging as a powerful tool for government policing and surveillance. This type of surveillance drastically reduces privacy. However, there may be a simple workaround to this technology.
Researchers from the Belgian university KU Leuven have just shown you can hide from an AI video recognition system with the aid of a simple image printout. Researchers were able to design an image which, when printed on a person’s clothing or held in front of a person’s body, could completely hide the person from AI-powered computer-vision systems. The image was originally designed to trick only one specific type of surveillance AI, but this approach could be adapted to hide people from multiple AI surveillance detectors at the same time.
Once this technology is further developed, clothes designed with specific images or patterns could conceivably let anyone, including criminals, hide from all types of AI surveillance and maintain complete privacy, even in plain sight.
Response
When AI-powered recognition technology is implemented responsibly, it can enhance society by making our lives more convenient and safe by becoming a tool to catch terrorists and criminals. However, if AI-powered recognition technology is used in oppressive or corrupt ways, the ability to hide and maintain complete privacy through special images is comforting. But, evildoers would have access to this same safety blanket, eliminating the positive impact of AI-powered recognition technology. Should these special images, which can nullify AI recognition technology, be publicly available to everyone? Or should all members of society be revealed, bringing safety and convenience to our lives at the cost of our privacy?
Related Links
For more info on: