Is the dystopian future already here? Delve into some of China’s emotional surveillance and citizenship assessment technologies and find out whether Black Mirror is coming true.

SeekerPublished: July 10, 2018
Published: July 10, 2018

It’s safe to say that our private information isn’t private anymore. New technologies are collecting data to be sold or shared between companies every day. While we entertain ourselves with novels, TV shows, movies, and video games depicting dystopian futures, we may be closer to some of the realities depicted in these fictional realms than we realize.

No country is exploiting repositories of personal data quite like China, with factories, state-owned enterprises, and sections of the Chinese military giving workers an EEG-like hat that passes information about their emotional states through AI models designed to detect spikes in anger, depression, or anxiety during the workday. This technology is said to help employers find out who’s stressed, modulate break times, and is even being applied in industries like transportation in medicine, for applications from assessing fatigue in high-speed train drivers to remotely monitoring the conditions of patients in hospitals.

While technology that mines personal data for patterns can have many positive, prosocial applications, large databases of highly personal information can have more sinister implications as well. Take China’s social credit score system, a similar system to the clout score assigned to individuals in the Black Mirror episode Nosedive. Like the American credit score, China’s social credit score takes into account financial transactions—but it goes much farther, assessing purchase types (diapers vs. video games, for example), criminal records, charitable donations, political loyalties, and even the social credit scores of the people with whom an individual associates most. In 2016, a man was denied a plane ticket only to find out a judge had deemed a court apology ‘insincere’––possibly in part because it happened to be delivered on April Fools’ Day––and plummeted his social credit score.

Regulations on technologies like this are few and far between. How can we define ‘insincerity?’ How can we mitigate the effects of social biases we’re already grappling with as a society, while allowing those challenges room to change over time? How can we ensure the privacy of individual citizens’ sensitive data while using the resources we do have to understand broad-scale social trends to our advantage as a collective? These are questions we must consider; for now, perhaps it’s best to focus on the present rather than a dystopian fantasy which may be closer to reality than we realize.

This video, " Is the dystopian future already here? Delve into some of China’s emotional surveillance and citizenship assessment technologies and find out whether Black Mirror is coming true. ", first appeared on seeker.com.

Be the first to suggest a tag

    Comments

    0 comments