Backlash against Facial Recognition Only Possible When People Given the Chance to Remain Anonymous
The 24-hour gym that I go to near my apartment is moving up the technology chain. Replacing a contactless card key system that allows users to go in and out, the gym recently introduced a facial recognition system that allows those registered to open the door just by scanning their faces. It is more convenient for users who are unwilling to carry their card keys around. And from the gym's perspective, it effectively prevents users from lending their card keys to their non-paying friends. The goal is to eventually entice all users to switch over to the facial recognition system and get rid of all card key users.
Some users seem to be quite resistant. Even though the gym sent multiple emails to users about registering their faces for the new system, and posted the same notice on the gym's premises, the majority of the users continue to show up with their card keys. Even though gym staff is around when these users visit, the users do not seem to have any intention of speaking to the staff members to get the facial registration done. Nor do they seem to be particularly bothered that the card key system will be removed soon, making it a last-minute hassle for them to switch over.
Indeed, some users of the gym vocally speak against the facial recognition system. One middle-aged woman, in particular, had a particularly stern conversation with staff members. The woman, expressing concerns about data security, spoke about her discomfort with having a photo of her face taken to enter the gym. Reiterating that no one knows that she goes to this gym and her wanting to keep this fact a secret, she wanted strict assurances from the staff members that her photo will be used only for this purpose and that it will not be leaked somehow elsewhere, whether within the gym's database or to external parties.
The discomfort gym users have with facial recognition may in some ways be idiosyncratic to Japan. Japanese mass media and the general public tend to be particularly careful when taking and using pictures of people's faces without their consent. The right to not being identified in public is taken quite seriously, with TV stations, in particular, using distorted voices, pseudonyms, and blurring of physical surroundings, in addition to not showing faces, to minimize chances that individuals who did not give consent to be identified can be identified in public broadcasts.
The care with which individuals take toward minimizing public representation of others also extends to themselves. The greater popularity of Twitter and Instagram (which commonly use nicknames) compared to Facebook and LinkedIn (which value real names) in Japan in part shows that people are more willing to share their views when they know their real identities are not disclosed. News articles and TV interviews are likely to cite "an anonymous source" even if faces and employers are shown just so more frank views can be extracted.
But the concerns with facial recognition are also universal. As facial recognition becomes more prevalent not just personal identification, but as ways to get into restricted spaces, get legal permissions, and even make financial payments, there are reasonably more concerns with who has access to one's pictures. For people like me, who have been exceedingly willing to upload pictures of myself on social media and other online platforms, the growing importance of pictures as identification raises questions of who these uploaded pictures should be shared with.
As the female user of the gym so rightly alluded, there need to be concrete solutions and assurances that personal photos are not abused that can harm the privacy as well as the physical and financial security of the users. From a technological perspective, that means more data security concerning the use of cloud storage, limits to permissions, even among company employees, on accessing certain private information, and setting up systems that allow certain types of data to be permanently erased from databases when they are no longer useful.
But technological solutions may not be enough. Especially on facial recognition, they should be more realization among their promoters and implementors that those who oppose their more widespread use may very valid points about privacy concerns and potential violations. People should be given choices to opt out of facial recognition systems if they do not feel comfortable with them, without suffering excessive consequences such as not being able to access a gym that they have regularly gone to for years. Giving people options, as well as assurances, will allow cutting-edge technologies like facial recognition to be more accepted without popular backlashes.
Comments
Post a Comment