Article

Harmful effects of facial biometric system on society

Alright, you need to protect your privacy.

 

Well, your life and that of your loved ones depends on privacy. My suggestion is not to cover your face with a mask all day but be more vigilant with the facial biometric systems.Understand that when you come in contact with your facial biometric system you are giving up your most intimate data with no control on the use of your face.  Ask your employer to replace the facial biometric system with the made in India non biometric non touch attendance system.  The responsibility of stopping your office to take your facial data solely lies on you.
 

Privacy is a fundamental human right

 

of course, your employer does have an obligation to protect your rights. Are you okay to be stalked by a biker? Or are you okay with someone following you to keep a check on whom are you dating?  So why let a camera in your office take your picture and map it with your attendance system? Without the protection of your fundamental human right around technology like Facial Biometric attendance, will end up harming you more and in your surveillance at every level of your life. I am sure you don’t want to operate in an office where your privacy is compromised. Unless and until we have laws to protect your facial data, my strong advice is to opt for a contactless non biometric attendance system. You have to adopt systems with more privacy and transparency.

 

You don’t want to be tracked everywhere you go.

 

So be clear with your HR and employer that you don’t want to compromise your privacy. As an employer and HR, if you safeguard your employees privacy it will show you care. 

 

I am not surprised at all that computer software can be just as biased in decision-making as its human programmers.

 

MIT researcher Joy Buolamwini found that “on the simple task of guessing the gender of a face, all companies’ technology performed better on male faces than on female faces and especially struggled on the faces of dark-skinned African women”. In 2018, the trial of Facial recognition system of one the police departments in India was unsuccessful as 2% was the accuracy rate.  The Ministry of Women and Child Development in 2019, reported that the system couldn’t even accurately distinguish between boys and girls as the accuracy rate was less than 1 per cent.

 

Women, Children, racial minorities and non binary genders are inaccurately recognised by the Facial biometric systems globally.

 

These biases make it a particularly dangerous application in the Facial biometric attendance system because of the gender bias at the workplace. Efficiency is an unfulfilled promise, but discrimination and unreliability are the most dangerous effects. San Francisco banned police use of facial recognition technology in 2019. Police trials of facial recognition systems in the United Kingdom have been a flop show as technology failed 80 percent of the time. More over the facial biometric attendance system exists with no law governing it and no oversight of its use. Someone rightly said it’s policing without constraint, not policing by consent.

 

So be ready for inaccurate results by facial biometric systems if you have darker skin. Facial biometric systems are infringing your daily life despite warnings of discrimiantion and inaccuracy by lawyers, human rights activists and technology experts. Research shows that facial recognition technology is less accurate when seeking to identify darker-skinned faces. 

 

So do you think a racially biased and women insensitive facial recognition system is worth the privacy risk?