Skip to Content
Why AI glasses with facial recognition could be a privacy nightmare
© Andrey Popov | Dreamstime.com
Technology

Facial recognition to make smart glasses even more of a privacy nightmare

In the latest Watch Dogs video game, the protagonists have a wealth of technology available at the push of a button. They can quickly scan people around them through a mobile phone. This brings up details like their name, occupation and personality traits.

That might seem like the conjurings of an over-active imagination, but Watch Dogs has always been somewhat rooted in reality. In fact, the first game in the series drew a lot of inspiration from the real world’s devasting Stuxnet virus.

Think we are still years away from the same technology as in Watch Dogs? Think again. The tech already exists, and you’ll never know if you have been captured by it. Read on to see how smart glasses with facial recognition technology could be a privacy nightmare.

Here’s the backstory

If the name ClearView AI rings a bell, it should. The company has been at the center of many privacy concerns. Thrust into the spotlight in 2020, a New York Times report revealed that law enforcement agencies have access to ClearView’s AI facial recognition system.

Has your local police department used facial recognition software? Tap or click here to check this database and find out.

While some agencies have solved crimes using the technology, there are fears that it could soon become available to the public. The biggest concern is how the technology works and the little information it needs to get results.

The tech is so powerful that it can identify a person from a single photograph. It can also scan government databases to gather relevant information.

Unfortunately, with many public record photos like licenses and passports available, it is only a matter of time for people to use it for nefarious purposes. The FBI has a database of over 700 million photos.

But the technology is now moving in even more dangerous territory. The New York Times recently revealed that ClearView AI signed a $50,000 contract with the U.S Air Force Research Laboratory to incorporate the tech into augmented reality glasses.

The intent is to increase security around airfields and military bases by having officers wear augmented reality glasses that could, in theory, quickly scan nearby faces for identification. However, the research lab promptly pointed out that no devices were delivered with the contract.

The department stressed that the collaboration with ClearView AI is for the “scientific and technical merit and feasibility” of such smart glasses.

What you can do about it

Thankfully, there isn’t anything that you must do. Yet. The technology may still be under development. But with so many government agencies eager to fully (or stealthily) adopt it, it might be here sooner than you think.

The bulk of ClearView AI’s data comes, to no surprise, from social media networks. Concerned that your image could be included in such technology? Now might be good to go through a digital cleanout.

We’ve highlighted ways to delete yourself from the internet, but there are other methods to scrub your data from being accessed by other people.

For example, deleting a social media profile might be a bit drastic for you. So, you can make it private instead. Check your security settings, especially on Facebook and Instagram, and adjust who tags you in photos. Tap or click here for Facebook privacy settings: Most important security checks to do now.

Keep reading

Data breach reveals Walmart, ICE and Best Buy’s dealings with facial recognition app

Zoom could secretly be recording your audio – Update your computer now

Tech smarts in 2 minutes a day

Get my Daily Tech Update and the Digital Life Hack. Just one minute each and arm you with the tech knowledge you need to impress your boss and friends with how smart you are.

LISTEN NOW