Covert Facial Recognition: is it harming our eyes?

Covert Facial Recognition: is it harming our eyes?
Near Infrared (NIR) lasers can cause us harm such as 'glass blowers cataracts' [source]

There's been concerns raised recently about AI facial recognition technology being rolled-out in all kinds of everyday places like our sports centres and supermarkets. The UN WEF Agenda 2030 propaganda focuses on 'keeping us safe' from crime by identifying shoplifters. Yeah, nah. In case you're not yet aware, the real reason for this BigTech intervention is data-harvesting for wealthy impact investors. [Find out what that means here]. But what about the covert facial recognition tech, like that found in vehicles?

An Australian/UK AI company (also registered in NZ) called Seeing Machines secured significant investment recently and is growing fast. It designs and sells in-vehicle monitoring systems that track not only a driver's face (even with a mask), but also tiny eye movements, using invisible near infrared (NIR) monitoring technology, linked to AI algorithms that constantly assess and judge driving:

Screenshot from a Seeing Machines demo [source]

But before I go into what I've learned about NIR, let me backtrack a little...

Last year some vehicle manufacturers finally (hooray!) acknowledged what us drivers have been complaining about for years: info-tainment screens are dangerous. Manual, tactile controls for simple, in-journey adjustments of temperature, music volume or lighting significantly reduces driver distractions. Well blow me down with a feather, who would have guessed?! Let's go 'back to the future' with common sense: bring back buttons.

The Benefits of Physical Buttons

1. Enhanced Safety – Unlike touchscreens that require drivers to take their eyes off the road to navigate menus, physical buttons provide a tactile experience, enabling adjustments by feel.
2. Improved Usability – Traditional buttons and dials allow for quicker, more precise adjustments, reducing driver frustration.
3. Better Accessibility – Physical controls cater to a wider range of users, including older drivers or those who may struggle with touchscreen sensitivity.
4. Increased Reliability – Unlike touchscreens, which can become unresponsive due to software glitches or wear, mechanical buttons tend to have a longer lifespan and more consistent performance.
Photo by Johnathan Ciarrocca on Unsplash

In contrast to this better-late-than-never customer-focused decision, and claiming to address a 'serious problem' on our roads that doesn't really exist, a 2021 European Union Directive (for your 'safety') has mandated that all new vehicles be fitted with a Advanced Driver Distraction Warning (ADDW) systems by this July, 2026. And beware - its going global. The ridiculous objective is 'Vision Zero' i.e. by 2030 no deaths on the roads due to driver distraction or fatigue (that's assuming we still have fuel to get around then)! And this is why companies like Seeing Machines and others like Neonode have gained market approval worldwide for AI 'solutions'.

In London, for example, a tram system has been recently fitted with ADDW, which allows employers and fleet managers (and AI) to be alerted not only if a driver is drunk, falling asleep or distracted, but also provides health insights:

...previously undiagnosed medical conditions – including overactive thyroid, sleep apnoea, and diabetes – have been identified early through [Seeing Machines] Guardian’s data insights, potentially preventing life-threatening events. These early interventions have not only improved individual wellbeing but also reinforced confidence in Tram Operations’ safety culture. [source]

Have these machines really 'improved individual wellbeing'? Maybe this snippet provides us with a clue about the real reason for this intrusive surveillance?

What is Near InfraRed (NIR) AI technology?

Near InfraRed is a type of optical radiation. And (over) exposure to natural or artificial versions of optical radiation brings risks. That's why infrared sauna users are recommended to wear protective goggles. But as this author conveniently explains when he analyses the function of sensors on our garden lights, NIR is invisible to us, because human eyes can only see within a fairly narrow range of 400 nm-700 nm.

"Near infrared (NIR) is just slightly above red, in the 700–900 range. However, it turns out [LED] cameras weren’t specifically designed to see in NIR, it was just a fortunate side effect of how they were made. You would think a camera would just stop at 700 nm because that is all humans can see, so why make the sensor see past that? Well, it turns out to be expensive to make a sensor that cuts off sharply at one point like that — it’s simpler and easier to make it respond across the spectrum so that you cover the region you are interested in. To get the color red at 700 nm, you naturally “bleed” over into NIR for the sensor."

Here's an image from that article to help our understanding:

So one of the problems with this type of AI facial recognition technology is that we cannot see this light, which is directed at drivers' eyes at a pulse of many times per second. The 'experts' from the NIR systems claim that 'independent studies show' >100 hours of being subjected to this invisible flashing monitoring is the same as one hour of sunlight. Yeah, nah. Of course, difference is, when we can see and feel when a light is potentially damaging to us, we instinctively turn away, squint or close our eyes to protect ourselves. Our eyes also have a built-in safety system: our pupils contract, reducing light into the sensitive cornea. That's how we learned how long-term exposure to NIR causes damage to the eyes, including a condition known as 'glassblowers cataract'. If we cannot see NIR, if/when eyes are damaged, we're unlikely to be aware straight away, and the biological and chemical processes can take a long time to show up, by which time, it's too late.

Now obviously I'm no ophthalmologist, but with this concern front of mind, I did a search using various keywords investigating NIR health and safety info. There didn't appear to be any human-based research, but this study on 15 poor Kiwi rabbits seemed to be cited often (concluding yes - it's harmful). I also searched the EU H&S databases, the Canadian one, and our own NZ Worksafe. Zilch. And all of the guidelines for NIR risk assessments I found were more than ten years out of date. This is all very worrying for drivers - especially those with modern vehicles who maybe particularly vulnerable to these risks. And what about the risks for military and airline pilots, who are subjected to this technology, too?

And wait, there's more bad news. At the moment, Seeing Machine's cameras are often retro-fitted to planes and vehicles like trucks, delivery vans or school buses. Paradoxically these cameras, wires and flickering lights clutter-up an already distracting cabin. But because Seeing Machine's significant shareholder is Japanese BigTech giant Mitsubishi Electric (yes, a US military partner) these systems are already an integral part of many vehicles dashboards. In fact, at a conference last year, Seeing Machines revealed its latest innovation, an NIR machine hidden within a rearview mirror:

I'm intrigued by this unethical strategy; collecting personal data from our bodies without our fully informed consent. And once again we get to play guinea pig in this global health data-harvesting experiment for Bigtech investors that is invasive, exploitative and controlling.

Do you have experience of a Seeing Machines 'Guardian' NIR camera or similar in your vehicle? Do you have expertise about NIR health and safety? Let me know in the comments below or respond to me privately.