The Surveillance Vending Machines

In Canada, students on a university campus had the unpleasant surprise of discovering their vending machine was out of order, and professed a strange error message including the words “Facial Recognition” (FR). The students were shocked, they didn’t even consider it possible that a vending machine could use FR. Why even‽ It’s not a question of being uneducated and not knowing the possibility, but simply that it’s beyond all common sense. So what could possibly motivate the presence of FR? Studying the company responsible for the vending machine, we discover that they use FR for targeted advertising. That’s right! A large retailer in Canada turned out to do something very similar, with a similar motivation.

K Kiosk in Switzerland plans to replace all its vending machines in train stations by similar systems. We’ll have a national FR-enabled camera network in our train station. And not even in the pretext of “safety”, just to sell more ads.

I do wonder, am I the only one that is so thoroughly shocked by such disregard for privacy? The seemingly prevasive lightness with which customer privacy1 is approached in buisnesses makes me think so. Yet, reading the articles and government reports, they are as shocked as I am, so I’m not alone. Regardless, it’s important to spell out the objections. After all, some people judged that it was a good idea, otherwise it wouldn’t have happened. What would a system respectful of the public look like? Let’s see.

1

Specifically customer privacy, because corporate secrecy is actually not taken lightly at all.

The shocking nature of the news comes from surprise. The students were observed and categorized by the vending machine, and they had no idea they were. Logically, if they knew about it, they wouldn’t be surprised to discover that a FR system was deployed in the vending machine. This is also what the Canadian government reproached to Fairview in its privacy commissioner report: That the Fairview customer didn’t give their consent nor was aware of the FR. Not that the students wouldn’t have been shocked, they would have been shocked and protested when they learned that someone wanted to install such a system, ie: before the actual privacy violation occurred, not after. I’m using an affirmation here, because that’s exactly what happened in Switzerland when the Ktipp magazine revealed that the federal train company made a call for tender for a camera network with FR features. Again, not for “security”, but strictly for demographic detection and selling more ads. They had to retract their tender and give up the project.

I think it should be mandatory to put a prominent sticker stating the usage of FR and its purpose on systems that do use FR.

While awareness is necessary, it is not sufficient. Consent should be mandatory. Would the students have accepted if the vending machine retailer asked them whether they agreed that a system with FR was installed in their university?

Well, I doubt it. But maybe the retailer could have discussed with the student, coming to a common understanding, so that the costs are not all held by the student, and the benefits entirely captured by the retailer. Maybe a balanced system where the students benefit from FR could have been devised. It would also have forced the retailer to confront the fact that their system would have been rejected by their client.

Sobriety

Beside the fact that the FR capabilities of the system were hidden from the students, the surprise comes from the application. We know of the implications of facial recognition. We immediately have to mind a dystopian future of total surveillance level with Orwell’s 1984. Maybe the vendor would protest, they are not using FR to track dissidents or criminals (although, FR has been used by a ticketing company to refuse employees of litigating law firms entry to events). But the capability is here. Whether they want it or not, they are building a national network of FR cameras. And for what? To sell “personalized ads”.

When we ask the question of proportionality, there are two sides. It’s benefit over cost. If the benefit is high, the acceptable cost could be high. In Moscow, the installation of FR cameras in the metro network was advertised as (1) Smoothing out the fare payment. No need for a card or ticket, just show your face to the camera and you are let through. (2) Improving security, following a particularly deadly terrorist attack in the metro. Of course, the actual motivation may be different than the stated motivation, but the stated motivations were enough to build consent. I don’t think the Moscow residents got the better end of the deal, but they at least have some advantages from the FR system.

In the case of the vending machines, the user gained absolutely nothing from the addition of FR. Ie: being more accurately targeted by advertisement. People don’t want to be better targeted by advertisement. In 2022, when Apple gave the choice to its users to opt out of sharing their personal data to Facebook for improved advertisement, 97% of the users chose not to. The only reasons this exists is because there was no opportunity to refuse. Looking at how it benefits the vending machine retailer, it’s also dire. They have the added cost and operational capital of a massive screen, set in a public space, not to mention the generated heat right next to a fridge; The added fickleness of software, (the fact the vending machine crashed is proof to that); The added electricity consumption; The added liability and complexity of an online device; The added cost of a network connection; The added liability of holding personal information; The added liability of toxic materials in the circuit boards; And reputational liability.

What was the cost-benefit analysis like? In my mind, what occurred is that the revenue from advertising was evaluated, and afterward, the eyes of the accountant turned into giant dollar signs, completely overlooking the rest of the equation.

This is visible in the brochure from the vending machine manufacturer. We see the “additional revenue from advertising” accompanied by an infinity sign. Of course you can’t earn infinite money, it’s meant to be cheeky, but it’s telling of the mindset.

Public discussion

There is clearly a bias toward using technology at all cost, or completely ignoring the drawbacks, as soon as tech is involved in modern business management practices. Or the costs are assumed to be null, as the company deploying the technology does not pay them, but the general public does.

A solution to reduce this bias is to make the cost explicit. As a technologist, I’m absolutely not thrilled by this, but to me, to get businesses to act responsibly with new techs — rather than like preteen children presented with a noisy and bright new toy (with flashing lights) — adding a tax on new technology could solve this. A tax on user data would also help. The only way to avoid such a tax would be for businesses to police themselves and value more sobriety.

Bringing the debate to the board room, getting people affected by business decisions to give their point of view, would reduce negative outcomes. I don’t say that just because I’m Swiss and I value a democracy built on compromise and public debate. I say that because it’s a scientific fact that decisions made after controversial debates are better. I’ll refer to Dan Sperber and Hugo Mercier in their excellent book: “The Enigma of Reason”. They posit that cognitive bias is just human nature, however smart you are, however trained you are, you are subject to it. And it’s human nature to make better choices when confronted with people that do not agree with you.

In a board room, decisions are always taken by a single interested party. No debate is possible without dissent, and therefore, it’s easy to make decisions without the insight of reason built on consensus, which is always superior to self-interested reasoning.

This is especially important as those vending machines occupy public space. There should absolutely be a democratic debate over the deployment of technologies with large drawbacks in public space. FR is a cost bore by every person passing in front of the vending machine, the people who pay that cost need to consent to it, and have the opportunity to say “no”, or ask for the economic equation to be more in their favor.

Overall, what I deplore, and what worries me, is not new technologies or the usage of face recognition. But rather, the total absence of public debate on the question. Our society is built on the assumption that we have a say in decisions that concern us. Through democratic process for decisions relating to the public sphere; and market choice for private good and services we use. It is less and less true.

We need to increase this ability to weight meaningfully in things that impact us, not decrease it. Otherwise the social contract is breached, all justifications for power relationships fall into the toilets, and we open the door to complete social revolution.