At the Olympics, AI Is Watching You

“What we’re doing is transforming CCTV cameras into a powerful monitoring tool,” says Matthias Houllier, cofounder of Wintics, one of four French companies that won contracts to have their algorithms deployed at the Olympics. “With thousands of cameras, it’s impossible for police officers [to react to every camera].”

Wintics won its first public contract in Paris in 2020, gathering data on the number of cyclists in different parts of the city to help Paris transport officials as they planned to build more bike lanes. By connecting its algorithms to 200 existing traffic cameras, Wintics’ system—which is still in operation—is able to first identify and then count cyclists in the middle of busy streets. When France announced it was looking for companies that could build algorithms to help improve security at this summer’s Olympics, Houllier considered this a natural evolution. “The technology is the same,” he says. “It’s analyzing anonymous shapes in public spaces.”

After training its algorithms on both open source and synthetic data, Wintics’ systems have been adapted to, for example, count the number of people in a crowd or the number of people falling to the floor—alerting operators once the number exceeds a certain threshold.

“That’s it. There is no automatic decision,” explains Houllier. His team trained interior ministry officials how to use the company’s software and they decide how they want to deploy it, he says. “The idea is to raise the attention of the operator, so they can double check and decide what should be done.”

Houllier argues that his algorithms are a privacy-friendly alternative to controversial facial recognition systems used by past global sporting events, such as the 2022 Qatar World Cup. “Here we are trying to find another way,” he says. To him, letting the algorithms crawl CCTV footage is a way to ensure the event is safe without jeopardizing personal freedoms. “We are not analyzing any personal data. We are just looking at shapes, no face, no license plate recognition, no behavioral analytics.”

However, privacy activists reject the idea that this technology protects people’s personal freedoms. In the 20th arrondissement, Noémie Levain has just received a delivery of 6,000 posters which the group plans to distribute, designed to warn her fellow Parisians about the “algorithmic surveillance” taking over their city and urging them to refuse the “authoritarian capture of public spaces.” She dismisses the idea that the algorithms are not processing personal data. “When you have images of people, you have to analyze all the data on the image, which is personal data, which is biometric data,” she says. “It’s exactly the same technology as facial recognition. It’s exactly the same principle.”

Levain is concerned the AI surveillance systems will remain in France long after the athletes leave. To her, these algorithms enable the police and security services to impose surveillance on wider stretches of the city. “This technology will reproduce the stereotypes of the police,” she says. “We know that they discriminate. We know that they always go in the same area. They always go and harass the same people. And this technology, as with every surveillance technology, will help them do that.”

As motorists rage in the city center at the security barriers blocking the streets, Levain is one of many Parisians planning to decamp to the south of France while the Olympics takes over. Yet she worries about the city that will greet her on her return. “The Olympics is an excuse,” she says. “They—the government, companies, the police—are already thinking about after.”

algorithmsartificial intelligencemachine learningolympicsSports