London, United Kingdom – “First we pick up on a planned protest,” said Barry Millet, an information security manager at Mitie, pointing to a computer screen showing a small protest.
“Our guys will analyze it, look at the different feeds and calculate how many people are attending. What actions have these groups taken in the past? Are they passive? Are they more direct action? Is it likely they are trying to get into the building?” he continued, demonstrating his company’s monitoring software from a trade booth at the 2021 International Security Expo (ISE) in London, at the Olympia Exhibition Center.
The recent event showcased everything from cell phone trackers and electrified motion-sensitive border fences to facial recognition software and nano-drones for government and corporate buyers.
Mitie has a range of contracts in the UK, providing security tools to immigrant detention centers and supermarkets.
At the expo, it showcased a sprawling open-source data aggregator, using information about the protest posted to Twitter, Facebook and Instagram, as well as real-time video from vloggers live-streaming the rally.
As UK police become increasingly reliant on big data, several new bills enacted to expand police powers will boost real-time and retroactive data surveillance, as well as the use of facial recognition software and artificial intelligence (AI).
The most notable of these changes in British policing is the Police, Crime, Conviction and Courts Bill which reached a third reading in the House of Commons on 5 July.
Monumental in its scope, the bill has the power to criminalize selected marches and protests, with violations of the law carrying a maximum penalty of up to 10 years.
It also aims to increase arrest and search powers for people who have previously committed violent crimes.
‘Safest place in the world online’
The government claims the draft Britain’s online safety law will make “the safest place in the world online” and likely rely heavily on AI-powered content moderation.
Martyn’s law, following the Manchester Arena bombing in 2017 that killed 22 people, requires enhanced security surveillance and strategic planning for sites.
Much of the technology on display at the ISE is currently being used by law enforcement agencies in the UK and around the world.
With the increased UK police budget for 2021-22, which will bring total counter-terrorism funding to £914 million ($1.2 billion), several of the security and policing solutions on display may begin to shift from a last resort to a first response.
“An hour of CCTV video from a single camera can take a police officer from one to four hours to view,” said Fariba Hozhabrafkan, chief commercial officer of SeeQuestor, a facial recognition software company contracted by the British police to use its AI for missing persons, rape and murder cases.
“An incident can contain six or seven hundred hours of CCTV footage,” she said.
SeeQuestor’s software allows real-time and recorded surveillance camera images to be uploaded to the program, where it performs facial recognition on individuals of interest, linking them to existing or live images, as well as police and home office databases.
If a possible match is found, the AI will give investigating officers a chance score.
It can also perform targeted searches based on gender, race and clothing – and is trained to recognize inanimate objects such as guns, backpacks or abandoned suitcases at airports and train stations.
However, despite government and company assurances, there are major privacy concerns related to surveillance technology. Critics question the ethical use, data storage, and potential impact and biases of AI.
The London Metropolitan Police came under fire in early 2020 for the unannounced use of facial recognition, manufactured by Japanese technology company NEC, outside the busy Oxford Circus station in central London. Human rights organizations disputed the legality of its use.
While it’s claimed to make the streets safer, researchers at the University of Essex commissioned by the Met found that the software not only failed 80 percent of the time, but also exhibited extreme racial bias.
Nevertheless, a four-year contract was approved in August between the Met and NEC to retroactively deploy facial recognition software across London.
“The surveillance market is raging,” said Silkie Carlo, director of the privacy and civil liberties campaign group Big Brother Watch. “Different [UK] authorities keep buying and betting [devices] with a high degree of secrecy and often with a very shaky or non-existent legal basis.
“When live facial recognition is used, you see people on watchlists for no good reason, including activists, people with mental health problems, people who have not committed any crime.”
But according to Sabrina Wagner of Videmo, a German facial recognition software company, despite all the concerns, “[AI] usage will increase as the data only grows.”
“The police today don’t have terabytes — they have petabytes — of data to be stored as evidence.”
Videmo’s software — which can partially identify hidden faces — was used by German police after the 2017 G20 protests in Hamburg and led to 15 arrests. Authorities then created a portal where people could upload videos they captured of protesters and combined this information with video cameras and open source footage.
“I think it was 60 years of video that they had to go through,” Wagner said.
Videmo uses the eye area to make a match, making it of particular interest to law enforcement following protesters in balaclavas.
Video is now working on gait recognition technology. According to Wagner, early studies indicate that it is possible to accurately identify a person just by their gait and limb proportions.
Carlo, the campaigner, warned: “In America there have been people who have gone to jail for facial recognition because people are tempted to think it looks like DNA. People think it’s very, very accurate. It’s not. And people can be wrongly flagged quite easily. Again, the problem is that there is no legal framework for that.”
The IMSI catcher, a piece of hardware with a history of use in protests, was also on display at the London event.
Known in the US as a Stingray, he massively intercepts, tracks and monitors cell phones, tricking them into treating them like a phone mast.
Stingrays became visible in the United States for their use against Black Lives Matter protesters and are rumored to have been used by the FBI for years.
They were also recently discovered in Mexico City, where giant fake antennas illegally monitored phones.
British company Revector has managed to reduce the size of IMSI traps and attach them to a drone that can be flown for eight hours at a time.
“The first use of IMSI catchers was in prison, to find illegal cell phones. They were the size of a truck. Then they were the size of a car. And we’ve had this [drone] two or three years… And we’re looking to get things smaller than that,” said a company spokesperson.
The company’s drone-mounted IMSI catcher has found applications in mountain rescues and emergency services in remote areas without phone signals, as well as in the fight against wildlife trafficking, locating poachers.
They also cooperate with the police.
The use of IMSI catchers is currently prohibited in the UK without authorization from the Home Secretary.
Privacy International has filed a freedom of information request regarding the use of IMSI traps by UK law enforcement officers, which has been denied.
Tom Drew, head of counter-terrorism for data science company Faculty, said AI should be implemented with great care.
The faculty works with the UK Home Office, Serious Fraud Office and Passport Office and is the official data science partner of the National Crime Agency.
“Something that’s incredibly important when thinking about public data sets is privacy,” he said.