IBM appears to have withdrawn a controversial video surveillance product from the market.

The surveillance industry trade website IPVM first reported on the company’s withdrawal of its intelligent video analytics product in May; reporter Charles Rollet starting poking around after IBM failed to appear at a major security and surveillance industry exhibition, attended by more than 30,000 security professionals this year, where the company had been scheduled to promote the video analytics software.

IPVM noted that “IBM’s video analytics have faced recent media scrutiny over human rights concerns,” citing our investigations, published with The Intercept, as reasons for the product’s withdrawal — along with the product’s “poor market performance,” as gauged by IPVM’s own market research. In the investigations, reporter George Joseph found that IBM’s video analytics technology was implicated in racial profiling in New York City and human rights violations in the Philippines.

The software uses facial- and object-recognition algorithms in real time to identify people and things in surveillance camera feeds. In March, Joseph revealed that IBM had sold intelligent video analytics software to the brutal Duterte administration in Davao City, Philippines, in 2012. At the time, Human Rights Watch had already found evidence that Davao City law enforcement appeared to be complicit in death squads targeting alleged drug dealers. IBM documents, used to promote the company’s “Smarter Cities” platform, confirmed that the video analytics technology was actively being used in drug-related enforcement operations in Davao City. IBM declined to respond to questions about whether the company had conducted any human rights due diligence before selling the software to Davao City.

Last fall, Joseph revealed that IBM had partnered with the New York City Police Department to develop video surveillance software that enabled users — including select counterterrorism officials in the New York City Police Department — to search footage of pedestrians by skin tone, hair color, and facial hair. The opaque program had access to hundreds of cameras across the city. While civil liberties advocates expressed concern about the software’s potential for facilitating mass racial profiling, a department spokesperson said NYPD officers had been instructed not to use the skin tone search feature and that intelligent video analytics technology had been phased out of the city’s larger surveillance architecture in 2016.

In a little-publicized April 23 announcement on the IBM website, the company said it was withdrawing intelligent video analytics “from marketing” without replacement. IBM also confirmed the product withdrawal in a tweet to Rollet. But IBM declined to respond to multiple requests for comment from Type Investigations over several weeks about the status of its intelligent video analytics product.