Search engine

Track fears on the PimEyes facial search engine

An illustration of face analysis

Privacy campaign group Big Brother Watch has filed a lawsuit against facial recognition search engine PimEyes.

PimEyes allows people to search for faces in publicly posted images on the Internet.

Big Brother Watch claims it facilitates harassment and has complained to Britain’s data and privacy watchdog.

But PimEyes chief executive Giorgi Gobronidze says it poses less risk of harassment than social media or other search engines.

Mr Gobronidze told the BBC that because PimEyes only searches publicly posted images, anyone who misuses it “only gets information that is available on the open internet”.

Enabling Monitoring

Big Brother Watch’s complaint to the Office of the Information Commissioner (ICO) claims that PimEyes has enabled “surveillance and harassment on a scale previously unimaginable”.

From a person’s photo, PimEyes finds other photos of them published online. This may include images on photo-sharing sites, in blog posts and news articles, and on websites.

Big Brother Watch says that by piecing together information associated with these images — for example, the text of a blog post or a photo on a work website — a stalker could determine “a person’s workplace, or indications of the area in which they live”.

“Images of anyone, including children, can be browsed and tracked on the Internet,” wrote Madeleine Stone, legal and policy officer at Big Brother Watch, announcing the complaint.

She argued that the tool could be secretly used by potential employers, college admissions officers, domestic abusers or stalkers, and said it threatened to “end anonymity such that we know him”.

Campaigners accuse PimEyes of illegally processing the biometric data of millions of British citizens – arguing that it does not obtain permission from those whose images are analysed.

However, PimEyes told the BBC it was technically impossible to say how many faces of UK citizens it had analysed.

Mr Gobronidze also responded to accusations that his search engine breached data protection law. He claimed it was “technically impossible to reconstruct a single photo” from data held by the company, “even though we put our entire database on the open web.”

No monitoring

To fully utilize PimEyes, users must purchase one of three paid subscription types.

In its terms and conditions, the site states that it is intended to allow people to search for publicly available information about themselves.

“PimEyes is not intended for, and is not designed for, monitoring others,” it says.

But Big Brother Watch says there’s no guarantee against that. There were also concerns that the tool could be used to uncover the true identities of sex workers.

However, Gobronidze says PimEye’s “data security unit” looks for suspicious activity, such as if a male user is always looking for females, or if a user uploads a picture of a child..

And the site allows people to opt out of having their image appear in the results.

A collage of images of faces

A collage of images of faces

The company argues that there are positive uses for the tool, telling the BBC that it:

  • assists investigative journalists

  • negotiates with German, Italian, UK and US law enforcement agencies to help fight crimes against children, human trafficking and terrorism

  • had helped women and girls remove thousands of images related to non-consensual pornography

  • worked with humanitarian organizations on the investigation of war crimes and crimes against humanity in Ukraine

However, another facial recognition search engine has found itself in legal hot water.

ClearviewAI’s tool isn’t usable by the general public, and the company says it’s only available to law enforcement. Nevertheless he faced a £7.5million fine from the Information Commissioner’s Office (ICO).

Responding to Big Brother Watch’s call for an investigation, the ICO said: “We are aware of this matter and are evaluating the information provided.”