By Beth Timmins Business reporter, BBC News
Facebook has announced it will no longer use facial recognition software to identify faces in photographs and videos.
There have been growing concerns about the ethics of facial recognition technology, with questions raised over privacy, racial bias, and accuracy.
Regulators had not yet provided a clear set of rules over how it should be used, the company said.
It has faced a barrage of criticism over its impact on its users.
Until now, users of the social media app could choose to opt in to the feature which would scan their face in pictures and notify them if someone else on the platform had posted a picture of them.
In a blog post, Jerome Pesenti, vice president of artificial intelligence at the firm said: “Amid this ongoing uncertainty, we believe that limiting the use of facial recognition to a narrow set of use cases is appropriate.”
In 2019, a US government study suggested facial recognition algorithms were far less accurate at identifying African-American and Asian faces compared to Caucasian faces.
African-American women were even more likely to be misidentified, according to the study conducted by the National Institute of Standards and Technology.
Last year, Facebook also settled a long-running legal dispute about the way it scans and tags photos.
The case has been ongoing since 2015, and it was agreed the firm would pay $550m (£421m) to a group of users in Illinois who argued its facial recognition tool was in violation of the state’s privacy laws.
Other tech firms such as Amazon and Microsoft have both suspended facial recognition product sales to police as the uses for the technology have become more controversial.
Facebook, which as well as running the world’s largest social media network also owns Instagram and the messaging service Whatsapp, has come under growing pressure from regulators and politicians.
It is facing increased scrutiny from regulators including the US the Federal Trade Commission, which has filed an antitrust lawsuit alleging anticompetitive practices.
And last month, a former employee accused the company of unethical behaviour. Frances Haugen released a cache of internal documents which she said showed Facebook had put profit before user safety.
Chief Executive Mark Zuckerburg said Ms Haugen’s claims were part of a co-ordinated effort to “paint a false picture” of the company.
The firm recently announced a new name, Meta, for the broader parent company following a series of negative stories about Facebook.
Mr Zuckerberg said the existing brand could not “possibly represent everything that we’re doing today, let alone in the future” and needed to change.
Facebook earns $9bn despite whistleblower scandal
TikTok pays out in lawsuit over facial recognition
Facebook is making hate worse, whistleblower says