Google will use hashes to find and remove nonconsensual intimate imagery from Search

News

On Wednesday, Google announced a partnership with StopNCII.org to combat the spread of non-consensual intimate imagery (NCII), the company announced today. Over the next few months, Google will start using StopNCII’s hashes to proactively identify nonconsensual images in search results and remove them. Hashes are algorithmically-generated unique identifiers that allow services to identify and blockThe VergeRead More