With HMA, platforms will be able to scan for any violating content and take action as needed.
Posted on – Wed 14 Dec 22 at 01:45pm

San Francisco: Meta has launched a new open source software tool called “Hasher-Matcher-Actioner” (HMA) that will help platforms stop the spread of horror, child exploitation, or any other offending content.
With HMA, platforms will be able to scan for any violating content and take action as needed.
HMA builds on Meta’s previous open-source image and video matching software for any type of offending content.
“Meta spent an estimated $5 billion globally on security last year and had more than 40,000 employees dedicated to it,” the company said.
“Among them, we have a team of hundreds of people dedicated to counterterrorism work, with expertise ranging from law enforcement and national security to academic research on counterterrorism intelligence and radicalization,” it added.
The new tool allows platforms to create and run their own databases, while also allowing them to leverage existing hash databases.
So, according to the company, they don’t need to save offending images or videos themselves, and can run everything through the database they use to detect postings that violate the rules.
The company shared the tool shortly before taking over as chairman of the board of the Global Internet Forum to Counter Terrorism (GIFCT) next month.
It’s an organization it formed with Twitter, YouTube and Microsoft in 2017 to combat extremism online.
GIFCT is a non-governmental organization that brings together technology companies to tackle terrorist content online through research, technical collaboration and knowledge sharing.
