Tech Company Says Its AI Can Forecast Crime

( – Voyager Labs claims artificial intelligence (AI) technology can predict crimes. Police departments across the US are throwing money at the company to contract with them. However, Meta is suing the company.

In 2018, the New York Police Department (NYPD) agreed to pay Voyager Labs almost $9 million after the company claimed its products could be used to predict fraud and other crimes by analyzing online behavior. In 2021, the department renewed the contract for more than $1.6 million. According to a report by The Guardian, internal documents obtained by The Brennan Center, a civil rights organization, in 2021, show Voyager Labs alleges it uses analytics software to map out someone’s posts and social media connections, allowing it to “unearth previously unknown middlemen” involved in crimes.

The Los Angeles Police Department (LAPD) and others across the country have also begun using the company’s software. The NYPD has not revealed what exactly it’s doing with the software, but has said the tools uncover “information relevant to investigations” and also address other concerns.

While law enforcement departments appear satisfied with the programs, Meta is not so thrilled. In January, the company, which owns Facebook and Instagram, filed a lawsuit against Voyager Labs to ban it from accessing the social media giant’s services. Meta alleges the AI company has made more than 40,000 fake Facebook and Instagram accounts in an attempt to farm data on the platforms. Since then, Meta allegedly discovered 17,000 more fake accounts.

Voyager claims it doesn’t make fake accounts and filed a motion to have the case dismissed. However, The Guardian claims documents show the company tells clients they can make fake social media accounts to “collect and analyze information that is otherwise inaccessible.”

William Colston, a spokesperson for Voyager Labs, told reporters the company is proud of its law enforcement contracts and happy the agencies have used its technology successfully.

Copyright 2023,