The application “Live Time Intelligence” of the controversial US company Banjo, which according to the company should produce reconnaissance information in real time with the help of artificial intelligence (AI) and thus allow “predictive policing”, contains no trace of AI . A government test report states: “Banjo does not use techniques that meet the industry definition of Artificial Intelligence.”
Automatic crime detection promised
The big data company hit the headlines last year when it became known that the state of Utah had signed a $ 20.7 million contract with it. Cities and municipalities should give the startup access to video recordings from government surveillance cameras, emergency call systems, movement data from public vehicles and other sensitive information. In return, Banjo promised to recognize signs of dangerous scenarios such as child abduction, terrorism, shootings or organized crimes related to drug trafficking and to raise the alarm in advance.
In May it was leaked that the company’s cofounder Damien Patton had been a member of the Ku Klux Klan as a youth and was involved in a shooting attack on a synagogue in Nashville. Utah’s Attorney General and Attorney General Sean Reyes then suspended the contract. The Republican ordered that an independent body should check the alleged AI software for possible biases and data protection violations.
Assigned State Auditor John Dougall has his Message, which he set up together with a specially established commission for the protection of privacy and against discrimination, published last week. Accordingly, the inspectors could not discover any prejudices in the system, since this did not use an algorithmthat could have been fed with appropriately loaded training data.
Promised functions are missing
In many other respects, the program was also vaporware that did not contain the promised functionalities. “Banjo stated that they had an agreement to collect data from Twitter,” wrote Dougall in the report to his party colleague Reyes. “But there was no evidence that Twitter data was integrated into Live Time.” One According to the report Ex-Banjo employees are said to have founded an offshoot with the company Pink Unicorn Labs, with which they secretly collected data from social networks using a series of inconspicuous-looking tracking apps.
The example advertised by Banjo, in which the system helped react to a simulated child abduction, was not validated by the Justice Department, but simply accepted, criticizes Dougall. The result was probably supplied by a skilled human data analyst, “because Live Time lacked the advertised AI technology”.
Obviously, “certain important security features” were disregarded when configuring the software, the auditors note. Banjo’s approach “was not in line with best practice”. A significant risk was that Live Time could make direct database queries to authorities. Theoretically, Banjo could have unauthorized access to other sensitive personal information without the knowledge of the emergency call center. Some offices also provided more data than was necessary, according to the report.
Don’t just believe promises
Dougall and the commission included a long list of recommendations for sourcing software with the report. In it, they urge government agencies to scrutinize technological claims and test the announced capabilities. Many vendors promise that the “magic” of AI and machine learning will solve complex problems. However, these technologies are “only just emerging and as a rule only specialists are familiar with the details”. At the same time, the risk of abuse is great. Authorities would also have to keep an eye on the risk of deanonymization with big data.
Justice Minister Reyes feels confirmed by the results that Live Time “neither invades privacy nor has racist or religious prejudice”. Sensitive personal information was not shared with Banjo during the testing stage. It was clear that further data protection precautions would have to be taken in order to fully exploit the potential of the system. The Republican also maintained that “the terrible mistakes of the founder’s youth never broadcast in any way on Banjo”. The company now has a new managing director and name (SafeXai), with which KI (“Artificial Intelligence – AI”) is directly involved.
(mho)
.