More Trust from Intel Agencies Needed for AI to Work for Government

2880
Darktrace, a cyber AI company used to protect against threats, recently presented at the AI World Government conference.

The intelligence community needs to work closely with the private sector for AI adoption by the federal government to be successful, said Justin Fier, director of cyber intelligence and analytics at Darktrace, during a panel at the recent AI World Government conference.

“You can’t test [machine learning]-based solutions the way you tested old software,” Fier said in an account in fedscoop. “I need real user data.”

Justin Fier, director of cyber intelligence and analytics, Darktrace

The model of how technology is acquired by the government may need to be rewritten for AI, one speaker suggested. Agencies need staff with a higher level of technical acumen working with the technologies, said Todd Myers, automation lead at the National Geospatial-Intelligence Agency.

“There will always be this rubber band pulling you back,” Myers said about the outdated acquisition process. “There has got to be a complete paradigm shift.”

Disinformation and misinformation campaigns are threat AI can help to combat, said another. Campaigns like Russia’s in the 2016 U.S. presidential election amount to “a war of cognition,” said Brett Horvath, president of Guardians.ai.

Read the source post in fedscoop.