The federal government is awash in data, to the point that it’s becoming physically impossible for human eyes to analyze it all. Salvation might be found in artificial intelligence, but while agency leaders may recognize the technology’s possibility, adopting it and applying it have been a bit of a slow burn.

Culture is one of the big reasons why, according to technology leaders in both the public and private sectors who gathered Wednesday at ACT-IAC’s AI forum to illustrate how machine learning could streamline future agency operations.

“I think AI is in a really weird place right now,” said Aaron Dant, chief data scientist at information technology company ASRC Federal. “When the cloud became the big thing, everybody was like, ‘We’ve got to get to the cloud. We’re going to get these big streams of data and it’s going to be this big thing.’”

Cloud technology progressed quickly “from a research notion to something that was product-oriented,” he said, but AI is different.

“AI is totally ‘Mad Max’ right now. Everybody is on their own crazy cart right now in the desert, trying to figure out what direction things are going to go,” Dant said. “There’s no productized framework that’s just the solution for AI that everybody uses. That’s the biggest gap right now.”

While the path forward remains unclear, agencies do recognize that they need AI, even if implementing it might follow a typically laborious process for federal IT advancements. A survey by enterprise software company ServiceNow found that fewer than once in five agencies are currently using advanced automation, but that 77 percent of respondents said they will need it in the next five years to keep up with the pace of work.

Respondents to the ServiceNow survey said the top barriers to adoption were the cost to replace legacy systems to be able to run automation, as well as the cost to staff the new systems.

Some agency leaders have sought to get a jump on AI, utilizing the tech in pilot programs centering on predictive analytics. They say widespread adoption could be slowed not by the technology’s development, but instead by the culture that has to embrace it. 

Kenneth Walton, an urban design architect at the National Capitol Planning Commission, has proposed using predictive analytics to determine what skills are needed in developing apprentice-based programs, as well as infrastructure workforce development programs. He said capitalizing on AI’s benefits requires federal leaders to experiment beyond some traditional roles.

“The biggest problem I would have would be trying to get out of the mission statement,” he said. “Being able to cross over from doing whatever is in your particular mission, AI doesn’t necessarily fit in with planning, architecture and urban design. So, if you are not going to have a champion in your agency who’s going to buy into it … there’s got to be a way to reduce the amount of red tape and policies that are front of being able to try new things that don’t necessarily fit into the agency mission.”

Marilyn Miller, program director for the Genetics of Alzheimer’s Disease component of the National Institutes of Health’s Alzheimer’s Disease Sequencing Project, said that she was met with intense skepticism when she proposed using AI to crunch genomic datasets for analysis.

“Looking across NIH, and academic institutions in general, I think the problem is the public and the infrastructure at NIH is pretty stodgy,” she said. “I think there have to be successes pretty soon in other areas. Because it’s going to be tough to get this broken through the way the community thinks the way academic institutions are, and scientists are frankly afraid of being scooped by a machine.”

Let’s block ads! (Why?)