Government agencies, by virtue, deal with tremendous amounts of data. While some functions are more data-heavy than others, essentially every public sector department collects, stores, manages, distributes, analyzes or otherwise interfaces with data. This isn’t a news flash.
What’s startling is what’s happening — or not happening — with that data. While other industries foresee a five-fold expansion in their volume of data in the next five years, the public sector, according to Splunk’s new Data Age report, anticipates a smaller increase. However, that smaller increase doesn’t alleviate the pressures government employees already feel at work, where unnecessarily manual processes persist in data management and inhibit tech skills development.
Why? Lacking resources is an issue for 81% of the report’s respondents, and more than 70% noted shortages in funding, training and understanding as barriers to AI and machine learning. The introduction of new technology in general presents hurdles at 81% of the respondents' departments.
One thing is certain: any increase will magnify the public sector’s struggles to manage, leverage and utilize the data they already have — which is a lot.
According to the Department of Commerce, every day that agency alone handles more than 20 terabytes of data, which is more than double the amount of data comprising the entire printed Library of Congress collection. The National Oceanic and Atmospheric Administration generates tens of terabytes of data daily from satellites, radars, ships and other sources. But given the sheer volume, most of it is essentially inaccessible.
That magnitude was cited by 83% of public sector Data Age survey respondents as another major hurdle for managing and leveraging data. Leaders in the federal government seem to agree.
“How do we take the data that we have — which is ubiquitous and it’s incredible across the federal government — understand it, be able to leverage it at every step in the chain?” Margie Graves, then deputy federal CIO, said at last year’s ACT-IAC Executive Leadership Conference.
Part of the answer lies in the Federal Data Strategy. The governmentwide, multipronged action plan for modernizing technology, accountability, transparency and the federal workforce establishes long-term strategies and infrastructure to leverage data to advance government operations.
Public sector meets the dawn of the Data Age
The workforce seems to understand the importance of this. In Splunk’s report, public sector participants noted the value of data in operations (77% of respondents), innovation (69%), cybersecurity (75%) and overall success (77%). The vast majority, 79%, also said they believe their organization is future-proofing to deal with changes in data volume.
Nonetheless, public sector participants' responses highlighted ongoing struggles within their organizations that hamstring their ability to manage and leverage data. In addition to aforementioned barriers, 77% said a shortage of in-house skills is another major challenge.
At the federal level, that’s one key area the FDS prioritizes. Action 4: Identify Opportunities to Increase Staff Data Skills, requires all 24 CFO Act agencies to comprehensively assess the critical data skills they need, evaluate current staff capacity to meet those needs, analyze gaps in data skills to prioritize needs, and identify and execute approaches to ensure adequate capability.
People as the answer to the public-sector data problem
The reality is nearly all future government jobs will have some element of data involved. It’s critical for public sector personnel to know about and access the growing number of reskilling and training opportunities — and to see the top-level recognition that “reskilling existing workers is a preferable alternative to hiring whenever possible,” as the CIO Council noted in its June Future of the Federal IT Workforce Update.
The CIO Council’s extensive focus on opportunities, support and investment reinforces that people are the public sector’s No. 1 asset. The dawn of the Data Age should accelerate investment of time and training in the public sector workforce. That includes providing modern tools that use AI, machine learning and automation to free valuable personnel from the impossible work of manually processing data.
The global pandemic’s seismic shift in workforce operations has driven the adoption of emerging — and that widespread adjustment illustrates how change is possible in public sector operations.
“Since March, we’ve seen a fascinating curve. [We’ve seen] the adoption of technologies in our common workflows that, behind secure walls, we weren’t able to incorporate … it’s really enabled us to start operating at the speed of industry. We’ve just begun to adapt our processes as much as our technology,” David Spirk, Defense Department chief data officer, said in September at the Billington Cybersecurity Summit. “The crisis taught us we have an opportunity to leverage commercially available tools.”
Spirk’s remarks underscore a concept that’s true across the public sector: The forthcoming surge in data, too, will bring opportunities for personnel. Opportunities to learn data fluency and skills that will become necessary in all jobs, especially those in the government; to spend less time on rote tasks and more on interesting and impactful work; and to contribute to a better-informed, more effective government of the future. This data-driven government won’t just benefit federal employees, but also all Americans that rely on government services.
Bill Wright is Splunk’s director of federal government affairs. He previously served as staff director and general counsel for two U.S. Senate subcommittees focused on homeland security and government IT. He is also a former special operations officer at the Office of the Director of National Intelligence’s National Counterterrorism Center.