Technological advancements and the exponential growth of information are reshaping operational practices across numerous sectors, including government. The generation of government data and the rates of digital archiving are climbing, driven by the rapid expansion of mobile devices and applications, smart sensors and IoT devices, cloud computing solutions, and citizen-facing portals. As digital information becomes more expansive and complex, the management, processing, storage, security, and disposition of this data also become more intricate. New tools for capture, search, discovery, and analysis are enabling organizations to derive meaningful insights from unstructured data. The government sector is reaching a critical juncture, recognizing information as a strategic asset. To better serve the public and fulfill mission requirements, governments must protect, leverage, and analyze both structured and unstructured information. As government leaders strive to evolve their organizations into data-driven entities, they are establishing the foundation to correlate dependencies among events, personnel, processes, and information.
High-value government solutions will emerge from the integration of several disruptive technologies:
- Mobile devices and applications
- Cloud services
- Social business technologies and networking
- Big Data and analytics
Big Data represents a transformative industry solution that empowers government agencies to make superior decisions by acting on patterns revealed through the analysis of vast volumes of data—whether related or unrelated, structured or unstructured.
However, achieving these outcomes requires more than merely accumulating large amounts of data. "Making sense of these volumes of Big Data requires cutting-edge tools and technologies that can analyze and extract useful knowledge from vast and diverse streams of information," wrote Tom Kalil and Fen Zhao of the White House Office of Science and Technology Policy in a post on the OSTP Blog.
The White House took a significant step toward assisting agencies in identifying these technologies by establishing the National Big Data Research and Development Initiative in 2012. This initiative allocated over $200 million to maximize the potential of the Big Data explosion and the tools necessary to analyze it.
The challenges posed by Big Data are nearly as daunting as the promise it offers is encouraging. Efficient data storage is one such challenge. With budgets remaining tight, agencies must minimize the per-megabyte cost of storage while ensuring data remains easily accessible, allowing users to retrieve information when and how they need it. Backing up massive quantities of data further intensifies this challenge.
Effective data analysis presents another major hurdle. Many agencies utilize commercial tools to sift through vast amounts of data, identifying trends that enhance operational efficiency. (A recent study by MeriTalk revealed that federal IT executives believe Big Data could help agencies save over $500 billion while simultaneously fulfilling mission objectives.)
Custom-developed Big Data tools are also enabling agencies to meet their analytical needs. For instance, the Computational Data Analytics Group at Oak Ridge National Laboratory has made its Piranha data analytics system available to other agencies. This system has assisted medical researchers in identifying links that can alert doctors to aortic aneurysms before they occur. It is also employed for routine tasks, such as screening resumes to connect job candidates with hiring managers.
Read more...