Although the term Machine Learning (ML) was coined in 1959, it’s advancement and development has never been more critical than it is today, particularly within government agencies. As the amount of data being produced, manipulated, and stored exponentially increases, so does the very real threat of cyber-security breaches and fraud. Meanwhile, federal budgets and staff resources continue to decrease. ML can provide high-value services for federal agencies including data management and analytics, security threat detection, and process improvement—but the list does not stop there.
A recent study of big data initiatives in 65 cities has interesting guidance for Federal big data initiatives. The researchers studied how data is collected and then used for decision making in what they called “the framework for Big Data initiatives.” There are two major cycles in the framework: “The data cycle governs the tools and processes used to collect, verify, and integrate data from multiple sources. Because of the variety of data sources involved, data teams in this cycle are [sic] often composed of representatives from multiple departments to leverage their field expertise and insider understanding of the data.
In December, I plan to write two postings detailing a scenario analysis for the next ten years of the Federal government’s data technologies. Governments are on the cusp of amazing technological advances propelled by artificial intelligence, blockchain technologies, and the Internet of Things. Also, governments will face new challenges such as the recent global cyber attack that took down Twitter and Netflix. I want to invite you, the reader, to also send in your predictions for the future of Federal government data.
Analytics and “big data” seem to be the next frontier in a number of arenas. Data researchers can use the large, real-time data sets that are available today to facilitate scientific discovery, improve the flow of traffic, and increase energy efficiency, among many other things. Last year, the White House appointed the first federal Chief Data Scientist. And a few months ago, the federal government released a strategy for big data research and development.
The debate between responsive websites and mobile apps took a decisive turn this week when the United Kingdom’s Digital Service (UKDS) banned the creation of mobile apps. In an interview with GovInsider, the founder of UKDS, Ben Terrett, explained that mobile apps were too expensive to build and maintain. Responsive websites were easier to build and updating the application only requires changing one platform. “For government services that we were providing, the web is a far far better way… and still works on mobile,” Terrett said.
Business processes have fascinated me since I took an undergraduate philosophy course in modern business management. A part-time professor who was a management consultant by day taught this unusual class. Perhaps business management thinking was first experimenting with ideas that would later lead to the agile and lean movement today. From this class I learned that nearly all organizational issues could be traced back to bad processes rather than poor workers.
Few other federal agencies deal with as much data as the National Aeronautics and Space Administration (NASA). Big science creates big data, and NASA manages many of the biggest science projects in world history. Even in the early days of NASA’s history, NASA pioneered new ways to create and store data. So, in the world of the cloud, Internet of Things, and intelligent agents, how does NASA deal with its big data needs?
The Congressional Research Service recently released a report (PDF, 688 kb, 17 pages, January 2016) describing the big data ecosystem for U.S. agriculture. The purpose of the report was to understand the federal government’s role in emerging big data sources and technologies involved in U.S. agriculture. As the report author, Megan Stubbs, points out, there is not even a standard definition of big data. “Big data may significantly affect many aspects of the agricultural industry although the full extent and nature of its eventual impacts remain uncertain.
The Digital Analytics Program (DAP) provides a wealth of standard Web analytics reports within its current Web analytics tool (Google Analytics Premium). Yet, navigating through big data with a standard report can be a challenge and definitely takes a few clicks. To quickly get to the insights of your agency websites’ traffic, building your own custom reports and segments is the way to go. As part of its on-going effort to educate and empower DAP users with Web analytics knowledge, the DAP team has put together the DAP Custom Reporting Catalog with many of the frequently used custom templates.
By now, you are familiar with “big data” or datasets that are so large that they cannot be analyzed by conventional analytical methods. You may have heard of “long data” which is data that has a temporal context. I work with long data when I analyze hiring patterns over time in workforce data. There is also “small data.” Small data are datasets that describe a current condition. For example, if you have a smart home appliance such as a smart thermostat or a home security system, that appliance is constantly monitoring data such as temperature or if a door is open.
Over the last several years, continuing advances in computer processing power and storage have brought about the growth of what some call big data. Mobile and wearable devices now also generate large amounts of data via our interaction with various apps and our geographic location. This endless stream of information is being harnessed to create extremely informative dashboards like analytics.usa.gov and helping make advances in medicine and even farming possible.
Data. Security. Privacy. These are the cornerstones of many discussions concerning technology. The security of citizen information when interacting with the federal government will be increasingly important as we progress into the future. A few agencies have begun to use Hyper Text Transfer Protocol Secure (HTTPS) in lieu of the standard HTTP. For these agencies, this transition to HTTPS is seen as a step in the right direction and is one way for the government to address the security of citizen information.
According to an article from Readwrite, the amount of money going to big data projects is steadily increasing despite widespread failure to achieve many results. For big data-related projects in global organizations, a total of $31 billion was spent in 2013 and that amount is expected to top $114 billion by 2018. The recognition that big data is important is present, but the results from big data projects have not illustrated this to the full extent.
As we move into 2015, the amount of data available in the digital ecosystem will increase very rapidly because of the Internet of Things (IoT), social media and wearable tech. In the future, the problem lies not only with data collection, but with what one does with the data. Big Data, one of the main and recurring buzzwords of the digital century, will remain important, but will force us to answer the question of what we will do with the data.
In January on DigitalGov, we’ll highlight pieces looking at trends we see coming in the digital government space in 2015 and beyond. We have lined up articles around: Customer Service Data 3D Printing at NIH and NASA Accessibility Mobile, and Training. Check back Monday, when we kick-off the month with 15 Government Customer Service Trends. And you can look at some of our most recent monthly theme articles in: crowdsourcing, user experience, and mobile.
Open data and big data—and the responsible management and protection of that data—are key components of the President’s agenda to drive innovation and economic growth. On Thursday, June 19, leaders from civil society, industry, academia, and 40 federal departments and agencies met at Georgetown University’s McCourt School of Public Policy’s Massive Data Institute to discuss how federal agencies can continue to unlock government data to drive innovation and improve services. Drawing from the White House Working Group report, Big Data: Seizing Opportunities, Preserving Values, this event focused on opening and using government data, while appropriately protecting privacy and preventing the use of data to discriminate against vulnerable populations in our society.
Are you looking for the “golden metric” that is the best measure of your agency’s website performance and cross-comparable across .gov websites? If so, stop looking. The concept of the golden metric is a dangerous one because it oversimplifies performance analysis of your website and overlooks the truth hidden behind other, more relevant metrics. Don’t get me wrong—it is easy to fall for the concept of the golden metric.
On June 19, the Obama Administration will continue the conversation on big data as we co-host our fourth big data conference, this time with the Georgetown University McCourt School of Public Policy’s Massive Data Institute. The conference, “Improving Government Performance in the Era of Big Data; Opportunities and Challenges for Federal Agencies,” will build on prior workshops at MIT, NYU, and Berkeley, and continue to engage both subject matter experts and the public in a national discussion about the future of data innovation and policy.
Government Web pages are found mainly through search engines. Google recently redesigned its search results page and there are quite a few small, but impactful, changes in this latest redesign. Specifically, it affects how page titles are displayed. Many experts now recommend even shorter page titles. Below are a couple of articles (plus tools) to see how the change may affect your page titles: Page Title & Meta Description By Pixel Width In SERP Snippet
Shortly after taking office in 2009, President Obama launched the Open Government Initiative, an effort to increase transparency, participation, and collaboration in the federal government. The initiative introduced a number of websites and strategies to offer raw government data, including research grant information on data.gov. For energy gurus, data.gov/energy offers downloads of energy-related data such as energy use and consumption in the U.S. Yet the mere provision of big data is not enough; a key component of making big data accessible is providing context and meaning to that data to enable the public to solve problems, identify patterns, and draw conclusions.