Since 2007, a major consulting firm has conducted an annual survey on organizations’ “Digital IQ.” In the ten years of organizations grappling with digital transformation, what has been learned? From the report: Focus on the human experience [emphasis in the original]: Rethink how you define and deliver digital initiatives, consider employee and customer interactions at every step of the way, invest in creating a culture of tech innovation and adoption, and much more.
When people think of government software, they often think of COBOL and PowerBuilder 5, with manual software deploys every three to six months on a fixed number of machines in a government-run data center. This perception is sometimes justified, but sometimes entirely wrong. Regardless, the perception makes many developers reluctant to work for the government because they worry about the frustrations of getting stuck in the bureaucracy instead of being able to iterate rapidly, ship products, and deliver value.
I recently had the chance to talk with the legendary Vint Cerf, one of the founding fathers of the internet. We had a wide-ranging discussion about the past, present and future of the internet, network security and what it would take to successfully, safely and reliably merge the digital and physical worlds, a concept known as the “Internet of Things,” or IoT. As its name suggests, the internet of things will connect all kinds of things, bringing us a wealth of data about, well, everything that we can use to improve our lives.
To folks new to government, one of the most surprising differences between our work and work in the private sector are the barriers in accessing commercially available software, and commercially available Software-as-a-Service (SaaS) in particular. There are good reasons for these barriers: the government places premiums on considerations such as security, privacy, accessibility, license management, and competition. It takes great care to work within those considerations while also providing digital teams with great tools to get work done.
Summary: Building on efforts to boost Federal cybersecurity & as part of National Cybersecurity Awareness Month, today we’re releasing a proposed guidance to modernize Federal IT. America’s spirit of ingenuity and entrepreneurship created the world’s most innovative economy and keeps us dominant in today’s digital age. Indeed, in 1985 about 2,000 people used the Internet; today, 3.2 billion people do. What started out as a useful tool for a few is now a necessity for all of us—as essential for connecting people, goods, and services as the airplane or automobile.
The Data Briefing: The Federal Data Cabinet—Promoting Data Literacy, Cultural Change, and the Federal Data Applications Ecosystem
Last Wednesday, the White House held the first Open Data Summit to showcase the open data accomplishments of the Obama Administration. One of the highlights was the formation of a government-wide “data cabinet.” Announced by Chief Data Scientist DJ Patil, the data cabinet is essentially a community of practice comprising the Federal agency’s data professionals. As Dr. Patil explains, the real issues concerning technical projects revolve around cultural issues. I couldn’t agree more.
This is the final post in the 5-part series, The Right Tools for the Job: Re-Hosting DigitalGov Search to a Dynamic Infrastructure Environment. Federal websites are required to implement DNSSEC, which relies on knowing exactly what server is responding to a request. In Amazon Web Services (AWS), the problem of unreliable servers is solved by Elastic Load Balancing (ELB). An ELB containing one or more servers is presented to the world as a single hostname — say, usasearch-elb.
Note: This is a guest blog post by Amando E. Gavino, Jr., Director, Office of Network Services, ITS/FAS/GSA. He is responsible for a portfolio of telecommunication acquisition solutions that provide government agencies the ability to meet their diverse set of telecommunication requirements. Acquisition solutions include Networx, Enterprise Infrastructure Solutions – EIS (the future replacement for Networx), SATCOM, Enterprise Mobility, Connections II, Federal Strategic Sourcing Initiative – Wireless (FSSI-W), and the Federal Relay Service.
This is post 4 in the 5-part series, The Right Tools for the Job: Re-Hosting DigitalGov Search to a Dynamic Infrastructure Environment. This post references the previous posts frequently, so please read those before reading this one if you haven’t done so already. In addition to the DNS challenges created by offering “masked” domains such as nasasearch.nasa.gov, we also had to solve the problem of how to maintain SSL certificates for the main search.
This is post 3 in the 5-part series The Right Tools for the Job: Re-Hosting DigitalGov Search to a Dynamic Infrastructure Environment. “All problems in computer science can be solved by another level of indirection, except of course for the problem of too many indirections.” – David Wheeler The simplest of our four requirements was to allow customers to choose whether to use the search.usa.gov domain for their search results page, or create a “masked” domain name such as search.
This is post 2 in the 5-part series The Right Tools for the Job: Re-Hosting DigitalGov Search to a Dynamic Infrastructure Environment. The last major infrastructure upgrade that DigitalGov Search had was in 2010. Not only has technology evolved significantly since then, but so have business models for right-sizing costs. Moving to Amazon Web Services (AWS) infrastructure allowed us to improve reliability by creating self-healing servers and distributing the service across four physically isolated datacenters, and reduce datacenter costs by 40% per month — no longer do we have to pay for peak throughput capacity overnight, on weekends, or during other predictably low-traffic periods.
Summary: Today, we’re launching the M3 Framework to provide agencies with leading best practices for mission-support function modernizations and migrations. The government’s internal operations have a powerful impact on service to its citizens, and this Administration has made transformation of management practices within the Federal Government a key priority. By sharing and streamlining mission support services and retiring or modernizing inefficient legacy IT systems, we’re better able to overcome the challenges of large-scale mission support projects, support core agency missions, and make our IT infrastructure more secure – all while achieving a more efficient, secure, and effective government for the American people.
This is the first post of a 5-part series. DigitalGov Search is a commercial-grade search engine provided as a shared-service by the United States General Services Administration. We power about 2,300 search configurations for hundreds of federal, state, and local government agencies. Using our platform, agencies can easily configure a search experience for the public that brings together resources from across their many publishing platforms: websites, blogs and feeds, social media, and government-specific resources like rules and notices from the Federal Register, and posts from USAJobs.
This is the fifth in a series describing how the Social Security Administration is working towards a more modern IT infrastructure. You can find part 1 here, part 2 here, part 3 here and part 4 here. In the next three posts we will consider the problem of modernizing old legacy software. In this post we will start a discussion about why modernizing software is important and what is most important to think about first.