Welcome!

GovIT Authors: Kevin Jackson, RealWire News Distribution, Xenia von Wedel

Related Topics: Web 2.0, Java, SOA & WOA, GovIT

Web 2.0: Article

The FTC Is More Responsive than NASA

A naively relaunched site might go down on the first cron run in a flood of scheduled posts and emails

The United States Congress managed to avoid a default with a last-minute agreement. But the reboot's still in progress, and many federal government servers and services remain shut down. The cause of the online blackout is this set of guidelines released by the Office of Management and Budget, and in particular their answer to this question:

Q5: What if the cost of shutting down a website exceeds the cost of maintaining services?
A5: The determination of which services continue during an appropriations lapse is not affected by whether the costs of shutdown exceed the costs of maintaining services.

This might seem ridiculous at first glance, but anyone who builds websites shouldn't be surprised that they included this directive. Keeping a site running isn't just a matter of paying hosting bills, and even the most well-crafted architectures never stop needing a hand at the wheel (especially where security is concerned). This is an unusual time, and it comes with unusual traffic patterns: the role of government is being questioned, and nothing brings in pageviews like national political scrutiny. Having a .gov domain is a major liability when you don't have staff waiting to perform disaster recovery.

So things look bad now - sites down, no timeline for return. The shutdown guidelines didn't instruct agencies to create plans to relaunch their web properties, but they'll need to have one in place if they want it to go smoothly. A naively relaunched site might go down on the first cron run in a flood of scheduled posts and emails. ‘Open data' government sites will get slammed by scrapers trying to make up for lost time. Exciting new bugs will pop up for data-driven government sites that never made plans for backfilling missing data or coping with null values. And of course, every site will have to deal with the traffic from rubberneckers - as soon as the news story breaks that a given agency's site is up, people who would otherwise never consider looking at it will scope it out.

This has led to an interesting question around the office: which agencies are best prepared to turn the lights back on? We do synthetic monitoring, so we're prepared to figure it out.

AppView Web synthetic user experience data for a set of government websites affected by the shutdown.

The first step in setting up synthetic monitoring is to choose a source. I went with our web monitor in Ashburn, Virginia because its proximity to Washington, D.C. makes it a solid proxy for the congressional staffers that drive a disproportionately high amount of traffic to breaking stories.

Next up, I chose several monitoring targets based on stories from VentureBeat, theWashington PostComputerworld, and Politico. Many government sites could've made the list, but I narrowed it down to five with a high impact on researchers, scientists, and the open government movement:

Finally, I defined each site's scripted transaction with our Selenium-based Firefox script recorder plugin. Since each site's layout is unique, I went with simple transactions across the board. That meant focusing on common user actions (browsing for recent news) instead of more complicated workflows (registering for accounts or logging in).

Pretty straightforward, right? There's just one caveat to be aware of: I opted to record these scripts on the Internet Archive's cached, pre-shutdown versions of the sites. That means that they'll intentionally fail when they're run against a shutdown splash page. By setting it up like this, I'll only see a green light when everything is back to business as usual.

And, well, it didn't take long to get there! My guess had been that NASA would be back first, but in the end, it was actually the FTC website that crossed the finish line less than two hours into the 17th. Someone, somewhere stayed up all night to press that button and turn the site back on again - and even though I wasn't watching, I know right when it happened and what impact it had on response time.

The FTC site has sub-one-second response time.

The FTC site has sub-one-second response time.

On the other hand, it looks like Census.gov and Data.gov were scheduled to open business at 9 AM this morning:

Census.gov saw a brief latency spike a few transactions after it first started serving pages.

Census.gov saw a brief latency spike a few transactions after it first started serving pages.

500px-gov-data

It seems that being a tech-savvy agency doesn't have much impact on responsiveness,
because NASA and NIST are still down as of this blog post:

500px-gov-nasa

NASA's site won't get past the homepage, as of this article.

500px-gov-nist

NIST's site may actually be running from some locations, as our web monitor in Virginia can access part of it even though I see a nopaywall. However, it took over a minute to load a largely static page.

All told, I think the FTC deserves some real credit for their user experience. It takes less than one second to get information on making a FOIA request, and they were ready for business almost eight hours before some other government sites. On the other hand, I expected a much better showing from NASA and NIST. I already guessed wrong, but what's your take as to when they'll be back up and running?

More Stories By James Meickle

James started as a hobbyist web developer, even though his academic background is in social psychology and political science. Lately his interests as a professional Drupal developer have migrated towards performance, security, and automation. His favorite languages is Python, his favorite editor is Sublime, and his favorite game is Dwarf Fortress.