FOIA Audit: Report From the Trenches
Six months ago, when I got onboard the Knight Open Government Survey project, I had only the slightest sense of what I was getting into. When the last survey was released in March 2010, I was only an intern – warm and safe in the bubble of the intern room/proto-document vault (Photo: Anya Melyakova). I had little knowledge of the survey beyond skimming an article in the New York Times. Within a couple days on the job, I began to grasp the significance of the project.
The largest and most time-consuming task I had to undertake was the day-to-day management of the 90-agency FOIA survey. From day one, FOIA responses from agencies began arriving and a basic filing and indexing system had to be established. Once a degree of crowd control was in place, I could easily determine agencies’ most basic level of compliance with the FOIA: responsiveness.
Agencies that responded were checked-off and filed until a final response arrived. Agencies that did not respond within the time limit set by the FOIA were contacted by phone. One of the better results of this series of calls came from the Department of Agriculture (see video). One of the most disappointing results came from the more obscure Court Services & Offender Supervision Agency. On the numerous occasions that I called the number provided by the Department of Justice and the agency’s own FOIA website, I dealt with one of the infamous automated operators employed across nearly every federal agency or private-sector corporation. I was never able to resolve the battle with my robotic nemesis at the Court Services & Offender Supervision Agency, which does not speak very highly of that particular agency’s customer service.
As final responses started coming in, I needed to create a metric for judging the level of compliance of the agency. Fortunately, the Emanuel-Bauer memorandum, on which the original FOIA request was built, presented a framework around which a metric could be easily constructed. Tracking agency compliance with the “two steps” was an elegant solution, but the diverse range of responses made separating agencies into discrete categories difficult. I ended up borrowing the basic response metrics of the 2010 Knight Open Government Survey and layering agency compliance with the “two steps” on top. The mindset behind the method was to eliminate, if not minimize, the number of subjective judgments applied to agency responses. One good thing to come out of this mindset was the rediscovery of my Microsoft Excel spreadsheet skills which languished in the unused, cobwebbed corners of my brain since I left engineering school in 2006.
As with any major project, the last few weeks before the deadline were a flurry of action. A few “buzzer-beater” agency responses gave me an excuse to review all the agency responses for what seemed like the 20th time. After a week of reviewing, writing, editing, and document formatting, I relished getting home and sitting (read: sprawling) on my couch and watching hockey with a cold beverage, proud of having contributed to a Survey of the state of FOIA during the Obama Administration