CIA’s Overclassification Practices Declassified
The Reducing Over-Classification Act of 2010, a byproduct of the 9/11 Commission Report‘s finding that overclassification and lack of intelligence sharing led, in part, to the 2001 terrorist attacks, mandated reductions in overclassification. However, the Act failed to define what “overclassification” meant, and missed the larger problem of the current classification system –that too many documents technically meet the standards for classification, but should nonetheless not be classified in the first place. Thanks to “Secrecy vs. Disclosure: A Study in Security Classification,” a 1976 CIA Center for the Study of Intelligence Monograph, we now know this has long been a flaw of the classification system.
The CIA document (a significant disclosure by the Agency for which they deserve recognition) is an internal history that attempts to be the first “historical development of classification, seeking to isolate its endemic problems and to gain a fresh perspective on a procedure that has become hackneyed for most of us.” According to the study’s forward:
“Throughout the history of the classification system, going back even to its nineteenth century precursors, certain negative traits—overclassification, unnecessary classification, vagueness of the classification criteria, and accumulation of vast quantities of classified paper—have always been present. Unfortunately, they are all part of the classification experience of the CIA today. In recent years, the negative traits have attracted public and congressional attention and have resulted in a series of studies, regulations, and reforms intended to improve the system. There has been some improvement, but not much.”
Reading this document, it is striking how relevant the 1976 findings are to the current classification crisis, as well as how similar the warnings. The study discusses the long-running credibility problems classifying agencies have encountered, especially in the wake of the Pentagon Papers, and how the public and Congress came to regard classification decisions as a means to hide illegality and embarrassment rather than true secrets. It also takes an in-depth look at unnecessary classification at the CIA in particular, noting that bureaucratic pressures and precedent perpetuate classification errors and unnecessary classification.
The report also notes that a few years prior to its completion, the use of the Confidential classification was relatively rare in the Agency and the use of the Secret classification much more common, in part because “[t]here were offices at that time where one could not find a Confidential stamp. It had become a reflex action to place Secret upon a piece of paper. If the officer forgot to do it, the secretary did so automatically.” This reflex to mark everything classified, irrespective of whether or not you had the security clearance to do so, made it difficult to reduce the amount of classification errors or classified paper simply by reducing the number of government employers authorized to classify documents. For example, in 1955 there were 1.5 million employees authorized to classify documents (there are now 4.2 million). And Congress, believing there to be:
“a causal connection between the expanding volume of classified information and the number of classifiers, they have demanded sharp reductions in the number of the latter. Thus, the executive branch was induced to reduce the number of classifiers from 1.5 million in 1957 to 55,000 in 1971, the year prior to the issuance of E.O. 11652…But there is little evidence of a corresponding decline in the birth rate of classified paper.”
The CIA example gives a good reason to question the belief that limiting the number of classifiers results in fewer classifications.
Another problem contributing to overclassification at the Agency was the practice of bureau offices intentionally:
“publish[ing] at the highest level on the grounds that their high-level customers have all the clearances and that this gives greater status to the report. In one instance it was standard practice to include some codeword material so that the report could be printed the same day in the quick seventh-floor press setup, rather than sent to the Printing Shop, which would have entailed a delay of a day or so.”
Troublingly, the report also notes that in the early 1970s:
“William G. Florence, a retired Air Force classification official, estimated that the Defense Department had a least 20 million classified documents and was of the opinion that 99 percent of them did not merit classification in terms of a strict interpretation of the Executive order. A State Department witness gave an estimate of 35 million classified documents for his department. And Dr. James B. Rhoads, the United States Archivist, said he was responsible for 470 million pages of classified material covering the period 1939-1954. It would probably be safe to say that total classified holdings of executive branch at this time have long since passed the billion mark.”
Fast forward forty years, and take a look at the Public Interest Declassification Board’s November 2012 Report to the President, Transforming the Security Classification System. In one startling passage the report notes:
The 1976 study also provides a sometimes humorous look at the history of overclassification, including a story of the complaints made by the Chief or Artillery in 1907 about the “indiscriminate use of” Confidential clearances on Army documents, “citing in particular one that contained the formula for making whitewash,” which the study cites as possibly being the first documented instance of unnecessary classification.
“At one intelligence agency alone, it is estimated that approximately 1 petabyte of classified records data accumulates every 18 months. One petabyte of information is equivalent to approximately 20 million four-drawer filing cabinets filled with text, or about 13.3 years of high-definition video. Under the current declassification model, it is estimated that one full-time employee can review 10 four-drawer filing cabinets of text records in one year. In the above example, it is estimated that one intelligence agency would, therefore, require two million employees to review manually its one petabyte of information each year. Similarly, other agencies would hypothetically require millions more employees just to conduct their reviews.”
The study contains a wealth of interesting material, and the gems highlighted here are only a fraction of them. Read the study for yourself, and let us know what you think. In the meantime, I’ll end with what might be my favorite quote from the document:
“In the long run overclassification saps the defense of classification. Bad secrets depreciate good secrets. Thus, many within the Agency have come to believe that Secret alone has lost much of its meaning and some, out of zeal for security, buttress an item requiring only the protection of Secret with supplementary controls and markings or even find a way of insinuating it into one of the codeword compartments. This escalating protection accorded the ‘real’ secrets, tends to drain the classification criteria of their prescribed meaning.”