25 May 2012. A sends:
Nine years ago you posted Chapter 9 of James Bamford's Puzzle Palace at http://cryptome.org/nsa-v-all.htm, and put a request for digitized copies of two documents at the bottom of the page. While I don't know the details of the campaign against intensified classification, you might link the two citations to the relevant URLs for the convenience of researchers:http://www.nsa.gov/public_info/_files/cryptologic_quarterly/digitalcomputer_industry.pdf
http://www.intelligence.senate.gov/pdfs/95nsa.pdf
Cryptome mirrors, respectively:
15 June 2003
Provided as background to the NSA's current endeavor to reduce information to be released under FOIA and the Bush Administration's increased secrecy of information about national and homeland security programs and related secret contracts to industry and educational institutions.
Excerpted from The Puzzle Palace: Inside the National Security Agency,
America's Most Secret Intelligence Organization, James Bamford, Penguin,
1982, pp. 426-57.
"DEAR SIR," EARNEST began with an overly large D and a small s, "I have in my possession a unsolveable code which I would like to sell." Noting his system's "superior power," the eleven-year-old Minnesotan firmly informed the government that he would "take not less than $20,000.00" for the code and advised that they write rather than come for it in person.
J. W. Hough also had a code he thought the government might like to consider. "It is, I believe, an improvement over any code system I have ever seen," he wrote, and to demonstrate, he enciphered the last portion of Lincoln's Gettysburg Address. Attached to the letter was a form entitled "Request for Extra Writing Privilege" with Mr. Hough's return address: United States Penitentiary, Leavenworth, Kansas.
For much of this century the only outsiders interested in the esoteric art of cryptography were a handful of hobbyists testing one another's skill with simple cryptograms, and a few entrepreneurs trying to make a business from selling commercial telegraph codes. Those code systems submitted to the government either for comment or for sale would inevitably wind up in a place known, according to one of the NSA's early pioneers, as the "nut file."
Such arrogance and contempt for those outside the barbed wire fences was easily affordable in the early days of pencil-and-paper cryptology. There being no high technology involved nor any scientific interest in the subject, the government code experts were permitted a near-monastic existence. The ascetic life ended, however, when both codemaking and codebreaking became increasingly mechanized during the late 1930s. A decade later, with computers replacing punched-card machines, cryptology had developed into a full-fledged science. Where once outside interest was rejected, if not ridiculed, the constant need to push outward the boundaries of mathematics, engineering, and telecommunications now required the establishment of a close yet secret alliance with America's academic and industrial communities.
To forge this alliance, the NSA, soon after it was formed, established the National Security Agency Scientific Advisory Board (NSASAB), a ten-member panel of science wizards plucked from ivy-covered campuses, corporate research labs, and sheepskin-lined think tanks. Twice a year they would converge on Fort Meade, join with senior NSA scientists, and then split into groups like the SIGINT Exploitation Advisory Panel and the Electromagnetic Reception Advisory Panel, where they would discuss the application of the latest theories in science and technology to eavesdropping, codebreaking, and cryptography.
Among the early members of the board was Stewart S. Cairns, who had earned his doctorate at Harvard and was chairman of the mathematics department at the University of Illinois at Urbana (the same school where William Martin, not long before his defection, would be sent on a two-year scholarship). Chairman of the NSASAB during the mid-1950s was Dr. Howard P. Robertson, professor of theoretical mathematics at the California Institute of Technology and later a science adviser to President Kennedy.
When Vice Admiral Laurence Frost arrived at the Puzzle Palace in the fall of 1960, he found relations between the board and NSA strained and bitter. Agency officials were charging the board members with not putting enough time and effort into some of the projects; the board hit back at the NSA leadership for its lack of guidance and its declining support.
In an effort to reduce the acrimony and mend the fences, Frost, much to everyone's surprise, appointed Robert F. Rinehart to chairman of the NSASAB. A fifty-three-year-old mathematics professor, at the Case Institute of Technology, Rinehart was the junior member of the board, having been appointed only a few months earlier. Yet, as he indicated in a letter to his fellow board members shortly after his selection, the "lack of previous NSA experience, implying absence of preacquired biases," was precisely the reason for his selection. "It is quite possible that the principle 'ignorance is beneficial,' " he added with tongue only slightly in cheek, "may have been carried to its ridiculous extreme."
Although some members remain on the board for only a year or two, others have stayed on for more than a decade. Among the latter, and dominating the advisory board during much of the 1960s and early 1970s, were Dr. William O. Baker of Bell Labs, Dr. Robert P. Dilworth of Cal Tech, Arnold I. Dumey of the Communications Research Branch of the Institute for Defense Analysis at Princeton, Dr. Joseph J. Eachus of Honeywell, and Dr. Richard C. Raymond of General Electric. Another long-time member, and chairman from January 1967 until at least the mid-1970s, was Dr. Willis H. Ware, a research executive at the RAND Corporation. From 1951 to 1971 Ware was head of RAND's Computer Sciences Department and subsequently was named a deputy vice president. He first joined the NSASAB in 1964 as a member of the Data Processing Panel.
The Scientific Advisory Board was one of the NSA's earliest efforts to employ the talents of nongovernmental elements of the scientific and academic worlds, but it was far from the last.
In 1956 Dr. Howard T. Engstrom, a computer wizard and vice president of Remington Rand, took over NSA's research and development organization. The following year he was appointed deputy director of NSA and a year later returned to Remington Rand.
Joseph H. Ream, executive vice president at CBS, was imported to replace Engstrom as deputy director. He, too, left after a year; he headed up CBS's Washington office and later CBS-TV's programming department. Ream's interlude at NSA is listed on his CBS biography simply as "retirement."
Three months before Ream gave up codebreaking for "I Love Lucy," one of the most important meetings in the history of the Agency took place in a clapboard structure on Arlington Hall Station known as B Building. On July 18, 1957, a handful of the nation's top scientists crowded together in NSA's windowless Situation Room to present a blueprint for the Agency's future technological survival.*
____________________
* Present were Dr. Hendrik W. Bode, a physicist and communications engineering specialist from Bell Labs; Dr. David A. Huffman, an expert in information theory and signal design from MIT; Dr. Luis Alvarez, associate director of the University of California's Lawrence Radiation Lab at Berkeley and a future Nobel laureate in physics; Dr. Richard L. Garwin, a communications specialist from IBM's Watson Lab; mathematics professor Andrew M. Gleason of Harvard; Dr. John R. Pierce, Bell Labs director of communications research; Claude E. Shannon of MIT and Bell Labs, who pioneered the fields of information theory and speech privacy; and Oliver Selfridge, a computer specialist from MIT's Lincoln Labs. (Baker Committee, photo with attachments, Friedman Papers, photo file, George C. Marshall Research Library.)
Chaired by Dr. William O. Baker, forty-two, the brilliant research chief of Bell Labs, the committee had spent months analyzing the capabilities, direction, and potential of America's cryptologic resources. They concluded that the operations performed by NSA constituted one of America's most valuable assets and one of the most important weapons in the Cold War. To fall behind would be to invite another Pearl Harbor. They therefore recommended the initiation of a Manhattan Project-like effort to push the USA well ahead of the Soviet Union and all other nations in the application of communications, computer science, mathematics, and information theory to cryptology. Such a goal, the Baker Committee suggested, could be accomplished only by a continuous transfusion of brainpower from the reservoir of scientific genius beyond Fort Meade.
NSA had now reached a crossroads. One year earlier, General Canine had begun laying the groundwork for Project Lightning, the five-year, $25 million program to increase computer speed a thousandfold. Research contracts had gone out to industrial and academic research groups. But this was to be open research, with the results reported in the literature and made freely available. In fact, research produced by Lightning contributed to over 160 technical articles, 320 patent applications, and 71 university theses.
Now the decision had to be made about whether to continue funding, as with Lightning, generalized, public research or to begin to direct those funds toward secret, specialized, cryptologic research. It was a choice between an open bridge or a hidden tunnel between the Agency and the outside scientific community. Following the Baker report, the decision was to use the tunnel. The vehicle would be Project Focus.
On the morning of October 22, 1960, a small group of invited guests sat quietly on folding chairs as they listened to Princeton University president Robert F. Goheen dedicate the latest building on his expansive campus. John von Neumann Hall was a contemporary, red-brick, two-story building with a pleasant, tree-shaded patio surrounded by an eight-foot-high brick wall. It might have been a new science building or possibly a student center.
It was neither.
Named after the brilliant mathematician who pioneered computer logic, John von Neumann Hall was, in effect, the academic world's entranceway into the NSA's secret tunnel.
A product of the Baker Committee, Project Focus involved the establishment of a private, independent think tank dedicated exclusively to aiding the NSA in discovering solutions to advanced cryptologic-related problems. One year earlier, in 1956, Secretary of Defense Charles E. Wilson had turned to James R. Killian, Jr., president of the Massachusetts Institute of Technology as well as chairman of the President's Board of Consultants on Foreign Intelligence Activities, and asked his help in bringing under one roof a permanent corps of civilians to assist the joint Chiefs of Staffs Weapons Systems Evaluation Group in arbitrating the Pentagon's numerous internal battles over such problems as which missile system to fund. The result was the Institute for Defense Analysis, an academic think tank formed originally by MIT, the California Institute of Technology, the Case Institute of Technology, Stanford, and Tulane, and later joined by seven other universities.
Following the Baker Committee report, Killian, who was now the chairman of the board of IDA, was asked to establish a similar organization for the NSA. He agreed to do so; and following the receipt of $1.9 million in 1958, IDA's Communications Research Division was formed, and planning began for the building of offices and laboratories on Princeton's campus.
Despite the assertion of one official of the institute that IDA has always been "completely independent of the government" in order to ensure that the institute would be "able to carry out studies that don't merely support some preconceived idea of the government," the CRD has always had the most intimate ties with the NSA. Selected as CRD's first director was Dr. J. Barkley Rosser, fifty, a professor of mathematics at Cornell and a specialist in numerical analysis. Chosen as his deputy, however, was Dr. Richard A. Leibler, forty-four, a five-year employee of the Puzzle Palace and a chief architect of Project Focus. A former mathematician with the Sandia Corporation who had also taught, at various times, at the University of Illinois (where he became friends with another math professor, Dr. Louis W. Tordella), Purdue, and Princeton, Leibler was primarily interested in probability and statistics. He apparently enjoyed what he once referred to as "our lonely isolation in Princeton." In reference to NSA, he once wrote to William F. Friedman, "For reasons which you must appreciate, I try to get down there and back as soon as possible,. so I usually manage to do all my work in a single day."
On September 12, 1961, A. Adrian Albert, aged fifty-five, chairman of the University of Chicago's mathematics department, replaced Barkley Rosser as head of the CRD. One of cryptology's earliest visionaries, Albert had seen the correlation between cryptography and higher algebra as early as 1941. In a paper entitled "Some Mathematical Aspects of Cryptography," he wrote, "It would not be an exaggeration to state that abstract cryptography is identical with abstract mathematics."
Like that of his predecessor, Albert's tenure at CRD was also short. In 1963 Deputy Director Leibler dropped the "deputy" from his title and moved into the director's office, thus tying the knot between the NSA and CRD all the tighter. The relationship must have been a good one. Leibler continued as director for the next fourteen years, leaving Princeton only in 1977 to return to the NSA as chief of the Office of Research within the Research and Engineering Organization.
Leibler was replaced by Dr. Lee P. Neuwirth, forty-three, who had served as deputy director for the previous twelve years. He had first joined CRD as a mathematician in 1961, two years after receiving his Ph.D. from Princeton.
Labeled "the most secret of the major think tanks" by Paul Dickson, in his book Think Tanks, IDA has its headquarters in a ten-story, concrete-and-glass high-rise across an acre of parking lots from the Pentagon. Eschewing even the smallest sign, IDA makes a point of not advertising its existence.
In the fall of 1967, partly as a result of this penchant for secrecy and the institute's heavy involvement in the Vietnam War, members of Princeton's chapter of the Students for a Democratic Society staged a demonstration in front of Von Neumann Hall, demanding that the university sever its ties with the institute. The students argued that Princeton's participation in IDA necessarily involved them indirectly in the war and also tended to compromise academic freedom by involving academicians in secret scientific projects whose findings could not be widely shared.
The protest against IDA spread to other campuses, and in the spring of 1968 it became one of the rallying cries of the students who staged the eight-day takeover at Columbia University.
The issue was finally resolved by the universities' agreeing to drop their official links with the institute while continuing to permit a representative of each school to serve in an individual capacity as a trustee.
Following the protests, the CRD quietly packed up and moved into a boxy, three-story brick building virtually hidden in an isolated wooded area off campus. Windowless except for the third floor, the building has, again, absolutely no signs indicating the name of the occupant.
By February 1970, IDA had grown into five divisions and three groups, comprising 285 professionals and a support staff of 274, with an annual income of a little more than $13 million. CRD, one of the smallest divisions, consisted of twenty-seven professionals (up only three from its first year) and thirty-three support personnel. The professionals, mostly mathematicians, were normally borrowed from universities on one-year contracts and were usually given far more latitude in attacking problems than their counterparts in NSA's Research and Engineering Organization.
The CRD's statistics are, however, a bit misleading. Shortly after the division's birth, several programs were launched to bring into the secret fraternity several dozen of the nation's most outstanding academic minds in mathematics and languages. The programs formed what must have been the country's most exclusive summer camp. Known as SCAMP, for Summer Campus, Advanced Mathematics Program, and ALP (the CRD wisely dropped the first two letters) for Advanced Language Program, the projects involved bringing together a wide variety of senior university scholars, introducing them to the mysteries of cryptology, and applying their collective genius to some of the NSA's most perplexing riddles.
Cleared and indoctrinated for top secret SIGINT and COMSEC material, the SCAMP and ALP participants, usually tenured professors from some of the nation's best schools, would arrive with their families in early summer and attend symposia and lectures in a specially built, heavily secured building on the campus of the University of California at Los Angeles. To avoid creating suspicion, the participants would be paid directly by UCLA, which in turn was reimbursed by IDA-CRD, which in turn was paid by NSA.
With the ending of Project Lightning in 1962, so too ended NSA's support of unclassified public research. Lightning had helped prime the scientific pump, and competition within private industry, it was felt, would ensure that the flow of technological advances in the computer and associated fields would continue to pour out. The Puzzle Palace, through CRD, SCAMP-ALP, and a select number of key consultants and contractors, could now focus its full attention, as well as its dollars, on a science where there was no competition, where NSA alone controlled a monopoly: cryptology.
But along came Lucifer.
"NSA," the Agency declared with all due modesty, "certainly hastened the start of the computer age." Among the early products of that age was the use of computers in the banking industry for everything from the massive transfers of money between banks and other financial institutions, to the simple recording of a midnight transaction at a remote automatic teller. But there was another product: computer crime. With sufficient knowledge and the access to a terminal, one could trick the computer into transferring funds into a dummy account or tickle a cash-dispensing machine into disgorging its considerable holdings.
To counter such possibilities, and realizing that data communications held enormous market potential, IBM board chairman Thomas Watson, Jr., during the late 1960s set up a cryptology research group at IBM's research laboratory in Yorktown Heights, New York. Led by Horst Feistel, the research group concluded its work in 1971 with the development of a cipher code-named Lucifer, which it promptly sold to Lloyd's of London for use in a cash-dispensing system that IBM had developed.
Spurred by the success of Lucifer, IBM turned to Walter Tuchman, a thirty-eight-year-old engineer with a doctorate in information theory, then working at the company's Kingston development lab. A sixteen-year veteran of IBM, Tuchman was asked to head up a data security products group that would transform Lucifer into a highly marketable commodity.
Aided by Carl Meyer, then forty-two, a German-born electrical engineer who had earned his doctorate from the University of Pennsylvania, Tuchman soon discovered that Lucifer would require considerable strengthening before it could withstand massive commercial use. The team spent the following two years tearing the cipher apart and putting it back together again, over and over, trying each time to give it more complex functions. The process included intense "validation," whereby experts would bombard the cipher with sophisticated cryptanalytic attacks. Finally, in 1974, the cipher was ready for market.
At about the same time that IBM was turning its attention to cryptography, another group was beginning to study the subject with great interest. In 1968 the National Bureau of Standards, charged since 1965 with developing standards for the federal government's purchase and use of computers, initiated a series of studies to determine the government's need for computer security. As a result of the studies, the NBS decided to search for an encryption method, or algorithm, that could serve as a governmentwide standard for the storage and transmission of unclassified data. Solicitation for such an encryption algorithm took place in May 1973 and August 1974.
The timing could not have been better for IBM, which submitted for consideration its Lucifer cipher. Labeled by David Kahn "the tiniest known 'cipher machine' ever produced," Lucifer actually consisted of a thumbnail-sized silicon "chip" containing an extremely complex integrated circuit. The "key" to the cipher was a long string of "bits" -- 0's and 1's -- the combination of which would vary from user to user just as the grooves in front-door keys will vary from neighbor to neighbor.
Like the door key going into the lock, the cipher key goes into a series of eight "S-boxes," actually supercomplex mathematical formulas that, when combined with the particular key, transform intelligible data into indecipherable bits-and then perform the reverse magic on the other end.
Just as more grooves on the key usually means a more difficult lock to pick, more bits in the cipher key will decrease the chances of successful cryptanalysis. For this reason IBM developed Lucifer with a key 128 bits long. But before it submitted the cipher to the NBS, it mysteriously broke off more than half the key.
From the very beginning, the NSA had taken an enormous interest in Project Lucifer. It had even indirectly lent a hand in the development of the S-box structures. "IBM was involved with the NSA on an ongoing basis," admitted Alan Konheim, a senior employee at IBM's Yorktown Heights lab. "They [NSA employees] came up every couple of months to find out what IBM was doing."
For the first time in its long history, NSA was facing competition from within its own country. The outsiders were no longer mere hobbyists but highly skilled professionals, supported by unlimited funds and interested more in perfection than in speed.
Viewed from within the NSA's barbed and electrified fences, the dangers were real. For years the Puzzle Palace had been growing more and more dependent on the ever-widening stream, of international data communications flowing invisibly through the air. That air was alive with messages about oil from the Middle East, financial transactions from Europe, and trade strategies from Japan. The NSA simply had to stretch out its electronic net and pull in the most valuable economic intelligence.
Of equal or greater importance was diplomatic and military intelligence plucked from the Third World. Encrypted, for the most part, on antique, inexpensive, or unsophisticated machines, most communications from Africa, South America, and Asia were easy pickings for the NSA. Through the back door would occasionally pass such jewels as a Third World minister's report to his Foreign Office of an intimate exchange with a Soviet or Chinese counterpart. "It goes on all the time," said G Group chief Frank Raven. "You'd be astonished the amount of information that you get on Communist targets from non-Communist countries."
But the development and widespread use of an economical, highly secure data encryption device threatened to turn the NSA's well-stocked stream into a dry riverbed.
Nervousness over competition from the outside was not limited, however, to the codebreakers of PROD. The cryptographers of COMSEC were just as worried, although for the opposite reason. For them, the major fear was that, by accident, the outside researchers might stumble across methods the NSA itself used, thereby compromising the Agency's codes.
As a result of closed-door negotiations with officials of the NSA, IBM agreed to reduce the size of its key from 128 bits to 56 bits. The company also agreed to classify certain details about their selection of the eight S-boxes for the cipher.
After the company submitted the now-truncated cipher, the Bureau of Standards passed it on to the NSA for what it called a "deep analysis." The Agency, in turn, certified the algorithm as "free of any statistical or mathematical weaknesses" and recommended it as the best candidate for the nation's Data Encryption Standard (DES).
The decision set off an immediate firestorm within the scientific community. Some critics, pointing to the shortened key, charged the NSA with ensuring that the cipher was just long enough to prevent penetration by corporate eavesdroppers but was just short enough for the NSA's own codebreakers. Others pointed to the Agency's tinkering with the critical S-boxes and expressed fears that it may have installed a mathematical "trap door,", enabling it to spring open the cipher with little difficulty. Hence the insistence on classification.
The reason for such actions, said the critics, was simple. Since the DES would eventually be manufactured commercially and installed on a wide assortment of equipment sold abroad, the NSA would not want to cut its own throat by permitting the foreign proliferation of an unbreakable cipher. Yet weaknesses in the cipher would still allow the Agency to penetrate every communications link and data bank using the DES, American as well as foreign.
Code expert David Kahn theorized that there was a secret debate within the NSA over the DES. "The codebreaking side wanted to make sure that the cipher was weak enough for the NSA to solve it when used by foreign nations and companies," he reasoned. "The codemaking side," on the other hand, "wanted any cipher it was certifying for use by Americans to be truly good. The upshot was a bureaucratic compromise. Part of the cipher -- the 'S-boxes' that performed a substitution -- was strengthened. Another part -- the key that,varied from one pair of users to another -- was weakened."
Leading the charge against the DES were Professor Martin E. Hellman and researcher Whitfield Diffie of Stanford University. The two computer experts argued that a computer capable of breaking the code could be built using one million special search chips, each chip capable of testing one million possible solutions per second.
Known within the Puzzle Palace as a "brute force" attack, the method involves acquiring a number of deciphered messages from the particular target and then, using high-speed computers, matching them against intercepted, encrypted versions of the very same messages. If an encrypted message is attacked with every possible version of the code, at some point it will match its unencrypted counterpart. At that point the code evaporates, and all further messages are there for the taking.
How long it takes to break the code depends on the length of the key. For a 56-bit key, the number of possible combinations would be about seventy quadrillion. But using a computer with a million of the special-purpose chips; capable of testing one trillion possible keys per second, the entire range of possible keys could be searched in seventy thousand seconds -- or less than twenty hours. On the average, however, Hellman noted, only about half of the keys would have to be tried before the appropriate key was found, making the average search less than half a day.
Then there is the matter of cost. According to the two Stanford scientists, the chips themselves could be produced for about $10 each, or about $10 million altogether. Allowing a factor of two for design, control hardware, power supplies, and the like, they concluded that such a computer could be built for about $20 million. Depreciated over five years, Hellman and Diffie maintained, the daily operating cost drops to about $10,000 a day, meaning that each solution would cost about $5000.
Looking further into the future, they noted that the cost of computation and hardware had decreased by a factor of ten every five years since the 1940s. Thus, if the trend continued, the machine would cost only $200,000 in ten years and each solution a mere $50.
But suppose IBM had ignored the NSA and instead submitted its original 128-bit key? The results, said Hellman and Diffie, would have been dramatically different. As opposed to the moderate $5000 price tag, each solution would cost an unimaginable $200 septillion, or $200,000,000,000,000,000,000,000,000.
Because of these concerns, the Bureau of Standards sponsored two workshops on the DES. At the first, in August 1976, NBS officials defended their choice for the DES and said that, given their existing machinery, a brute force attack would take seventeen thousand years. Yet the make-up of the NBS's DES team cast a shadow over the impartiality of the judgments. The leader of the bureau's computer security project was Dennis Branstad, a former employee of the NSA; Arthur Levenson, an NBS consultant on the project, had been, at least through the late 1960s, one of the NSA's most senior codebreakers as a group chief in PROD before he, apparently, left for IBM.
Other participants at the workshop disagreed on the costs, development time, and exhaustion time necessary to construct such a computer, with the majority suggesting ranges of two to ten years for construction, six months to ten years for exhaustion, and a price tag of from $10 to $12 million.
Also coming to the defense of the IBM algorithm were the cipher's inventors, Walter Tuchman and Carl Meyer. They said the cost of constructing a specialized DES codebreaking machine would be closer to $200 million and that DES-based devices could be designed to encipher messages twice using two different keys, thus effectively doubling the key size to 112 bits. But as a trio of experts from Bell Labs pointed out, "Most data terminals will not be set up to do this."
Finally, the Senate Intelligence Committee looked into the controversy and concluded that, although the "NSA convinced IBM that a reduced key size was sufficient [and] indirectly assisted in the development of the S-box structures," they could find no wrongdoing.
The controversy over the DES has, for the time being, ebbed. On July 15, 1977, it became the official government civilian cipher, and a half-dozen firms began turning it out for private industry. Some, such as the American Banking Association, have endorsed it, but others, like the Bell Telephone Company, have resisted it.
But there is one thing on which all sides in the debate agree: the issue will soon arise again. As new advances in computer speeds and capabilities, and new technologies, such as Josephson Junction, cryogenics, and bubble memory, come into play, the safety margin offered by the cipher will gradually disappear. Some give the cipher five years; others give it ten; few give it more. The intervening years.will be decisive.
In the same way that companies today are beginning to market the DES, an enterprise known as the Code Compilation Company opened its doors for business in New York in the 1920s. Located in a tall, gray office building at 52 Vanderbilt Avenue, the company compiled and sold a wide assortment of codes to various trade and other businesses. In the back of the company offices was a heavy, locked door, through which only company employees were allowed to pass. The reason for such secrecy was that the back room held the headquarters for Code Compilation's parent: Herbert Yardley's Black Chamber, America's secret codebreaking organization.
The question the nation must resolve during the years before a second-generation DES is developed is whether, as in the Code Compilation Company, there will be a secret door between public and governmental cryptology.
***
Victorious in its battle over the DES, the NSA now set its sights on an even greater potential threat: academic research into cryptology. Suddenly, on numerous campuses across the country, mathematicians and computer scientists began devoting substantial energies and resources to a subject that was once merely a strange word locked away in a dictionary. The research was both theoretical and applied; some scientists developed actual hardware, and others peered ever deeper into mathematical conundrums. So far had the interest gone that several colleges even offered courses in the subject, and in 1977 a scholarly journal devoted exclusively to cryptology was born.*
____________________
* One such course was taught by mathematics professor Cipher A. Deavours, Kean College of New Jersey, Union, New Jersey. The journal is Cryptologia, founded and edited by David Kahn, Cipher A. Deavours, Louis Kruh, and Brian J. Winkel.
What must have come as NSA's biggest blow took place in 1976, when the two anti-DES scientists from Stanford, Hellman and Diffie, came up with what David Kahn called "the most revolutionary new concept in the field since polyalphabetic substitution emerged in the Renaissance." Later refined by three scientists at MIT, Ronald Rivest, Adi Shamir, and Leonard Adleman, the system was labeled public-key cryptography and offered a radical new twist to an old concept. Rather than using the same key, as with the DES, to encrypt and decrypt, the public-key system allows for two separate keys -- one limited to encryption and the other to decryption. What this means is that a person can now freely distribute his computer's encryption key, such as in a national directory, permitting anyone to send secret information to him. But only he is able to decrypt the messages, since he alone has the decrypt key. An added bonus of the system is that it also permits the sender of the messages to sign, in effect, in indelible code, thus ensuring the authenticity of the author.
The problem faced by the NSA in trying to halt this research, and thus restore its own hegemony, was that, unlike the industrial world, with its heavy dependence on defense contracts and its conservative boards of directors, the university researchers were independent. The cozy relationship that the Agency had fostered with IBM would be impossible with the free-wheeling academics. Nevertheless, virtually all the researchers had an Achilles' heel: the National Science Foundation.
Set up as an independent government agency to foster research into basic scientific knowledge, the NSF accomplished its goals by awarding grants and contracts to universities and non-profit research organizations. It was through such grants and contracts that most of the nongovernmental cryptologists received their funding. Thus, it was reasoned at NSA, if the Agency could wrest control of all cryptologic funding, it would effectively control almost all outside research in the subject. Such control could come to NSA if the NSF turned over to the Agency all responsibility for the cryptologic programs or, if this was not feasible, if it granted to the NSA the right to classify any program it deemed should be kept secret.
The first step in the NSA's power grab took place on April 20, 1977, when Assistant Deputy Director for COMSEC Cecil Corry (who by then had been with the Agency for thirty-five years) and one of his assistants, David G. Boak, journeyed down to Washington to meet with an NSF official at the foundation's G Street headquarters. The official was Dr. Fred W. Weingarten, the special projects program director of the Division of Computer Research, and the purpose of the meeting was to discuss the NSF's support of cryptographic research.
Corry, one of COMSEC's founding fathers and at this time the number two man in the organization, wasted little time in getting down to business. Soon after the meeting began, he suggested to Weingarten that an unspecified presidential directive gave the NSA "control" over all cryptographic work and that Weingarten and his foundation were operating outside that directive.
It was not the first time Weingarten had heard such a charge. Almost two years earlier, in June 1975, one of the NSF's grantees who also worked for the NSA told him that NSA "had sole statutory authority to fund research in cryptography, and, in fact, that other agencies are specifically enjoined from supporting that type of work." Weingarten, fearing that he may have been operating outside the law in awarding cryptographic grants, immediately suspended any new awards in the field and shot off a memo to NSF's general counsel asking for an opinion on the issue. Unable to find any such statute, Assistant General Counsel Jesse E. Lasken telephoned NSA's legal office, but no one there could find the supporting statute. The scare over, Weingarten's cryptographic funding continued.
Now he was hearing the threats again, but this time he told his two NSA visitors that his general counsel had looked into the matter nearly two years before and found no such directive involving research. The bluff having failed, Corry mumbled that they would have to get such a law passed and then resorted to his alternative strategy. Hoping to win for the NSA the power of the classification stamp over all cryptographic proposals, Corry suggested that the two agencies "coordinate" the review process for the proposals.
Because the NSA did have the only reservoir of expertise in the field, Weingarten agreed to send over the proposals, but only, he warned, so that NSA could provide its expert opinion on the technical quality of the work. Further, he told Corry, his division would continue to consider proposals in the field of cryptology, that it would operate in as open a manner as possible, that it would decline proposals only for fully documented scientific reasons, and that he would not permit any "secret reviews -- reviews of the form 'Don't support this, but I can't tell you why."'
Weingarten could see the beginnings of a major power play involving the open research policy of the foundation and the national security claims of the NSA. Several days after the meeting he recorded his views. concerning the looming battle in an internal memorandum for the record:
First -- NSA is in a bureaucratic bind. In the past the only communications with heavy security demands were military and diplomatic. Now, with the marriage of computer applications with telecommunications in Electronic Funds Transfer, Electronic Mail, and other large distributed processing applications, the need for highly secure digital processing has hit the civilian sector. NSA is worried, of course, that public domain security research will compromise some of their work. However, even further, they seem to want to maintain their control and corner a bureaucratic expertise in this field. They point out that the government is asking NSA help in issues of computer security. However, unquotable sources at OMB tell me that they turned to NSA only for the short-term, pragmatic reason that the expertise was there, not as an expression of policy that NSA should have any central authority.It seems clear that turning such a huge domestic responsibility, potentially involving such activities as banking, the US mail, and cable television, to an organization such as NSA should be done only after the most serious debate at higher levels of government than represented by peanuts like me.
Furthermore, no matter what one's views about the role of NSA in government, it is inescapable that NSF relations with them be formal. Informal agreements regarding support of areas of research or individual projects need to be avoided.
Apparently not having gotten the message, Corry wrote to Weingarten's boss, Dr. John R. Pasta, director of the NSF's Division of Mathematical and Computer Sciences, to say "We are grateful for your willingness to cooperate with us in considering the security implications of grant applications in this field."
Pasta sent the letter up through channels at the NSF, noting that his division had made no such agreement. Later he replied to Corry, "clarifying" the arrangements and adding that any review the NSA made on proposals would become part of the public record.
Meanwhile, on July 5, 1977, as each side continued to posture, Vice Admiral Bobby Inman moved into the Puzzle Palace, replacing newly promoted General Lew Allen, Jr. Inman's initiation into the battle was to be a quick one. The day after his arrival, one of his civilian employees, Joseph A. Meyer, having decided that the issue of public research into cryptology was destined to be as ignored under Inman as it was under Allen, took action. Without any authorization, he wrote a threatening letter to the Institute of Electrical and Electronic Engineers (IEEE), the nation's largest professional engineering society (of which he was a member), warning that those planning to participate in an upcoming IEEE symposium on cryptology might violate the law.
Among those who would be speaking and presenting papers at the October gathering were public-key originators Martin Hellman and Ronald Rivest. What bothered Meyer so much was not only that the meeting was going to be open to the public, but that a number of foreign guests would also be attending and participating. Worse, there were plans to send copies of the talks, before they were delivered, to the Soviet Union under a general umbrella agreement made by the IEEE several years earlier.
In his single-spaced, one-and-a-half-page letter, Meyer brought up the International Traffic in Arms Regulations (ITAR), through which the State Department controls the export of arms, am-
munition, and "implements of war," like jet fighters and warships, which are listed in a document known as the U.S. Munitions List. Also on the list, under Auxiliary Military Equipment, were cryptographic devices, speech scramblers, privacy devices, and their associated specialized equipment.
What Meyer emphasized was that ITAR covered the export not only of actual hardware, but also of unclassified technical data associated with the restricted equipment. He claimed that holding symposia and publishing papers on cryptology were the same as exporting the information. Thus, he concluded, "unless clearances or export licenses are obtained" on some of the lectures and papers, "the IEEE could find itself in technical violation of the ITAR."
He had a point. The ITAR did cover any "unclassified information that can be used, or adapted for use, in the design, production, manufacture, repair, overhaul, processing, engineering, development, operation, maintenance, or reconstruction" of the listed materials, as well as "any technology which advances the state-of-the-art or establishes a new art in an area of significant military applicability in the United States." And export did include transferring the information both by writing and by either oral or visual means, including briefings and symposia in which foreign, nationals are present.
But followed literally, the vague, overly broad regulations would seem to require that anyone planning to write or speak out publicly on any technology touching the Munitions List must first get approval from the State Department -- a chilling prospect clearly at odds with the First Amendment and one as yet untested by the Supreme Court.
Nevertheless, the letter had its desired effect on the IEEE. Officials of the organization urged participants in the upcoming symposium to clear any questionable material with the State Department's Office of Munitions Control-thus, in effect, clearing it through the. NSA.
Despite the fact that Meyer had penned his letter as a private citizen and a member of the IEEE, it was only a matter of days before it was discovered that he worked for NSA -- which, of course, led many to believe that the letter was simply the NSA's covert way of stifling public cryptographic research.
Following a storm of embarrassing publicity over the incident, the Agency denied any connection with the letter, but for many the denial was insufficient. As a result, Inman sought help from a group he had come to know quite well during his years as director of Naval Intelligence and vice director of the Defense Intelligence Agency: the Senate Intelligence Committee. Inman asked the committee to conduct an impartial review of the Meyer affair and several other issues. The committee agreed, and in April 1978 it issued two reports, one classified and one unclassified, that acquitted the Agency of any involvement in the Meyer letter.
Possibly buoyed by the congressional vote of confidence, that same month the NSA took another giant step toward silencing the competition.
Six months earlier a foursome of inventors in Seattle, working in their spare time in the back of a garage, managed to develop a new type of voice scrambler. Led by thirty-five-year-old Carl Nicolai, a job-shopper, or technical "Kelly girl," the group called its new invention the Phasorphone and submitted a patent application in October 1977. In April 1978, Nicolai finally received a response from the Patent Office. But when he opened the letter, he was stunned. Instead of a patent, his hands held a strange form with the words SECRECY ORDER in large bold letters across the top.
Nicolai had suddenly been assaulted with one of the oldest weapons in the nation's national security arsenal: the Invention Secrecy Act. Passed in 1917 as a wartime measure to prevent the publication of inventions that might "be detrimental to the public safety or defense or might assist the enemy or endanger the successful prosecution of the war," the measure ended with the conclusion of World War I. The act was resurrected in 1940 and was later extended until the end of the Second World War. Then, like the phoenix, it once again rose from the ashes with the passage of the Invention Secrecy Act of 1951, which mandated that secrecy orders be kept for periods of no more than one year unless renewed. There was a catch, however. The act also said that a secrecy order "in effect, or issued, during a national emergency declared by the President shall remain in effect for the duration of the national emergency and six months thereafter." Because no one ever bothered to declare an end to President Truman's 1951 emergency, the emergency remained in effect until September 1978.
Nicolai's secrecy order told him little except that he faced two years in jail and a $10,000 fine for disclosing any aspect of his device "in any way to any person not cognizant of the invention prior to the date of the order." Nowhere on the order did it say why it was issued or who ordered the action.
Unknown to the Seattle inventor, the patent application for his backyard scrambler had traveled through one of the government's least-known bureaucratic labyrinths -- one littered with such security classifications as SUPER SECRECY and BLUE SLIP. Once submitted to the Patent and Trademark Office, it, like all other applications, was sent to a unit called the Special Laws Administrative Group, better known as the Secret Group. Here, several dozen specially cleared examiners separate the applications into chemical, electrical, or mechanical inventions and then, using guide lists provided by the various defense agencies, determine whether any contain national security information. Those they suspect are passed on to the Pentagon's Armed Services Patent Advisory Board (ASPAB), a sort of clearinghouse for secrecy orders, which then requests an opinion from the appropriate agency and coordinates the decision to invoke secrecy.
When Nicolai's Phasorphone reached the ASPAB, there was disagreement. The middle-level official at NSA responsible for such decisions wanted the secrecy order issued (although others within the Agency disagreed), and he was supported by the Air Force and Navy representatives. But the Army saw no reason for such a move, so the decision was kicked up to NSA's Director Inman for a final decision. He gave the go-ahead to the order.
Nicolai had thus become, in the slang of the ASPAB, a "John Doe." Of the three hundred or so secrecy orders issued each year, all but a very few are either on inventions the government has originated itself and already classified, or on inventions somehow connected with the government. A John Doe is one of the few outside this circle. In this instance, John Doe was hopping mad.
The object of Nicolai's patent application and the NSA's anxiety was a voice privacy system that relied more, apparently, on the science of transmission security than cryptography. As opposed to cryptography, which merely renders the contents of a message unintelligible to those without the key, transmission security conceals the very existence of the message itself. The seed for the Phasorphone was planted in 1960 in an article on communications security by Alfred Pfanstiehl for Analog magazine. Pfanstiehl suggested that instead of the traditional method of transmission, where signals are sent between transmitter and receiver over a single frequency, a system of pseudorandom wave forms be used. Under such a system a code could be devised using pseudorandom alterations of the frequency spectrum exactly synchronized between transmitter and receiver. The system held promise for an area particularly vulnerable to eavesdropping: CB and marine band radio. But it could also be modified for telephone.
What was so worrisome to the NSA, it seems, was the movement by the private sector into yet another once-exclusive domain. For years the Agency had been putting strong emphasis on the marriage of cryptography and transmission security for hidden communications with submarines and clandestine agents in hostile foreign countries.* Such techniques included frequency-hopping, where messages are bounced from frequency to frequency at more than a thousand times a second; burst communications, where a message is supercompressed into a brief "squirt"; and spread spectrum techniques, where a signal is first diluted to a millionth of its original intensity and then intermingled with background noise.
____________________
* In 1973 TRW began designing a satellite system for use by the CIA in communicating with agents in "denied areas." Code-named Pyramider, the system employed frequency-hopping. This provided the agent with large "safe areas" in cities, where the signals could be hidden among random urban radio transmissions. The system was also capable of reducing aircraft interception in remote areas to a radius of twenty nautical miles. (See Robert Lindsey, The Falcon and the Snowman [New York: Simon & Schuster, 19791, page 218.)
To add insult to injury, Nicolai was planning to market his Phasorphone at a price most buyers could easily afford, about $100, thus increasing the interest in the technology.
That the NSA was suddenly attempting to flex its muscles in the patent area could be seen in the fact that the very day Nicolai's secrecy order was issued, another inventor was opening a secrecy order on yet another invention. Dr. George I. Davida, a professor of electrical engineering and computer science at the University of Wisconsin, had submitted a patent application for a "stream" cipher device, incorporating advanced mathematical techniques, about the same time Nicolai submitted his Phasorphone application. Now, like his Seattle counterpart, Davida had also become a John Doe.
Whatever the NSA had hoped to accomplish by its rapid one-two punch was lost in the embarrassing public battle that followed. Soon after Davida received his secrecy order, Werner Baum, chancellor of the University of Wisconsin's, Milwaukee campus, sent off a letter to the director of the National Science Foundation, which sponsored Davida's project, denouncing the secrecy order and calling for "minimal due process guarantees." He then told Science magazine that he regarded the order an invasion of his faculty's academic freedom and said it smacked of McCarthy-era tactics against universities.
After first winning the support of Senator Warren Magnuson, Nicolai also turned to Science and later charged that the order "appears part of a general plan by the NSA to limit the privacy of the American people." He added, "They've been bugging people's telephones for years and now someone comes along with a device that makes this a little harder to do and they oppose this under the guise of national security."
The warfare was soon publicized by the national media, and the NSA was forced to sound retreat. It rescinded Davida's secrecy order on June 15, blaming it on a "very well-meaning, middle-level bureaucrat" at NSA and an "attempt to hold the line that had clearly already been passed." Nicolai, however, did not have his secrecy order cancelled until October 11. "There I was, faced with a split decision inside NSA over whether the Nicolai invention represented a threat," Inman said later. "From dealing day by day with the Invention Secrecy Act, you have to make a quick, snap decision."
The bruises the Puzzle Palace had received in struggles over the DES, the Meyer letter, and the secrecy orders had taken their toll on both the Agency and its director. Inman believed the NSA had received a "bum rap" and was afraid the one-sided controversy was having a demoralizing effect throughout the Agency and that it would frighten away many promising recruits. Even worse, he told a closed-door meeting with employees in the Friedman Auditorium, some of the news stories were threatening to cause "immediate damage" to the Agency's sensitive "sources and methods."
The director's tactic was to launch a counterattack on two fronts-one in the open and the other behind the scenes. On the open front, Inman decided to convert to his own use what he believed was his opponent's biggest weapon:.the media. Both Nicolai and Davida, he felt, had used the press to manipulate the NSA. Now he himself would begin manipulating the press for the Agency's benefit.
The first round in Inman's public relations war was fired shortly after he rescinded Nicolai's secrecy order. For the first time in the NSA's twenty-six-year history, an incumbent director would grant an interview to a member of the press. Inman told Science magazine's Deborah Shapley that because of the "burgeoning interest" in cryptology, he felt it was necessary to
find a way into some thoughtful discussion of what can be done between the two extremes of "that's classified" and "that's academic freedom" . . . Security has served the national interest with respect to the NSA extraordinarily well over a long period. So a whole series of directors have taken the view that "no comment" was the best response. But as we moved into burgeoning public interest in public cryptography, a substantial volume of unfavorable publicity has occurred with no counterbalance ... to point out that there are valid national security concerns.
Despite the ballyhoo, Inman's "interview" was little more than a monologue; the only area he would discuss was NSA's side of the Davida-Nicolai controversy. With regard to Davida, Inman said, the order was a "bureaucratic error, because, as it turned out, the material had already appeared in the open literature and so could not be classified." In Nicolai's case, he admitted that there was disagreement among the reviewers and that he had opted to "err on the side of national security." As a result of the publicity brought on by the two cases, Inman said, he had instituted a new procedure whereby any pro-secrecy order decision is automatically reviewed by a high-level committee. Nevertheless, he noted, "we would . . . [classify] any application where we feel there is a valid national security use or concern."
But Inman's most telling comment was his statement to Shapley that he would like to see the NSA receive the same authority over cryptology that the Department of Energy enjoys over research into atomic energy. Such authority would grant to NSA absolute "born classified" control over all research in any way related to cryptology.
"A public address by an incumbent Director of the National Security Agency on a subject relating to the Agency's mission is an event which if not of historic proportions is at least, to my knowledge, unprecedented."
So began Admiral Inman's second venture into the spotlight. "Traditionally," he continued, "NSA had maintained a policy of absolute public reticence concerning all aspects of our mission." But now, he said, "the Agency's mission no longer can remain entirely in the shadows."
Speaking before a symposium of the Armed Forces Communications Electronics Association at the State Department in January 1979, Inman elaborated on the dangers inherent in "unrestrained public discussion of cryptologic matters" to which he had previously only alluded. Warned Inman:
Application of the genius of the American scholarly community to cryptographic and cryptanalytic problems, and widespread dissemination of resulting discoveries, carry the clear risk that some of NSA's cryptanalytic successes will be, duplicated, with a consequent improvement of cryptography by foreign targets. No less significant is the risk that cryptographic principles embodied in communications security devices developed by NSA will be rendered ineffective by parallel nongovernmental cryptologic activity and publication ... All of this poses clear risks to the national security [and places the mission of the NSA] in peril.
Following his "sky is falling" address, Inman again called for increased governmental controls over outside cryptologic research. "While some people outside NSA express concern that the government has too much power to control nongovernmental cryptologic activities," he said, "in candor, my concern is that the government has too little."
As a result of Inman's call for a "dialogue" between the NSA and the academic community, the American Council on Education established a Public Cryptography Study Group to investigate, and recommend possible solutions to, the problems facing both groups. Cochairmen of the group were Werner A. Baum, the former University of Wisconsin chancellor who went to Davida's rescue and is now a dean at Florida State University, and Ira Michael Heyman, chancellor of the University of California at Berkeley. The seven other members were mostly professors of mathematics and computer science from various universities. Representing the NSA was General Counsel Daniel C. Schwartz.*
____________________
* The scientist-members were David H. Brandin, vice president, Computer Science and Technology Division, SRI International (nominated by the Association for Computing Machinery); Professor R. Creighton Buck, Department of Mathematics, University of Wisconsin (nominated by the American Mathematical Society); Professor George I. Davida, Department of Electrical Engineering and Computer Science, University of Wisconsin (nominated by the Computer Society of the IEEE); Professor George Handelman, Department of Mathematical Sciences, Rensselaer Polytechnic Institute (nominated by the Society for Industrial and Applied Mathematics); Professor Martin E. Hellman, Department of Electrical Engineering, Stanford University (nominated by the IEEE); Professor Wilfred Kaplan, Department of Mathematics, University of Michigan (nominated by the American Association of University Professors).
Inman proposed that the group consider the feasibility of a statute permitting the NSA to exercise prepublication censorship over a "central core" of nongovernmental technical information relating to cryptology. Such a statute, the group concluded, could be implemented either by making it a crime to disseminate cryptologic information or by requiring prepublication review by a government agency, such as the NSA. Under the first practice, the NSA would monitor virtually all published information and recommend criminal prosecution in any instances where restricted cryptologic information has been published. Under the second, anyone publishing cryptologic information without first having it cleared by the NSA would face a jail sentence.
Recognizing-the constitutional questions involved in such drastic actions, the study group decided on a middle ground: a system of voluntary censorship. Under such a system, the NSA would reserve the right to notify'anyone working on cryptologic writings-authors, researchers, publishers-of its desire to review such information before publication. The Agency would then request those individuals whose writings contained material that the NSA desired withheld to voluntarily refrain from publishing them. In case of disagreement, a five-member advisory committee appointed by the director of NSA and the science adviser to the President would make the ultimate recommendation.
The final vote on the voluntary system passed with near-unanimity on February 7, 1981. The sole dissenting voice was George I. Davida's, one of the only two specialists in data security in the group. Davida believed the decision of the study group was both unwise and dangerous and would set a precedent for future nonvoluntary intervention by the NSA in the academic community.
Two years from now [he later wrote], if the NSA decides that it does indeed wish to impose restraints, the question will no doubt receive a hearing in Congress. It is easy to imagine the NSA offering the decision of our study group to Congress as evidence that academicians do indeed agree with the NSA-that our work could compromise the national security ... It would be only too easy for us to lose our constitutional freedoms in bits and pieces . . . One gets the impression that the NSA is struggling to stand still, and to keep American research standing still with it, while the rest of the world races ahead ... The NSA can best perform its mission in the old-fashioned way: Stay ahead of others.
Meanwhile, as the Agency continued its public campaign, it was also making significant headway on its second front-the quiet, behind-the-scenes effort to gain control from the National Science Foundation of all cryptologic funding. By controlling the dollars, the NSA would control the research.
Following a briefing on the NSA's operations in September 1978, NSF director Richard C. Atkinson suggested to Admiral Inman that one way to help alleviate the problem of the foundation's research impinging on the NSA's "sensitive areas" would be to have the Agency begin a small ($2 to $3 million), unclassified research support program at various universities. In this way, Atkinson suggested, the NSF could shift its effort away from the cryptologic area as the NSA took up the slack.
It was the opening Inman had been hoping for, and he quickly replied to Atkinson that his offer was "most attractive" but that before any program was implemented, "some homework needs to be done here."
If there was any doubt as to Inman's ultimate intentions, it was dispelled during the first meeting of the Public Cryptography Study Group on May 31, 1980. Among those attending the meeting as an "authorized observer" was Richard Leibler, who was listed simply as "Chief, Office of Research, Department of Defense." In fact, Leibler, having served as director of the Agency's think tank, IDA-CRD, for fourteen years, was now head of NSA's Office of Research. In a remark that was never included in the minutes of the meeting, Leibler noted that the "NSA would take over the funding of cryptographic research grants from the NSF, assuming there are no legal impediments to such transfer and the study group produces worthwhile recommendations on how to effect it."
Apparently the study group never considered the proposal. Still, Inman was just about ready to begin his coup; all he needed was the right research project to come along. Two and a half months later, that project appeared. On Thursday, August 14, Leonard Adleman, a theoretical computer scientist at MIT and one of the fathers of public-key cryptography, received a telephone call from Bruce Barns of the NSF. To his surprise, Barns told him that the NSF had decided not to fund part of his grant proposal. When asked why, Barns merely said something about an "interagency matter."
The following day, Adleman received another call, this one from Inman himself, who explained that the NSA wanted to fund his proposal. It was an unsettling experience, and Adleman wanted nothing to do with the Agency. "In the present climate I would not accept funds from the NSA," he later told Science. He said that he was concerned about the terms the Agency might extract, whether his funds would be cut off if the NSA insisted on classification and he refused, and whether he would be denied due process. He added, "It's a very frightening collusion between agencies."
The sudden intervention of the NSA apparently caught the NSF off guard. NSF director Atkinson had resigned only six weeks earlier, and the acting director, Donald Langenberg, had become deputy director just a few weeks before Atkinson departed.
Following the incident, both Inman and Langenberg met with White House science adviser Frank Press. It was decided that, at least for the time being, all proposals for cryptographic research would go first to the NSF and then to the NSA for technical review. Should the NSA find a proposal it wished to fund, it would so notify the NSF, which would offer the researcher a choice of accepting funds from either agency.
In reviewing the three and a half years of bureaucratic footwork engaged in by the NSA and the NSF, a congressional committee concluded that the history reflected "not that of two agencies at loggerheads, but of the mission-oriented NSA having sent the NSF a message in bureaucratic code that the latter is still struggling to decipher. The record leaves little doubt about NSA's intentions."
On September 15, 1981, Lieutenant General Lincoln D. Faurer, Inman's successor as DIRNSA, transmitted another message, this one to America's computer industry. Unlike the message to the NSF, however, this one went unencrypted.
Two months earlier, the NSA had unveiled its newest addition: the Computer Security Technical Evaluation Center. Its purpose is to analyze computer hardware and software produced by private industry and rate the products in terms of their potential vulnerability to penetration. Although submission was supposed to be strictly voluntary, Faurer, addressing a meeting of the IEEE, left little doubt that to ignore the Center would be to risk saying good-by to lucrative government contracts. "Frankly," the NSA chief warned, "our intention is to significantly reward those DOD suppliers who produce the computer security products that we need."
By using this carrot-and-stick approach, the NSA hoped to rapidly push ahead the development of secure computer systems by private industry and at the same time, through the Center, encourage the industry to share its innovations with the NSA. Lack of such cooperation, CIA Deputy Director Inman [sic?] said at the Center's opening, "might lead to a highly undesirable situation where private-sector users (e.g., banks, insurance companies) have higher integrity systems than the government."
Despite their warnings, if the reaction of industry to the Center proves anything like the reaction of the professional societies to the Public Cryptography Study Group, Inman and Faurer are in for a disappointment. As of the spring of 1982, the voluntary review system of the Study Group had been all but officially ignored by its member societies. Most have taken a position similar to the IEEE, which leaves entirely up to the individual scientist the decision of whether or not to submit and makes no recommendation one way or the other.
Such lack of enthusiasm for the program may have prompted Admiral Inman's blast, in January 1982, that if the scientists did not agree to the voluntary review of their work by the intelligence agencies, they would face a "tidal wave" of public outrage that will lead to laws restricting the publication of scientific work that the government might consider "sensitive" on national security grounds.
As a result of what he called the "hemorrhaging of the country's technology," Inman warned a meeting of the American Association for the Advancement of Science that "the tides are moving, and moving fast, toward legislated solutions that in fact are likely to be much more restrictive, not less restrictive, than the voluntary" censorship system of the Study Group.
Thus far, however, Inman's "tidal wave" of public outrage has yet to dampen the soles of his shoes.
25 May 2012. Sources for the two documents listed at top.
Among Bamford's footnotes are these documents Cryptome would like to publish or provide pointers if online. If in paper see send to mail or fax information on Cryptome home page If digitized send to jya@pipeline.com
National Security Agency, Influence of U.S. Cryptologic Organizations on the Digital Computer Industry, May 1977. (Declassified)U.S. Senate, Select Committee on Intelligence, Unclassified Summary: Involvement of NSA in the Development of the Data Encryption Standard, 1978.