No Security in Obscurity
Judging cryptography with Kerkhoff’s Principle — and what that means for security and open source
We have a lot of standards in security. Regulations like GDPR, COPPA, PCI-DSS, and HIPAA are so common that they make their way into the plots of TV shows and movies. But while we have a sea of acronyms to describe and measure the security of areas like identity, access control, and even threat intelligence, cryptography remains comparatively ill-defined.
This is not to say there are no standards for cryptography. NIST’s FIPS 140–2, the US federal government’s standards for cryptographic modules, is a frequent standard for judging the strength of a cryptography.
But there are many controversial aspects of FIPS (most notably around known vulnerabilities in NIST-certified crypto modules for FIPS and their potential for use in government surveillance) that have made some corporations reluctant to embrace it. And as countries like China who use non-western cryptography gain economic prominence there is an increasingly dire need for an international standard that can be used to judge the security of a cryptographic system.
Surprisingly, the standard we’re looking for may lie in a 19th century article on the Franco Prussian War.
Military Grade Cryptography
Auguste Kerkhoff was a 19th century Dutch linguistics professor who taught at the École des Hautes Études Commerciales in Paris.
In the 1870s, France fought and lost the Franco Prussian War. The Franco Prussian War was a preview of the disruptive impact of technology on warfare that would later be horrifyingly realized in World War 1. Technology like smokeless powder, telegrams, and the railroad dashed aside traditional conventions of war and strategy that had remained unchanged for centuries.
One area heavily disrupted by the technology was military intelligence — in particular cryptography. Cryptography in the early 19th century was very similar to what it had been in the late 16th century. Spymasters and intelligence officers would craft cryptograms using polyalphabetic ciphers to encrypt sensitive communications that would be ferried by rider to the front. Copies of keys would be distributed to officers in the field.
The 19th century saw the introduction of two disruptive forces to this model of early wartime cryptography: mathematics and the telegraph.
Mathematicians like Charles Babbage introduced new techniques for computation that they applied to analyze and break common military ciphers. The resulting discipline, cryptanalysis, ensured that nearly all of the world powers’ cryptography were vulnerable to espionage even without killing and robbing officers of their keys on the field of battle.
The telegraph dramatically reduced the cost and time necessary to send long distance communications. This was critical for maintaining strategy and prosecuting strategic warfare across a theater. But the telegraph also introduced new vectors for eavesdropping, as spies could infiltrate and monitor communications over the wire. Using cryptanalysis, mathematicians could decode these messages.
To meet these challenges, new cryptography was necessary to protect sensitive communication. Mathematicians and linguists worked to perfect systems to keep secrets safe in plain sight, and there was a boom in late-19th century cryptography that attracted the attention of academics like Kerkhoff.
Kerkhoff’s contribution to cryptography and military intelligence was an 1883 article in the Journal des Sciences Militaires titled La Cryptographie Militaire (Military Cryptography). Military Cryptography is a collection and analysis of common period military ciphers that Kerkhoff reviews in the wake of the Franco-Prussian War, and was one of the key manuals on cryptographic security until the end of World War 2 nearly a hundred years later.
Far outlasting the transposition and substitution ciphers Kerkhoff presents in Military Cryptography are his principles for good cryptosystem design. According to Kerkhoff, any good military cryptosystem must meet the following criteria to stand up to espionage and scrutiny in war:
The system should be, if not theoretically unbreakable, unbreakable in practice.
The design of a system should not require secrecy, and compromise of the system should not inconvenience the correspondents.
The key should be memorable without notes and should be easily changeable.
The cryptograms should be transmittable by telegraph.
The apparatus or documents should be portable and operable by a single person.
The system should be easy, neither requiring knowledge of a long list of rules nor involving mental strain.
The second criteria — the design of a system should not require secrecy, and compromise of the system should not inconvenience the correspondents — has become immortalized as Kerkhoff’s Principle. It is a fundamental part of information security and military intelligence, having been translated over the years into various forms such as “the enemy will eventually know everything” and “there is no security in obscurity.”
Cryptography today is very different from cryptography in the late 19th century. But whether we’re talking about AES-256 GCM, SM-2, Enigma, or the Vignere Cipher that Kerkhoff presents in Military Cryptography, Kerkhoff’s Principle remains an excellent way of analyzing the security of a crypto system.
Simply put: if an adversary had full access to the designs of how your system worked and it didn’t give them an advantage, you’ve got a pretty dope crypto system.
Ghosts in the Shell
Kerkhoff’s Principle is important because historically teams and organizations have (either accidentally or on purpose) introduced vulnerabilities that could be exploited by adversaries to break cryptography.
Some are the result of over-optimizing for performance or other non-security characteristics, such as Telegram’s vulnerabilities in MTProto. Others are the result of implementation issues that within otherwise cryptographically-sound algorithms in code. The OpenSSL crypto module integrates industry-leading cryptography that, due to the massive and monolithic nature of its stack, contains a number of vulnerabilities that could be exploited to spy on network communication by a sufficiently skilled adversary.
Detecting and exploiting vulnerabilities in crypto systems requires a rare skill set. Cryptanalysis requires a fairly deep knowledge in math like combinatorics and number theory, ensuring that architectural attacks on mathematical flaws in popular cryptography are rare. Detecting and exploiting side channel vulnerabilities, attacks on the implementation of potentially sound algorithms and math that allow adversaries to “skip” the cryptography altogether, requires additional expertise in areas like computer architecture, networking, and operating systems.
The difficulty of finding vulnerabilities in cryptography ensures that researchers investigating crypto systems need to work together to validate a system’s security.
Free as in Beer
To maximize the security of a crypto suite, it is critical that researchers and organizations have transparent access to a crypto system’s design.
Given this, one’s knee jerk reaction might be to call for all cryptography to be open source. This makes sense. There is no greater transparency than simply open sourcing the entirety of a crypto system’s internals and letting the community scrutinize its contents.
But what does that mean? Open sourcing professional cryptography raises serious questions about the selection and use of which open source licenses to use.
Depending on one’s selection of a license, the author of that code (and anyone using that code later) may be subject to potentially onerous requirements. For strong copyleft licenses like the GPL family, you may be required to open source everything using that cryptography. Activist licenses may ensure you cannot use that cryptography for certain use cases. Examples here include the GPU License, which legally state that the code authored under this license — and any use thereafter — cannot be used by any military.
Where cryptography begins and ends can be complicated. A physical crypto module may have clearly defined dimensions. The rotors on an enigma machine are contained in a certain box and the physical chip powering Skipjacs are contained within a certain sector of PCB.
In software this is much more complicated. Software can mix and match different crypto modules to compose a single encryption or decryption workflow. Programming languages and frameworks are prime examples of this, and many common libraries or open source projects will combined elements of popular open source cryptographic modules in order to present one API for a developer to interact with.
The choice of which open source license to use is critical here and could have a dramatic impact on the use of your cryptography. If you use a copyleft-licensed cryptographic module in your code, you are forced to thus distribute all copies of your code under the provisions of that license. If you use an activist license you will be required to constrain your usage of cryptography to certain use cases or styles of monetization (or no monetization). This is all arbitrary to cryptographic security and solely a function of the legal requirements of your software license.
A rising convention in open source is to implement Kerkhoff’s Principle via a combination of open source and external audit:
- Open Source a crypto system’s designs and crypto module: Open source the actual crypto module containing the algorithm and the designs of that algorithm under a non-copyleft license. This allows the broader open source and research community the opportunity to scrutinize the system’s architecture and math, hopefully reducing the possibility of a cryptanalytical vulnerability or a side channel vulnerability on the module itself.
- Subject that crypto’s use to regular external audit and publish the findings: For business or personal reasons it may be challenging to open source the actual use of that cryptography in code. As such, retaining a skilled external auditor you can give access to your closed-source code repositories to is a popular (and required in the case of many security regulations) way to minimize side channel attacks in implementation. Publishing those findings is critical to maintaining transparency, and allows researchers and users to investigate findings further without having constant access to the source code.
Auditors and contracted security researchers provide another unique benefit beyond discretion about IP when it comes to reviewing and publishing findings on closed source code: they are skilled enough to detect and help remediate vulnerabilities.
While broadly open sourcing all source code ensures the community may discover vulnerabilities, professional auditors and researchers will almost certainly discover issues and will likely be much more effective in helping you to understand and remediate those issues.
It’s one thing to file a terse Github ticket. It’s another thing to sit on a thirty minute-long Zoom call where you review a potential software vulnerability with the authors of that code and work together to search for a remediation. The latter is how you solve problems without introducing new ones.
Kerkhoff was right. There is no security in obscurity. If you want to build a strong crypto system, that system needs to be able to stand up to the scrutiny of a skilled adversary who, given enough time and resources, will have complete access to your system’s architecture.
But while this principle is just as applicable today for reviewing architectural security, the world has changed dramatically since the late 19th century. Applying Kerkhoff’s Principle means navigating complex licensing and economics issues related to open source.
Navigating those issues is critical to building strong privacy and security in software. And it’s par for the course in cryptography; after all, working in crypto is all about sweating the small stuff.