This is not a novel suspicion nor an exotic point of view within the actuali computer security community. Nevertheless I wouldn't have said anything, because honestly... who cares. Well meaning people keep recommendingii his stuff to me however, which is getting annoying. I would like a ready reference to be able to explain my position with a single link in the future, so here we go.
Consider his most recent article, Air Gaps.
Since I started working with Snowden's documents, I have been using a number of tools to try to stay secure from the NSA. The advice I shared included using Tor, preferring certain cryptography over others, and using public-domain encryption wherever possible.
The recommendation to still use Toriii is already a declaration of affiliation. Which, whatever... everyone's free to choose a camp in this world.
In principle preferring certain types of cryptography over others is sound. The problem seems to be that Schneier prefers exactly the wrong types (ECC). I don't know that there's anything more suspicious in all crypto currently than ECC, and while suspicion and educated guesswork might not interest you, the fact still remains that ECC is, historically, the most frequently attacked by the NSA. Review the record.
Public domain encryption is probably the correct choice, but it should not be in any case regarded as ironclad. The "million eyes" theory is mostly a mythiv, as well exemplified by the Bitcoin codebase. While perfectly open, and while offering clever folks a great financial incentive to find flaws, nevertheless it was two years before anyone read or comprehended the database code it was relying on to a degree sufficient to understand how it was being initialised. And it took a massive fork that nearly killed the project back in March to get people even looking at that part of the code. Thus your notion that public domain software has been already reviewed by countless experts at no cost or expense to you may well be misplaced.
I also recommended using an air gap, which physically isolates a computer or local network of computers from the Internet. (The name comes from the literal gap of air between the computer and the Internet; the word predates wireless networks.)
This is a great recommendation. Air gaps are an essential building block of COMSEC, or at least of COMSEC that works at any rate.
Air gaps might be conceptually simple, but they're hard to maintain in practice. The truth is that nobody wants a computer that never receives files from the Internet and never sends files out into the Internet. What they want is a computer that's not directly connected to the Internet, albeit with some secure way of moving files on and off.
This is also largely correct, minus the insistence on "files". You can have perfectly functional airgaps through the use of a QR reader gun on each end. I know at least some MPEx users employ this method to pass signed orders to their live connection and receipt responses back to their airgapped machine. It's a really neat solution that however involves no files - and in fact I would hold that filelessness as a strong point of the entire system - darned hard to hack a QR gun via a QR code.
The intervening considerations are also correct, sure, Obama Osama used one, all that jazz.
And air gaps have been breached. Stuxnet was a US and Israeli military-grade piece of malware that attacked the Natanz nuclear plant in Iran. It successfully jumped the air gap and penetrated the Natanz network. Another piece of malware named agent.btz, probably Chinese in origin, successfully jumped the air gap protecting US military networks.
This, to the best of my knowledge, is factually incorrect. While it's possible Schneier knows something I don't, as far as I know Stuxnet did not cross any airgap. It was carelessly spread by poor security procedures. I suppose you can take the hardline that "if there's an airgap at any place in the design and stuff nevertheless gets across you may claim the airgap was breached, even if it was in fact the techs bypassing the airgap". This argument is perfectly solid in some contexts, such as in the post-mortem analysis of the defeated system. This argument is utterly facetious in a general discussion of airgaps, about akin to saying that "passwords offer no security because in at least one case an employee kept a post-it with his passwords on his monitor and so the workstation was trivially subverted". Obviously that subversion didn't involve password security, it was a social engineering attack through and throughv.
Agent.btz on the other hand definitely did not cross any airgaps, it simply benefited from the atrociously careless and irresponsible behaviours of contemporary US SIGINT people.vi
Back to reality : breaching an airgap is insanely difficult, and - much like RSA factorisation - an unsolved problem.
Since working with Snowden's NSA files, I have tried to maintain a single air-gapped computer. It turned out to be harder than I expected, and I have ten rules for anyone trying to do the same:
1. When you set up your computer, connect it to the Internet as little as possible.
No. To quote myself,
Any piece of hardware that was ever at any point in its life connected to the Internet can no longer be used as part of an airgapped system. Period.
And since we're on the topic, any girl that has been having as little sex as possible while taking great care for every round to last less than thirty seconds still isn't a virgin. Not no more. Funny how that works, and weird that I should have to point this out.
It's impossible to completely avoid connecting the computer to the Internet
This is, of course, false. Completely, patently, outrageously and mindnumbingly false.
For one, there existed computers before there was an Internet. They worked just fine - heck, in many cases they worked a lot better than the current Windows / Cloud crapola.
I purchased my computer off-the-shelf in a big box store
This is a good idea.
then went to a friend's network and downloaded everything I needed in a single session
This is a horribly bad, retarded idea. For one, the "friend" belies gross unfamiliarity with the principled core of COMSEC. That is to say, if you use a third party, it's either an unrelated third party or else it's related for a very good reason that's part of the objectives. Thus, using your own computer, or using a random wireless connection are the available options. Using the customer's connection, if you're working for one and they want you to may also be fine. Using "a friend", ie a related entity that's related in a way that's not related to your project nor justified by it is exactly, but I say exactly the hole through which the night comes in. Just go wan raiding already.
For the other, you really don't need to download anything on your airgapped computer. You can always download your favourite Linux CD distro, checksum it, burn it and live a happy life subsequently. And if even for a moment you believe yourself involved in COMSEC off Windows computers you really have to re-read footnote #5 above.
The ultra-paranoid way to do this is to buy two identical computers, configure one using the above method, upload the results to a cloud-based anti-virus checker, and transfer the results of that to the air gap machine using a one-way process.
This is so stupid a plan it beggars belief. For one, anything to do with "cloud-based" is about as antithetical to what we're doing here I'm beginning to suspect a certain three letter agency has a deal with the various cloud operators to somehow diddle everything on the go. There's just no way to explain why anyone would be proposing this cargo cult voodoo otherwise. Be that as it may, do not, under any circumstances and for any reason involve "the cloud" in your airgapping efforts. It does not belong there.
And for that matter, antiviruses are a myth. They roughly work in the same manner US budget ceilings work, their only security implication being that if you're in a space that uses antiviruses you know you're in a space that's never going to be secure.
2. Install the minimum software set you need to do your job, and disable all operating system services that you won't need. The less software you install, the less an attacker has available to exploit.
This is good advice.
I downloaded and installed OpenOffice, a PDF reader, a text editor, TrueCrypt, and BleachBit.
This is patent nonsense, antithetical to the aformentioned good advice. For one, OpenOffice is a dead project, and has been for a long long time. This means they're no longer keeping up with the world. For the other, a PDF reader is in no conceivable way needed on an airgapped machine, ever. Also, the PDF format is so fucking dangerous that it, together with Windows style WYSIWYG editors and their braindamaged attempts to become emacs through misimplementing macros etc create most of the ownage cases that don't involve a browser. Get rid of them, you don't need bulleted lists and power point presentations on your airgapped machine. Airgap it from stupid too, since you're going to all this trouble.
(No, I don't have any inside knowledge about TrueCrypt, and there's a lot about it that makes me suspicious. But for Windows full-disk encryption it's that, Microsoft's BitLocker, or Symantec's PGPDisk -- and I am more worried about large US corporations being pressured by the NSA than I am about TrueCrypt.)
Windows full disk encryption does not work. At all. It's pure snake oil.
Linux full disk encryption does not really work, for that matter. If the sorts of problems that full disk encryption is intended to resolve actually are security considerations for you, the correct approach is something akin to keeping a half gallon bottle of oil of vitriol close by at all times and the hard drive exposed and within reach, because no "full disk encryption" will keep your data safe from an attacker with physical access to your full disk and enough time and gadgetryvii on their hands.
The correct solution for the vast majority of applications is, of course, to simply gpg whatever you want to keep secret and forget about it. A properly made passphrase will keep your attacker sucking air out of a can indefinitely.
3. Once you have your computer configured, never directly connect it to the Internet again. Consider physically disabling the wireless capability, so it doesn't get turned on by accident.
For that matter, consider driving a nail through the wireless controller.
If you need to install new software
You don't. Ever. An airgapped system may never be upgraded no matter what happens, come hell or high water. It's salted and brined as is.
If you absolutely must have some spanking new feature, you're making a new airgapped machine. If making a new airgapped machine isn't worth the hassle you don't need whatever feature.
Turn off all autorun features. This should be standard practice for all the computers you own, but it's especially important for an air-gapped computer. Agent.btz used autorun to infect US military computers.
Right. So much derp & lol in here I don't even.
Minimize the amount of executable code you move onto the air-gapped computer. Text files are best. Microsoft Office files and PDFs are more dangerous, since they might have embedded macros. Turn off all macro capabilities you can on the air-gapped computer. Don't worry too much about patching your system; in general, the risk of the executable code is worse than the risk of not having your patches up to date. You're not on the Internet, after all.
Just get rid of the damned Microsoft stuff, and of the damned "office" crap.
Only use trusted media to move files on and off air-gapped computers. A USB stick you purchase from a store is safer than one given to you by someone you don't know -- or one you find in a parking lot.
This is true, if the file obsession a little out of place. Seriously, the world is not made out of files, the world is made out of data streams.
For file transfer, a writable optical disk (CD or DVD) is safer than a USB stick. Malware can silently write data to a USB stick, but it can't spin the CD-R up to 1000 rpm without your noticing. This means that the malware can only write to the disk when you write to the disk. You can also verify how much data has been written to the CD by physically checking the back of it. If you've only written one file, but it looks like three-quarters of the CD was burned, you have a problem. Note: the first company to market a USB stick with a light that indicates a write operation -- not read or write; I've got one of those -- wins a prize.
This is pure voodoo, seriously. So it can only write when you write, what sort of defense is that ? You will write, won't you ? You've written "one file" and 3/4 of the CD appears burned to visual inspection ? What's this, an application of the "64kb files should be enough for everyone" principle in tandem with the "Holy Mary Full of Grace" approach to security/birth control ?
Visual inspection of a CD is about as likely to catch that three and a half kb piece of malware as TSA screening is to catch terrorists. Forget about nonsense ritual and do things that have an impact on actual security rather than your subjective impression of security. Stroking that later is how the former gets compromised. Fucking CD auguration, I can not believe this.
What if the CD is sorta greenish ? Is that bad ? Cause you know, green is for poison, maybe we should only use blue tinted CDs. Those are cool, right ?
When moving files on and off your air-gapped computer, use the absolute smallest storage device you can. And fill up the entire device with random files. If an air-gapped computer is compromised, the malware is going to try to sneak data off it using that media. While malware can easily hide stolen files from you, it can't break the laws of physics.
Nor can it rewrite anything. It is a point of fact that no malware ever rewrites anything. Like, ever. At all. It just starts new processes and creates new "files" for everything. Always. It's in the rules, when Schneier's God created Schneier's Enchanted Universe of Strangely Insecure Computer Security, he made this a rule of Schphysics.
Consider encrypting everything you move on and off the air-gapped computer. Sometimes you'll be moving public files and it won't matter, but sometimes you won't be, and it will. And if you're using optical media, those disks will be impossible to erase. Strong encryption solves these problems. And don't forget to encrypt the computer as well; whole-disk encryption is the best.
Whole disk encryption is retarded in all those implementations which require the system to have the keys in order to use that disk. Strong encryption of individual files is always a good idea. That strong encryption means one thing and one thing only : gpg --encrypt --armor -r yourkey
One thing I didn't do, although it's worth considering, is use a stateless operating system like Tails. You can configure Tails with a persistent volume to save your data, but no operating system changes are ever saved. Booting Tails from a read-only DVD -- you can keep your data on an encrypted USB stick -- is even more secure. Of course, this is not foolproof, but it greatly reduces the potential avenues for attack.
I have no idea about Tails. Maybe. It's in general a bad idea to make exotic systems your team doesn't understand well central to your security. Debian Sarge is, for this reason, a much better choice than whatever tails.
Yes, all this is advice for the paranoid
Sadly, a large chunk of it is advice for the delusional. They do go together, delusional and paranoid, but not to any sort of useful effect.
And it's probably impossible to enforce for any network more complicated than a single computer with a single user. But if you're thinking about setting up an air-gapped computer, you already believe that some very powerful attackers are after you personally. If you're going to use an air gap, use it properly.
Security is never "probably impossible". Anything you wish to do is quite possible, just as long as you don't expect Windows, whole disk encryption and the cloud to do it for you. That aside, yes, if you're going to the trouble of doing anything, anything whatsoever, do it properly. That includes not relying on Windows etc.
Of course you can take things further. I have met people who have physically removed the camera, microphone, and wireless capability altogether. But that's too much paranoia for me right now.
I've not met any people who haven't. If you use a laptop for the purpose of being your cold computer, you definitely want to rape the cam and the mic (some models come with a 2nd mic hidden in the chassis btw). And drive a nail through the fucking wireless controller.
In closing, I would like to stress that none of this negates Schneier's past accomplishments or achievements. Applied Cryptography is still a great book. It's just that I don't trust him, not anymore.
———- As distinguished from people who read and perhaps write blogs on the topic of computer security. [↩]
- Most recently half hour ago. [↩]
- From Dear Guardian : stop being retarded :
That contrary to planted disinformation of which the Guardian article is a fine example, the NSA has complete and unlimited, instantaneous access to any and all information passed through the TOR network in its entirety, as a matter of course and by design.
That article even includes an oblique reference to Schneier in footnote 3. [↩]
- Roughly a restatement of the well known Parkinson law of triviality. While it's true that banal parts of the code garner a lot of posturing and discussion (chiefly from people desperate for a little bit of spotlight), it's equally true that the "obscure", the unsexy parts of the code are passed on by everyone, in the hollow faith that "someone else", "everyone else" will not be quite as "clever". Seriously, that's the thought process, "I will pass over this function here because reading it is unlikely to make me famous and everyone else has read it anyway. Because everyone else is not at all likely to think the same way, because I am an unique snowflake of brilliance with my own, personal and independent ideas, quite unlike everyone else." [↩]
- And by the way, allow me to remind : Social engineering is the #1 threat you face. Which social engineering includes COMSEC disinformation planted by enemy agents. Jus' sayin'. [↩]
- Since we're on the topic : the French were pretty much isolated in the NATO SIGINT community of the Cold War days, because everyone suspected that they'll leak whatever they're given. And so as long as there were French officers in the room, nobody discussed any serious intelligence.
Let it be clearly stated that while on occasion they did have trouble keeping a lid on things, it was rare that they actually lost anything. By contrast, the Americans at the time were regarded as grossly incompetent, and with few and far between exceptions basically worthless as field agents. Nevertheless, they did have deep pockets, a propensity for cool gadgets and a solid capacity of keeping data safe. Consequently, they were accepted as equals by their more able European counterparts, those people who made it their daily business to actually confront the devilishly competent Russians and the even worse Romanians on a daily basis.
And yet here we are, the US is leaking reams of data on a yearly basis. Derpies, you're worse than the French ever were by degrees of magnitude! Wake up, do something, you're becoming a laughingstock. There are fucking African states with a much better grasp on things, what the hell are you doing! [↩]
- Like, you know, a can of cold air. Herp. [↩]