U.S. Government to Encrypt All Laptops

This is a good idea:

To address the issue of data leaks of the kind we’ve seen so often in the last year because of stolen or missing laptops, writes Saqib Ali, the Feds are planning to use Full Disk Encryption (FDE) on all Government-owned computers.

“On June 23, 2006 a Presidential Mandate was put in place requiring all agency laptops to fully encrypt data on the HDD. The U.S. Government is currently conducting the largest single side-by-side comparison and competition for the selection of a Full Disk Encryption product. The selected product will be deployed on Millions of computers in the U.S. federal government space. This implementation will end up being the largest single implementation ever, and all of the information regarding the competition is in the public domain. The evaluation will come to an end in 90 days. You can view all the vendors competing and list of requirements.”

Certainly, encrypting everything is overkill, but it’s much easier than figuring out what to encrypt and what not to. And I really like that there is a open competition to choose which encryption program to use. It’s certainly a high-stakes competition among the vendors, but one that is likely to improve the security of all products. I’ve long said that one of the best things the government can do to improve computer security is to use its vast purchasing power to pressure vendors to improve their security. I would expect the winner to make a lot of sales outside of the contract, and for the losers to correct their deficiencies so they’ll do better next time.

Side note: Key escrow is a requirement, something that makes sense in a government or corporate application:

Capable of secure escrow and recovery of the symetric [sic] encryption key

I wonder if the NSA is involved in the evaluation at all, and if its analysis will be made public.

Posted on January 3, 2007 at 2:00 PM74 Comments

Comments

McGavin January 3, 2007 2:10 PM

“I wonder if the NSA is involved in the evaluation at all, and if its analysis will be made public.”

Probably for the former question.

Probably not the full story for the latter question. Not to protect the vendors, but to protect the techniques used to find the problems.

Israel Torres January 3, 2007 2:41 PM

What good is the encryption if the keys are most likely to be on a usb key in the same laptop bag the laptop is in? Sure add factors but if they involve anything written down they will more than likely be written down and kept in the same laptop bag the encrypted drive is in along with the tokens necessary to decrypt the drive.

Time to patent a locking handcuff to laptop interface because these days are not too far away after everyone realizes people are the flaw.

Israel Torres

Ritchie January 3, 2007 3:03 PM

Apple provides FileVault (account directory encrypted with AES-128) under OS X. Seems pretty good !

On top of that, it also provides secure deletion (1, 7 or 35 passes), and disk image copy encryption (also AES-128).

All of this at no extra-cost and very simple to use.

Hopefully, Apple will be in the short list !!

Cheers,

Ritchie

REDACTED January 3, 2007 3:20 PM

@ Ritchie

Since FileVault only supports one of the twelve listed operating systems, it is unlikely to even get on the list. In any event, it’s not full disk.

Another reason to go full disk rather than per file or per-directory is the annoying habit of some OSes cough to make “temporary” copies of files in unexpected places.

Now, “Does not modify the GINA.dll”, that’s going to be a tough requirement to meet.

Bryan Sowell January 3, 2007 3:27 PM

I wonder why PGP Corp or Entrust didn’t show up on the list with their product lines ? Entrust’s FDE product is based on PointSec , which does show up, but you’d think at least PGP would have been on there.

Cerebus January 3, 2007 3:40 PM

“I wonder if the NSA is involved in the evaluation at all, and if its analysis will be made public.”

No. This solution is only for unclassified systems, so no suite-A encryption involved. FIPS 140-2 level 1 is a requirement, but NIST and the Cryptval labs do that stuff.

Petréa Mitchell January 3, 2007 3:41 PM

So, what do you want to bet that the first time a laptop gets stolen after all this is implemented, it turns out to have had the FDE password/token/whatever attached?

Cerebus January 3, 2007 3:42 PM

“What good is the encryption if the keys are most likely to be on a usb key in the same laptop bag the laptop is in?”

The RFP requires using the DoD Common Access Card (a smartcard with DoD PKI certificates on it) to protect the bulk encryption keys.

Cerebus January 3, 2007 3:45 PM

“Now, “Does not modify the GINA.dll”, that’s going to be a tough requirement to meet.”

Nope. Mode FDE solutions use a pre-boot mini-OS to handle authentication and a FS shim to handle encryption/decryption on the fly. There’s no need to modify GINA unless you want the pre-boot authentication to also authenticate to Windows (single sign on)–which is not a requirement and not desired.

Brian January 3, 2007 4:25 PM

Full disk encryption can be easily circumvented by “over the shoulder” techniques. IE, key logging and other methods.

But hey, at least the data will be somewhat safe when not in use.

Israel Torres January 3, 2007 4:35 PM

@Cerebus
“The RFP requires using the DoD Common Access Card (a smartcard with DoD PKI certificates on it) to protect the bulk encryption keys.”

I’ve seen plenty of CACs in laptop bags with their PINs taped on them… how was this secure again? The idea of n-factor authentication being successful is that the attacker does not have access to them at the same time. Since PKI is solely people-policy-based all it takes is for the people to not follow policy and you have an instant breach of protocol and there lies the vulnerability. Once the drive has been decrypted and copied no one will be the wiser but only feel that oh-so trademarked sense of security… and pass the buck when it all hits the fan.

Israel Torres

Cerebus January 3, 2007 4:35 PM

“Full disk encryption can be easily circumvented by “over the shoulder” techniques. IE, key logging and other methods.”

Not when a smartcard is used for pre-boot authentication. You require both the PIN and the card. Since the DoD CAC is also the DoD identity card, users will notice its absence.

Israel Torres January 3, 2007 4:49 PM

@Cerebus
“Since the DoD CAC is also the DoD identity card, users will notice its absence.”

Realistically anyone with a smartcard, mspaint and smartcard printer can easily whip up something that convincingly resembles the same card up until its use whereby it allows the attacker more time (maybe just enough) before further protocols are initiated. This allows the attacker to switch cards, use the real one with the known PIN, and switch them back without ever being noticed and/or detected.

Israel Torres

Kevin Davidson January 3, 2007 4:59 PM

Credant Technologies has been awarded one of the contracts to encrypt 12,000 laptops according to Government Computer News:

http://www.gcn.com/online/vol1_no1/42857-1.html?topic=security&CMP=OTC-RSS

“Our focus is on encrypting data as it moves on and off devices,??? he said. “The software extends encryption policies to personal digital assistants, USB drives and external hard drives. The software ensures these components have security on them, and the data is encrypted before they are allowed to sync [with the server].

Paul Mendoza January 3, 2007 5:02 PM

File Phantom (http://www.filephantom.com) is a pretty good piece of software for doing encryption of just specific files. Encrypting everything is just way too much overkill. Besides, if every company goes with the same encryption provider, it’s not going to take long before someone comes up with a crack for it.

Matt from CT January 3, 2007 5:09 PM

Moving to encrypting the hard drives is a reasonable step to prevent UNINTENTIONAL data loss.

If someone intends to steal it…any system can be compromised if you have the time and resources to do so.

What this prevents is having to say “There might have been a data loss” when equipment is stolen by street criminals or simply misplaced, to saying “No, the hardware was lost, but since the keys weren’t compromised, no data was lost.”

Dan Esparza January 3, 2007 5:21 PM

The NSA does appear to be involved, as well as all major branches of the military. (Look at the spreadsheet you linked to, and there appears to be some very interesting contact information in it under the ‘Government’ tab)

j January 3, 2007 5:37 PM

“No, the hardware was lost, but since the keys weren’t compromised, no data was lost.” :

“was lost” –> “is expected to have been lost”

Josua January 3, 2007 5:51 PM

Israel: Sure, and if the possessor leaves StateSecrets.doc open on his screen at the coffee shop when he goes for a piss, anybody who walks by can read it. That’s obvious. Nobody expects Perfect Security; certainly nobody who reads this blog with any regularity should.

The point is that requiring FDE for everybody reduces the possibilities for compromising security. Even if only 10% of users are smart enough not to keep their CAC in the same place as their laptop (with their PIN taped on it), which I’m quite sure is well below the real number in the field, you’re still better off than not using encryption at all.

Every system fails if you move the goalposts far enough back. Security nihilism is not the only alternative to security theater.

Ben January 3, 2007 5:57 PM

Will they be etching the password onto the side of the laptop’s display in-house, or subcontracting?

Sceptic January 3, 2007 6:55 PM

Several commentators have noted the problem that users tend to keep crypto passwords and dongles close the laptop for convenience. I agree this is a very real problem.
Perhaps a solution would be to adopt a crypto solution that requires a hardware token/fob of some sort and mandate the following rule:

If the laptop is lost, the user MUST return the token/fob device when reporting the loss or explain in an interview why both items are gone.

After making an example of the first few victims who lose both devices, the laptop users might start to take this problem a bit more seriously.

@Ritchie

“On top of that, it also provides secure deletion (1, 7 or 35 passes)”

Every so often, I come across a warning that just wiping a disk once isn’t good enough to ensure that the data cannot be recovered. For older disks (let’s say before year 2000) I accept this but can anybody tell me more about recovering files on overwritten disks using modern hardware? Just curious.

Stefan Wagner January 3, 2007 9:37 PM

@sceptic: I tend to agree.
If data recovery after 1, 6, or 34 overwrites is possible, why doesn’t the manufactoror sell his 100 GB drive with a special driver as 34×100 GB device?

Wouldn’t it be sufficient, to write random patterns 35 times to the whole disk before the first use?

Gopi Flaherty January 3, 2007 10:48 PM

@Stefan:
If data recovery after 1, 6, or 34 overwrites is possible, why doesn’t the manufactoror sell his 100 GB drive with a special driver as 34×100 GB device?

The paper I read awhile ago dealt with recovering data that had been sitting on the drive for a long time. The bits were written in narrow tracks; over time, they got wider. Over-writing lots and lots of times would only get you new narrow bits. If you had a very high-res probe, you could recover the older bits.

One way to deal with this problem would be to have a one time pad on a per-sector basis. Divide the disk up into a few sections, say 10. 10 different pads. When the disk was idle, create a new pad for one of the seconds, and re-write the blocks with the new pad.

Even if your data stayed the same, it would be constantly over-written with different data.

Another strategy would be to have a physical:logical mapping that changed all the time, moving sectors around all the time. Note that this scheme wouldn’t guarantee that the bits would be always changing – if you had a very empty disk, with lots of sectors with zeros, you may find that, say, alternating between an all-zero block and a block with the bank account number you’re hiding would result in enough long term spread that it could be recovered. Again, only an issue IMHO if you had a nearly empty disk. Writing random data into empty blocks might help.

This reminds me of a computer a friend of mine had. The floppy drive was supposed to be double sided – you could write to both sides without flipping the disk. In fact, he described his as “four sided.” One of the heads was mis-aligned enough that, if you flipped the disk, you could write two more sides of data. Presumably, the tracks were narrow enough that this didn’t cause problems. I don’t know how he fared with pre-written disks, or what the error rate was…

Jungsonn January 3, 2007 11:13 PM

I don’t get it. Really.

What is wrong with AES 128/256 bit ?
And why are they consulting Microsoft in this? are they insane? Yeah, I doubt they lobbied with the NSA on this one.

quincunx January 4, 2007 12:00 AM

Bruce,

No government can run like a competitive business as long as it has the non-business like ability to a) conscript money from its subjects, b) dilute its value by issuing credit and paper tickets, and c) borrowing with the actions of (a) being the security.

The fact that you praise what is corporate welfare is quite revealing.

“And I really like that there is a open competition to choose which encryption program to use.”

An open competition, for one winner?

I wonder suspect it is more likely that decisions will be made by political bickering, campaign contributions, and kickbacks rather than by real competitive merit. How could it be other wise?

“It’s certainly a high-stakes competition among the vendors, but one that is likely to improve the security of all products. ”

No, this view is incorrect. The incentives fostered here is to make improvements in an effort to get a profit-guaranteed stream from the government, once the effort fails for many there is no improvement elsewhere. Government, being the monopsonist that it is, can not bargain like other firms, otherwise it can drive the profit down to zero. Because that would discourage the business, the government must GUARANTEE a profit that is basically suitable to encourage a certain amount of competition, competing against all other industries in other fields. This is primarily how the ‘public’ utilities work. Those suckers make 9-12% profits, no matter what! How sweet it must be to not compete.

The government will set that rate way above what real markets would set, and can decide on how many bidders there will be in the first place. All improvements will not payoff – alternatives can not be tried because only one gets the gig, and the others have no extra outlets for their work.

Thus just like a patent, innovation is stimulated before the patent, and then reduced afterwards. Afterall your monopoly lasts 20 years. Hell, why not even not use it, and then sue some bastards when they do.

Do you really believe that a 90-day competition will suddenly increase investment into some sort of R&D, or something? Inventors will pull magic formulas out of their ass, that they never possessed before?

“I’ve long said that one of the best things the government can do to improve computer security is to use its vast purchasing power to pressure vendors to improve their security. ”

Woah, hold off there Bruce, now you are acting crazy. This is exactly the opposite of what public choice economics tells us. Guaranteed income to select corporations, does not result in efficiency, rather it results in the exact opposite: waste and inefficiency. No effort at improving is made once you get the government teat to fill your coffers. Why bother, really?

Maybe one should look to see what happens in other industries before jumping the gun.

If you want to increase inefficient output – subsidize it.

If you want to decrease efficient output – tax it.

“I would expect the winner to make a lot of sales outside of the contract, and for the losers to correct their deficiencies so they’ll do better next time.”

Bruce, there are already many ‘winners’, that can be calculated by the profits they are making. Just because they are not picked by the biggest gang in town to be endowed with what is essentially corporate welfare, doesn’t make them worse, in a business or security sense.

BeauKey January 4, 2007 12:09 AM

The TPM is a ‘desired’ requirement. Is the TPM not mature yet? Vista (Bitlocker) can use the TPM (including key recovery).

annoyed January 4, 2007 12:17 AM

This is one more step into a swampy cesspool which the Office of Management and Budget dug when it overreacted to the VA laptop incident by issuing the infamous OMB 06-16 memo:

http://www.whitehouse.gov/omb/memoranda/fy2006/m06-16.pdf

This declared an open season of vast government waste spending on redundant encryption purchases on tight deadlines to protect a class of data that had not been clearly defined, while simultaneously enshrining a nonsense phrase–“personally identifiable information”, a.k.a. PII–as the code word for government stupidity in FY 2007.

Just think about that phrase for a minute. It means “information that is personally identifiable”. It’s gibberish. What they perhaps meant was “personally identifying information”, and it’s not clear how this is distinct in their tiny little minds from good old “personal information”. But now thousands of tie-wearing, headless chickens are running around the halls of government crying PII, PII, without any real thought given to what the real objective is–presumably, that government programs not contribute to identity fraud through mismanagement of collected data.

The result is a room at every agency filled with amateur philosophers, all incompetently and futilely trying to suss out the nature of OMB’s fabulous beast, the dreaded PII. Because of the divers sophomoric output of these committees, definitions of PII now in practice usually include things as aggressively moronic as publicly available names and telephone numbers, and thus we end up with these sorts of expensive mandates to encrypt all mobile devices, instead of a more sensible mandate to simply stop putting the so-called PII on mobile devices in the first place.

Meanwhile, the idiots at the top utterly fail to address the continuing real loss of personal information through myriad avenues such as improperly secured paper files, telework conducted from insecure home PCs, failure to encrypt emails between HR personnel, failure to sanitize media on excess equipment, failure to properly shred documents before disposal, etc, etc, etc.

It’s absolutely, utterly, infuriating. This is the classic exemplar of a poor security tradeoff, and I hope Bruce reconsiders his endorsement.

supersnail January 4, 2007 2:22 AM

Sorry guys but everything about this seems a sensible rsponse to an real security problen.

Problem:- lots of confidential data on portable devices which are easy to steal, once you have possesion of the hard disk you can read the data.

Solution:- encrypt all the data.
Furthermore put out hte tender to open competion and choose the product that best suits your requirement.

I dont see how anyone could gripe about this, and, I think this should be a model for any organisation that routinely holds confidential data in portable computers (i.e 90% of the companies listed on the NYSE and NASDAQ!).

I hope this scheme is widely adopted because if it becomes “standard practice within the industry” it gives you an opening to sue any institution that doesnt encrypt data and than lets it loose due to poor security practices.

Paul Larson January 4, 2007 3:02 AM

Sorry to be cynical but…

I wonder which “Bush Pioneer” will get the contract for their totally inadequate and overpriced encryption system!

annoyed January 4, 2007 3:15 AM

@supersnail
“Problem:- lots of confidential data on portable devices which are easy to steal, once you have possesion of the hard disk you can read the data.
“Solution:- encrypt all the data.”

Uh, no. The only problem that solves, if everything goes right, is theft of the device. What about viruses/trojans/back doors? What difference does full-disk encryption make when the system containing it is part of a botnet?

And you didn’t state the problem correctly.

Problem: lots of personal information defined by who knows what criteria, collected by who knows what programs, stored on who knows what systems, protected by who knows what measures, backed up on who knows what media, printed on who knows what paper, shared with who knows what contracting companies, remotely accessed by who knows what teleworkers, disposed of by who knows what measures.

Guess what? The DOD has had an information classification system for many years. All of this nonsense is a poorly disguised, ill-conceived, crude, and garishly inept attempt to define a similar classification system (“PII”) for the civilian sector, with the following obvious flaws:
– No definition of PII.
– No uniform guidelines for defining PII across government agencies.
– No protection for PII stored on backup tapes, paper files, compromised computer systems (other than stolen laptops), etc.
– No safeguards for transmission of PII over networks.
– No requirements for labeling PII to indicate its status as such.

Full-disk encryption of laptops doesn’t even strike the real problem obliquely. It’s not even wrong. It shouldn’t be a model for anything other than a Three Stooges short, because it’s nothing more than pure security theatre.

One correct solution would be to define another classification in the DOD classification system, and require all government agencies to adopt that classification and all appropriate practices to implement it. This might include physically separate networks for transmission of PII, clear labeling of all media containing PII, rules governing disclosure of PII, a method for declassifying PII, etc.

Of course, one would have to actually think about the problem to solve it. OMB doesn’t trouble themselves with that whole thinking thing–it’s just too hard. Much easier to just say we’re encrypting all laptops, isn’t it?

HAL January 4, 2007 3:32 AM

First, not really, other than to coordinate/attend meetings. It is ‘suite B’ crypto. Second, yes, as it will show up on all sorts of things, but the eval will not be released in order to prevent damage to vendor if findings are negative. Vendor will be provided copy to fix/address issues, if desired.

greg January 4, 2007 5:44 AM

Remeber its a repsonse to a particular type of thieft. The thief just wants the laptop. But hes a theif so you can’t trust em with the data. Encryption is a good idea.

Please take the tin foil hats off guys.

The real irony is that companies are alowed to sell this type of data to other compainies anyway. So why all the fuss about security when the law is the problem?

Andy B January 4, 2007 6:28 AM

Seems a good idea to me. In the UK we’ve a news story, oh, every few months about somebody having a laptop with personal/financial/defence details being nicked. An encryption policy as described, while perhaps not stopping an actual attack, will stop a typical criminal with a brick and a keen eye for the contents of people’s cars. And by applying it universally, it’ll cost less than trying to select what needs to be encrypted.

And you’ll still have your ‘Classified’ system for the things you really need to keep secret.

It’s just raising the bar – stopping casual thieves, who’re much more common than key-logging, trojan-wielding, uber-hackers.

Elliott January 4, 2007 7:20 AM

Israel Torres wrote:
“Time to patent a locking handcuff to laptop interface…”

Well, I would not want to use one because motivated criminals are more than willing to cut hands off.

Elliott January 4, 2007 7:25 AM

@Stefan: Presumably the electron microscope needed to recover overwritten data would be too slow and too expensive to bundle it with the hard disk. Also, the head movement is not precise enough to reliably NOT overwrite old data.

Elliott January 4, 2007 7:52 AM

@quincunx: “Do you really believe that a 90-day competition will suddenly increase investment into some sort of R&D, or something? … No effort at improving is made once you get the government teat to fill your coffers.”

I agree that competition would be over before it even began if only one vendor is selected for a long time to come. The alternative is, of course, an infinite loop:

scope = select_initial_scope();
criteria = define_initial_criteria(scope);
for (;;) {
product = select_product(criteria);
buy(product,scope);
rollout(product,scope);
scope = select_new_scope(scope,criteria);
criteria = improve_criteria_for_scope(scope,criteria);
}

Would it make our data more secure, for less money? I am inclined to think so, given that all selection functions are properly implemented. Especially the selection of scope has a strong impact on security and cost, because the criteria should depend on the scope, and the people working within that scope should not need training for new software too often.

Elliott January 4, 2007 8:13 AM

@Gopi Flaherty: “Another strategy would be to have a physical:logical mapping that changed all the time, moving sectors around all the time.”

That would make selective zeroization much harder, if not impossible.

Say you encrypt a disk with symmetric key s, then encrypt s with n password hashes h1..hn and store the resulting enc(s,h1)..enc(s,hn) on disk. This scheme allows n users to access the disk, each with their own password. Since no user knows s or the passwords of others, you can easily revoke access for any user by zeroizing her password.

If the disk moves logical sectors around, you can zeroize one copy of enc(hx), but not all copies because you don’t know where other, not yet overwritten copies are stored.

Modern drives have a number of spare sectors to remap bad sectors, so we already have this problem, but we can deal with k spare sectors by splitting enc(hx) into k+1 (or more) blocks, where k blocks are filled with random data r1,r2,…,rk and one contains enc(hx) xor r1 xor r2 xor … xor rk. Now, even if k of the k+1 sectors get remapped, at least one sector is not, so overwriting all k+1 sectors guarantees that at least for one of the blocks there is no unoverwritten copy left. Unless, of course, someone got a copy before it was overwritten…

Cerebus January 4, 2007 8:52 AM

@Israel: I cannot prevent people who intentionally aim at their extremities from shooting themselves in the foot. So the idiot who writes his 6-digit PIN on the card which he uses every damn day yet can’t seem to remember and keeps the card in the bag despite explicit training to the contrary simply can’t be stopped. But I can stop compromise from most people who are far smarter than that. Would you rather nothing was done? That seems to be the only course that would make you happy.

No solution is 100%. But for the threat being addressed (which is, as some above seem to understand but others do not, the threat to FOUO data through casual system theft–not the threat to FOUO data from targetted compromise) FDE is the best way to go. When coupled with off-system strong two-factor authentication, it’s the best we’re going to get given the systems we have available.

In re: your convoluted and artificial scenario of stealing my card and temporarily replacing it with a look-alike (but, I might stress, not a function-alike), I challenge you to carry that attack off. Let me know when you succeed. You’ll pardon me if I don’t wait around, mmm’kay?

@whoever was asking about TPM: BitLocker does secure boot (which is good), but the BitLocker pre-boot OS doesn’t do smartcards (which is bad). In addition, BitLocker provides no way to recover systems without domain admin involvement (which is a law enforcement requirement), and BitLocker boot PINs are shared among all users of the device, so there’s no user attribution at boot time (which is a forensic requirement). Some flaws there. In addition, TPM pre 1.2 isn’t worth it, and most gov’t systems are old enough that they don’t have a TPM chip at all–and the product selected has to be deployed now, not in 3 years when everyone has a TPM 1.2 chip. Rest assured that the next competition in a few years will have more stringent requirements.

Cerebus January 4, 2007 9:02 AM

@annoyed:

“- No definition of PII.”

Incorrect. However, as a new definition, it takes a little time to settle down.

“- No uniform guidelines for defining PII across government agencies.”

Different agencies typically deal with different data and will need different rules. Insofar as there are common data (addresses, SSNs, etc.) there are already rules established through vehicles like the Privacy Act and HIPAA.

“- No protection for PII stored on backup tapes, paper files, compromised computer systems (other than stolen laptops), etc.”

Umm, WTF? How do you protect information on a compromised system? In re: paper, gov’t has rules on protecting that data (Privacy Act requirements). In re: backups and other electronic records, these things are being addressed, but only so much can be done at once.

This isn’t to say the rules are any good or that there’s a good track record, just that there are rules.

“- No safeguards for transmission of PII over networks.”

Incorrect. Data has to be protected while in motion, and there are rules and standards for this. The RFP being discussed is for data at rest which is a newly recognized requirement.

“- No requirements for labeling PII to indicate its status as such.”

Privacy Act again. What, you’ve never seen a Privacy Act cover sheet?

bob January 4, 2007 9:29 AM

Basically a good idea. But when you create 10 million laptops with the same encryption protocol you are painting a BIG target on that protocol.

Elliott January 4, 2007 9:54 AM

@bob: Yes, monoculture makes big targets.
But unencrypted notebook drives make even bigger targets.

This is an opportunity to get quite much security for quite few money, with no damage to privacy and civil liberty. A better deal than most these days.

Ian Ringrose January 4, 2007 10:26 AM

There is one easy way to make people keep the smart card with the PIN safe. Just require that the same card + PIN can be used to take cash out of an ATM that is then deducted from their next pay packet!

Valdis Kletnieks January 4, 2007 11:10 AM

Regarding the recovery of overwritten data:

Nobody outside the spook world can do it, at least not in anything resembling an economically feasible way.

Proof: Assume the contrary. Go look at all the disk-recovery companies that can recover deleted material off drives that have been through floods, fires, and other disasters. We’re talking here companies that will charge you several hundred/thousand dollars just to look at the drive.

And yet, not one single company claims to be able to recover from even a “single-overwrite” condition. Even though there’s a big market for it. (Note that nobody outside the commercial arena, such as academia or non-spook government, is doing it either – there’d still be a big market for that ability there, in the form of “published papers”).

Conclusion: The fact that nobody’s trying to make money off doing it is proof that it’s not feasibly doable.

Counter-proof: Feel free to cite a single verifiable example of somebody who has recovered data off a drive made in the last decade after a single-overwrite.

Israel Torres January 4, 2007 11:21 AM

@Josua
“The point is that requiring FDE for everybody reduces the possibilities for compromising security.”

I am sure everyone here would agree that something is better than nothing with the following understanding:’

@Cerebus
“No solution is 100%.”

Exactly and advocating otherwise only demonstrates ignorance of the real world.

@Cerebus
“Let me know when you succeed.”

The whole idea is that you as a target aren’t supposed to know.

Israel Torres

annoyed January 4, 2007 11:58 AM

@Cerebus
“‘- No definition of PII.'”
“Incorrect. However, as a new definition, it takes a little time to settle down.”

Kindly point us to the useful definition of PII. The Privacy Act is tangentially germane, but when it gets down to brass tacks, agencies are having to define PII for themselves, and they’re doing a lousy job. You’re obviously not having to deal with dozens of reports of a stolen cell phone or PDA that “might have had some home phone numbers on it”. The failure to define PII in any rational way is wasting incident response resources in a profligate way, all the way up to US-CERT. Call US-CERT and ask them what proportion of their report-handling time is now spent on stolen mobile devices.

“How do you protect information on a compromised system?”

First of all, a stolen laptop is a compromised system. And you miss the point, which is that if we are defining a class of sensitive data, it needs to be tracked and protected everywhere it goes, not just on laptops. The only reason a blanket purchase is under discussion here–Bruce himself says it–is because it’s easier than determining which laptops need protection. That’s not a crypto problem; it’s a management problem. No one knows what or where the PII is, so purchasing FDE for laptops is an expensive way to sidestep the real problem.

Imagine if DOD were putting plaintext Secret data on a few laptops, and no one knew which ones. Would it make sense for them to buy FDE for every laptop they own, when only a few had the data? Wouldn’t the real problem be, obviously, that Secret data was not being handled properly?

The right way to solve this problem wouldn’t waste money purchasing, installing, and maintaining FDE and a keystore infrastructure for hundreds of thousands, perhaps millions, of laptops. Instead, it would establish rules for tracking all the PII and appropriately protecting it wherever it is stored. If PII has to be stored on a laptop (e.g. by Census), then FDE could be purchased for that laptop, along with appropriate other countermeasures (anti-virus, anti-spyware, firewall, etc.). A blanket purchase accomplishes the opposite of what is needed, because now every drone with a laptop will think it’s safe to put PII on his laptop, and storage of PII on laptops will increase, not decrease. And laptops, even with FDE, are the least secure systems in any enterprise because they keep getting connected to insecure networks.

“In re: paper, gov’t has rules on protecting that data (Privacy Act requirements).”

Do you work at a government agency? If so, what care do your HR and physical security people take with personnel files? What email encryption do your HR and physical security people use? How does your incident response team deal with theft of paper files, or are they even consulted? If not, why is PII theft on a laptop handled by one group, while PII theft on paper is handled by someone else? Shouldn’t there be uniform handling of PII theft, no matter how it occurs?

“In re: backups and other electronic records, these things are being addressed”

Not in any uniform or even rational way.

Remember that all this stupidity was in reaction to one theft of a laptop that was not supposed to have the data on it in the first place. Meanwhile, no one keeps track of which other systems have these data on them, and laptop thefts are just the tip of the iceberg, even as mobile devices are concerned–how many PDA cell phones have been stolen in the past year? Why hasn’t that number been reported in the media, when laptops have? (Answer: because it’s theatre.)

“‘- No safeguards for transmission of PII over networks.'”
“Incorrect. Data has to be protected while in motion, and there are rules and standards for this.”

Really? And which of those rules and standards apply to PII? Why not enforce those rules?

Oh, wait–what is PII, anyway?

“The RFP being discussed is for data at rest which is a newly recognized requirement.”

The RFP is for a tiny fraction of the real problem, and is going to waste a large proportion of everyone’s security budget on theatre. Just setting up the FDE on all those laptops is going to chew up major resources, and then we have follow-on costs with maintaining laptop software, infrastructure hardware and software, and help desk support for the thousands of technical problems which will arise. And what evidence do we have that there has ever been any identity fraud based on data from a stolen government laptop anyway?

“‘- No requirements for labeling PII to indicate its status as such.'”
“Privacy Act again. What, you’ve never seen a Privacy Act cover sheet?”

Not on a laptop. Not on a web server. Actually, not anywhere, no. And I work for the government.

annoyed January 4, 2007 12:11 PM

@Elliott
“Yes, monoculture makes big targets.
But unencrypted notebook drives make even bigger targets.”

Wrong. The implementation of this will include a network software component on every laptop, because of the need for key escrow and centralized management. That network component alone will be a bigger target than all of the government laptop thefts in history ever were.

Elliott January 4, 2007 1:15 PM

@annoyed: “That network component alone will be a bigger target…”

My answer to bob related only to the local encryption stack and software, not the remote access facility that is specified to be part of the solution. Sorry for not being clear.

I do expect that it is economically impossible to design and implement a network service free of vulnerabilities.

First I felt that the key management could be designed to limit the scope of leaked key data /and other exploits/ to only one notebook, and use secret splitting to allow key escrow only if multiple guarantors cooperate.

But if there is an access path from the network service to the disk decryption key, which seems plausible if the remote service is supposed to access the data when the regular user forgot her passphrase, then an exploit over the network could use that path as well. And it would work on all notebooks that are equipped with a similar software version.

An interesting variant might be if the symmetric key is stored in a TPM, and can be retrieved either with the user passphrase, or with an escrowed key that works only for that machine. But then again, who guarantees that there is no way to access the user passphrase or any derived state, or circumvent the protection of the TPM. Plus, without a non-TPM access paths, the TPM might be used in a DOS attack to make many systems inaccessible at once.

Scary stuff. Thanks for pointing this out.

annoyed January 4, 2007 2:15 PM

@Elliott
“But if there is an access path from the network service to the disk decryption key”

Even if there isn’t a path to the decryption key, there’s one more piece of crap vendor software presenting a new attack and fingerprinting surface on every government laptop. If there’s a network vulnerability in that code, it doesn’t matter whether the key is recoverable, because the laptop is already running the OS.

FDE only helps when the laptop has been turned off. Once the OS is running, FDE is nothing more than a drag on system performance.

WhatIsBeingProtected January 4, 2007 4:57 PM

The company I work for just did this, by using PGP encryption of the entire laptop contents. However, it was not done to protect the contents. It was done to avoid fines that can be assessed if the laptop is stolen and contents weren’t encrypted.

Nicolai January 4, 2007 6:21 PM

FDE is being implemented in response to the lost laptop reports so popular last summer. In fact the deadline to have them encrypted is now passed. They are moving on to PDA and will be coming back to encrypt the desktops. It isn’t to protect the data on a running system, it is so they can say we lost a laptop but it’s okay the disk was encrypted. FDE is being implemented because there is no file storage discipline in Windows applications. Might be in the user’s directory or might be in the root directory. Only solution is encrypt them all and let God sort it out.

PII is defined in the privacy act but as always people start fantasizing movie plots and so expand the definition until it includes even a person’s initials in some randomized order.

Microsoft advises against FDE in the Vista EULA (under the virtual system restrictions). I guess they’ll be changing that bit of legalese and hopefully the OS won’t choke since it never has had to deal with security before.

Sceptic January 4, 2007 7:18 PM

Thanks to everyone who posted coments to my disk data recovery question.

@Nicolai

“FDE is being implemented because there is no file storage discipline in Windows applications”

That seems a bit harsh on Windows to me. I’d suggest that the problem is a lack of user discipline about what they put on the laptop in the first place. It’s so easy to dump a large amount of data to laptop (and memory sticks etc) for convenience; that’s why we will keep hearing about data loss on lost and improperly disposed of laptops and disks.

Even if we all used Linux or whatever, this would still be a problem.

Saqib Ali January 4, 2007 7:46 PM

@Sceptic

“That seems a bit harsh on Windows to me.”

Not really. One problem with Windows is that it really doesn’t care where it is creating tmp files and what the permissions on those tmp. Windows tend to be more open as to where a user can save their files compared to linux and unix.

Having said that, even unix / linux create files in /tmp or /var/tmp folder but on a limited basis……

averros January 4, 2007 11:35 PM

…and of course, nobody asks the ONLY relevant question: what exactly the secrets the government so gets to guard are?

Plans to new offensive wars? The evidence of massive corruption and waste?

Oh, sure, encrypt everything, to reduce chances of accidential glimpses of inner workings of The Gang by the general public.

Any sane person would be very concerned about this “security” initiative. Because the next logical step after encryption of data will be prohibition of keeping the same data on paper. To save the trees, no doubt.

Here goes your FOIA, and the evidence of malfeasance can now go away in a second, with a key “lost” or “accidentally destructed” – with no piles of shredded paper. Neat. Clean. Unaccountable. The vaunted Democracy is well on its way to the quite ordinary fascism.

…and the key escrow is a whip over low-level flukes, so they won’t be tempted to double-cross their political overlords. How convenient that it is “trusted” to some shadow agency. So they can read everything and assemble dossiers on the potential challengers to the ruling few.

I think this scheme would make any dictator wannabe piss himself from happiness.

Ossifer X January 5, 2007 1:07 AM

I work for the government. My agency is still using DOS-based applications. This all sounds really neato, but I shake my head when I think of the implementation. The gubmint is an ungainly, plodding behemoth.

annoyed January 5, 2007 3:08 AM

@Nicolai
“PII is defined in the privacy act but as always people start fantasizing movie plots and so expand the definition until it includes even a person’s initials in some randomized order.”

PII is never mentioned in the Privacy Act, which was written in 1974. The closest term that does occur is “individually identifiable information”, which is just as much gibberish as PII, and occurs only once.

The most applicable term that is defined in the Privacy Act is a “record”:

“the term ‘record’ means any item, collection, or grouping of information about an individual that is maintained by an agency, including, but not limited to, his education, financial transactions, medical history, and criminal or employment history and that contains his name, or the identifying number, symbol, or other identifying particular assigned to the individual, such as a finger or voice print or a photograph;”

Note the “including, but not limited to…” This is completely open-ended, which is why agencies don’t know where to stop.

Meanwhile, a crucial term that is never defined in the Privacy Act is “identify”. It could mean “identify an individual uniquely,” but a name doesn’t do that in most cases, and a name is an alternative to other “identifying” information. So it could also mean “to narrow the field of candidate individuals”, which is the sense in the common term “identifying mark”. But then, how much does information have to narrow the field before we consider it “identifying”? Is hair color enough? Hair and eye color? Last four digits of phone number? MD5 digest of the street address? IP address? (If you think IP address can’t identify an individual, go talk to someone who’s been sued by the RIAA.) Who knows?

Now look at the definition of “record” and tell me what a “record” is when we don’t know what “identify” means.

It should be no surprise that, with this gobbledygook as starting point, agencies are defining PII to include things such as “a name combined with a telephone number”. On the one hand, they’re being instructed to protect all “PII” on mobile devices; on the other, they’re being given no practical guidance whatsoever on what PII is. Sure, if you have medical records and SSN, that’s PII, but most agencies don’t have that. They have names, phone numbers, email addresses, IP addresses in logs. Are web logs PII? What about email headers?

This isn’t movie-plot fantasy–it’s simply the chaos that results when the White House says “Jump!” without saying how high. It’s the fault of OMB for issuing broad mandates with short deadlines and no forethought. Agencies have already spent tens of millions trying to implement this crap. With this blanket purchase, they’ll spend hundred of millions, perhaps billions more. All that money could be used effectively if it could be targeted to protect the information that is deemed so important. But without defining that information in an actionable way, all the government can do is shovel money into the boiler and hope nobody asks any questions.

It’s time to start asking questions. Now.

Sceptic January 5, 2007 7:18 AM

@Saqib Ali

“One problem with Windows is that it really doesn’t care where it is creating tmp files and what the permissions on those tmp”

I think you have missed my point (I am quite happy to concede that Windows OS and Apps tend to be a bit sloppy about file management).

Scenario:
Hard-working company man is running out of time for his report at night so he dumps a copy of a fileshare/database/whatever onto his laptop to take it home. The laptop is then lost and the company may have a problem.

That’s what laptop FDE is intended to address – not the finer points of TMP/swapfile forensics and suchlike. It doesn’t matter what OS the employee in the scenario is using; if the data is lost then it’s lost.

Note that the list of requirements for the FDE competition also requests a solution for UNIX and Linux.

annoyed January 5, 2007 10:57 AM

@Sceptic
“Hard-working company man is running out of time for his report at night so he dumps a copy of a fileshare/database/whatever onto his laptop to take it home… That’s what laptop FDE is intended to address…”

Better solution: put hard-working company man in prison for mishandling sensitive data. I mean, if you actually care about the problem.

Peter January 5, 2007 2:44 PM

We use safeboot for one client’s laptops. The purpose of key escrow is when an (ex)employee forgets or refuses to unlock the company’s laptop.

This isn’t significantly different from GLBA requirements to keep “non-public information” protected, as the only real interpretation of that part of GLBA is the requirement of encryption. Most folks interpret that to mean that if you have names and passwords, such as in the ubiquitous “users” table, you’re going to need to encrypt some or all of the contents of that table.

http://en.wikipedia.org/wiki/Gramm-Leach-Bliley_Act

Sceptic January 5, 2007 8:14 PM

@annoyed

Did you read my earlier posts?

“I’d suggest that the problem is a lack of user discipline about what they put on the laptop in the first place.”

“After making an example of the first few victims who lose both devices, the laptop users might start to take this problem a bit more seriously.”

I have no problem about user education and a bit of thoughtfully applied pressure to persuade users not to copy sensitive data to removable storage but getting users to take this seriously is difficult. In the commercial world, jailing an employee for a mistake (unrealistic) or taking revenge by sacking etc will not affect the damage of the data leak after the event. Encrypting laptops is a step towards mitigating the damage caused by leaking data through lost/stolen laptops which might be justifiable on a cost/risk analysis.

I don’t know how the law works in the US but jail for accidental information disclosure for a public sector employee also seems a bit unlikely to me.

By definition, if you read this blog then you are more security aware than the average (US?) citizen. It is worth remembering that most public sector employees involved in data processing roles are constantly bombarded with moronic bureaucracy; it’s hardly surprising that they don’t pay attention to the finer points of losing government data.

annoyed January 6, 2007 2:58 AM

@Sceptic
“Did you read my earlier posts?”

Indeed I did, but I suspect you didn’t read mine.

“In the commercial world, jailing an employee for a mistake (unrealistic) or taking revenge by sacking etc will not affect the damage of the data leak after the event.”

With appropriate penalties, there won’t be any data leak after the event, because there won’t be sensitive data on the stolen equipment.

The specific penalty is not important; it’s not even the point.

The White House has unconsciously defined a new classification for sensitive information in a completely idiotic series of memos that mandate broad technical controls for a small collection of computer systems, rather than the information they pretend they want to protect. These drooling incompetents at OMB are concerned strictly with appearances, and are doing nothing to address the real problem, which is that agencies are collecting and storing personal information without knowing whether or how it is to be protected.

I have nothing against FDE per se–indeed, I would demand it for every laptop that a government agency decides this data MUST be stored on. But first I would demand that they identify where the data is already, decide whether it needs to be there, label it appropriately, protect it, and train everyone in how to handle it. Instead, they’re blowing their budget on protecting the laptops they use to give their pathetic PowerPoint presentations. The vast majority of these systems don’t have any personal information on them anyway, and the vast majority of those that do, shouldn’t.

“It is worth remembering that most public sector employees involved in data processing roles are constantly bombarded with moronic bureaucracy; it’s hardly surprising that they don’t pay attention to the finer points of losing government data.”

I work in IT security for a federal government agency and I’m keenly aware of the paucity of technical knowledge. This is why a full-fledged classification, not some silly parlour game, needs to be brought to bear on this problem. The Navy doesn’t have a problem with compromise of Secret data stored on laptops. That’s not because they’re encrypting the laptops, though they may well be, and it certainly isn’t because everyone who handles Secret data is an expert in IT security. They have no problem because they know where the Secret data is being stored, they keep track of it, and they control access to it. The secrets are safe because there are real penalties for failing to handle them properly.

Meanwhile, the rest of the government is blithely mishandling an ocean of personal information, and no one is doing a damned thing about it. They’re just wasting money–a lot of money–on theatre.

Sceptic January 6, 2007 4:58 PM

@annoyed

Your posts are interesting and you seem to be (rightly) frustrated about the way the government approaches data processing and lifetime handling.

You use the US Navy system for Secret data processing as an example of how things should be done. Fine. Judging by your description, the Navy have a good working system. I presume that anybody caught breaking the rules is subjected to military disciplinary proceedings (btw, I cannot help wondering how many mistakes get covered up by sailors anxious to avoid punishment but that’s another story).

Can these rigorous standards be applied to everything, including systems run by Federal/state employed civilians and contractors? I doubt it. It is well known that there are tradeoffs between security and convenience and I suggest that the amount of Secret data processing carried out by the Navy is rather small in comparison with less restrictvely classified data.

“The specific penalty is not important; it’s not even the point.”
“The secrets are safe because there are real penalties for failing to handle them properly.”

Sorry, but that looks like a contradiction to me. If you could apply miltary style discipline to all staff employed in government data processing then you would be onto a winner (and the US would be some sort of Stalinist state).

“an ocean of personal information”

That phrase reminded me of Bruce’s comment about data being one of the pollutants of the 21st century:
http://www.schneier.com/blog/archives/2006/10/schneier_lectur.html

Let’s imagine you are given Presidential Authority and a big budget to tighten government data processing security. I suggest that before starting the data and systems classification, procedural changes and punitive regime required to meet your standards, you might want to identify and eliminate all the unnecessary data that the government keeps first.

Consider what the chances are that the government will agree to stop collecting and retaining unnecessary data. I’m sorry but I just don’t believe it. As the technology becomes cheaper and more efficient, the government will keep collecting data. In my country (UK), the government seems to be obsessed with storing more and more detailed records about its citizens.

I can see your point about the failure to consistently apply data handling security standards but I think you are underestimating the practical difficulties involved in applying your principles to government data processing.

I suppose my comments above could be seen as a rallying cry for apathy, which isn’t my intention. I suggest that more effort should be given to educating the public about the nature of modern IT systems and the new risks they pose as well as technical and procedural guards for data processing security. The sooner the public appreciate the problem then the sooner you will be able to get people to buy into your ideas (without threatening penalties). We don’t all need to be IT experts but perhaps we do all need to understand modern privacy and security.

This is my last post on this topic.

Regards.

Jeff January 6, 2007 10:31 PM

This is a good initiative.

No, it’s not going to provide 100% security against every possible attack, but it will provide very good security in general, and excellent security for cases of laptop loss or theft by ordinary theives.

Also, we’re talking about unclassified laptops here. The kind of super-sophisticated theives who are going to forge and secretly replace CACs, serruptitiously obtain PINs, etc., aren’t going to go through all this trouble to get a list of veteran’s names.

Ron January 7, 2007 10:27 PM

I work for a major manufacturer of commercial and millitary aircraft. We have had a couple laptop thefts make the news over the last few years and we are switching to whole disk encryption on all of our laptops.

The key is assigned by company security so no token is required. Although the laptop will boot without any need to enter or have a key, you still need a domain or local account to log in. If you use a program like Norton Commander or some Linux boot CD, you cannot use the utilities to change the passwords or view the files on the drive because the drive is encrypted and therefore unreadable without booting from the drive first.

Ouroboros January 10, 2007 3:30 AM

What few people are noticing is there is not one, but two kinds of pork in this competition. One is for the likely overinflated purchase and ongoing encryption support costs. The second, and likely more lucrative, is the IT support services that have to support FDE machines. Without encryption, if the drive gets borked, you had a standing chance of recovering data without too much trouble. With FDE, the IT support service costs for dying and dead drives will be enormous (for the work alone, or for the sane alternative of a consistent incremental backup policy and hardware). I’d keep a close eye on which vendors have cosy relationships with backup/storage/NAS/SAN vendors. If they ignore the corruption problem, they’ll only shoot themselves in the foot.

Still, identity theft for the government is still largely an external cost, and good luck busting them for violations

aeschylus January 10, 2007 3:41 PM

“What few people are noticing is there is not one, but two kinds of pork in this competition.”

Sounds delicious! :^)

Mike January 15, 2007 9:01 AM

If they could get the fundamental security right, I could see it actually making like easier for government employees. Could full disk encryption be combined with some form of multi-boot to end the need for some government employees to carry up to three different computers for classified, unclassified, and personal use?

The Army’s practice of putting the private keys on ID cards seems like a good solution. It is highly unlikely any Army officer is going to put their ID in their laptop bag. But it is a pain to have to get it out every time you need to do some little personal task.

Reid January 16, 2007 4:48 PM

Israel:

I am a GOVie implementing an in-house answer to the whole-disk encryption requirement list. Bear in mind that my opinions posted here are not spoken on behalf of my employer :).

I appreciate your insight into the technical deficiencies of the requirements list.

I guess I should explain our intentions. It’s been said that no solution offers 100% coverage. This is especially true where physical access to a machine can be gained by an adversary (as in a laptop’s hard disk). What we’re trying to do is
minimize the risk.

I think you’re running into the classic butting heads of policy versus reality. Policy states that our secure laptops are not to be carried in the same container as our CAC. Policy also states that our CAC PIN is not to be written down (let alone written down and taped onto our CAC).

Reality dictates that there are probably violators out there, true. The risk is minimized through policy, though. The intersection of people that carry their laptops and CACs in the same container is very small. The intersection of that small group with people that write their PIN on their CAC is even smaller. The intersection of this very, very small group with people whose laptops get stolen is hopefully 0, or somewhere very close to it. Further, the intersection of those stolen laptops with thieves that care about the CAC + PIN is even smaller — they’re probably most interested in the value of the machine. This is what I mean by risk minimization. It’s still possible for someone to get the laptop + CAC + pin, but the chances of them doing this successfully and know what they’ve got are very, very, very, (did I mention very?) small, because most .GOV workers follow policy.

It’s true that an adversary could print up a fake CAC with a custom applet on it that grabs the user’s PIN. The user will know something is up, though: they won’t be able to sign in to the laptop, they won’t be able to VPN back to home base, etc, because the fake CAC won’t have their key in its private memory. They’ll call their help desk (hopefully) and their CAC will be determined dead, it will be revoked, and added to the certificate revocation list. A new card will be issued, with a new PIN. It’s hard for a laptop’s disk encryption scheme to actually obey the CRL, as it has to decrypt the hard disk before OS services are available, so the adversary could still steal the laptop and use the original CAC to decrypt it, I suppose. Of course, the adversary could also rig up a custom laptop with custom ccid reader and custom CAC, and leave the old CAC plugged in somewhere, allowing the new laptop to do a kind of man-in-the-middle…

Still, anyone capable of performing this type of “fake CAC” feat has significant resources behind them. They aren’t your common thief, they likely know what they’re trying to get (nation-state actor or something like it). Laptop hard disk encryption is not meant to protect against this kind of adversary. Data that must be protected against this kind of adversary should be classified at a sufficient level, as in SECRET or above (technically the classification is a measure of damage that the data could do to the US if it is leaked, but if a resourceful actor is attempting to gain the data, it is highly probable that this is the case). Classified data is not allowed on a laptop used in an unclassified environment (e.g. outside of a classified facility, like your home or starbuck’s). In order for such an actor to gain access to such a device, they would have to have a security clearance, would have to get past armed guards, etc…insider threat and armed enemy combatants are also threats that this solution is not meant to protect against.

A different variety of safeguards are put into place on machines with classified data. The protection provided is commensurate with the security classification of the data on the device. Laptop disk encryption is meant for unclassified data, where harm will not cause significant damage to operations of the US government. As such, it does not require the more stringent safeguards, and disk encryption should suffice.

I hope this provides a little more insight into the rationale behind the list, and I hope that it dispels the idea that we’re trying for a total solution. We recognize the problems, we’re just trying to make it very unlikely for petty theft ala the VA laptop case to put unclassified but for “for official use” data at risk in the future.

Cheers, and thanks for the input,
Reid

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.