The CIA's "Development Tradecraft DOs and DON'Ts"

Useful best practices for malware writers, courtesy of the CIA. Seems like a lot of good advice.

General:

  • DO obfuscate or encrypt all strings and configuration data that directly relate to tool functionality. Consideration should be made to also only de-obfuscating strings in-memory at the moment the data is needed. When a previously de-obfuscated value is no longer needed, it should be wiped from memory.

    Rationale: String data and/or configuration data is very useful to analysts and reverse-engineers.

  • DO NOT decrypt or de-obfuscate all string data or configuration data immediately upon execution.

    Rationale: Raises the difficulty for automated dynamic analysis of the binary to find sensitive data.

  • DO explicitly remove sensitive data (encryption keys, raw collection data, shellcode, uploaded modules, etc) from memory as soon as the data is no longer needed in plain-text form. DO NOT RELY ON THE OPERATING SYSTEM TO DO THIS UPON TERMINATION OF EXECUTION.

    Rationale: Raises the difficulty for incident response and forensics review.

  • DO utilize a deployment-time unique key for obfuscation/de-obfuscation of sensitive strings and configuration data.

    Rationale: Raises the difficulty of analysis of multiple deployments of the same tool.

  • DO strip all debug symbol information, manifests(MSVC artifact), build paths, developer usernames from the final build of a binary.

    Rationale: Raises the difficulty for analysis and reverse-engineering, and removes artifacts used for attribution/origination.

  • DO strip all debugging output (e.g. calls to printf(), OutputDebugString(), etc) from the final build of a tool.

    Rationale: Raises the difficulty for analysis and reverse-engineering.

  • DO NOT explicitly import/call functions that is not consistent with a tool’s overt functionality (i.e. WriteProcessMemory, VirtualAlloc, CreateRemoteThread, etc – for binary that is supposed to be a notepad replacement).

    Rationale: Lowers potential scrutiny of binary and slightly raises the difficulty for static analysis and reverse-engineering.

  • DO NOT export sensitive function names; if having exports are required for the binary, utilize an ordinal or a benign function name.

    Rationale: Raises the difficulty for analysis and reverse-engineering.

  • DO NOT generate crashdump files, coredump files, “Blue” screens, Dr Watson or other dialog pop-ups and/or other artifacts in the event of a program crash. DO attempt to force a program crash during unit testing in order to properly verify this.

    Rationale: Avoids suspicion by the end user and system admins, and raises the difficulty for incident response and reverse-engineering.

  • DO NOT perform operations that will cause the target computer to be unresponsive to the user (e.g. CPU spikes, screen flashes, screen “freezing”, etc).

    Rationale: Avoids unwanted attention from the user or system administrator to tool’s existence and behavior.

  • DO make all reasonable efforts to minimize binary file size for all binaries that will be uploaded to a remote target (without the use of packers or compression). Ideal binary file sizes should be under 150KB for a fully featured tool.

    Rationale: Shortens overall “time on air” not only to get the tool on target, but to time to execute functionality and clean-up.

  • DO provide a means to completely “uninstall”/”remove” implants, function hooks, injected threads, dropped files, registry keys, services, forked processes, etc whenever possible. Explicitly document (even if the documentation is “There is no uninstall for this “) the procedures, permissions required and side effects of removal.

    Rationale: Avoids unwanted data left on target. Also, proper documentation allows operators to make better operational risk assessment and fully understand the implications of using a tool or specific feature of a tool.

  • DO NOT leave dates/times such as compile timestamps, linker timestamps, build times, access times, etc. that correlate to general US core working hours (i.e. 8am-6pm Eastern time)

    Rationale: Avoids direct correlation to origination in the United States.

  • DO NOT leave data in a binary file that demonstrates CIA, USG, or its witting partner companies involvement in the creation or use of the binary/tool.

    Rationale: Attribution of binary/tool/etc by an adversary can cause irreversible impacts to past, present and future USG operations and equities.

  • DO NOT have data that contains CIA and USG cover terms, compartments, operation code names or other CIA and USG specific terminology in the binary.

    Rationale: Attribution of binary/tool/etc by an adversary can cause irreversible impacts to past, present and future USG operations and equities.

  • DO NOT have “dirty words” (see dirty word list – TBD) in the binary.

    Rationale: Dirty words, such as hacker terms, may cause unwarranted scrutiny of the binary file in question.

Networking:

  • DO use end-to-end encryption for all network communications. NEVER use networking protocols which break the end-to-end principle with respect to encryption of payloads.

    Rationale: Stifles network traffic analysis and avoids exposing operational/collection data.

  • DO NOT solely rely on SSL/TLS to secure data in transit.

    Rationale: Numerous man-in-middle attack vectors and publicly disclosed flaws in the protocol.

  • DO NOT allow network traffic, such as C2 packets, to be re-playable.

    Rationale: Protects the integrity of operational equities.

  • DO use ITEF RFC compliant network protocols as a blending layer. The actual data, which must be encrypted in transit across the network, should be tunneled through a well known and standardized protocol (e.g. HTTPS)

    Rationale: Custom protocols can stand-out to network analysts and IDS filters.

  • DO NOT break compliance of an RFC protocol that is being used as a blending layer. (i.e. Wireshark should not flag the traffic as being broken or mangled)

    Rationale: Broken network protocols can easily stand-out in IDS filters and network analysis.

  • DO use variable size and timing (aka jitter) of beacons/network communications. DO NOT predicatively send packets with a fixed size and timing.

    Rationale: Raises the difficulty of network analysis and correlation of network activity.

  • DO proper cleanup of network connections. DO NOT leave around stale network connections.

    Rationale: Raises the difficulty of network analysis and incident response.

Disk I/O:

  • DO explicitly document the “disk forensic footprint” that could be potentially created by various features of a binary/tool on a remote target.

    Rationale: Enables better operational risk assessments with knowledge of potential file system forensic artifacts.

  • DO NOT read, write and/or cache data to disk unnecessarily. Be cognizant of 3rd party code that may implicitly write/cache data to disk.

    Rationale: Lowers potential for forensic artifacts and potential signatures.

  • DO NOT write plain-text collection data to disk.

    Rationale: Raises difficulty of incident response and forensic analysis.

  • DO encrypt all data written to disk.

    Rationale: Disguises intent of file (collection, sensitive code, etc) and raises difficulty of forensic analysis and incident response.

  • DO utilize a secure erase when removing a file from disk that wipes at a minimum the file’s filename, datetime stamps (create, modify and access) and its content. (Note: The definition of “secure erase” varies from filesystem to filesystem, but at least a single pass of zeros of the data should be performed. The emphasis here is on removing all filesystem artifacts that could be useful during forensic analysis)

    Rationale: Raises difficulty of incident response and forensic analysis.

  • DO NOT perform Disk I/O operations that will cause the system to become unresponsive to the user or alerting to a System Administrator.

    Rationale: Avoids unwanted attention from the user or system administrator to tool’s existence and behavior.

  • DO NOT use a “magic header/footer” for encrypted files written to disk. All encrypted files should be completely opaque data files.

    Rationale: Avoids signature of custom file format’s magic values.

  • DO NOT use hard-coded filenames or filepaths when writing files to disk. This must be configurable at deployment time by the operator.

    Rationale: Allows operator to choose the proper filename that fits with in the operational target.

  • DO have a configurable maximum size limit and/or output file count for writing encrypted output files.

    Rationale: Avoids situations where a collection task can get out of control and fills the target’s disk; which will draw unwanted attention to the tool and/or the operation.

Dates/Time:

  • DO use GMT/UTC/Zulu as the time zone when comparing date/time.

    Rationale: Provides consistent behavior and helps ensure “triggers/beacons/etc” fire when expected.

  • DO NOT use US-centric timestamp formats such as MM-DD-YYYY. YYYYMMDD is generally preferred.

    Rationale: Maintains consistency across tools, and avoids associations with the United States.

PSP/AV:

  • DO NOT assume a “free” PSP product is the same as a “retail” copy. Test on all SKUs where possible.

    Rationale: While the PSP/AV product may come from the same vendor and appear to have the same features despite having different SKUs, they are not. Test on all SKUs where possible.

  • DO test PSPs with live (or recently live) internet connection where possible. NOTE: This can be a risk vs gain balance that requires careful consideration and should not be haphazardly done with in-development software. It is well known that PSP/AV products with a live internet connection can and do upload samples software based varying criteria.

    Rationale: PSP/AV products exhibit significant differences in behavior and detection when connected to the internet vise not.

Encryption: NOD publishes a Cryptography standard: “NOD Cryptographic Requirements v1.1 TOP SECRET.pdf“. Besides the guidance provided here, the requirements in that document should also be met.

The crypto requirements are complex and interesting. I’ll save commenting on them for another post.

News article.

Posted on March 13, 2017 at 12:00 PM42 Comments

Comments

Jim P. March 13, 2017 12:11 PM

for binary that is supposed to be a notepad replacement

This is why Microsoft needs to step up their game and provide users with a tool that will validate all install applications/binaries. Gabe does it, so it’s not impossible.

orcmid March 13, 2017 12:18 PM

These aren’t such bad ideas for privacy-by-design approaches to software as well, especially those that employ security primitives.

Rhys March 13, 2017 12:19 PM

Um. Gee. When you get a marked ‘sensitive’ document, is your first thought to publish it?

Or delete it?

At least establish that its classification has been changed.

ab praeceptis March 13, 2017 12:25 PM

Hahaha! The dos and don’ts were among the first I looked at.

And I enjoyed what they said re. ssl/tls.
Very funny, but shhhh, don’t tell anyone of the laaarge “ssl solves all problems!” croud or else they will hunt you (they don’t care much about facts; it’s more of a sect thing).

Bobby March 13, 2017 12:42 PM

2015-03-09 16:50 [User #3375388]:

How about: DO NOT write your own crypto, unless you must?

Hee hee. Even the CIA needs to be reminded of this from time to time.

hawk March 13, 2017 1:28 PM

Notice bad guy stuff better if secret, but good guy stuff better if not secret. Proves commercial / academic cryptography = bullshit.

albert March 13, 2017 2:28 PM

Interesting reading indeed.

Assuming that theses dos and don’ts are the bible for malware writers, I would assume that a malwares code and that codes behavior would make attribution extremely difficult. Especially for state actors. The mere reuse of code proves nothing.

“…Rationale: Maintains consistency across tools, and avoids associations with the United States….”

And that’s the point. Not only can any individual or state use these techniques, but they can implant cues that point to someone else.

. .. . .. — ….

Rien March 13, 2017 2:49 PM

Unexpected hat-tip from the CIA? That NOD Crypto standard recommends reading Practical Cryptography by Schneier and Ferguson.

Bob Dylan's Facial Hair March 13, 2017 3:18 PM

The most important thing about this list is not the DOes and DON’Ts but the rationales supplied. Security advice today is overrun with lists, directions, “top ten” nonsense, and other such rot that treats the user like a puppet doll to be jerked around at the so-called experts whim. Anyone who cannot explain WHY an action should be taken and the action’s BENEFIT in less than twenty words should be showed the way to the door to the gulag in Siberia. We are at the meta stage in the game where the best advice one can give is advice on how to tell good advice from bad advice.

So I applaud the CIA for taking the time to spell out the rationales for their behaviors. If only we could force a business like Comodo to do the same.

Ergo Sum March 13, 2017 3:32 PM

@albert…

It would be pretty much surprising to me, if other state actors don’t have similar/same Dos and Don’ts in place. And if they do, state actors’ activities, or rather their remnants for forensic, do look very similar. At which point it is anyone’s guess who the state actor had been…

CallMeLateForSupper March 13, 2017 4:16 PM

Well, alrighty then…. policies made and published. Tick! we be done.

Just how widely and precisely are those policies followed?
Is there any audit to measure compliance? If not, why not? If so, are records retained or destroyed and forgotten?

Jonathan Wilson March 13, 2017 4:35 PM

Not sure if it captures ALL binaries but the system file checker tool (sfc.exe) produced by Microsoft is pretty good at detecting and fixing missing/bogus system files.

Lawrence D’Oliveiro March 13, 2017 4:36 PM

The original article describes some of these guidelines as “dated”, while Bruce says they are “good advice”.

ASmith March 14, 2017 1:41 AM

Nearly all of the Gestapo, Stasi 2.0 CIA Tradecraft Do’s and Do nots would be voided in many Free and Open Sourced Software. Most forms of obfuscation would be easily unobfuscated by the NSA,CIA,Mossad having nearly unlimited free USA taxpayer funds and free resources to defeat it leaving the end result of obfuscation against non-State Sponsored attackers. Parts of the CIA do’s outlined appear to have been used in the CIA-Mossad stuxnet,duqu,gauss and flame worm/virus program using a recognizable mossad coded encryption in portions of those linked virus’s which were deliberately spread globally. I wonder how many America industrial workers lost their lives or had their hands, arms crushed by those CIA-Mossad virus infected Siemens electronic control modules so commonly found across heavy industry in America and globally?

Who? March 14, 2017 4:43 AM

@ Ion

It looks like they need some real programmers. None of the Open Source variety.

Seriously, what is wrong with open source community? We write high quality software!

Who? March 14, 2017 4:52 AM

@ My Info

http://man.openbsd.org/changelist.5

/etc/changelist is used to track a set of files that are important to the OpenBSD operating system. It does not validate the files themselves. In fact, most of these files are intended to be customized by system administrators. It is not a security feature either, as it can be easily defeat by anyone with the right privileges.

Michael Moser March 14, 2017 5:21 AM

Interesting, I bet they have automated tools that check if these recommendations have been followed (at least some of them). I like this document, it provides rationals for each point; that’s way better than most coding conventions.

Ralph Bolton March 14, 2017 7:22 AM

Obfuscating config data in memory (and elsewhere) seems like a good precaution in lots of cases. I’d be interested in the expert opinion here, but I wonder if a simple XOR would be sufficient obfuscation to make forensic in-memory analysis very hard. For strings hard-coded in a binary, the same would make the ‘strings’ tool (and others) miss it, and changing the XOR ‘key’ on each compile or deploy would probably throw off a lot of the tools that look for signatures and the like.

Of course, before believing anything I’m saying you’d need to test it thoroughly because I haven’t 😉

Also, the practice of writing zeros over your data before deallocating the memory for it seems like good advice. I guess relatively easy to do at the low-level before calling free(). I wonder if any high level languages do/will do this as a matter of course? It’d be nice to think that deleting a hash key would zero out the actual RAM previously occupied by the key and value (seeing as that may not cause an actual free() to take place).

Tatütata March 14, 2017 8:34 AM

DO NOT have “dirty words” (see dirty word list – TBD) in the binary.

TBD? What a bummer, I would have liked to see what this list would have included (beyond Carlin’s seven words), or see the minutes of the meetings that produced it, and their member selection. Let’s invite Gary to our committee, he’s a real expert at cussing. Do they have a pre-processor that bleeps like network TV whenever such a word is detected? Or is this document merely a code-review checklist. (That is, if they have code reviews).

DO NOT use US-centric timestamp formats such as MM-DD-YYYY. YYYYMMDD is generally preferred.

Alleluia! I also hope that through their intercepts, they learned to love and use metric units. (“veer off rocket 100km”)

but I wonder if a simple XOR would be sufficient obfuscation to make forensic in-memory analysis very hard.

XORing with always the same byte doesn’t help. It will still show up immediately in a hex dump if you have repeated data, such as a buffer filled with 00 or $20, or tell-tale combinations like $0d$0a (particular to the Winsoft world).

Couldn'tPossiblyComment March 14, 2017 10:10 AM

Some great advice in that list for general security as well.

I wonder how many people will actually take notice of:

DO NOT solely rely on SSL/TLS to secure data in transit. Rationale: Numerous man-in-middle attack vectors and publicly disclosed flaws in the protocol.

1Password did prior to this being public. I still talk to plenty of engineers and architects who think enabling TLS is a bulletproof shield behind which everything can be sent in plaintext.

Tatütata March 14, 2017 10:17 AM

I still talk to plenty of engineers and architects who think enabling TLS is a bulletproof shield behind which everything can be sent in plaintext.

Or worse. I’m thinking here of “OAUTH 2.0″… I won’t add anything more before I find my barf bucket.

Clive Robinson March 14, 2017 10:23 AM

@ Bruce,

Some of the rationales have “meta-meta-data” like information. For instance

    Rationale: Shortens overall “time on air” not only to get the tool on target, but to time to execute functionality and clean-up.

At first glance appears benign enough at first sight. But it actually gives you information about operational asspects of the tool users. Which in turn gives you other insights into the part of the organisation as well…

@ All,

The thing is about these dos and don’ts is that they are very broad in scope, nearly all would apply to sensible development of things other than spy/malware.

What I do see as missing are the things Adam Young and Moti Yung developed and put in their Cryptovirology book. I find it odd that such things are missing from the list, especialy when it mentions obsfication/crypto of setup/config files and in program data.

Thus the question of if this is a novice / beginers or contractors list, and if there is a more interesting NOBUS insider list.

Clive Robinson March 14, 2017 10:41 AM

@ Couldn’tPossiblyComment,

I wonder how many people will actually take notice of…

Darn few. It’s the same problem with all security end points and an openly accessible comms channel…

albert March 14, 2017 12:01 PM

@Ergo Sum,

…and not even state actors. -Anyone- writing malware should use these procedures. That’s the scary part that can affect most people. Although technology applies to all, dealing with state actors is mostly geo-political and strategic. Attribution in individual cases is often international in scope, but stopping those individuals is a prime concern for private citizens (like us).

No matter where we live, we’re all in this together.

. .. . .. — ….

Hang em High March 14, 2017 12:48 PM

Re Vault7 provenance: With the SCO members explicitly allied and acutely aware they’re facing war with the US, how do they prepare?

Winning the war is the easy part. When China and Russia stop US aggression, they must be able to decapitate CIA publicly and juridically. Or else an armistice or nonaggression pact is not worth the paper it’s written on. CIA will just morph and come after them again and again and again. So what do they need?

(1) CIA torture records documenting elements of crimes to ICC standards;

(2) Nuremberg Count 1, the plan or conspiracy for war: material showing the purpose of systematic and widespread torture was to fabricate a casus belli for aggression in multiple nonbelligerent countries;

(3) Documented CIA command responsibility for drone murder war crimes for, at the minimum, SCO member Pakistan.

Does anybody think that OPM adjudication dirt is all that China got? These are the thouroughest people in the thoroughest country in the world. CIA got their asses pwned and they’re losing their shit. That’s where Vault7 comes from, and it’s only a teaser. They’re holding onto the best stuff. That’s for the tribunal.

Craig McQueen March 14, 2017 5:18 PM

DO use variable size and timing (aka jitter) of beacons/network communications. DO NOT predicatively send packets with a fixed size and timing.

Rationale: Raises the difficulty of network analysis and correlation of network activity.

I thought timing jitter would increase the ability of analysis by correlation of network activity. Timing jitter would create a unique fingerprint of the data that can make it easier to trace it through network routers, Tor, etc. At least that’s what I would instinctively think; I don’t know much about Tor or gov’t abilities to track data by such network oversight.

Carbon-14 March 14, 2017 5:31 PM

@Hang em High,

True or not, it changes absolutely nothing.

You admit, hydra will live on and on and on.

You’re busy cannibalizing the remains now. 😉

Carbon-14 March 14, 2017 5:35 PM

I’ll tell you the same thing I’ve told a thousand others:

Any good idea is just like a virus. (Think about it, it’s pretty catchy.)

Good of course, it relative and as per whatever type of RNA Class I II III or IV bug we were or weren’t referencing I might add that by tempting one’s self with other’s feces one COFF runs the risk of catching worms.

😉

Bon’ Apetit.

Dan H March 15, 2017 6:47 AM

@Outtern
CIA’s crimes against humanity? That is absurd.

Where’s your condemnation of Kim? He’s just plain nuts.

Of Putin? He kills political rivals.

Of Hillary who started a war in Libya where it is more dangerous than before?

Of the Mexican drug cartel leaders? Yes, the drug cartels have done worse than the CIA and do so everyday. In some areas of Mexico the people have started their own patrols with roadblocks to protect their towns from cartels. They behead people and commit mass murder. You do realize the CIA also works to stop drugs in Mexico, Colombia, and other countries?

C U Anon March 15, 2017 9:01 AM

@Dan H:

You do realize the CIA also works to stop drugs in Mexico, Colombia, and other countries?

To be honest we don’t know what the CIA does, not even the CIA. If you have a look into known history you will find US involvment in drug shipping via the likes of the CIA.

I suspect that parts of the CIA lust after those “Glory Days” when they could over throw countries leaders for those that US Corporates approved of. All so cheap resources keep flowing into the US etc to maintain the overly high lifestyles, and US political parties got their kick backs as they looked the other way. So shipping drugs for cartels etc certainly used to be on the CIA list of things to get funds to supply small organisations with serious weapons so that they would destabilize or over throw governments that did not kiss US Corporate feet on demand.

Which of course means that the question is what dirty dealing by the CIA is going to get exposed next, and will the politicos do any thing other than lift a corner of the carpet to sweep it under?

Mindraker March 15, 2017 5:28 PM

“US-Centric” issues like dates, timestamps, and terms is waaaaaaay beyond the average user. We’re still dealing with inches and feet over here. People just have no idea how much the world has left them behind.

Jack March 15, 2017 9:50 PM

@ ab praeceptis,

From the looks of it, I guess they are very fond of extremely long switch cased statements.

Clive Robinson March 15, 2017 11:23 PM

@ Mindraker,

We’re still dealing with inches and feet over here. People just have no idea how much the world has left them behind.

Some of us still need to use fathoms, nautical miles, rods, poles, perches, furlongs, links, chains and Ramsden’s chains, for surveying.

Then there are “barleycorns” used for shoe sizes and all the weights, liquids and dry goods measures

GreenSquirrel March 17, 2017 10:41 AM

@Craig McQueen

I thought timing jitter would increase the ability of analysis by correlation of network activity.

It depends what you are trying to evade.

If you send out a regular beacon, then it is easy to detect on edge services (e.g. firewall logs). The regular heartbeat of malware is one of the main things people look for in their SIEMs (or they should be doing this).

If you jitter the beaconing, then it is less likely to get spotted at the perimeter but, as you say, may increase the risk of correlation between malware samples. However, if you avoid detection this risk might not matter.

D. Alexander August 27, 2017 8:45 PM

I took 11 computer security classes through the DOD, (unfortunately I have forgotten most of it) even though the classes were declassified, (no pun intended) it seemed that military computers are not as well protected as they should be. The setting up of a DMZ with outside servers with the hardware firewall on the outside, and a software firewall on the inside seems like a good idea to stop application proxy spoofing attacks, these systems still have some problems. They are vulnerable to user-authentication flooding attacks to use up bandwidth and crash systems; setting up botnets on insecure machines to do a DDOS. Military computers are more likely to be targeted by a DDOS then computers in the private sector. Also tunneling with HTTPS/SSL encryption, rather than some form of VPN is probably a bad idea. A hacker can spoof an out of date and un-patched version of SSL; and often communicating with the most authentic version, will cause the newer version to trust the older one, and pass the keys downward… NMAP and other tools actually have exploits built in to do this.

I agree that encryption is important, but being as code-challenged as I am, I have not installed it yet. Some email client seem to have at least SSL. Code can be broken by hashing code variations with rainbow tablets, but rainbow tablets do not work on all code. Encryption can be broken, but code that is not crackable with rainbow tables is almost impossible to crack, without finding vulnerabilities in the systems that create the code themselves.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.