version 2019-08-19 by Artem Pronichkin
submit a
comment (“issue”) or
edit
This is about creating great tools (including scripts.) If you develop, maintain, own a tool or utility, or repeatedly ask others to use some tool, or build requirements for tools used in your processes, please consider the below criteria.
These requirements are based on years of experience working with most security-cautions customers, such as banks or government agencies across the world.
In a very short way, all of the below can be reduced to: do not require or assume more than needed, and document everything which is truly required.
When supporting customers or working with commercial support organizations on behalf of customers, it's quite common to be asked for running a certain tool to troubleshoot or gather logs. It sounds very easy for support people, and hence they tend to “overrequest”—that is, ask for using more tools than needed at certain point.
Unfortunately, not all tools are created equal, and some of them might be difficult to run in certain environment—especially when it's fairly locked down. While certainly possible, request to run such tools might delay troubleshooting significantly, because it would require to change production system configuration (e.g. include specific binary into whitelisting policy.) Hence, we should avoid such tools (that is, the tools which fail to comply with the requirements provided below)—unless absolutely necessary and reasonable to proceed.
I'm not saying that each and every customer would impose the constraints listed below. In fact, most of the customers would not, or maybe they will have only one or two of these requirements. However, the list below is meant to be comprehensive and failsafe—that is, if you succeed to comply with all of these requirements, you can be confident your tool won't face any obstacles and can be recommended for broad use.
These requirements are not specific to troubleshooting or diagnostic tools. I mention those types of tools first because I have personally been bitten by this before many times. When you have a time-sensitive problem and call for support, the last thing you want is to be asked for running a tool which fails to execute. So that instead of troubleshooting the actual problem, you start troubleshooting the tool. And at the end that might require changes to production environment which are often not easy (either technically or operationally), or might impact the repro.
However, the below requirements are in fact universal and can (or even should) be applied to any tools or utilities that we, as an industry, expect customers to run—such as monitoring, hardware configuration, audit assessment, system deployment and configuration, etc.
Let me put this straight. If your tool fails to comply with the requirements listed below, running it might both be painful and take long for certain customers. They might spend hours or even days figuring out how to make the tool work. Even if it's not precisely your fault (the tool works as designed and as needed), you probably could make life easier for all parties if you were at least aware of constrains of some customers' environments.
That's the whole point of this writeup. It's really not to put a blame on anyone. (Certainly not on you.) It's to rise awareness.
Last but not least, I am a Windows guy and I work for Microsoft. Hence the below explanations are sometimes Windows-centric and mention several Windows-specific technologies as examples. However, the nature of various constraints imposed by the customers are really cross-platform, regardless of the operating system or tools being used. Please consider the below requirements even if your tools are intended to run on different operating system—though you might need to substitute some technology names with their equivalents or counterparts.
Ideally, the tool should come inbox with the operating system (or other product that it's intended to work with—e.g. database software.) Even if you plan to update it frequently, having the initial version inbox at first place (merely as a placeholder) would help including it into whitelisting policy. In fact, anything coming inbox will likely be whitelisted by default. And due to the way many whitelisting solutions work, that would automatically cover future versions.
If you cannot ship your tool inbox, that's of course understood and is totally fine. However, in this case you need at least to comply with the following publishing requirements.
You can casually examine the values of such fields in “File Properties” box in Windows Explorer, or by exploring ”Version Info” property in PowerShell.
In the other hand, cab format is known to not support folders. So, if your tool relies on a complex directory structure, please evaluate other options—such as MSI or MSIX.
All of the above requirements are imposed by whitelisting solutions such as AppLocker or Windows Defender Application Control (WDAC), also known as “Device Guard User Mode Code Integirty” (UMCI).
If you are not familiar with WDAC, you can learn about it
here. However, you don't have to become a WDAC expert. If you
comply with the requirements listed above, you can rest assured
that your tool does its best to comply with WDAC.
If you happen to ship a kernel mode driver (which is very uncommon for troubleshooting tools) please make sure you comply with HVCI requirements as well.
The above requirements can be skipped only if the tool is a very
simple PowerShell script which can be executed in
Constrained Language Mode. If your script is fully functional
under constrained
language mode, the above requirements are “nice to
have”—but not strictly necessary.
If you are not sure whether your script supports Constrained Language Mode, please do not assume it does.
The tool has to support Server Core installation option. This typically means no GUI assumptions or dependencies. Or, more precisely, no explorer.exe (Windows shell), no WPF, no hardware graphic acceleration, etc.
You can learn more about Server Core here. Please note that it's not a separate edition or “SKU” of Windows. It is an installation option available in all editions of Windows Server (Standard and Datacenter), and it is recommended by default for most use cases.
Unfortunately, as of today, there's not an easy way to check whether your tool is compatible with Server Core (other than testing it.) But we're working on it and appreciate your diligence.
The above does not mean your tool has to be command line only. You are very welcome to provide a GUI—as long as it's not the only mode of operation. However, all functionality should be also available in command line.
It is very beneficial if the tool supports connecting to remote systems. In fact, that's one of the recommendations provided below. However, you should not assume that fresh credentials are always required for that (or for any other operations.)
There may be scenarios where current user cannot supply credentials on demand, especially if you prompt for user name and password and do not support smart cards. Some users are “smartcard-only” which means they have no password whatsoever. (Well, technically, they still do have a password—however, they do not know it.) Another somewhat similar scenario is Remote Credential Guard, where the user is not supposed to enter any credentials interactively, even though they technically can.
For these reasons, the tool should always try using current user credentials first. (That is typically Kerberos ticket obtained from current logon session—however, normally you do not need to code anything special for this.) Only prompt for credentials optionally, or if everything else fails.
Some customers operate fully air-gapped (that is, isolated) networks. For this reason, the tool should not assume it always can download components from the Internet, or upload diagnostic data directly to the vendor. This behavior can be optional, and of course many customers would appreciate that. However, there should be also an option to supply all the required components (e.g. baselines, metadata, schemas, updates, etc.) offline, and to emit results as a set of files which can be transferred to the vendor manually, if needed.
Generally, the number of dependencies should be as small as possible. One approach to achieve that is making the tool modular. E.g. it's certainly helpful if you support gathering data using several external tools (e.g. Sysinternals.) But it also helps if these operations are optional and can be omitted. Of course, running in reduced functionality will gather fewer data—but it might be still enough for some cases, or at least at the beginning. So, please do not assume you can only do “all or nothing”.
The reason for this requirement is also application whitelisting. If your tool is a simple script (which, let's say, grabs some logs), it might not need being included into the whitelisting policy at all. And hence it would not require policy changes. However, if any dependency is a binary, it will certainly require a policy modification. Any such occasion imposes additional operational overhead and might slowdown or complicate troubleshooting efforts (which may be time-sensitive.)
Even if your tool needs to be whitelisted by itself, adding fewer external items to the policy is often faster and easier to justify than adding a large number of dependencies with opaque scope.
Any dependency should not be assumed as granted. It needs to be explicitly documented (in tool's readme, accompanying email template, etc.) This even applies to dependencies on any inbox components, such as PowerShell modules or 32-bit subsystem (WoW64.) Even though these components are included by default, if they can be uninstalled—there will be customers who remove them for one reason or another.
The tool needs to support at least one of the following. (Bonus points if you manage to satisfy both requirements.)
Todo: add more technical details on what's available and what's not.Such dependencies should be generally avoided—unless clearly necessary and unavoidable. (In which case requirements should be documented).
It is understood if you tool requires local administrator permission (or equivalent) to configure something on the machine. However, please be mindful about your requirements. If there are some operations that do not actually require those permissions (e.g. just analyzing the current configuration) you better allow your tool to run as standard user. In either case, please document the requirements explicitly—especially if you require some special privileges (such as “Debug Programs” or “Generate Secuity Audits”.) Do not assume every user who runs your tool can elevate their privileges on the fly.
One specific example is when your tool runs on a machine to configure remote systems (such as explained in 7a above.) In this case it's almost certain that your tool does not need Administrator permissions on the “management” machine. And if some of your customers adopted certain Credential Theft Mitigation (CTM) best practices (such as “Privileged Access Workstations” or PAWs) this means that they definitely won't have Administrator permission on their local machines (even though they are administrators on the remote machines being managed.)
It is fine if your tool requires Administrative permissions for installation and updating. Just do not assume the same permissions are available all the time even for running the tool, and do not require process elevation unless actually needed.
If possible, offer automated distribution mechanism for your tool which is native for the operating system (e.g. Microsoft Store) and/or the runtime environment (e.g. PowerShell Gallery.)
Depending on the nature and target environments of your tool, this may or may not be not the only way to obtain it. E.g. Microsoft Store is not available on Windows Server, and it also cannot be easily used by the customers who run in air-gapped networks. However, it is the perfect solution for many users running Windows 10. So, it's perfectly fine if you provide multiple alternative installation vehicles. Having more options is generally better than fewer.
Besides obvious benefits such as automatic updating offered by the Store, packaging for it (or PowerShell Gallery) implicitly makes you comply with digital signature requirements listed above.