Adversary Engagement, A Practical Guide by MITRE
Adversary Engagement, A Practical Guide by MITRE
MITRE | Engage
2
A Practical Guide to Adversary Engagement
Table of Contents
Introduction .................................................................................................................... 7
1.1 Cyber Denial, Deception, and Adversary Engagement .......................................................... 7
1.2 Getting Started with Adversary Engagement ....................................................................... 9
MITRE Engage™ Matrix ................................................................................................. 10
2.1 The Structure of Engage ..................................................................................................... 10
2.1.1 Strategic and Engagement Actions ..................................................................................................... 10
2.1.2 Engage Goals....................................................................................................................................... 12
2.1.3 Engage Approaches ............................................................................................................................ 14
2.1.4 Engage Activities ................................................................................................................................. 15
2.2 ATT&CK® Mappings ........................................................................................................... 15
2.3 Operationalizing the Engage Matrix ................................................................................... 16
2.4 Integrating the Engage Matrix into Your Cyber Strategy .................................................... 17
Elements of an Adversary Engagement Operation ......................................................... 18
3.1 Element 0: Operational Objective ...................................................................................... 18
3.2 Element 1: Narrative.......................................................................................................... 19
3.3 Element 2: Environment .................................................................................................... 19
3.3.1 Isolated Environments ........................................................................................................................ 20
3.3.2 Integrated Environments .................................................................................................................... 20
3.3.3 Relationship Between Pocket Litter and Environment ....................................................................... 21
3.3.4 Pocket Litter versus Lures ................................................................................................................... 21
3.4 Element 3: Monitoring ....................................................................................................... 22
3.5 Element 4: Analysis............................................................................................................ 22
Adversary Engagement Operational Security (OPSEC) ................................................... 22
The Process of Adversary Engagement .......................................................................... 23
5.1 Step 1: Assess knowledge of your adversaries and your organization ................................ 24
5.2 Step 2: Determine your operational objective ................................................................... 25
5.3 Step 3: Determine how you want your adversary to react ................................................. 25
5.4 Step 4: Determine what you want your adversary to perceive ........................................... 25
5.5 Step 5: Determine presentation channels to engage with the adversary ............................ 26
5.6 Step 6: Determine the success and gating criteria .............................................................. 26
5.7 Step 7: Execute the operation ............................................................................................ 26
5.8 Step 8: Turn raw data into actionable intelligence ............................................................. 26
5.9 Step 9: Feedback intelligence............................................................................................. 26
5.10 Step 10: Analyze successes and failures to inform future operations ................................. 27
MITRE | Engage
3
A Practical Guide to Adversary Engagement
MITRE | Engage
4
A Practical Guide to Adversary Engagement
List of Figures
Figure 1: Advesary engagement is the use of cyber denial and deception, in the context
of strategic planning and analysis to drive up the cost, while driving down the value of
the adversary's operations. .............................................................................................. 7
Figure 2: Operators viewing denial and deception activities through the lens of planning
and analysis can identify opportunities for adversary engagement. ................................ 8
Figure 3: The MITRE Engage Matrix bookends engagement activities with strategic
planning and analysis, to ensure that every action is goal driven. ................................. 10
Figure 4: Strategic Actions are taken to support your operational strategy .................... 11
Figure 5: Engagement Actions are the actions taken against the adversary that are
more often associated with cyber deception. ................................................................. 11
Figure 6: The Engage goals ........................................................................................... 12
Figure 7: Intelligence gathered before and after nine elicitation operations ................... 13
Figure 8: Engage Approaches are the ways in which you drive progress towards a
selected goal .................................................................................................................. 14
Figure 9: By mapping to ATT&CK, Engage ensures that each engagement activity is
driven by real adversary behavior. ................................................................................. 15
Figure 10: Use the MITRE Engage Cycle to integrating Engage into your cyber strategy
........................................................................................................................................ 16
Figure 11: Questions to consider when thinking about adversary engagement
operations ....................................................................................................................... 18
Figure 12: The MITRE Engage 10-Step process of adversary engagement ................. 24
Figure 13: Organization template for the revealed and concealed facts and fictions ..... 36
List of Tables
Table 1: Summary of the Engage Approaches .............................................................. 14
Table 2: Template Mission Essential Task List for self-infection elicitation operations .. 28
MITRE | Engage
5
A Practical Guide to Adversary Engagement
MITRE | Engage
6
A Practical Guide to Adversary Engagement
Introduction
1.1 Cyber Denial, Deception, and Adversary Engagement
Cyber defense has traditionally focused on the use of defense-in-depth technologies to deny the adversary
access to an organization’s networks or critical cyber assets. In this paradigm, any time the adversary can
access a new system or exfiltrate a piece of data from the network, they win. However, when a defender
introduces deceptive artifacts and systems, it immediately increases ambiguity for the adversary. Is the
system the adversary just accessed legitimate? Is the piece of data the adversary just stole real? Questions
such as these begin to drive up the cost to operate, while driving down the value of the adversary’s cyber
operations.
Figure 1: Adversary engagement is the use of cyber denial and deception, in the context of strategic planning and analysis to
drive up the cost, while driving down the value of the adversary's operations.
Cyber Denial is the ability to prevent or otherwise impair the adversary’s ability to conduct their operations.
This disruption may limit their movements, collection efforts, or otherwise diminish the effectiveness of their
capabilities. In Cyber Deception we intentionally reveal deceptive facts and fictions to mislead the
adversary. In addition, we conceal critical facts and fictions to prevent the adversary from forming correct
estimations or taking appropriate actions. When cyber denial and deception are used together, and within
the context of strategic planning and analysis, they provide the pillars of Adversary Engagement.
The main goals of adversary engagement can be any combination of the following: to expose adversaries
on the network, to elicit intelligence to learn more about the adversary and their Tactics, Techniques, and
Procedures (TTPs), or to affect the adversary by impacting their ability to operate. Adversary engagement
operations provide opportunities for the defender to demonstrate tools, test hypotheses, and improve their
threat models, all with the added benefit of negatively impacting the adversary.
MITRE | Engage
7
A Practical Guide to Adversary Engagement
Figure 2: Operators viewing denial and deception activities through the lens of planning and analysis can identify opportunities
for adversary engagement.
Denial, deception, and adversary engagement technologies are far from novel. Honeypots, or dummy
systems set up for the sole purpose of studying adversary interactions, have been in use for decades.
However, they often suffer from the shortcomings of being unrealistic, easily signatured, and general-
purpose. In contrast, the engagement environments used in adversary engagement operations are
carefully tailored, highly instrumented environments designed on an engagement-by-engagement basis.
Often, these environments are seamlessly integrated into a production network. However, these
environments need not be complex or even highly technical. Suppose an organization is worried about
intellectual property theft. This organization could set up a series of shared directories across the corporate
network that are accessible and contain apparently sensitive, but fake, corporate data. If these directories
are accessed, the defender will receive a high-fidelity alert. Anyone in the organization can access the
information; however, if no one in the organization has a business need to access this sensitive data, any
user activity associated with the data warrants a review. These simple decoy directories are an example of
utilizing adversary engagement as part of a larger intellectual property protection strategy.
Adversary engagement is an iterative, goal driven process, not merely the deployment of a technology
stack. It is not enough to deploy a decoy and declare success. Rather, organizations must think critically
about what their defensive goals are and how denial, deception, and adversary engagement can be used to
drive progress towards these goals. Unlike other defensive technologies, such as antivirus (AV), adversary
engagement technologies cannot be considered “fire and forget” solutions. In the previous decoy directory
example, while a relatively trivial operation, the defender still needed to think carefully about their goals
when designing the engagement.
Let’s continue to explore this example to understand how goals shaped and drove the operational design.
The company was worried about a particular threat. In this case, the threat was a malicious insider. They
MITRE | Engage
8
A Practical Guide to Adversary Engagement
identified critical cyber assets of value to their adversary. Specifically, they were concerned about the
valuable intellectual property associated with the latest R&D project. Using these facts, the organization
was able to create decoy directories that would be appealing to the adversary, by filling the decoys with
fake results from the latest R&D tests. The organization also identified key locations likely to be noticed by
anyone poking around the network. In this case, they chose the CEO’s private SharePoint. With few
resources, this exemplar company was able to deploy deceptive artifacts that will likely result in high fidelity
alerts to insider threat behavior. As a reminder, this should not be the totality of this company’s insider
threat protection program. Rather, this adversary engagement operation should augment the larger
strategic goals of the program.
While it is easy to imagine the value of successfully deceiving the adversary over a long period of time,
given the current state of adversary engagement operational tooling, especially automated and at scale, the
adversary may eventually realize the deception. However, this may still provide defensive benefits. For
example, the adversary may now doubt the validity of any compromised system on your network.
Additionally, the defenders learn what steps the adversary takes to verify a system and can use that
information both to identify malicious activity and to better tailor environments in future operations.
It is important to recognize that adversary engagement operations are scalable. This guide will walk
through a template for more complicated team operations. However, the basic idea of an adversary
engagement operation is simple, as illustrated in the decoy directory example. Additionally, while deception
for detection use cases are often associated with low complexity deceptions, they are not the only simple
adversary engagement opportunities. All types of operations can also be used effectively, even by teams
with limited experience or resources. For example, take a computer, connect it to a network, run the
adversary’s malware, and study the result. You could start an elicitation operation with a laptop and a non-
enterprise Wi-Fi connection. As your adversary engagement goals and capabilities mature, operations can
grow quickly in complexity. However, it is important to remember that, as you learn, so will your
adversaries. That single laptop probably won’t get you far, but it will give you a chance to get your feet wet
and begin to experiment with how adversary engagement strategies could inform your organization’s
defensive strategy.
MITRE | Engage
9
A Practical Guide to Adversary Engagement
Figure 3: The MITRE Engage Matrix bookends engagement activities with strategic planning and analysis, to ensure that every action is goal
driven.
MITRE | Engage
10
A Practical Guide to Adversary Engagement
Figure 5: Engagement Actions are the actions taken against the adversary that are more often associated with cyber deception.
MITRE | Engage
11
A Practical Guide to Adversary Engagement
By dividing the Matrix into Strategic and Engagement categories, we highlight the importance of
approaching adversary engagement as a thoughtful and intentional process, guided by the defender’s
goals.
Figure 6: The Engage Goals help defenders drive progress towards strategic outcomes
The Prepare and Understand goals focus on the inputs and outputs of an operation. While the Matrix is
linear, it is essential to understand that within and across operations, this process is cyclical. For each
operation, you will be continuously iterating and improving as the operation progresses. This iterative cycle
will demand that you constantly think about how to Prepare for the next set of operational actions and how
to Understand operational outcomes in the context of your larger cyber strategy. This process ensures that
you are constantly aligning and realigning your actions to drive progress towards your Engagement Goals.
The Engagement Goals are Expose, Affect, and Elicit. These Goals focus on actions taken against your
adversary. Let’s explore each of these Goals in more detail.
We can Expose adversaries on the network by using deceptive activities to provide high fidelity alerts when
adversaries are active in the engagement environment. There is often overlap between practices that are
considered good cyber hygiene and techniques used to expose adversaries on the network. For example, if
the defender collects and analyzes system logs and identifies malicious behavior, that is not an example of
using the Engage activity System Monitoring to Expose the adversary. That is just good cyber security
practice! However, if a defender places decoy credentials on the network, and monitors the system logs for
the use of those credentials elsewhere in the network, that is an example of using Lures and System
Monitoring to Expose the adversary using adversary engagement activities. It should be mentioned that, at
the time of writing this handbook, most of the commercial vendor offerings focus on deception to Expose
adversaries on a system.
We can Affect adversaries by having a negative impact on their operations. Affect activities are ultimately
about changing the cost-value proposition in cyber operations for the adversary. The defender may want to
increase the adversary’s cost to operate or drive down the value they derive from their operations. For
example, the defender can negatively impact the adversary’s on-network operations to drive up the
resource cost of doing operations by slowing down or selectively resetting connections to impact
MITRE | Engage
12
A Practical Guide to Adversary Engagement
exfiltration. This type of activity increases the adversary’s time on target and wastes their resources. To
drive down the value of stolen data, a defender could provide an adversary deliberately conflicting
information. Providing such information requires the adversary to either choose to believe one piece of data
over another, disregard both, collect more data, or continue with uncertainty. All these options increase
operational costs and decrease the value of collected data. It is important to note that we limit all Affect
activities to the defender’s network. We are NOT talking about hack back or any activities in the adversary’s
space. This distinction is important to ensure that our defense activities are legal! However, this does not
mean the impact of our activities are confined to just the time the adversary is operating within our network.
For example, if we provide the adversary with false or misleading data, we may impact their future
operations by influencing capability development, targeting, and/or infrastructure requirements. This type of
information manipulation of our own data is legal and can have longer term impact on adversary behavior.
We can Elicit information about the adversary to learn about their tactics, techniques, and procedures
(TTPs). By creating an engagement environment that is uniquely tailored to engage with specific
adversaries, the defenders can encourage the adversary to reveal additional or more advanced
capabilities. To do this, the defender may need to use a combination of documents, browser artifacts, etc.
to reassure an adversary and reduce suspicion, while adding enticing data and exploitable vulnerabilities to
motivate the adversary to operate. Observing an adversary as they operate can provide actionable cyber
threat intelligence (CTI) data to inform the defender’s other defenses. Many years ago, when MITRE began
our adversary engagement program, we used malware self-infection operations to form the basis of most of
our CTI knowledge. As seen in Figure 7, in the set of operations that MITRE ran, we gained on average 40
additional IOCs, adversary files, signatures, or new decoders per operation. While some of these
operations requires a higher level of technical maturity (such as writing custom decoders), you can easily
get valuable IOCs and adversary files just by running and observing an operation.
At this point, it is important to highlight the difference between these Engagement Goals, your operational
objective, and your organization’s larger adversary engagement goals. For a given operation, you will
MITRE | Engage
13
A Practical Guide to Adversary Engagement
define one or more operational objectives. These objectives are the specific, measurable actions that will
enable you to accomplish your organization’s larger adversary engagement goals. For example, your goal
may be to fill in a known gap in your knowledge of a specific APT. Over the course of one or more
adversary engagement operations, you can make progress towards this goal by setting operational
objectives such as “Identify at least X many new indicators” or “engage the adversary to obtain a second
stage malware file.” Goals set direction; objectives take steps in that direction. It is also important to
remember that, while we identify three high level Engagement Goals, where adversary engagement fits into
your organization’s cyber strategy may mean you have organizational goals that fall on a spectrum or build
on one another. For example, an operation might allow the adversary to engage freely for a time to elicit
new TTPs, before identifying an opportunity to influence or control an adversary’s actions to meet a larger
strategic goal for your organization. When using the Engage Matrix to plan an operation remember that
these categories are intended to provide a language through which your goals can be clearly articulated
and not to limit or box in your use of adversary engagement operations.
Figure 8: Engage Approaches are the ways in which you drive progress towards a selected goal
Strategic Approaches help you to focus on the steps you must complete before, during, and after an
operation to ensure that your activities are aligned with your overall strategy. Strategic Approaches help
ensure that your operations of today inform your operations of tomorrow.
Engagement Approaches help you to identify what actions you would like to take against your adversary
and help you to drive progress towards that impact. As seen in Figure 8, Engage outlines 9 approaches to
drive progress towards the various goals.
Name Description
Plan Identify and align an operation with a desired end-state
Collect Gather adversary tools, observe tactics, and collect other
raw intelligence about the adversary's activity
Detect Establish or maintain awareness of adversary activity
Prevent Stop all or part of the adversary's ability to conduct their
operation as intended
Direct Encourage or discourage the adversary from conducting
their operation as intended
MITRE | Engage
14
A Practical Guide to Adversary Engagement
Figure 9: By mapping to ATT&CK, Engage ensures that each engagement activity is driven by real adversary behavior.
MITRE | Engage
15
A Practical Guide to Adversary Engagement
Figure 10: Use the MITRE Engage Cycle to integrating Engage into your cyber strategy
Figure 10 outlines the MITRE Engage Cycle and illustrates how the Engage Matrix can be operationalized.
This cycle has no defined beginning or end, but for the sake of walking through the model, we will start with
collecting raw data from sensors. This collection is tool agnostic—it simply refers to your collection methods
regardless of how that collection happens. These collection tools can range from low-cost solutions such as
collection Windows System Monitor (Sysmon)/Auditd/etc., to vendor Endpoint Detection and Response
(EDR) solutions. The next step in the cycle is analyzing raw data in the context of existing CTI data. Here
you can use tools such as MITRE ATT&CK to contextualize this new data. By analyzing adversary actions
and comparing this data to past behavior, the defender can identify patterns, such as common actor TTPs,
that offer indications about the adversary’s current, and possibly future activities. Armed with this
knowledge, the defender can use the Engage Matrix to identify opportunities for engagement to meet
strategic defensive goals. It should be noted that, as you begin your deception program, you may rely
heavily on intelligence feeds, open-source reporting, and/or information sharing groups to learn about the
behavior of your target adversary, or the intended target of your deception. As you build your deception
capabilities, you can complement this CTI data with insights gained by watching adversaries in your own
environment. After you have found opportunities, it is time to implement your engagements. At this stage
the deceptive assets are deployed and the engagement begins.
MITRE | Engage
16
A Practical Guide to Adversary Engagement
As previously discussed, adversary behavior should drive the engagement. Each time the adversary
interacts with the engagement environment, the expert defender refines operational activities to manage
the operation most effectively. Did the adversary ignore or overlook a deployed lure? The operator may
move or change the lure to better encourage engagement. Did the adversary display a new behavior?
There may be new opportunities for deception given this new activity. Did the adversary drop a new file?
The operator may pause the operation to analyze the file and ensure that future engagement activities are
aware of the capabilities of this new piece of malware. As an example, if the adversary is operating in a
nonproduction environment for an elicitation operation, the defender may need to make adjustments if a
piece of deployed malware has capabilities to capture audio or video. In addition to making refinements
during an operation, the expert operator will also make refinement across operations. At the conclusion of
each operation, lessons learned, distilled intelligence and other operational outcomes should drive future
operations.
This cycle can continue indefinitely; the defender collects and analyzes adversary behavior from an
operation, which supplies new engagement opportunities, which yield further data to collect and analyze,
and the cycle continues. This active defense allows the defender to be proactive rather than reactive to
adversary actions. Whether or not your organization is already collecting and utilizing Cyber Threat
Intelligence (CTI), cyber denial, deception, and adversary engagement, and/or other active defense
principles the Engage Matrix can be used to begin or mature your strategy.
Driven by the defender’s goals, we designed Engage to complement a traditional cyber defense strategy.
The Engagement Goals, Expose, Affect, and Elicit, are not inherently deceptive. And therefore, it is easy to
imagine how these goals may already be guiding an organization’s security practices. Are you worried
about insider threats? Look at the Expose activities to start adding deceptive artifacts around your critical
assets. Do you have legacy systems that are no longer able to be updated? Look at the Affect activities to
understand how you place decoys to direct adversaries away from these vulnerable systems. Do you feel
exhausted by the endless game of CVE whack-a-mole? Look at the Elicit activities to begin to generate
your own CTI feed to drive your defense by the adversary’s TTPs and not the CVE of the day. Whatever
your defensive goals, the Engage Matrix can help you find complementary engagement activities to ensure
that if a defense-in-depth approach fails, you remain in control.
While we often consider adversary engagement as a distinct security practice, the most effective and
mature implementations are seamlessly integrated into the very culture of an organization. Just as we train
our workforce in good cyber hygiene habits, we must train the security community to consider deception as
a best practice. In the United States, we have pathologized deception; we consider deception as inherently
MITRE | Engage
17
A Practical Guide to Adversary Engagement
negative, sneaky, and dishonest. However, Engage enables defenders to normalize denial and deception
activities as routine, essential, and intelligent security practices.
Now that we have explored how adversary engagement can be part of a larger strategic plan, we will zoom
back in to understand how to run an engagement operation. The remainder of this handbook will explore
the process of adversary engagement and discuss how you can organize a team to begin using adversary
engagement as part of your organization’s cyber strategy.
• How do we monitor and track the adversary as they move through the environment?
Sensing • How do we get data off of the engagement environment?
Figure 11: Questions to consider when thinking about adversary engagement operations
MITRE | Engage
18
A Practical Guide to Adversary Engagement
is not clearly defined though goal setting. The remaining operational elements should always be informed
by and aligned with the operational objective.
It should be noted that there is, of course, a fine line between creating an environment that is so glaringly
vulnerable and high value that an adversary immediately recognizes it as a deception and creating an
environment so locked-down that the adversary is discouraged from attempting to find and exploit a
vulnerability. This distinction is a difficult line to walk and will vary greatly from adversary to adversary.
What might scare one adversary away may be the ideal environment for a second adversary. This is yet
another example of why setting operational goals and using CTI data to understand your target adversaries
is essential. We will discuss this further in Section 5, when we explore how these elements are combined in
the process of deception.
MITRE | Engage
19
A Practical Guide to Adversary Engagement
When creating an isolated engagement environment, the defender must consider what the adversary
expects to find. Understanding these expectations is essential to create an environment that will minimize
the risk that the adversary detects the deception. By playing into the adversary’s cognitive biases, the
defender can lower the risk that the deception will be discovered and help reassure the adversary that they
are in a legitimate environment. For example, if the adversary expects to find a researcher’s computer,
including relevant backdated documents may assuage any doubts the adversary may have that this is a
real victim. Understanding how much the adversary knows about the environment, and what they expect to
find, can also help the defender prioritize their resources. For example, many ransomware samples will not
execute if the network does not contain a domain controller. Therefore, setting up a domain controller is an
essential task if the goal is to detonate ransomware. As another example, an external threat may not know
what the production systems should look like. However, and insider threat will have exquisite knowledge
and about what the production network and systems contain. Therefore, much higher levels of realism are
required in the latter operation than the former.
MITRE | Engage
20
A Practical Guide to Adversary Engagement
When considering the deployment of deceptive artifacts in integrated environments, it is also important to
consider if there is already an adversary who is currently in, or was previously in, the environment. As
discussed before, meeting the adversary’s expectations in terms of which assets are in an environment is
crucial to maintaining the deception. If an adversary is already familiar with an environment, they may more
easily identify deceptive assets. As a result, deployed tripwires may be rendered ineffective. However, even
if the adversary is aware of the deception, they may still suffer a resource cost due to the increased
ambiguity of the network.
MITRE | Engage
21
A Practical Guide to Adversary Engagement
There is a continuum of options between the two extremes of never speaking about adversary engagement
and openly presenting at conferences and releasing tools. For example, some organizations may find
middle ground by sharing operations with a closed group of trusted partners. No matter where your
organization falls on this spectrum, there are pros and cons. By keeping a closed program, you may have
MITRE | Engage
22
A Practical Guide to Adversary Engagement
more opportunities to engage unsuspecting adversaries. When adversaries enter an environment without
expecting deceptive assets, their natural biases may increase their tolerance for imperfections in the
environment and improve the overall believability of the ruse. With an externally known program, you may
have the opportunity to deter adversaries through increased ambiguity; if they know deception is in play,
they must question everything on your network. As an additional consideration, participating in adversary
engagement communities of interest or sharing groups, may expose you to new technologies and research,
as well as enable you to compare operational data to complement and enhance your own findings.
Whatever you chose, it is important to gather stakeholders and consider your options carefully at the onset
of developing your program.
There are additional OPSEC concerns surrounding operational details. It is important to define guidelines
around topics such as operational data storage, sharing, and analysis. Additional considerations are
required when working through the operational details for insider threat detection programs. While specific
OPSEC requirements for operational data are outside of the scope of this document, we hope this brief
discussion has highlighted the importance of defining OPSEC guidelines early in the development of an
adversary engagement program.
Organizations often fail to adequately plan how and where denial, deception, and adversary engagement
will be utilized on their networks. As previously stated, adversary engagement is an iterative, goal driven
process, not merely the deployment of a technology stack. Organizations must think critically about what
their defensive goals are and how denial, deception, and adversary engagement can be used to drive
progress towards these goals. The Engage 10-Step Process helps organizations consider engagement
activities as part of just such a process.
The 10-Step Process is particularly important to organizations with limited resources or less mature security
programs. Looking at available vendor technologies, it can be tempting to believe that the only solution lies
in sophisticated and expensive appliances. However, when you step back and clearly define goals, it may
be possible to scope down engagements to tightly align with those goals. Suddenly, even small
organizations can get started integrated adversary engagement into their defensive strategies with limited
resources!
MITRE | Engage
23
A Practical Guide to Adversary Engagement
As seen in Figure 12, the 10 Steps are broken into three categories: Prepare, Operate, and Understand.
These three categories mirror the three components of the Engage Matrix. The strategic approaches and
activities under the Engage Prepare goal correspond to steps 1-6. The engagement approaches and
activities under the Engage Expose, Effect, and Elicit goals correspond to steps 7. And finally, the strategic
approaches and activities under the Engage Understand goal correspond to steps 8-10. It is not an
accident that the engagement activities fall under the smallest category in the 10-Step Process. While
these activities often get the spotlight, strategic actions taken to prepare for and understand the results of
your engagement operations are the most important elements of any operation.
This process is not unique to cyber deception. In The Art and Science of Military Deception, Barton Whaley
wrote a chapter entitled, “The Process of Deception.” There, Whaley laid out a ten-step process
for creating military deceptions. We refined and repurposed Whaley’s work for the cyber domain. We will
now explore each step in this process in more detail.
MITRE | Engage
24
A Practical Guide to Adversary Engagement
If you know the enemy and know yourself, you need not fear the result of a hundred battles.
If you know yourself but not the enemy, for every victory gained you will also suffer a
defeat. If you know neither the enemy nor yourself, you will succumb in every battle.
This statement rings true in the world of cyber operations. Creating a threat model to understand the risks,
vulnerabilities, and strengths of an organization is foundational to planning an effective adversary
engagement operation. As part of this model, defenders must identify the organization’s critical cyber
assets. Likewise, defenders should use Cyber Threat Intelligence to understand the threat landscape.
MITRE | Engage
25
A Practical Guide to Adversary Engagement
MITRE | Engage
26
A Practical Guide to Adversary Engagement
5.10 Step 10: Analyze successes and failures to inform future operations
Whenever a gating criterion is reached, it is essential to analyze the operational successes and failures.
This retrospection is an opportunity for the team to review the events of the operation to ensure progress
towards operational objectives. It can include a review of the entire operational process from planning,
implementation, engagement activity, and impact. In additional to the operation itself, it is an important time
to assess the communication and teamwork of the operations team and all contributing stakeholders. While
such a review should always occur at the end of an operation, periodic reviews during long-running
operations, and especially whenever a gating criterion is reached, are vital to ensure alignment and
progress towards the operational objectives.
Operational Template
Now that we have explored the high-level process of adversary engagement, we can create an operational
template designed to help teams get started running adversary engagement operations. This template is
meant to be a guide. When applying this template to your organization, you will need to adapt and refine
the structure to fit your organizational needs and resources. It is important to note that, while this template
deals is organized around a team, it is very possible to run adversary operations with only a single
individual. In that case, consider these various roles and responsibilities as guides to help you identify the
various tasks you will need to complete.
6.1.1 Mission Essential Task List (METL) and Mission Essential Tasks (METs)
The METL is a structure by which you identify training requirements and qualifications, establish a team
purpose and drive progress to accomplish your goals. The METL concept used by the MITRE Engage team
has been adapted from the process used by much larger teams across the U.S. Government.
The METL is made up of a series of Mission Essential Tasks (METs). These tasks are the essential
activities that must be completed during the planning, execution, or analysis phases of an engagement
operation. Not all adversary engagement activities are considered METs, as some require significantly
more specialized skills than others, such as malware analysis and decoder authoring. Therefore, you will
never have 100% knowledge of all skills across all the team. However, METs ensure that, where possible,
MITRE | Engage
27
A Practical Guide to Adversary Engagement
team members are trained to be proficient in all core competencies. While not everyone is expected to be
an expert at every MET, each team member is expected to be able to accomplish each MET. In Table 2,
we share the Engage METL Summary used by the Engage team when running self-infection operations for
elicitation goals. See the Engage METL Template for a more detailed explanation of this METL template.
Adjust this template based on your organizational needs and goals.
Table 2: Template Mission Essential Task List for self-infection elicitation operations
MET Description
Establish Gating Criteria Define non-negotiable stops or pauses for an
operation. These criteria often form the basis of
the Rules of Engagement (RoE) document for the
operation
Create Engagement Narrative Design the persona(s) that will be utilized to
interact with the adversary and engagement
environment. Design the storyboard of events
that will drive the interactions during the
operation.
Establish Monitoring System Identify collection points that will ensure
operational safety as outlined in the RoE
document
Build Out Victim Windows System Provision Windows systems consistent with the
established engagement narrative
Build Out Victim Linux System Provision Linux systems consistent with the
established engagement narrative
Deploy Monitoring System to Engagement Buildout the collection system to ensure
Environment operational safety
Deploy Persona(s) and Deceptive Assets to Buildout the persona(s) and deceptive assets
Engagement Environment (Pocket Litter and Lures) consistent with the
established engagement narrative
Monitor Operational Activity Observe operational activity within the safety
constraints as defined in the RoE
Forensically Investigate Victim Post Operation Use disk forensics and other IR techniques to
gather data from the environment post operation
Analyze Data from Live Operation and Forensic Analyze data to distill actionable intelligence
Investigation
Conduct Open-Source Intelligence (OSINT) Use OSINT sources to learn about your target
Searches Pre/Post Operation adversary’s Indicators of Compromise (IOC) and
TTPs and to hunt on new IOCs and TTPs you
discover during your operation
MITRE | Engage
28
A Practical Guide to Adversary Engagement
work on a single role. The purpose of defining these roles is to clarify responsibilities. Any changes to these
roles during an operation should be communicated to all those involved in the operation.
It is important to note that roles are not meant to be equal in effort over the course of an operation. Some
roles are only relevant during a single phase of the operation, whereas others are relevant throughout.
Some roles are only responsible for a couple of tasks, whereas others are responsible for many. With this
in mind, be sure to staff roles appropriately with individuals who have the necessary bandwidth, seniority,
and skillset to maximize likelihood of success of an operation.
MITRE | Engage
29
A Practical Guide to Adversary Engagement
o Coordinate with the Team Lead to deconflict with other simultaneous adversary
engagement operations as appropriate
o Ultimate responsibility for operational safety and to initiate appropriate response when
gating criteria reached
MITRE | Engage
30
A Practical Guide to Adversary Engagement
MITRE | Engage
31
A Practical Guide to Adversary Engagement
MITRE | Engage
32
A Practical Guide to Adversary Engagement
template is not intended to be a one-size-fits-all solution. Think of this section as a guide to help you start
integrating adversary engagement operations in your organization. This template will show you one
Concept-of-Operations (CONOPS) that covers the planning that should be done before beginning an
operation, preparation steps that need to be taken before the environment can be constructed, deployment
of the environment to the field, daily operations and maintenance, daily monitoring and analysis, and how to
gracefully end the operation while capturing lessons learned.
It is important to note that the steps described below are not meant to be uniform in scope and effort. Some
may be as simple as filling out a form, others may demand hours of attention and effort. It is also important
to note that some steps depend on previous steps being completed, but that is not necessarily the case.
While this section identifies roles and responsibilities associated with each task, it should not be viewed as
a complete list, as necessary tasks may vary between operations, and optimal ways of facilitating and
coordinating this effort will vary by organization. (Refer to Section 6.2 for further details on the scope of
roles and responsibilities.)
For each of the various roles, a single individual should be identified as the responsible party. Even when a
responsibility should be worked in coordination with multiple people, the ultimate responsibility for
successful completion needs to be assigned to one person. This assignment ensures that that person
knows that they are accountable for the specific job to be done and done well. Once again, we will reiterate
that this should be considered as only a template for how your team can operationalize the Engage 10 Step
Process for adversary engagement. The exact implementation may vary significantly to fit the needs and
resources of your organization.
6.3.1 Planning
This phase comprises steps that should be accomplished before starting an operation.
A successful adversary engagement operation relies on a well thought out plan of action. The priorities set
during this phase will inform the remainder of operational activities.
1
https://www.mitre.org/publications/systems-engineering-guide/enterprise-engineering/systems-engineering-for-
mission-assurance/crown-jewels-analysis
MITRE | Engage
33
A Practical Guide to Adversary Engagement
MITRE | Engage
34
A Practical Guide to Adversary Engagement
MITRE | Engage
35
A Practical Guide to Adversary Engagement
Some examples include concealing the fact that the environment is closely monitored, revealing that the
environment’s IP address falls in the true address space of the organization, revealing fictitious personas of
invented end users, and concealing PII of these fictitious end hosts (e.g., social security numbers) in order
to keep the barrier to identifying this environment as a hoax sufficiently high. Figure 13 illustrates some
examples of this type of information.
Figure 13: Organization template for the revealed and concealed facts and fictions
MITRE | Engage
36
A Practical Guide to Adversary Engagement
Related Steps: Determine what you want your adversary to perceive. Determine how you want your
adversary to react
Personas serve to support the engagement narrative. In order to make the adversary believe that they are
operating on a real network, it is essential to create the illusion that real people are using the network. For
any hosts that may fall under scrutiny, assign a work function and some basic personality characteristics to
the fictitious person that would be using it. These will be leveraged later to inform what sorts of Pocket Litter
appear on the machine and what actions are taken during the op to make the user(s) look active.
MITRE | Engage
37
A Practical Guide to Adversary Engagement
the engagement environment. Coordinating the RoE with existing InfoSec teams is also essential to ensure
appropriate communication and understanding across the organization.
The RoE is meant to capture foreseeable bright red lines as well as guidelines for how to act under
unforeseen circumstances. It is not meant to be exhaustive, and the clarity and legibility of the document
should be prioritized.
MITRE | Engage
38
A Practical Guide to Adversary Engagement
prevention of accidents in the event that someone on the operations team accidently contaminates a clean
corporate network with potentially malicious data from an engagement environment.
MITRE | Engage
39
A Practical Guide to Adversary Engagement
Litter, on the other hand, exists only to reinforce the engagement narrative. Therefore, Pocket Litter often
allows for more flexibility and lower fidelity than Lures.
Gathering artifacts, particularly Pocket Litter, is a step that can bloat to fill as much time as is available.
Therefore, it is important to rely on the target adversary profile developed in Section 6.3.1.4. Important
high-level considerations include the amount of scrutiny that a given file or process is expected to fall
under, whether the adversary’s operators usually exfiltrate data indiscriminately in bulk or specific
documents relevant to their objectives, etc. This understanding of the target adversary will inform tradeoffs
in quality vs quantity, and how much time is necessary to spend on litter in general.
This step lends itself to automation, especially for gathering/generating Pocket Litter. For human language
content, such as documents or emails, consider generating content using open-source language models
such as GPT-2, or by scraping it from the web. For software, make sure to gather necessary installers and
licenses that match the software/host versions that support the narrative. Automation may be more
challenging for Lures, as these assets likely need to be specific content relating to a specific topic.
The Threat Analyst will be responsible for selecting deceptive artifacts that will be enticing to the adversary.
The Blue Team should be familiar with what assets will be on the machine in order to better identify rogue
files and processes on the machine and adversary exfiltration.
This step of gathering Pocket Litter also offers opportunities to establish plausible deniability in the event
that the environment is compromised by a security researcher. For example, writing a detailed explanation
of the operation, and leaving it in the environment in an encrypted zip file with a sufficiently complex
password could be used to prove to a researcher that what appeared to be a security vulnerability in the
organization’s network was an intentional engagement environment. By providing the password to any
security researcher who exfiltrates the encrypted zip, the intentional deception can be revealed. Similar
opportunities including creating uncrackable user passwords with phrases such as “honeytoken” or
“deception operation” which can be used to reveal post-operation that the environment was part of an
intentional deception operation.
6.3.2 Deploying
This phase comprises steps to set up a deception environment such that it’s ready to go live, and then
kicking off the operation.
MITRE | Engage
40
A Practical Guide to Adversary Engagement
Role: Threat Analyst in coordination with the Reverse Engineer, System Administrator and Operational
User
Responsibility: Deploy Pocket Litter, Lures, and other deceptive artifacts onto the deception network.
Related Steps: Determine what you want your adversary to perceive. Determine how you want your
adversary to react. Determine channels to engage with your adversary.
This step includes both setting content and software on machines in locations that will be observable to the
adversary in support of the engagement narrative, as well as with timestamps that support the engagement
narrative.
The System Administrator is responsible for the action of placing or installing artifacts while the Threat
Analyst is responsible for advising where and when artifacts should be placed to entice the adversary.
MITRE | Engage
41
A Practical Guide to Adversary Engagement
Responsibility: Snapshot and store each machine’s baseline state as a point of comparison for analysis
and in case there is the need to redeploy.
Related Step: Determine the gating criteria
This step should happen after Pocket Litter has been deployed on each of the machines and collection
systems have been configured on end hosts. Backups of each machine’s baseline should be stored off of
the deception network and should be available on short notice should the need to redeploy the environment
arise.
MITRE | Engage
42
A Practical Guide to Adversary Engagement
Time permitting, Red’s feedback should be used to tweak the environment such that it is more believable
and collection system configurations such that visibility is improved. All changes must be properly
documented.
6.3.3 Operating
This phase comprises steps that must be taken during an operation in order to maintain it and keep it
believable.
MITRE | Engage
43
A Practical Guide to Adversary Engagement
As is mentioned in more detail in Section 3, the Blue Team is responsible for maintaining situational
awareness of all NRT alerts, and especially when there is novel or unexplained activity. The bulk of this will
likely be false positives, but understanding what false positives usually look like will allow further tuning of
the collection system and enable the Blue Team to better distinguish when alerts look abnormal and
demand more attention.
The Blue Team is also responsible for taking any courses of action outlined in the RoE in response to
unexplained activity within the allowable timeframe.
6.3.4 Analyzing
This phase comprises steps that must be taken during an operation to maintain situational awareness of
adversary behavior in the environment.
MITRE | Engage
44
A Practical Guide to Adversary Engagement
Related Steps: Turn raw data into actionable intelligence, Feedback intelligence
The initial stage malware may need to be analyzed to determine how to successfully execute it in the
deception environment. Some malware may have configuration or software dependencies, such as
Microsoft Word or .NET runtime. Sophisticated adversaries, particularly targeted threats, integrate
environmental checks into their malware.
If the adversary drops later-stage malware in the engagement environment, reverse engineering it may be
necessary to maintain situational awareness of what the adversary is doing or might do in the environment.
The amount of time and resources dedicated to reverse engineering should scale according to the
complexity of the malware and the organization’s risk tolerance. This effort can be continued after the
operation ends, provided a sample of the malware is preserved.
MITRE | Engage
45
A Practical Guide to Adversary Engagement
Role: Operational Coordinator in coordination with the Team Lead, Blue Team, and Threat Analyst
Responsibility: End the operation upon reaching a success state, end state, or unanticipated state per the
Gating Criteria
Related Steps: Turn raw data into actionable intelligence, Feedback intelligence, Analyze successes and
failures to inform future actions
While the Blue Team may end the operation in the event that defined success or end states are met, per
the RoE, should the unexpected happen in the course of the operation (as is often the case), it is the
Operational Coordinator’s responsibility to decide whether continuing or ending the operation is in the
proper spirit of the RoE, per legal and management’s risk tolerance.
Ending the operation may be automated through programming machines to shut down or disconnect from
the network if certain conditions are detected. It can also be as manual as pulling the network cable out of a
victim machine if adversarial collateral damage is detected. In any case, all paths to the operation’s network
should be cut if it is time to end it.
Other relevant considerations at this step are the impression that the adversary should be left with. If the
operation is meant to be covert, and the adversary absolutely cannot discover the network to be a ruse,
then it may be better to end the operation gracefully (e.g., simulate the behavior of a hardware failure,
server crash, or incident response) if the adversary seems to be growing suspicious of the environment. If it
doesn’t matter whether the adversary’s suspicions are raised, it may make sense to keep the operation
going until they lose interest or attempt to perform actions outside of the RoE.
These sorts of considerations should be kept in mind while writing the RoE during the planning phase, but
since all possible outcomes of an operation cannot possibly be thought through ahead of time, they are
worth reiterating here.
MITRE | Engage
46
A Practical Guide to Adversary Engagement
Role: Operational Coordinator in coordination with the Team Leader, Threat Analyst and Blue Team
Responsibility: Assess what lead up to the end of the operation, regardless of whether the objective was
achieved.
Related Step: Analyze successes and failures to inform future actions
If the operations team was successful in achieving the operation’s goal, take time to review the implications
of the new intelligence, or the indicators of cost-imposition. It is very difficult to measure the effectiveness of
an operation. What is often more feasible is to show measures of work. Some exemplar success metrics to
show these measures of work can include total number of C2 packets sent to the engagement
environment, length of time of meaningful engagement, megabytes exfiltrated, novel tools collected,
domains/IP addresses/signatures collected, new TTPs identified. Figure out how this value can be
communicated back to the rest of the organization, or other partnering organizations.
If the adversary engaged with the environment, but eventually lost interest, review adversary behavior
leading up to them abandoning the environment. Were they looking for certain kinds of information or
hosts? This can contribute to design considerations of future operations.
If the adversary identified the environment as being a deception environment, review their behavior leading
up to, and following, this determination. The more that is understood about how the adversary checks for
the legitimacy of the environment, the more authentic looking the environment can be made in future
operations. The more the operations team understand about how the adversary acts when they discover
themselves subject to a deception, the more intelligence that will have to be fed into counter-deception
TTPs.
MITRE | Engage
47
A Practical Guide to Adversary Engagement
Future Considerations
The current adversary engagement concept covers operations from simple, standalone computers on a
consumer grade Internet connection, to larger, bespoke networks mimicking more involved home networks,
small office networks, or enterprise level networks. Future work may consider automating the polymorphic
design and provisioning of large networks, automating persona generation, automating natural language
content generation, automating behaviors performed by operational users during deception operations, and
more.
Outside of the direct scope of running adversary engagement operations, future work may also investigate
the weaknesses inherent in one’s deception environments in order to identify areas where the environment
may be hardened, such as the hiding of virtualization or eliminating its use. Developing tradecraft for
identifying deception environments is also useful in the context of countering deception during offensive
cyber operations.
Acknowledgements
We would like to recognize and thank the following individuals who took the time to peer review this
document, or a previous iteration of this document, providing content, feedback, and many edits:
Bill Hill, Director of MITRE Information Security Andrew Sovern, Adversary Engagement
and CISO Engineer
Dan Perret, MITRE Principle Cyber Security Dr. Frank Stech, Principal Behavioral Scientist
Engineer
Adam Pennington, MITRE ATT&CK Lead
Additionally, we would like to thank the members of the MITRE Engage Team
Maretta Morovitz Dylan Hoffmann Leslie Anderson
Gabby Raymond Bryan Quinn Olivia Hayward
Stan Barr Brittany Tracy
MITRE | Engage
48