The Full Wiki

Red Team: Wikis


Note: Many of our articles have direct quotes from sources you can cite, within the Wikipedia article! This article doesn't yet, but we're working on it! See more info or our list of citable articles.


From Wikipedia, the free encyclopedia

Red Team activity is any set of activities that deal with an unannounced assessment of security and readiness by an unfamiliar (to the target) team of operators with no awareness or support from the assessed target. The function of individuals engaged in this activity is to provide a unique understanding from a threat actor's point of view in a less contrived circumstance than through exercises, role playing, or announced assessments. Red Team activities may involve interactions that trigger active controls and countermeasures in effect within a given operational environment.

In wargaming, the opposing force or OPFOR in a simulated military conflict may be referred to as a red team and may also engage in Red Team activity, which is used to reveal weaknesses in military readiness. The key theme being that the aggressor is composed of various threat actors, equipment and techniques that are obscured from the defender's complete knowledge.

Some of the benefits of Red Team activities are that it challenges preconceived notions by demonstration; they also serve to elucidate the true problem state that planners are attempting to mitigate. Additionally, a more accurate understanding can be gained about how sensitive information is externalized, as well as highlight exploitable patterns and instances of undue bias with regard to controls and planning.


United States Army

In the US Army, Red Teaming is defined as: “structured, iterative process executed by trained, educated and practiced team members that provides commanders an independent capability to continuously challenge plans, operations, concepts, organizations and capabilities in the context of the operational environment and from our partners’ and adversaries’ perspectives.” (TRADOC News Service, July 13, 2005) [1]

The Army Red Team Leaders Course is conducted by the University of Foreign Military and Cultural Studies (UFMCS) at Fort Leavenworth. The target students are graduates of the U.S. Army CGSC or equivalent intermediate and senior level school (Major through Colonel, and Chief Warrant Officer 3/4/5 with MEL IV qualification or equivalent).

The Red Team Leader’s Course (RTLC) is graduate-level education of 720 Academic Hours (18 weeks) designed to effectively anticipate change, reduce uncertainty, and improve operational decisions. The typical academic day is 8 hours and the typical reading load is 250 pages per night.

The University of Foreign Military and Cultural Studies was formed as an outgrowth of recommendations from the Army Chief of Staff's Actionable Intelligence Task Force. UFMCS, as an element of the TRADOC (DCSINT) Intelligence Support Activity, or TRISA, located at Fort Leavenworth, KS, is an Army directed education, research and training initiative for Army organizations and other joint and government agencies designed to provide a Red Teaming capability.

A UFMCS-trained Red Team is educated to look at problems from the perspectives of the adversary and our multinational partners, with the goal of identifying alternative strategies. The Red Team provides commanders with critical decision-making expertise during planning and operations. The team’s responsibilities are broad – from challenging planning assumptions to conducting independent analysis to examining courses of action to identifying vulnerabilities.

Red Team Leaders are expert in:

  1. Analyzing complex systems and problems from different perspectives to aid in decision making using models of theory.
  2. An analysis of the concepts, theories, insights, tools and methodologies of cultural and military anthropology to predict other’s perceptions of our strengths and vulnerabilities.
  3. Applying critical and creative thinking in the context of the operational environment to fully explore alternatives to plans, operations, concepts, organizations, and capabilities.
  4. Applying advanced analytical skills and techniques at tactical level through strategic level and develop products supporting command decision making and operational execution.

U.S. Joint Forces Commands' Joint Enabling Capabilities Command

Two operational positions associated with red teaming exist at the United States Joint Forces Command formerly called Blue Red Planners within the Standing Joint Force Headquarters (SJFHQs). These two positions, now called Red Team Leaders (RTLs) are designed to provide the Joint Task Force Plans and Operations Groups with insight into the adversary’s political and military objectives and potential course of action (COA) in response to real or perceived Blue action. RTLs are the leads of a RT Cell composed of operationally oriented experts that analyze Blue conditions-driven COA from an adversary-based perspective. The RT Cell also anticipates potential adversary responses to counter Blue COA and end-state objectives. The RT also identifies critical Blue vulnerabilities and potential operational miscues. The RT cell also assists in war gaming, COA development early in the Joint Operations Planning Process (JOPP). RTLs, in collaboration with the Combatant Commander's staff and Centers of Excellence, provide in-depth knowledge of the local political landscape, of the adversary’s history, military doctrine, training, political and military alliances and partnerships, and strategic and operational objectives. The RTLs will postulate the adversary’s desired end-state, and also, postulate what the adversary may surmise Blue’s desired end-state or objectives to be. Finally, the RTLs help identify, validate, and/or re-scope potential critical nodes identified through systems developed understanding of the operational environment.

United States Government

Red Teaming is normally associated with assessing vulnerabilities and limitations of systems or structures. Various watchdog agencies such as the Government Accountability Office and the National Nuclear Security Administration employ red teaming, sometimes with dramatic findings.

  • In exercises and war games, Red Teaming refers to the work performed to provide an adversarial perspective, especially when this perspective includes plausible tactics, techniques, and procedures (TTP) as well as realistic policy and doctrine.

Important cases

The FAA has been implementing Red Teams since the Pan Am Flight 103 over Lockerbie, Scotland. Red Teams conduct tests at about 100 US airports annually. Tests were on hiatus after September 11, 2001 and resumed in 2003.[1]

The FAA use of red teaming revealed severe weaknesses in security at Logan International Airport in Boston, where two of the four hijacked 9/11 flights originated. Some former FAA investigators who participated on these teams feel that the FAA deliberately ignored the results of the tests and that this resulted in part in the 9/11 terrorist attack on the US.

Other examples include:

  • Billy Mitchell - a passionate early advocate of air power - demonstrated the obsolescence of battleships in bombings against the captured World War I German battleship Ostfriesland and the U.S. pre-dreadnought battleship Alabama.
  • Rear Admiral Harry E. Yarnell demonstrated in 1932 the effectiveness of an attack on Pearl Harbor almost exactly showing how the tactics of the Japanese would destroy the fleet in harbor nine years later. Although the umpires ruled the exercise a total success, the umpire's report on the overall exercises makes no mention of the stunning effectiveness of the simulated attack. Their conclusion to what became known as Fleet Problem XIII was surprisingly quite the opposite:
It is doubtful if air attacks can be launched against Oahu in the face of strong defensive aviation without subjecting the attacking carriers to the danger of material damage and consequent great losses in the attack air force." [2]


  1. ^ Deborah Sherman (30 March 2007). "Test devices make it by DIA security". Denver Post.  

See also

External links


Got something to say? Make a comment.
Your name
Your email address