U.S. Military Lowered Its Computer Security Level the Night Before 9/11

Check out this curious incident relating to the 9/11 attacks. According to a new entry in the Complete 9/11 Timeline, at around 9 p.m. on the evening before 9/11--less than 12 hours before the attacks began--the U.S. military lowered its "Infocon" threat level to the lowest possible level, supposedly because of "reduced fears of attacks on computer networks."

The Infocon system is intended as "a structured, coordinated approach to defend against and react to attacks on Defense Department systems and networks." General Ralph Eberhart, the commander of NORAD, was responsible for issuing Infocons to the US military, and so he was presumably responsible for lowering the Infocon level on September 10. (See my previous blog entry for details of Eberhart's suspicious actions on the day of 9/11 itself.)

The Infocon level was raised again after the second plane hit the World Trade Center on 9/11.

Yet another strange "coincidence."

September 10, 2001: Military 'Infocon' Alert Level Reduced because of Perceived Lower Threat of Computer Attacks
The US military reduces the Information Operations Condition (Infocon) to Normal--the lowest possible threat level--less than 12 hours before the 9/11 attacks commence, reportedly due to reduced fears of attacks on computer networks.
Level Reduced Due to 'Decreased Threat' - The Infocon level is lowered to Normal, meaning there is no special threat, at 9:09 p.m. this evening. The reason for this, according to historical records for the 1st Fighter Wing at Langley Air Force Base, Virginia, is "a decreased threat from hacker and virus attacks on the computer networks across the US." [Colorado Springs Gazette, 5/3/2001; 1st Fighter Wing History Office, 12/2001] Since October 1999, the commander of the US Space Command has been in charge of Defense Department computer network defense, and has had the authority to declare Infocon levels. [IAnewsletter, 12/2000] General Ralph Eberhart, the current commander of both the US Space Command and NORAD, is thus responsible for evaluating the threat to US military computers and issuing information conditions--"Infocons"--to the US military. He is presumably therefore responsible for lowering the Infocon level this evening.
Higher Infocon Level Requires More Precautions - It is unclear what difference the reduced Infocon level makes. But an e-mail sent earlier in the year from Peterson Air Force Base in Colorado, where NORAD and the US Space Command are headquartered, revealed the steps to be taken when the Infocon level is raised one level from Normal, to Alpha. These steps include "changing passwords, updating keys used to create classified communication lines, minimizing cell phone use, backing up important documents on hard drive, updating virus protection on home computers, reporting suspicious activity, and reviewing checklists." [Colorado Springs Gazette, 5/3/2001]
Level Increased Earlier in Year - It is also unclear what the Infocon level was prior to being reduced this evening and why it had been at that raised level. Pentagon networks were raised to Infocon Alpha for the first time at the end of April this year, as a precaution against attacks on US systems, after Chinese hackers warned of such attacks in Internet chat room postings. [United Press International, 4/30/2001; Colorado Springs Gazette, 5/3/2001; United Press International, 7/24/2001] The Infocon level was raised to Alpha a second time in late July, due to the threat posed by the Code Red computer virus. [United Press International, 7/24/2001; US Department of Defense, 7/24/2001] It will be raised again, from Normal to Alpha, during the morning of September 11, immediately after the second attack on the World Trade Center takes place (see 9:04 a.m. September 11, 2001). [1st Fighter Wing History Office, 12/2001]
System Intended to Protect Defense Department Computers - The Joint Chiefs of Staff established the Infocon system in March 1999 in response to the growing and sophisticated threat to Defense Department information networks. The system is intended as a structured, coordinated approach to defend against and react to attacks on Defense Department systems and networks. Reportedly, it "provides a structured, operational approach to uniformly heighten or reduce defensive posture, defend against unauthorized activity, and mitigate sustained damage to the defense information infrastructure." It is analogous to other Defense Department alert systems, such as Defense Condition (Defcon) and Threat Condition (Threatcon). The Infocon system comprises five levels of threat, each with its own procedures for protecting systems and networks. These levels go from Normal, through Alpha, Bravo, and Charlie, up to Delta, which, according to Rear Admiral Craig Quigley, the deputy assistant secretary of defense for public affairs, is when "You're currently under an absolutely massive hack attack, from a variety of means, from a variety of sources. You're talking a very concerted, focused attack effort to get into [Defense Department] systems." [IAnewsletter, 12/2000; General Accounting Office, 3/29/2001; US Department of Defense, 7/24/2001]

Source: http://www.historycommons.org/context.jsp?item=a091001infoconnormal&scale=0

Doesn't come as a surprise to me

Doesn't come as a surprise to me in the least. The links below to commongroundcommonsense will bring up blog entries written by me a long time ago. They have been picked up and mirrored by others. Note that many of the links inside the links no longer work but many of those docs were downloaded on to an old hard drive and every item was read and re-read by me (and others back then). (I had also merged the topics with situational awareness, TADMUS research and other topics, which I had attempted to apply to the field of emergency management response; I have published in the field.)

It's a long read. Scan it once, read it again, and tuck it away.

It's about how the above fits, and ties in with PTech, PROmis, the OODA loop, netcentric warfare, Dick Cheney and more.

Goggle for the OODA loop, pick up Coram's biography of Boyd (who is not involved or the target here), and perhaps spend some time looking at the many PowerPoints that have been put together on the OODA loop. It can become a snore but if you get the essence of what the OODA loop can do (neatly expressed below by Bageant), then you've got the sense of it. Lt. Col. Boyd was the top Air Force Top Gun instructor before there was a Navy Top Gun school; he was known as "40-second Boyd". He could think faster in his butt than most people could with all brain cells firing.)

“...we no longer take effective action, because it has become impossible to identify what we might do to change anything. Instead, we react to events. That is what the ruling class wants, because if we are reactive, then outcomes can be controlled by controlling the stimuli. Keep 'em dazzled with foot work. So the stimuli keep coming at us faster than we can think. And they are presented as fate, or the result of "fast changing world events," or a banking collapse no one could have predicted -- things to which we must respond immediately. Most of us just give up. Which again, is what the ruling class wants us to do -- become a uniformly pliant mass.”

~ Joe Bageant

Links to original blog entries of mine:










The whole enchilada is mirrored here and expanded:

Paraphrasing for the technically less familiar..

these articles suggest the 'power-down' was unusual/exceptional?

lessons learned

"Today, almost all command, control, communications, computer, intelligence, surveillance, and reconnaissance (C4ISR) systems are acquired via spiral development. In order for these systems to support warfighter missions, they must be networked together with various weapon systems.

Current operational testing is generally dedicated but limited in scope and is conducted by stimulating the system under test with master scenario event lists. Such testing is not fully operational and does not focus on effectiveness in accomplishing the warfighter mission.

This article proposes a strategy to improve the operational realism of testing net-centric C4ISR systems. The strategy is based on the United States Department of Defense’s lessons learned from the Year 2000 operational assessments conducted in 1999 in all combatant commands....."


From WikiPedia

From WikiPedia ( http://en.wikipedia.org/wiki/Network-centric_warfare ):

Network centric warfare can trace its immediate origins to 1996 when Admiral William Owens introduced the concept of a 'system of systems' in a paper of the same name published by the Institute National Security Studies. Owens described the serendipitous evolution of a system of intelligence sensors, command and control systems, and precision weapons that enabled enhanced situational awareness, rapid target assessment, and distributed weapon assignment.

Also in 1996, the Joint Chiefs of Staff released Joint Vision 2010, which introduced the military concept of full-spectrum dominance. Full Spectrum Dominance described the ability of the US military to dominate the battlespace from peace operations through to the outright application of military power that stemmed from the advantages of information superiority.
[edit] Network Centric Warfare

The term "network-centric warfare" and associated concepts first appeared in the Department of Navy's publication, "Copernicus: C4ISR for the 21st Century." The ideas of networking sensors, commanders, and shooters to flatten the hierarchy, reduce the operational pause, enhance precision, and increase speed of command were captured in this document. Many of these ideas were further refined and developed through models developed by the consulting firm Booz Allen Hamilton working with the Joint Staff in the Pentagon in the mid to late 1990s.

As a distinct concept, however, network-centric warfare first appeared publicly in a 1998 US Naval Institute Proceedings article by Vice Admiral Arthur K. Cebrowski and John Gartska. The concepts were later given greater depth in the book, Network Centric Warfare coauthored by Gartska, David S. Alberts (Director of Research, OASD-NII), and Fred Stein of The MITRE Corporation.

Published by the Command and Control Research Program (CCRP), the book derived a new theory of warfare from a series of case studies on how business was using information and communication technologies to improve situation analysis, accurately control inventory and production, as well as monitor customer relations.
[edit] Understanding Information Age Warfare

Network-centric warfare was followed in 2001 by Understanding Information Age Warfare (UIAW), jointly authored by Alberts, Gartska, Richard Hayes of Evidence Based Research and David S. Signori of RAND. UIAW pushed the implications of the shifts identified by network-centric warfare in order to derive an operational theory of warfare.

Starting with a series of premises on how the environment is sensed, UIAW posits a structure of three domains. The physical domain is where events take place and are perceived by sensors and individuals. Data emerging from the physical domain is transmitted through an information domain.

Data is subsequently received and processed by a cognitive domain where it is assessed and acted upon. The process replicates the "observe, orient, decide, act" loop first described by Col. John Boyd of the USAF.


Network-centric warfare/operations is a cornerstone of the ongoing transformation effort at the Department of Defense initiated by former Secretary of Defense Donald Rumsfeld. It is also one of the five goals of the Office of Force Transformation, Office of the Secretary of Defense.

See Revolution in Military Affairs for further information on what is now known as "defense transformation" or "transformation".....


"Data is subsequently received and processed by a cognitive domain where it is assessed and acted upon."

Puts a new wrinkle on the phrase "cognitive infiltration" and "crippled epistemology", doesn't it?

Does all this

suggest that the unusual reduction in the security level was to allow some ordinarily unacceptable program to be infiltrated into the military command system?

'the confluence between AI and systems monitoring'

"... At the time of 9-11, I was renting a duplex on the upper west side Manhattan, and I had 2 roommates: one Pakistani, and one Indian; both were British citizens. I knew them because the Pakistani gentleman was a client of SilverStream; and was the Head of Emerging Technology for Deutsche Bank. He led me into a company called Panacya (PANACYA), where I started work in November 2001 as the Director of Sales for Financial Services. At the time, Deutsche Bank was in the midst of a project called “Blue Sky”, which had a budget of approximately $200 million dollars, and Panacya was one of the technologies in contention for part of the contract.

The Panacya product- called “b@ware”- was the confluence between artificial intelligence and systems monitoring software, whereby it could be applied to observe a given environment and actually learn in a variety of ways- either through learned behavior based on observation and interaction, or encapsulated instructions conditioned by an authoritative source. This learning and real-time observation was then compared against both real-time and historical information; the function of performance of course being dependent on the validity of the history which the software references to make its decisions.

Based on these premises, the software could be applied to a system wherein it begins to learn; and parallel agents remotely observe the environment, share information in a peer to peer manner, and acclimate- both as a group and individually- to become a virtually decentralized network, which produces quality intelligence.

When the software agent matures, it is said to graduate, and in that state it can actively & accurately predict critical events BEFORE THEY OCCUR; and has the ability to take preemptive action, either learned or conditioned behaviors, to prevent critical events from affecting clients or transactions. Of course, if the agent is cut off from its authoritative components, it can only collect and store information, until it reconnects with the authoritative engine whereby action can be taken.

Allow me restate that: The software gives the user the ability to predict critical events in the applied environment: BEFORE THEY OCCUR, for the purpose of taking pre-emptive action. In simplest terms, the Panacya product could be applied to any information technology architecture, monitor every aspect of that architecture, actively learn, and then consistently predict pending crashes- so that they could be fixed before they occurred and costly downtime was encountered.

Here’s the practical use: Let’s say a client of mine, Fidelity, processes $500,000,000 worth of transactions on a particular server per hour. What is the cost of that server being down for an hour, or a day? How much is it worth to a Corporation to be able to prevent losing $500,000,000 worth of transactions? ...and that’s just the savings realized from preventing one hour of downtime.

I would also ask the question: If software this powerful is being openly offered commercially, are we to believe that our intelligence community doesn’t have software at least this powerful? –and under that supposition- that they indeed do- is it not fair to assume that the intelligence community was at the very least aware of some sort of impending critical event prior to 9-11? After all, in hindsight, the Administration itself has offered forth numerous memos and warnings that were received prior to 9-11…

In fact, in many instances, these memos and/or warnings were used as a basis of action by the current Administration… but unfortunately for the American People, the actions taken were not to warn the People of impending danger, but rather to capitalize on the tragedy- as the fraudulent transactions can be tracked back to intelligence community executives responsible for the protection of our own country.

I would also mention at this point, that the information technology systems used by our intelligence community became a virtual scapegoat post-9-11… under the guise of “intelligence failures”, wherein they claimed that 9-11 was a result of the intelligence agencies allegedly not being able to effectively share information across organizations.

In fact, the information systems and software tasked with preserving the security of our country, performed without fail, as all of the “failures” were choreographed and contrived so as to present the illusion which served as the propulsion mechanism through which the intelligence community could be brought under the control of one man- by creating a position called Director of National Intelligence- a position currently occupied by John Negroponte… just look him up in Wikipedia if you’re not making the connection between the position of DNI and the 9-11 Commission recommendations.

Lastly on this topic I would mention that the FBI recently had a $250 million software project to integrate their systems, and interestingly, like many projects of its kind, it was suspiciously cancelled- but only after receiving approval and funding. Where does all the money go?"

From the transcript provided to me by Richard Andrew Grove based on his appearance on the Maria Heller show... Track 8