by Loop Garou
This article is based on a comment LG posted on another 9/11 thread. We welcome replies and rebuttals, please send them to [email protected]”, marked “9/11”
The National Institute of Standards & technology (NIST) was engaged by Congress and by FEMA, shortly after the events of 9/11, to produce a report on the destruction of the three WTC towers.
While it did pursue some initial real-world experimentation (which should be discussed in turn), NIST built its conclusions on the collapse primarily on the basis of computer models.
It follows their conclusions can only be as good as those models.
Let me explain first how a predictive computer model works. It’s virtual reality. If you are building a model to predict anything from the stock market to building collapses you are essentially telling a computer a set of rules that enable it to construct a real-world simulation of your money markets or your building. The most important thing to understand is the result you get is only as reliable as the data you input, because computers are quick but not smart.
If you input garbage, you will output garbage. If you punch in wrong values a computer won’t realise they make no sense, it will just run its program with those values and produce a result that has no connection to the real world, and can even be downright ridiculous. There’s no fail-safe or common sense override. Punch the wrong data into your computer model and you will get “proof” cars can drive on water, or birds can fly through solid rock.
Any computer model of anything is only as good as the parameters fed into it.
NIST’s models can’t be assessed independently as a whole because NIST refuses to release any data about them. Their claimed reason for this is that releasing the docs might endanger national security. However NIST did disclose some limited information about their parameters in the body of their reports, most perturbing and inexplicable of which is their acknowledgement they assigned all the steel in their WTC model a thermal conductivity of zero, or close to zero.
To explain to a non-science-based readership what that means, just consider what you would expect to happen if you placed one end of a steel bar in a fire and kept hold of the other end. Would you expect:
A) the end you were holding to gradually heat up to the point you could not keep it in your hand?
B) the end you were holding to remain cool no matter how hot the end in the fire becomes?
Believe it or not, NIST chose the second option. Here it is in their own words:
“The interior walls [including insulated steel columns] were assumed to have the properties of gypsum board [0.5 W/m/K].” NCSTAR 1-5F, p 52
“Although the floor slab actually consisted of a metal deck topped with a concrete slab…the thermal properties of the entire floor slab were assumed to be that of concrete [1.0 W/m/K].” NCSTAR 1-5F, p 52
You don’t need to be a professional scientist to know this is bunkum and a total disregard of basic physics.
Why does this matter? It matters a LOT. Changing the assumed conductivity of steel from its actual figure to zero would allow the model to produce much higher temperatures in the steel directly exposed to fire than would be possible in reality. It’s like calculating the amount of water you could get into a sieve at any one time by assuming the sieve has no holes. The model will show the sieve can be filled to the brim, but that is just so much garbage with no real-world application.
Just so with the temperatures of the steel. NIST needed to produce a model that allowed cool office fires of around 800deg to somehow produce enough heat in localised areas to weaken and buckle steel girders and struts. If they’d allowed the steel to behave normally and wick the heat away along its length they simply could not achieve this aim. Only by turning the assumed thermal conductivity to zero (the equivalent of assuming the sieve has no holes) could they get their model to create enough heat to do the buckling and weakening.
This is a huge problem. In fact it could not be a bigger problem. This bogus assumption that steel has zero thermal conductivity not only renders the NIST report as a whole deeply suspect, it entirely nullifies even the flawed basis for its “collapse by fire” hypothesis.
This is why so many scientists are calling for another investigation. They aren’t saying the gumment did it, they aren’t claiming a conspiracy, they just see huge errors in the previous investigation and want more work to be done.
Bottom line is NIST punched in false data that totally invalidated their model. The zero thermal conductivity issue alone is sufficient grounds for a new investigation.