The environmental extremists’ Genius Zone wants us to believe that every global warming prediction is 100% correct. But computer models can err and quickly draw wrong conclusions. The author has personally developed and directed the development of several computer models. It is effortless for a computer model to be incorrect. It is rather remarkable that they ever make any correct predictions. So many errors can creep into a model and cause it to predict erroneous results.
Secondarily, the average computer modeler comes to model development with a particular bent — they want to see a specific result. With that in mind, this author has jokingly said that he should his modeling skills to the highest bidder: “Tell me what you want to model and what you want it to predict, and I will build you a model.” That would be unethical, of course, but anyone I’ve ever met who was developing a computer model wanted it to predict a particular result.
If it showed that result, the modeler could quit and call the model complete. If it didn’t show that result, the modeler continued working to develop it further. Even if a particular product is not a conscious goal, subconsciously, most modelers are looking for a specific result. In addition to all the possible errors that can affect model results, the modeler’s natural bent must always be considered. How ethical is the modeler or the modeling team? Would they intentionally slant a model to produce the results they want? We want to think most would not deliberately misinterpret a model for the desired result.
RELATED ARTICLES :
- The Mystique of Beauty and Splendor- The Intuitive Capability of Guy to Recognise Fact and Exact
- Guide to Building a Gaming Computer With a Good CPU
- Breast Cancer: Fancy Gadget and Half 1,000,000 Ringgit Didn’t Cure Her
- Computer Security Software Reviews
- Health and Beauty Secrets and Techniques Subjects
One must wonder about this — particularly in the global warming debate because all sorts of inappropriate, unethical tricks are used to declare predicted results absolute truth and discourage others from questioning those results. “The debate is over. Consensus has been achieved!” Science doesn’t work by consensus — and the debate is hardly ever over. “The Hollywood elite support the results!” Who cares what Hollywood thinks? “How dare you suggest these results are not accurate?” Well, some people know something about models and the model development process. They understand all the possible pitfalls of model development. “How dare you disagree with us?” We disagree for many reasons that have not been included in the debate. We disagree because the debate never occurred. If the intelligentsia is willing to play debating games and wants to stifle discussion when they think their side is in the lead, one must look carefully at all details and question all results.
A computer model is a program designed to simulate a particular function and make predictions of its expected behavior. For example, the author used computer models to predict the viscous behavior of fluids and suspensions in industrial systems to render computer-generated movies that perfectly simulate the visualizations shown. For example, complex algorithms show reflections on shiny objects to affect how light bounces from sources to the viewer’s eye. When the original models and algorithms correctly predicted light reflections, they began to generate movies. The following list includes many of the pitfalls that can unintentionally hinder the success of computer models:
First, models are simplifications of real phenomena. The modeler(s) must determine the proper mathematics to simulate each sensation of interest. One usually selects the most straightforward mathematical algorithm to perform the task. If one chooses incorrectly, the results may be in error. For example, some phenomena appear to have a linear behavior. But the linear behavior may change to non-linear behavior under certain extreme conditions. If that is not known in advance, the model may be asked to predict values in the ‘harsh conditions’ territory, and errors will result. This happens easily.
For example, the fluid viscosity of a suspension (powder mixed in a fluid) starts as a linear function of the powders’ concentration added to the liquid. When the concentration of powder is small, the process is linear. But as the concentration of powder increases, the viscosity behaves in a non-linear manner. The initial linear process is relatively simple to program into a model, but the non-linear behavior is complex to model accurately. It is easy to make programming mistakes and utilize the wrong mathematics. This is closely related to the first pitfall above. If you think you know how a particular phenomenon behaves, but you use the false equation, the model will predict erroneous values.
Some phenomena are difficult to model. Sometimes, the results of a particular set of phenomena are not known. One must then perform a complex calculation each time those phenomena must be used. Rather than use the resulting mathematical equation to simulate a function, it may be necessary to simulate the underlying phenomena to arrive at the results. This may force a model within a model, adding complexity to the calculation.
For example, rather than using a simple mathematical equation to simulate how clouds affect sunlight, it may be necessary to model individual raindrops’ behavior in daylight and then model the action of the bazillions of raindrops that form a cloud to determine how a personal cloud will behave in sunlight. Until one builds up to simulate a whole, the model can take enormous proportions, and the calculation times can be too long. Having gone through such an exercise, one must determine if the equations and algorithms were modeled accurately at each step.
The memory capacity of a computer and speeds of computation can be limited. This was more of a problem 20-30 years ago, but sizes and speeds can still be limiting. In early computer-modeled, accurate were modeled ers used by this author, you you wished — as long as it could fit into a 64,000-byte program (relatively small as computer programs go.) Program sizes were limited, and measures of memory locations were also limited. Computers have grown over the years, and most programs can now be large; a programmer doesn’t need to be concerned with size limitations or memory capacity. But sometimes, these still need to be taken into account.
When computation times can grow exponentially with specific simulations, one must determine how long a particular computation will take. If computation times for a specific phenomenon double with each new iteration, capacities can quickly outgrow the available memory and allow computational times. And models will reach those points within one or two iterations. Suppose it takes one full day, for example, to perform one simulation iteration, and the calculation time doubles with each new iteration. How long is the modeler willing to wait to complete the simulation? See — this will build quickly — one , two days, four days, a week, two weeks, a month, two months, four months, eight months, 1 1/3 years, etc. Again — how long is the modeler willing to wait?
How many raindrops are needed to form a cloud? How many individuals must be simulated to model the behavior of a cloud adequately? How many in combination are required to simulate the interaction of light with a cloud? If these simulations define a model, we’re talking about huge droplets, huge memory requirements, and too-long computing times. Even if this process started with an iteration taking a fraction of a second, it doesn’t take many doubles to reach a full day where the previous paragraph list began.
In some cases, a modeler’s mathematical ability can limit the model’s complexity. Some phenomena are complicated to simulate mathematically. If the modeler cannot perform the previous paragraph, list the last section listt does not insert that calculation into a computer to achieve it. Some models require advanced calculus or other higher mathematics to solve a problem quickly. If that math level is beyond the modeler’s capabilities, a less elegant, more extended calculation method may be required. If that is not possible, it may be necessary to postpone finishing the model until the appropriate algorithms become available.
The fighter jet with its wings canted forward comes to mind. This is a fundamentally unstable configuration for an airplane. Its natural tendency is to flip over and fly backward. It needed two technological advancements to design and test such a plane. (1) It needed a controller that could quickly adjust its control surfaces so it could fly. They needed to wait until fast computers were available to control the plane. Pilots were not quick enough to do this. (2) It needed to wait until light; stiff composite materials were available to make the wings. Stresses on the wings of such an airplane are incredibly high, and for years, they did not have materials that could handle the stresses and still be light enough for use in a fighter jet. They had a great idea but needed to wait for the technology to catch up.
Computer modelers can have great ideas, too, but if they can not code sufficiently complex mathematics, they may have to wait. An important phenomenon can be overlooked. When problems randomly occur in an industrial process setting, it usually means one or more important phenomena have not been considered in the control schemes. Process engineers do their best to include ALL important phenomena in their control algorithms, but most processes still suffer from random, unpredictable problems. Most of these are blamed on Murphy, but most occur because critical control phenomena have been overlooked. In a particular plant control process, we thought we had considered all possible factors. Yet, an occasional batch of raw materials didn’t follow expectations and caused enormous problems. When searching for an answer, we learned that a particular characteristic of the batch materials was responsible. In maybe 95% of all batches, this variable was not a problem, but in 5% of the collections, that particular characteristic was extreme, and many problems occurred.
This same behavior happens in computer models. For example, according to the ‘big boys’ in the global warming debate, the earth is not heating due to solar radiation variations from the sun. So what if a computer modeler forgets to include solar radiation in the earth’s temperature calculation because the sun does not affect it? The results will be erroneous because the sun does affect the earth’s temperature.
There are lots of reasons why a modeler can overlook an important phenomenon. Sometimes, one phenomenon is not known to affect another. When calculating the earth’s temperature, one must consider the area of paved parking lots?… auto emissions?… the height of downtown buildings?… etc. It is easy to miss necessary phenomena simply because they are not deemed essential for inclusion.
Are the mathematics of phenomena a constant with time?… or do they change? This question affects computer models that cover long time frames (like the global warmers). Do aric gasses absorb radiant energy today the same way they did thousands of years ago and the same way they will thousands of years in the future? Lots of other phenomena should be questioned in this same way. Uniformitarian principles suggest that everything happens today as it happened in the distant past and will happen in the distant future. There are problems, though. According to evidence, thenot only changed several times in the past but supposedly switched polarities several times (i.e., the north became south, and the south became north.) If a phenomenon depends on the earth’s magnetic field, how does one handle that in a computer model?
Darwinian evolution and uniformitarianism are closely related. Both theories say that changes occurred slowly, and all phenomena behaved similarly throughout those eons. True? False? It depends because creationists who believe in a young earth are grouped with catastrophists who say that the world was formed by a catastrophic series — not by gradual changes over eons. Even in this case, unless known otherwise, one must assume that all phenomena occurred in the past and will occur in the future as they appear today. But in this case, the models may only deal with thousands of years rather than millions or billions of years.
This question still needs to be taken into account. When computer models are developed, are they checked against useful data? Are the results published for all to see? The author developed several computer models that were applied to ceramic process systems. Those results were all published in the technical ceramics literature because they were only relevant to a small technical community. But each model had to be proven against real phenomena. Each model had to be demonstrated to determine if it accurately simulated the natural wonders. When no prior data was available, the author had to perform experiments to verify that the computer’s predictions were correct. In some cases, real results were well known, or data was already available to show behavior.
The models were then used to explain why the behavior occurred. Extra tests did not need to be run in those cases because the results were well-known. The effects occurred because of the answers sought by the computer models. And then, depending on the nature of the models, results were published in appropriate journals. In the case of global climate models, the products appear to be buried in the technical literature, and we are left to see the media’s and the politicians’ explanations that dire events are soon upon us! If the models are so important that they will affect our economy and lives, results that demonstrate the integrity of the models should be published in the open literature for all to see. Suppose today’s mass media believes these models are so accurate that Washington will alter our behaviors in response. In that case, we should not need to dig to find the articles that show us the models and prove the accuracy of the results.
According to some, we have collected excellent satellite temperature data since 2002. Our best computer models should be tested against those satellite data to demonstrate the models can accurately predict 2010 weather behavior. Those results should be published in the open literature for all to see. We should not need to take the words of politicians, environmental extremists, or the intelligentsia that we are in jeopardy of dire consequences from global warming. They should be willing to show these important results to all of us. They are unwilling to do so lends credibility to the idea that global warming is nothing but a hoax — perpetrated to allow the redistribution of wealth from the “haves” like the US and Europe to the “have nots” like third world countries.
If results are published broadly, will we see the right, logical answers to our questions? If global warming is causing the too-violent hurricanes of the last several years (note — we haven’t had any to the author’s knowledge), are the modelers going to make reasonable explanations for such predictions, or must we continue to hear only from the politicians and extremists, “Well, of course, global warming is to blame!” That is no explanation, and computer modelers must have more substantial, logical answers for such claims than that. An “of course, it is responsible” response is insufficient for us to , cold waves, hurricanes, tornadoes, snowstorms, etc., result from global warming. If modelers believe this to be true, they must have better answers than just Off course.”
Can a computer model predict climate events 10 to 50 years from now? Professor Cotton, a Professor of Atmospheric Science at Colorado State University, [Cotton, W.R., Colorado State University, “Is the climate predictable on 10-50 year time table?”, 20 Jul 2010, Powerpoint presentation] concluded that it is not possible to do this. According to Cotton, too many unpredictable phenomena affect our weather to make accurate predictions over that time frame possibly. Has any of the other computer modelers asked and answered this question before beginning their computer modeling quests? Such thinking and questioning were insufficient to stop other modelers from attempting to develop such models.
According to the Bible, God controls the wind and the rain. This means God controls the weather and the climate. If He wants it to rain, snow, hail, or drought at some particular location on the earth — He can make it so! Have computer modelers taken this into account in their models? This author has seen at least two managers exerting control over their processes so that they each became an input variable in successfully managing their operations. The engineers responsible for those processes had to consider their manager’s decisions to control the functions successfully. This made it awkward to hold the methods because the managers’ decisions were unpredictable. If God is in control of the wind and rain, in particular, and the weather, in general, how can a modeler take that into account in a model that predicts climate 50 – 100 years from now? The Bible says, “For who hath known the mind of the Lord?” [Rom 11:34] Man certainly doesn’t! So how can a computer model account for God’s decisions? It can’t! It is simply impossible!
There are lots of potential problems that computer modelers must face in the development of climate change models. Some are within their control. Some are entirely outside and beyond their control. Some apply specifically to global climate change models, while most apply to all computer models. There are enough potential pitfalls to the accurate development of such models. This author believes we should see detailed descriptions, results, and proofs of integrity in the open literature.
If environmentalists truly believe we face dire consequences shortly, all of these details, answers, and results should be out there where all can see. That should be the case if they have nothing to hide and sincerely believe in their products. But the underhanded arguments and sneaky methods (“The debate is over!”) used to suggest there is more to these computer model results than meets the eye. When Phi Jones, former director of the University of East Anglia’sUnit [Petre, Jonathan, UK Daily Mail: “Climategate U-turn as Scientist at Centre of Row Admits: There Has Been No Global Warming Since 1995,” 11 Aug 2010] recently admitted that “there has been no ‘statistically significant’ warming over the past 15 years,” one begins to wonder what kind of shenanigans the politicians are trying to pull.
Computer models are beneficial to help us understand all sorts of phenomena. Many models have been developed and are used to explain many different sensations. Those who wish to model global climate change over the next 50 – 100 years should be interested in their models’ proof, testing, and use. That the modelers are being quiet and allowing the extremists, politicians, and intelligentsia to defend their models’ results suggest that something underhanded is up!
Dennis Dinger is a Christian who is a Professor Emeritus of Ceramic and Materials Engineering at Clemson University. In 2008, he curtailed his ceramics career in their models’ results of blood cancer called Multiple Myeloma. In 2010, the cancer was in complete remission. Over the past three decades, he has directed many applied ceramic engineering research projects; he has been an active researcher and private consultant. He is the author of several ceramic engineering textbooks and several Christian books.
This book, Global Climate Change, the Bible, and Science, was written to enter the author’s thoughts and reasoning into the global warming debate. In this book, he shows the Bible references which support three important points: (1) God created, (2) God controls the day-to-day workings of the creation, and in particular, (3) God controls the wind and the rain (that is, God controls the weather and climate). Also included are discussions of process control systems, understandings of which are needed by those who want to create climate models, some important natural cycles which have been in balance (without humanity’s help) for years and years, and possible pitfalls for computer models. These and other related topics are discussed in this book. For more details, click on .