Matsu told me on Friday that Edward Lorenz had died recently. Somehow, I had missed that.
Lorenz was one of the first folks to start talking about chaos theory, and did most of his early work in the area of weather prediction. Going beyond the standard knowledge that weather prediction is a bit hit or miss, Lorenz began to work on simulations of weather patterns, and observed that even tiny, supposedly insignificant, changes in initial conditions made HUGE changes in the final outcome.
This was the so-called butterfly effect – that, theoretically, a butterfly flapping its wings in Mongolia could have a profound impact on weather patterns in Brazil in the months or years to follow. It’s not that the butterfly’s motion *causes* those changes, as some news stories oversimplified, but just that tiny changes in initial data has enormous impacts on final output of the formula. Indeed, this is one of the definitions of chaos theory – sensitivity to initial conditions.
In college, I took a course in Chaos Theory, using Glick’s book as our text book. It was more of a research class than a lecture class – we were learning at the same time as our professor, and there were just two of us in the class. It was a truly fascinating class, not least because of looking at Lorenz’s research. He came up with results that were obviously wrong – at least given knowledge of the day – and persisted in looking at it, rather than throwing it out like many of his colleagues did at the time. Indeed, many advances in the sciences are by people who refuse to accept the common knowledge, and instead pursue results that are, by common wisdom, obviously wrong.
So, every time you look at those computer-generated weather prediction reports, you have Edward Lorenz to thank that they are as accurate as they are, and that weather prediction is anything more than a guess — even though it still feels that way most of the time.