Monday, May 11, 2020

Untapped Potential


About 15 years ago Janice and I made the decision to move back to Northwestern Ontario. At the time we were living in Northern Manitoba. One thing for certain, especially for me, was that we would not remain there as the winter’s are too long and the summers too short. For me returning to the area I grew up in was easy; I like the temperate climate with its full four seasons, the general beauty and the broad spectrum of outdoor activities available, even if not as broad as before we left for Manitoba back in 1991.

Having returned I have noted many changes, some subtle and others not, but none that are positive. An illustration is the state of the forest industry that was the keystone of the local economy. A simple illustration is that when we left Ontario there were 10 operating paper and/or pulp mills operating in Kenora and Thunder Bay Districts. Now there are only 3. In a related fashion I have been on the Lake Nipigon Forest Local Citizens Committee on a volunteer basis. Over time on the committee I have come to realise a major impediment to the forest industry is over regulation, including the fact that any business interested in a forest-based activity has no choice but to follow the rules set out in the “Forest Management Planning Manual” and its related guides. Together all regulate how forestry practices must be done with little if any consideration of economic sustainability. In other words, the government, rather than just providing a framework of targets to guide development, through the Ministry of Natural Resources and Forestry, is telling people how to run their business. And we all know, or should, that the one organisation that does not know how to do things efficiently is government.

The past Liberal government is largely to blame but unfortunately that even if the current Conservative government had any desire to roll back these impediments, the current bureaucracy will not help as they love these restrictive regulations.

To illustrate, I tried writing the Minister of Natural Resources and Forestry about the above and several other matters and included a suggestion on increasing revenues. I proposed that more recreation lots be made available. First, I doubt very much that the letter was ever seen by the minister and secondly the response to my suggestion was “Crown land is no longer actively marketed, rented or sold for private recreational or residential use”. As most of the land in our area is “Crown Land” that does not bode well.

We look at what has taken place in Minnesota just south of us and it is like night and day. While very anti-mining even though Northern Minnesota was developed by mining and it still provides a significant contribution to the local economy, they have developed their tourism industry immensely whether through government or private development. The Sibley Peninsula closer to home has some development but not much. Then we get to the Black Bay Peninsula, a true gem in the rough with fantastic and impressive natural features that few can see. 

Through much lobbying by some local interests Nipigon Bay has been declared a significant part of the Lake Superior National Marine Conservation Area, managed by Parks Canada. A “park” conserving what, I do not know. But why not transfer the Black Bay Peninsula and even the Nipigon Bay bounding islands to the Federal government to make a real “National Park” rather than the current ephemeral “Conservation Area”? That, I think, would truly be a great boon as an area of profound beauty would become accessible by the general populace, even if under controlled access. Our very own Banff.

An alternative that could have similar positive benefits for our area is if those signatories to the Robertson Superior Treaty, the Ojibwe of the north shore of Lake Superior, decided to negotiate for a modernisation of said treaty. Why do I even suggest this? Read the treaty and you quickly see that it amounts to putting all Ojibwe on welfare and under protection of Queen Victoria and her successors. In effect treating them as children. Back in 1850 there is no question that the two cultures were quite different. I will not comment as it simply was a different world then. But 170 years later it is time it be updated as both sides now are now on equal footing when it comes to deciding what is right and without question the Ojibwe are “all grown up” and should not be treated as children any more. They are very capable of being much better managers of their own future. A key to life in the present is control over the land you live on. So why don’t the Ojibwe trade getting annual welfare payments and have title restored to them of lands that they can then take responsibility for, such as the Black Bay Peninsula and associated islands?

Of the “modern” treaties, those signed in the latter half of the twentieth century, such as the Nunavut Land Claims Agreement or the James Bay and Northern Quebec Agreement set the precedent; self-management of some of the “Crown” land. Just a thought for my fellow citizens of this country who are Ojibwe. I for one am open to anything that foretells of a future of hope. Not the current one of despair thanks to disdain of those at Queens Park.

An Introduction to Data Analysis and Models


I graduated from Lakehead University in 1977 and ever since my career has focused on data collection, management, and analysis with the last 25 or so years dealing primarily with computer modelling. As that topic has been much in the news for some time now, I thought it might be helpful to explain to the lay person what some of the characteristics as well as strengths and weaknesses are.
When dealing with digital data a primary factor can be summarised thusly: garbage in, garbage out. In other words if you use poor data do not expect anything but poor results. What makes “poor” data you ask? Combining different “quality” of data is one example. Or using data from different sources, collected using different methods, and inconsistency in the data coverage (data clustering). All introduce severe biases.

Once a sample has been collected it may be sent to a laboratory for analysis. Here we add another layer of biases that complicate the quality of the data set. As there are a few different methods available and all have strengths and weaknesses, especially when what we are looking for is present in very tiny quantities, such as gold. This introduces the concepts of “accuracy” and “precision”. The former is the measure of how repeatable the results are and the other how close to the actual value. Ideally you want both, but that does not always happen.

Let us look at two real data sets; one is the average monthly temperature as reported by Hydro One and the other is the average monthly temperature as reported by Enbridge, for the same residence and for the same time period of November through to March this past winter: -6.8, -11.7, -13.4, -12.15, -6.45; and -2, -9, -9, -11, -6. First thing to note is the “precision” of the two data sets; one is reported to 2 decimal places whereas the other is only as integers. The average of the first set is -10.10 and for the other it is -7.4. Now both are supposedly for the same location and for the same time. Why are they different? One bias has been introduced in that in both cases I divided the sum of each set by 5, the number of elements. But each month covered has a different number of days and we did not allow for that. We really do not know where the temperatures were read but I am going to speculate that the “smart” meter on my house has a temperature sensor whereas the Enbridge readings are from their facility in Thunder Bay. Amazing then the temperature differences 100 kilometres makes! 

We should not assign a precision to the Enbridge data set that we do not have. If the source data is integers, then the average needs to be reported as an integer, -7. Similarly, for decimal data we should not report more decimal places than that for the input date. We should not imply a precision that is not there! And we have just scratched the surface of the complexities of data that need to be addressed!

Now let us jump to computer modelling, the process that uses our input data. A computer model uses an algorithm consisting of a set of mathematical equations used to process the data and generate an interpretation. The more complex the system being modelled the more complex the equations. Many assumptions are made in the choice of equations. We then process the data and then analyse the output. But is the output reliable? One way to verify is to validate using a data set where we know the results. If the model prediction matches reality, then we have confidence in the algorithm used. To illustrate, if we have a temperature prediction model then we would take a set of historical temperature input data and see what the prediction is for the present.

This is where scientific peer review comes in. If different researchers use the same algorithm, but different data sets, all with known results and they can duplicate those results then we have confirmed it is reliable.

Unfortunately, in the real world we have “researchers” using bad data, such as mixing “proxy” temperature data with actual temperature readings. Then they do not adjust for existing biases within the data. Next, they use a computer model that has not been validated. This generates bad results, garbage out. To complicate things further they report the results to a precision not present in the input data. These severely flawed results are then given to the media who strip off any caveats that may have been included thus making a speculative statement one of “fact”. This result is complicated further in that far too many people apply confirmation bias to use these to prove their own flawed thesis and then the politicians get involved making the matter even worse. As Churchill said, “A lie gets halfway around the world before the truth has a chance to get its pants on”.

In conclusion, unless a computer model has been validated using real data and has shown it can predict real events then it is not to be trusted. Any “researcher” who is unwilling to share their algorithm and data with others so it can be tested cannot be trusted. Any politician who uses the results from an unvalidated model too cannot be trusted. If you see any of these three truths do not believe what you are being told because to do otherwise is at your own peril.

What Can we Learn?


This is a copy of a submittal I made to the Thunder Bay Chronicle Journal, and which they published May 9, 2020.
 
Fear. An emotion shared by many living creatures, including ourselves. One that has evolved through evolution as a means of protecting oneself from danger, or perceived danger. When groups of creatures are together, as there is strength in numbers, it does not take much to overcome that rational action by the irrational actions fear often instills. Take as an example a herd of bison, truly very formidable creatures that should have little fear of anything else. Yet aboriginals of the plains could easily get a few to start running which quickly would expand to the whole herd and then they were easily led to run over cliffs to their deaths.

Humans, while still prone to irrational actions instigated by fear, have developed means of controlling that fear. Take for example your house; a structure built to withstand most actions the local environment can produce. Different areas have different types of structure depending on local conditions. But all share some basic features. Close the doors and windows and little, if anything, from outside can penetrate. As you become more assured that the outside elements are not posing the dangers you first identified you can start by easing open these controls. Even within the bounds of your home you can have even greater control of the local environment, by closing doors to individual rooms then changing the heat or other characteristics of that room as needed.

The nation state is merely an extension of the home as a controlled environment as a means of protecting the residents from external dangers. There are exceptionally good reasons we have secure borders; to protect us from unpredictable dangers from abroad.

Ironically, the herd instinct still prevails to our detriment. Take for example our “pandemic”. A rational approach would have been, once we were made aware of the potential danger of what is now called Covid-19, to shut down our borders and isolate recent arrivals from China from the general population until we were certain of a few things: that the new arrivals did not have the virus by giving their immune systems enough time to deal with any new disease, and to do enough research to determine what the true risks were.

I will not get into the details as we all know both actions were not taken until far too late. 

The latter though is more complicated. Computer models were used to “predict” how dangerous the new threat was. Unfortunately, far too many people forgot, or never accept, a basic caution of any data based process; garbage in, garbage out. To translate, a computer model relies on input data. If that input data is uncertain, or even wrong, the output will be no better. Yet we have had so many restrictions placed upon us based on computer models that have proven to be, how can I say it, far from accurate?

Every disaster provides opportunities to learn to prevent those actions that proved most useless. Two I am going to recommend are as follows: no computer model should ever be used to control policy, especially if the input data has any uncertainty; and we have to accept that there are very valid reasons why countries have borders and act accordingly, enough with this “post-national” nonsense.