The disappearing data and the cost to society

A National weather service surface data site collects information about temperature, humidity, wind, air pressure and more. Some public entities are installing their own high-resolution weather instrument sites, called mesonets…but significantly to severely reducing public access. Photo credit: NOAA

In mid-May of 2021, AllisonHouse was informed by the National Weather Service that the West Texas Mesonet would no longer be available without a high cost to our customers. Unfortunately, we had to stop receiving the feed of this data as passing along the cost would have required us to raise subscription prices significantly.

Unfortunately, the West Texas Mesonet is not the only mesonet to do this. Several years ago, the Oklahoma Mesonet did the same thing. Although we tried, at the time, to offer a $2 a month add-on for people to acquire that data, few people did…and because we were losing money, we dropped the service.

These are two examples that are either recent, or prominent to many of our subscribers. But these aren’t the only two, by far. Last year, a grand announcement was made for the New York mesonet, which was supposed to help, in part, the National Weather Service forecasters in their accuracy of snowfall predictions.

But much of that mesonet, paid for by tax dollars, was immediately placed behind a paywall. While the general public can access the data through a tedious process, unless you pay a large amount of money, you cannot make the data convenient to use.

I’m going to argue that there is a multifold problem with this approach. First of all, data that only helps government agencies…falls way short of helping and serving the public. Secondly, we the public paid for the data…why can’t others, both public and private, optimize that data for applications and tools to improve society’s impacts to weather?

I get it. After working in the educational community for 2 decades, it goes something like this: we have the money to make a mesonet or get a radar in a grant from the National Science Foundation, and/or the local, state, or the federal government…and we aren’t given the long-term funding to maintain or distribute said data. And, in many cases, projects must have a termination time.

But we, in government and in the private sector, need to look longer-term. Why not have a 20 or 30 year grant that pays for the installation, maintenance and distribution costs? Why, at the end of the grant period, must it either be terminated, or gotten rid of altogether, or put for sale at exorbitant prices?

Solution: If you are a governmental agency (universities included), you need to pay for access and maintenance costs, long-term, AND that has to be an option in a grant, or by commitment from the hosting governmental agency. Nobody is served well by your mesonet or radar if just one agency can see it displayed in a very useful manner on their screens.

How much more could we have learned from Project Vortex and it’s sequel if it went on a 10 year run, instead of just two?
And how much would we know and be better forecasters if only more than a select few can see crucial pieces of weather data without breaking the bank?

We can and must do better…much better than this. It’s a black eye on the scientific community, and on research in this country. As other countries have embraced the model we once had of making the data available to everyone, it would be a great disappointment…and be a detriment to the public making decisions based on actual weather conditions…if we regress to locking data down and making it unavailable in real-time. The thinking, not the system, must change to long-term benefits, instead of short-term gains.

Climate, models and apps: but what about the data?

WSR-88D Radar
The danger of a lack of data is here. What can be done to prevent imminent, serious issues?

It’s 2021, and here in America, the meteorological profession is seeing a lot of software development and climate research going on. What could we expect 20, 50, 100, 500 years down the road in terms of climate? And this ongoing research doesn’t just look at temperatures, either. Dr. Victor Gensini from Northern Illinois University has done research on long term tornado forecasting and trends. One of his efforts is here:

And then there are the models. From seasonal to multi-decadal, much research is going on to see how the temperatures and weather are generally going to change in the years to come. How much climate change is going to occur, and how quickly?

Also going on the field of meteorology are software/app improvements. AllisonHouse, who make the Maps website; DTN, who make RadarScope, and many other software makers and weather companies provide apps for your phone, desktop or tablet; some are Web-based. We can now see whether we are in a tornado warning down to within a city block in seconds, thanks to plotting of warning polygons on GIS-heavy software like the Gibson Ridge (GR) products, and AllisonHouse Maps. Programmers are writing phenomenal software that would have been unimaginable 10 years ago…and the innovations keep coming hot and heavy!

But there’s one thing that supports them all…and without it, all of this becomes absolutely useless. And that is:


Without raw data to feed models, apps, and research, everything stops. Everything. And not just those listed above. If surface weather observations cannot be obtained from a major airport in a large city, you can have a partial or total ground stop. You can’t get tornado warnings to phones or apps, or to the media. That alone can cost many lives, or financial ruin by trucks and trains driving into tornadoes or bad weather. Some carry hazardous materials or fuel, and let’s just say an unwarned tornado hitting these while in a city can cause major, life-threatening problems that affect a number of people.

The word “crisis” is thrown around a lot these days, sometimes way too loosely. But I would like to put forth a concern that we are approaching a data crisis, in terms of reliability.

No, I’m not saying that AllisonHouse is unreliable. Just the opposite: we have achieved a nearly 100% uptime across all of our data feeds. But, that depends on airports, individuals, and the National Weather Service and the FAA to provide that data across modern, reliable acquisition and distribution systems. And all of these, to some extent or another, are beginning to, or are, breaking down.

But it’s not just a server/computer/fiber/Internet issue. Farmers who have been taking climate observations for decades are dying, and their farms are being sold to a company who is not interested in sharing their data with anyone, or the next generation of farmer decides that they don’t want to do that anymore. Or, the farm gets sold to developers. Airports close. Chicago no longer has a manned weather station downtown anymore after the airport was closed.

Then, you have smaller airports. The FAA uses temperature and humidity sensors that feed data to the National Weather Service. And then you have the Automated Surface Observation System (ASOS) weather stations that still require human intervention 24/7 to make sure the data is accurate and going out. The FAA wants to eliminate all observers at the large airports, but political pressure has kept meteorologists in jobs there…for now.

And then there’s the National Weather Service distribution system. This system is antiquated, and as more and more data gets shoved through the system, complete breakdowns are occurring on a semi-regular basis now. The latest one, as of this blog post, was embarrassing and scary:

And just before that, their main data center in Silver Spring, Maryland was literally flooded after a water pipe break above their primary data center (don’t get me started on how a water pipe got placed above a very critical data center). This knocked out (ironically) buoy data for a week; and if their system goes down, NOBODY (including us) gets data. That was down for a WEEK. Shipping interests were not amused. The NOAA Weather Radio system is stale in its technology; it is still the best, most reliable way to receive alerts, but it warns via county…even if you are outside the warning polygon. (Still, that is absolutely NO reason not to get one. This is slowly changing as well.) Weather radio data is used by emergency managers, individuals, radio and TV stations to get the warnings out.

There are other blogs that have dealt with this, in part. But one thing that has been neglected is the data itself. AllisonHouse IS data; without it, we are nothing. We provide the data to whoever wants it, in whatever way we can process it to make your app work, and our individual customers use it for weather spotting, monitoring, forecasting, or assisting in making weather-based decisions. And we use as many redundant paths for that data as reasonably possible to keep the data going 24/7/365. During that National Weather Service (NWS) outage mentioned above, it kept the NWS forecasters in the dark for 7 hours. Thanks to redundancy I have in our system here, we lost anywhere from 20 minutes to 2 hours of data, depending on the feed. After doing this for 23 years, I know how to get the data and to be as reliable into our systems than any other weather company. We don’t have “sun outages” every spring and fall that others do with data that is acquired via satellite. We have connections via fiber to primary and backup NWS connections to keep the data flowing.

But when the system breaks down, my best efforts will fail. And at that point, nobody has data. Not us, not the NWS offices, not your app on your phone, tablet or computer; not the NWS websites, not us. It happened for 7 hours on a relatively quiet night in March of 2021. It will happen again.

And that’s the near-crisis we are in. Unless we understand how much VALUE raw weather data provides to us as a nation, we are headed for a data outage…and that next one has a better chance of occuring during a tornado outbreak. A blizzard. An ice storm. It will hamper or destroy research, and our nation’s climatology. It will hurt and/or kill precious people. It WILL happen, unless we route some of these resources back to the government data distribution system, and data sources, NOW.

And the loss of data in more rural areas needs to be addressed as well. A lack of promotion of “co-op” weather stations within the National Weather Service in recent years means fewer climate stations will be around in generations to come, as well as working with companies to make basic data available to researchers. That will make climate forecasting even more of a huge challenge that it already is.

When a sports team is having a bad year, a good coach always goes back to the basics to get back to a team that produces wins. It’s time the United States, its citizens, and the National Weather Service management fully realize how valuable raw data is, and quickly. Otherwise, the system will fail…and 23 years of experience on my end won’t be able to do a darn thing about it.

How do we fix the Weather Service part of the issue?

Read this:

Author’s note: We at AllisonHouse have data alarms that text and email me and our other employees when something goes down on our end, or at the National Weather Service, or private data vendor end. Until we intervene, we get alerts every 2 minutes until the issue is resolved one way or another. The amount of those I have received over the last 6 months has been staggering…and completely unacceptable.

Breaking up is hard to do…but it can be a good thing.

This split-off program with GREarth will allow for a more manageable user experience, and in the future, more model options. Image courtesy Gibson Ridge / GRM

Sometimes, with relationships, you couldn’t imagine breaking up with your significant other. And no, we’re not breaking up with Gibson Ridge: that relationship just keeps getting better! But in the world of software, it can be a good idea to break programs up.

Case in point…GREarth. It has current surface and upper air data, as well as models, all in one program. And, as anyone can attest using it, that’s a lot of menu options. And as more models and more data comes online, it’s getting to be a big program. Too big, in fact.

So, Mike Gibson is going to break up the program. GREarth 3,0 will contain surface and upper air data (including satellite and radar, of course!). But, what will become of the model guidance?

Introducing GRModel, aka “GRM”.

To start, GRM will look like GRE, and function like it. But, this will allow GRE and GRM to expand in the future with more data and models. This also makes the menu of both a lot more manageable. Of course, you can run both at the same time, if you have more than 1 monitor. This means that you can view models and surface data on separate windows!

When will this happen? As I type this, GRM and GREarth 3.0 are in private beta testing, and are getting close for public beta testing. The wording Mike Gibson used is “soon”, to the point where I can let the cat out of the bag now. Both programs will be released simultaneously. Of course, watch our AllisonHouse Facebook page, and our blog here, for the latest updates on GRE and GRM!

Editor’s note: Your GREarth subscription will cover both programs. And, there will be no price increase! To clarify: models will be in GRM, and real-time data will be in GRE.

Author’s note: Both programs are now available for testing at the GRLevelX owner’s forum at: