Andrew's Weather Center

UTC/GMT/Z Time:

26 January 2019, Blog No. 2

 

Records are made to be broken, right? Sure. However, when "record-setting" weather threatens, actual impacts can be extreme and can pose dangers to life and property. In particular, excessive heat, record cold snaps, and historical rainfalls all have dire consequences. Just how common is record-breaking weather? Does a particular record get broken more often? These are two points I explore in this blog.

 

Record Climatology

I computed a basic, thirty-year climatology from 1989-2018 of meteorological records at RDU International Airport in Raleigh, NC (Fig. 1), which lies within the NWS Raleigh County Warning Forecast Area1 (CWFA). Each NWS office archives meteorological records and extremes and most are updated on a daily basis. Figure 1 shows the climatology below:

Blog Archive

How Long Before All Meteorological Records are Unbreakable? 

 

How can we know? Developing a Record Vulnerability Index may be a good starting point

 

Some quick hits and takeways I noticed from computing this climatology table: 1) Each year since 1952 has contained at least one new daily rainfall record; 2) Record maximum temperatures and record high minimum temperatures are the most frequently broken records; 3) More records are being broken each year, especially record high minimum temperatures, a metric that is critical for agricultural purposes and growing certain cultivars of fruits, and sensitive crops.

 

To validate points 2) and 3) above, I created a pie chart showing the distrubtion of records (Fig. 2) and the combination of maximum temperature records and record high minimum temperature (warmest low temperatures on a calendar day) records accounted for 278 of the 493 (56.4%) records broken or set over the past thirty years. To support point 3), I took a moving average of the number of records broken per year and compiled the results graphically into a sparkline chart showing the 1-year, 2-year, 3-year, 5-year, and 10-year moving average (Fig. 3). The result was a clear uptick in broken weather records in recent years. Why? Well, that's for another blog.

Figure 1. Meteorological record climatology by year at RDU International Airport (KRDU) from 1989-2018. Data: NWS Raleigh2

 

Cells highlighted in yellow represent maxima for each column.

 

At least one daily record precipitation event was recorded in every year since 1952 (data prior to 1989 is not shown, but this was verified on NWS Raleigh's climate archive.

 

2007 and 2015 each saw 29 meteorological records set, the most of any year in this thirty-year span.

 

Record minimum temperatures and record low maximum (coldest high temperatures) appear to be the most resilient records.

 

All official record data is compiled, maintained, and updated daily by NWS Raleigh at the following link, cited again as reference number 2 at the end of this blog:

https://w2.weather.gov/climate/index.php?wfo=rah

 

 

When it comes down to it, these numbers are useless and have no value until tested and validated with calculating which records actually fall in 2019 and then looking at the elapsed time between the previous record marker and the new benchmark. Consider that within the past thirty years, the year with the least number of records set was 5 in 2003. On average, we set around 16 records per year, so here's to seeing what 2019 brings us. On the heels of Raleigh's wettest year since World War II8, I am sure we are in store for another wild weather year. In the meantime, check out the newly minted Record Vulnerability page on the website and check out which records are most apt to fall this year.

 

Sláinte,

Andrew

 

 

References:

1. https://www.weather.gov/rah/virtualtourwhere

2. https://www.weather.gov/rah/rdutemperaturerecords

3. https://ggweather.com/enso/oni.htm

4. https://www.newsobserver.com/news/business/article135530323.html

5. https://digitalcommons.unl.edu/cgi/viewcontent.cgi?article=1084&context=envstudtheses

6. https://www.britannica.com/topic/Gregorian-calendar

7. https://climate.ncsu.edu/cronos

8. https://www.wral.com/triangle-sets-record-for-wettest-year-and-more-rain-is-on-the-way/18090424/

 

Sources:

Microsoft Excel/Google Sheets

https://products.office.com/en-us/excel

 

Desmos Web Calculator and Web Application

https://www.desmos.com/calculator

 

gallery/rvi_climo_table
gallery/sparkline_plot
gallery/waterfall_records
gallery/records_by_type_pie

Figure 2. Pie chart showing basic distribution of meteorological records broken between 1989-2018 at RDU Int'l by type. The five main types of records kept by NWS Raleigh and most other forecast offices in the NWS include these five categories.

Figure 3. Sparkline chart showing a slight upward trend in the quantity of new meteorological records through aggregated plots of 1, 2, 3, 5, and 10 years of records. This is just a way to "smooth out" the data and identify any trends.

gallery/all_records_by_year

Figure 4. Line chart showing the five categories of meteorological records and the total number of records broken per year.

The above plot in Figure 4 visually breaks down the table of data first shown. Note the increased frequency of record maximums (red curve) and record high minimums (violet curve) compared to other categories. No real year-to-year trends or patterns are clear, and the number of records broken per year do not appear to correlate with other indices or parameters. My first inkling was to check the correlation to El Niño and La Niña occurrences to parse out any evident connections, but was unable to confirm anything with this small sample size of thirty years at one point location. Future research may prove useful in further answering this question. For reference, this bookmarkable website, https://ggweather.com/enso/oni.htm3, outlines El Niño and La Niña years and intensities.

Figure 5. Waterfall chart showing annual quantity of record high minimum temperature (warmest low daily temperatures) records. Note the increased values upwards of 10 broken records per year since the beginning of the 2010s.

Record maximum temperatures are one thing; bringing an aspect of sensationalism to getting that one extra degree Fahrenheit to exclaim that record heat occurred. However, the less popular and often forgotten cousin to record maximum temperatures, record high minimum temperatures, have real implications for a very important industry - agriculture. Warmer than average nights in the early or late growing season can endanger sensitive crops and fruits and essentially confuse the plants into blooming early or late, causing unconventional harvesting techniques. For instance, this February 2017 News & Observer article4 describes the implications of warm nights on strawberry harvesting in southeastern Wake County, NC. Another write-up, this 2012 undergraduate thesis from a University of Nebraska-Lincoln student5 outlines impacts of warming nighttime tempearatures and crop health in the Corn Belt in the Midwest. So, while other meteorological records may be more glamorous or less impactful, record high minimum temperatures are examples of record-setting weather with dire consequences.

Record Vulnerability

So, how do we got about determining which records are more likely to fall and which records will stay untouched for years to come? I devised a mathematical method to calculate the vulnerability of records, expressed as a percentage, that describes the likelihood of a record to be matched or surpassed in a given year. The methodology and process is outlined below. Warning: long-winded discussion and mathematical jargon ahead!

 

I used a funnel approach to narrow down the specific probability values needed to calculate record vulnerability, so I will start by outlining the grand scheme of my process and then start broadly and increase specificity as this section continues. The formula I used to calculate record vulnerability is as follows:

 

Vulnerability = (% chance event occurs on given day of month) * (% chance exceeding mean record value departure) * (exponential regression function of climatological data)

 

Starting with the first segment, a Gregorian calendar6 year, the calendar most humans use, including the United States of America, contains either 365 or 366 days, depdending on the occurrence of a leap year. Given that meteorological records are set on a daily basis, a 1 in 365.25 probability (0.27% chance) exists that a record will be set on a particular day. But wait! This assumption is generalized and does not account for the variability of months, seasons, or the weather itself. Defining the intricacies of weather variability on the daily scale is a tall order and breaking down expected weather by season is a bit too broad, but segmenting weather variability by the twelve months of the year is a decent middle ground. So, in January and all other months with 31 days, a 1 in 31 probability (3.23%) exists for anticipating a record breaking on a certain day. The same logic follows for months with 30 days (3.33%), and for February, 1 in 28.25 (3.54%) is used due to the leap year skewing the number of days every four years (28+28+28+29)/4 = 28.25. Thus, the first component of the vulnerability equation is set; either a value of 3.23%, 3.33%, or 3.54% is plugged in for this quantity for months with 31 days, 30 days, and February, respectively.

 

Next, we know the values of both climatological normal temperatures by day and the daily records themselves by looking them up on NWS Raleigh's website. In turn, we know the departure, or how far off, each record is from the climatological normal temperature. With this data, we can calculate the mean value of how far off record values are from climatological normal temperatures for each day of the year. For RDU International Airport (KRDU) data, these values we calculated as follows: 17.9°F for maximum temperature records, 19.9°F for minimum temperature records, and 2.08" of rain for precipitation records. In other words, on average in Raleigh, when a record high/maximum temperature is set, the temperature is likely about 17.9°F above the normal high temp for that day, and so on for the other two record categories. With this data, I retrieved precipitation data from 1997-2018 and temperature data from 1949-2018 at the RDU Int'l Airport7 and calculated the number of days that exceeded the normal temperature and divided that value by the total number of days sampled - all with Microsoft Excel. These values vary by month because of seasonality and the ease for temperatures to get much warmer in the summer versus bitterly cold in the winter. I will spare the nitty-gritty details for this blog post, but this part of the process just became seemingly endless number crunching and copying and pasting a lot of decimal values and sorting them by month, by record, and by magnitude. In the end, I arrived at twelve values to plug in for each month for each record type to insert into the middle quantity in the vulnerability equation.

 

Then came the fun part - determining a model function to allow me to input any of the record values and output the percentage/probability of that record occurring. I simply used the data points gathered in the middle step cited in the last paragraph and created a linear regression in Excel. No dice. I was getting wild values that made no sense - why? Weather is not linear! Linear relationships describe a one-to-one correspondence of values, namely a certain condition resulting in a certain outcome. Is this the case outdoors with Mother Nature? No way - the temperature could be 50°F and cloudy, windy, rainy, sunny, hazy, foggy, and it could be January, or February, or March, or October, and so on - you get the idea. Weather is nonlinear, but that is about as specific as we can get. We cannot exclusively deem weather as another kind of function, but the closest class of relationship would be exponential functions. An exponential function, in a VERY broad nutshell, describes very numerous occurrences of small events and a very small number of extreme events. This is more like weather, right? We will have upwards of 100 or so small rain showers through the year in North Carolina, but probably only a handful of severe thunderstorms or storms with torrential rain. Of course, the records lie within that zone of rare storms or days with exceptional weather conditions. So, scratching the linear regression, I created an ab-exponential regression with the same data. Yahtzee! I now had a set of three realistic curves to describe the likelihood of max/min temperature and precipitation records (Fig. 6). The way these curves work is that our input value (x-axis) is the standing record for a given day. The output is the matching value on the y-axis that will be a decimal value between 0-1. This value is the estimated probability of that record value being matched or surpassed based on the empircal data we retrived from the RDU Int'l Airport. This is the third and final section of the vulnerability equation.

gallery/desmos-graph

Figure 6. The three calculated regression curves plotted using Desmos web application. Each of the exponential functions approximately describes the probabilty of matching or surpassing a record value for a given calendar day.

 

Green curve - Precipitation: y = 0.287*0.0979x 

Blue, dotted curve - Minimum Temperatures: y = 0.857*0.7701x

Red, dashed curve - Maximum Temepratures: y = 0.783*0.7623x

After all of this number crunching and one really, really busy Excel spreadsheet (Fig. 7), I had reasonably quantified the probability of certain meteorological records falling or standing. Assigning true values to something as unpredictable as the weather is virtually impossible, but this is a beginning step to assigning value to how resilient or vulnerable records may be.

gallery/blog_screengrab

When Will All Records Be Unbreakable?

The boring truth is that we truly cannot know. We cannot predict the weather with enough accuracy to know, and determining long-term trends and extremes is absurd and physically impossible.

 

However, with numerical values now in place with the devised record vulnerability above, we can conjure up some far-fetched, unscientific, but fun numbers to satisfy our curious brains.

 

Probability is a function of time, that is, the idea of probability and chance is contingent upon repeated trails or occurrences of an event. Hence, with probability values, we can extrapolate how many repeated iterations (or in this case, how many years) are needed to ensure, or at least make highly likely, the chance that an event occurs (or again in our case, a record is broken).

Figure 7. A screenshot of the Excel spreadsheet that I utilized to calculate my Record Vulnerability Index. Data go from most empirical to the left to most theoretical and mathematically derived toward the right, until the columns highlighted in yellow. The rightmost section shows the breakdown of daily probability of exceeding the mean record value departure, the second quantity in the equation.

The ranges for record vulnerability range from 0.0000000656% [8 October: Hurricane Matthew daily rainfall (6.45")], a seemingly impossible record to break, to 11.67% (29 February: 24°F minimum temperature), a very easy record to break (we get in the mid 20s frequently in winter), just skewed vastly by the leap day coming every fourth year. Dividing each of these probabilities into 1 yields the number of years that it would viably take to ensure a high likelihood of the record falling. Of course, we know this is not necessarily true becasue weather is intrinsically random, but this is just for fun at this point. So, 1/0.1167 = 8.57, rounded to 9 years. But, let's multiply by four due to the leap year constraint, so within 36 years, hypothetically, in a perfect world, the record is likely to fall. In contrast, 1/0.000000000656 = 1,524,390,244 years (rounded). So, in jest, it may likely be over a billion years before that Hurricane Matthew rainfall record falls. Think about it - it would take a tropical cyclone with the same or similar track on the same day of the year with the same rainfall rates over the same area to repeat that record. Now we are getting a bit far out, so let's come back to reality!

Blog 2: Weather Record Vulnerability