An Even Greater American Eclipse

In 2017, I had a special opportunity to view the last total solar eclipse that passed over North America, dubbed the ‘Great American Eclipse’ for its sweeping cross-country path of totality from Oregon to South Carolina and unprecedented (at least for eclipses) social media virality. Now, another total solar eclipse will trace a path from Texas to Maine on Monday, with several million people expected to flock into the shadow of totality to catch a minutes-long glimpse of this rare celestial phenomenon. For weeks, I’ve seen article after article about the eclipse, how best to view it, how much economic benefit and traffic are expected; perhaps it’s just my algorithm, but I sense substantially more hype than last time. And I believe the hype is justified, since laying eyes on a solar eclipse is one of the few truly indescribable and soul-moving experiences I have had, however fleeting.

I’ve been asked several times where would be the best place to maximize one’s eclipse-viewing experience – in all honesty, getting a clear view of the sun at any given moment is largely up to chance. We can count on cloud cover predictions a few days in advance, which indicate that Texas will have a cloudy day but the rest of the path varies from sunny to partly cloudy. I will be spending the weekend in Lebanon, NH (all lodging was filled in the northern half of Vermont) and plan to drive to Waterbury, VT to view totality: by some strange luck, northern New England has the highest chances for a clear sky despite only ~30% of days being even partly sunny at this time of year. The region received about a foot of snow this week, which should help ensure a clear sky by suppressing surface-based absorption and convection (the most difficult clouds to predict)…I am keeping my fingers crossed that this will be enough to keep the clouds away from our view.

3-day cloud cover forecast (in % coverage, where white counterintuitively represents the clearest sky). Source: Pivotal Weather

While the rest of us gawk at the eclipse’s visual incredibleness, the brief minutes of totality present a unique opportunity for scientists to conduct meaningful research. In the past, solar eclipses have provided a window for the first observation of coronal mass ejections, the discovery of helium, and the first proof of light bending around a massive object like our sun. But several interesting studies came out of the last eclipse, presenting new questions that could be answered during this one. For example, NASA will be taking broad-spectrum imagery of coronal behavior, which should be especially interesting near the 11-year max of this solar cycle. Another study aims the instruments at the ionosphere, trying to measure its interaction with light in order to identify disruptions that could affect global communications. I will be particularly interested in the boundary layer observations that come from the eclipse path, since atmospheric soundings and remote sensing can reveal hidden airflows that can assist with turbulence and cloud formation models. As the next cross-country total eclipse is not until 2045, I hope we can make the most of this opportunity at all levels of observation!

Topography and Tornadoes: a Recent Case Study

In the ten years since I became interested in this question, there have been several significant advances in understanding the influence of topography on tornado development. The Midwest, a region known for flat and boring terrain, serves as an ideal testbed to observe these influences, offering the ability to isolate topographical variables against a flat control domain. This post seeks to identify topographical drivers and assess the physical mechanisms within a recent tornado outbreak that spawned over a dozen tornadoes across Ohio, Indiana, and Illinois on March 14th.

Zooming out to assess all possible terrain variables, Kellner (2012) first employs GIS methodologies to establish statistical correlations between tornado touchdown points and several spatial features: elevation, land use, surface roughness, antecedent rainfall, and more. Using a high-resolution buffer analysis, the study highlighted that 64% of all tornado touchdowns occurred near urban land use and 42% near forests, a moderate-to-strong correlation given the overwhelming presence of flat farmland and range. The consensus explanation for this correlation is that increased surface roughness causes more horizontal vorticity that can contribute rotational energy when advected into a tornadic storm via the streamwise vorticity current, though I would also posit that urban heat island effects can add significant energy (localized SBCAPE) as well. A strong example of this formation mechanism occurred when a tornado coalesced downwind of Muncie, IN as an EF2, briefly lifted off the ground over Farmland (that’s a town name, though the terrain type is implied), then churned at up to EF3 strength for another 40 miles into Ohio.

Selma, IN (EF2) and Winchester, IN (EF3) tornado damage paths from the evening of 3/14/24 (NWS)

To dive deeper into the physics of tornado-ground interactions, Satrio (2020) experiments with a well-established LES model to simulate how a tornado-like vortex moves over various hill configurations. Near-ground flows have been a popular research topic for their damage potential and effects on vortex stability, inspiring numerous observational studies using radar, photogrammetry, and debris impact analysis. Satrio’s simulation results paint a particularly cogent picture of tornado dynamics, how uphill slopes can cause a low-level vortex disruption, how downhill slopes cause an intensification of swirl, and how vortices can bend to remain perpendicular to sloped terrain. All three of these effects can be clearly observed in the tornado that touched down near Hanover, IN: after crossing the Ohio River, the tornado weakened to an EF0 at the top of the first ridge, intensified to an EF2 as the track turned to follow the southern edge of the valley, weakened back to an EF0 over the second ridge, and reintensified to an EF2 upon crossing the valley again. Variations of this tornado evolution behavior have been corroborated numerous times in recent literature, most clearly by Lyza and Knupp (2018) on the periodic ridges of northeast Alabama, Bosart (2006) in the Hudson Valley, and Wagner (2018) near the Kansas River.

Hanover, IN tornado (3/14/24) damage path strongly influenced by Ohio River valley terrain (NWS)

However, tornado damage paths do not always show such visible signatures of terrain influence. Sometimes, as in the brief EF2 tornado near Plymouth, OH, there are absolutely no interesting topographical features in sight (though I would argue that the absence of terrain heterogeneities likely prevented further intensification within this supercell). In other instances, inferences about the nature of storm inflow would be needed to consider any terrain interactions. For example, the long-track tornado that struck Lakeville, OH as an EF3 does not have any significant hills, cities, forests, or geographic boundaries along its path. But, when the initial EF0-EF1 tornado crossed Grand Lake, the vortex was turned northward to favor the inflow side. Several miles later, likely due to the increased humid inflow, the tornado reformed as an EF3 near Wapakoneta and rumbled for almost an hour toward the northern suburbs of Columbus. The presence of the urban/suburban roughness on the inflow side, along with the Scioto and Olentangy River valleys, likely gave this tornado the extra push to reform as a long-track EF1 near Delaware, OH. These terrain influences are more subtle and speculative, of course, but the potential for these types of observations sparked my interest in tornado simulation and prediction in the first place. I firmly believe that localized surface conditions play a major role in the formation of not just tornadoes but many weather phenomena, and it’s always gratifying to find physical evidence in support.

Grand Lake, OH EF1 tornado that preceded the Lakeville, OH EF3 on the evening of 3/14/24 (NWS)

Delaware, OH long-track EF1 tornado formed from the remains of Lakeville, OH tornado (NWS)

Bunch of nothing around Plymouth, OH EF2 tornado path, 3/14/24 (NWS)

Tornado path maps from the NWS Damage Assessment Toolkit, preliminary data

A Not-So-Super Conductor

If you’re a scientist or layperson alike, you probably felt some of the hype surrounding a novel, potentially game-changing superconducting material named “LK-99” over the last couple of weeks. Well, a series of experiments has swiftly disproven this material’s superconducting ability, hopefully serving a dose of caution to overambitious scientists and overeager investors everywhere. Having seen several cart-before-the-horse scenarios in tech before, I was skeptical from the beginning. Before accepting any of the claims from a viral internet puff piece at face value, I wanted a deeper look at the source material, which was posted to preprint website arXiv on July 22nd. I’m not an expert on this family of materials, but I spent long enough researching mixed perovskite crystals in grad school to identify major red flags in this work immediately.

First, the choice to bypass peer-review and post directly to arXiv is certainly questionable. Sure, the field of superconductor research is competitive, and being the first group to discover a room-temperature superconductor would be a monumental (likely Nobel Prize-worthy) achievement. There is a substantial risk when submitting to a peer-reviewed journal that either a) the paper gets initially rejected because there is no published precedent to your novel finding or b) a peer precedes you to publication after delays inherent to the peer-review process. A way to preempt this is to submit the article to preprint (like arXiv) with a responsible title that doesn’t oversell the controversial claims or draw too much attention. Nothing remotely close to the title used: “The First Room-Temperature, Ambient-Pressure Superconductor.” It’s brazen confidence in a headline that reads almost like a joke, but this group was serious.

Second, the material is a mixed crystalline structure where copper is doped into a lead apatite lattice. Again, I’m no expert in superconducting materials, but I took an analogous approach in my research when I doped bromide ions into a perovskite lattice of lead, iodide, and methylammonium cation. As bromide dopant concentration was increased, the crystalline lattice assumed more and more of the character of the bromide-based lattice, with a higher bandgap, improved air stability, greater conductivity, and lower absorbance. Importantly, no new properties were generated: the bromide-doped perovskite simply contained a mixture of the properties of iodide-based and bromide-based crystals depending on ionic concentrations. A copper dopant may break some of the insulating nature of the lead apatite by inserting its conduction band periodically within the lattice, but it would be a tough sell to convince me that the conductivity of this mixed material surpasses the conductivity of pure copper, much less displays superconductivity.

Ultimately, the false promise of the work was exposed when other superconductor experts reproduced the LK-99 material and tested more rigorously for superconductivity. The preprint study highlighted a critical magnetic field as evidence for zero-resistivity, which is one measure of several necessary to prove viability of a superconductor. While the critical magnetic field exists with lead ions under stress, other key identifiers like a steep drop in the voltage-current curve indicating zero resistance were missing. The group is now under investigation for fraud, as it is currently unclear whether the work was simply sloppy or a purposeful attempt to deceive. I hope that this debacle helped to inform those who translate and magnify scientific stories to take all bold claims with a grain of salt. On the bright side, the general public is much more aware of the promise and importance of superconductors, perhaps one of the next great frontiers in materials science….just not yet.

Observing Meteors

While I was camping in central Texas last weekend, a bright fireball meteor streaked across the evening sky and exploded into several smaller fireballs before fizzling out. It was truly spectacular, the most impressive single meteor I’ve witnessed, and I was lucky to be looking at a dark patch of sky at that very instant. But it left me with questions, as I realized I knew very little about where meteors originate and why they behave as they do. This might be a divergence from the usual meteorology – which focuses on hydrometeors, we’re more interested in fiery space rocks for today’s post – but I learned enough in my deep dive that I wanted to drop a few nuggets here on this forum.

Fireball meteors generally come from a specific region of the asteroid belt that experiences a resonance with Jupiter’s gravity that scatters asteroid debris through the rest of the belt and beyond into the solar system. In fact, about a third of all meteorites found on Earth originated from the same asteroid, Hebe, presumably broken off in an asteroid collision thousands of years ago. The fireball I saw traversed the ecliptic, tracing the sky between Jupiter and Venus, which makes sense as a meteoroid from the asteroid belt would likely need to stay in the plane of the solar system to impact Earth. Depending on semantics, this meteor qualifies as a rare bolide under some definitions, namely that the meteor was brighter than Venus and detonated under the pressure of the atmosphere. At least 3 smaller meteors were visible after the explosion, which was followed by a faint echoic pop, like a cosmic firework. I reported the event to the American Meteor Society, who collects anecdotal data for verification and record-keeping. It was pretty cool to see that my sighting was corroborated by a guy over 100 miles away in San Antonio, and maybe others will add details to the report!

Most of my fireball sightings have occurred outside of a designated ‘meteor shower’. That’s because meteor showers happen by a different mechanism: Earth passes through comet trails of ice and dust periodically along its orbit, hence why the position and peak intensity times are on a predictable annual schedule. Some showers can persist for decades or even centuries before debris trails dissipate – the Perseid meteor shower was documented by the ancient Chinese over 2,000 years ago, consistent due to the regular passage of the Swift-Tuttle comet every ~133 years. These meteor showers are more visible in the early morning hours, as this is when you look up into the direction of Earth’s travel. While the streaks of comet dust hitting Earth’s atmospheric windshield are still a sight to behold, fireballs are only marginally more likely to become visible meteors in the predawn hours since meteoroids travel at several times Earth’s orbital velocity.

There’s quite an interest in studying large meteors, not just for their stellar beauty but also for the existential risk posed to humanity. Surprisingly, there have been no documented deaths from a direct meteor strike, though a number of close calls have occurred over the years. Memorably, a house-sized meteor fell over a densely populated area near Chelyabinsk, Russia in 2013, creating a shockwave that injured 1500 people. Just this month, a 2.8 pound meteorite crashed into a lady’s bedroom in British Columbia. Although the impacts of comparatively small meteorites are random and immitigable, NASA and other space/defense agencies worldwide are diligently working on technologies to intercept the next ‘big one,’ which historically happen over a period of about 250 years. Meanwhile, I will continue watching the skies, hoping that another impressive fireball might pass overhead.

New Normals

There’s been a lot of talk about a “new normal” as we emerge from the COVID-19 pandemic, one characterized by germophobia, physical distancing, remote work, and a reluctance to gather socially. The hesitancy may go on indefinitely; even though all Americans over 16 have access to a free vaccine, a large portion of the population will opt to go unprotected. It is truly confounding that those most staunchly anti-vaccine are likely living their full-contact lives, while many vaccinated people still take precautions. And that events will be limited until herd immunity is achieved, despite that practically no one in the herd will be concerned about catching the virus here in a few short weeks. But I digress.

The new normal we should be talking about, in my view, is living with the irreversible changes brought by human activity. Last week, NOAA published new 30-year averages for temperature, precipitation, and other meteorological variables. To no one’s surprise, the new average temperatures reflecting 1991-2020 are substantially higher than the previous figures reflecting 1981-2010, showing a warming consistent with global temperature increases. The last two decades have each seen nearly a 0.5 °F rise in average national temperature, accounting for over half of the total 1.7 °F temperature increase since 1900. Mathematically, this means that the 2010s were 1.5 °F warmer than the 1980s, the 2000s 1.5 °F warmer than the 1970s, onward and upward. Moreover, climatic trends that began around 50 years ago are exacerbated in this latest update: it’s becoming warmer and wetter across most of the country, and much hotter and drier in the desert southwest.

Link to Washington Post article

While global warming-driven climate change has seemed like a distant threat in the past, the 2010s represents a turning point in which the issue became imminently tangible. Whether it’s the proliferation of news media or simply an increased environmental interest worldwide, there have been a number of stories documenting startling changes in our Earth systems. Calving ice sheets, habitat loss, industrial spills and plastic waste pollution, natural disasters, water shortages, and more are broadcast with viral imagery (often sensationally, but that’s effective) and discussed in the proper context of historical records. I recently watched Chasing Coral, a shocking documentary showing in real-time as major sections of the Great Barrier Reef died over the span of a few months. The fact that 1 °C of ocean warming could cause a loss of over 90% of coral reefs is an alarming testament to the fragility of our natural balance, and I fear the ripple effects across all ecosystems due to what once seemed like a minor perturbation in climate.

Moving forward, I want to better advocate for Earth-conscious policies and causes, rather than simply be a background observer as I have been for the past decade. The Paris Climate Accord, which seemed proactive at its nascence in 2015, can now be viewed as a no-brainer reaction to an existential threat and frankly may not go far enough to push developed countries to innovate a sustainable future. I regret my decision in 2012 to work all summer instead of learning scuba and seeing the coral reefs of the South Pacific when they were near full strength, not foreseeing that irreversible damage would be done in a few short years. I do not want to make the same mistake to miss Earth’s other disappearing treasures: indigenous villages and cultural enclaves assimilating into modern society, coastal cities and coral atolls sinking into the sea, glaciers melting and rainforests burning and previously habitable land desertifying. Even the pristine dark sky is vanishing due to the multiplying satellites in orbit, and an unlit sky is perhaps the most transfixing view on Earth. Perhaps an individual can’t slow the massive forces of change, but a little more worldly observation would suit us all, since what we enjoy about today may not be here forever.

An Odd Tornado Trend

The 2021 tornado season is already in full swing, with back-to-back weeks of high-risk convective outlooks in the Southeast. Numerous supercells spawned significant tornadoes across Mississippi, Alabama, and Georgia, culminating in a tragic EF4 tornado in Newman, GA on Thursday evening. The recent rash of severe weather reminds me of a trend that I know has no scientific merit but strikes me as uncanny nonetheless. It seems like the major tornadic events of my lifetime have mainly occurred during odd-numbered years. Joplin and Tuscaloosa in 2011. Moore in 1999, 2003, and 2013. El Reno in 2011 and 2013. More recently, a rash of winter tornadoes in early 2017. Rare autumn tornadoes near Dallas in 2015 and 2019. Dayton and several others during a record wave in May 2019. As an armchair statistician with an interest in severe weather, I could make the list go on and on.

But while a list of examples does constitute an observation – a worldly observation at that – this is no way to make a scientific assertion. Experiential data can be biased based on one’s vantage point, preconceived notions, memory, or any number of factors. The cold, hard data tells the true story: that variations in tornado counts over time fall within statistical randomness. My little odd-year hypothesis was not unfounded for the last decade, though, when happenstance alone may account for spikes in 2011, 2015, 2017, and 2019 compared to surrounding years:

Annual tornado counts since 1950. The increase over time reflects progress in tornado observation and damage reporting, not climatic effects or other biases.

A count of all tornadoes doesn’t tell the full story, of course. About 90% of tornadoes are weak, EF0-EF1 rated, often brief spin-ups in larger QLCS or tropical storm systems. If we only consider the EF2 and above tornadoes, which account for more than 95% of tornado-related fatalities and economic damage, it becomes clear that 2011 was an outlier year due almost entirely to the April 27 super outbreak. 2015 actually registers as the lowest year for strong tornadoes since the EF-scale was introduced, and 2017 and 2019 only reflect slightly above average numbers of significant tornadoes:

Doppler-era strong tornadoes (EF2 and up) depict 2011 as an outlier year. Source: SPC

If it’s a trend at all, it’s a weird one that I’m willing to chalk up wholly to coincidence. Tornado climatologists have tried with limited success to link tornado occurrences to El Niño/La Niña, but that wouldn’t explain a two-year alternating cycle either. And it would be a minor trend, as off the top of my head I forgot some major tornado events during even years, like strong tornadoes in Mayflower, AR and Tupelo, MS in 2014. In fact, last year was an even year headlined by a devastating tornado in Nashville and a large outbreak that I wrote about but completely neglected to remember (there’s that bias again – my memory was distracted by the onset of a global pandemic and associated life changes). Trend or no trend, severe year or less severe year, all tornado outbreaks inflict damage and affect the people living in their paths, irrespective of whether the event is “major” or memorable. That’s why I hope for a break in the trend for 2021, a less severe tornado season and a return to more scientific analyses.

The Worst Hard Time

In the spirit of “Well, it could be worse…”, I recently finished reading The Worst Hard Time by Timothy Egan. This book wraps first-hand accounts of the Dust Bowl into a chronological picture of the extreme hardship experienced by High Plains farmers and entrepreneurs throughout the 1930s. The storytelling is detached from the pain of its main characters, people trapped in an unimaginable hell that has largely worn away from our collective memory. Despite growing up in Oklahoma, I was surprised by the coarse brutality of dust storms and the rooted resilience of the people who survived.

First, dust storms are worse than I ever imagined – and I study severe storms! Not only did these ‘dusters’ whisk away topsoil and kill crops, getting caught outside in one was practically a death sentence. Respiratory conditions like silicosis and dust pneumonia claimed hundreds of lives, a threat so serious and continuous that homesteaders would seal windows and doors with a wet rag to impede the inevitable infiltration of insidious particles. The static electricity from blowing dust could incinerate vegetable gardens as well as short out vehicles, stranding motorists in remote dunes. Airborne dust twice drifted as far east as New York and Washington, coating cars and buildings with a grubby film of Heartland strife. Meteorologically, dust storms form when a gust front (or just persistent winds, it’s frequently windy in this part of the country) crosses an expanse of loose, desertified territory. There were a few in 1930 and 1931, 14 in 1932, 38 in 1933, then too many dust storms to count on windy days throughout 1934 and 1935. People could eventually tell a cloud’s origin by its color – black from Kansas and Nebraska, red from Oklahoma, gray from Colorado and New Mexico, sandy tan from Texas.

The Black Sunday storm approaching Rolla, Kansas
Black Sunday (April 14, 1935) duster descending on Rolla, Kansas. (History Channel)

Second, the arc of the crisis was unsettling – and in a way, prescient. The 1920s were a decade of plenty, as ‘nesters’ flocked to the high plains to take advantage of cheap land and high wheat prices. But when drought and dust storms ravaged the newly-plowed land, many nesters moved on to greener pastures, and those who stayed plowed their plots with a stubborn urgency. Marketing campaigns urged more settlement and more planting, since “rain follows the plow.” A pyrotechnics expert was hired to bomb the sky to whip up some rain. When that didn’t work, perhaps it was God’s will to inflict suffering on his people – though never because they were too proud for having superseded nature. The federal government swooped in to offer New Deal subsidies for tree-planting and grassland conservation projects, and prominent voices representing the people rejected any government involvement on the grounds that they were tough people and could pull through themselves. Residents of Dalhart, Texas founded the fatalistic ‘Last Man Standing Club’ as they watched formerly prosperous ranching country turn into a lifeless wasteland.

Denial is often as dangerous as a crisis itself. After optimists said “this drought can’t possibly go on” in 1931, it was a full eight years until substantial rain fell and over 10 years until wheat became profitable again. Likewise, two months into the COVID-19 pandemic, there’s a growing sentiment that normal activities should be resumed despite the fact that case counts continue to increase. Rather than denying the scope of the problem, or simply returning to what we’ve always done, I would advocate for exploring every possible avenue for a solution. The Dust Bowl likely wouldn’t have lasted as long if there was a concerted, government-led effort to either a) drill deep wells into the Ogallala Aquifer or b) restore native prairies/plant ‘shelterbelt’ forests prior to 1937. The COVID-19 pandemic could be similarly abated with a concerted, government-led effort to mandate mask-wearing, social distancing, and sanitization precautions in public places until a vaccine is available. Only if people accept the reality of the situation can we improve it, opening up in a way that optimizes the protection of lives while also preserving livelihoods.

Time for Leap Year

It’s time for our quadrennial bonus day! This go-round I am in Hawaii, which means I will be among the last in the world to experience the rare glory of February 29th. It also means that the lone drawback of this holiday, namely that it doesn’t extend June or October instead of gloomy February, is moot for me! Before I head to the beach, I have to take this once-in-four-years opportunity to talk about the time dimension of my weather modeling work.

Leap year is the perfect window for a discussion of time. Of course, the reason that we have an extra day every four years is that the Earth completes its orbit of the sun in roughly 365.2422 days.  Since it’s not an even 365.25 days, we skip leap year every 100 years, except if the year is divisible by 400, then we observe it (as we did in 2000). Because Earth’s solar orbit is affected by the moon and other planetary bodies, the length of a solar year can, in fact, vary slightly.  Complicating the math further, the average day now lasts for 24.0000006 hours since the second is now defined atomically rather than astronomically. The calendar is oh-so-gradually creeping forward (one could argue that this would be solved with universal adoption of the controversial-but-perhaps-overly-maligned leap second), but this won’t become a noticeable issue for another couple thousand years.

Why do I care about any of this? The angle of the sun is an important variable in the heat balance of Earth’s surface, a core ingredient to all weather models. The leap year phenomenon causes the solar zenith (and corresponding sunrise/sunset times) to change a little bit from year to year.  The annual sinusoidal movements created by the gravity of the sun and moon are summed up by the solar zenith equation, below, which does not consider the leap year variance. For the sake of exactitude, I modify my N input with a leap year correction term; this basically sets the N+10 intercept to the exact time of day that the winter solstice occurred the previous year, either on December 20th, 21st, or 22nd.

Once you account for leap year, these four equations describe the incident angle of solar radiation for any location at any time. Of course the amount of solar radiation depends heavily on other atmospheric factors, but these few lines of code are still pretty powerful.

An exact expression of dates and times is also important when integrating data from numerous agencies and historical records. Data calls generally return the observation time along with the corresponding time zone information for the polled location, which is helpful. I then use the datetime package in Python to do the conversions for me – this normalizes every data input to the computer’s internal Unix time, which counts seconds since 0:00 GMT, January 1, 1970. Beyond preventing Y2K fears from being realized, this standardization (along with UTC time expression) has become absolutely crucial to how the world operates today, from telecommunications to air traffic control. Certainly beats the alternative of mass confusion resulting from differing conceptions of time, like the 300+ years when the world ran on both Julian and Gregorian calendars.

For most of my readers, this is probably more than you ever wanted to know about tracking time. I sincerely apologize, with the caveat that this day would not exist without the observations of early astronomers and many concerted efforts to standardize the calendar.  It’s a bonus day, hopefully that means I get a pass!

Changing Currents

Two understated news stories crept onto my radar this week. First, and this is a fun one, a British Airways flight from New York to London set the subsonic flight record across the Atlantic, smashing the previous record by 17 minutes while clocking in under 5 hours. To cut down flight time by a quarter from around 6.5 hours, the Boeing 747 jetliner had to surf a wildly fast 200+ mph jet stream. This uncommonly intense tailwind was made possible by the pull of a strong low pressure system over the British Isles, with credit to the polar vortex oscillating over Canada.

Second, not only are air currents setting records, ocean currents are also speeding up. Increased winds and warming waters have caused a 5% acceleration per decade in mean oceanic flows, both at the surface and up to 2 km deep. Raising the average flow velocity from 1.0 mph to 1.1 mph since 2000 may seem like a minuscule change, but the sheer volume of water in the ocean makes this trend very eye-opening. We don’t know what sort of effects this flow increase has on regional climates or on the deep ocean, but it could become substantial if current trends continue.

These stories may not sound consequential on their own: two small, random scientific details on an enormous globe. But in my eyes, they are emblematic of what climate change really looks like. Individual hurricanes and winter storms may be the flash points for climate change dialogue today, but causation is impossible to prove with any certainty. Meteorology and climatology are statistical fields, after all: definitive proof is unattainable without a vast collection of data from many sources over a long time period. Though weather and climate experts agree that global warming/climate change is happening, the field shies away from predicting its impacts because of the variable nature of weather.

However, simultaneous worldwide observation is possible in the age of satellites and the internet, and a whole-earth view reveals several indicators of accelerating climate change. The polar vortex is growing more unstable by the decade; the jet stream is speeding up; mean winds have increased by about 1.9% globally in the last 10 years; ocean currents have sped up by 5% in the same time; while the ozone hole in the southern hemisphere has mostly healed, mean carbon dioxide concentrations hurdle past 400 ppm at a steady 3 ppm/year rate; and over 90% of the world’s glaciers are receding. The fact that big changes are happening slowly in the world’s climate should be indisputable, unless you reject hard scientific evidence. If the conversation focused on this type of evidence rather than post-disaster what-if conjecture, perhaps the topic wouldn’t be so controversial and we could plan for our future wisely.

Projecting onto the World

Sometimes I wish the world were flat. Not because I’m some kind of conspiracy wonk, just because I’m practical. Like your prototypical engineer, I love applying coordinate systems to, well, anything. A flat earth, easy, model it with a 2D Cartesian coordinate space. We could even transform it into polar coordinates, say, if we wanted to superimpose a tornado above the origin. I thought that was a legitimate approach when I started conceptualizing the tornado prediction model a few years ago, but it really isn’t that simple. In fact, I found some of the intricacies interesting. This post aims to break down my foray into GIS (geographic information systems) as my cartographic view evolved from “Let’s just call the earth flat” to “The ideal projection should minimize shape and size distortion to promote a balanced, unified worldview, but alas, such a projection is only possible in three dimensions.”

The earth’s geometry is complex, even if you ignore its movement through space and its relation to other celestial bodies. First off, it isn’t truly a sphere: it’s an oblate spheroid. With the angular momentum of planetary rotation stretching earth’s liquid core, Earth’s diameter at the equator is about 43 kilometers greater than its diameter at the poles. Moreover, the axis of rotation differs from its magnetic axis, complicated by the fact that Earth’s magnetic poles are continuously on the move. This squished globe is our home, however, so cartographers and others have worked for centuries to develop accurate, catch-all representations of its irregular surface for navigational purposes and more.

Earth’s irregular form is governed by many forces. Source: ASU

Before projecting a coordinate system with accuracy, the three-dimensional form of Earth, also known as the geodesy, must be established. Traditionally, this was done based on simple ellipsoid geometry, as the spheroid parameters were periodically updated throughout the 19th and early 20th centuries with new geographic observations and theoretical advances. By the 1970s, the advent of GPS required an exactitude that motivated worldwide cooperation to come up with a standard geodetic reference. So naturally, we came up with two of them: WGS84 (the World Geodetic System, used by most of the world) and NAD83 (the North American Datum, used by the United States). Both are accurate on a scale of inches over North America, and remote sensing data can take either reference as its basis. It’s best practice to run a conversion algorithm to transform all of your spatial layers into the same geodetic reference to minimize any offset with your projections.

While an error in geodetic reference may be minor, your choice of projection can have hugely important consequences. It is, of course, difficult to visualize the surface of a sphere in a 2D medium, so large-scale map projections sacrifice accuracy in the portrayal of size, shape, or both. Depending on your cartographic needs, there are hundreds of projections that people have developed over the years, many of them easily usable with open-source GIS functions. For my application, I require the spatial information in a square grid by latitude and longitude so that I can perform math on the grid cells. Despite the plethora of options at my disposal, I’ve found my wind modeling calculations easiest to visualize with a Mercator-type projection. Controversial history aside, I appreciate that the lat-lon coordinates are perpendicular and the relative shapes of surface features are preserved. Satellite remote sensing data is often released in spherical coordinates, and I use the USGS elevation product (a 1/3 arc-second resolution DEM) as a base layer without requiring a transformation.

The sailing hasn’t been smooth, however, mainly because the MRLC datasets – land cover, tree canopy, and ground imperviousness – that I use to describe the terrain come in an Albers equal area projection. To maintain 30-meter square grid cells, the grid only aligns with the compass rose on one finite line, in my case the 96th west meridian between the latitudes of 29.5° N and 45.5° N. Fortunately this line runs directly through Tulsa and Omaha, limiting distortion in Tornado Alley. To accurately serve areas significantly east and west of the meridian, I have been experimenting with open-source algorithms to transform the conical raster data into the preferred lat-lon coordinates. The transformation is not trivial, converting squares into slightly convex trapezoids of differing size and centroid position. While I’m sure most of these algorithms work, the challenge lies in implementing a transformation that includes the ability to crop the input data (so I never need to transform all 20GB of the United States at once), executes with optimal computational efficiency, and coexists with any user’s Python software environment. After a few rounds of trying and debugging, I think I have found a suitable solution, though not elegant. This solution involves three steps: reading a larger rectangular domain in raster coordinates, transforming the raster to WGS84 coordinates, then cropping to the requested lat-lon domain. The computational inefficiency ranges between 0 and 30% due to this location-sensitive domain mismatch, but I’m willing to live with that tradeoff in place of trying to code an efficient data management algorithm myself.

I realize that the level of information went from basic to deep very quickly, that’s exactly how it happened for me. There’s such a wealth of GIS routines available for Python data processing, but the information is so decentralized that building my code has felt like a virtual treasure hunt. I’m still deep in the GIS developer’s rabbit hole, but at least I’m enjoying it down here. If you’d like to know more about the basics of projection, I love this video from Vox. And to play around with the shape distortions that accompany world map projections, I highly recommend this cool web application by Jason Davies. I’ll be back soon with pretty graphics to share!