J.C. Moore Online
Current Events from a Science Perspective

Sustainability comes to Wichita

     Posted on Thu ,17/06/2021 by admin

Sustainability Comes to Wichita

     Posted on Wed ,16/06/2021 by admin

The Wichita City Council voted on June 15, 2021, to form a Sustainability Board. This was due to the work of several environmental groups, especially SOAR, who had that as one of its main goals.

In 2017, a group of citizens in Wichita, Kansas formed the Society of Alternative Resources (SOAR) as a way to improve the sustainability of their city. The purpose of SOAR was to advise and assist local government, businesses, and residents on alternative resources, sustainability, and renewable energy issues. Its long-term goal is to ensure that our children and grandchildren have clean air, pure water, and a livable Earth.

SOAR decided to use the STAR communities rating system to interact with the local government. Below is the matrix that STAR uses to evaluate a community’s sustainability. It also acts as a guideline for ways to improve the community and evaluate its progress. 

Each item in the matrix has a further explanation in the STAR-V2 guidelines. Many cities invest millions to attract businesses and make their city more competitive in job creation,  entrepreneurship, workforce development, and capital investment. They also need to invest in their communities’ Sustainability.  The things that attract and keep the millennials, the talent, and the young entrepreneurs to a city fall under Sustainability. Below is a letter from the local Wichita Eagle newspaper designed to promote SOAR and its goals.

 How to Improve the Qualify of Life in Wichita  11/10/2017

“Local Sustainability Issues” was the topic of the October Luzzati Lecture Series at WSU. Zach Baumer, Climate Program Manager of the Office of Sustainability in Austin, talked about the city’s effort to “green” its environment. Sustainable practices and a healthy environment are important issues for businesses, young professionals, and entrepreneurs when they consider locating in a city.

STAR ratings give an overall picture of the quality of life in a city and the desirability of living there. The STAR system considers a city’s progress in nine categories: Built Environment, Climate and Energy, Economy and Jobs, Education, Arts and Community, Equity and Empowerment, Health and Safety, Natural Systems, and Innovation and Processes. Austin rates as a four-star community with 476 points of a possible 720. Wichita has a three-star rating with 231 points.

Clearly, we have room to improve our community’s sustainable practices and our STAR rating. It will take effort and resources, but our businesses, city leadership, Chamber of Commerce, and our citizens should support improvements in the Wichita community. After all, we all have to live here.

(c) 2021 J.C. Moore

Mank: A Warning about Fake News

     Posted on Thu ,20/05/2021 by admin

“Though Mank was about the writing of Citizen Kane in 1934, it carries a valuable lesson about fake news that is relevant today.”

Mank is a movie about the life of Herman J. Mankiewicz, who collaborated with Orson Welles to write Citizens Kane. Citizen Kane was modeled on the life of media magnate William Randolph Hearst. Mank explores Hearst’s longtime friendship with one of Hollywood’s most powerful studio moguls, MGM’s Louis B. Mayer. Hearst’s newspapers helped Mayer ensure the success of his Hollywood films and stars for decades. Citizen Kane, though, was a most unflattering look at Hollywood’s powerbrokers. Before it was released, Mayer offered RKO, the studio that produced it, a million dollars if they would destroy it. Though that was a fortune in 1934 dollars, it is fortunate that RKO refused the offer. Citizens Kane has been acclaimed as one of the best movies of all time.

In his newspapers, Hearst had a reputation for going after anyone whom he wanted to target. One theme of the movie was Hearst and Mayer’s machinations to defeat Upton Sinclair in his 1934 campaign for governor of California. Sinclair had won national acclaim for his 1906 novel, The Jungle. It exposed the abuse of slaughterhouse workers and showed he was certainly no friend of the wealthy and powerful. The state’s Republican establishment, led by Hearst’s California-based papers and Mayer’s Hollywood studios, decided to do whatever it took to defeat Sinclair. They not only considered Sinclair a socialist, but they also feared his promises to raise their taxes. Back then, Mayer was the highest-salaried executive in the nation and the finance chair of the national Republican Party. Mayer was portrayed in the movie as using the Great Depression as an excuse to extort large salary cuts from the writers and actors guilds.

It was no great stretch, then, when Hearst’s California newspapers began running stories in 1934 that “reported” on Sinclair’s plans to expropriate small shops and homes – but Sinclair actually had no plans to do so. Perhaps the most consequential element of the campaign against Sinclair was a series of fake newsreels created by Hollywood film producer Irving Thalberg. These videos, featured “reporters” speaking to “people on the street,” many of whom were actually small-time Hollywood actors reciting scripted remarks. Well-dressed individuals criticized Sinclair and praised his opponent. And there was footage of men jumping from freight cars, which the newsreel narrators said were shots of dangerous “hobos” arriving in California in anticipation of a Sinclair regime that would pay them to live off the state. The “hobos” were actually from footage taken from the movie, The Wild Boys.

The videos depicted Merriam supporters as good, solid Americans and Sinclair supporters as foreign-accented Bolsheviks.  This material was bundled together and presented as regular newsreels to the millions of Californians who went to the movies every week. It was all fake, but the public bought it – there it was in the newsreels. Thus bolstered, Merriam staged a remarkable come-from-behind victory in November’s general election.

Ironically, Mankiewicz was one of the very first film industry figures to sound the alarm about fake news. He penned an anti-Hitler drama in the month following the Nazis coming to power in 1933, which predicted the murderous violence of the then-fledgling Third Reich. He wrote a script about the fake news that Josef Goebbels had produced and the anti-Semitic falsehoods that played a central role in the Nazis’ rise to power.  Mankiewicz tried to find a studio with the courage to produce it, but the studios wouldn’t as they feared the loss of their German market. Too bad, Mankiewicz’s script would certainly have made a timely movie, alerting the world to the dangers of Nazism. It should still be a warning to democracies today, about the fake news created by extremists on the far right.

The Mythical Magic Hydrogen Economy

     Posted on Tue ,02/03/2021 by admin

There’s a little bit of truth to every myth, and the hydrogen economy is no different. Hydrogen fuel cells would be wonderful for the environment. They combine hydrogen with oxygen from the atmosphere to produce electricity, and they emit pure water. The hydrogen can be made by electrolysis of water, and the energy for the electrolysis can be provided by renewable energy such as solar and wind. Though hydrogen must be stored at high pressures or low temperatures, it can be transported and used to replace fossil fuels in most of their applications. Then why are the fossil fuel companies so eager to transition to a hydrogen economy? They now are applying for grants from stimulus money for research on hydrogen power. There must be more to the story and that is where magic comes in.

It would take a tremendous amount of magic to make hydrogen a viable source of energy within 30 years. Currently, 95% of commercial hydrogen is made from fossil fuels, primarily methane. Producing hydrogen from methane is energy intensive. It requires that methane be reacted with steam-heated to about 1100°C. That reaction produces hydrogen and carbon monoxide – which is then treated with additional steam at 380°C to convert the carbon monoxide to carbon dioxide. Not only is carbon dioxide produced as a byproduct, but it takes a tremendous amount of fossil fuel to heat the steam hot enough to carry out the reaction. Hydrogen produced in this way is called Brown hydrogen, because of all the fossil fuels used. You’re probably beginning to see why fossil fuel companies are so eager to transition to a hydrogen economy.

But wait. All we would need to do is capture the carbon dioxide produced in making Brown hydrogen and store it underground. Hydrogen could then be produced without adding more CO2 to the atmosphere, so it is called Blue hydrogen. Fossil fuel companies are now pursuing grants and subsidies to develop Carbon Capture and Storage Systems (CSS) to do just that. But there are a few problems. Fossil fuel companies knew as far back as 1979 (see memo below) that adding more CO2 to the atmosphere would cause global warming and damage the environment. A CSS system requires little new technology, so why did they not develop CSS then – and global warming would never have become a problem. Fossil fuel companies did not do it because it would have made their products more expensive, and demand would have gone down. And they are even less serious about developing CSS systems now. With prices dropping on renewable technologies and energy storage systems, CSS would make carbon or hydrogen fuels so expensive that it would accelerate the transmission to renewable energy and battery storage.

Though there are currently large supplies of methane available from fracking operations, using fracked methane to produce hydrogen just isn’t a good idea. The main problems associated with fracking are methane leaks and earthquakes (caused by the disposal of fracking fluids). It has been estimated that about 20% of the methane produced at the wellhead is lost through transmission losses and leaks. Because so much methane is lost during production, France has recently prohibited American fracked methane from being sold there. Though the amount of methane in the atmosphere is small, methane is 72 times more potent as a greenhouse gas than carbon dioxide. As the graph below shows, the methane concentration in the atmosphere has grown exponentially – and it now accounts for about 1/4 as much global warming as carbon dioxide.

That brings us to hydrogen produced by electrolysis, called Green hydrogen. To create the infrastructure to produce enough Green hydrogen to transition to a hydrogen economy would take more than 30 years. To get there, we would have to start now. That would require Black and Brown hydrogen to be used while we develop a CSS system, and then Blue hydrogen could be used until we have a fully operational Green hydrogen infrastructure. We would be dependent on fossil fuels for at least 30 more years, and the concentration of carbon dioxide in the atmosphere would certainly go up. The best carbon capture systems are trees, oceans, and soils (through regenerative agriculture). Currently, those systems have not been able to keep up. Deforestation, commercial farming, and the acidification of the oceans are exhausting those systems’ abilities to capture CO2. The environment of the Earth cannot absorb much more carbon dioxide, and we certainly can’t wait 30 years on the chance that a commercial scale CSS system will be developed.

Hydrogen is very useful for things such as welding, food processing, ammonia production, and rocket fuel – but it will never be useful to power our economy. That is because a hydrogen economy would be terribly energy inefficient. If you were to use electricity from wind to produce hydrogen, transport the hydrogen to where it is needed, and use hydrogen fuel cells to power your car, about two-thirds of the energy would be lost in the process. The electrical energy that would take you 300 miles in a battery-powered car, would only take you 100 miles in a hydrogen-powered car. There is also no infrastructure in place to conveniently transport large volumes of hydrogen. Natural gas pipelines could not be used, as hydrogen reacts with metals and makes them brittle. I contrast, transmission lines for electricity are already in place and, if upgraded to handle the larger load, they could deliver power directly to your home and your car – and do it three times more efficiently.

Finally, hydrogen is explosive. If you have ever seen a hydrogen filled balloon exploded, you are probably aware of the tremendous power of a hydrogen explosion. Hydrogen explosions are rare, but are bound to happen if hydrogen were in wide use. A hydrogen explosion occurred in an AT&T Uninterruptible Power Source battery room in 2020. The explosion blew a 400 square foot hole in the roof and collapsed walls and ceilings throughout a large portion of the 50,000 square foot building. Fortunately, the computer/data center was vacant at the time and there were no injuries.

All things considered, unless you own a fossil fuel company or believe in magic, trying to convert to a hydrogen economy is a really bad idea. 

© 2021 – J. C. Moore. All rights reserved.

Thermodynamics: A Tour through the Three Laws

     Posted on Thu ,18/02/2021 by admin

I’m not sure how they got to be laws, but they do appear inviolable in most instances.

The First Law: “Energy is conserved, i.e., it can be neither created nor destroyed.”

However, it may change from one form to another, such as heat to work. This law allows you to trace energy as it changes from one form to another and to identify all the places it ends up in the environment.

 It was a little embarrassing when atomic physicist discovered you could convert mass to energy. Before that, there was also a Law of Conservation of Mass. However, the amount of energy produced is given by Einstein’s Law: E = mC2. So mass is now considered to be another form of energy and energy is considered to be another form of mass. It is difficult to convert energy to mass, but then, again, there is the Creation Story.

The Second Law: “It is impossible to convert heat completely into work.” …Lord Kelvin

There are many different statements of the second law, all supposedly equivalent, although it may take several pages of equations to show it. The second law of thermodynamics was originally an empirical observation about the workings of heat engines. It was later realized that it was a fundamental law of nature, and it was most useful as it introduced the concept of entropy (S).

“In an isolated system, a process can occur only if it increases the total entropy of the system.” … Rudolph Clausius

One very useful derivation based on the second law is that for an engine converting heat to work, the :

 Maximum Efficiency = ( Th – Tc ) / Th

Here, Th is the higher operating temperature of the engine and Tc is the colder temperature of the exhaust. Using this we can show that the maximum efficiency of a coal-fired power plant is about 35%, while an extension of the formula shows the maximum efficiency of an internal combustion engine is about 15%. This, for instance, means that an electric car reduces carbon emissions by about half – even if charged from a coal-fired power plant.

Chemists have found entropy very useful as it can be used predict whether a chemical reaction will be spontaneous and how much product will be produced when equilibrium is reached, as:

”The entropy of an isolated system which is not in equilibrium will tend to increase over time, approaching a maximum value at equilibrium.”

With the advent of quantum mechanics, a better understanding came of how entropy relates to individual particles. Particles tend to arrange themselves in their energy levels in such a way as to reach a minimum in energy and a maximum probability. Which arrangement of particles below is more probable?

                              A                                                                         B

Clearly, B is more probable as there are more ways to arrange the particles in the energy levels. The relationship between entropy and energy is made clear in this model. To move from arrangement A to arrangement B will require energy to move the particles up in the energy levels. The entropy of each arrangement can be calculated by:

                                  S = kln(W)

where k is Boltzmann’s constant and W is the number of ways the set of particles may be arranged in the energy levels. Arrangement A is very interesting as there is only one way, and S= kln (1) = 0. That would be the arrangement at 0° Kelvin, and that leads us to the Third Law.

The Third Law:  “As the temperature of a system approaches absolute zero, all processes cease and the entropy of the system approaches a minimum value.”

In other words, at 0° Kelvin ( –273 oC), all particles are in their lowest energy states.  At that temperature, all motion ceases, except for the vibration of molecules and the motion of electrons, and those energies are in their lowest possible states. Attempts to achieve 0° K have been unsuccessful, as cooling an object requires extracting energy from it and depositing it somewhere cooler. And, there is nowhere cooler. The lowest temperature achieved has been a little less than 1 billionth of a degree Kelvin, which is cold enough for most purposes.

 It is convenient to have an absolute scale with which to measure thermodynamic properties, as absolutes are otherwise hard to find. I once witnessed an argument between a colleague, called barracuda Beth by students, and a humanities professor about whether there were absolutes. My colleague won the argument by claiming that the atomic weight of oxygen-16 was absolutely 16 amu, and humanities professor had no come back. Unfortunately, she picked the wrong absolute. The next year, the standard for measuring mass was changed to carbon-12, and oxygen became 15.995 amu. Wisely, I did not mention that to my colleague, and that also explains the first sentence in this article.

Note: I have somewhat simplified the laws of thermodynamics and have avoided mathematical equations as much as possible. The goal was to give you a feeling for the laws and to entertain you. I hope you find it interesting.

© 2021 – J. C. Moore  All rights reserved.

Global Warming: The Rise of Methane

     Posted on Mon ,15/02/2021 by admin

New

Greenhouse gases play a huge role in keeping the surface of the Earth warm. Without the greenhouse effect, the temperature of the Earth would average about -18°C, and all the water on the Earth’s surface would be ice. The average temperature of the Earth’s surface is now about 15°C and rising. The graph below shows the concentration of the main greenhouse gases in the Earth’s atmosphere, and how they have changed in the last two millennia. 

Inarguably, an increase in the greenhouse gas concentrations will warm the Earth – and we are seeing that happen. The average temperature of the Earth is now 1.2°F warmer than it was in 1850. The temperature of the Earth was fairly constant over the thousand years before the industrial age, and people, plants, animals, and our agricultural practices have adapted to that temperature. What will happen as the Earth’s temperature rises? We are finding out, and the effects are alarming.

Of the greenhouse gases, water accounts for about 70% of the greenhouse effect, carbon dioxide about 20%, methane 4%, nitrous oxide 1%, and the other greenhouse gases together about 5%. Our efforts to reduce global warming have focused mostly on carbon dioxide, as its concentration has increased over 40% from our use of fossil fuels. It will take time to phase out fossil fuels and transition to the use of renewable energy. The concentration of methane has grown appreciably in the last century, from about 800 parts per billion (ppb) to over 1900 ppb and it is rising rapidly. Methane has about an eight year half-life in the atmosphere before it is converted to carbon dioxide by natural process. The methane in the atmosphere would decrease quickly if we stopped putting it there. That is important, as methane has about 72 times the global warming potential of carbon dioxide.

The main cause of the rise in methane is commercial leaks, oil production, and fracking operations. In commercial sales, it is sometimes less expensive to ignore small leaks than to fix them. But many small leaks add up and it has been estimated that about 10% of natural gas put into pipelines is lost before it reaches the end user. Some of those problems could be fixed. Methane is also produced as a byproduct of oil production. If the amount of gas is too small to be sold commercially, it is often flared, i.e., lit like a torch. That converts it into carbon dioxide, which is less damaging to the environment. 

Fracking operations now produce a tremendous amount of natural gas for commercial use, and considerable amounts of methane escape into the atmosphere from the drilling operations and pipeline leaks. It requires effort and resources to contain the methane at the wellhead and to fix storage and transmission leaks. The EPA requires that leaks be self-reported, but often they were just ignored. Just recently, it has become possible to detect methane from GHGSat satellites. Below is a map that shows eight leaks in a 25 mi.² area in Turkmenistan, as they were seen by satellite. Estimates were that those leaks accounted for about 10,000 kg of methane a day. The methane was from fracking operations, pipeline leaks, and unlit flares.

Before satellites, most methane emissions were discovered by infrared cameras. Using them, it was found that the methane emissions from the Permian basin in Texas and New Mexico were much greater than those reported. Much of that came from unlit flares, which could easily be corrected. One accident at a gas well in Ohio is now thought to be the largest methane leak ever in the United States. Three different oil and gas facilities in Algeria were found to be leaking methane amounts equivalent to the carbon dioxide produced by a medium-size coal-fired power plant. The detection of leaks has been spotty and regulation of leaks has been difficult in the past. There is considerable economic incentive for gas companies to reduce methane emissions from leaks. However, it is expensive to send out crews to detect and repair smaller leaks, and many companies have just let them go.

The EPA expects the oil and gas industry to self-report and to repair leaks, but many companies just don’t. There are plans to deploy seven more GHGSat satellites to monitor greenhouse gas emissions. With them, it will be possible to detect and enforce the regulation of many methane leaks. It has been estimated that cutting methane emissions by 40% would have the same effect as taking 60% of the world’s coal-fired power plants off-line. And, we could easily cut methane emissions by 40% within the next decade.

(C) 2021 J.C. Moore All rights reserved.

Global Warming and the Jet Stream

     Posted on Sun ,14/02/2021 by admin

The Arctic is much warmer now than it was 30 years ago. They even had 100° days in Siberia last summer. The warming Arctic has caused changes in the jet stream, which controls the Northern Hemispheres’ weather. The Rossby waves in the jet stream,  that move from west to east across the United States, (see picture), now come down further and move slower from west to east.

This means that the jet stream can sometimes pull Arctic air down from the Arctic region, called a Polar Vortex. The slower movement of the Rossby Waves causes the extreme cold to persist for longer. It is 3° today in Kansas, the windchill is -15°, and this cold spell will persist for about a week. If it is extremely cold and snowy where you live, you can thank global warming for that.

Rossby Waves of the Jet Stream

Apparatuses of Justification

     Posted on Fri ,05/02/2021 by admin

In his internationally renowned work, Capital in the Twenty-First Century,  Thomas Piketty says that extreme economic inequality can only be sustained by “apparatuses of justification.”  He states, “ The existence of such “apparatuses” can hardly be disputed; the notion that wealth rightly belongs to those who possess it, no matter the means by which they acquired it or the needs of others around the world, is certainly well within the mainstream of contemporary thought, especially in North America and Europe. Ideas such as this did not, however, permeate contemporary culture on their own. They are derived, developed, and distributed by corporations, government offices, “independent” think-tanks, etc.” Two apparatuses of justification that immediately came to mind are trickle-down theory and the lies created by the Cornwall Alliance.

The trickle-down theory claims that the best way to promote economic prosperity for everyone is to give tax breaks to large corporations and those already wealthy. The idea this promotes is that they will create jobs and provide opportunities for those less well off. It was tried on a large scale in the United States under Reagan, Bush ll, and Trump. Over the years, many poor and middle class citizens have voted for politicians advocating trickle down theory. It is a flawed theory, wealth actually flows upward and pools at the top.  Meanwhile, after 40 years, they are still waiting for their share of the wealth trickle-down. The wealthy have become wealthier, the poor poorer, and the economic inequality in the United States has grown to unacceptable levels, as shown in the graph below.

After all that time, many Americans still do not realize how they have been fooled, as the chart below shows.

The Cornwall Alliance was originally started to help the poorer countries adapt to climate change. When E. Calvin Beisner took over as its spokesman, he interpreted that to mean that the Third World countries needed to use more fossil fuels. Never mind that they do not have the infrastructure or wealth to acquire and use them. Under his leadership, the Cornwall Alliance has become funded by dark money, most of which can be traced to fossil fuel companies. Who else? Beisner created the Green Dragon Monster, which he uses to represents environmentalists who want to reduce our dependence on fossil fuels. He uses “climate alarmist” to represent the 99.5% of climate scientists who have shown that climate change is caused by man’s activity, and “climate doomsayer’ for those who agree with scientists that global warming is harming the Earth.

Beisner uses religious arguments to reach out to conservative Christians and solicit donations. There is little evidence that the money goes to the poor, being used mostly to pay himself to distribute his message. He interprets, “God giving man dominion over the earth ”, Genesis 1:26-28, to mean that God has given man the right to exploit nature as he pleases. Apparently, he has very little understanding of ecology. Pope Francis’s encyclical on ecology, Laudato Si, says that “climate change is real and mainly a result of human activity.” “The problem is urgent. Never have we so hurt and mistreated our common home as we have in the last two hundred years.” Beisner claims that Pope Francis was just wrong, probably news to most Catholics.

Beisner’s position is even at odds with his own Presbyterian faith. The Presbyterian Church is now recommending divestment from fossil fuels and it was one of the first churches to address global warming. The Presbyterian Church first noted its “serious concern over global warming at the 1990 General Assembly, when it warned that the global atmospheric warming trend (the greenhouse effect) represents one of the most serious global environmental challenges to the health, security, and stability of human life and natural ecosystems’’.

There are many other examples of apparatuses of justification. You may recognize them by their tendency to label their opponents with unflattering terms; by their opposition to scientific research; by their derision of mainstream religious leaders; and, by their distortion of the truth. Ask yourself, “Who profits from their message?”, and if it is a special interest group, recognize it for what it is. And above all else, vote for the political candidates opposed to those special interest groups.

© 2021 – J.C. Moore, All rights reserved.

The Problem with Addressing Induced Earthquakes

     Posted on Thu ,04/02/2021 by admin

Many people believe that man’s activities are so inconsequential that they could not possibly induce earthquakes. However, there have been cases as far back as the 1960s where the only reasonable explanation was that earthquakes were being induced by disposal wells. When the U.S. Army’s Rocky Mountain Arsenal built a disposal well in 1961 to get rid of waste fluids, the seismic activity in the area increased. The well was plugged and the earthquakes stopped. A study by the U.S. Army Corps of Engineers and the U.S. Geological Survey (USGS) determined that a “deep, hazardous waste disposal well at the Rocky Mountain Arsenal was causing significant seismic events in the vicinity of Denver.”  Another case of induced seismicity occurred in Kansas in 1989 near Palco, northwest of Hays. The largest earthquake had a magnitude of 4.0 and caused some minor damage. Several injection wells, used for the disposal of wastewater from conventional vertical oil wells, were located near a deeply buried fault zone. Investigators concluded that the earthquakes were likely induced.

Recent research shows that disposal wells are causing the earthquake swarms in Kansas and Oklahoma. There were only two or three quakes a year in Kansas and Oklahoma before 2009. That was when fracking operations started in the area and each day millions of gallons of wastewater were pumped into disposal wells. By 2015, there were about 4500 Class ll disposal wells in Oklahoma and about 1600 and Kansas. Some Class II disposal wells, which are associated with oil and gas production, were injecting as much as 15,000 barrels of disposal fluids daily.

The graph below shows the number of earthquakes in the central United States from 1973 to 2019. The number of earthquakes, mostly in northern Oklahoma and southern Kansas, increased dramatically as the number of disposal wells increased after 2009.  When, in 2015, the Oklahoma Corporation Commission (OCC) and the KCC started putting limits on the amount of disposal fluid that could be injected into wells near earthquake epicenters, the number of earthquakes fell off appreciably. However, that decrease may not be true for the magnitude (M).

Figure 1.

Earthquake intensity is measured on the Richter scale. The Richter scale is a logarithmic scale, with an M3 .0 quake being 10 times stronger than an M2.0 quake, and it dissipates 32 times more energy. Earthquakes over M2.0 can be felt, those over about M3.5 can cause minor damage, and those over M4.0 are strong enough to do structural damage to buildings and infrastructure. An earthquake in Oklahoma near Pawnee, in 2016, was an M5.8 earthquake. It caused millions of dollars of damage in Oklahoma, an estimated $600,000 in damages 110 miles away in Wichita, and was felt as far away as Illinois.

Most disposal wells are drilled into the Arbuckle zone as it is porous enough to take up the fluid. The Arbuckle zone lies under the region at about 2700 feet.  The pressure of gravity on a column of saltwater that long exerts a pressure of over 1500 psi at the bottom. The drilling fluid, under that much pressure, has to go somewhere so it migrates outward from the injection wells.  As the fluid migrates, it causes an increase in pressure in the zone, labeled dP. When the increase in pressure, dP, reaches about 50 psi, it starts destabilizing ancient faults, causing earthquakes. The graph below shows how the dP has changed in south-central Kansas over the past several years, and its increase can be identified with new clusters of earthquakes.

Figure 2.

The pink area (dP > 50 psi) began increasing near the disposal wells in Harper and Sumner Counties and, by 2014, earthquakes were beginning to start there. The pink area reached Reno County (RN) in 2017. Since then, that area has experienced 126 earthquakes of a magnitude 2.0 or greater – and the magnitude seems to the increasing over time.  Hutchinson experienced an M4.2 quake in 2019  and an M4.6 earthquake on 01/19/2021, which was felt in 20 states. In 2018, Burrton, situated between Hutchinson and Wichita, had an M4.2 earthquake. One of Burrton’s school buildings was damaged and hasn’t been used since then.  The maps only go to 2017, but the disposal fluid has been migrating outward since then, with the pink area, dP > 50, likely reaching Wichita in 2020. That’s when earthquakes began occurring in Wichita.

The Wichita area has had very few earthquakes in the past. In the period1990-2019, there was only one quake near Wichita, about 15 miles east. However, beginning in November 2020, a cluster of earthquakes occurred with the epicenters under Northeast Wichita. There were 21 earthquakes greater than M2.5, with the largest of those an M3.9 on December 30th, which could be felt as far as 30 miles away. Minor damage occurred and Wichita citizens became concerned that there might be more and stronger quakes. Many people thought the earthquakes were caused by disposal wells in the area. The KCC and the Kansas Geological Survey (KGS) investigated and found that there were six disposal wells within a 6 mile radius and together they were only injecting a modest 9000 barrels of wastewater a day. So, the investigators concluded that there was very little link between the earthquakes and the disposal wells near Wichita.

And, the investigators were mostly right. The earthquakes were likely caused by disposal wells much further away. In 2016, the KCC limited the volume of fluid injected into disposal wells within a 6 mile radius of an earthquake’s epicenter. That 6 mile distance, establish for wells south-central Kansas, is apparently not adequate. A 2018 research study by the American Geophysical Union concluded that the earthquakes that occurred in Hutchinson were caused by an increase in fluid pressure from wells that were as much as 55 miles away. The graph of dP versus time from the KGS (Figure 2.) makes it clear that the increased pressure migrating outward from disposal wells correlates with the clusters of earthquakes.

The induced earthquakes have done millions of dollars in damage to homes, public buildings, roads, and bridges. The disposal well companies should be liable for the damages. Lawsuits to recover damages have been unsuccessful as it is not possible to link earthquake damage to any one disposal well. It has been proposed that the disposal companies carry liability insurance or voluntarily set up a fund to pay for damages. Since damages are caused by the total volume of fluid, it would be reasonable to apportion the cost among the disposal companies according to the amount of fluid they inject. Those proposals have not been well received. Many people are now buying earthquake insurance for their homes when it was not needed before. The insurance has been little help. After damages, many customers found their policies have a clause that limits payments to damage only from naturally occurring earthquakes. Even policies covering induced earthquakes have been slow to pay, claiming the damage was caused by settling or poor construction. A professional engineer can determine if the damages are caused by an earthquake, and insurance companies should be required to pay up promptly when the engineer certifies that is the case.

There have been efforts to put regulations in place to limit earthquake damage. Those have met with some success, but are clearly not adequate. Current regulations by the KCC in Kansas impose volume limits on wells within 6 miles of a known earthquake epicenter. That distance is clearly not sufficient. An effort to put limits on the volume of disposal fluid in all wells in Kansas, HB 2641, failed in the legislature after intense lobbying by the oil and gas industry. The KCC and KGS need to re-examine the research and put new guidance in place to protect private property and infrastructure and guide the legislature in protecting the citizens of Kansas.

It is clear that both Kansas and Oklahoma need to put regulations in place to limit induced earthquakes and pay for damages to infrastructure and homes. Either effective agreements, or good legislation is needed to:

1) restrict the location of disposal wells.
2) limit the amount of wastewater that can be disposed of at a site.
3) limit the pressure which can be used to inject the wastewater.
4) require that any disposal well linked to significant seismic activity be further regulated.
5) require that disposal well companies form a fund or carry liability insurance to pay for earthquake damage, and pay claims promptly.

It would be best if the disposal well industry would regulate itself by agreements. They would be happier with the outcome and it would avoid the political pressure put on the state legislatures. So far, it is the Corporation Commissions who have put what regulations there are in place. Our best hope is that they will look at the most recent research and put regulations in place which take it into account.

(C) 2021 J.C. Moore All rights reserved.

Earthquakes in Wichita?

     Posted on Thu ,28/01/2021 by admin

Earthquakes in the Wichita area are very rare. However, beginning in November 2020, a cluster of earthquakes occurred with the epicenters under Northeast Wichita. There were 21 earthquakes with a magnitude above 2.5 (M2.5) on the Richter scale. The largest of those was an M3.9 on December 30 which could be felt as far as 30 miles away. Many people thought the earthquakes were caused by disposal wells in the area, but the Kansas Corporation Commission (KCC) and the Kansas Geological Survey (KGS) investigated and found that there were only six disposal wells within a 6 mile radius and together they were only injecting a modest 9000 barrels of wastewater a day. That may seem like a lot, but it is a relatively small quantity compared to disposal wells in Harper and Sumner Counties where regulators found dozens of wells in 2016 pumping as much as 15,000 barrels a day. So, the investigators concluded that there was no link between the earthquakes the disposal wells near Wichita. And the investigators were mostly right. The earthquakes were likely caused by disposal wells as far as 55 miles away.

There were only two or three quakes a year in Kansas and Oklahoma before 2009. That was when fracking operations began in Oklahoma and millions of gallons of wastewater were pumped into disposal wells. The graph below shows the number of earthquakes per year in the central United States. The number of earthquakes, mostly in northern Oklahoma and southern Kansas, went to over 1000 a day by 2015, with one in Oklahoma in 2016 being an M5.8 earthquake. It was felt as far away as Illinois and caused an estimated $600,000 worth of damage in Wichita. When the Oklahoma Corporation Commission and the KCC put a limit on the amount of disposal fluid that could be injected into each well, the number of earthquakes fell off appreciably.

In Kansas, fracking started later, but earthquakes soon emerged as a problem as the number of disposal wells grew to more than 16,000 by 2015. Some of them were injecting as much as 15,000 barrels each day. In response to the induced earthquakes they caused, the KCC put a limit on the amount of fluid that could be injected into disposal wells within a 6 mile radius of earthquake epicenters. However, later research by the American Geophysical Union found that earthquakes could be caused by disposal wells as far as 55 miles away.

Most disposal wells inject fracking fluids into the Arbuckle zone because it is porous enough to take up the fluid. The extra fluid has to go somewhere, so it migrates outward from the injection wells, causing an increase in the fluid pressure, dP, ahead of it. The Arbuckle zone lies under the entire region, and the increase in pressure is moving North and East in the zone. When the increase in pressure, dP, reaches about 50 PSI as shown in pink, it starts destabilizing and causing slippage in ancient faults. The graph below shows how the dP has changed in south-central Kansas over the past several years, and its migraton can be identified with new clusters of earthquakes.

When the pink area (dP > 50 PSI) reached Reno County in 2017, the area began experiencing quakes. Since then, the county has experienced 126 earthquakes of a magnitude 2.0 or greater. The magnitude seems to the increasing over time.  Hutchinson experienced an M4.2 quake in 2019 and an M4.6 earthquake on 01/19/2021, which was felt in 20 states. In 2018, Burrton, situated between Hutchinson and Wichita, had an M4.2 earthquake. One of Burrton’s school buildings was damaged and hasn’t been used since then.  The map only goes to 2017, but the disposal fluid has been migrating outward since then, with the pink area, dP > 50, likely reaching Wichita in 2020. And that’s when the earthquakes began. Judging from Hutchinson’s experience, future earthquakes in the area may be stronger.

Since the earthquakes are clearly caused by the volume of fluid injected by the disposal wells, the industry should be accountable for the damage done to private and public property. In the period 1990-2012, there were only 16 earthquakes at M2.5 or greater magnitude in KS. Only one of those was greater than M4.0 and only one was near Wichita, about 15 miles East. From 2013 until today, there have been 730 earthquakes of M2.5 or greater in Kansas, 220 of them M3.0 or greater, and 9 of M4.0 or greater. The Richter scale is a logarithmic scale, so an M3 .0 quake is 10 times stronger than an M2.0 quake, and dissipates 32 times more energy. Earthquakes over M2.0 can be felt, those over about M3.5 can cause minor damage, and those over M4.0 are strong enough to do structural damage to buildings and infrastructure.

The earthquakes in Oklahoma have done millions of dollars in damages, but lawsuits to recover damages have been unsuccessful as it is not possible to link one disposal well with any given earthquake. One reasonable proposal was to have the disposal companies voluntarily create a fund which could be used to reimburse injured parties for damages. Since damages are caused by the total volume of fluid, it would be reasonable to apportion the cost among the disposal companies according to the amount of fluid they inject.

Certainly, regulations are needed to protect people’s property in Kansas. Current regulations, imposed by the KCC, only limit disposal volumes within 6 miles of earthquake epicenters. An effort to put limits on the volume of disposal fluid, KS HB 2641, failed when it died in committee. The Kansas KCC and KGS need to re-examine the research and put new guidance in place to regulate disposal wells and guide the legislature in protecting the public and private property in Kansas.

Kansas has not had exceptionally strong earthquakes yet. It has a unique opportunity to learn from what happened in Oklahoma, and take action to limit induced earthquakes and any costs to Kansas citizens. Effective agreements with the industry – or good legislation is needed to:

  1. restrict the location of disposal wells.
  2. limit the amount of wastewater that can be disposed of at a site.
  3. limit the pressure which can be used to inject the wastewater.
  4. require that any disposal well linked to significant seismic activity be further regulated.
  5. require that disposal well companies create a fund or carry liability insurance to pay for earthquake damage.

Without effective action, the earthquakes are likely to grow worse. If the disposal wells connot be regulated, then at least the disposal well companies should compensate people and taxpayers for the damage.

© 2021 – J.C. Moore All rights reserved.