"(T)o say that the individual is culturally constituted has become a truism. . . . We assume, almost without question, that a self belongs to a specific cultural world much as it speaks a native language." James Clifford

Wednesday, January 14, 2026

Global Warming Accelerating

When I took calculus in my first college-degree program, the graduate-student instructor didn’t bother to tell the class that a derivative signifies changes in the rate of acceleration. A derivative is not the rate itself, but, rather, the change in the rate—something much more difficult to detect empirically, as in watching an accelerating car. Formulae were the instructor’s focus, as if they constitute ends in themselves. By the time the climate numbers for 2025 came in, scientists could confidently say global warming was accelerating. The rate itself may have been increasing (i.e., a positive derivative), but attention to that by the media would have taken an educational reform as to how calculus was being taught. We think in terms of speed and acceleration. In this respect, we may be deficient in climate change itself as it has been unfolding. More decades than I care to admit had passed by 2025 since I had that course in calculus; only now can I say that I have used the math, albeit theoretically rather than via formulae.

Looking at the numbers for average global temperature for 2023, 2024, and 2025, Robert Rohde, the chief scientist at the Berkeley Earth Monitoring Group, said in early 2026, “The last three years are indicative of an acceleration in the warming. They’re not consistent with the linear trend that we’ve been observing for the 50 years before that.”[1] A linear trend represents no acceleration, so the rate of acceleration only became positive in 2023. Relative to the prior years, the averages for 2023, 2024, and 2025 “seemed to jump up,” said NOAA climate-monitoring chief Russ Vose.[2] The average for 2024 was 1.6C degrees above pre-industrial levels, hence slightly above the internationally agreed-upon limit of 1.5C degrees, and the averages for 2023 (1.48C above) and 2025 (1.47C above) were essentially tied so close to 1.5C that the average of the three years is above 1.5C. Even though the “leap” from the previous years since at least 2015 instantiates an acceleration, more years may be needed to assess whether the rate of the acceleration was increasing (mathematics majors would know this). At the outset of 2026, the three preceding years appeared as a plateau rather than evidence of continued acceleration, but a plateau could exist within a trend even of a positive derivative. My point is that we should have been more focused on changes in the rate of acceleration, for if the rate itself was increasing, then it would not be long until the threshold of 1.5C is surpassed and more extreme symptoms of climate change occur.

One of the weaknesses of democracy is that such symptoms may have to be experienced and seen before electorates treat climate-change as an important issue in voting. Human nature itself, a product of natural selection, still prioritizes the immediate over the long-term, especially in regard to threats. Instant gratification too is “hard-wired” in us all, which is why we tend to vote to keep gas prices low rather than to cut off the further manufacture of gasoline-powered cars. Whereas these contributory drawbacks in our nature, inherited from the gradual process of natural selection in evolution (mostly in the hunger-gatherer period of our species), have been associated with the lack of sufficient political will in the world since 2016 at Paris to keep the average global temperature from surpassing 1.5C above the pre-industrial level, our cognitive impairments that are also contributory are less well-known. This is the idea.

In addition to difficulties in conceptualizing and keeping attuned to what the derivative represents (i.e., change in the rate of acceleration, rather than the rate itself), our arrogance of pride in what we think we know also holds us back from grasping the magnitude of the human contribution to climate change. Just days before writing this essay, a man aged 75 declared to me that climate change is “just the natural cycles.” I don’t know whether that person had gone to college, but I do know that he was not a scientist. So the man’s declaration itself rang out as being out of place, given his actual level of knowledge on climate science. Similar to how we tend to focus on acceleration rates rather than changes in those rates, most people would be attuned to the content of the man’s statement—that climate change is merely part of a long-term natural cycle that will eventually reverse itself—rather than to the declaratory form of speech with which he made the statement. It is too difficult for us to grasp changes in rates of acceleration and focus on the presumption of entitlement that can be detected in the way a person makes a statement, whether it is written or verbal, and yet we tend not to realize that we have trouble with both. As one consequence, we understate the severity of climate change.

Lest anyone needs a refresher, “Rohde said nearly all of the warming is from human-caused emissions of greenhouse gases. . . . Samantha Burgess, strategic climate head of the Copernicus service, said the overwhelming culprit is clear: the burning of coal, oil and natural gas.”[3] Lest it be conveniently assumed that the burning has been going on somewhere in nature away from humans, Burgess doesn’t mince words: “Climate change is happening. It’s here. It’s impacting everyone all around the world and it’s our fault.”[4] Climate change is not just from a natural cycle that would be occurring even if there were no homo sapiens species.

So, Joe the plumber, a person let’s say who barely graduated from high school, would not only be incorrect in declaring that climate change is just part of a natural cycle; he would also be presumptuous in slighting the contradicting knowledge of climate scientists, whose years of study are indeed superior to Joe’s opinion. Like arrogance on stilts during a flood, Joe’s self-love issuing out in puffed up “knowledge” may one day be underwater if he happens to live on a coast when enough of the polar ice has melted to rise the level of oceans appreciably. That Joe would likely react angrily to being corrected even though his declaration of knowledge actually has no foundation is yet another indication of the presumptuous that may be endemic to the human mind but seems to be more salient in uneducated people. Formerly known in Western civilization as the sin of pride, which Augustine and Paul set as the worse (and thus intractable) sin, treating one’s own opinion as a fact of knowledge can be added to the list of the deficiencies in our nature that may wind up causing the extinction of our species as the Earth’s climate approaches a new equilibrium sooner rather than later. How much sooner depends at least in part on whether the relevant derivative is positive.



1. Seth Borenstein, “Scientists Call Another Near-Record Hot Year a ‘Warning Shot’ of a Shifting, Dangerous Climate,” APnews.com, January 14, 2026.
2. Ibid.
3. Ibid.
4. Ibid. Italics added for emphasis.

Sunday, December 7, 2025

Holidays at U.S. Parks: Usurped by Partisan Ideology

In the United States, Christmas is the last official holiday of the calendar-year, and Thanksgiving is the penultimate holiday. New Year’s Day is the first holiday of the year. Any other holidays among or between these are private rather than public holidays, and thus the public is not obliged to recognize those holidays as if they were equivalent to public holidays. Although New Year’s Day has remained safe from ideological attack, neither Thanksgiving nor Christmas have. Nevertheless, their status as official U.S. holidays has remained, at least as of 2025, and thus it remains as of then at least proper and fitting for Americans to refer to those holidays by name rather than by the denialist, passive-aggressive expression, happy holidays, which conveniently disappears even from retail clerks just in time for New Year’s because that holiday is ideologically permissible. The problem writ large is the influx of ideology trying to invalidate certain official United States holidays. By the end of 2025, the initial influx had triggered a counter-influx that is just as ideological, and thus only encircling certain (but not all) official holidays with ideology. The underlying fault lies in using the creation of a holiday to promote an ideology.

Martin Luther King Day and Juneteenth were made official U.S. holidays to promote an ideology. This rationale for declaring a public holiday is problematic because such holidays should be acceptable beyond a partisan minority or even a simple majority of the public. This translates into requiring that both major parties agree (even beyond simple majorities) in Congress before a new holiday is declared.

With regard to existing official holidays that have long been on the books, the onus should be on efforts to remove those holidays because ideologically-oriented motives for change, being partisan, warrant strict scrutiny, whereas the holidays’ default status does not. In short ideologically-motivated change should be subject to heightened scrutiny because ideologies are typically partisan rather than a matter of unanimity.

That Martin Luther King Day and Juneteenth are arguably too duplicative or overlapping, thus contributing to there being too many public holidays at the expense of the Gross Domestic Product and thus prosperity (and employment), is an indication that both holidays came out of an ideological push rather than a national sense or identity. In other words, the excess alone is a sign that holiday-making had gotten out of hand. In 2025, U.S. President Trump argued that there had come to be “too many non-working holidays,” and that all the days off were costing the U.S. economy too much in lost productivity.[1] Doing ideology by creating holidays does not come cost-free in economic terms. If the selfish trend of making holidays in one’s own image continues, more and more holidays might be viewed as valid only by some, rather than by every American, as being an official U.S. holiday were not validating enough. This does not mean that every American must or should celebrate every holiday.

The trend can also be seen in the changes made to holidays on which fees are waived in national parks. Firstly, that the “Trump administration removed Martin Luther King Jr. Day and Juneteenth from [the 2026] schedule of free entrance days for national parks” indicates that those two holidays are ideological, and thus partisan, in nature, and thus not fit to be public holidays.[2] Secondly, that the federal president then added his own birthday to the list of free-entrance days shows just how egocentric and thus arbitrary (to other people) holiday-creation had become. Trump also removed the birthday of the Bureau of Land Management, which could be a reflection of the president’s ideological dislike of regulatory agencies. Why not remove the first day of National Park Week, Great American Outdoors Day and National Public Lands Day too, as being excessive losses of revenue, given that none of those constitute even minor holidays like MLK Day, Veterans Day, and Juneteenth. Removing non-fee days such as Great American Outdoors Day would make sense from a financial standpoint, especially given Trump’s addition of President Theodore Roosevelt’s birthday and the Fourth of July, which make more sense anyway, given all that Roosevelt did for the national parks and the major status of Independence Day in terms of anything governmental in the United States.

The president’s fiddling with the fee-free days at national parks goes to show that the questionable ideologically-based rationale of holiday-creation may seem to go seamlessly along with more legitimate, and credible from a national standpoint, rationales. So, the interlarding of the former can easily go unnoticed and only objected to after too many holidays have been added to the calendar. That conservatives were joining in the game of ideological holiday taking-and-giving has effectively relativized, or flagged, what the progressive had been doing in creating new national holidays and even in trying to outlaw Christmas, a national holiday, be castigating any mention of that major public holiday by name.

The addition of a counter-force could thus be efficacious if the objective is to sever holiday-construction from the tool-kit of partisan ideology. That politicalizing had already gone too far with neither realization from the public at large nor any self-restraint by the expansionist ideologues themselves is itself a problem worthy of notice and correction. Successfully adding or ending a national holiday should receive the consent of the vast (super) majority of Americans at the very least, including both of their major political parties rather than just one with a minority of the other. Opposing partisan ideologies can be fought over on the campaign trail and at the ballot box rather than by using holidays, which, incidentally, can serve as respites from all the political turmoil. Treating holidays as political means rather than as ends in themselves, including what they stand for, has gone virtually unnoticed by Americans and their elected representatives. This takes a gradual and subtle yet important toll on the very notion of a public, official holiday, such that even the major holidays are subject to attack for ideological purposes. It is important to realize that any ideology is partial rather than wholistic because some values are emphasized more than others.

The guts that it took to risk treason by declaring British colonies to be sovereign countries, and President Lincoln’s benevolent declaration of one day to give thanks came under attack in the early twenty-first century because American history is not salubrious with respect to American Blacks and Indians, and counter-holidays, partisan in nature, were created, whether public or private holidays (as if the two were the same).  As a result, nearly every national holiday could be viewed as being valid only for people of a certain ideology on one side or the other, rather than as what a public or national holiday should be. The vacuous, ideological expression in “wishing” someone, “Happy holidays” is just one symptom of the underlying societal illness. Such a “greeting” fits with Nietzsche’s point that modern morality has been wielded like a club under the subterfuge of good-will. In other words, “Happy holidays” contains a virulent “Thou shalt not!”  Unfortunately, the very notion of an official national holiday has become collateral damage for a people grown wary of too much ideological push. Is there any respite? At one time, holidays afforded such a rest. Put simply, spending weeks arguing directly or by verbal passive-aggression about a galvanized holiday is counter-productive from the standpoint of enjoying a day off work to relax and have fun. The tyranny of an ideological minority can be just as bad as that of the majority; holidays—July 4th at the very least, should be tyranny-free.



1. Pocharapon Neammanee, “Trump’s Birthday Added to National Park Free-Entry Days After Dropping MLK Day and Juneteenth,” The Huffington Post, December 6, 2025.
2. Ibid.

Monday, November 10, 2025

COP30: Is Symbolism Enough Amid Climate-Change?

With the U.S. fed up and only 100 governments left willing to attend COP30 in Brazil on combatting carbon-emissions and the related global warming, the question of whether the basis of the annual conference, voluntary compliance, is sufficient and thus should be enabled by the staged meetings. Even to continue to have the conferences annually can be viewed as part of a broader state of denial, given that the 1.5C degree maximum for the planet’s warming set at the Paris conference a decade earlier was by 2025 universally acknowledged by scientists to no longer be realistic; the target would almost certainly be surpassed. It is in this context that any progress from COP30 should be placed.

At the end of the pre-COP30 meetings, the “European Union and Brazil launched an appeal calling on other nations to recognize carbon pricing as a pragmatic way to cut emissions and fun the green transition.”[1] Crucially, the “declaration . . . is a symbolic way to encourage world nations to develop strategies and establish markets akin to the EU’s emissions trading scheme, ETS, in place since 2005. Under the ETS, the EU makes companies pay for the emissions they produce.”[2] Below the nice headline of the declaration and assurances of “partnerships” lies the key word, symbolic. To characterize countries as partners is already a red flag, for that is weaker than even alliances, which can be broken at a moment’s notice with impunity.

Immediately after the “declaration” was made public, critics were saying “that putting the spotlight on carbon pricing could divert attention from real emissions-cutting, like investing in restoring natural carbon sinks, like forests and oceans.”[3] Even in putting “real emissions-cutting” in terms of restoring forests and oceans—COP30 ironically being held near the increasingly deforested Amazon rain-forest—minimizes the urgency in staving off warming from greatly exceeding 1.5C degrees. Real decreases in carbon-emissions were needed, and yet only 100 national governments were meeting in Brazil to consider voluntary action at the country-level.

The elephant in the living room, invisible to almost everyone, is the assumption that voluntary decisions by national governments in the face of economic and political immediate costs can be relied upon to solve the problem, even when it was clear in 2025 that the 1.5C degree maximum “decided” at the COP15 in Paris would be surpassed. Like the tremendous risk of destruction to the species from nuclear war, which the belligerence of the Russian and Israeli governments for two years as of 2025 means that the irrational decision to unleash nuclear weapons is not at all unrealistic, the risk to the species’ very survival from climate change justifies the establishment of a world federation with just enough governmental sovereignty, backed up militarily, to push back against wayward national governments in order to keep the worst of human nature from being unleashed with hitherto unimaginable ferocity and mass destructiveness. Anyone with the irrational fear that such a world federation, which Kant recommends in his writings, would produce the Anti-Christ might want to look at the Russians in Ukraine and the Israelis in Gaza as of 2025 for a clue as to where in the tiered system evil has already been manifest. Stalin and Hitler provide easy examples from the twentieth century.

In short, symbolic international conferences and absolute national-sovereignty should no longer be relied on so much by our species if it hopes not to go extinct. If that does happen, the wound would almost certainly be self-inflicted. Yet even then, with blood dripping from the knife being held by our species, still word of the deed will not have reached us. As Nietzsche writes of the unconscious discrediting of God (which Nietzsche opposed, for he was not an atheist), word of the deed did not reach the culprits, as in light from a far star not having reached Earth yet and yet the explosion has already happened. So too, our species has been oblivious concerning what is sufficient to stave off the destruction even of the species itself. The human mind discounts even mass-destructive possibilities that are thought to be low-probability and far off in the future, and thus flinches from agreeing to set up adequate safeguards.

In issuing the warning here with an acknowledgement of utter futility, I may be writing only to future descendants who are already dead. I am time, the destroyer of worlds, Lord Krishna tells Arjuna in the Hinduism’s Bhagavad-Gita. Left to its own devices by a feckless, stubborn, and greedy species, time may indeed see the extinction of homo sapiens, the “wise” species of Man, while the gods laugh at our primped-up seriousness as if we had been children pretending to be adults. Pathetically, we even take ourselves to be adults as we marvel at our own symbolic feats.



1. Marta Pacheco, “COP30: EU Back Global Carbon Market Alliance to Crack Down on CO2 Emissions,” Euronews.com, 10 November, 2025.
2. Ibid., italics added for emphasis.
3. Ibid.

Saturday, November 1, 2025

Accountability for the Rich and Famous: A Soft Landing for an Ex-Prince

In ancient Greek tragedy, it was not uncommon for a god or goddess to perform the function of a Greek (i.e., conscience) chorus at the end of a play while being pulled by pullies high above the stage. Deus ex machina is the Latin phrase, which meant, a deity out from pullies. We get machine, mechanism, and even engine from the Latin word, machina. A movie entitled Ex Machina is on an AI android that seems full of life, even miraculous, from “pullies” inside it’s “body.” Ex-Prince Andrew of the (seceded) sovereign state of UK, or “Britain” informally, seemed to fly about the other actors in being able to land, rent-free, fittingly around Christmas, 2025, at the monarch’s Sandringham estate in eastern Britain, still rent-free, and with King Charles funding his brother. Considering that Andrew Windsor should arguably been sent to prison for having sex with a 17 year-old prostitute in the employ of the infamous Epstein, and that a large settlement paid by Queen Elizabeth II made Giuffre’s charges go away, as if magically, Andrew not only landed on his feet, but without touching the ground where us mere mortals make our way through life to survive and perhaps prosper.

The state’s palace-office put out a statement claiming that “royal sympathies are with the victims of abuse, but if that were the case, the royal family could have acted more firmly . . . Distancing themselves from Andrew is not the same as calling for accountability.”[1] This is not to imply that the royal family approved of Andrew’s behavior, not only in allegedly illegally raping Giuffre or in allegedly having his police-guards dig up dirt on her, but make no mistake, his soft landing wherein he actually is allowed to remain comfortably in the air above us mere mortals does not divorce him from the luxurious life of royalty. Even though Andrew has been accused of using his public duties to enrich himself through his businesses, the King announced that he would be funding his brother going forward even though questions about “how, exactly, Andrew affords his lavish lifestyle” could continue to be raised.[2]

When a prince himself, Charles could be said to have abused Diana emotionally by serially subjecting her to his rather blatant infidelity with Camilla. Additionally, the royal family refused to get Diana help for her mental illness. So, it would not be surprising were the King to actually have sympathy for his brother plagued by misdeeds of his own. Birds of a feather fly together, even when they appear to diverge publicly.

The Palace, I suspect, has become very savvy in how to use brand management to shore up the reputation of the royal family as well as the various actors therein. As one commentator wrote, “Distinguishing Andrew from the rest of the royal family is Windsor brand management after years of taint by association.”[3] Such taint includes Prince Harry’s revelation that Prince William became violent in attacking the younger brother because William was angry and disliked Harry’s wife, a Californian and a former actress! So the Palace put out video of William seemingly crying when listening to a subject’s sad story. The sudden show of emotion from a guy who had otherwise looked staid and placid should have raised questions of manipulation of the public. That William could become king sooner rather than later due to his father’s ongoing treatment of cancer (shown publicly in bloated, ruddy hands in photos) may have motivated the PR offensive. Such actually-offensive manipulation is sadly typically missed on the public anywhere. Dazzled perhaps by the rich and famous soaring above us, we look up but strangely miss the sordid underbellies. Deus ex machina really does seem to apply to royalty especially, even when accusations of squalid, even illegal conduct are too strong to ignore. It seems that the human mind, which is actually the brain, is too susceptible—too vulnerable—to being manipulated by forces whose power reigns on the public airwaves. If only you and I were as savvy as the rich and famous, accountability could be on the horizon. Surgite et  adsequimini superis!



1. Autumn Brewington, “UK’s Andrew Losing His ‘Prince” Title Isn’t the End of the Story,” MSNBC.com, October 31, 2025.
2. Ibid.
3. Ibid.

Tuesday, August 19, 2025

Complexity in Global Warming: On the Imprint of Pride

It would be incorrect to claim that the planet’s atmosphere and oceans are both getting warmer in a linear, across-the-board way. The existence of exceptions, such as the slightly cooler average summers in some places in the interior of North America, no longer allows for credible claims of climate-change denial, an agenda that was financed and promoted in part by fossil-fuel companies in the U.S. and E.U. before being totally repudiated by science. Indeed, the credibility of natural science vastly exceeds that of corporations with vested financial interests. Rather than discuss those, which have become better known to the public, I want to describe the sheer complexity of a generally warming planet, which is rarely adequately grasped by non-scientists, and delve into the sordid self-love that I contend is ultimately behind global warming. 

Comparing summers of the past 30 years as of 2025 with the 1901-1960 average, the 48 contiguous American States showed “large changes I some regions, especially the West, and very muted ones in the central and southeast” States.[1] The “limited warming and even slight cooling in some locations” was “strikingly apparent.”[2] For example, Tuscaloosa County in Alabama cooled 0.6F since the first half of the twentieth century.[3] Lest this fact be taken as a rebuttal or counter-fact to “global warming,” Joseph Barsugli, a climate researcher at the University of Colorado cautions, “There are not too many places on the planet that are showing this, honestly.”[4] In fact locations in the interior of North America that have shown slight cooling are “an oddity amid a warming climate, as the general pattern is that the world’s land areas are warming up more quickly than the oceans. Europe, for instance, is one of the fastest warming land areas on the planet.”[5]

“According to the ERA5 dataset of the Copernicus Climate Change Service (C3S)” in 2025, Europe, rather than North America, had achieved the honor of being the fastest-warming continent; in fact, it had been warming by “approximately 0.53C per decade since the mid-1990s,” and the Arctic was “warming even faster—around 0.69C per decade.”[6] Even the linearity in “0.53C per decade” is an over-simplification and thus it should not be projected out from 2025 to even 2035.  Although not directly impacting the summer heating, the Gulf Stream, an ocean current that brings relatively warm water from Florida to northern Europe, was still active, albeit slowing down, during the winter of 2024-2025. Should that conveyor-belt cease to function due to the influx of cold fresh-water from melting ice in the Arctic, European winters would be much colder, and that would lower the overall warmth of Europe annually and could even have an impact on the summer heat—which way, I do not know; the complexity easily surpasses my ken as I am not a climatologist.

My point is precisely that the complexity, including feedback loops that have already been set in motion by warming that had already occurred before the mid-2020s and are entirely natural and thus cannot be stopped, still eludes even the grasp of the scientists who study the phenomenon of climate change. The E.U.’s southern States in particular were extraordinarily hot during the summer of 2025, and no one could then predict if or when the Gulf Stream might shut down in the Atlantic Ocean. The human impact on Earth’s atmosphere and oceans had already gone beyond what human minds or computers could project in terms of the future. The negligence on population growth and pollution by business around the world during the 20th century came with the convenient implicit assumption that the impact would not compromise or even extinguish our species, including homo sapiens being able to think its way to counter such wide-scale effects. In fact, the inherent or innate underlying human problem goes beyond cognition being limited relative to human prowess.

The preeminence of the self, in a narrow self-love according to Augustine, is pride, “the beginning of all sin.”[7] In other words, “’the love of personal pre-eminence’ (amor excellentiae propriae) . . . means not simply ‘love of money’ but that ‘general avarice’ which makes a [person] seek for ‘something more than is fitting’, ‘for the sake of [one’s] own pre-eminence and through a kind of love of possession (quemdam propriae rei amorem)’.”[8] Whereas Aquinas would contend that greed is the prime sin, Augustine hung his hat on pride as the root of all evil, from which greed, or the desire for more sans limit, as evinced by not only the executives and shareholders of fossil-fuel companies, but also consumers who refused to constrain their use of electricity and gas/petro for driving cars rather than taken buses, trolleys, and subways in the last half of the 20th century and the first quarter of the next century. The impulse preferring instant gratification for the self over the public or common good was continuously given enough rope even to possibly choke the species eventually. 

The strategic use of regulation and contributions to political campaigns for influence rendered not only people running corporations culpable, but also elected representatives and appointed regulators in government. The comfortable, mutual back-scratching relations between people in business and government could easily betray even the medium-term general good. 

In other words, allowing a lower good to surmount a higher good, which Aristotle called misordered concupiscence, for private and thus narrow immediate gain, is incredibly short-sighted and humanity even in the mid 2020’s was coming to realize that time was catching up and it was already time to begin “paying the piper” for all the accumulated political, economic, and social/cultural lapses that had contributed to excessive CO2 emissions. 

To be sure, the extent of the Earth’s surface and atmosphere relative to a human being could easily give people the misimpression of not being able to significantly alter the Earth’s atmosphere and oceans; but humanity should also have been sufficiently concerned about the exponential increase in the human population through the 20th century, even with two world-wars to grasp the necessity of constraining that increase as well as the consumption of energy by individuals and the pollution of companies geared to supplying the greatly expanded population’s consumption.



1. Chris Mooney, “The Strange Divide in How Americans Experience Summer Temperatures,” CNN.com, August 19, 2025.
2. Ibid.
3. Ibid.
4. Ibid.
5. Ibid.
6. The E.U. Climate Change Service, “Why Are Europe and the Arctic Heating Up Faster than the Rest of the World?” 14 July, 2025.
7. Augustine, De Gen. ad litt. XI. 18ff, as quoted in John Burnaby, Amor Dei: Study of the Religion of St. Augustine (London: Hodder & Stoughton, 1938), p. 120.
8. John Burnaby, Amor Dei: Study of the Religion of St. Augustine (London: Hodder & Stoughton, 1938), p. 120. Burnaby quotes from Augustine, De Gen. ad litt. XI. 18ff.

 

 


Sunday, July 6, 2025

Climate Change in Europe: On the Culpability of the Media

A report by the E.U. Copernicus Climate-Change Service in 2024 contains the finding that “Europe is the continent with the fastest-rising temperatures on Earth, having warmed twice as fast as the global average since the 1980s.”[1] In spite of “fastest-rising” and “twice as fast” are alarming expressions, no such corresponding sense of urgency had translated into a political will capable of pushing through game-changing legislation and regulations in the European Union. The short-term financial interests of industry, cost-conscious consumers, workers not wanting to be laid off, and taxpayers would pale in comparison were a sense of emergency to take hold the domain of politics. “Weak” states (i.e., governments) that are not willing or even able to resist short-term political pressures from an electorate exacerbate the problem even in the midst of climate change, which scientists decades earlier had predicted would really begin to move the needle on air-temperatures globally in the 2020s (and just wait until the oceans become saturated with CO2!). You ain’t seen nothin yet may be the mantra for the 2030s.

It seems to be a case of the proverbial oblivious frog in gradually yet steadily warming water in a cooking pan on a stove, as the editors at journalistic media companies have been orienting their news to reporting on specific climate-related events that are disasters only in particular locales and thus do not point explicitly to global warming. For instance, on 4 July, 2025, a wildfire in the E.U. state of Greece “prompted evacuations in coastal areas south of Athens” and mobilized “75 firefighters, including five elite ground teams . . . alongside fire engines, volunteers, four helicopters, and two aircraft” as well as municipal water trucks.[2] “(O)ngoing heatwaves, drought and strong winds” kept the fire-risk high in the area.[3] Only at the end of Euronews’ article on the fire is climate change mentioned, and then only as an attenuating factor: “While fires are common in the area, experts say climate change is exacerbating them.”[4] That is to say: Oh, by the way, the warming of the planet’s atmosphere and oceans is in play here. Even as climate change is relegated thusly, that it is only exacerbating wildfires in the southern states such as Greece is a way of deflating claims that climate change ought to be handled as an emergency in terms of public policy. The media has thus been culpable.  


1. David O’Sullivan, “Firefighters Battle Wildfires in Greece and Turkey, Prompting Evacuations and Emergency Response,” Euronews.com, July 4, 2025.
2. Ibid.
3. Ibid.
4. Ibid.

Friday, June 20, 2025

The Summer Solstice: Astronomy Is Not Meteorology

It boggles the mind that the same meteorologists who know that June, July, and August days are counted when calculations are made on the average temperature for summer nonetheless broadcast the summer solstice that falls three weeks into June as the first day of summer. To do so in the context of weather forecasts is nothing short of intellectually dishonest. To an unfortunate extent, those meteorologists may simply be following the herd of tradition at the expense of thinking for oneself. The human brain is suited for much more than a herd-animal mentality.
 
“What’s the difference between meteorological and astronomical seasons? These are just two different ways to carve up the year. While astronomical seasons depend on how the Earth moves around the sun, meteorological seasons are defined by the weather. [Meteorologists] break down the year into three-month seasons based on annual temperature cycles. By that calendar, spring starts on March 1, summer on June 1, fall on Sept. 1 and winter on Dec. 1.”[1] Therefore, meteorologists who broadcast the summer solstice, which falls between June 20-22 depending on the year (as per leap years), as the first day of summer can be reckoned as intractable herd animals in their own profession. Distinct from the weather, as “the Earth travels around the sun, it does so at an angle relative to the sun. For most of the year, the Earth’s axis is tilted either toward or away from the sun. That means the sun’s warmth and light fall unequally on the northern and southern halves of the planet. The solstices mark the times during the year when this tilt is at its most extreme, and days and nights are at their most unequal. During the Northern Hemisphere’s summer solstice, the upper half of the earth is tilted toward the sun, creating the longest day and shortest night of the year. This solstice falls between June 20 and 22.”[2] This does not signify the first day of summer in terms of climate or weather, and yet too many meteorologists continue to mislead the public by stating on a weather graphic that the first day of summer falls on the summer solstice.
At most, meteorologists should constrain themselves on the summer solstice to announcing that daylight hours are most on that day (and least on the winter solstice, which falls well into December rather than on December 1, which is the first day of winter as we know it. By June 20-22, summer as we know it here below on Earth is well underway. During that week in 2025, parts of North America and Europe were already in a heat-wave, so it would be ludicrous to claim—especially by meteorologists as they should know better—that meteorological summer has just begun. And yet the basic category mistake continued unabated.



A television station in Boston, MA misleading the public as if June 20, 2025 were the first day of meteorological summer even though the graphic itself shows a heat-wave coming up! (left). On the right, Weather.com shows the forecast high then for London, UK. Obviously, June 20, 2025 was not the first day of meteorological summer in the Northern Hemisphere, so the claim to the contrary by meteorologists is nothing short of puffed up ignorance based on a category mistake broadcasted publicly by people who should know better because weather is their profession. Meteorology and astronomy are distinct domains, even though they are related. Maybe meteorologists in London should have telephoned those in Boston to pass on the tip that 90F in London is well into meteorological summer rather than its first day. 

That cognitive phenomenon is aptly described by Nietzsche in regard to his infamous claim that God it dead. Even though he states that he is referring to a particular conception of God—the Abrahamic one in which God is both vengeful and omnibenevolent—he is been thought to have been an atheist. Rather, his claim that adding “Vengeance is mine, sayeth the Lord” to a deity that is omnibenevolent is to place an internal contradiction in that conception of the divine, as vengeance contradicts benevolence. The people responsible for this contradiction had no idea what they had done—their murderous act of discrediting an extant conception of the divine. Like light coming from the most distant star but not yet reaching Earth, news of their own deed did not arrive to them even as they had blood on their hands. Similarly, news of committing a category mistake has not reached the meteorologists who know that calculations regarding temperatures in a summer include June, July and August and yet broadcast that the first day of meteorological summer doesn’t “arrive” until the astronomical “summer” solstice. News of their own confusion and conflation hasn’t reached them yet, and yet their recurrent deed should be obvious to them especially, as their profession is meteorology. Perhaps astronomers could step and change the names of the astronomical “seasons”—not even using that word—so the public might realize that astronomy and meteorology are two distinct, albeit related, domains. Even astronomy is misleading in this respect in calling quadrants of the Earth’s orbit “seasons.” Therefore, I submit that the professions of both meteorologists and astronomers are at fault in enabling the confusing category mistake wherein two distinct domains are conflated. 



1. The Associated Press, “Sunshine Abounds as the Summer Solstice Arrives,” APnews.com, June 20, 2025.
2. Ibid.