"(T)o say that the individual is culturally constituted has become a truism. . . . We assume, almost without question, that a self belongs to a specific cultural world much as it speaks a native language." James Clifford

Thursday, August 17, 2023

Walmart: Encroaching on Employees' Private Lives

In 2023, Walmart relaxed its policy requiring anyone applying for a job at the company to get a drug test, including for marijuana, which at the time was legal in several U.S. member states. Once hired, however, employees were still subject to random testing. An employee in a member state in which the drug is legal could be fired even if the person is never affected by the drug while working. I contend that the practice is unfair, unethical, and an over-reach in terms of the nature of a labor contract. 

The ethical principle of fairness is violated because both marijuana and alcohol can impair the brain and yet the company only tests for one even where both drugs are legal. An argument can be made that the alcoholic personality is less than suitable, and yet taking marijuana outside of work (with no impact during work hours) is reason enough for an employee to be fired. Whereas alcohol can inducive hostility and even aggression, marijuana has a calming drug—something that could actually help busy cashiers.

Besides being unfair, the policy of even random tests for marijuana is invasive, beyond the legitimate scope of an employer’s reach—assuming that the employee using marijuana is never “high” at work. In selling one’s labor, an employee does not agree to a company’s management being able to control the employee’s legal activities outside of work if those activities do not affect the employee’s work. Sam Walton, the founder of Walmart, was against marijuana; for him to impose his ideological opposition on others where the drug is legal was over-reaching and impious; he was not a god. An argument can also be made that it is none of the company’s business, literally and figuratively, whether an employee uses the drug where it is illegal, again as long as the employee is not “high” at work. Law enforcement is the job of police, not a company’s managers. Of course, if an employee is convicted of a crime, an employer may not permit convicted employees to continue. In the case of Walmart, it hires people who have criminal records, which shows just how nonsensical the policy of random testing for marijuana is (especially as more and more U.S. member states legalize recreational use of the drug). In terms of a contract between an employer and an employee, an employer who presumes to dictate an employee’s recreational activities imposes a cost on employees that is not offset by the monetary compensation.

Imagine what would happen if a labor union informed a company’s management that an abrasive supervisor must be subject to drug and alcohol tests and fired for any positive results, or else the employees would strike. Suppose too that the supervisor does indeed have a problem with alcohol, but is not under its influence while at work. Still the union insists that the company fire that person. Suddenly, the company’s management would object with a mighty roar, How dare employees tell us what we cannot do on our days off! The nerve! Well, it goes both ways, folks. The attitude is the same: the unethical vice of invasiveness (in peoples’ personal, not work-related lives) is noxious and may even point to a toxic organizational culture.

See: Walmart: Bad Management as Unethical


Wednesday, August 16, 2023

Getting the Seasons Wrong: Purblind Meteorologists

You may think you know the answer to the question, “When is the autumn season?” But do you?  Watching the weather section of local news on television or the internet, you could be excused for getting the beginning date wrong because it is the meteorologist who has misled you. In itself, getting the exact day right is not a big deal; it is not as if the temperature can be expected to take a nose-dive on the first day of fall. The astonishing thing is that so many meteorologists either knowingly or out of ignorance present the astronomical beginning of the “autumn” quarter of the Earth’s orbit as the meteorological start of fall, for the two are different yet admittedly related.

“According to The Old Farmer’s Almanac, meteorological seasons are based on the temperature cycle in a calendar year.”[1] The first month of a given season tends resemble the preceding season and the last month anticipates the upcoming season. A season comes is fully its own in its second month. Each season lasts three months. “Meteorological fall begins on September 1” in the Northern Hemisphere and ends “exactly 90 days later, on November 30. Winter then gets its three months. Growing up in a northern Midwestern (U.S.) state, I just assumed that snowy March was part of winter. It sure felt like that. Only later, while living in the Southwest, did I realize that temperatures do start to go up in March.

Distinct but having an impact on meteorological seasons, the astronomical seasons are “based on the position of Earth in relation to the sun’s position.”[2] There are such seasons because of the tilt of the Earth in relation to the sun. On the summer solstice—the astronomical beginning of “summer”—the sun’s perpendicular rays get the farthest north; the Northern Hemisphere tilts toward the sun. The Southern Hemisphere is closest on the winter solstice—the astronomical beginning of “winter.” Again, while growing up in the northern Midwest, I knew that meteorological winter could not possibly start well into December just before Christmas, for winter cold was well ensconced by that time. The meteorological start of winter, on December 1st, is much more accurate in terms of temperature.

It is odd, therefore, that the start dates of the astronomical seasons are “more commonly celebrated” even to mark changes in weather.[3] A weather site or television broadcast using the astronomical start-date is inherently misleading, as the implication is incorrect. Even though Accuweather.com states, “Astronomical autumn officially arrives on Saturday, Sept. 23 at 2:50 a.m. EDT, a few weeks after the arrival of meteorological fall,” the presentation of the two starts by a weather organization may be confusing, especially as the paragraph continues with: “Regardless of which date you celebrate the start of autumn, . . .”[4] In an article on the fall weather forecast, two start-dates for that seasons are given. At least Accuweather.com distinguishes the astronomical from the meteorological. Local meteorologists use the astronomical dates on charts of weekly weather forecasts. 

On this weather chart, the Thursday (September 22nd) is labeled as "Fall." 

This is definitely misleading—or is it the case that local weather personalities do not realize their mistakes? At the very least, the television meteorologists astonishingly do not realize that they are giving false information—that they are misleading the public. It is absurd, at least in the northern tier of U.S. member states to say that summer does not begin until June 21st and that winter does not begin until a few days before Christmas. To stick with something that is so obviously absurd and incorrect when meteorologists should know better is precisely the cognitive phenomenon that I want to highlight here.

Perhaps the culprit is cognitive dissidence: the brain holding two contradictory thoughts at the same time. I know this date is astronomical AND I am using it on a weather forecast AND I know that the meteorological date is different. This weakness or vulnerability of the human brain may mean that there are others.

Regarding religious and political ideological beliefs, the brain may be susceptible to “short-circuiting” an internal check that would otherwise keep the brain from conflating belief with knowledge. A person once told me that Michele Obama is really a man. I disagreed. The person replied, “That’s just your opinion; I have the facts.” I said that I did not want to discuss politics. “It’s not political,” she replied. My claim to the contrary was, again, “just an opinion.” I was stunned at such ignorance that couldn’t be wrong.  Next, she wrote a nonsensical “deep state” political coded message on an index card. That her brain would not entertain the possibility that it could be in error is precisely the vulnerability that I contend plagues the brain as it ventures into political and religious domains of cognitions. In short, I suggest that a healthy human brain has more cognitive weaknesses than merely being subjective, and that society is overwhelmingly oblivious to them.



[1] Amaya McDonald, “When and How to Watch the Perseid Meteor Shower,” CNN.com, August 11, 2023.
[2] Ibid.
[3] Ibid.
[4] Brian Lada, “AccuWeather’s 2023 US Fall Forecast,” Accuweather.com, July 26, 2023.


Tuesday, July 4, 2023

On the Decadence of American Journalism: Journalists as Celebrities

I submit that when a conveyer of the news becomes the story, something is wrong; in typing this sentence initially, I did not include I submit that. To state my thesis statement as if it were a fact of reason (Kant’s phrase) seemed to me rather heavy-handed (i.e., arrogant). Similarly, when some Americans insisted after the U.S. presidential that Don Trump had won as if the asseveration were a fact of reason, I could sense aggressiveness along with the presumptuousness in treating one’s own opinion as a declaration of fact, especially if the actual fact—Joe Biden being sworn into the office—was otherwise. Opinion is one thing; fact is another. When a person misconstrues one’s opinion with fact, something is wrong. I believe this happens so often that it may be due to a problem innate in the human brain. Religious folks would not have to reach far to point out that in the story of Adam and Eve in the Garden of Eden, the sin of pride manifests in wanting to be omniscient; eating of that proverbial apple of the knowledge of good and evil ushers in original sin. A person perceiving one’s own opinion as fact, or even as important as fact, implicitly regards oneself as God. A journalist who interlards one’s role in conveying the news with one’s own commentary, and an editor who then makes that commentary the point of a story both treat a means (i.e., the conveyer of news) as an end (i.e., the news itself). I contend that at least by 2023, American journalism had fallen into this hole with impunity, which involved a lack of industry self-regulation and individual self-discipline.   

On July 4, 2023, The Huffington Post ran a story, “CNN Journalist Responds to Brazen Trump Campaign Claim with Disbelief.” The story begins with the following statement: “CNN’s Phil Mattingly on Monday couldn’t quite believe a Trump campaign response to a Washington Post question about the former president’s efforts to overturn the 2020 election result.”[1] Why should it matter whether a journalist can’t quite believe a statement made by a person being interviewed? CNN also ran the story, "Anderson Cooper Is Dumbfounded by Ron DeSantis' Bad Polling Excuse." The news network reported that the "CNN anchor was confused by the 2024 Republican presidential candidate's reason for falling behind Donald Trump." Why should it matter that the journalist was confused? Maybe he was not the brightest lightbulb. The network's message was obviously that DeSantis was to blame for the journalist's confusion, so the intent was to bias the viewers and readers against the presidential candidate. The inclusion of the word, excuse, in the story's title indicates the tenor of the bias in the "news" story.  

Is it ethical for a journalist to sway or bias the reactions of viewers or readers? Euronews, a E.U. rather than a U.S. news network, explicitly espouses impartiality so viewers can form their own opinions unimpeded by that of a journalist. That network even has a feature in which video is shown of events going on around the world, such as a political protest, with “No Comment” showing at the end. What a contrast to the American news media!

CNN’s obsession with the visible reactions of one of its news anchors, Anderson Cooper, to political statements even of people being interviewed illustrates my thesis. From his “news” show, Anderson Cooper 360, the network posted a video on CNN.com entitled “Watch Cooper’s Reaction to What Sondland Told Trump.” His reaction was visibly nothing spectacular. 

Another video had the title, “See Anderson Cooper’s Reaction to Ted Cruz ‘Groveling’ on Fox.” Again, the reaction was hardly noteworthy. 

Nevertheless, his show on the new network all about the anchor, as the very name of the show makes explicit. CNN’s CEO must have thought that the anchor’s reactions would make good promotional material for the network. Strangely, a magazine’s editor even thought that Anderson Cooper’s reactions were of value apart from promotional purposes. People posted a story online about the CNN anchor’s reactions in a New Year’s Eve broadcast from Times Square in New York City. The story, “Anderson Cooper Was All of Us with His Hilarious Reactions as He Took Shots to Bid 2020 Farewell,” featured the celebrity’s reaction to drinking a shot of alcohol on live television. After he and the other host drank a shot, “Cooper pursed his lips and coughed in seemingly slight discomfort, though he otherwise held it together.”[2] Hilarious.

It may be that Anderson Cooper’s minute reactions were such important fodder for publications because the anchor’s mother, Gloria Vanderbilt, was an heiress of an illustrious (and very rich) American family. She was a businesswoman, fashion designer, socialite, and writer in her own right. Interestingly, she had offered at the age of 85 to carry a baby for her gay son. That in itself was more newsworthy than any of her son’s muted political reactions on air. Of course, when Anderson Cooper had come out as gay on air, that too was deemed to be newsworthy in spite the journalistic standard that the sexual orientation (or race, gender, or political ideology) of a conveyer of news shouldn’t have an impact on the presentation of the news.

When a journalist becomes the story, especially in expressing a personal opinion, news itself (and journalism) becomes obfuscated, diluted, and even toxic from the standpoint of the role of an electorate in a democracy. The societal justification in giving journalists an outsized mouthpiece in public discourse is predicated on their function in conveying the news. This does not extend to molding public opinion and being the news themselves. In a culture in which reality-shows spawn celebrities, perhaps it is only natural that anyone on television could be made into one even for displaying muted visible reactions.

Wednesday, June 14, 2023

Starbucks: A Racist Company Against Racism

In June, 2023, Starbucks had to face a unanimous jury decision in favor of a regional manager whom Starbucks' upper management had fired because she had resisted the company's racist policy of punishing innocent Caucasian managers for good public relations, which the CEO felt was needed and appropriate after a store manager had legitimately called the police on two Black people in a Starbucks restaurant who presumed the right not only to sit in a restaurant without ordering anything (before Starbucks allowed this),  but also to ignore the authority of the store's manager. Starbucks cowered to the unjust negative publicity, and thus showed a lack of leadership, and went on to act unethically in wanting to show the world that the company can go after Caucasian employees. This racism is ironic, for several years earlier, Starbucks' CEO had ordered employees at the store level to discuss racism with customers. Interestingly, the anti-racist ideology being preached was partial, and thus contained a blind spot wherein racism such as the company's upper management would exhibit is acceptable. 

As the CEO of Starbucks, Howard Schultz had employees promote his political ideology on two social issues: gay marriage and race. Regarding the latter, he ordered employees, whom he artfully called partners, to write race messages on cups so customers would unknowingly enable employees to impart Schultz’s position on the issue by raising the topic. I assume that the employees could not begin such conversations. I have argued elsewhere that Schultz’s use of the employees for such a purpose was not only extrinsic to making coffee as per the employees’ job descriptions, but also unethical.[1] In terms of corporate governance alone, the shareholders, as the owners of the company, should have decided whether to have their company used to promote partisan positions on social issues. In 2023, Target and Budweiser would learn of the perils in wandering off the knitting to get political on social issues. In terms of jurisprudence, the “right” of a company, a legal entity, to have free speech is dubious, as abstract entities, even if legally recognized as such, are not human beings. Rather, the “free speech” claimed by companies is really that of the human beings who work for the companies. Using an abstract entity that itself cannot speak to gain additional publicity for one’s ideological views is unfair because the vaulted or amplified speakers are not so from a democratic standpoint. In short, why should Howard Schultz have access to a megaphone and employees to propagate his political ideology on social issues, when you and I have no such means of self-amplification? Whether we agree or disagree with the former CEO’s political ideology on race is not relevant to my point. To be sure, that his employees were told to speak against racism is in my opinion much better than had they been told to advocate racism against Black people. That Starbucks would then engage in racism is that much harder to understand, but perhaps the hypocrisy reflects a hidden negative aspect of Schultz’s ideology on race. American society could benefit by having that aspect uncovered; such a benefit vastly outweighs any benefit to business. Even in a pro-business culture, a lower good should not be put over a higher one. Aristotle refers to this error as misordered concupiscence.

In June, 2023, a jury in New Jersey “found in favor of former Starbucks regional director Shannon Phillips, who sued the company for wrongfully firing her, claiming she was terminated for being White.”[2] The company’s position was that Phillis’ boss fired her because she had displayed weak leadership. The use of such vague jargon as leadership for what is actually management is itself problematic. Even if Phillips had “appeared overwhelmed and lacked awareness of how critical the situation had become,” as her boss presumably had written, does not constitute weak leadership, for she was not in a leadership role[3]; instead, the company’s CEO should have got out in front of the issue and provided a vision for the company.[4] If Schultz was the CEO at the time, the failure of his leadership would be especially telling, considering his earlier foray into politics using the company to promote his ideology.

The triggering incident that had overwhelmed Phillips, according to her boss, whom the CEO at the time must agree in retrospect failed as a supervisor but presumably was not fired, involved two Black men who had refused to leave a Starbucks store in 2018 even though they would not purchase anything. They were thus not customers, and the incident occurred before the company allowed non-purchasers to be in the stores. That the two Black men refused to leave the company’s private property means they were trespassing, so the store manager was on solid legal grounds in having the local police remove the men from the store. Being Black, even if that race has been (and is) subject to racism generally, does not give a person the right to trespass on private property, and efforts to remove such trespassing is not racist, for anyone trespassing would be legally subject to removal from the property. 

I contend that Howard Schultz’s notion of racial reconciliation suffers from the weakness of being blind to the racial presumption displayed by the two Blacks. In having employees talk about the need not to be racist to customers, Schultz was assuming that racism is something that non-Blacks do to Blacks. Employees were not told to suggest to Black customers that being Black does not give them special exemptions from the law or in society. Schultz could have had employees suggest to Black customers that jay-walking between intersections in a major street even if cars are coming is not “a Black thing” that is justified because the race in general has been subject to discrimination. Furthermore, the use of the word, nigga, cannot be allowed only if the speaker is Black, for that would be a racist position. For a Black person who uses the word to become hostile or aggressive towards an Indian, Oriental, or Caucasian who also uses the word is itself racist (and of course the hostility is unjustified unless the related word nigger is used in a hostile manner). The U.S. Constitution does not indicate that free speech depends or is limited by race; such a clause would be prime facie racist.

Phillips’ complaint, which the jury accepted unanimously, states that following the arrest of the two Black men, Starbucks “took steps to punish White employees who had not been involved in the arrests, but who worked in and around the city of Philadelphia, in an effort to convince the community that it had properly responded to the incident.”[5] Phillips was ordered “to place a White employee on administrative leave as part of these efforts, due to alleged discriminatory conduct which Phillips said she knew was inaccurate. After Phillips tried to defend the employee, the company let her go.”[6] It does not sound like Phillips was overwhelmed; in fact, she was being pro-active and ethical in defending an employee from an unjust punishment. The implication is that the person who fired Phillips acted unethically.

Moreover, in being willing to sacrifice Caucasian employees based on their race for good public relations, the company’s upper managers were being racist. An unseen implication is that those managers believed that the public reaction against the company for having the two Black men removed from the store in Philadelphia had some validity—that Black people should not be treated like that or that Black people deserve special treatment due to their race. But such a belief is itself racist. Schutz’s talking points for his employees to discuss with customers on race did not include mention of the racism in such beliefs. Moreover, he did not have the company’s employees talk about racism by Black people stemming from resentment. Any ideology is partial, rather than whole, and even claim of being against racism can fall short. In going after Caucasian employees, including Phillips, Starbucks’ upper managers fell short; the failure of leadership ultimate belongs to the CEO at the time. At least at the time of the trial, Howard Schultz was the CEO.

Saturday, June 10, 2023

Gay Pride and Evangelical Christianity

Taylor Swift, an American singer and cultural icon in 2023, spoke “out against anti-queer legislation” during a concert in early June. “We can’t talk about Pride Month without talking about pain. There have been so many harmful pieces of legislation that have put [gay people] at risk. It’s painful for everyone. Every ally. Every loved one . . . ,” she said[1]. So much hurt. This motivated me to volunteer to carry a full-size gay flag in a gay Pride parade until the end of the route even though I am not gay. When I arrived in the morning, I thought the issue was political; by the time the parade began, religion had trumped the political. A small but vocal group of evangelical Christians and a larger group of young women wearing and carrying gay flags (in part to hide the Christians) were shouting at each other in utter futility of noise. What if people would use religion to dissolve the religious and political anger and even tension instead of stoking them? Both sides missed an opportunity.

The full essay is at "Gay Pride and Evangelical Christianity."


1. Shruti Rajkumar, “Taylor Swift Breaks Silence And Condemns Anti-LGBTQ Bills During Eras Tour,” The Huffington Post, June 3, 2023.

Thursday, May 18, 2023

Thanksgiving as a Day of Mourning: On the Instinctual Urge of Resentment

According to CNN’s website, the “sobering truth about the harvest feast that inspired Thanksgiving” is is the fact that colonists killed Indians. According to an analyst at CNN, the American Indian Day of Mourning, established in 1970 for the fourth Thursday of November, turned Thanksgiving “into something more honest” than the Thanksgiving mythos of a peaceful feast in 1621 suggests.[1] The drenching of self-serving ideology in CNN’s “analysis,” like heavy, overflowing gravy obscuring the sight and taste of the underlying mashed potatoes, is something less than honest.

Historically, the feast in Massachusetts Bay Colony in 1621, exactly four hundred years before Thanksgiving in 2021, when I am writing this essay, was attended not only by the Pilgrims, but also the Wampanoag Indians. The two peoples were then in an alliance. CNN’s Tensley attempts to derail the value of cross-cultural feast by pointing out that initially, “the pious newcomers didn’t even invite the Wampanoags to the revelry.”[2] The value in the fact that the Indians reveled with the Pilgrims in feasting is not nullified by the fact that the Pilgrims had changed their minds on sending out an invitation. Also, that the invitation served strategic interests in strengthening the alliance is no vice, for the alliance was based on ensuring survival in a changing world.

Moreover, the Day of Mourning is itself partisan in that it tells only a partial truth—namely, that Pilgrims killed Indians in the colonial era of North America. Left unsaid is the equally valid point that Indians killed Pilgrims. That the small pox disease also led to the death of Indians was no fault of the Pilgrims, contrary to Tensley’s ideological resentment. Furthermore, that killing took place between Indian tribes and English colonies generally does not nullify the good that is in a shared feast even among allies. Mourning the loss of American Indians generally as a replacement for Thanksgiving obscures that good and in fact implies that the killings in a broader war nullifies the good that is even in an eventual invitation. It surely must not have been easy for either the Indians or the Pilgrims to sit down together for a feast given the more general prejudice then existing between the Indian tribes and English colonies.

Of course, the point of the national Thanksgiving holiday established by President Lincoln in the nineteenth century—namely to give thanks to God—is of value in itself rather than being nullifiable by the hitherto conflict between the English colonies and Indian tribes in North America. 

The 2021 Thanksgiving Day parade in New York City. (Source: CNN) Surely such happiness does not deserve to be sullied by sordid resentment as a will to power.

In short, the imposition of a day of mourning over Thanksgiving really misses the point. In actuality, the imposition is, in Nietzschean terms, an instinctual urge fueled by resentement. Nietzsche claims that more pleasure from power can be had by mastering such an intractable urge rather than letting it run wild. Such overcoming internally is the signature of strength—more so even than conquering enemies on a battlefield. Unfortunately, the weak are so oriented to their external enemies that even truth can suffer and giving thanks to God can be overlooked entirely. The weak are children of a lesser god. A god of projected resentement, which discredits the very notion of a benevolent deity. Like light from a distant star, Nietzsche writes, the news of the discrediting murder of the conception of a vengeful god of perfect goodness has not yet reached the murderers, whose hands are drenched with blood. To claim, out, out damned spot! yet not know the source of the blood must surely be a worrisome interim condition drenched with anxiety, which in turn can fuel an instinctual urge based in resentement of the strong—people willing and able to control even their most intractable instincts.   
[1] Brandon Tensley, “National Day of Mourning Turns Thanksgiving into Something More Honest,” CNN.com, November 25, 2021 (accessed same day). CNN labels Tensley’s role as that of an analyst rather than an opinion-writer—the false attribution of fact belies the salience of the writer’s opinion-informed ideology.
[2] Ibid. Tensley’s own hostile ideological resentment is evident in his labeling of the Pilgrims as pious—here connoting a presumption of superiority. The Indians no doubt also felt superior, as it is only natural for any people, including the Pilgrims and Indians, to favor one’s own culture over others if one’s ideology (i.e., values and beliefs) is in sync with one’s culture.


Undermining Progress: Power Enforcing Infallible Ignorance

Bleeding to heal. The Earth is flat. Earth is at the center of the solar system. Zeus lives on Mount Olympus. The divine right of kings to act even as tyrants (e.g., Henry VIII of England). Hitler died in his bunker. Turning the heater on in a local bus kills coronavirus. These are things that were thought in their respective times to be uncontrovertibly true. In some of these cases, the power of the establishment was not subtle in enforcing them even when they should have been questioned. How presumptuous this finite, mortal species is! If ignorance on stilts is bliss, then why is it such in need of power? Subconsciously, the human mind must realize that its assumption of not being able to be wrong is flawed. We are subjective beings with instinctual urges—one of which manifests in the unquestioned assumption that what we know cannot be wrong, and furthermore that we are entitled to impose our “facts” on others. As the homo sapiens (i.e., wise) species, we are too sure, and too proud, concerning our knowledge and especially beliefs. We would like to have the certainty and objectivity that computers have, but we are subjective biological animals, not inert machines.

How much do we actually know? David Hume claimed that we do not really understand causation; we don’t get close enough to it to understand how one thing causes another thing. Worse still, we often take a positive correlation—that one thing is related to another (e.g., rain and seeing umbrellas)—as meaning that the one thing causes the other. Rain does not cause umbrellas; nor do umbrellas cause rain. Descartes was of a rare breed in that he was willing to critique his entire edifice of knowledge. With an open plain filled with the debris in front of him, he wrote that he could only be sure that he was thinking and therefore that he was existing. Cognito sum. I think, therefore I am. That he went on to reconstruct the very same edifice may suggest that he was still too taken with his previous knowledge. At the very least, his rebuilt edifice cannot be reckoned as progress.
Generally speaking, pride/ego plus knowledge is a retardant to progress and a sycophant to the status quo. New ideas must break the glass in order to breath and circulate even to reach peoples’ consciousness. Well-established beliefs clutch at us even in the face of strong arguments and empirical evidence to the contrary.

Hikes and stake-outs on Mount Olympus could have demonstrated to the ancient Greeks that immortal giants did not live there. The Greeks who scaled the peak tended to say that they felt the gods there—that the gods were invisible, as if they were merely spirit. Such contorting and even pruning when necessary is not uncommon in cases in which religion over-reaches; the core of the religious belief itself must endure even in the face of contravening empirical evidence. Sadly, not much progress has been made on the mind-game in the domain of religion; the human mind itself may be susceptible, with denial protecting the mind from recognizing its own susceptibility.

By the time that the ancient Greek religion became extinct, people were willing to conclude that no such gods existed (or had existed), and the belief that they lived on Olympus was simply wrong. Few if any people, however, were then able to consider that their own living religion could be wrong too. It’s the other guy who is wrong; this time, the deity really does exist. The firmness with which this belief is held, as if it were knowledge, is a sign of excessive defensiveness, and thus of unconscious doubt. Perhaps the unconscious is more honest with itself than consciousness is with us.
How many Christians consider that perhaps people could be wrong that Jesus literally rose from the dead (i.e., historically, as an empirical, historical fact)? How many Jews consider that historical evidence is lacking to support the belief that Moses was a historical person?  Josephus, an ancient Jewish historian who lived in the first century, wrote Antiquities, which refers to a man named Jesus (albeit with probable later Christian parenthetical additions that a Jewish historian would not have accepted). To go from a man named Jesus to Jesus Christ involves a religious claim/belief that Jesus is divine. We have left the territory of historical accounts, which are in the past tense, to make use of faith narratives, which, as myths, can be in the present tense. For example, myths such as the Christian Passion story can be reenacted in ritual each year as if Jesus’ passion is once again to be felt. The religious experience is presently experienced, having been triggered by myth (religious story) and ritual (couched in drama).

In short, in looking back at the ancient Greek religion, we dub the stories of Zeus and the other gods as myth. Yet we instinctively resist even the possibility that the ongoing religions could include myth, for it and historical writings are two different genres and we clutch at the added certainty that can be provided by historical accounts. Why is additional certainty believed to be so important? Religionists don’t want to even consider that their particular religious beliefs could be wrong or over-stretched. To be sure, a myth-writer (or orator) may reference historical events, but his point is not to convey the veracity of them. Rather, historical events may be used (and adapted) to make religious points. For example, the Gospels differ on when the Last Supper occurred relative to Passover because the writers wanted make different religious points. None of the writers of the faith narratives would have subordinated religious points to historical accuracy. Therefore, the added certainty is a mirage. Rather than essentially reclassifying religious belief as knowledge (empirical or through reasoning), matching religious belief with its own kind of confidence would be more in keeping with the domain, and thus with human experience therein.

Unfortunately, religion does not rest with the exogenous certainty; the inhabitants in the domain not only try to conquer (and thus control) each other; other domains are fair game too. Run through the circuits of a human brain, religion tends to be infused with pride such that the religious domain may have a propensity to encroach onto other domains, even assuming the prerogative to dominate them. How uncouth! Hence Christianity got into trouble when it tried to control science and claim history for itself. The assumption that religion should constrain scientific knowledge not only conflates two different categories, or domains, but also was ignorantly taken as infallibly true. Furthermore, a faith-belief could be taken as a historical fact, which in turn could be used to justify the belief. Such a closed, self-reinforcing cognitive loop is not easily broken open even to the scalpel of an inquisitive, self-questioning mind. How rare such minds have been and are even in the midst of robust technological progress and greater knowledge available to mankind. 
Christianity also got into trouble with itself, without realizing it, when it over-reached onto the military domain, which is not at all friendly to loving thy enemy. When the Roman Catholic popes became partisans in geo-political rivalries in Europe, the Church became closed in effect to its rivals and thus short-circuited its own mission—that is, the mission in the religious domain to save souls by leading people to Christ. We can count as progress the success of other domains in pushing religion back within the confines of its own turf. To presume to know the native fauna of another land better than the native plants on one’s own land, and then to presume to weed that land without sufficiently weeding one’s own is like arrogance on stilts; the toxic attitude of superiority should be underwater. Thus the high are made low, at least in theory.

In surveying world religions, I see progress at the point when the extant religions (with the exception of Satanism) came to no longer believe that human sacrifice appeases deities. When Judaism and Christianity had gained enough traction in ancient Greco-Roman culture that religion itself was no longer just a matter of ritual, but also had moral content (e.g., the Ten Commandments, the Beatitudes), religion itself may have progressed. Why not more definite? Friedrich Nietzsche, a nineteenth-century European philosopher, argues that modern morality borne of weakness and foisted on the strong to make the latter voluntarily renounce acting on their strength. Meanwhile, the ascetic priests, who are weak (literally in being celibate) are free to unleash their urge to dominate by controlling their respective herds and in confronting the strong with, “Thall Shalt Not!” Even our surest knowledge of progress can afford to be questioned.

Unfortunately, once the Greco-Roman religion that was merely ritual to appease the gods and included human sacrifice was extinct, continued progress has faced a strong headwind from the still extant religions that were created roughly in the “second generation” (1800 BCE-650 CE). Even though the ancient cultures within which those religions formed are by the twenty-first century oceans of time from modern-day cultures, religious strictures grounded in the formative cultures die hard, if at all. These strictures are sustained at in part out of a fear that beginning the project of separating the divine from (human) culture would lead to anything goes (i.e., cafeteria-style religion). What if the divine in revelation is itself cultural reflected on high? Change itself faces an uphill battle even though the sheer difference between modern and ancient cultures suggests that changes are necessary in order that moderns are not to be held captive by the arbitrary limitations in long-ago cultures. This is particularly true in religious moralities. That Paul thought that women should not preach in Christian churches is not sufficient for churches today to be obligated to treat Paul’s opinion in his letters as if it were divine revelation. Even that Jesus’ disciples in the Gospels are men does not mean that Jesus sought to limit his disciples to men. Writings on Mary Magdalene discovered in the twentieth century support this point. Put another way, even mere opinions in ancient letters are held so firmly that human opinion is essentially divinized. As a writer, I am well aware that mistakes are in writings. Correcting for those errors, such as the Christian overlay on Josephus’ historical account on Jesus, has largely been inoperative when the human mind entertains religious belief (i.e., dogma).

My point is that the self-retarding mechanisms of the human mind can slow down progress and enclose us in ignorance that cannot be wrong. We tend to overrate both the freedom of progress from human nature and the knowledge and beliefs we have both individually and as a species. This is not to deny the existence of progress through history. Gladiators killing each other in stadiums has been replaced by football (both sports) fighting for a ball. A general increase in the value of human life has occurred in enough societies to suggest an upward trend for the wayward dictators to measure themselves against. Nietzsche aside, moral progress has also occurred, again in enough societies to demonstrate an upward trend. The incredible technological advances in the twentieth century can also be taken as progress because they have expanded human potential. For one thing, people could write beyond daylight, electric lights being brighter than candles. Just think how long candles were relied on, then all of a sudden, in the turn of a switch, the initially-feared new light was on and could spread. The danger, it seems to me, lies in the assumption that the biological fixity of our species becomes less of a hindrance as technology becomes even more advanced.

The coronavirus pandemic in 2020 hit the species even in spite of our technological advances, even in the field of medicine. Boris Johnson, the prime minister of the UK at the time, initially swore off precautions. The fact that he held high office did not prevent him from having to go into intensive care at a hospital. As far as a virus is concerned, we are not apart from Nature; rather, we are biological. Our minds, being corruptible in terms of knowledge and judgment, can limit what technology can do to stave off a pandemic.

For example, according to a local bus driver in Phoenix, Arizona, the bus company’s management was urging drivers to turn on the heat when the temperature outside was not prohibitive and close the windows (hence trapping the airborne virus) because “the heat kills the virus.” The closed windows meant that plenty of airborne virus could be expected to be trapped in the buses. Perhaps the treatment of bleeding would have healed the brain-sickness of managers. Unfortunately, they were able to use their authority to enforce their ignorance that could not be wrong. So could grocery-store managers there—in a state in which public education is ranked 49th out of the 50 States—who did not even notice that even their own employees were not keeping at a physical distance from each other and customers (who behaved as herd-animals incapable of altering a well-grooved habit even to protect themselves!). The improved knowledge available from medical experts didn’t matter. In fact, by the month of April, most customers and employees of grocery stores in Phoenix were wearing the surgical masks that the virus can easily pass through; such masks were to be used by the infected so they don’t spit on, and thus infect, the healthy. Of what value is progress in knowledge if a major metropolitan area in a developed country acts regardless? A meat manager at one grocery store there told me that one guy touched a number of meat packages after having gorged on some chocolate. The customer rebuffed the manager, saying, “My fingers going from my mouth to the packages won’t get anyone sick.” An uneducated opinion was presumptuously dismissing science. In this way and many others, the benefits of progress in human knowledge are held back by human nature—specifically, by ignorance that cannot be wrong, and even presumes to trump knowledge.

It is ironic that progress has been extolled even in times held back by the status quo. “We are in an age of greater transparency,” a person interviewed by the BBC said just after the British government tried to have it that the prime minister, Boris Johnson was hospitalized for tests and because he had symptoms. The lightness of this announcement is belied by the fact that he went to a hospital during his Queen’s speech. He surely would not have wanted to take away from the speech, and yet he was going in for tests, so why did he not wait until after the speech? Why the urgency if he was going in for tests? The implication that his hospitalization was not urgent was undone the next day by press reports that he was then in intensive care. So much for transparency, at least from the government. The primitive instinct for security surreptitiously stepped back from, and thus nullified at least in part, the contribution that technology had made on transparency in the press on government affairs.

Similarly, even though a French agent reported to the French intelligence service that he had recently seen Adolf Hitler and his wife attending an opera during one of its three performances in South America after World War II, the world, including the U.S. Government, stuck publically with the Soviets’ story that the couple had died and then been burned in Hitler’s bunker in April, 1945. Even after the Soviets tested the couple’s DNA and found that both people were women, the world and its governments continued with the story that Hitler and his wife had died in the bunker. That Hitler might have lived the rest of his life in South America, even conniving with his expert on dropping a nuclear bomb on New York City, apparently triggered the security instinct such that the progress in intelligence-gathering and analysis was for naught. The tyranny of the status quo against progress is subtle, yet more enduring than the rule of a tyrannical ruler.

Why was it insisted historically in Europe that the Earth is flat even without any evidence? The “scientific fact” was even defended by threats of death, but then it was more a matter of religious belief masquerading as fact. Why is the human mind so hesitant to say, “It’s a theory, but we really don’t know.” The pride of a mind is and the fear of uncertainty are human qualities rooted in the instinct of self-preservation. Pride is thought to beget power, which aids self-preservation. So too does having greater certainty of the environment. Such bloated pride can motivate a Christian king to become convinced that the divine right of kings justifies even tyranny that is hardly in line with Jesus’ teachings. Even Christian clerics intoxicated with their temporal power may suppose that burning a scientist for claiming that the Earth orbits the Sun rather than vice versa is in line with loving enemies. Being more in love with temporal power than with Jesus’ preachments is yet another example of the religious costs of trying to dominate in another domain.

Whether in religion, politics, or higher education, does cognitive difference really make someone an enemy, or is the human brain prone to overstepping, emotionally speaking, in applying emotion to cognitive differences? We humans are overwhelmingly utterly unaware of the games our minds play on us. We assume that we are in control of what we think, and that we use reason impeccably. Nietzsche claimed that the content of ideas is instinctual urges, and thus reasoning is a subjective tussle within loose strictures that may themselves be urges. How much do we really know even about ourselves? Yet we would not tolerate someone saying that what we are absolutely sure we know may yet be incorrect. We are so sure that we grasp for authority to enforce what we know on others who resist. Hence, if we were to go back in time and refuse to be bled, a physician may dismiss our claim that bleeding actually weakens rather than cures a person and use his authority as a physician to subject us to the treatment. The weak—in this case, the ignorant with power—think nothing of dominating the strong; in fact, the resentful enjoy it.