"(T)o say that the individual is culturally constituted has become a truism. . . . We assume, almost without question, that a self belongs to a specific cultural world much as it speaks a native language." James Clifford

Wednesday, December 6, 2023

Time Magazine’s Person of the Year: Taylor Swift

Time magazine named the singer Taylor Swift as its person of the year for 2023. Such a force of nature were her stadium-filled concerts during that summer that they triggered economic booms in the respective host cities. In Pittsburgh, Pennsylvania, for example, hotel rooms went for as much as $2,500 downtown on the night of the concert. In terms of American culture, the analogy of gravity waves may fit. During an interview for television at her home (or one of her homes), Swift’s savvy business acumen was very evident; her marketing prowess was extraordinary. She even re-released her own songs, resulting in a huge financial windfall for what are really the same songs merely re-sung. It is not as if she had grown a new voice. Swift personifies American culture, whose “movers and shakers” seem “happy go lucky” on stage yet, behind the scenes, they tend to be lazar-focused on the business end. In short, considerable distance may exist between the societal image and the private business practitioner, and the ethical element can get lost in the shuffle and excitement.

To be sure, economics was evident in the “Swiftie” phenomenon during the summer of 2023. According to Time, Swift “achieved a kind of nuclear fusion: shooting art and commerce together to release an energy of historic force.”[1] Her Eras concert tour "brought in a whopping $1.04 billion with 4.35 million tickets sold across 60 tour dates."[2] Not just any singer can make such a haul and even trigger municipal economic booms and saturate the media’s attention worldwide simply by going on tour. Also, the magazine is clear that such a gargantuan amount of money brought in is not “something we often chalk up to the alignments of planets and fates,” for “giving too much credit to the stars ignores [Swift’s] skill and her power.”[3] In particular, her intense and sustained focus on every conceivable way, such as by re-recording existing songs and bundling them (admittedly with some songs from her vault) into albums in their own right, attending to merchandise and actively using the media for free publicity, to increase revenue leveraged, or made use of, her tremendous market power that was unrivaled; she dominated the airwaves during the summer of 2023. The “Taylor’s Version” albums provide us with an interesting case study wherein hype, money, and ethics are all in the mix.

According to Time, “Swift began releasing re-recordings of her back catalog in 2021 in an effort to reclaim her original music, after her initial label Big Machine Records sold her masters to Scooter Braun’s Ithaca Holdings in 2019. ‘Now Scooter has stripped me of my life’s work, that I wasn’t given an opportunity to buy,’ Swift wrote. . . . ‘Essentially, my musical legacy is about to lie in the hands of someone who tried to dismantle it.’”[4] I don’t doubt the authenticity of her emotive motivation here. In the vernacular, she was pissed.  Even so, if she had signed a contract with Big Machine Records giving it the unilateral right to sell the masters of her songs, and the purchaser has the legal right of use, then she had no legal or ethical claim to preempt the sale or be sold the masters outright. Of course, if labels write heavily unfair contracts essentially reflecting the commercial interests of the labels, taking advantage of the lack of bargaining power of new signers, ethical critique is fair game.

By its very nature, a contract is a coming together of (at least) two interests, with consideration (money) given by one party to the other. A residential lease, for instance, should reflect both interests. It should not restrict use of premises to be narrowed down to reflect only how the property owner would use the space or would like the space to be used. A property owner might prefer a “no guest” policy, but such as “policy”—the very word being presumptuous—violates reasonable use of premises. Furthermore, the property owner’s personal religious or moral lifestyle, for instance, should not bind the counterparty as long as the property itself is not damaged. “I don’t believe in eating meat, so you are not allowed to use the kitchen of your apartment to cook meat,” for instance, is presumptuous and dogmatic. More to the point, such a clause would violate or nullify the fact that in receiving rent, the property owner is selling the use of the space (as long as the property is not damaged). The mantra, “It’s my house,” taken as an absolute, is circumscribed when use is being sold for consideration (i.e., rent). Having it both ways is selfish and childish.

Whether or not Taylor Swift originally signed a one-sided contract is beyond my ability to investigate, given the information that I have. Her fans did not know either, and so, because of her emotional claim and her “star power,” her ethical cause resonated. Even so, it can be asked whether it is ethical to have hyped “Taylor’s Version” albums to the extent that buyers were willing to pay the full price of an album even if they had most of the songs already. To be sure, the “Taylor’s Versions” included “vault tracks”—songs not on the original albums. She also updated some lyrics. Even so, it can be asked whether the additional work justifies a full price of a new album. It can also be asked whether customers having receipts for the original albums, such as Fearless, should have been able to buy Taylor’s version at a discount. I submit that such a discount would be reasonable, given both the amount of additional work on Taylor’s part and the substance of the product (i.e., the extent to which it differs from the originals). A few songs from the vault and some new lyrics do not render the albums commensurate with albums filled with previously unreleased songs.

If Swift’s motivation was indeed to gain control of her songs, she should have agreed to a discount. Fearless (Taylor’s Version) had the biggest debut for any album in 2021, with 722.7 million on-demand streams in the U.S. that year.[5] Surely at least some of those customers already possessed the original album. Of course, the irrational exuberance that would cause such a customer to buy the same songs again can also be criticized, but many of her customers were teenagers and thus easily taken in even by orchestrated hype of good feeling seemingly aloft from the earthly taint of business strategizing. My point is that it is no accident that Taylor Swift made a lot of money essentially recycling songs ready for re-singing. She was not merely trying to regain control over her work. I submit that she was acting as a business woman, and a darn good one at that.  Her true identity—her driving financial ambition—was practically hidden under the blinding glitter of the “nuclear fusion” that Time magazine describes. My point is that the resulting sonic boom was orchestrated to coordinate and max out both the hype and the revenue. Behind the moral cause, behind the curtains, Swift’s financial acumen could be said to be a subterranean force of nature.

Such a force tends to be obscured, obfuscated, or, more often, intentionally hidden in the American entertainment industry. Similarly, elected representatives in Congress or the White House keep both their fowl tongues and their raw desire for power far away from the reach of microphones and cameras. In short, the sheer difference between private personas, including agendas, motivations, and even personalities, and the public images on the societal stage is astounding. Especially in politics in a representative democracy, this differential is a real problem that goes beyond the financial harm to young “Swifties” who have been subtly manipulated into buying (mostly recycled) songs at full price.


1. Jordan Valinsky, “Taylor Swift Named Time’s ‘Person of the Year,” CNN.com, December 6, 2023.
2. Maria Sherman, "Taylor Swift's Eras Tour Is the First Tour to Gross Over $1 Billion, Pollster Says," APNews.com, December 8, 2023.


Saturday, September 30, 2023

Exposing Yale’s Sordid Side: “The Inner Ring” by C. S. Lewis

C. S. Lewis aptly describes in one published lecture the nature of a very human game, which is really about how soft power, which is often buttressed by institutional position, works in any human organization. To use Nietzsche’s expression (which Lewis would have hardly appreciated), the dynamics of an inner ring is human, all too human, and thus hardly an extractible part of the human condition. Yet it is much more salient, and arguably even dysfunctional, in just some organizations, especially those that have an elite reputation such as Yale, whose essence, we shall investigate here, might be exclusion even within the university community, such that some vulnerable members are told they are not really members (but that their donations are welcome).

In my essay, “Yale’s Original Sin,” I describe Yale’s culture of inner-exclusion operating within the university, wherein some insiders are relegated by inner-insiders as outsiders. During my stay as an alumnus doing research for a book I was writing, I was astonished to read emails from non-academic employees in which they bluntly stated that I was not a “member of the Yale community” because I was an alum. Unfortunately, and quite tellingly, those explicit statements were just the tip of the iceberg. Much more common, in more sense than one ironically, were the intentional subtle hints given by some faculty, faculty-administrators, and even non-academic employees that I was not worth their time whether in replying to an email message or in conversation. This extended to the faculty culture being averse to allowing alumni (and other scholars, as a courtasy) to audit courses and to that of clerical employees not recognizing alumni in residence for a term as members of the Yale community. This self-serving, arrogant, and deeply mistaken attitude and belief applied in a counter-productive way to charging alumni in residence $4 more than students, faculty, and the non-academic employees themselves, for lunch at the university lunch hall known as Commons. A common mentality to be sure. 

In his lecture entitled, “The Inner Ring,” C. S. Lewis describes the ubiquitous phenomenon that he calls the inner ring of an organization. “I can assure you,” he tells his audience, “that in whatever hospital, inn of court, diocese, school, business, or college you arrive . . . , you will find the Rings—what Tolstoy calls the second or unwritten systems.” In War and Peace, Tolstoy alludes to such an informal yet firmly hierarchical or concentric system: “(S)ide by side with the system of discipline and subordination which were laid down in the Army Regulations, there existed a different and more real system—the system which compelled a tightly laced general with a purple face to wait respectfully for his turn while a mere captain like Prince Andrey chatted with a mere second lieutenant like Boris.” The general is not royalty, and so he deferred to the prince even though the latter was of a lower rank. The general was thus an outsider in the immediate context of the prince’s conversation even though he is very much an insider among military brass.

We mere humans revile being relegated as outsiders; we very much want to be insiders. This is C. S. Lewis’ main point. “My main purpose in this address is simply to convince you that this desire is one of the great permanent mainsprings of human action.” Specifically, he means here “the desire to be inside the local Ring and the terror of being left outside.” This desire and fear can be distinguished from the desire for personal gain and the fear of going homeless out of financial ruin. “And you will be drawn in, if you are drawn in, not by desire for gain or ease, but simply because at that moment, when the cup was so near your lips, you cannot bear to be thrust back again into the cold outer world. It would be so terrible to see the other man’s face—that genial, confidential, delightfully sophisticated face—turn suddenly cold and contemptuous, to know that you had been tried for the Inner Ring and rejected.” In other words, wanting to feel oneself as an insider and to avoid feeling like an outsider are desires that do not necessarily line up with, or reduce to, the desire for political or economic gain.

As with any desire, the desire to be an insider cannot be permanently satiated once achieved. C.S. Lewis wrote, “As long as you are governed by that desire you will never get what you want. You are trying to peel an onion: if you succeed there will be nothing left. . . . Once the first novelty is worn off, the members of this circle will be no more interesting than your old friends.” Or perhaps a ring within that ring will emerge, and you will have a new impediment to feeling like an insider. Even if that is achieved, you would still suffer from the fear that you could become an outsider, for the grounds from relegating you are informal in this secondary system and thus secretive and hardly subject to the moral principle of fairness. C. S. Lewis goes so far as to declare, “Until you conquer the fear of being an outsider, an outsider you will remain.”

The combination of secrecy and informality in an inner ring, or circle, renders unfairness from personal like and dislike especially likely. An official hierarchy, in contrast, operates ideally on the basic of merit, with avenues for appeals. Money in the form of bribes and political power can less ideally come into play in formal hierarchies. So too can friendships. But these more informal means of promotion and demotion are more the currency of inclusion and exclusion in informal hierarchies, such as C. S. Lewis describes. To be rejected for lack of merit is, I submit, easier to take than by unfair means or reasons. The latter is evinced when the decision-makers are hidden from view and thus appeals to them cannot be made. This passage from C. S. Lewis describes the subtle mechanics of an inner ring very well:

“You are never formally and explicitly admitted by anyone. You discover gradually, in almost indefinable ways, that it exists and that you are outside it; and then later, perhaps, that you are inside it. . . . It is not easy, even at a given moment, to say who is inside and who is outside. Some people are obviously in and some are obviously out, but there are always several on the borderline. . . . There are no formal admissions or expulsions. People think they are in it after they have in fact been pushed out of it, or before they have been allowed in: this provides great amusement for those who are really inside. It has no fixed name.”

The subtle messages in the rude behavior from Yale faculty, academic administrators, non-academic employees, and even some students that I describe above and in “Yale’s Original Sin” are the means by which ill-favored Yalies gradually discover that they have already been rendered outsiders. That the realization of having been excluded can occur gradually opens up the outsider to embarrassment, for the insiders relish watching as if the person with a blindfold on is stumbling over furniture. The behavior could be regarded superficially as mere rudeness, so it can be difficult if one is on the receiving end to detect that one is being handed one’s hat on the way out. A person may just stand there, holding one’s hat, wondering why a person just felt the need to deliver the hat even if the other person intended to send the message, you are no longer welcome here but I can’t kick you out of the building. This is precisely the message that people in Yale’s inner rings (and there are more than one) want to send. Bottom line: such people refuse to tolerate even the very presence of a person they don’t like. This includes a person who holds a contrary opinion. The motive, in other words, goes beyond wanting to make sophomoric statements of superiority; the intention is also meant to convey to others that they are outsiders. Whereas gorillas establish superiority and push certain individuals out by physical means, our species is not so forthright and honest (or brave).

In the movie Contact (1997), Haddon, a millionaire, says to Ellie, a young astrophysicist who wants to be chosen to go on a space mission, “The powers that be have been very busy lately, falling over themselves to position themselves for the game of the millennium. Maybe I can help deal you back in.” By this he is referring to being dealt cards in a card game. Ellie takes the hint and replies, “I didn’t realize that I was out,” to which Haddon says, “Maybe not out, but certainly being handed your hat.”  Ellie has no idea that even her boss, in jockeying for position to be chosen as the astronaut, has been working to see to it that Ellie is eliminated from consideration by the inner ring of which the boss is an insider but Ellie is not. She has no access to that circle of power-elites, so she doesn’t even know that she needs to promote or defend herself, or even appeal. From outside the inner ring, its workings are shrouded with mystery, for outsiders are not privy to the phone calls and other conversations that take place within. As C.S. Lewis wrote, “There are no formal admissions or expulsions. People think they are in it after they have in fact been pushed out of it, or before they have been allowed in: this provides great amusement for those who are really inside. It has no fixed name.” Ellie’s boss gets pleasure from dealing her out, especially because this is being done without her knowledge. C. S. Lewis wrote, “It is not easy, even at a given moment, to say who is inside and who is outside. Some people are obviously in and some are obviously out, but there are always several on the borderline.”

Interesting, C. S. Lewis must have known that the question of whether the phenomenon of the inner ring, even manifesting in a seminary, is evil was being asked. If, as I strongly suspect, exclusionary comments and actions are deliberately done at least in part to emotionally hurt other people, even just out of dislike, the question of whether such insiders are de facto evil is relevant. C. S Lewis focuses his answer at the level of the ring, but with implications for its inhabitants.  “I am not going to say that the existence of Inner Rings is an Evil. It is certainly unavoidable. There must be confidential discussions: and it is not only a bad thing, it is (in itself) a good thing, that personal friendship should grow up between those who work together.” But this is just one side of the coin, or ring. Lewis admits that the “genuine Inner Ring exists for exclusion. There’d be no fun if there were no outsiders. The invisible line would have no meaning unless most people were on the wrong side of it. Exclusion is no accident; it is the essence.” These last two sentences may aptly describe the dark side of Yale. This is not to say that the essence of Yale is exclusively exclusion, for that would imply that no other source of worth, in this case, academic, exists in the organization. Even so, exclusion is an excessive, or hypertrophic instinctual urge in many people there, especially in those who work there.

Lewis claims that the anguish in being reckoned as an outsider is a strong human motivating force in wanting to be counted as insiders. But if a group, or its inner ring, is filled with rude, petty elitists, wouldn't a normal person feel some solace and even self-esteem in being an outsider? I suppose whether this is one's own choice or that of the "members" of a ring makes a difference here. Nevertheless, Nietzsche wrote that the healthy should not visit the sick in hospital lest the healthy catch something. In Christianity, Paul warns about hanging out with fools. Depending on the group, a person might very well relish being an outsider, even if not by choice. Some rings have bad odors. 

A question posed by C. S. Lewis seems relevant: “I must not ask whether you have derived actual pleasure from the loneliness and humiliation of the outsiders after you, yourself were in: whether you have talked to fellow members of the Ring in the presence of outsiders simply in order that the outsiders might envy; whether the means whereby, in your days of probation, you propitiated the Inner Ring, were always wholly admirable. I will ask only one question—and it is, of course, a rhetorical question which expects no answer. In the whole of your life as you now remember it, has the desire to be on the right side of that invisible line ever prompted you to any act or word on which, in the cold small hours of a wakeful night, you can look back with satisfaction? If so, your case is more fortunate than most.” From this, I surmise that Lewis reckoned that such people are bad, and even malicious, but not evil, because what he was describing was human nature itself.

In Augustine’s theology, we are all subject to original sin. Proverbially, we are all sons and daughters of Adam and Eve. Evil, it seems to me, cannot simply be human nature itself, but, rather, an extreme in enjoying human suffering. But even this definition is problematic, for sociopathy is a psychological illness rather than a religious phenomenon. Evil is a distinctly religious term. I think the problem is psychological where exclusion is allowed to fill in a void to become the essence of an organization. Taken to the extreme, exclusion as substance or the raison d’etre of an organization and thus being its very essence snuffs out other possible substances and thus must ultimately collapse. Relatedly, M. Scott Peck writes in People of the Lie that it is a sense of inner emptiness that lies at the core of malignant narcissism. Perhaps that is responsible for the dysfunctional organizational culture of inner-exclusion from within that has plagued Yale.

Thursday, August 17, 2023

Walmart: Encroaching on Employees' Private Lives

In 2023, Walmart relaxed its policy requiring anyone applying for a job at the company to get a drug test, including for marijuana, which at the time was legal in several U.S. member states. Once hired, however, employees were still subject to random testing. An employee in a member state in which the drug is legal could be fired even if the person is never affected by the drug while working. I contend that the practice is unfair, unethical, and an over-reach in terms of the nature of a labor contract. 

The ethical principle of fairness is violated because both marijuana and alcohol can impair the brain and yet the company only tests for one even where both drugs are legal. An argument can be made that the alcoholic personality is less than suitable, and yet taking marijuana outside of work (with no impact during work hours) is reason enough for an employee to be fired. Whereas alcohol can inducive hostility and even aggression, marijuana has a calming drug—something that could actually help busy cashiers.

Besides being unfair, the policy of even random tests for marijuana is invasive, beyond the legitimate scope of an employer’s reach—assuming that the employee using marijuana is never “high” at work. In selling one’s labor, an employee does not agree to a company’s management being able to control the employee’s legal activities outside of work if those activities do not affect the employee’s work. Sam Walton, the founder of Walmart, was against marijuana; for him to impose his ideological opposition on others where the drug is legal was over-reaching and impious; he was not a god. An argument can also be made that it is none of the company’s business, literally and figuratively, whether an employee uses the drug where it is illegal, again as long as the employee is not “high” at work. Law enforcement is the job of police, not a company’s managers. Of course, if an employee is convicted of a crime, an employer may not permit convicted employees to continue. In the case of Walmart, it hires people who have criminal records, which shows just how nonsensical the policy of random testing for marijuana is (especially as more and more U.S. member states legalize recreational use of the drug). In terms of a contract between an employer and an employee, an employer who presumes to dictate an employee’s recreational activities imposes a cost on employees that is not offset by the monetary compensation.

Imagine what would happen if a labor union informed a company’s management that an abrasive supervisor must be subject to drug and alcohol tests and fired for any positive results, or else the employees would strike. Suppose too that the supervisor does indeed have a problem with alcohol, but is not under its influence while at work. Still the union insists that the company fire that person. Suddenly, the company’s management would object with a mighty roar, How dare employees tell us what we cannot do on our days off! The nerve! Well, it goes both ways, folks. The attitude is the same: the unethical vice of invasiveness (in peoples’ personal, not work-related lives) is noxious and may even point to a toxic organizational culture.

See: Walmart: Bad Management as Unethical


Wednesday, August 16, 2023

Getting the Seasons Wrong: Purblind Meteorologists

You may think you know the answer to the question, “When is the autumn season?” But do you?  Watching the weather section of local news on television or the internet, you could be excused for getting the beginning date wrong because it is the meteorologist who has misled you. In itself, getting the exact day right is not a big deal; it is not as if the temperature can be expected to take a nose-dive on the first day of fall. The astonishing thing is that so many meteorologists either knowingly or out of ignorance present the astronomical beginning of the “autumn” quarter of the Earth’s orbit as the meteorological start of fall, for the two are different yet admittedly related.

“According to The Old Farmer’s Almanac, meteorological seasons are based on the temperature cycle in a calendar year.”[1] The first month of a given season tends resemble the preceding season and the last month anticipates the upcoming season. A season comes is fully its own in its second month. Each season lasts three months. “Meteorological fall begins on September 1” in the Northern Hemisphere and ends “exactly 90 days later, on November 30. Winter then gets its three months. Growing up in a northern Midwestern (U.S.) state, I just assumed that snowy March was part of winter. It sure felt like that. Only later, while living in the Southwest, did I realize that temperatures do start to go up in March.

Distinct but having an impact on meteorological seasons, the astronomical seasons are “based on the position of Earth in relation to the sun’s position.”[2] There are such seasons because of the tilt of the Earth in relation to the sun. On the summer solstice—the astronomical beginning of “summer”—the sun’s perpendicular rays get the farthest north; the Northern Hemisphere tilts toward the sun. The Southern Hemisphere is closest on the winter solstice—the astronomical beginning of “winter.” Again, while growing up in the northern Midwest, I knew that meteorological winter could not possibly start well into December just before Christmas, for winter cold was well ensconced by that time. The meteorological start of winter, on December 1st, is much more accurate in terms of temperature.

It is odd, therefore, that the start dates of the astronomical seasons are “more commonly celebrated” even to mark changes in weather.[3] A weather site or television broadcast using the astronomical start-date is inherently misleading, as the implication is incorrect. Even though Accuweather.com states, “Astronomical autumn officially arrives on Saturday, Sept. 23 at 2:50 a.m. EDT, a few weeks after the arrival of meteorological fall,” the presentation of the two starts by a weather organization may be confusing, especially as the paragraph continues with: “Regardless of which date you celebrate the start of autumn, . . .”[4] In an article on the fall weather forecast, two start-dates for that seasons are given. At least Accuweather.com distinguishes the astronomical from the meteorological. Local meteorologists use the astronomical dates on charts of weekly weather forecasts. 

On this weather chart, the Thursday (September 22nd) is labeled as "Fall." 

This is definitely misleading—or is it the case that local weather personalities do not realize their mistakes? At the very least, the television meteorologists astonishingly do not realize that they are giving false information—that they are misleading the public. It is absurd, at least in the northern tier of U.S. member states to say that summer does not begin until June 21st and that winter does not begin until a few days before Christmas. To stick with something that is so obviously absurd and incorrect when meteorologists should know better is precisely the cognitive phenomenon that I want to highlight here.

Perhaps the culprit is cognitive dissidence: the brain holding two contradictory thoughts at the same time. I know this date is astronomical AND I am using it on a weather forecast AND I know that the meteorological date is different. This weakness or vulnerability of the human brain may mean that there are others.

Regarding religious and political ideological beliefs, the brain may be susceptible to “short-circuiting” an internal check that would otherwise keep the brain from conflating belief with knowledge. A person once told me that Michele Obama is really a man. I disagreed. The person replied, “That’s just your opinion; I have the facts.” I said that I did not want to discuss politics. “It’s not political,” she replied. My claim to the contrary was, again, “just an opinion.” I was stunned at such ignorance that couldn’t be wrong.  Next, she wrote a nonsensical “deep state” political coded message on an index card. That her brain would not entertain the possibility that it could be in error is precisely the vulnerability that I contend plagues the brain as it ventures into political and religious domains of cognitions. In short, I suggest that a healthy human brain has more cognitive weaknesses than merely being subjective, and that society is overwhelmingly oblivious to them.



[1] Amaya McDonald, “When and How to Watch the Perseid Meteor Shower,” CNN.com, August 11, 2023.
[2] Ibid.
[3] Ibid.
[4] Brian Lada, “AccuWeather’s 2023 US Fall Forecast,” Accuweather.com, July 26, 2023.


Tuesday, July 4, 2023

On the Decadence of American Journalism: Journalists as Celebrities

I submit that when a conveyer of the news becomes the story, something is wrong; in typing this sentence initially, I did not include I submit that. To state my thesis statement as if it were a fact of reason (Kant’s phrase) seemed to me rather heavy-handed (i.e., arrogant). Similarly, when some Americans insisted after the U.S. presidential that Don Trump had won as if the asseveration were a fact of reason, I could sense aggressiveness along with the presumptuousness in treating one’s own opinion as a declaration of fact, especially if the actual fact—Joe Biden being sworn into the office—was otherwise. Opinion is one thing; fact is another. When a person misconstrues one’s opinion with fact, something is wrong. I believe this happens so often that it may be due to a problem innate in the human brain. Religious folks would not have to reach far to point out that in the story of Adam and Eve in the Garden of Eden, the sin of pride manifests in wanting to be omniscient; eating of that proverbial apple of the knowledge of good and evil ushers in original sin. A person perceiving one’s own opinion as fact, or even as important as fact, implicitly regards oneself as God. A journalist who interlards one’s role in conveying the news with one’s own commentary, and an editor who then makes that commentary the point of a story both treat a means (i.e., the conveyer of news) as an end (i.e., the news itself). I contend that at least by 2023, American journalism had fallen into this hole with impunity, which involved a lack of industry self-regulation and individual self-discipline.   

On July 4, 2023, The Huffington Post ran a story, “CNN Journalist Responds to Brazen Trump Campaign Claim with Disbelief.” The story begins with the following statement: “CNN’s Phil Mattingly on Monday couldn’t quite believe a Trump campaign response to a Washington Post question about the former president’s efforts to overturn the 2020 election result.”[1] Why should it matter whether a journalist can’t quite believe a statement made by a person being interviewed? CNN also ran the story, "Anderson Cooper Is Dumbfounded by Ron DeSantis' Bad Polling Excuse." The news network reported that the "CNN anchor was confused by the 2024 Republican presidential candidate's reason for falling behind Donald Trump." Why should it matter that the journalist was confused? Maybe he was not the brightest lightbulb. The network's message was obviously that DeSantis was to blame for the journalist's confusion, so the intent was to bias the viewers and readers against the presidential candidate. The inclusion of the word, excuse, in the story's title indicates the tenor of the bias in the "news" story.  

Is it ethical for a journalist to sway or bias the reactions of viewers or readers? Euronews, a E.U. rather than a U.S. news network, explicitly espouses impartiality so viewers can form their own opinions unimpeded by that of a journalist. That network even has a feature in which video is shown of events going on around the world, such as a political protest, with “No Comment” showing at the end. What a contrast to the American news media!

CNN’s obsession with the visible reactions of one of its news anchors, Anderson Cooper, to political statements even of people being interviewed illustrates my thesis. From his “news” show, Anderson Cooper 360, the network posted a video on CNN.com entitled “Watch Cooper’s Reaction to What Sondland Told Trump.” His reaction was visibly nothing spectacular. 

Another video had the title, “See Anderson Cooper’s Reaction to Ted Cruz ‘Groveling’ on Fox.” Again, the reaction was hardly noteworthy. 

Nevertheless, his show on the new network all about the anchor, as the very name of the show makes explicit. CNN’s CEO must have thought that the anchor’s reactions would make good promotional material for the network. Strangely, a magazine’s editor even thought that Anderson Cooper’s reactions were of value apart from promotional purposes. People posted a story online about the CNN anchor’s reactions in a New Year’s Eve broadcast from Times Square in New York City. The story, “Anderson Cooper Was All of Us with His Hilarious Reactions as He Took Shots to Bid 2020 Farewell,” featured the celebrity’s reaction to drinking a shot of alcohol on live television. After he and the other host drank a shot, “Cooper pursed his lips and coughed in seemingly slight discomfort, though he otherwise held it together.”[2] Hilarious.

It may be that Anderson Cooper’s minute reactions were such important fodder for publications because the anchor’s mother, Gloria Vanderbilt, was an heiress of an illustrious (and very rich) American family. She was a businesswoman, fashion designer, socialite, and writer in her own right. Interestingly, she had offered at the age of 85 to carry a baby for her gay son. That in itself was more newsworthy than any of her son’s muted political reactions on air. Of course, when Anderson Cooper had come out as gay on air, that too was deemed to be newsworthy in spite the journalistic standard that the sexual orientation (or race, gender, or political ideology) of a conveyer of news shouldn’t have an impact on the presentation of the news.

When a journalist becomes the story, especially in expressing a personal opinion, news itself (and journalism) becomes obfuscated, diluted, and even toxic from the standpoint of the role of an electorate in a democracy. The societal justification in giving journalists an outsized mouthpiece in public discourse is predicated on their function in conveying the news. This does not extend to molding public opinion and being the news themselves. In a culture in which reality-shows spawn celebrities, perhaps it is only natural that anyone on television could be made into one even for displaying muted visible reactions.

Wednesday, June 14, 2023

Starbucks: A Racist Company Against Racism

In June, 2023, Starbucks had to face a unanimous jury decision in favor of a regional manager whom Starbucks' upper management had fired because she had resisted the company's racist policy of punishing innocent Caucasian managers for good public relations, which the CEO felt was needed and appropriate after a store manager had legitimately called the police on two Black people in a Starbucks restaurant who presumed the right not only to sit in a restaurant without ordering anything (before Starbucks allowed this),  but also to ignore the authority of the store's manager. Starbucks cowered to the unjust negative publicity, and thus showed a lack of leadership, and went on to act unethically in wanting to show the world that the company can go after Caucasian employees. This racism is ironic, for several years earlier, Starbucks' CEO had ordered employees at the store level to discuss racism with customers. Interestingly, the anti-racist ideology being preached was partial, and thus contained a blind spot wherein racism such as the company's upper management would exhibit is acceptable. 

As the CEO of Starbucks, Howard Schultz had employees promote his political ideology on two social issues: gay marriage and race. Regarding the latter, he ordered employees, whom he artfully called partners, to write race messages on cups so customers would unknowingly enable employees to impart Schultz’s position on the issue by raising the topic. I assume that the employees could not begin such conversations. I have argued elsewhere that Schultz’s use of the employees for such a purpose was not only extrinsic to making coffee as per the employees’ job descriptions, but also unethical.[1] In terms of corporate governance alone, the shareholders, as the owners of the company, should have decided whether to have their company used to promote partisan positions on social issues. In 2023, Target and Budweiser would learn of the perils in wandering off the knitting to get political on social issues. In terms of jurisprudence, the “right” of a company, a legal entity, to have free speech is dubious, as abstract entities, even if legally recognized as such, are not human beings. Rather, the “free speech” claimed by companies is really that of the human beings who work for the companies. Using an abstract entity that itself cannot speak to gain additional publicity for one’s ideological views is unfair because the vaulted or amplified speakers are not so from a democratic standpoint. In short, why should Howard Schultz have access to a megaphone and employees to propagate his political ideology on social issues, when you and I have no such means of self-amplification? Whether we agree or disagree with the former CEO’s political ideology on race is not relevant to my point. To be sure, that his employees were told to speak against racism is in my opinion much better than had they been told to advocate racism against Black people. That Starbucks would then engage in racism is that much harder to understand, but perhaps the hypocrisy reflects a hidden negative aspect of Schultz’s ideology on race. American society could benefit by having that aspect uncovered; such a benefit vastly outweighs any benefit to business. Even in a pro-business culture, a lower good should not be put over a higher one. Aristotle refers to this error as misordered concupiscence.

In June, 2023, a jury in New Jersey “found in favor of former Starbucks regional director Shannon Phillips, who sued the company for wrongfully firing her, claiming she was terminated for being White.”[2] The company’s position was that Phillis’ boss fired her because she had displayed weak leadership. The use of such vague jargon as leadership for what is actually management is itself problematic. Even if Phillips had “appeared overwhelmed and lacked awareness of how critical the situation had become,” as her boss presumably had written, does not constitute weak leadership, for she was not in a leadership role[3]; instead, the company’s CEO should have got out in front of the issue and provided a vision for the company.[4] If Schultz was the CEO at the time, the failure of his leadership would be especially telling, considering his earlier foray into politics using the company to promote his ideology.

The triggering incident that had overwhelmed Phillips, according to her boss, whom the CEO at the time must agree in retrospect failed as a supervisor but presumably was not fired, involved two Black men who had refused to leave a Starbucks store in 2018 even though they would not purchase anything. They were thus not customers, and the incident occurred before the company allowed non-purchasers to be in the stores. That the two Black men refused to leave the company’s private property means they were trespassing, so the store manager was on solid legal grounds in having the local police remove the men from the store. Being Black, even if that race has been (and is) subject to racism generally, does not give a person the right to trespass on private property, and efforts to remove such trespassing is not racist, for anyone trespassing would be legally subject to removal from the property. 

I contend that Howard Schultz’s notion of racial reconciliation suffers from the weakness of being blind to the racial presumption displayed by the two Blacks. In having employees talk about the need not to be racist to customers, Schultz was assuming that racism is something that non-Blacks do to Blacks. Employees were not told to suggest to Black customers that being Black does not give them special exemptions from the law or in society. Schultz could have had employees suggest to Black customers that jay-walking between intersections in a major street even if cars are coming is not “a Black thing” that is justified because the race in general has been subject to discrimination. Furthermore, the use of the word, nigga, cannot be allowed only if the speaker is Black, for that would be a racist position. For a Black person who uses the word to become hostile or aggressive towards an Indian, Oriental, or Caucasian who also uses the word is itself racist (and of course the hostility is unjustified unless the related word nigger is used in a hostile manner). The U.S. Constitution does not indicate that free speech depends or is limited by race; such a clause would be prime facie racist.

Phillips’ complaint, which the jury accepted unanimously, states that following the arrest of the two Black men, Starbucks “took steps to punish White employees who had not been involved in the arrests, but who worked in and around the city of Philadelphia, in an effort to convince the community that it had properly responded to the incident.”[5] Phillips was ordered “to place a White employee on administrative leave as part of these efforts, due to alleged discriminatory conduct which Phillips said she knew was inaccurate. After Phillips tried to defend the employee, the company let her go.”[6] It does not sound like Phillips was overwhelmed; in fact, she was being pro-active and ethical in defending an employee from an unjust punishment. The implication is that the person who fired Phillips acted unethically.

Moreover, in being willing to sacrifice Caucasian employees based on their race for good public relations, the company’s upper managers were being racist. An unseen implication is that those managers believed that the public reaction against the company for having the two Black men removed from the store in Philadelphia had some validity—that Black people should not be treated like that or that Black people deserve special treatment due to their race. But such a belief is itself racist. Schutz’s talking points for his employees to discuss with customers on race did not include mention of the racism in such beliefs. Moreover, he did not have the company’s employees talk about racism by Black people stemming from resentment. Any ideology is partial, rather than whole, and even claim of being against racism can fall short. In going after Caucasian employees, including Phillips, Starbucks’ upper managers fell short; the failure of leadership ultimate belongs to the CEO at the time. At least at the time of the trial, Howard Schultz was the CEO.

Saturday, June 10, 2023

Gay Pride and Evangelical Christianity

Taylor Swift, an American singer and cultural icon in 2023, spoke “out against anti-queer legislation” during a concert in early June. “We can’t talk about Pride Month without talking about pain. There have been so many harmful pieces of legislation that have put [gay people] at risk. It’s painful for everyone. Every ally. Every loved one . . . ,” she said[1]. So much hurt. This motivated me to volunteer to carry a full-size gay flag in a gay Pride parade until the end of the route even though I am not gay. When I arrived in the morning, I thought the issue was political; by the time the parade began, religion had trumped the political. A small but vocal group of evangelical Christians and a larger group of young women wearing and carrying gay flags (in part to hide the Christians) were shouting at each other in utter futility of noise. What if people would use religion to dissolve the religious and political anger and even tension instead of stoking them? Both sides missed an opportunity.

The full essay is at "Gay Pride and Evangelical Christianity."


1. Shruti Rajkumar, “Taylor Swift Breaks Silence And Condemns Anti-LGBTQ Bills During Eras Tour,” The Huffington Post, June 3, 2023.

Thursday, May 18, 2023

Thanksgiving as a Day of Mourning: On the Instinctual Urge of Resentment

According to CNN’s website, the “sobering truth about the harvest feast that inspired Thanksgiving” is is the fact that colonists killed Indians. According to an analyst at CNN, the American Indian Day of Mourning, established in 1970 for the fourth Thursday of November, turned Thanksgiving “into something more honest” than the Thanksgiving mythos of a peaceful feast in 1621 suggests.[1] The drenching of self-serving ideology in CNN’s “analysis,” like heavy, overflowing gravy obscuring the sight and taste of the underlying mashed potatoes, is something less than honest.

Historically, the feast in Massachusetts Bay Colony in 1621, exactly four hundred years before Thanksgiving in 2021, when I am writing this essay, was attended not only by the Pilgrims, but also the Wampanoag Indians. The two peoples were then in an alliance. CNN’s Tensley attempts to derail the value of cross-cultural feast by pointing out that initially, “the pious newcomers didn’t even invite the Wampanoags to the revelry.”[2] The value in the fact that the Indians reveled with the Pilgrims in feasting is not nullified by the fact that the Pilgrims had changed their minds on sending out an invitation. Also, that the invitation served strategic interests in strengthening the alliance is no vice, for the alliance was based on ensuring survival in a changing world.

Moreover, the Day of Mourning is itself partisan in that it tells only a partial truth—namely, that Pilgrims killed Indians in the colonial era of North America. Left unsaid is the equally valid point that Indians killed Pilgrims. That the small pox disease also led to the death of Indians was no fault of the Pilgrims, contrary to Tensley’s ideological resentment. Furthermore, that killing took place between Indian tribes and English colonies generally does not nullify the good that is in a shared feast even among allies. Mourning the loss of American Indians generally as a replacement for Thanksgiving obscures that good and in fact implies that the killings in a broader war nullifies the good that is even in an eventual invitation. It surely must not have been easy for either the Indians or the Pilgrims to sit down together for a feast given the more general prejudice then existing between the Indian tribes and English colonies.

Of course, the point of the national Thanksgiving holiday established by President Lincoln in the nineteenth century—namely to give thanks to God—is of value in itself rather than being nullifiable by the hitherto conflict between the English colonies and Indian tribes in North America. 

The 2021 Thanksgiving Day parade in New York City. (Source: CNN) Surely such happiness does not deserve to be sullied by sordid resentment as a will to power.

In short, the imposition of a day of mourning over Thanksgiving really misses the point. In actuality, the imposition is, in Nietzschean terms, an instinctual urge fueled by resentement. Nietzsche claims that more pleasure from power can be had by mastering such an intractable urge rather than letting it run wild. Such overcoming internally is the signature of strength—more so even than conquering enemies on a battlefield. Unfortunately, the weak are so oriented to their external enemies that even truth can suffer and giving thanks to God can be overlooked entirely. The weak are children of a lesser god. A god of projected resentement, which discredits the very notion of a benevolent deity. Like light from a distant star, Nietzsche writes, the news of the discrediting murder of the conception of a vengeful god of perfect goodness has not yet reached the murderers, whose hands are drenched with blood. To claim, out, out damned spot! yet not know the source of the blood must surely be a worrisome interim condition drenched with anxiety, which in turn can fuel an instinctual urge based in resentement of the strong—people willing and able to control even their most intractable instincts.   
[1] Brandon Tensley, “National Day of Mourning Turns Thanksgiving into Something More Honest,” CNN.com, November 25, 2021 (accessed same day). CNN labels Tensley’s role as that of an analyst rather than an opinion-writer—the false attribution of fact belies the salience of the writer’s opinion-informed ideology.
[2] Ibid. Tensley’s own hostile ideological resentment is evident in his labeling of the Pilgrims as pious—here connoting a presumption of superiority. The Indians no doubt also felt superior, as it is only natural for any people, including the Pilgrims and Indians, to favor one’s own culture over others if one’s ideology (i.e., values and beliefs) is in sync with one’s culture.


Undermining Progress: Power Enforcing Infallible Ignorance

Bleeding to heal. The Earth is flat. Earth is at the center of the solar system. Zeus lives on Mount Olympus. The divine right of kings to act even as tyrants (e.g., Henry VIII of England). Hitler died in his bunker. Turning the heater on in a local bus kills coronavirus. These are things that were thought in their respective times to be uncontrovertibly true. In some of these cases, the power of the establishment was not subtle in enforcing them even when they should have been questioned. How presumptuous this finite, mortal species is! If ignorance on stilts is bliss, then why is it such in need of power? Subconsciously, the human mind must realize that its assumption of not being able to be wrong is flawed. We are subjective beings with instinctual urges—one of which manifests in the unquestioned assumption that what we know cannot be wrong, and furthermore that we are entitled to impose our “facts” on others. As the homo sapiens (i.e., wise) species, we are too sure, and too proud, concerning our knowledge and especially beliefs. We would like to have the certainty and objectivity that computers have, but we are subjective biological animals, not inert machines.

How much do we actually know? David Hume claimed that we do not really understand causation; we don’t get close enough to it to understand how one thing causes another thing. Worse still, we often take a positive correlation—that one thing is related to another (e.g., rain and seeing umbrellas)—as meaning that the one thing causes the other. Rain does not cause umbrellas; nor do umbrellas cause rain. Descartes was of a rare breed in that he was willing to critique his entire edifice of knowledge. With an open plain filled with the debris in front of him, he wrote that he could only be sure that he was thinking and therefore that he was existing. Cognito sum. I think, therefore I am. That he went on to reconstruct the very same edifice may suggest that he was still too taken with his previous knowledge. At the very least, his rebuilt edifice cannot be reckoned as progress.
Generally speaking, pride/ego plus knowledge is a retardant to progress and a sycophant to the status quo. New ideas must break the glass in order to breath and circulate even to reach peoples’ consciousness. Well-established beliefs clutch at us even in the face of strong arguments and empirical evidence to the contrary.

Hikes and stake-outs on Mount Olympus could have demonstrated to the ancient Greeks that immortal giants did not live there. The Greeks who scaled the peak tended to say that they felt the gods there—that the gods were invisible, as if they were merely spirit. Such contorting and even pruning when necessary is not uncommon in cases in which religion over-reaches; the core of the religious belief itself must endure even in the face of contravening empirical evidence. Sadly, not much progress has been made on the mind-game in the domain of religion; the human mind itself may be susceptible, with denial protecting the mind from recognizing its own susceptibility.

By the time that the ancient Greek religion became extinct, people were willing to conclude that no such gods existed (or had existed), and the belief that they lived on Olympus was simply wrong. Few if any people, however, were then able to consider that their own living religion could be wrong too. It’s the other guy who is wrong; this time, the deity really does exist. The firmness with which this belief is held, as if it were knowledge, is a sign of excessive defensiveness, and thus of unconscious doubt. Perhaps the unconscious is more honest with itself than consciousness is with us.
How many Christians consider that perhaps people could be wrong that Jesus literally rose from the dead (i.e., historically, as an empirical, historical fact)? How many Jews consider that historical evidence is lacking to support the belief that Moses was a historical person?  Josephus, an ancient Jewish historian who lived in the first century, wrote Antiquities, which refers to a man named Jesus (albeit with probable later Christian parenthetical additions that a Jewish historian would not have accepted). To go from a man named Jesus to Jesus Christ involves a religious claim/belief that Jesus is divine. We have left the territory of historical accounts, which are in the past tense, to make use of faith narratives, which, as myths, can be in the present tense. For example, myths such as the Christian Passion story can be reenacted in ritual each year as if Jesus’ passion is once again to be felt. The religious experience is presently experienced, having been triggered by myth (religious story) and ritual (couched in drama).

In short, in looking back at the ancient Greek religion, we dub the stories of Zeus and the other gods as myth. Yet we instinctively resist even the possibility that the ongoing religions could include myth, for it and historical writings are two different genres and we clutch at the added certainty that can be provided by historical accounts. Why is additional certainty believed to be so important? Religionists don’t want to even consider that their particular religious beliefs could be wrong or over-stretched. To be sure, a myth-writer (or orator) may reference historical events, but his point is not to convey the veracity of them. Rather, historical events may be used (and adapted) to make religious points. For example, the Gospels differ on when the Last Supper occurred relative to Passover because the writers wanted make different religious points. None of the writers of the faith narratives would have subordinated religious points to historical accuracy. Therefore, the added certainty is a mirage. Rather than essentially reclassifying religious belief as knowledge (empirical or through reasoning), matching religious belief with its own kind of confidence would be more in keeping with the domain, and thus with human experience therein.

Unfortunately, religion does not rest with the exogenous certainty; the inhabitants in the domain not only try to conquer (and thus control) each other; other domains are fair game too. Run through the circuits of a human brain, religion tends to be infused with pride such that the religious domain may have a propensity to encroach onto other domains, even assuming the prerogative to dominate them. How uncouth! Hence Christianity got into trouble when it tried to control science and claim history for itself. The assumption that religion should constrain scientific knowledge not only conflates two different categories, or domains, but also was ignorantly taken as infallibly true. Furthermore, a faith-belief could be taken as a historical fact, which in turn could be used to justify the belief. Such a closed, self-reinforcing cognitive loop is not easily broken open even to the scalpel of an inquisitive, self-questioning mind. How rare such minds have been and are even in the midst of robust technological progress and greater knowledge available to mankind. 
Christianity also got into trouble with itself, without realizing it, when it over-reached onto the military domain, which is not at all friendly to loving thy enemy. When the Roman Catholic popes became partisans in geo-political rivalries in Europe, the Church became closed in effect to its rivals and thus short-circuited its own mission—that is, the mission in the religious domain to save souls by leading people to Christ. We can count as progress the success of other domains in pushing religion back within the confines of its own turf. To presume to know the native fauna of another land better than the native plants on one’s own land, and then to presume to weed that land without sufficiently weeding one’s own is like arrogance on stilts; the toxic attitude of superiority should be underwater. Thus the high are made low, at least in theory.

In surveying world religions, I see progress at the point when the extant religions (with the exception of Satanism) came to no longer believe that human sacrifice appeases deities. When Judaism and Christianity had gained enough traction in ancient Greco-Roman culture that religion itself was no longer just a matter of ritual, but also had moral content (e.g., the Ten Commandments, the Beatitudes), religion itself may have progressed. Why not more definite? Friedrich Nietzsche, a nineteenth-century European philosopher, argues that modern morality borne of weakness and foisted on the strong to make the latter voluntarily renounce acting on their strength. Meanwhile, the ascetic priests, who are weak (literally in being celibate) are free to unleash their urge to dominate by controlling their respective herds and in confronting the strong with, “Thall Shalt Not!” Even our surest knowledge of progress can afford to be questioned.

Unfortunately, once the Greco-Roman religion that was merely ritual to appease the gods and included human sacrifice was extinct, continued progress has faced a strong headwind from the still extant religions that were created roughly in the “second generation” (1800 BCE-650 CE). Even though the ancient cultures within which those religions formed are by the twenty-first century oceans of time from modern-day cultures, religious strictures grounded in the formative cultures die hard, if at all. These strictures are sustained at in part out of a fear that beginning the project of separating the divine from (human) culture would lead to anything goes (i.e., cafeteria-style religion). What if the divine in revelation is itself cultural reflected on high? Change itself faces an uphill battle even though the sheer difference between modern and ancient cultures suggests that changes are necessary in order that moderns are not to be held captive by the arbitrary limitations in long-ago cultures. This is particularly true in religious moralities. That Paul thought that women should not preach in Christian churches is not sufficient for churches today to be obligated to treat Paul’s opinion in his letters as if it were divine revelation. Even that Jesus’ disciples in the Gospels are men does not mean that Jesus sought to limit his disciples to men. Writings on Mary Magdalene discovered in the twentieth century support this point. Put another way, even mere opinions in ancient letters are held so firmly that human opinion is essentially divinized. As a writer, I am well aware that mistakes are in writings. Correcting for those errors, such as the Christian overlay on Josephus’ historical account on Jesus, has largely been inoperative when the human mind entertains religious belief (i.e., dogma).

My point is that the self-retarding mechanisms of the human mind can slow down progress and enclose us in ignorance that cannot be wrong. We tend to overrate both the freedom of progress from human nature and the knowledge and beliefs we have both individually and as a species. This is not to deny the existence of progress through history. Gladiators killing each other in stadiums has been replaced by football (both sports) fighting for a ball. A general increase in the value of human life has occurred in enough societies to suggest an upward trend for the wayward dictators to measure themselves against. Nietzsche aside, moral progress has also occurred, again in enough societies to demonstrate an upward trend. The incredible technological advances in the twentieth century can also be taken as progress because they have expanded human potential. For one thing, people could write beyond daylight, electric lights being brighter than candles. Just think how long candles were relied on, then all of a sudden, in the turn of a switch, the initially-feared new light was on and could spread. The danger, it seems to me, lies in the assumption that the biological fixity of our species becomes less of a hindrance as technology becomes even more advanced.

The coronavirus pandemic in 2020 hit the species even in spite of our technological advances, even in the field of medicine. Boris Johnson, the prime minister of the UK at the time, initially swore off precautions. The fact that he held high office did not prevent him from having to go into intensive care at a hospital. As far as a virus is concerned, we are not apart from Nature; rather, we are biological. Our minds, being corruptible in terms of knowledge and judgment, can limit what technology can do to stave off a pandemic.

For example, according to a local bus driver in Phoenix, Arizona, the bus company’s management was urging drivers to turn on the heat when the temperature outside was not prohibitive and close the windows (hence trapping the airborne virus) because “the heat kills the virus.” The closed windows meant that plenty of airborne virus could be expected to be trapped in the buses. Perhaps the treatment of bleeding would have healed the brain-sickness of managers. Unfortunately, they were able to use their authority to enforce their ignorance that could not be wrong. So could grocery-store managers there—in a state in which public education is ranked 49th out of the 50 States—who did not even notice that even their own employees were not keeping at a physical distance from each other and customers (who behaved as herd-animals incapable of altering a well-grooved habit even to protect themselves!). The improved knowledge available from medical experts didn’t matter. In fact, by the month of April, most customers and employees of grocery stores in Phoenix were wearing the surgical masks that the virus can easily pass through; such masks were to be used by the infected so they don’t spit on, and thus infect, the healthy. Of what value is progress in knowledge if a major metropolitan area in a developed country acts regardless? A meat manager at one grocery store there told me that one guy touched a number of meat packages after having gorged on some chocolate. The customer rebuffed the manager, saying, “My fingers going from my mouth to the packages won’t get anyone sick.” An uneducated opinion was presumptuously dismissing science. In this way and many others, the benefits of progress in human knowledge are held back by human nature—specifically, by ignorance that cannot be wrong, and even presumes to trump knowledge.

It is ironic that progress has been extolled even in times held back by the status quo. “We are in an age of greater transparency,” a person interviewed by the BBC said just after the British government tried to have it that the prime minister, Boris Johnson was hospitalized for tests and because he had symptoms. The lightness of this announcement is belied by the fact that he went to a hospital during his Queen’s speech. He surely would not have wanted to take away from the speech, and yet he was going in for tests, so why did he not wait until after the speech? Why the urgency if he was going in for tests? The implication that his hospitalization was not urgent was undone the next day by press reports that he was then in intensive care. So much for transparency, at least from the government. The primitive instinct for security surreptitiously stepped back from, and thus nullified at least in part, the contribution that technology had made on transparency in the press on government affairs.

Similarly, even though a French agent reported to the French intelligence service that he had recently seen Adolf Hitler and his wife attending an opera during one of its three performances in South America after World War II, the world, including the U.S. Government, stuck publically with the Soviets’ story that the couple had died and then been burned in Hitler’s bunker in April, 1945. Even after the Soviets tested the couple’s DNA and found that both people were women, the world and its governments continued with the story that Hitler and his wife had died in the bunker. That Hitler might have lived the rest of his life in South America, even conniving with his expert on dropping a nuclear bomb on New York City, apparently triggered the security instinct such that the progress in intelligence-gathering and analysis was for naught. The tyranny of the status quo against progress is subtle, yet more enduring than the rule of a tyrannical ruler.

Why was it insisted historically in Europe that the Earth is flat even without any evidence? The “scientific fact” was even defended by threats of death, but then it was more a matter of religious belief masquerading as fact. Why is the human mind so hesitant to say, “It’s a theory, but we really don’t know.” The pride of a mind is and the fear of uncertainty are human qualities rooted in the instinct of self-preservation. Pride is thought to beget power, which aids self-preservation. So too does having greater certainty of the environment. Such bloated pride can motivate a Christian king to become convinced that the divine right of kings justifies even tyranny that is hardly in line with Jesus’ teachings. Even Christian clerics intoxicated with their temporal power may suppose that burning a scientist for claiming that the Earth orbits the Sun rather than vice versa is in line with loving enemies. Being more in love with temporal power than with Jesus’ preachments is yet another example of the religious costs of trying to dominate in another domain.

Whether in religion, politics, or higher education, does cognitive difference really make someone an enemy, or is the human brain prone to overstepping, emotionally speaking, in applying emotion to cognitive differences? We humans are overwhelmingly utterly unaware of the games our minds play on us. We assume that we are in control of what we think, and that we use reason impeccably. Nietzsche claimed that the content of ideas is instinctual urges, and thus reasoning is a subjective tussle within loose strictures that may themselves be urges. How much do we really know even about ourselves? Yet we would not tolerate someone saying that what we are absolutely sure we know may yet be incorrect. We are so sure that we grasp for authority to enforce what we know on others who resist. Hence, if we were to go back in time and refuse to be bled, a physician may dismiss our claim that bleeding actually weakens rather than cures a person and use his authority as a physician to subject us to the treatment. The weak—in this case, the ignorant with power—think nothing of dominating the strong; in fact, the resentful enjoy it.