"(T)o say that the individual is culturally constituted has become a truism. . . . We assume, almost without question, that a self belongs to a specific cultural world much as it speaks a native language." James Clifford

Monday, January 8, 2024

On the Birth of Corporate Social Responsibilty in 1869

Referring to the speculation in gold that was engineered by Jay Gould and others in 1869 to enrich themselves and the Erie Railroad, Henry Adams (1838-1918), a grandson of John Quincy Adams and great grandson of John Adams, wrote at the time:

“For the first time since the creation of these enormous corporate bodies, one of them has shown its power for mischief, and has proved itself able to override and trample on law, custom, decency, and every restraint known to society, without scruple, and as yet without check. The belief is common in America that the day is at hand when corporations far greater than the Erie [Railroad] — swaying power such as has never in the world’s history been trusted in the hands of mere private citizens  . . . — will ultimately succeed in directing government itself. Under the American form of society, there is now no authority capable of effective resistance.” (1)

Gould had wanted the price of gold to rise not only because he had bought some to sell at a higher price, but also because as a stockholder of the Erie, he would benefit from the railroad transporting more wheat from the Midwest to the east coast for export. A higher price in gold meant a lower dollar. Wheat being based in dollars, a lower dollar meant more exports. The strategy was essentially to devalue the dollar, which Gould assured President Grant would be in the national interest economically. As the price of gold rose to $165 in 1869, Grant, fearing a bubble, pulled the plug by having the Treasury sell $4million in gold.  The collapse in the gold market triggered a drop in the stock-market. Even if it might have been in the short term interest of the speculators and railroads, the manufactured bubble was not in the national interest after all. Gould’s bribes of administration officials had been in vain.

Henry Adams saw the imprint of corporate power eviscerating both societal norms and democracy in the scandal.  In other words, the new-found corporate power eventuated in the birth of the need for corporate social responsibility amid capitalism eclipsing democracy. In academic terms, corporate social responsibility and (corporate) business & government, although discrete fields, were both first publicly recognized in 1869.

The corporate power occasioning Adam’s recognition was a novelty at the time, according to Brands, because the large corporation had only come into being as the railroads incorporated in the 1850s. Looking back after the Civil War, Henry Adams observed, "The last ten years had given to the great mechanical energies — coal, iron, steam — a distinct superiority in power over the old industrial elements -- agriculture, handwork, and learning." (2)  The power of steam in particular translated into large, publicly-held, corporations first in the railroad industry.

On account of their size and scope, and the associated equity capital requirements given the risk faced by lenders, the railroads were the first large American corporations to be publicly traded. The diffusion of ownership — a consequence of the large capital demands — led to a separation of ownership from control and to a new ownership interest: that of the short-term-oriented speculator. A short-seller, for example, seeks lower corporate earnings in the future, while a long-term investor hopes for higher dividends, and thus profits. Managers can exploit this difference in order to pursue their interests in the name of the corporation at the expense of societal norms and democratic governance.

Undergirding the managerial basis in skill, the railroads were the first companies to develop the methods of corporate administration. For example, there were supervisors over supervisors—in other words, multilayered organizational charts. Furthermore, dovetailing with the need for safety and efficiency (given the competition), the railroads developed precise management of their operations, including the development of standards for measuring performance. In short, the railroads were the first to develop a cadre of managers specialized in administration in the particular industry. (3)

Regarding the private power based on technique (i.e., managerial power), Henry Adams announced in 1869 that there was no authority, whether in society or government, capable of resisting it. The normative call for corporate social responsibility and the political call for a resurgence of democracy amid the encroaching capitalism were born. In other words, with great power came a recognition of a need for great responsibility. The corporate social responsibility movement began as precisely this recognition even as the modern large corporation was in its second decade.

Punctum Saliens, the large corporate type of commercial organization itself is inherently powerful relative to societal norms and even potential governmental or regulatory restraints. That is to say, the invention of the large corporation may have been inherently problematic, essentially involving systemic risk to the republic itself on account of the private power of the managements. To paraphrase Nietzsche, power cannot be but powerful. To unleash an inherently powerful feeding machine and expect it not to eat the grass is naive, if not patently irresponsible. To expect the managements of extremely wealthy corporations to be willingly socially responsible when their economizing and power-aggrandizing nature is to run through such non-constraints is simply ideological, if not fanciful. Fundamentally, the problem with corporate management is its inherent proclivity to bristle at any external constraint. It is the underlying maximizing egoism that is innately antithetical to the limiting natures of government regulation and corporate social responsibility.

Endnotes:

1. Henry Adams, “The New York Gold Conspiracy,” in Charles F. Adams, Jr. and Henry Adams, Chapters of Erie (Ithaca: Cornell University Press, 1956), pp. 135-36.
2. Henry Adams, The Education of Henry Adams (1907; Boston: Houghton Mifflin, 1961), p. 238.
3. H. W. Brands, American Colossus: The Triumph of Capitalism 1865-1900 (New York: Doubleday, 2010), pp. 22-23.

Wednesday, December 6, 2023

Time Magazine’s Person of the Year: Taylor Swift

Time magazine named the singer Taylor Swift as its person of the year for 2023. Such a force of nature were her stadium-filled concerts during that summer that they triggered economic booms in the respective host cities. In Pittsburgh, Pennsylvania, for example, hotel rooms went for as much as $2,500 downtown on the night of the concert. In terms of American culture, the analogy of gravity waves may fit. During an interview for television at her home (or one of her homes), Swift’s savvy business acumen was very evident; her marketing prowess was extraordinary. She even re-released her own songs, resulting in a huge financial windfall for what are really the same songs merely re-sung. It is not as if she had grown a new voice. Swift personifies American culture, whose “movers and shakers” seem “happy go lucky” on stage yet, behind the scenes, they tend to be lazar-focused on the business end. In short, considerable distance may exist between the societal image and the private business practitioner, and the ethical element can get lost in the shuffle and excitement.

To be sure, economics was evident in the “Swiftie” phenomenon during the summer of 2023. According to Time, Swift “achieved a kind of nuclear fusion: shooting art and commerce together to release an energy of historic force.”[1] Her Eras concert tour "brought in a whopping $1.04 billion with 4.35 million tickets sold across 60 tour dates."[2] Not just any singer can make such a haul and even trigger municipal economic booms and saturate the media’s attention worldwide simply by going on tour. Also, the magazine is clear that such a gargantuan amount of money brought in is not “something we often chalk up to the alignments of planets and fates,” for “giving too much credit to the stars ignores [Swift’s] skill and her power.”[3] In particular, her intense and sustained focus on every conceivable way, such as by re-recording existing songs and bundling them (admittedly with some songs from her vault) into albums in their own right, attending to merchandise and actively using the media for free publicity, to increase revenue leveraged, or made use of, her tremendous market power that was unrivaled; she dominated the airwaves during the summer of 2023. The “Taylor’s Version” albums provide us with an interesting case study wherein hype, money, and ethics are all in the mix.

According to Time, “Swift began releasing re-recordings of her back catalog in 2021 in an effort to reclaim her original music, after her initial label Big Machine Records sold her masters to Scooter Braun’s Ithaca Holdings in 2019. ‘Now Scooter has stripped me of my life’s work, that I wasn’t given an opportunity to buy,’ Swift wrote. . . . ‘Essentially, my musical legacy is about to lie in the hands of someone who tried to dismantle it.’”[4] I don’t doubt the authenticity of her emotive motivation here. In the vernacular, she was pissed.  Even so, if she had signed a contract with Big Machine Records giving it the unilateral right to sell the masters of her songs, and the purchaser has the legal right of use, then she had no legal or ethical claim to preempt the sale or be sold the masters outright. Of course, if labels write heavily unfair contracts essentially reflecting the commercial interests of the labels, taking advantage of the lack of bargaining power of new signers, ethical critique is fair game.

By its very nature, a contract is a coming together of (at least) two interests, with consideration (money) given by one party to the other. A residential lease, for instance, should reflect both interests. It should not restrict use of premises to be narrowed down to reflect only how the property owner would use the space or would like the space to be used. A property owner might prefer a “no guest” policy, but such as “policy”—the very word being presumptuous—violates reasonable use of premises. Furthermore, the property owner’s personal religious or moral lifestyle, for instance, should not bind the counterparty as long as the property itself is not damaged. “I don’t believe in eating meat, so you are not allowed to use the kitchen of your apartment to cook meat,” for instance, is presumptuous and dogmatic. More to the point, such a clause would violate or nullify the fact that in receiving rent, the property owner is selling the use of the space (as long as the property is not damaged). The mantra, “It’s my house,” taken as an absolute, is circumscribed when use is being sold for consideration (i.e., rent). Having it both ways is selfish and childish.

Whether or not Taylor Swift originally signed a one-sided contract is beyond my ability to investigate, given the information that I have. Her fans did not know either, and so, because of her emotional claim and her “star power,” her ethical cause resonated. Even so, it can be asked whether it is ethical to have hyped “Taylor’s Version” albums to the extent that buyers were willing to pay the full price of an album even if they had most of the songs already. To be sure, the “Taylor’s Versions” included “vault tracks”—songs not on the original albums. She also updated some lyrics. Even so, it can be asked whether the additional work justifies a full price of a new album. It can also be asked whether customers having receipts for the original albums, such as Fearless, should have been able to buy Taylor’s version at a discount. I submit that such a discount would be reasonable, given both the amount of additional work on Taylor’s part and the substance of the product (i.e., the extent to which it differs from the originals). A few songs from the vault and some new lyrics do not render the albums commensurate with albums filled with previously unreleased songs.

If Swift’s motivation was indeed to gain control of her songs, she should have agreed to a discount. Fearless (Taylor’s Version) had the biggest debut for any album in 2021, with 722.7 million on-demand streams in the U.S. that year.[5] Surely at least some of those customers already possessed the original album. Of course, the irrational exuberance that would cause such a customer to buy the same songs again can also be criticized, but many of her customers were teenagers and thus easily taken in even by orchestrated hype of good feeling seemingly aloft from the earthly taint of business strategizing. My point is that it is no accident that Taylor Swift made a lot of money essentially recycling songs ready for re-singing. She was not merely trying to regain control over her work. I submit that she was acting as a business woman, and a darn good one at that.  Her true identity—her driving financial ambition—was practically hidden under the blinding glitter of the “nuclear fusion” that Time magazine describes. My point is that the resulting sonic boom was orchestrated to coordinate and max out both the hype and the revenue. Behind the moral cause, behind the curtains, Swift’s financial acumen could be said to be a subterranean force of nature.

Such a force tends to be obscured, obfuscated, or, more often, intentionally hidden in the American entertainment industry. Similarly, elected representatives in Congress or the White House keep both their fowl tongues and their raw desire for power far away from the reach of microphones and cameras. In short, the sheer difference between private personas, including agendas, motivations, and even personalities, and the public images on the societal stage is astounding. Especially in politics in a representative democracy, this differential is a real problem that goes beyond the financial harm to young “Swifties” who have been subtly manipulated into buying (mostly recycled) songs at full price.


1. Jordan Valinsky, “Taylor Swift Named Time’s ‘Person of the Year,” CNN.com, December 6, 2023.
2. Maria Sherman, "Taylor Swift's Eras Tour Is the First Tour to Gross Over $1 Billion, Pollster Says," APNews.com, December 8, 2023.


Saturday, September 30, 2023

Exposing Yale’s Sordid Side: “The Inner Ring” by C. S. Lewis

C. S. Lewis aptly describes in one published lecture the nature of a very human game, which is really about how soft power, which is often buttressed by institutional position, works in any human organization. To use Nietzsche’s expression (which Lewis would have hardly appreciated), the dynamics of an inner ring is human, all too human, and thus hardly an extractible part of the human condition. Yet it is much more salient, and arguably even dysfunctional, in just some organizations, especially those that have an elite reputation such as Yale, whose essence, we shall investigate here, might be exclusion even within the university community, such that some vulnerable members are told they are not really members (but that their donations are welcome).

In my essay, “Yale’s Original Sin,” I describe Yale’s culture of inner-exclusion operating within the university, wherein some insiders are relegated by inner-insiders as outsiders. During my stay as an alumnus doing research for a book I was writing, I was astonished to read emails from non-academic employees in which they bluntly stated that I was not a “member of the Yale community” because I was an alum. Unfortunately, and quite tellingly, those explicit statements were just the tip of the iceberg. Much more common, in more sense than one ironically, were the intentional subtle hints given by some faculty, faculty-administrators, and even non-academic employees that I was not worth their time whether in replying to an email message or in conversation. This extended to the faculty culture being averse to allowing alumni (and other scholars, as a courtasy) to audit courses and to that of clerical employees not recognizing alumni in residence for a term as members of the Yale community. This self-serving, arrogant, and deeply mistaken attitude and belief applied in a counter-productive way to charging alumni in residence $4 more than students, faculty, and the non-academic employees themselves, for lunch at the university lunch hall known as Commons. A common mentality to be sure. 

In his lecture entitled, “The Inner Ring,” C. S. Lewis describes the ubiquitous phenomenon that he calls the inner ring of an organization. “I can assure you,” he tells his audience, “that in whatever hospital, inn of court, diocese, school, business, or college you arrive . . . , you will find the Rings—what Tolstoy calls the second or unwritten systems.” In War and Peace, Tolstoy alludes to such an informal yet firmly hierarchical or concentric system: “(S)ide by side with the system of discipline and subordination which were laid down in the Army Regulations, there existed a different and more real system—the system which compelled a tightly laced general with a purple face to wait respectfully for his turn while a mere captain like Prince Andrey chatted with a mere second lieutenant like Boris.” The general is not royalty, and so he deferred to the prince even though the latter was of a lower rank. The general was thus an outsider in the immediate context of the prince’s conversation even though he is very much an insider among military brass.

We mere humans revile being relegated as outsiders; we very much want to be insiders. This is C. S. Lewis’ main point. “My main purpose in this address is simply to convince you that this desire is one of the great permanent mainsprings of human action.” Specifically, he means here “the desire to be inside the local Ring and the terror of being left outside.” This desire and fear can be distinguished from the desire for personal gain and the fear of going homeless out of financial ruin. “And you will be drawn in, if you are drawn in, not by desire for gain or ease, but simply because at that moment, when the cup was so near your lips, you cannot bear to be thrust back again into the cold outer world. It would be so terrible to see the other man’s face—that genial, confidential, delightfully sophisticated face—turn suddenly cold and contemptuous, to know that you had been tried for the Inner Ring and rejected.” In other words, wanting to feel oneself as an insider and to avoid feeling like an outsider are desires that do not necessarily line up with, or reduce to, the desire for political or economic gain.

As with any desire, the desire to be an insider cannot be permanently satiated once achieved. C.S. Lewis wrote, “As long as you are governed by that desire you will never get what you want. You are trying to peel an onion: if you succeed there will be nothing left. . . . Once the first novelty is worn off, the members of this circle will be no more interesting than your old friends.” Or perhaps a ring within that ring will emerge, and you will have a new impediment to feeling like an insider. Even if that is achieved, you would still suffer from the fear that you could become an outsider, for the grounds from relegating you are informal in this secondary system and thus secretive and hardly subject to the moral principle of fairness. C. S. Lewis goes so far as to declare, “Until you conquer the fear of being an outsider, an outsider you will remain.”

The combination of secrecy and informality in an inner ring, or circle, renders unfairness from personal like and dislike especially likely. An official hierarchy, in contrast, operates ideally on the basic of merit, with avenues for appeals. Money in the form of bribes and political power can less ideally come into play in formal hierarchies. So too can friendships. But these more informal means of promotion and demotion are more the currency of inclusion and exclusion in informal hierarchies, such as C. S. Lewis describes. To be rejected for lack of merit is, I submit, easier to take than by unfair means or reasons. The latter is evinced when the decision-makers are hidden from view and thus appeals to them cannot be made. This passage from C. S. Lewis describes the subtle mechanics of an inner ring very well:

“You are never formally and explicitly admitted by anyone. You discover gradually, in almost indefinable ways, that it exists and that you are outside it; and then later, perhaps, that you are inside it. . . . It is not easy, even at a given moment, to say who is inside and who is outside. Some people are obviously in and some are obviously out, but there are always several on the borderline. . . . There are no formal admissions or expulsions. People think they are in it after they have in fact been pushed out of it, or before they have been allowed in: this provides great amusement for those who are really inside. It has no fixed name.”

The subtle messages in the rude behavior from Yale faculty, academic administrators, non-academic employees, and even some students that I describe above and in “Yale’s Original Sin” are the means by which ill-favored Yalies gradually discover that they have already been rendered outsiders. That the realization of having been excluded can occur gradually opens up the outsider to embarrassment, for the insiders relish watching as if the person with a blindfold on is stumbling over furniture. The behavior could be regarded superficially as mere rudeness, so it can be difficult if one is on the receiving end to detect that one is being handed one’s hat on the way out. A person may just stand there, holding one’s hat, wondering why a person just felt the need to deliver the hat even if the other person intended to send the message, you are no longer welcome here but I can’t kick you out of the building. This is precisely the message that people in Yale’s inner rings (and there are more than one) want to send. Bottom line: such people refuse to tolerate even the very presence of a person they don’t like. This includes a person who holds a contrary opinion. The motive, in other words, goes beyond wanting to make sophomoric statements of superiority; the intention is also meant to convey to others that they are outsiders. Whereas gorillas establish superiority and push certain individuals out by physical means, our species is not so forthright and honest (or brave).

In the movie Contact (1997), Haddon, a millionaire, says to Ellie, a young astrophysicist who wants to be chosen to go on a space mission, “The powers that be have been very busy lately, falling over themselves to position themselves for the game of the millennium. Maybe I can help deal you back in.” By this he is referring to being dealt cards in a card game. Ellie takes the hint and replies, “I didn’t realize that I was out,” to which Haddon says, “Maybe not out, but certainly being handed your hat.”  Ellie has no idea that even her boss, in jockeying for position to be chosen as the astronaut, has been working to see to it that Ellie is eliminated from consideration by the inner ring of which the boss is an insider but Ellie is not. She has no access to that circle of power-elites, so she doesn’t even know that she needs to promote or defend herself, or even appeal. From outside the inner ring, its workings are shrouded with mystery, for outsiders are not privy to the phone calls and other conversations that take place within. As C.S. Lewis wrote, “There are no formal admissions or expulsions. People think they are in it after they have in fact been pushed out of it, or before they have been allowed in: this provides great amusement for those who are really inside. It has no fixed name.” Ellie’s boss gets pleasure from dealing her out, especially because this is being done without her knowledge. C. S. Lewis wrote, “It is not easy, even at a given moment, to say who is inside and who is outside. Some people are obviously in and some are obviously out, but there are always several on the borderline.”

Interesting, C. S. Lewis must have known that the question of whether the phenomenon of the inner ring, even manifesting in a seminary, is evil was being asked. If, as I strongly suspect, exclusionary comments and actions are deliberately done at least in part to emotionally hurt other people, even just out of dislike, the question of whether such insiders are de facto evil is relevant. C. S Lewis focuses his answer at the level of the ring, but with implications for its inhabitants.  “I am not going to say that the existence of Inner Rings is an Evil. It is certainly unavoidable. There must be confidential discussions: and it is not only a bad thing, it is (in itself) a good thing, that personal friendship should grow up between those who work together.” But this is just one side of the coin, or ring. Lewis admits that the “genuine Inner Ring exists for exclusion. There’d be no fun if there were no outsiders. The invisible line would have no meaning unless most people were on the wrong side of it. Exclusion is no accident; it is the essence.” These last two sentences may aptly describe the dark side of Yale. This is not to say that the essence of Yale is exclusively exclusion, for that would imply that no other source of worth, in this case, academic, exists in the organization. Even so, exclusion is an excessive, or hypertrophic instinctual urge in many people there, especially in those who work there.

Lewis claims that the anguish in being reckoned as an outsider is a strong human motivating force in wanting to be counted as insiders. But if a group, or its inner ring, is filled with rude, petty elitists, wouldn't a normal person feel some solace and even self-esteem in being an outsider? I suppose whether this is one's own choice or that of the "members" of a ring makes a difference here. Nevertheless, Nietzsche wrote that the healthy should not visit the sick in hospital lest the healthy catch something. In Christianity, Paul warns about hanging out with fools. Depending on the group, a person might very well relish being an outsider, even if not by choice. Some rings have bad odors. 

A question posed by C. S. Lewis seems relevant: “I must not ask whether you have derived actual pleasure from the loneliness and humiliation of the outsiders after you, yourself were in: whether you have talked to fellow members of the Ring in the presence of outsiders simply in order that the outsiders might envy; whether the means whereby, in your days of probation, you propitiated the Inner Ring, were always wholly admirable. I will ask only one question—and it is, of course, a rhetorical question which expects no answer. In the whole of your life as you now remember it, has the desire to be on the right side of that invisible line ever prompted you to any act or word on which, in the cold small hours of a wakeful night, you can look back with satisfaction? If so, your case is more fortunate than most.” From this, I surmise that Lewis reckoned that such people are bad, and even malicious, but not evil, because what he was describing was human nature itself.

In Augustine’s theology, we are all subject to original sin. Proverbially, we are all sons and daughters of Adam and Eve. Evil, it seems to me, cannot simply be human nature itself, but, rather, an extreme in enjoying human suffering. But even this definition is problematic, for sociopathy is a psychological illness rather than a religious phenomenon. Evil is a distinctly religious term. I think the problem is psychological where exclusion is allowed to fill in a void to become the essence of an organization. Taken to the extreme, exclusion as substance or the raison d’etre of an organization and thus being its very essence snuffs out other possible substances and thus must ultimately collapse. Relatedly, M. Scott Peck writes in People of the Lie that it is a sense of inner emptiness that lies at the core of malignant narcissism. Perhaps that is responsible for the dysfunctional organizational culture of inner-exclusion from within that has plagued Yale.

Thursday, August 17, 2023

Walmart: Encroaching on Employees' Private Lives

In 2023, Walmart relaxed its policy requiring anyone applying for a job at the company to get a drug test, including for marijuana, which at the time was legal in several U.S. member states. Once hired, however, employees were still subject to random testing. An employee in a member state in which the drug is legal could be fired even if the person is never affected by the drug while working. I contend that the practice is unfair, unethical, and an over-reach in terms of the nature of a labor contract. 

The ethical principle of fairness is violated because both marijuana and alcohol can impair the brain and yet the company only tests for one even where both drugs are legal. An argument can be made that the alcoholic personality is less than suitable, and yet taking marijuana outside of work (with no impact during work hours) is reason enough for an employee to be fired. Whereas alcohol can inducive hostility and even aggression, marijuana has a calming drug—something that could actually help busy cashiers.

Besides being unfair, the policy of even random tests for marijuana is invasive, beyond the legitimate scope of an employer’s reach—assuming that the employee using marijuana is never “high” at work. In selling one’s labor, an employee does not agree to a company’s management being able to control the employee’s legal activities outside of work if those activities do not affect the employee’s work. Sam Walton, the founder of Walmart, was against marijuana; for him to impose his ideological opposition on others where the drug is legal was over-reaching and impious; he was not a god. An argument can also be made that it is none of the company’s business, literally and figuratively, whether an employee uses the drug where it is illegal, again as long as the employee is not “high” at work. Law enforcement is the job of police, not a company’s managers. Of course, if an employee is convicted of a crime, an employer may not permit convicted employees to continue. In the case of Walmart, it hires people who have criminal records, which shows just how nonsensical the policy of random testing for marijuana is (especially as more and more U.S. member states legalize recreational use of the drug). In terms of a contract between an employer and an employee, an employer who presumes to dictate an employee’s recreational activities imposes a cost on employees that is not offset by the monetary compensation.

Imagine what would happen if a labor union informed a company’s management that an abrasive supervisor must be subject to drug and alcohol tests and fired for any positive results, or else the employees would strike. Suppose too that the supervisor does indeed have a problem with alcohol, but is not under its influence while at work. Still the union insists that the company fire that person. Suddenly, the company’s management would object with a mighty roar, How dare employees tell us what we cannot do on our days off! The nerve! Well, it goes both ways, folks. The attitude is the same: the unethical vice of invasiveness (in peoples’ personal, not work-related lives) is noxious and may even point to a toxic organizational culture.

See: Walmart: Bad Management as Unethical


Wednesday, August 16, 2023

Getting the Seasons Wrong: Purblind Meteorologists

You may think you know the answer to the question, “When is the autumn season?” But do you?  Watching the weather section of local news on television or the internet, you could be excused for getting the beginning date wrong because it is the meteorologist who has misled you. In itself, getting the exact day right is not a big deal; it is not as if the temperature can be expected to take a nose-dive on the first day of fall. The astonishing thing is that so many meteorologists either knowingly or out of ignorance present the astronomical beginning of the “autumn” quarter of the Earth’s orbit as the meteorological start of fall, for the two are different yet admittedly related.

“According to The Old Farmer’s Almanac, meteorological seasons are based on the temperature cycle in a calendar year.”[1] The first month of a given season tends resemble the preceding season and the last month anticipates the upcoming season. A season comes is fully its own in its second month. Each season lasts three months. “Meteorological fall begins on September 1” in the Northern Hemisphere and ends “exactly 90 days later, on November 30. Winter then gets its three months. Growing up in a northern Midwestern (U.S.) state, I just assumed that snowy March was part of winter. It sure felt like that. Only later, while living in the Southwest, did I realize that temperatures do start to go up in March.

Distinct but having an impact on meteorological seasons, the astronomical seasons are “based on the position of Earth in relation to the sun’s position.”[2] There are such seasons because of the tilt of the Earth in relation to the sun. On the summer solstice—the astronomical beginning of “summer”—the sun’s perpendicular rays get the farthest north; the Northern Hemisphere tilts toward the sun. The Southern Hemisphere is closest on the winter solstice—the astronomical beginning of “winter.” Again, while growing up in the northern Midwest, I knew that meteorological winter could not possibly start well into December just before Christmas, for winter cold was well ensconced by that time. The meteorological start of winter, on December 1st, is much more accurate in terms of temperature.

It is odd, therefore, that the start dates of the astronomical seasons are “more commonly celebrated” even to mark changes in weather.[3] A weather site or television broadcast using the astronomical start-date is inherently misleading, as the implication is incorrect. Even though Accuweather.com states, “Astronomical autumn officially arrives on Saturday, Sept. 23 at 2:50 a.m. EDT, a few weeks after the arrival of meteorological fall,” the presentation of the two starts by a weather organization may be confusing, especially as the paragraph continues with: “Regardless of which date you celebrate the start of autumn, . . .”[4] In an article on the fall weather forecast, two start-dates for that seasons are given. At least Accuweather.com distinguishes the astronomical from the meteorological. Local meteorologists use the astronomical dates on charts of weekly weather forecasts. 

On this weather chart, the Thursday (September 22nd) is labeled as "Fall." 

This is definitely misleading—or is it the case that local weather personalities do not realize their mistakes? At the very least, the television meteorologists astonishingly do not realize that they are giving false information—that they are misleading the public. It is absurd, at least in the northern tier of U.S. member states to say that summer does not begin until June 21st and that winter does not begin until a few days before Christmas. To stick with something that is so obviously absurd and incorrect when meteorologists should know better is precisely the cognitive phenomenon that I want to highlight here.

Perhaps the culprit is cognitive dissidence: the brain holding two contradictory thoughts at the same time. I know this date is astronomical AND I am using it on a weather forecast AND I know that the meteorological date is different. This weakness or vulnerability of the human brain may mean that there are others.

Regarding religious and political ideological beliefs, the brain may be susceptible to “short-circuiting” an internal check that would otherwise keep the brain from conflating belief with knowledge. A person once told me that Michele Obama is really a man. I disagreed. The person replied, “That’s just your opinion; I have the facts.” I said that I did not want to discuss politics. “It’s not political,” she replied. My claim to the contrary was, again, “just an opinion.” I was stunned at such ignorance that couldn’t be wrong.  Next, she wrote a nonsensical “deep state” political coded message on an index card. That her brain would not entertain the possibility that it could be in error is precisely the vulnerability that I contend plagues the brain as it ventures into political and religious domains of cognitions. In short, I suggest that a healthy human brain has more cognitive weaknesses than merely being subjective, and that society is overwhelmingly oblivious to them.



[1] Amaya McDonald, “When and How to Watch the Perseid Meteor Shower,” CNN.com, August 11, 2023.
[2] Ibid.
[3] Ibid.
[4] Brian Lada, “AccuWeather’s 2023 US Fall Forecast,” Accuweather.com, July 26, 2023.


Tuesday, July 4, 2023

On the Decadence of American Journalism: Journalists as Celebrities

I submit that when a conveyer of the news becomes the story, something is wrong; in typing this sentence initially, I did not include I submit that. To state my thesis statement as if it were a fact of reason (Kant’s phrase) seemed to me rather heavy-handed (i.e., arrogant). Similarly, when some Americans insisted after the U.S. presidential that Don Trump had won as if the asseveration were a fact of reason, I could sense aggressiveness along with the presumptuousness in treating one’s own opinion as a declaration of fact, especially if the actual fact—Joe Biden being sworn into the office—was otherwise. Opinion is one thing; fact is another. When a person misconstrues one’s opinion with fact, something is wrong. I believe this happens so often that it may be due to a problem innate in the human brain. Religious folks would not have to reach far to point out that in the story of Adam and Eve in the Garden of Eden, the sin of pride manifests in wanting to be omniscient; eating of that proverbial apple of the knowledge of good and evil ushers in original sin. A person perceiving one’s own opinion as fact, or even as important as fact, implicitly regards oneself as God. A journalist who interlards one’s role in conveying the news with one’s own commentary, and an editor who then makes that commentary the point of a story both treat a means (i.e., the conveyer of news) as an end (i.e., the news itself). I contend that at least by 2023, American journalism had fallen into this hole with impunity, which involved a lack of industry self-regulation and individual self-discipline.   

On July 4, 2023, The Huffington Post ran a story, “CNN Journalist Responds to Brazen Trump Campaign Claim with Disbelief.” The story begins with the following statement: “CNN’s Phil Mattingly on Monday couldn’t quite believe a Trump campaign response to a Washington Post question about the former president’s efforts to overturn the 2020 election result.”[1] Why should it matter whether a journalist can’t quite believe a statement made by a person being interviewed? CNN also ran the story, "Anderson Cooper Is Dumbfounded by Ron DeSantis' Bad Polling Excuse." The news network reported that the "CNN anchor was confused by the 2024 Republican presidential candidate's reason for falling behind Donald Trump." Why should it matter that the journalist was confused? Maybe he was not the brightest lightbulb. The network's message was obviously that DeSantis was to blame for the journalist's confusion, so the intent was to bias the viewers and readers against the presidential candidate. The inclusion of the word, excuse, in the story's title indicates the tenor of the bias in the "news" story.  

Is it ethical for a journalist to sway or bias the reactions of viewers or readers? Euronews, a E.U. rather than a U.S. news network, explicitly espouses impartiality so viewers can form their own opinions unimpeded by that of a journalist. That network even has a feature in which video is shown of events going on around the world, such as a political protest, with “No Comment” showing at the end. What a contrast to the American news media!

CNN’s obsession with the visible reactions of one of its news anchors, Anderson Cooper, to political statements even of people being interviewed illustrates my thesis. From his “news” show, Anderson Cooper 360, the network posted a video on CNN.com entitled “Watch Cooper’s Reaction to What Sondland Told Trump.” His reaction was visibly nothing spectacular. 

Another video had the title, “See Anderson Cooper’s Reaction to Ted Cruz ‘Groveling’ on Fox.” Again, the reaction was hardly noteworthy. 

Nevertheless, his show on the new network all about the anchor, as the very name of the show makes explicit. CNN’s CEO must have thought that the anchor’s reactions would make good promotional material for the network. Strangely, a magazine’s editor even thought that Anderson Cooper’s reactions were of value apart from promotional purposes. People posted a story online about the CNN anchor’s reactions in a New Year’s Eve broadcast from Times Square in New York City. The story, “Anderson Cooper Was All of Us with His Hilarious Reactions as He Took Shots to Bid 2020 Farewell,” featured the celebrity’s reaction to drinking a shot of alcohol on live television. After he and the other host drank a shot, “Cooper pursed his lips and coughed in seemingly slight discomfort, though he otherwise held it together.”[2] Hilarious.

It may be that Anderson Cooper’s minute reactions were such important fodder for publications because the anchor’s mother, Gloria Vanderbilt, was an heiress of an illustrious (and very rich) American family. She was a businesswoman, fashion designer, socialite, and writer in her own right. Interestingly, she had offered at the age of 85 to carry a baby for her gay son. That in itself was more newsworthy than any of her son’s muted political reactions on air. Of course, when Anderson Cooper had come out as gay on air, that too was deemed to be newsworthy in spite the journalistic standard that the sexual orientation (or race, gender, or political ideology) of a conveyer of news shouldn’t have an impact on the presentation of the news.

When a journalist becomes the story, especially in expressing a personal opinion, news itself (and journalism) becomes obfuscated, diluted, and even toxic from the standpoint of the role of an electorate in a democracy. The societal justification in giving journalists an outsized mouthpiece in public discourse is predicated on their function in conveying the news. This does not extend to molding public opinion and being the news themselves. In a culture in which reality-shows spawn celebrities, perhaps it is only natural that anyone on television could be made into one even for displaying muted visible reactions.

Wednesday, June 14, 2023

Starbucks: A Racist Company Against Racism

In June, 2023, Starbucks had to face a unanimous jury decision in favor of a regional manager whom Starbucks' upper management had fired because she had resisted the company's racist policy of punishing innocent Caucasian managers for good public relations, which the CEO felt was needed and appropriate after a store manager had legitimately called the police on two Black people in a Starbucks restaurant who presumed the right not only to sit in a restaurant without ordering anything (before Starbucks allowed this),  but also to ignore the authority of the store's manager. Starbucks cowered to the unjust negative publicity, and thus showed a lack of leadership, and went on to act unethically in wanting to show the world that the company can go after Caucasian employees. This racism is ironic, for several years earlier, Starbucks' CEO had ordered employees at the store level to discuss racism with customers. Interestingly, the anti-racist ideology being preached was partial, and thus contained a blind spot wherein racism such as the company's upper management would exhibit is acceptable. 

As the CEO of Starbucks, Howard Schultz had employees promote his political ideology on two social issues: gay marriage and race. Regarding the latter, he ordered employees, whom he artfully called partners, to write race messages on cups so customers would unknowingly enable employees to impart Schultz’s position on the issue by raising the topic. I assume that the employees could not begin such conversations. I have argued elsewhere that Schultz’s use of the employees for such a purpose was not only extrinsic to making coffee as per the employees’ job descriptions, but also unethical.[1] In terms of corporate governance alone, the shareholders, as the owners of the company, should have decided whether to have their company used to promote partisan positions on social issues. In 2023, Target and Budweiser would learn of the perils in wandering off the knitting to get political on social issues. In terms of jurisprudence, the “right” of a company, a legal entity, to have free speech is dubious, as abstract entities, even if legally recognized as such, are not human beings. Rather, the “free speech” claimed by companies is really that of the human beings who work for the companies. Using an abstract entity that itself cannot speak to gain additional publicity for one’s ideological views is unfair because the vaulted or amplified speakers are not so from a democratic standpoint. In short, why should Howard Schultz have access to a megaphone and employees to propagate his political ideology on social issues, when you and I have no such means of self-amplification? Whether we agree or disagree with the former CEO’s political ideology on race is not relevant to my point. To be sure, that his employees were told to speak against racism is in my opinion much better than had they been told to advocate racism against Black people. That Starbucks would then engage in racism is that much harder to understand, but perhaps the hypocrisy reflects a hidden negative aspect of Schultz’s ideology on race. American society could benefit by having that aspect uncovered; such a benefit vastly outweighs any benefit to business. Even in a pro-business culture, a lower good should not be put over a higher one. Aristotle refers to this error as misordered concupiscence.

In June, 2023, a jury in New Jersey “found in favor of former Starbucks regional director Shannon Phillips, who sued the company for wrongfully firing her, claiming she was terminated for being White.”[2] The company’s position was that Phillis’ boss fired her because she had displayed weak leadership. The use of such vague jargon as leadership for what is actually management is itself problematic. Even if Phillips had “appeared overwhelmed and lacked awareness of how critical the situation had become,” as her boss presumably had written, does not constitute weak leadership, for she was not in a leadership role[3]; instead, the company’s CEO should have got out in front of the issue and provided a vision for the company.[4] If Schultz was the CEO at the time, the failure of his leadership would be especially telling, considering his earlier foray into politics using the company to promote his ideology.

The triggering incident that had overwhelmed Phillips, according to her boss, whom the CEO at the time must agree in retrospect failed as a supervisor but presumably was not fired, involved two Black men who had refused to leave a Starbucks store in 2018 even though they would not purchase anything. They were thus not customers, and the incident occurred before the company allowed non-purchasers to be in the stores. That the two Black men refused to leave the company’s private property means they were trespassing, so the store manager was on solid legal grounds in having the local police remove the men from the store. Being Black, even if that race has been (and is) subject to racism generally, does not give a person the right to trespass on private property, and efforts to remove such trespassing is not racist, for anyone trespassing would be legally subject to removal from the property. 

I contend that Howard Schultz’s notion of racial reconciliation suffers from the weakness of being blind to the racial presumption displayed by the two Blacks. In having employees talk about the need not to be racist to customers, Schultz was assuming that racism is something that non-Blacks do to Blacks. Employees were not told to suggest to Black customers that being Black does not give them special exemptions from the law or in society. Schultz could have had employees suggest to Black customers that jay-walking between intersections in a major street even if cars are coming is not “a Black thing” that is justified because the race in general has been subject to discrimination. Furthermore, the use of the word, nigga, cannot be allowed only if the speaker is Black, for that would be a racist position. For a Black person who uses the word to become hostile or aggressive towards an Indian, Oriental, or Caucasian who also uses the word is itself racist (and of course the hostility is unjustified unless the related word nigger is used in a hostile manner). The U.S. Constitution does not indicate that free speech depends or is limited by race; such a clause would be prime facie racist.

Phillips’ complaint, which the jury accepted unanimously, states that following the arrest of the two Black men, Starbucks “took steps to punish White employees who had not been involved in the arrests, but who worked in and around the city of Philadelphia, in an effort to convince the community that it had properly responded to the incident.”[5] Phillips was ordered “to place a White employee on administrative leave as part of these efforts, due to alleged discriminatory conduct which Phillips said she knew was inaccurate. After Phillips tried to defend the employee, the company let her go.”[6] It does not sound like Phillips was overwhelmed; in fact, she was being pro-active and ethical in defending an employee from an unjust punishment. The implication is that the person who fired Phillips acted unethically.

Moreover, in being willing to sacrifice Caucasian employees based on their race for good public relations, the company’s upper managers were being racist. An unseen implication is that those managers believed that the public reaction against the company for having the two Black men removed from the store in Philadelphia had some validity—that Black people should not be treated like that or that Black people deserve special treatment due to their race. But such a belief is itself racist. Schutz’s talking points for his employees to discuss with customers on race did not include mention of the racism in such beliefs. Moreover, he did not have the company’s employees talk about racism by Black people stemming from resentment. Any ideology is partial, rather than whole, and even claim of being against racism can fall short. In going after Caucasian employees, including Phillips, Starbucks’ upper managers fell short; the failure of leadership ultimate belongs to the CEO at the time. At least at the time of the trial, Howard Schultz was the CEO.