"(T)o say that the individual is culturally constituted has become a truism. . . . We assume, almost without question, that a self belongs to a specific cultural world much as it speaks a native language." James Clifford

Monday, October 21, 2024

On the Ethics of Marketing AI

The documentary, Eternal You (2024), is one film that zeros in on the use of AI to contact loved ones who have died. As the marketing departments of the tech companies providing these products say, AI can deliver on what religion has only promised: to talk with people beyond the grave. Lest secular potential buyers be left out, AI can provide us with “a new form of transcendence.” Nevermind that the word, transcendence, like divinity and evil, is an inherently religious word. Nevermind, moreover, that the product is actually only a computer simulation of a person, rather than the actual person direct from heaven or hell. The marketing is thus misleading. In the film, a woman asks her dead husband if he is in heaven. “I’m in hell with the other addicts,” he answers. She is hysterical. Even though people who write computer algorithms cannot be expected to anticipate every possible question that AI could be asked and every response that it could give, government regulation keeping the marketing honest and accurate can significantly reduce the risk that is from AI’s use of inference (inductive) and probability that are beyond our control to predict and even understand.


The full essay is at "Eternal You."

Monday, October 14, 2024

October 12th: Happy Vikings Day!

I contend that the ideological war being waged in the United States by the 2010s over whether October 12th should be “Indigenous People’s” Day or Columbus Day became real in 2021 when President Biden issued a proclamation commemorating “Indigenous People’s” Day not coincidentally to fall on the same day as Columbus Day. Similarly, though only unofficially, the United American Indians of New England have labeled Thanksgiving Day as “The National Day of Mourning” since 1970. The de facto hegemony of ideology in changing official U.S. holidays, including in the refusal of some people and even businesses to say “Christmas” even on Christmas Eve Day, has proceeded without the premise that ideology should play such a role being debated in public discourse. Instead, the onslaught has been enabled by the vehemence of the conquerors in insisting that their decisions be recognized and not contradicted. Once I went to a Unitarian “church” on a Thanksgiving expecting a spirit of gratefulness, as per President Lincoln’s proclamation establishing the date of the holiday after two years of brutal war between the CSA and USA. The sermon was instead on the need for sorrow instead. I walked out, shaking my head in utter disbelief. Perhaps some Americans might one day insist that a similar mood be preached in churches on Christmas Day. Both the need and insistence come with a tone of passive aggression, and are indeed power-grabs based in resentment, which Nietzsche argued is a major indication of weakness rather than strength, and thus self-confidence. Perhaps the manufactured dialectics, such as the one centered on October 12th, can be transcended in a Hegelian rather than religious sense at a higher level.

According to Britannica, Helge and Anne Ingstad discovered “the remains of a Viking encampment that they were able to date to the year 1000,” almost 500 years before Columbus’ landing on islands in the Bahamas (rather than on the mainland of North America).[1] The Graenlendinga Saga (Saga of the Greenlanders) has Bjarni Herjólfsson as the first European to see mainland North America in 985. At around the year 1000 CE, Leif Eriksonn, son of Erik the Red, “is reported to have led an expedition in search of the land sighted by Herjólfsson,” according to Eiríks Saga Rauda.[2] This is consistent with the empirical evidence found in Newfoundland.  Leif Eriksonn “found an icy barren land he called Helluland (“Land of Flat Rocks”) before eventually travelling south and finding Vinland (“Land of Wine”).[3] Later, Leif’s brothers travelled to Vinland, where their expedition stayed for three years. This is certainly sufficient to refer to the holiday on October 12th as “Vikings’ Day,” or “Eriksonn Day,” which would cover the brothers too. Columbus Day has been antiquated by the discovery of Viking artifacts on the mainland of North America, in Newfoundland, which is a lot closer to where the Puritans settled than is the Caribbean islands, which of course are not on the mainland of North America.

To be sure, the peoples who came to be known as American Indians by the Europeans had come to North America thousands of years earlier, and thus were not indigenous either, could be said to have been the first people to discover America, from the vantage point of Asia rather than Europe. But those people emigrated gradually from east Asia over a land-bridge that extended back then westward from Alaska, rather than coming over after an expedition of discovery. In any case, the word “Indigenous” can be struck from “Indigenous Peoples” Day for greater accuracy.

In short, Vikings Day can safely, from the perspective of the hyperactive ideologies, be used for the holiday, as there is no evidence that the Vikings in the eleventh century mistreated any of the earlier arrivals from Asia. The dialectic of Columbus Day and American Indians Day can thus be done away with at a higher level of historical accuracy.

The role of ideology in making and remaking holidays in the U.S. can be seen as it has played out at universities located in different member states. At Harvard, which is located in Massachusetts, classes and offices were closed on “Indigenous Peoples Day” on October 12, 2024, without any mention of “Columbus Day” in the academic calendar. The ideological preference is clear not only in which name the university used for the holiday (rather than using both names), but also in the fact that the university did not cancel classes for Veterans Day. Universities in the militaristic member-state of Arizona had classes on Columbus Day but not on Veterans Day.  Whereas Harvard kept its libraries open on “Indigenous Peoples Day,” public universities in Arizona did not even do that for Veterans Day, in spite of the fact that students would obviously be studying on a one-day break from classes. Harvard is typically compared with Yale. Not even Yale cancelled classes on October 12th (or on Veterans Day); instead, Yale, unlike Harvard, had a fall break of one week in October, which did not include Columbus Day.

In short, the respective university administrations, reflecting the political ideology that was most powerful locally, were making ideological statements in deciding whether and when to not hold classes on particular holidays. Harvard’s administration used the excuse that Cambridge, Mass recognized “Indigenous Peoples Day” as the reason why the university was recognizing that holiday and not Columbus Day, even though it too was a national (and state) holiday. I would not be surprised if Americans of Italian ancestry felt a slap. Part of the problem with ideology-fueled resentment is that such collateral damage is ignored or even, in a twisted way, believed to be justified.  Allowing an ideology to turn holidays into a battlefield is in dire need of being debated in the public square in the United States, rather than being tacitly allowed due to the efforts to intimidate. “Thanksgiving IS a day of sorrow! You better not disagree!” Such has been the tone intended to thwart even debate on the matter.



1. Jeff Wallenfeldt, “Did the Vikings Discover America?” Britannica.com (accessed October 14, 2024).
2. Ibid.
3. Ibid.

Saturday, October 12, 2024

Starbucks Bucks Its Workers’ Labor Union

Even though more than 500 Starbucks shops had unionized by the end of 2024, it seems that the company’s management did not respect the new union very much. Unfortunately for the company, one implication that can be drawn is that the company’s management didn’t respect federal labor law very much too. For in not respecting its union enough to negotiate it on reducing employee work hours, the company violated federal law. The “smoking gun,” I submit, was that the management used dissimulation to respond to the government, rather than address the complaint directly.

On October 10, 2024, the “general council of the National Labor Relations Board filed a complaint . . . alleging that Starbucks made the scheduling changes in late 2022 and early 2023” without consulting and negotiating with the union.”[1] The complaint reads in part that Starbucks changed workers’ hours “without prior notice to the Union and without affording the Union an opportunity to bargain.”[2] As per federal labor law, Starbucks was required to give prior notice to the union and give it a chance to bargain, as well as to tell the union how the change in hours would impact the paychecks of the workers affected. In its written response, the company’s management ignored this requirement and instead defended a practice that was not against the law and the government was thus not in the government’s complaint.

Starbucks stated, “We continuously review operations decisions to optimally address business needs and customer expectations, consistent with the law.”[3] Indeed, doing so does not in itself violate federal law, but the statement does not address the complaint. Next, the company tried to obviate the complaint, again by not addressing anything that was illegal, by pointing out, “our decisions were made across our system, in unionized and non-unionized stores, and they were made without regard to organizing activity at Starbucks.”[4] Even if that were true, it does not address whether the management had informed the union and given it an opportunity to bargain in the cases of the unionized stores.

By not addressing the violations specified in the complaint, the company’s managers may either have been dissimulating by changing the terms of the dispute or trying to avoid lying by denying the specific charges. Either way, the mentality is sordid, and this in itself can be interpreted in line with the old adage, Where there is smoke, there is fire.  Where there is a devious mentality, there is likely to be a crime.

As a result of the violation, some employees lost the benefit of health insurance because they no longer worked enough hours per week. Therefore, the union’s lawyer said that damages could be more than merely the wages for the lost hours. A conservative estimate could be “north of $30 million.”[5] Lest this seem like enough of a disincentive for the management to begin to respect the union (and federal labor law), I submit that it is extremely difficult to change a company’s organizational culture.

Today, I went to buy a product at a Target store. The shelf was empty so I went to customer service, which the linguistically opportunistic management calls “guest” services. The employee was incorrect that I could not order the product online and have it delivered to the store; she was even wrong that the product was not in the back of the store. I went to a manager, who assured me that she would “coach” the employee.  In a tone of “you’re not getting it,” what I actually said was, “It was not just her mistakes; her mentality—her attitude—was terrible, and that can’t be coached away.” The manager didn’t say anything, but her facial expression was one of dismissiveness. Starbucks’ management at the corporate level needed more than coaching from the government.


1. Dave Jamieson, “Starbucks Could Owe Millions to Baristas Who Unionized,” The Huffington Post, October 11, 2024.
2. Ibid.
3. Ibid.
4. Ibid.
5. Ibid.

Friday, October 11, 2024

AI Facial-Recognition Software in China: Ethical Implications beyond Political Economy

By the 2020s, the Chinese government had made significant advances in applying computer technology to garden-variety surveillance. To do so, that government relied to a significant extent on Chinese companies, and this in turn encouraged innovation at those companies even for non-governmental applications. I contend that treating this as a case study in business and government, without bringing in the ethical and political implications is a mistake. The ostensive “objectivity” of empirical social science may seem like an objective for scholars, but I submit that bringing in political and ethical theory renders the analysis superior to that which political economy alone can provide.

David Yang, who teaches economics at Harvard, spoke on a panel on China on October 11, 2024 on why some Chinese companies were developing AI technology even though generally technological development tends to go on in democracies rather than dictatorships. The reason for the exception, he said, is that the Chinese government had been buying facial-recognition software from companies in order to improve surveillance of the Chinese. Ignoring the unsavory ethical implications of a more totalitarian surveillance, Yang characterized the relationship between the businesses and the government as a win-win. The purchases by the government gives the companies the financial incentive and wherewithal to innovate AI for other, purely commercial purposes, and the government can more easily “restore order locally from social unrest.” Characterizing political protests as unrest can be said to be taken from an autocrat’s handbook, which unfairly casts a negative glow on what in a democracy is seen as healthy. Omitting the ethical implications from relations between business and government generally is thus partial both with respect to wholeness or completeness and in the sense of being biased. Even though Yang tried to present the relationship between the Chinese companies working on facial-recognition and the Chinese government buying the finished products objectively, his omission of the ethical dimension resulted in an incomplete explanation and a pro-autocratic bias. Even though such a bias could be said to be in sync with Harvard’s police-state, the hegemony of social order as the top value in political theory is problematic.

To be sure, social value could come with a government’s use of facial-recognition AI technology. Dave Davies, an American journalist who ventured inside China’s “surveillance state,” has argued that the Chinese Communist Party was “trying to internalize control. . . . Once you believe its true, it’s like you don’t even need the policeman at the corner anymore, because you’re becoming your own policeman.”[1] Once we shift from political protests to criminal activity, it is easier for even a democrat to see the value in prompting people to police themselves so a visible police-state apparatus in public would not be as likely. This is the antithesis of the “invisible man” question: What would you do that is illegal were you invisible so no one would see you and you wouldn’t get caught?  Instead, we can ask: What wouldn’t you do that you otherwise would do if you thought odds were high that you would get caught by police using facial-recognition AI technology?

But even with this internalization of control within an individual, which admittedly does not reach the individual not wanting to steal or injure someone in some way, the loss of privacy in public can be reckoned as an ethical (i.e., undeserved) harm that outweighs the ethical benefit of internalized control. Ben Franklin, one of the founding fathers of the United States, famously said that people who would trade privacy for more security deserve neither liberty nor safety. Of course, liberty is severely repressed in a dictatorship, and such a government can freely reduce people’s privacy with impunity in getting carried away with adding security measures. Ironically, such action by a pseudo-government can be observed at major universities in the States, including at Yale and Harvard, whose police departments do not have democratic legitimacy because the U.S. Constitution gives the police power to the state and federal governments rather than to even very wealthy private (“non-profit”) organizations.  

Whether in the United States or China, a republic (of republics) or an autocracy (or dictatorship), the human instinctual urge for still more control can manifest so easily in a sliding slope towards an excessively visible (and invisible), and thus impinging, police-state. The mind’s judgment concerning whether it has gone too far in this regard is susceptible to distorting itself or even suspending itself due to the allure of the pleasure of increasing control in a geographical area (or organization). In short, control is not easily internalized in people in whose discretion security measures lie. It is ironic that the internalization of control is easier when applied to individuals being controlled. An external check on the minds of the controllers is thus strongly advisable, lest we do not all wake up one day in a world in which we are surrounded by manifestations of passive-aggression and even unaccountable police brutality by people drunk with power from having the legal right to use lethal weapons. Cameras in public places are certainly superior to an overwhelming visible police presence in public places (and universities whose atmospheres are at least in principle academic in nature), but even with the ethical and practical value of internalized control, the unethical costs in terms of invasion of privacy should not be minimized or ignored outright. Achieving a policy that is balanced may not be easy, but I suspect it is best.


Saturday, August 24, 2024

Beyond Climate Change: Starbucks Awash in Cash

While it may be tempting to go after companies for hypocrisy on corporate social responsibility, even deeper criticism may be closer to the bottom line, financially. Even though social media castigated Starbucks for its impact on carbon emissions in agreeing to fly its Southern Californian CEO Brian Niccol to Seattle on a company plane each week, I submit that the amount of spending entailed raises questions about cost-containment and even cast some doubt on whether the company’s price increases in 2024 were wholly justified, and thus even on whether the industry was competitive or an oligarchy.

Before Niccol was to assume his role as CEO on September 9, 2024, Starbucks announced that he would “not be required to relocate to the company’s headquarters” during his employment with the company.[1] Because he would be expected to work at the Seattle office at least three days a week to comply with the company’s policy on hybrid working, he would be flying a distance greater than that which is between Berlin and Rome on a company plane weekly. Why could he not fly commercial (business class) and thereby save the company a lot of money? Is a CEO really above such flying?

I suspect that in the E.U. the answer would be more down-to-earth, or realistic, than in the U.S., where CEO’s are more likely to be reckon as akin to divine emperors. Whereas in Europe, an aristocracy exists that can put the moneyed caste in its proper place, American CEOs reside at the top of the societal pyramid. Being consumed with thoughts of money is valued rather than presumed low. This is not to say that inherited wealth is value-free and thus exempt from a different criticism. Rather, my point is that CEO’s of American companies can get away with being treated like royalty on account of the relatively pro-business (or business-leaning) societal culture.

Rather than criticizing Starbucks for spending too much money on its CEO’s transportation, users of social media expressed anger over the company’s preachments on sustainability while the CEO is to be flown on a private plane weekly, burning thousands of liters of fuel in the atmosphere. On its website, the company claimed that it had “a bold aspiration to be a resource positive company.”[2] The CEO of Conservation International stated that the company was backing up its “commitments with immediate actions to reduce [its] footprint and invest in nature.”[3] The hypocrisy could have been easily obviated by having the CEO fly business in a commercial airline.

It is not as if Niccol would not be able to afford the flights, as his annual salary was announced as $1.6 million, not including a possible performance-related bonus of up to $7.2 million and up to $23 million a year in company stock.[4] Of course, the company would no doubt cover the cost of its CEO’s commute, whether commercial or on a company plane, and such money, together with his compensation-level, suggests that Starbucks had money to burn in 2024 even as it was increasing the prices of its drink products.

In 2023, the CEO-to-worker pay ratio in the United States had increased to 251:1, which was up 26% from 2022. Back in 1965, CEOs were paid on average just 21 times more than the medium worker. In 2021, Chipotle, where Niccol had worked prior to becoming CEO of Starbucks, was at 2,998:1, which was the fifth highest in the United States. I suspect that he had rather high expectations in negotiating with Starbucks. That the company relented even as it felt the need to increase drink prices (presumably to keep afloat financially) is a point that the carbon-emission critics missed.

Considering the rise in prices at restaurants and grocery stores since the pandemic of 2020, it is worthy of note societally that a company raising prices would have enough cash on hand to fly one person weekly on a company plane instead of having him fly commercial (and on his own dime!). That is to say, one might wonder how legitimate the rising prices of food (and drink) were even after the pandemic. In competitive markets, new entrants can offer more competitive prices and thus bring down prices generally in an industry, such that the companies cannot afford to be extravagant in spending. Starbucks may simply have been raising prices because it could get away with it, and could thus afford to fly its CEO on a company plane weekly not only to the company’s headquarters, but on visits to company stores and brewing facilities on a regular basis.


1. “Anger Boils Up over Starbucks CEO 1600km ‘Super Commute’ on Private Jet,” Euronews, August 23, 2024.
2. Ibid.
3. Ibid.
4. Ibid.

Sunday, June 16, 2024

On the Ethics of AI

In June, 2024 at the international political meeting of the G7, a group of seven industrial nations, the head of the Roman Catholic Church, Pope Francis, spoke on the ethical dimension of artificial intelligence, or machine-learning. Regarding what the Pope called the “techno-human condition,” machines capable of AI are yet another manifestation of human propensity, which our species has had since its inception, to use tools to mediate with the environment. Although tools can be thought of as an extension of our arms ad legs, it is important to distinguish the human from the machine, even as we posit human characteristics onto some advanced machines, such as computers. In the film, 2001, the computer Hal sounds human, and may even seem to have human motivations, but any such attributions come to an abrupt end when Hal is shut down. To say that Hal dies is to commit a basic category mistake. It would be absurd, for example, to claim that Hal has an after-life. So too, I submit, is there a category mistake in taking the Pope’s talk on the ethics of AI as being religious in nature. Just as it is easy to imprint the human mind on a machine-learning computer, it can be tempting to superimpose the religious domain onto another. The Pope overreached in arbitrarily bringing in religious garb on what is actually an ethical matter in the “techno-human” world.


The full essay is at "The Pope on AI."

Tuesday, June 4, 2024

When Hollywood Gets Political: Partisan Profits

Entertainment celebrities and businesses alike risk losing customers and thus revenue by taking positions publicly on political issues. Fearing a surge from political parties on the far-right, some large businesses in the E.U. took the unusual step of coming out against those parties, labeling them as “extremist,” prior to the E.U. election in June, 2024. Typically, businesses there limit their political stances to particular issues that bear on core functions. This is a prudent policy, for human beings, being of bounded rationality, can easily translate ideological disagreement into switching brands. Even universities can get bruised by becoming embroiled in a domestic or international matter that is controversial. Hence after the contentious spring semester of pro-Palestine protests at Harvard (and other many other universities), the university’s administration enacted a policy not to take positions on issues in which the core functions of the university are only indirectly touched or are not affected at all. In creating a “marketplace” for academic freedom, universities themselves are best positioned by staying neutral. Although it is tempting for anyone (for oneself or one’s institution) who has access to media to sway public opinion on a political issue, I contend that the immediate self-gratification is usually outweighed by lost revenue and the reputation of being partisan. Applying strict scrutiny to one’s foray into controversial issues is harder to do if some vocal customers are demanding that a position be publicly taken. The silence of other customers, who would “vote with their purse or wallet” were an opposing position to be taken, should not be overlooked.  The singer Taylor Swift and the actor Robert De Niro provide us with two illustrations. Stepping out of their respective domains comes at a cost in those domains, and thus should, I submit, be done prudently and seldom.

As Israel was bombing Rafah in Gaza in 2024, contravening two rulings of the International Court of Justice (i.e., the UN’s court), a significant number of “Swifties,” that is, fans of the singer Taylor Swift, pleaded on social media for the international celebrity to take a position against Israel’s aggression. One fan wrote, “Taylor, please say something. Your silence is hurting us. We need you to stand with Palestine and condemn the Israeli occupation and aggression.”[1] I submit that the alleged hurt was exaggerated by the teenager. I sincerely doubt that Taylor’s silence kept many Swifties from buying Swift’s recently released album. Had the singer taken a stand, on the other hand, her fans on the other side might do more than block Swift on social media. That is to say, Swift’s financial bottom-line would be more impacted, and negatively so. It seems very improbably that increased purchases by Swifties in favor of Palestine would surpass the loss of revenue from Swifties on the other side of the issue “voting with their purses and wallets.” The lack of symmetry here is behind my advice to celebrities not to take a position on a controversial political issue, or to do so knowing that a financial cost will come with the exercise of political influence.

To be sure, exercising political influence on a societal and even world stage is tempting. As one Swiftie wrote on social media of Swift’s latent power, “if she can rally all of us to vote, she had the power to speak up about injustice.”[2] More bluntly stated, Taylor Swift had the power to significantly influence elections. The ideological benefit to her in doing so is not trivial; my point is that in accruing such a benefit, she should know that it comes with a financial cost in terms of her core function. By 2024, she had made so much money that not earning as much as she otherwise could by taking a position on Israel and Palestine could have made rational sense to her. Yet possible hits to her reputational capital could go beyond merely losing some customers of her music.

As Israel was bombing Gaza, former U.S. president Don Trump was on trial for criminal fraud in order to commit a political crime. Robert De Niro, a movie star, went to the courthouse and castigated Trump, calling him a monster.[3] As a result, the National Association of Broadcasters rescinded its Service to America Award, which the actor was to accept in just days. A spokesperson for the organization explained that it “is proudly bipartisan, uniting those from across the political spectrum to celebrate the impactful work of local broadcasters and our partners.”[4] De Niro would be a “distraction.”[5] Hence he was disinvited from even attending the event. De Niro took the high road and wished the organization well. For him, the loss of the award and even any loss at the box office if Trump supporters would then “vote with their purses and wallets” was worth it. Like Swift, De Niro had plenty of money, no doubt, and great star-power; he could take some of it out for a spin—like taking a new car out for a fast drive—without fear that he would end up in the poor house. Even so, the question of whether the hit to his personal “brand” was worth the financial and reputational cost is worth asking. Perhaps the answer is yes only if his public condemnation of Trump would end up making a difference in the election that was still half a year away. To De Niro, the answer could have been yes even if not because of the psychological reward that he felt from standing up for something important to him. Even so, rationally it would still be wise to keep an eye on the brand.

In short, it is human, all too human, to want to have political influence on a societal or even a global scale, and to enjoy the psychological pleasure that goes with the expenditure even though it could mean fewer sales than would otherwise be the case and a hit to one’s reputational capital, or brand. Generally speaking, though, such immediate gratification may not usually be worth the long-term costs, both tangible and intangible. Balancing the immediate with the long-term is not something that we humans are particularly good at, and natural selection in the process of evolution is to blame. The time-value of money, an economic concept, stems from the human preference for instant gratification. It is for this reason that I contend that celebrities should as a rule stick to their core functions—stick to the knitting in the words of the business book, In Search of Excellence—and only branch out to “cash in” to influence a political matter only rarely if at all. Taylor’s silence wasn’t actually hurting anyone; she was being an astute businesswoman and thus acting in her best interest.


1. David Mouriquand, “#SwiftiesForPalestine: Taylor Swift Urged to Speak Up on Gaza Conflict,” Euronews.com, May 29, 2024 (accessed June 3, 2024).
2. Ibid.
3. Dylan Donnelly, “Robert De Niro Has Award Withdrawn after Calling Donald Trump ‘Monster’ Outside Trial,” Sky News, June 2, 2024 (accessed June 3, 2024).
4. Ibid.
5. Ibid.