"The greatness and the genuine trait of your thought and writings lie on the fact that you positively and interestingly make use of philosophical thoughts and thoughtfulness in order to deeply and concretely cogitate about America's social issues. . . . This does not mean that your thought is reducible to your era: your thought, being inspired by issues characterizing your era . . . , overcomes your era and will still likely be up to date even after your era, for future generations." Bruno Valentin

Thursday, December 8, 2016

The Golden Age of Innovation Refuted

“By all appearances, we’re in a golden age of innovation. Every month sees new advances in artificial intelligence, gene therapy, robotics, and software apps. Research and development as a share of gross domestic product [of the U.S.] is near an all-time high. There are more scientists and engineers in the U.S. than ever before. None of this has translated into meaningful advances in Americans’ standard of living.”[1] The question I address here is why.
For one thing, a lag follows big ideas before which they translate into increases in productivity related to labor and capital. Breakthroughs in electricity, aviation, and antibiotics did not reach their maximum impact until the 1950s, when “total factor productivity” stood at 3.4% a year.[2] In contrast, the figure for the first half of the 2010s was a paltry 0.5% a year. “Outside of personal technology, improvements in everyday life” were “incremental, not revolutionary,” during this period.[3] Houses, appliances, and cars looked much the same as they did twenty years earlier. Airplanes flew no faster than in the 1960s. This “innovation slump” is, according to the Wall Street Journal, “a key reason the American standards of living have stagnated since 2000.”[4] What had been revolutionary breakthroughs in the last century were still carrying the day into the next by means of incremental change based on product improvements. It could take a few decades before the fruitful research in AI, gene therapy, robotics, and software apps reach marketability and thus can impact productivity and radically alter daily life.
To be sure, the computer-tech revolution had altered daily life even by 2010—the smart-phone is a case in point. Yet personal computers go back to the 1970s so even in this respect the marketable innovations by Steve Jobs and Bill Gates can be viewed as incremental rather than revolutionary in nature. Software apps are themselves responsible for incremental changes based on the smartphone, which in turn is based on the personal computer. Even amid all the high-tech glitter, the developed world in 2016 stood as if a surfer waiting between two giant waves for the next one to hit.
Artificial intelligence, gene therapy, robotics, and software apps were poised to give rise to the next wave—first one of revolutionary change and then, after a lag, another wave—one of raised living standards. Unfortunately, revolutionary innovation “comes through trial and error, but society has grown less tolerant of risk.”[5] Furthermore, regulations “have raised the bar for commercializing new ideas.”[6] Lastly, “a trend toward industry concentration may have made it harder for upstart innovators to gain a toehold.”[7] In other words, the concentration of private capital may forestall the switch from continued incrementalism to revolutionary change. Breaking up oligopolies as a matter of public policy thus has more to it than merely preventing monopoly.
To be sure, companies and entrepreneurs in 2016 were “making high-risk bets on cars, space travel, and drones.”[8] Add in advances in AI and medical research—which could make death no longer inevitable for human beings—and some serious changes in daily life and productivity can be predicted. Yet, as potentially momentous as these efforts at innovation are, decades could separate the inventions and impacts on daily life and productivity. I suspect the world in 2016 was truly between two waves—the first of electricity, the telephone, sound recording, the automobile, the computer, and the airplane—and the second of space/astronomy, medical research, and AI/computer technology. Colonizing Mars, rendering death not inevitable, and combining AI, robotics, and software apps could result in wave that would dwarf the one of the previous century. The advent of smartphones, having all the glitter of a revolutionary product yet being an adaptation of the personal computer, whets appetites to look for the “big one” coming up on the horizon. Indeed, we have little time to waste; that wave could bring with it technology capable of arresting and even reversing the accumulations of CO2 and methane in the Earth’s atmosphere. The interesting dynamic of being able to save this planet just as we are able to colonize another is like the related one of making death something less than inevitable—due to stem-cell research on organ replacement, genetic therapy, and advances on curing diseases—just as we make the Earth habitable for our species for the indefinite future. Meanwhile, we are like children who have outgrown their clothes, still playing with adaptations of twentieth-century toys.

[1] Greg Ip, “Economic Drag: Few Big Ideas,” The Wall Street Journal, December 7, 2016.
[2] Ibid.
[3] Ibid.
[4] Ibid.
[5] Ibid.
[6] Ibid.
[7] Ibid.
[8] Ibid.

Wednesday, November 9, 2016

Societal Norms Understating Unethical Corporate Cultures: The Case of Wells Fargo

The case of Wells Fargo suggests that even when a massive scandal is revealed to the general public, the moral depravity of a company’s culture is skirted rather than fully perceived. Wells Fargo was fined a total of $185 million by regulatory agencies including the Consumer Financial Protection Bureau, which had accused the bank of creating as many as 1.5 million deposit accounts and 565,000 credit-card accounts that for which consumers never asked. The bank fired 5,300 employees over the course of about five years after it was revealed those employees had opened the accounts and credit cards.[1] Wells Fargo's CEO at the time, John Stumpf, "opted" for a cushy early retirement after an abysmal performance before a U.S. Senate committee; he walked away from the bank with around $130 million[2], and none of the other members of senior management were fired, or "retired," obliterating any hope societally that any of the senior managers would be held accountable. This result is particularly troubling, given the true extent to which that management had turned the bank into an ethically compromised organization.

"There is a serious problem with senior management at Wells Fargo," U.S. Senator Elizabeth Warren told CNBC in September, 2016.[3] "You can't have a scandal of this size and not have some senior management who are personally responsible," she said.[5] With so many sham accounts and fake credit-card applications, the problem must have gone beyond particular executives giving orders. As a former bank employee told me, “When we went to work there, we knew we were selling our souls to the devil.” To be sure, being willing to be hired anyway is a choice worthy of blame. We can be struck nonetheless at the unexpected banality of bank. It is truly remarkable both that a well-established institution could have such a sordid culture out of public view, and that senior managers could be fine with such “shared understandings” within the bank.

Besides the over-charged customers, aggrieved Wells Fargo workers--"people who say they were fired or demoted for staying honest and falling short of sales goals they say were unrealistic"--bore the brunt of the unethical senior and middle management[5]. For example, Yesenia Guitron, "a former banker, sued Wells Fargo in 2010--three years earlier than the bank has admitted it knew about the sham accounts . . . Intense sales pressure and unrealistic quotas drove employees to falsify documents and game the system to meet their sales goals, she wrote in her legal filing.” She “said she did everything the company had taught employees to do to report such misconduct internally. She told her manager about her concerns. She called Wells Fargo’s ethics hotline. When those steps yielded no results, she went up the chain, contacting a human resources representative and the bank’s regional manager. Wells Fargo’s response? After months of what Ms. Guitron described as retaliatory harassment, she was called into a meeting and told she was being fired for insubordination.”[6] Clearly, she had not gotten the memo on the requirement of selling her soul to the devil.

A memo to the rest of us could inform us that our designated watchdogs in the media do not go far enough in uncovering for us just how bad things are in companies run by unethical people. The extent of their moral depravity, and thus of the organizational culture, is not reaching us. As a result, we cannot push our elected representatives enough—the corporate lobbying notwithstanding—to enact legislation that is sufficient to meet such challenging cases. We suppose, for instance, that the replacement of a CEO can be sufficient to usher in restorative measures at the company level in spite of the extent of depravity.

1. Jon Marino, “Bove: Wells Fargo Will Make Retail Banks ‘Rethink’ Compensation,” CNBC.com, September 14, 2016.
2.Matt Egan, “Wells Fargo CEO Walks with $130 Million,” CNN Money, October 13, 2016.
3.John Marino, “Elizabeth Warren.”
4.Marino, “Elizabeth Warren.”
5. Stacy Cowley, “Wells Fargo Workers Claim Retaliation for Playing by the Rules,” The New York Times, September 26, 2016.
6. Cowley, “Wells Fargo Workers.”

Saturday, October 29, 2016

An Anti-Obesity, Anti-Poverty Philanthropist Joins PepsiCo.’s Board: A Case of Reform from Within

In October 2016, Darren Walker, president of the Ford Foundation, became the newest member of PepsiCo’s board of directors. Whereas Walker worked at the time for a more just and equitable society, Pepsi was making the bulk of its money by selling sugary drinks and fatty snacks and there being a well-established link between obesity and economic inequality. Would he be working at cross-purposes? “There’s a risk that he will be viewed as inconsistent,” said Michael Edwards, a former Ford Foundation executive at the time.[1] The company itself could also be viewed as being inconsistent—lobbying against anti-obesity public-health legislation while putting Walker on the board of directors.

To be sure, the Ford Foundation had not funded organizations working to combat obesity or diabetes, so there does not seem to be a direct conflict of interest for Walker.[2] Yet he did acknowledge, “I know that my own credibility and the credibility of the Ford Foundation is tied to this decision. Those of us in philanthropy have to be discerning about the corporate boards we join, and be discriminating to ensure that our service on a board is aligned with our values.”[3]

So rather than there being a conflict of interest, the issue for Walker was whether he could act as a reformer from within. Even though he planned bring the perspective of a social-justice organization and his own perspective “as someone who is deeply concerned about the welfare of people in poor and vulnerable communities,” he would still bear responsibility should PepsiCo’s board go in another direction.[4] He would not, in other words, be chairman of the board. That the company had just pledged to further reduce the amount of sugar, fat, and salt in its products by 2025, however, suggests an appetite for accommodation with Walker’s perspective. Additionally, Walker would not be responsible for the company’s past unethical lobbying against anti-obesity legislation, use of unethical suppliers of palm oil, and deceptive marketing, and the company had since taken steps to remedy these ethical problems.[5]

As in politics, the matter for Walker and the other board-members concerning would be whether together they could wield compromises taking into account both Walker’s vantage-point and the legal and ethical fiduciary duty to act as faithful stewards of the stockholders’ capital. Reform from “the inside,” moreover, can be more productive than merely staying in the philanthropic sphere. In terms of American politics, the analogue would be moving from the Green Party, for instance, to the Democratic Party so as to work toward reform that could actually manifest in legislation. Admittedly, idealism is tested in such a strategy, but consequentialism tells us that even 50% of 10 is more than 0% of 10.

1. David Gelles, “An Activist for the Poor Joins Pepsi’s Board. Is That Ethical?,” The New York Times, October 28, 2016.
2. Ibid.
3. Ibid.
4. Ibid.
5. Ibid.

Tuesday, September 27, 2016

Facebook’s Zuckerberg Donates $3 billion to Medical Science: Some Major Implications

Facebook’s CEO, Mark Zuckerberg, and his wife, Priscilla Chan, announced in September, 2016, that they would invest more than $3 billion during the next decade to build tools that can facilitate medical research on diseases. The first outlay of funds ($600 million) would create a research lab of engineers and scientists from the area’s major research universities.[1] “This focus on building on tools suggests a road map for how we might go about curing, preventing and managing all diseases this century,” Zuckerberg said at the announcement.[2] Moreover, the couple had previously announced a year before that they would give away 99% of their wealth over their lifetimes through the Chan-Zuckerberg Initiative in the areas of education and healthcare. I would like to point out a few implications that may not be readily apparent.

Firstly, such funds going to preventing and curing disease could bring the day nearer when—along with advances in anti-aging and stem-cell research—death is no longer inevitable for a human being. Even before the Zuckerberg-Chan announcements, some scientists were openly predicting that that day might come as early as the 2050s. To be sure, being able to grow replacement organs, apply an anti-aging treatment to the body’s cells, and prevent major diseases (I suspect the common cold will still be around, just to keep us humble) does not guarantee that death will be put off; running into a train or bus, or jumping off a high building could still mean death. Nevertheless, the notion that death can be put off indefinitely dwarfs the combined impact from all the twentieth-century’s technological progress put together.

Considering the costs involved, access to rendering death no longer inevitable would doubtlessly raise ethical issues in terms of the distribution. Moreover, ethical questions would suddenly arise concerning the species’ increasing population and reproduction-rights. Secondary issues such as climate change could become even more pressing. It could be, for example, that a drastically increasing human population outstrips the planet’s food-capacity as well as the capacity of the atmosphere to absorb the species’ waste, including greenhouse gases. It would be highly ironic were the feat in removing the threat of death a major contributor to the extinction of the species. In short, the story could go as follows: we maximize our species’ size—which means success genetically—only for the increased numbers to cause extinction because the climate is no longer hospitable to human habitation or the lack of food causes wars ending in nuclear war. The first alternative would be particularly likely.

Secondly, that the couple could give up 99% of their wealth over their lifetimes may imply that they will have earned too much money, if being able to use it is at all relevant. Put another way, being able to give away almost all of their total earnings may suggest that they (namely Zuckerberg) earned too much. Does it even make sense for someone to get money that is beyond the capacity to be spent even through inheritance?

One implication is the question of whether Zuckerberg’s employees at Facebook should get a significant amount of what Zuckerberg earns, whether in salary or stock. Why such a huge difference in compensation? To be sure, ownership does have its privileges, but is there no limit? The fact that Zuckerberg, Bill Gates, and Warren Buffet could give vast sums of money to charity raises the question of whether founders and CEOs shouldn’t face some limit in terms of wealth, with a progressive tax system kicking in for multi-billionaires. Were elected representatives to decide how such vast sums should be spent, the legitimacy of the power behind such a decision would be greater.

1. Deepa Seetharaman, “Zuckerberg Fund to Invest #3 Billion,” The Wall Street Journal, September 22, 2016.
2. Ibid.

Wednesday, August 26, 2015

Mass Shootings in the U.S.: Why Are Americans So Angry?

Even though the United States account for less than 5% of the world’s population, 31% of the total number of mass killings worldwide between 1966 and 2012 occurred there.[1] I contend that a rise in passive aggression and the related intolerance accounts for much of the difference. In other words, it could be that Americans generally are getting nastier and more angry at each other.

Although not a mass-shooting, Vester Flanagan shot two former co-workers in August 2015. While it is easy to relegate the story by simply concluding that the guy was nuts, a closer examination reveals the situation to be more complicated. The nuances may help us understand what lies behind the mass-killing violence that goes beyond the killers themselves and is disproportionately an American phenomenon. In analyzing the Flanagan case, I want to stress that even if his coworkers had been at fault, the double-murder was completely unjustified. My analysis is oriented to uncovering a hidden trend in American society rather than answering whether the shooting is justified.

In a lawsuit against another network in 2000, Flanagan had claimed that a producer had called him a "monkey" and that he had been "made aware that other black employees ... had been called monkeys by officials affiliated with defendant." He also claimed that a Caucasian "official" had told him that "it busted her butt that blacks did not take advantage of the free money," referring to scholarship funds. Additionally, he insisted that a supervisor at the station had said that "blacks are lazy,” and that that another employee had told a black tape-operator to "stop talking ebonics." WTWC-TV acknowledged that an employee "may have made similar comments to another employee," but denied that such comments are "indicative of unlawful employment practices." The case ended in a non-disclosed settlement.[2] The admission of race-oriented comments to another employee lends some credibility to Flanagan’s assertions.

Even so, Flanagan may have made his own contribution to the workplace tension. The news station denied that his termination was the result of discrimination. It instead cited "poor performance," budgetary reasons and "misbehavior with regards to co-workers."[3] The latter in particular resonates with what he wrote regarding the cameraman (Adam) and reporter (Alison) from his next station. After announcing that he filmed the shooting, he wrote, “Adam went to hr on me after working with me one time!!!”[4] Either Adam had overreacted or Flanagan’s treatment of co-workers was incredibly bad. Flanagan also wrote, “Alison made racist comments” to him, and that he had filed an EEOC report.[5] It could be a case of “white privilege,” or simply that Alison was racist (or that she took sides with Adam).

In any case, the shooting stemmed from anger in the workplace—people not getting along and not having the social skills to work things out rather than make things worse. Adam’s quick trip to the station’s human resources department may indicate a lack of tolerance, as well as a tendency to escalate matters rather than patiently work them out. If Alison made demeaning racial statements to Flanagan, then perhaps her attitude may have been condescending and thus inherently conflictual. Of course, both Adam and Alison may have simply been reacting to extraordinarily bad treatment from Flanagan—his report to the EEOC being an effort to go on the offensive rather than admit that he had treated his coworkers very badly.

I suspect that at least part of the problem is societal—Americans may be been becoming more passive aggressive, and this anger in turn might be kicking the outright aggression up a notch in some people. The lack of tolerance for disagreements shows up in the ideological fragmentation of the American news networks, for example, with Fox News and MSNBC employees on the air brazenly displaying utter disdain for progressives and conservatives, respectively. Dismissiveness toward others, or in other words being “too cool to talk,” stemming from an abject lack of respect for others, may have been increasing at least in the Millennial Generation. As the sordid attitude becomes more socially acceptable as a social more in America, then increasing anger and ensuing aggression can be expected. “Road rage” is a case in point: an extreme hostility toward other people ruffling feathers. Why are so many Americans angry? This may be part of the reason why the U.S. has a disproportionate number of mass killings, and I suspect that the same holds for workplace (and former workplace) violence.

[1] Stan Ziv, “Study: Mass Shootings ‘Exceptionally American Problem’,” Newsweek, August 23, 2015.
[2] Dana Liebelson and Jessica Schulberg, “Shooting Suspect Sued Another Newsroom for Racism, Claimed He Was Called a Monkey,” The Huffington Post, August 26, 2015.
[3] Ibid.
[4] Ibid.
[5] Ibid.

Saturday, August 22, 2015

Humans As the Intense Predator: Unbalancing the Food-Chain Unsustainably

By 2015, humans—the homo sapiens species in particular—had become “the dominant predator across many systems”; that is to say, the species had become an unsustainable "super predator."[1] We have had a huge impact on food webs and ecosystems around the globe.[2] Moreover, we have been using more of the planet's resources than we should. By August 2015, for example, humans had already consumed the year's worth of the world's resources.[3] In terms of fossil fuels, the consumption has had an impact on the warming of the Earth’s atmosphere and oceans. Behind human consumption are human beings, so the astonishing increase in human population is a major factor. As a virus-like species incredibly successful genetically over the previous five-hundred years, the self-maximizing feature both in terms of population ecology and profit-maximization may be the seed of the species destruction, and thus long-term genetic failure.

According to one study, humans are "particularly intense" when it comes to hunting, and have used powerful killing technology (trawl nets, guns and mechanized slaughterhouses, for example) to dominate other predators. [4] 

Large-scale fishing does not distinguish between fertile adults, weak fish, and the young. (James Watt: Getty Images)

With the efficiency (i.e., profitability) of large-scale fishing businesses, we remove fish at 14 times the rate of marine predators.[5] The research confirms what many scientists had warned for years: If we don't stop overfishing, we may soon run out of animals to catch. The study reports that many fish populations had already been hunted to the brink of collapse, shark populations decimated, and less than 8 percent of southern bluefin tuna left. [6] On land, humans had been killing top carnivores, such as bears, wolves and lions, at nine times their own self-predation rate.[7] By 2015, the food chain as a whole had become terribly unbalanced that thus unsustainable as a whole.

Applying business-efficiency principles to hunting, we can capture adult prey at minimal cost, and so gain maximum, short-term reward. The cost being minimized is both in terms of business and the species. Of the latter, Chris Darimont of the study points out that "advanced killing technology mostly excuses humans from the formerly dangerous act of predation." [8] Because hunters “’capture’ mammals with bullets, and fishes with hooks and nets. . . [Humans] assume minimal risk compared with non-human predators, especially terrestrial carnivores, which are often injured while living what amounts to a dangerous lifestyle."[9] To be sure, working on the deck of a commercial fishing boat in the north Pacific is one of the most hazardous jobs around, but the fishing businesses can externalize at least some of the cost (e.g., insurance).

Even so, by not applying principles from population ecology, the businesses engaged in hunting, fishing, and farming animals have been undermining efficiency, and thus profitability. The study claims that besides the sheer number of animals that humans kill for food being problematic—56 billion farmed animals were at the time being slaughtered annually, “(h)umans focus on adult prey, unlike other predators. A full-grown lion, for example, often opts for the smaller, weaker juvenile zebra rather than an adult. This distinction makes it harder for animal populations to recover as breeding members are removed.”[10] Presumably recovering populations are in line with sustainable profitability.

Tom Reimchen, a co-author on the study, “uses a financial analogy to explain the damaging consequences of hitting adult populations hardest. He calls the adults the system's ‘reproductive capital’—the equivalent of the capital held in a bank account or a pension fund. And he says we are eating into this capital when we should really be living off the interest—the juveniles, which many species will produce in colossal numbers, expecting a good fraction to be doomed from the moment they are born via predation, starvation, disease, accidents and more.”[11] “We are dialing back the reproductive capacity of populations," Darimont said. [12]

The doubtlessly unintentional self-defeating strategies of the businesses mirrors that of the species itself, in that the failure to be prudent in terms of population growth is also self-defeating because the ecosystems, including the Earth’s atmosphere and oceans, get breached beyond repair in terms of being able to sustain our species when it is essentially a maximizing variable rather than tending toward an equilibrium. In short, the wise human species—homo sapiens—is not so wise, after all.

1 Chris Darimont et al, “The Unique Ecology of Human Predators,” Science, Vol. 349, no. 6250, pp. 858-860.
2 Ibid.
3 Jonathan Amos, “Humans Are ‘Unique Super-Predator’,” BBC News, August 20, 2005.
Nick Visser, “Thanks Humanity. Now We’re Unsustainable ‘Super Preditors,” The Huffington Post, August 21, 2015.
5 Amos, “Unique Super-Preditor.”
6 Visser, “Thanks Humanity.”
7 Amos, “Unique Super-Preditor.”
8 Ibid.
9 Ibid.
10 Visser, “Thanks Humanity.”
11 Amos, “Unique Super-Preditor.”
12 Visser, “Thanks Humanity.”

Tuesday, August 4, 2015

The Natural Wealth Model of the Modern Corporation: A Basis for Sustainable Organization

Going to the Humanities to construct a sustainable organization based on ecological theory, this essay presents a theory of the firm that is at odds with the profit-maximization premise. I draw on the notion of the natural wealth of the Golden Age as depicted by such ancient Western poets as Ovid and Hesiod—who assumed such wealth to be devoid of greed—as a basis for sustainable organization from ecological theory to produce an alternative theory of the firm.