Friday, January 7, 2022

Giving Up to Go Forward


Dinosaur Hall, Fort Worth Museum of Science and History

“The problem is never how to get more innovative thoughts into your mind, but how to get the old ones out.”  

                                                      -        Dee Hock, founder, Visa, Inc.


“…you can’t truly hope to beat alcohol until you give up all hope of beating alcohol.  This necessary shift in outlook generally happens as a result of ‘hitting rock bottom,’ which is AA-speak for when things get so bad that you’re no longer able to fool yourself.”

-        Oliver Burkemann, Four Thousand Weeks: Time Management for Mortals (2021)


Dry January began in 2013 with Alcohol Change UK both as a public health campaign and to inspire thinking about drinking addiction.  The first of the famous Alcoholics Anonymous 12 Steps reads “We admitted we were powerless over alcohol—that our lives had become unmanageable.”   Only when the addict can admit that drinking makes any decent life impossible can the reality of addiction be faced at last—so that steps can at last be taken to devise a life that has a chance of working. 

Giving up on what is unsustainable doesn’t just apply to destructive drinking.  It is the foundation of an approach to problem-solving that begins with the destruction of what is not working. 

For AA members, the “admission statement” is designed to initiate what psychotherapists call a second-order change: a shift in perception (sometimes sudden, sometimes gradual) that results in a totally new appreciation of a situation or problem.  The second-order perception then allows for an entire range of solutions that were not even visible under the first-order viewpoint. It is from this newfound range that entirely new solutions can then be allowed to emerge. 

This is why “idea extinction” is so important:  it is the destructive act necessary to ridding ourselves of the delusion that the old idea can somehow be forced to become effective.  Nothing short of total annihilation can move the mind forward and away from what has proven a failed idea.  This is the “rock bottom” that’s as good as an education about maladaptive thinking.

Creative problem-solving consultant Steve Grossman explains why, for better ideas to be born and nurtured, worse ideas must be put to rest: not gently, but terminally and for good.  Once this occurs, the old ways of thinking can be buried.  Now when the problem is reopened and fresh, the technique of reversing assumptions can be applied.  Assumption Reversal exposes the unconscious assumptions supporting old solutions—which can then be examined and discarded.  The outcome: prospecting for value in completely new territory. AR builds from a reopened base: a redefined concept of the problem to be solved.  This entirely refreshed way of looking at the situation then can expose new potentials the newly opened mind can take full advantage of. 

Like a light suddenly switched on, old ways of resolving the problem are extinguished. This act creates the potential to turn products and services to unexpected and more successful uses and directions. (See his article “Extinction:  A Power Tool to Source New Ideas” (2019), He explains that “In helping businesses solve difficult and persistent problems, I have discovered that it is not any lack of ideas that prevents even very bright people from finding solutions. Instead, one of the biggest roadblocks to new and creative solutions is not conjuring new ideas but in ridding the brain of those already embedded.”

Examples of second-order thinking are the hallmark of invention and innovation.  Computers were assumed to be for scientists only, working in labs, and used to crunch numbers.  The total demand was projected at 100,000, worldwide, all scientists.  But the current computer as mass media in every home show what happens when they are programmed for words and images rather than calculation.  On the road, vans and off-road vehicles were for commercial and sporting purposes only before the minivan.  In 1983 Lee Iacocca saw the potential for a family vehicle with cargo and passenger space, saving Chrysler and shaping all car design into the future—and now making a comeback in sales.  (He also noted that focus groups are not the pathway to new concepts.) 

In education, before World War II, college-bound students were scholars, bound for academic careers in teaching and research, not the general public.  College is now an expected achievement of an extended learning curve.  Amusement parks were sketchy places far from family life and middle-class taste; their redesign as theme parks left Disney as the world’s top entertainment brand.  All of these major innovations in the information, transportation, education, and entertainment fields have transformed life.  They all involved a major rethinking about their purposes and possibilities.  Add the 3D printing of body parts, the space station, interstellar exploration, smart watches, Alexa, and even discount brokers (as everyone becomes a stock-market investor) to that list.

Failed products and initiatives suffer from an “optimism bias,” the conviction that every megaproject, ad campaign, or invention has a good chance of making it in the marketplace.  In reality, few building megaprojects meet their objectives of cost or demand, and 90% of all new products fail.  A short list of these projects includes Afghanistan (unmanageable starting with Britain’s invasion in 1830), the Segway (which didn’t replace walking), the Orkut social network (too early to reach critical mass), Disney’s America (the infrastructure politics went wild), the metric system, Prohibition, Kmart, the 2004 Athens Olympics (leading to Greece’s debt trap), the Chunnel (an ongoing net loss to the UK, along with Denmark’s Great Belt Tunnel), and Enron (too good to be true). 

These cases became caught up in a downward spiral of cost overruns and benefit shortfalls they couldn’t recover from to prove themselves viable. A perennial problem in project management is failure to look deeply at their assumptions at the front end, the far cheaper alternative to launching and then making expensive fixes as things fall apart.  The analysis of how people actually will use these things, and the conditions needed to make their use possible, is far more desirable at the fuzzy front end than putting out fires in a system doomed to fail.  Many times, proposals on the table need to be prematurely, permanently sunk to save millions or billions, so that proposals better conceived and designed could use that same funding to launch and succeed.   

Sunday, December 5, 2021

Speech and Unconscious Bias


“Humans have evolved over the millennia to care deeply about the way people talk—and to use this marker as a quick-and-dirty signal of who is us and who is them.”       

                                        --Katherine D. Kinzler, How You Say It (2020)

Human differences are really a critical part of our humanity, a fact gaining recognition and traction at work, school, and organized behavior in general.  This is the goal of DEI – Diversity, Equality, and Inclusion--as this principle seeks to inform human relations across the board.

The DEI movement is picking up speed and power not only in multinational business, where it first took root because businesses like IBM had global workforces, but in professional organizations, schools, nonprofits, companies and working groups of all sizes in the US and its multinationals. 

While gender and race, the startpoints of DEI, are well acknowledged, other points of difference like mental health states and (coming soon) Covid status as a long-term health issue are less well-known as human factors.  As points of difference, these factors are seen to unfairly tip the balance of power between people, their interactions, and opportunities.  But here is yet another aspect to be explored, and it has even more social clout: speech.

Speech differences between groups are far older than our concept of race difference.  While we can’t determine how far back spoken language extends in the human record, speech is one of the first dividers of groups.  Children are language discriminators who instinctively gravitate to speakers who sound like their parents.  Language categories are intrinsic to our ideas about “others,” especially since any language other than our own prevents communication to any degree.  Speech is perhaps the most basic marker of who we are and how we relate to others through our individual as well as group voice.  It is the ultimate social skill – yet its many variations create mental categories we can’t help noticing and reacting to.

Speech bias is so ingrained in our daily interactions that we don’t recognition it’s there – or that it is in face a prejudice that rules our opinions of others.  But this is more than personal preference for one “brand” of speech over another.  It drives our justice system, our social circles, education attainment, housing (where people can buy or rent), medical treatment, job application, careers, and the attainment of life goals.  Our speech quality and perception by others shapes the very goals we feel are possible to achieve. 

In the US, there is race—and there is language.  Not just languages other than American English as the US unofficial tongue (along with the second designate, Spanish) but variations like accent or dialect.  This scope would include Black or African-American English (AAE) dialect, as well as immigrated English speech with an accent (as in Chicano English).  Supreme Court Justice Clarence Thomas grew up speaking Gullah, the English creole spoken on coastal Georgia.  He clearly was able to do as many other professionals have done – “code-switch” between his home style for which he was ridiculed and another, mainstream standard of the bar and bench. 

Black children’s ability to code-switch between African American English (Ebonics) and standard English determines their school test scores for educational upward mobility.  It also sets the trajectory for “the places these kids might want to go, to learn, work, and live…you’re handicapping them by not teaching them the two codes,” says Julie Wahington, language specialist at George State.  Besides test scores, creating a level playing field is based on language even more than race, and speech discrimination is ongoing (Atlantic, April 2018, p. 20). 

Katherine Kinzler’s recent book How You Say It (Why You Talk the Way You Do and What It Says About You) points to speech as more important than race in the social hierarchy.  How people speak, no matter how they look, determines how much they are trusted, where they can live and work, where they go to school, and how they are treated by the justice system.  Their future prospects, and those of their children, depend on these factors, and how they are driven by accent, word usage, dialect, and related variant patterns measured again mainstream-educated (many would say White) standards.

This is all very unconscious.  We are constantly exercising our preferences for some kinds of speech over others, starting with preferring our mother tongue over all others.  We assign greater status to whoever we think is a better speaker and to the groups that speak as they do. Think of how socially advantaged a posh accent (English or MidAtlantic) can be, while also alienating as a class signifier-- versus a “street speech” repertoire.  National origin prejudice is protected by EEOC rules, but not the persistent accent associated with immigrant status. Linguistic bias is everyone’s unconscious prejudice, and we rarely consider why we like some people more than others simply based on the way they handle language. A whole field has opened up, signaled by Kinzler’s work, to answer this question.  Nonstandard and socially marginalized speech, as part of “linguistic discrimination should be part of our national and judicial consciousness, and we should make an effort to curb it in our legal system and our minds (p. 150).”

So it should be surprising that speech is not included on the DEI spectrum.  This gap ignores a wide agenda of social effects—those that point to status on the hierarchy of social opportunity and potential.  Speech bias indicates on ongoing major source of discrimination based on sound, not visual, cues.  Stigmatized speech links to discrimination by gender and race, national origin, mental capacity, age, and class – in ways that may shed light on those more established categories as well. 

Sunday, November 7, 2021

Diversity Plus

                                                                                          Image by Gerd Altmann from Pixabay 

             “Diversity of perspective and thought is essential to understanding
             and interpreting the law.”

                                                            --Law School Admissions Council

“Diversity and inclusion, which are the real grounds for creativity, must remain at the center of what we do.”  

                                                 --Marco Bizzarri – President, Gucci

 The percentage of women in US law schools, starting in 2016, rose to over 50%, and has been increasing since, now around 54%.  Blacks are 43% of the armed forces, but just two hold positions in the 43 top four-star ranks.  Just one percent are heads of Fortune companies.  Women lawyers make up 18% of the top Partner equity class.

The initiative DEI, Diversity, Equity, and Inclusion, is intended to address the fairness ethic so central to US life, in raising awareness of disproportionate ratios across the board, in companies, associations, schools, and board membership.  DEI goes beyond the diversity trainings, micro-aggression and inherent bias now active in HR departments.  One reason is the Harvard 2019 study concluding that these initiatives have been unproductive to counter-productive (Chang, Milkman, Gromet, et al.).  DEI is a wider mandate—not a formal program but a general theme—to extend beyond the training platform as a universal design principle.  This is a business function, operating alongside recruitment and retention, member / worker satisfaction, and policies and procedures.  It is long-term, comprehensive, and company-wide, and driven by a wider mandate to acquire Cultural Competence. 

Diversity as a social fairness principle began with the integration of the army by President Truman in 1948, then continued with the Civil Rights Act in 64, gay marriage in all 50 states (unimaginable before) was decided by the Supreme Court in 2015.  Gender identity has taken center stage in the 21st century, as well as mental health states, neurodiversity (autism and ADHD), and coming up, long-term Covid health conditions.  Although gender and racial discrimination were the initial concerns, these have exploded in a broad variety of other issues such as age discrimination.  California, as the leading diversity center of attention, is part of the 22% rate of marriages between ethnic groups in the Western US. 

This raises other fairness issues – Alan Bakke v. University of California (1978), which disallowed racial quotas in college admissions.  In our culture’s orientation to fairness, the playing field tilted back away in a reverse from elitism, to the extent that it is now a disadvantage to be a white male, as you will hear often in work circles.  This makes for a difficult situation for employers. The Art Institute of Chicago just fired its entire White, mostly female, 100 docents because they violated the diversity mandate just by being People Without Color.  Clearly this is a fairness issue. 

The concept of “Intersectionality,” however, offers the opportunity to search out and find other diversity categories, such as age and disability, or mental health states, that would work as diversity qualifiers.  When any individual can have multiple qualifiers, the number of possible identities works out in the tens of thousands.  Women of color who are lesbians and lower-class; white males who are disabled or veterans and Jewish; the list builds so that point systems for weighting one applicant against another must at some point need to become the standard for making decisions in hiring and promotion, or membership candidacy.  How else to decide between the Black male and the White female applicant for a senior position?

Why has diversity been so long in taking hold?  Tribalism, which has ruled human groups of hundreds of thousands of years, is our natural social preference for dealing with people we are related to, either closely or loosely. If others look, speak, and act like us, it fits our affinity for being able to read face and voice, the start of ability to predict what people are thinking and what they will do in any situation.  Dealing with other ethnicities and cultures makes this more difficult.  Added to the difference is a leading one: gender, as the only biological difference between people. 

Language different from your own is a major non-starter for communication.  Within your own language, however, language styling and accent are leading class markers, even above ethnic difference.  Hierarchy is a natural outcome of status distinctions like wealth, education, opportunity, competence, and social acumen.  Voice and literacy are markers for all the status signals we look for in deciding someone’s class. 

At the opposite end of the “familiarity scale” from tribalism is cosmopolitanism – the assumption of being comfortable and culturally consonant with a range of otherness – ethnic, class, education, positional on the hierarchy.  It is a sophistication always present in any culture as it reaches out to the world and welcomes what is different, including imported people and ideas.  In evolutionary terms, we were hunter-gatherers for most of human history, while only recently agriculturalists and town-dwellers.  Now over half the world live in cities, and this share is predicted to be far higher into this century.  Ancient Rome hit one million in 133 BCE; London was the first modern city to reach that number not until 1810, New York in 1975.  Technology and communications have made possible megacities of over 10 million, of which China has the most.  

Diversity is the future with the expansion of global industries, migration, communication, and identity.  The first companies to become diverse in their makeup were the multinationals like IBM, the subject of a famous study of power distance by Geert Hofstede (1980).  Gradually but with increasing speed, companies at all levels are picking up the slack in who they hire, nurture, and promote into leadership as a growth strategy. 


Monday, October 4, 2021

Endogenous Growth Theory: Ideas as Wealth

"The question that I first asked was, why was progress . . . speeding up over time? It arises because of this special characteristic of an idea, which is if [a million people try] to discover something, if any one person finds it, everybody can use the idea."  

               -- Paul Romer, on receiving the Noble Prize in Economics in 2018

Economics is the study and practice of optimizing resources by allocating those resources in ways that maximize their use and increase that use in the future.  In 1990 Paul Romer advanced the theory of endogenous growth.  The value of ideas can explain growth.  The 20th century’s was based on human capital, innovation, knowledge, investment capital, and ability to protect intellectual property--and businesses founded on these dynamics.  Thinking in new ways, applied profitably, contains the power to transform human living conditions, and much more--our collective options to build on affluence for groups and the individual.  Education, another idea source, has long been dedicated to this goal as its main outlook. 

This theory rests on the assumption that the flow of new and economically actionable ideas is unlimited, as a main product of the human imagination and ingenuity.  Throughout human history, ideas are in constant production.  Not all are actionable or successful.  But everything we have as part of culture, from art to tools to high concepts like language and mathematics, began as an idea in the human mind.  From fire to space exploration, ideas, as well as a way to make them reality, have ruled how we live, and our view of what kinds of futures are possible.  The human purpose on earth is an active agenda in the daydreaming of us all. 

“To my mind, intellectual creativity is one of a number of deep mysteries about human cognition, to which it may be vain to seek answers,” writes linguist Geoffrey Sampson in The Linguistic Delusion (2017). Sampson draws parallels between linguistics and economics in their respective searches for the limits of human cognition—one is in language.  The other is in idea generation leading to long-term economic growth based on productivity and technological change generated from innovations within an industry or economy.  Sampson points to Romer’s 1990 theory of innovation as a major advance in economics because it opens the floodgates to further thought--by considering human cognition as an engine of economic growth.  The digital economy is a prime example, including the Internet, along with entrepreneurship as a main form of (GDP) strength in both creating efficiency and new job talents. 

The entrepreneurial revolution parallels the Experience Economy described by Pine and Gilmore in 1999 as a means to widen the standard view of goods and services as the mainstay of economic thinking, expanding this view to include retail, movie-going, travel, medicine, and education.  Focus on experience as its own economic category has prompted professionalizing of many spheres of consumer activity, including medicine, car-buying, vacationing (resorts and theme parks), with many new applications of design, like architecture, to theme experiences of many kinds—for working, shopping, learning, socializing, worshipping, and wellness, including death and dying.

Idea Generation

Endogenous growth is a breakthrough because it counts as an endlessly renewable resource the output of the human brain.  The 1990s was coincidentally the Decade of the Brain underwritten by US government funding to discover much of what we now know about our most important asset.

The human brain is the most advanced processing machine in the known universe.  It contains 86 billion brain cells, generating up to 3000 thoughts per hour, nearly 1 per second, for 70,000 every day.  Brainstorming in groups is much slower: the 6-3-5 method, between 6 people, yields 3 written ideas every 5 minutes. Memory capacity is a quadrillion, or 10 to the 15th bytes, enough to store the entire Internet.  Computing speed per second is 10 to the 13th   --  16th, more than 1 million times the number of people on earth, and more than any existing supercomputer.

Given that English has over 1 million words (including scientific and technical vocabulary), it is not hard to appreciate how many variations within any given idea statement could be developed.  The simple directive “Our industry needs to develop new ways of thinking about what we offer consumers” could be restated and expanded infinitely to inspire all sorts of new thoughts.  Taking this further, lateral thinking seeks out less familiar patterns over familiar ones.  Shifting a concept from one domain to inform another is a common brainstarter in corporate settings: “What ideas can be imported from an unlike source to produce new ways of operating and offering new product lines?”  Of course there is a vast difference between gross and net in setting ideas within language.  The multiple iterations required for publishing ideas in formal style is one example. 

Neuromorphic engineering follows brain function in using what we know about processing speed to develop faster computer function while keeping energy demands low.  The mind in fact uses a double processing approach of two modes: the conscious and subconscious.  The conscious mind is the rational “top of mind,” the logical one that can be trained to think, but this thinking is limited to the rational mode. This mode is vertical, using straightline logic “in the box.”

Below the frontal lobes, operating in the horizontal sphere of lateral thinking “out of the box,” the subconscious mind is 80 times more powerful at processing, comparing, and elaboration – which is the reason 95% of our decisions are actually made subconsciously.  When we have to justify this 95% to the 5%, we recruit the conscious compartment to devise a rationale for our deeper-brain decisions. 

But these originate from a place far less open to metacognition, knowledge of how and why our thinking works.  It is also the mysterious, hard-to-examine seat of our most brilliant ideas. 

Friday, September 17, 2021

Linguistics + Diversity


 “Were language acquisition solely a question of learning by rote, it would in principle be impossible: one of the key distinguishing features of any given human language is that the number of expressions it contains is infinite.”

--David Shariatmadari, Don’t Believe a Word: The Surprising Truth about Language (2019, p. 246)


Starting in 1908, the study of language and its structure, linguistics, began to try taking apart this most sociable of human capabilities.  It departs from the academic study of specific tongues (ancient Latin and Greek, modern Spanish and French and English, to unwritten Asian, African, and Native American) or language families, began only in the twentieth century, forming around courses at the University of Geneva in the first decade taught by Ferdinand de Saussure.  Linguistics took off in US universities a half-century later in the innovative 1960s across the academic curriculum.      

How scientific can the study of language be?  That’s difficult to say, because science is about understanding the totality of things, and the infinite and varied nature of humankind’s spoken and written tongues, and their sheer diversity, makes that nearly impossible.  Science also attempts to use what it knows to predict what can happen within any given domain—difficult to determine against the constant changing voice and rules of language as it moves forward in time and among various groups of speakers.   As Geoffrey Sampson puts it in his expose of the field, The Linguistics Delusion (2017), “The heart of the problem is that linguistics sees itself as a science--the soundbite it has used since the 1960s to define itself is “the scientific study of language.  That is a delusion.  Human language is not the kind of thing that can be studied by the methods of science” (p .2). 

English is not the leading native language, spoken by only 400 million people.  By sheer numbers, Mandarin Chinese is first, over 700 million. But English is spoken by 20% of the world population as the secondary language of up to 2 billion people as well as the common linguistic science and technology standard.  The hegemony of English is ironic in that English is difficult to learn and master as languages go.  Ancient forms, Latin and Greek, are widely taught as international academic standards.  Spanish, French, and Russian are also Indo-European internationally well-distributed, with Arabic rising as the leading non- Indo-European member (Western Semitic). Half the world speaks an Indo-European language.  Sino-Tibetan is the second-largest language family, with over a billion Chinese speakers.  These places on the language racetrack change continuously, advancing and retreating with the way they are used and the geopolitics of language learning. 

There are six leading language families, with many branches including extinct ones, accounting for two-thirds of all languages on earth.  A total of 142 families are comprised of over 7,000 languages now spoken, including very large speaking groups and very small ones (representing 2400 at risk of extinction).  One example is the Tuyuca speakers of the East Amazon, just 1000 currently, with 1.5 million tenses, making it one of the world’s hardest to learn.  Archi, spoken in a few Dagestan villages, has just 900 speakers.  Silbo Gomero, spoken on the coast of Spain and the Canary Islands, is whistled.  Xhosa in South Africa, among 11 national languages and the national language of Zimbabwe, uses clicks. An entirely invented language is Esperanto, conceived in 1887 by Polish writer L.L. Zamenhof.  It was designed as a lingua franca, based on Germanic, Romantic, and Slavic roots, but has just 100,000 speakers.  Klingon, the Star Trek invention, still has only 100 fluent speakers—we assume they speak mostly to each other. And modern Hebrew is an ancient language revitalized, spoken by 9 million, almost all Israeli.

This variation, along with the infinite generative power of words, makes linguist Noam Chomsky’s Universal Grammar a difficult proposition to make and maintain.  Sampson maintains that the attempt at comprehensive grammar has been abandoned, and that there is little sense of convergence toward this ultimate goal (p. 32).  It is not like saying that homo sapiens have a common body, with variations not in structure but in weight, height, skin color, eye color, gender, and genetic code.  All people come into the world with the ability to learn any language, any at all—from Icelandic to any of the 300 Australian Indigenous family.  President Herbert Hoover knew Chinese from his mining engineering work and could communicate that way with his wife when they wanted to share private messages.  The Asia-Pacific Economic Cooperation (APEC) convention, even when held in Beijing, is not conducted in Chinese but in English. 

The many languages, and the wide diffusion of centuries and distances between them, under many currents of change, showcase the diversity of speech, words, syntax, and semantics among them all.  Such diversity indicates the richness of the language toolbar itself in the ability to express an infinite number of ideas and realities.  Of course, language is also the largest barrier to communication and idea-sharing—a Tower of Babel-- among the various families together with subfamilies that have developed as expressions of culture.  But as Charles Darwin showed, looking for differences may be more instructive and produce better insights than looking for commonalities.   Indeterminism, rather than limits-seeking, is the rule in physics, history, biology, the arts, and economic life.  Language on the ground, rather than linguistically bound, is more generative, variable, creative, and less rule-bound.  And this seems to offer the gateway into creativity rather than chaos.



Thursday, August 19, 2021

Business Success: The bottom line is customer value

Amazon is the leading exemplar of company success based on four principles:  customer empowerment, good treatment, meeting unmet needs, and becoming a public good.  These precepts, set out by the Editor-in-Chief of Fortune, are the leading indicators for success, and they outrank the financial benchmarks relied on in classic business analysis. 

In his retirement essay as Editor-in-Chief of Fortune (June/July 2021), Clifton Leaf reviews the history of Amazon’s stunning but hardly instant success starting in 1996, fully a quarter century ago. After all that time in building a reputation for service, Amazon is Number 2 on the Fortune 500, behind only Walmart (the classic customer-oriented giant), with nearly $400 billion in yearly sales.  The pandemic helped its rise to the top and its competitive muscle against other online servers.  As founder Jeff Bezos put it, “Most online businesses fail because they misestimate the value proposition.” 

Leaf continues:

What did give companies a genuine edge, and what still does today, is to empower consumers.  It sounds almost too obvious to say, and yet it’s a message that’s routinely forgotten.  Want to sell more stuff?  Make it as easy as possible for a customer to buy it…Take a page from the Steve Jobs playbook and make it as intuitive to use as possible (Fortune, p. 14).

Serving the customer interest—the largest sense of customer service--is key to that success, The proof is in its runaway position as the go-to internet retailer, Amazon leads the field in devising ways to make buying of practically anything as intuitive as Jobs made the smartphone. Costing hundreds per month, since the mid-1990s, computer-based household purchases have created an entirely new category for the US consumer and the global economy.

This kind of creativity goes beyond innovating on existing platforms.  It requires seeing the market for an altogether new category, then adapting the design so that anyone can learn it virtually unassisted.  The original industry projection for desktop computers was 100,000 units worldwide to be used only by scientists and engineers for math applications.  Thanks to smartphones –a computer with a phone function added -- 90% of the computer-based communications in the world are in words, not numbers. Even for the many non-computer literates, it is the preferred means of communication at any given moment.    

In its quest to make buying and selling easier as the Internet evolves (and become the go-to retailer for the planet as an outcome), Amazon has pioneered on a grand scale better ways of engaging buyers and sellers. This leverages the Internet's customer advantage as it begins to supplant brick and mortar shopping.  The signs were there decades ago that retail was outgrowing its department store and mall heritage.  The entire business model has been "unravelling" for a long time (GlobalData Retail, 2021).  The quest is on to repurpose many thousands of square feet of retail space, morphing mall space, for example, into charter schools, senior residences, Covid vaccination centers, and even as newfound Amazon distribution warehouse centers. 

Others, like American Dream, a $5 billion mall in New Jersey, diversify with experience offerings like ski slopes, water parks, and roller coasters to attract shoppers beyond retail.  So in effect retail spaces are drawing on the proven success of Disney's theme-park artform to give customers what they want.  This is the reason theme parks have totally rewritten the rules of public space to become the leading edge for experience design and marketing machine.

               Leaf comes away from his tenure at Fortune with four “comically simple” principles for conducting business—all customer-focused. 

1)      Empower customers

2)      Treat people well

3)      Meet an unmet need

4)      Make the world better (Become a public good)

Beyond price-earnings multiples and rates of sales growth, judging a business can be as simple as coming to conclusions about its relationship with its customers.   Disney Company has come under sharp consumer criticism for its ticket pricing (which becomes the leading indicator for all theme parks) as the daily gate outpaces the ability of fans to continue their annual visits to Orlando and Anaheim.  This critique is based on the popular assumption that Disney is a public good rather than a profit-driven enterprise.

The New York Times has reported that hidden fees, or drip pricing, makes event tickets and hotel-room reservations difficult to determine or compare until the online transaction is complete – hardly a transparent transaction.  The US Department of Transportation banned drip pricing for airlines in 2011; Marriott and Hilton have more recently been sued by several states over their pricing practices.  Meanwhile the FTC, with broader powers, is considering a nationwide protection plan for consumers in all industries.

Wednesday, July 7, 2021

Cultural Logic and Decision-making

Culture is a broad-based ongoing set of assumptions about ourselves and the world that humankind has been building over millennia.  Culture stands over and above our biological and instinctual heritage.  It is the first and best invention of mankind, the longest-running, and the most influential set of decision-making guidelines ever devised.  It extends the individual brain by multiplying its power over the time and space of collective will and perception.

Culture’s purpose is simple: to make decision making automatic, as free of thought as possible.  This is exactly the reason we know so little about the massive “C-force” that runs most of our thinking—it is so deep-seated that it never shows itself.  But an analytical understanding of its workings is exactly what is called for to appreciate why all decision-making is such a problematic and difficult behavior--and under what mechanisms it operates.


Decision, from the Latin decisio, literally, a cutting off,” shares the same root as “incisive,” also meaning “to cut.”  It is about the knife-blade, and the moment of truth at its glinting edge.  The act of cutting by decision selects things: one to keep, the other to cast away; one ranked above the other, each assigned to separate categories.  Some decisions are irrevocable and life-changing, while others are as routine and harmless as breakfast Cereal A versus B.  While this last has vast implications for General Mills, for the cereal-eater it is a trivial moment in the daily business of choosing Cheerios over Chex.  But any decision, not just cereal (of which there are over 40 name brands on the market by General Mills alone) is normally a cut between options already shaken out or preselected.

When we buy a house, we don’t examine and compare the hundreds, thousands--even millions--of properties on the market nationwide. We know we have a much narrower field to choose from--only a tiny precut sampling: of sizes, styles, addresses, price ranges, and features.  In housing, that cut is rarely related in any significant way to house type or cost, but to location alone. The most important decisions, the “pre-cut” stage, is actually where the critical choices are made—at the far front end.

Cultural logic

This is where the cultural logic process is the most active, and it works in under 250-450/1000 of a second (source: UC Irvine Vision Group).  That’s the time it takes to recognize a broad pattern: to determine whether a person, image, or concept is friendly or threatening, useful or irrelevant, or somewhere in the register (spectrum) in between. It is the broad logic that tells us whether things “out there” are a fit with our own identity, our own current “brand” of gender, age stage, class, and community.  This covers a new business idea, a new decorating color for the home office, the decision to join a health club, accept a credit card offer, switch long-distance providers, or dedicate time to reading one book rather than another.  Rarely do we sit down to draw up a calculus of characteristics to be graded on a rating point scale.  When we do, this is because the values involved are so similar that there is no intuitive way to favor Option A over Option B.

We might buy a toaster by ratings, or even an entertainment center, but not the things that really count in our lives:  the person we marry, the college we attend, the career we follow, the house we live in, or the car we drive.  That is because the more important the purchase, the less it has to do with qualities that can be measured or counted.  The truth is that we choose things because they reflect our internal value system, or sense of who we are, our individual and social identity (including our gender, age, and ties to others).  What this means is that we do use a logic system to make decisions, but it is not technical logic.  It is called Cultural Logic, and like all logic systems, it comes with a set of rules and principles that makes it useful to us.

Cultural logic is our thinking system for buying what matters most to us, but amazingly enough, it is an almost completely invisible system.  We rarely think about it – except when two similar options confront us and we must decide between them.  Decision making is a cultural process; if it weren’t, Scientific American would be in the top ten best-selling magazines in the US, right alongside People.  It’s not even in the top 100.

Curiously enough, although much attention is paid to the weighting method as “scientific” and superior to intuition, only at the very end-stage of the process does this charting happen. At the front-end, far more important as the “framing moment” of the decision, we know what to do, because it “feels right,” evokes the right emotional response, and is consistent with the host of other choices we make throughout our day or our lifetime.  All our choices express these dimensions as expressions of the only true brand, the core of all brand choices: the one we call ME.

Only in the aftermath of the prime choice, which occurs in the pre-editing stage, does rational or process logic comes into play.  As opposed to relational or cultural logic, this is the fine-tuned, rational logic process of combing through a set of similar choices to arrive at the one that seems to offer the highest reward—but that difference is by this time a matter of a far smaller degree.  The value of the field of choices itself has already been established before the finalists are sorted through.  It is the difference between the original filming sequences for a film and the final edits.  The shooting script has already narrowed what the film editor will work with.  And naturally, this framework is narrowed even further by the final cuts: far more footage is shot than ever makes its way to the screen. But theme, story, locale, and characters--all are set from the start.

What this means is that the big cuts in ideas, people, actions, and products have been made tacitly by culture, not the decision-maker, through the process of cultural logic.  The thought process of that logic is a preexisting, invisible one, not one that consumers can reliably articulate in a focus group, survey, or even by observation or experience. People cannot readily tell you why they bought a particular product, only their experience with that product. 

What they cannot identify is the leading factor—those forces that drove them to buy it in the first place. Buyers can only rationalize their choices because the drivers of their decisions operate below their conscious horizon.  This is what marketers want to know, and it is the holy grail of consumer studies.  The closest approach we know to reading the collective mind is to delve into the decoding of that mind by cultural means.