Friday, December 8, 2023

Quotes on Magic and Science

 

There are thousands of quotable quotes associated with presidents, geniuses, educators, athletes, religious and military leaders, great builders, lawmakers, artists, authors, and actors.  But among these are certain quotations you know immediately to be false—my current favorite is “The problem with quotes found on the Internet is that they are often not true,” attributed to Abraham Lincoln.

Of course, that's an obvious joke - unlike a famous Gandhi quotation, “Be the change you wish to see in the world,” that is by many accounts a misattribution.  Yet the exhortation sounds impressive and true, because it embodies the truths and actions central to the man in his lifelong activist career. 

Truth value is contained in the context of the quotation itself, which can be judged to be off-center if it references the unknowable future or contradicts or compromises certified statements by the same person.  As humorous as Abe could be (and he frequently was), in no way could he have foreseen the advent of digital communication.  (He also said, referencing his homeliness, “Honestly, if I were two-faced, would I be showing you this one?” in the Lincoln-Douglas Debates of 1858.)

But there are cases less clearly resolved.  Just last month, Jamie O’Boyle and I presented in short form our findings on The Disney Effect, in honor of the centennial of the Walt Disney Company, at the Sixth TEAAS –Themed Experience & Attractions Academic symposium in Orlando, Florida. Part of our coverage was about the Disney branding Magic—starting with what magic actually means.  Among our citations was science-fiction writer Arthur C. Clarke, best known for his consulting on “2001: A Space Odyssey.”  In one of Clarke’s many media interviews, he is quoted as remarking “Magic is just science that we don’t understand yet.”

                             

Here is where the quote-check issue arose.  At one of the seminar tables, a person or persons objected, raising doubts that Clarke ever made such a statement.*  A cursory search later on to review our sources showed a significant number of citations of this statement, including one on a T-shirt  My suspicion is that this critic was simply unfamiliar with this version of meaning, assuming it to be a misquote or falsely attributed--because it did not match the writer’s better-known and easier to find written quotation, his self-proclaimed Third Law, “Any sufficiently advanced technology is indistinguishable from magic.” 

Both quotations are freely cited online.  There is no misquotation or misattribution to be resolved.  (A misquotation, for example, would be “Money is the root of all evil,” instead of the actual phrasing “The love of money is the root of all evil” (Timothy 6:10, King James).  A posing as B.  But Clarke’s two quotes are clearly separate, consistent with one another, and neither is difficult to find. Clarke’s Third Law, his most famous, comes from Profiles of the Future (1973). The shorter, pithier one is a paraphrase by the same author that appeared in an Australian radio interview transcript also treating the future of technology.  Spoken word quotations may be harder to locate and validate than the written kind, but it is often possible.

However, there is another angle involved here.  Our presentation was not a literary study; we simply cited the “Magic” statement as a way of moving thought forward.  The issue is not Arthur C. Clarke’s output or opinions, but the idea of what magic might be and how it operates.  The citation issue must be considered a sidebar to the theme of the presentation.  This was the legacy of the Disney company and its innovations outside the parks, operating full force in the world beyond.  The quotations was never an issue germane to our presentation, nor is that why it was featured.  Sourcing and authenticity shouldn’t become a stumbling block to understanding.  It is the idea of magic itself and how its definition is constructed that was central. 

Perhaps the principle to keep in mind is not to believe everything you hear at a conference – at a table or behind the lectern - and especially not if the authenticity inquiry creates a buzzy diversion from the main topic.  Now if Arthur C. Clarke were a Disney Imagineer? That would be a different matter of fact-checking.  

The prolific density of Internet information also means a careful path to truth can be blazed.  So always double-check the source of your quotes and keep in mind those profound, albeit fake, words of Abraham Lincoln: “The problem with quotes found on the Internet is that they are often not true.”  

*Thanks to friend and colleague Kile Ozier for bringing this issue to my attention.

Friday, October 6, 2023

The Inheritance Talk: Why Money Mindfulness Is So Difficult--and Important


                                     

"The New York Times Advertising: T Brand Studio seeking a psychologist or sociologist who can provide deeper scientific context on why people avoid having financial conversations, especially the inheritance talk, why talking about money triggers stress and how to start having these critical conversations. We'll also talk specifically about the inheritance talk and why the lack thereof can cause turmoil among family members (Sept. 12, 2023 email).

"We’re never really taught that we have to think about our work before we can do it…Thinking in a concentrated manner to define desired outcomes and requisite next actions is something few people feel they have to do (until they HAVE to).  But in truth, it is the most effective means available for making wishes a reality." ---David Allen, productivity expert, Getting Things Done (2015)  

“The Baby Boomer generation is expected to leave a significant amount of money to their Millennial children. It's estimated that more than $68 trillion will be bequeathed to their offspring. The great wealth transfer is expected to make Millennials the richest generation in American history” (Forbes, Aug. 9, 2023).


Cultural values are the shared reality that drives perception, decision-making, and action (see my website below).  Wealth and money are essential to group well-being.  Wealth transfer between generations takes knowledge, talent, diplomacy, and a wide view of the interests and rights of others.  The stakes are high.  But few have the vision or practice to enter into negotiations with their parents as well as adult siblings that the inheritance talk requires. 

This is largely because money is one of the least-discussed topics in any relationship – less than sex, religion, or politics. Why? Because finances are a minefield of uncritical thinking, involving history, status, hope, regret, self-worth, and how we value our relationships.  This may be the reason that only one out of three people in the US have any estate plan or even a will.    

Abe Lincoln died without a will (despite being a lawyer).  He didn’t expect to be assassinated. Next century: Worth over $500 million today, since his death in 1981, singer Bob Marley’s complex estate (no will) has attracted dozens of claimants.  Jimi Hendrix likewise had an estate missing a will.  After his death in 1970, the battle waged over his assets continued until the end of the century.  Howard Hughes died in 1976 at age 70, again, without a will.  Although one was “discovered” at the Mormon headquarters in Salt Lake City, it proved to be a forgery.  Eventually the billionaire’s holdings were divided among 22 aides, charities, and former lovers.  (Source: LegalZoom.com)  

Family life operates by pre-assumptions—those habits of thinking that are unmindful about how things operate—especially around finances.  Very few families can be open and above board about how funds are managed or allocated. Even less-often shared is the rationale, the WHY behind HOW things operate—investments, expenditures, loans, savings, bequests, and long- and short-term goals. 

The Inheritance Talk needs to establish two things:  First is what the family has, and second, how these assets will be shared out--now and for the future when elder parents decease.  Both topics are hotbeds of potential conflict and confusion, leading to the need for clarification, correction, negotiation, and change. This is the time when assumptions are challenged and family secrets unveiled. None of these is anything family members seek out.  Nevertheless, The Conversation, freighted as it is with these perils, is essential to moving forward from past to future. 

This complicated talk about family money is far more than a meeting about numbers.  It can’t help being a judgment on the way Mom and Dad have handled—or failed to handle—assets and opportunities over time.  The discovery process is legitimate in its need to know how things stand and where they are headed.  Ideally, this scrutiny might go just fine, showing sound management over decades.  But more likely the close examination is going to reveal some flaws or possibly frauds—answering the question of why people are generally resistant to this audit process.  (We don’t like the IRS inviting us in to talk taxes, either.)  

Faults and inconsistencies will emerge, which means these can no longer be ignored or assumed to be unproblematic.  And financial arrangements between parents and siblings will come to light, raising jealously about favored compared to less-favored kids.  This is when you suddenly discover that the family house will not be sold and divided, but a caretaker child, or one cared for, has been given the right to live there indefinitely.

The Talk makes any assumptions clear and stays the ambiguity.  A common example of discovery is favoritism among adult children and uneven or strange-looking distributions.  Suddenly the actual operating assumptions showcase the unmistakable need to articulate reasoning, make fairness arguments, assign responsibly and/or blame, and press concerns about the future needs and rights of offspring (and surviving parents).  Now conflict avoidance is no longer possible, and the truth of things looms directly in the family’s faces, impossible to ignore.  It’s a moment of truth that often comes too late to save either feelings or finances.

This is why it is so critical to make things transparent all along, with the help of advisors, estate attorney, financial planner, or CPA and daily money manager providing counsel.  Many parents really do hope they will be spared The Talk and safely gone before having to do the right thing becomes necessary and unavoidable.  But the far better alternative is to get and keep the financial house in order long before that, getting everyone on the same page over time, through inevitable ongoing changes in the family picture.  Everyone is aware of this principle.  Few see full disclosure practiced.

Financial literacy, the ability to understand and apply financial concepts to managing personal finances, peaks in the early age 50s, when fewest mistakes are made.  This is the age when people have “accumulated knowledge and experience about money, spending and saving, but haven’t begun losing key analytic cognitive skills” (ARC Centre of Excellence in Population Ageing Research, Australia, 2023).  This means adult children are at or approaching their peak ability to read and repair the family financial landscape, while at the same time, their parents have passed that point and need help, whether requested or not.

In South Philadelphia where I live, I ran into a seventy-some man on his way to Italy.  His mission is to try to get in writing his ninety-some father’s promise of properties in that country under Italian law.  I wonder how that’s going for them?

Photo: Pixabay



Thursday, September 14, 2023

The Disney Effect – Themeatics

 

“Sixty years of crowd management has made Disney operations the undisputed champion of event control and coordination….If you’re serious about solving [your traffic] problems, you go to the Disney Academy at Disney World.”

                                                -- Phil McKinney, innovation expert, Beyond the Obvious, 2012

                                                EPCOT:  Journey into Imagination pavilion

Over the past century, the Disney Corporation has exerted an outsized effect on popular culture and the popular arts.  Disney 100, this year’s centennial of the company’s 1923 founding, is a timely point for taking stock of the “Disney Effect” across major cultural domains—especially in the domain of “themeatics. “

Since 1955 at Disneyland, the Ur-theme Park, the Disney model has guided design basic to the experience economy, acting as centers for creativity and innovation in the arts and technology, both popular and elite.  Themeing has been transforming public space in both design and use as enlargements of the sphere of entertainment far “beyond the berm,” most unforeseen and not at all calculated on the part of D-Co.  My career knowledge in themeatics (as I’m calling the aggregated skill set that created theme parks) can be put to work to take a measure of the far-reaching effects of this legacy. 

Themeatics is the unified field theory for the arts, as an artform that underpins design across fields from architecture to city planning to drama and audience experience to graphic design. Every aspect of the designed environment has been informed by the Disney Metropolitan Deco template. Around 150 artistic subfields, as a conservative estimate (by the late Marty Sklar as President of Walt Disney Imagineering) can be enumerated; and any artform ever devised, from ancient to state-of-the-art, can be seen within the fabric of the park design.

Walt Disney himself is considered a master innovator in the arts (all genres) and technology (high- as well as low-tech), with theme parks as the incubator enjoying a test audience of millions per month.  Far from Disney’s initial reputation as an animator (his true role was as a story editor and creator of the synchronized sound cartoon) and entertainer of the nation’s children with cartoons and audio animatronic rides, he headed a studio led by Imagineering (“imagination plus engineering”) that soon became a center for interdisciplinary creativity and innovation.  The outcomes transformed public space and the way it could be used, enlarging the entire scope and influence of simple entertainment. 

Central to this role was the development of hyperreality as the ultimate adaptable format for designing as well as experiencing art as a total-immersion, mixed-media, seamless experience.  Art and technology became permanently conjoined, using augmented reality (AR) from digital programs and applications, the basis of 5-D multi-media.  The outcome was environmental artworks, the most iconic being the theme park, aimed at brains and bodies of all ages. 

Hyperreality melds the real with fantasy and the subconscious so that these become indistinguishable in a new amalgam—as in the transformation of history on Main Street, USA.  As Christopher Finch put it in The Art of Walt Disney as early as the 1970s, “Disneyland and Walt Disney World are shows—a kind of total theater which exceeds the wildest dreams of avant-garde dramatists.”   Hyperreality is a concept in post-structuralism that refers to the process of the evolution of notions of reality, leading to a cultural state of confusion between signs and symbols invented to stand in for reality, and direct perceptions of consensus reality. Hyperreality is seen as a condition in which, because of the compression of perceptions of reality in culture and media, what is generally regarded as real and what is understood as fiction are seamlessly blended in experiences so that there is no longer any clear distinction between where one ends and the other begins. Hyperreality – established within the popular arts as well as the elite levels--works to integrate emotion, memory, rationality (as art history), and cultural values for both brain and body.

Culturally, at the theme parks, hyperreality acts as the enveloping artform to showcase themes important to cultural values for Americans; they express those values we most favor about ourselves and our national heritage: collective imagination (Fantasyland), our shared vision of the future (Tomorrowland), other people, places, and adventures important to us (Adventureland), and American history in Frontierland and Main Street, USA. 

At the theme park, the Disney Effect is an influence in entertainment and edutainment on all fronts.  Here the distinction between entertainment (as engagement) and amusement (as diversion) emerges. Disney productions and their methods are major instigators of the entire Experience Economy identified by Pine and Gilmore in 1999. 

Further, the Creative Economy, reflected in the Experience Economy, considers public space a closely designed and deliberate event-integrated vision--as seen in animation art. The Disney Imagineering team is cross-functional and interdisciplinary, a template copied across creative industries, for example in the use of storyboards to diagram character and action, an aspect of “blue-sky” open-ended idea generation.

The Disney legacy can be traced through the decades across a range of creative industries, tools, and technologies that inform the designed environment.  Theme parks in particular have become urban labs where concepts can be experienced by millions of visitors to test viability. 

Perhaps no other artform innovation approaches the reach, persistence, and inspiration as clearly as the legacy of this prototype.  Themeatics is a hyperreal mix of techniques borrowed from animation and filmmaking rather than architecture and urban planning: the familiar storyline, identifiable archetypal style, “not the design of space but the organization of procession” (architect Philip Johnson); stagecraft, iconography, special effects, audio-animatronics (3D animation), and color coordination, all led by the concept of “show” and “enhanced reality” (the late senior Imagineer John Hench’s term). Themeing is a tightly focused reality made to evoke specific times and places with strong cultural resonance. 

These distillations – from musical cueing and food to landscaping, lighting, scaling, signage, sound, surface, texture, and smell--play off perception and collective memory to create “instant moods.” These are achieved by motifs, layered detail (fractals), and multi-sensory environmental designs, favoring images over text to tell stories and give emotional direction. 

Inherent in themeing’s sense of place as a theater stage is the legacy of revival or nostalgia in latter twentieth-century design, and the multi-media assemblage of artforms and styles from many eras, traversing the evolutionary range from craft to high-tech.

Such a far-reaching and durable “Disney effect” was unintended and unanticipated; co-evolving with the unforeseen ascendence of virtual reality as a new default resulting in hyperreal environments.  The theme park model would recreate the real world both within and outside it, multiplying other worlds as themeatic offshoots.

            Journey into Imagination photo by J.G. O’Boyle

Monday, August 7, 2023

The Betty Crocker Legacy: A Century of the Homemaker’s Creed

                                 Betty Crocker portraits 1936 – 1996                       

 

This year marks the 150th anniversary of General Mills in Minneapolis, whose face since 1921 was that of the archetypal homemaker, Betty Crocker. 

Betty Crocker took shape after Gold Medal Flour ran a contest in the Saturday Evening Post with a jigsaw puzzle.  Along with the puzzle solutions, entries included questions from home cooks asking for baking advice.  Betty was created to provide answers from the staff of the Gold Medal test kitchen.  William Crocker was the popular company director who inspired the surname. Betty is an informal family-style name, but still traditional (nickname for Elizabeth, with Hebrew and English roots).  The World War II resonance (like movie headliners Bette Davis and Betty Grable) made a good naming level for a close family advisor in the kitchen that became a household name.

 

The interaction with the baking public was an early example of social media, sampling the public (now called crowd sourcing) as a way to do market research on women’s issues in cooking, and using that consumer input as the basis for creating the beloved advisor.   Answering letters was already a tool of the company’s public relations department, but generating a completely new character was innovative marketing genius.  New to the airwaves, the first radio cooking show was the “Betty Crocker Cooking School of the Air” on a Minneapolis station.  Today we would go to You Tube; in 1924, there was no television. Instead, families and friends gathered around their radios and, yes, because we intuitively look toward the source of someone speaking, they actually watched the radio when important news came on.

 

By the late 1940s Betty became one of the earliest brand icons on television.  Women after World War II had married in unprecedented numbers. With their husbands’ benefits from GI Bill, they could move into their own homes at an age unheard of before the war. Radio and later television, replaced their mothers as a resource for advice on how to manage the home and particularly the kitchen. What was called “Home Economics” (cooking, budgeting, and sewing your own clothes, among other skills) was still being taught in high schools well into the late 1960s.

 

Betty Crocker was already a star presence from her radio days. According to Fortune magazine, by 1945 she was the best-loved female figure after First Lady Eleanor Roosevelt. A considerable accomplishment for someone who didn’t really exist. She filled a gap in the homemaking pantheon.  Preparing dishes using packaged mixes was still not the norm (compared to cooking from scratch) and Betty was the link, informing the nation of the DIY aspects of processed and packaged recipes.

 

For young mothers operating on their own often very distant from their origin families, she was the trusted home-wise senior female always ready with moral support as well as mastery of how modern cooking operated.  This was especially vital during World War II, when rationing dominated the Homefront and ingredients were limited, lower-quality, or nonexistent.

 

General Mills produced a guidebook in 1945 called Our Nation’s Rations to help customers cope.  As a wisdom figure on a par with Walter Cronkite (“The most trusted man in America”), Dale Carnegie, or Eleanor Roosevelt, she emerged from the ranks of American wisdom heroes (embodied first by Benjamin Franklin).  These are practical problem-solvers with grace and integrity whose positive outlook is won through experience and shared with a broad public.

 

The Betty Crocker Picture Cook Book from 1950 is a perennial best-seller over seventy years later.  “The Homemaker’s Creed,” about the pride and talents of making the ideal home for the family, came out during wartime in 1944—just as women were being encouraged to return to the kitchen from the wartime factories where so many had been working.  The Creed pledge [of the Home Legion], suitable for signing, begins with “I believe homemaking is a noble and challenging career….an art requiring many different skills and the best of my efforts, my abilities, and my thinking.”  It was co-signed by Betty Crocker as the icon of American homemaking.   

 

Since her invention in 1921, she’s been gradually replaced by the secondary icon, the red cooking spoon, not a human icon with which anyone can identify.  Brand icons have two basic roles: they can either represent the product, or the product user.  Over the decades, after the 1960s, Betty’s image was steadily adapted to look younger and more cosmopolitan, as if she represented the consumer rather than the virtues of the product line itself.  The value proposition represented by Betty was rather the young grandmother with authority in the kitchen who knows what her customers want and the techniques to get them there.  She embodied authority and how-to inherent in the product.  A catch-phrase for her talents was “You can do it, and Betty can help you.” General Mills confused the two roles of the icon, meaning the Crocker image became less and less relevant for consumers when they saw a younger, less authoritative version.

 

Was Betty Crocker a real person?                                                                                                                                  

 

Not just one person, but a composite of several females originally drawn from the ranks of Gold Medal Home Service personnel.  For the 80s revision, 75 separate photos were aggregated in a single image.  The seven portraits from the 30s to the 90s all show closely ranging features and coloring, looking like female relatives (much as Disney animated princesses could be cousins).  In this way, the Crocker image was an early version of hyperreality applied to portraiture.  But that image is responded to as a living person: an adaptable, far-sighted, consistent, in-control figure inspiring trust and confidence. 

 

The Crocker image has had amazing cultural value – among the top 20 most recognizable images in the world (topped by the leading corporate symbol of Mickey Mouse).  Through seven transformations, from 1936 to 1996, she maintained her recognizable middle-American “ageless 32” image—in fact, she looks progressively younger in each, and by the mid-90s, has acquired a pan-ethnic olive complexion. In our research on the Crocker image, we found one consumer commenting that “the final image is no longer Betty Crocker, but Betty Rodriguez married to a Crocker.”

 

General Mills did not understand or appreciate that their customers did not want to BE Betty Crocker. They wanted Betty Crocker working for them in the kitchen. They were not identifying with the icon, but with the competence she represented – her skill as a baker who could turn out a perfect result every time.

 

This was the value center. It is an important distinction in marketing.  Companies are always looking for ways to update their images—however, dropping the persona of Betty wasn’t the way to do this (just as New Coke discovered in beverages). Companies often tire of their own images because they see them every day. They get inured to them, thinking them too old-fashioned and time-worn.  In fact, that is often their true value—think of the preppy themeing of Abercrombie and Fitch or Ralph Lauren.  These fashion lines show lasting profitability for their references to classic design, time-tested as trustworthy. In music, the late Tony Bennett trusted his faith in the classic American songbook that made him a lead performer well into his 80s.

 

After a century in media, Betty Crocker had become a fully vested American symbol of family, hearth, and home that many generations still treasure and look up to.  Her image is solidly emblematic of the middle-class, productivity, and women as homemakers. These are values steadily central to middle-class aspiration. To the company, Betty’s image began to recede as a symbol invested with the World War II generation, one that has now almost totally passed away.  Not its influence, however.

 

But Betty represented a stable and reliable universe, one on which you could depend. Consumers could rely on media figures not to make rash or selfish decisions, to hold the right values, to be principled and rational, and to speak with the voice of reason and moral authority around home and family. Most of all, despite her image updates, she presented constancy; you knew what you were going to get every time. Betty would never let you down.  We still need such icons in our lives—now more than ever. 

 

Photo: Pixabay

Friday, July 21, 2023

Your Brain Is Not a Computer: Hard v. Soft Technology

    

                     “Your brain does not process information, retrieve knowledge, 
                        or store memories. In short, your brain in not a computer.” 


                                                                              - “Your Brain Is Not a Computer,”
                                                                           Robert Epstein, Aeon, May 16, 2016


“We suggest that the question for scientists should instead be: if we adopt the definition from computer science, then what kind of a computer are brains? For those using the definition from outside of computer science, they can be assured that their brains work in a very different way than their laptops and their smartphones—an important point to clarify as we seek to better understand how brains work.”  – “The Brain-Computer Metaphor Debate Is Useless,” Richards and Lillicrap, Frontier, Feb.8, 2022 

Stage 1 - Image from Aeon

Without looking in your wallet, try this:  Draw the portrait side (the “obverse”) of a $1.00 bill.  How did you do?  Does it look like a child drew it? (Stage 1)  How many times have you looked at this exact same object?  Certainly thousands and thousands.  Then why isn’t it stored somewhere in the brain, ready to leap onto the page?  (Robert Epstein took on this question and its implications in 2016 in an article called “Your Brain is Not a Computer” in Aeon). 

Although the metaphor is alive and active, the information processing theory of the brain (active since the 1940s) is not only misleading, because our brains are not uploaded, downloaded, or a cluster of coded programs, but prevents seeing our unique capabilities, which aren’t even parallel to those of computers.  Each human brain is uniquely shaped by its own experiences (not “inputs’) and in fact gets “rewritten” in different ways.  We each learn differently, creating new knowledge from our unique abilities to live and learn from those experiences.  We change and create constantly because we aren’t coded to one system for handling information—though we are biased socially in the direction of the culture we inhabit.  This is the reason our brains can’t be downloaded to a computer.  Brains don’t store words, images, or symbols—we don’t retrieve or download memories.  It could be called the “One percent reality problem” – we live in our heads, not in anything like an objective reality sphere. 

Memory is one reason we operate day to day on incomplete information.   Our perception is sketchy, our memories are full of holes, and our general knowledge studded with gaps. In trying to recall and draw a dollar bill you will get a crude drawing with main features only. 

Juries have this problem with evidence, as well as employers looking for recruits, marketing looking for purchasing motives, by drawing conclusions from limited cues.  This is what culture does: it helps us think and decide on the basis of very limited information.  Name bias—attempting to size up people by last / first name, is one example. 

In his book Things that Make Us Smart (1993) Don Norman says, “We are excellent perceptual creatures who see a pattern and immediately understand it.  Another common phrase used in psychology to describe this state is ‘going beyond the information given.’  A simple fragment of information and we immediately recognize the whole…Sometimes we can identify a friend or relative from a cough or footstep.”  Sampling yields errors for infrequent events and/or people.  The brain processes images against a stereotype list – a shortcut to pick out a person, object, place, or symbol.  It looks over this patterns list to match up the perceived pattern with something already familiar.  In this way, acts of perception are always acts of plumbing the past to resolve unknowns in the present.  And as a whole class of studies has shown, this list is scattered, imprecise, and set up to be misleading.

However, through the magic of adapting ideas to their current use, it works for us.  We know to look at a real dollar bill to resolve our mental picture of it to clarify relationships and correct errors. (Stage II)

 

                                                                              Stage II - Artist’s rendition of the $1 bill

Intelligence

In making hundreds of thousands of connections, human intelligence lies not in the number of neurons in the brain but in the connective system between brain cells.  This connectivity of ideas and images is the basis of adaptive intelligence and creativity.  But from a computing standpoint, this system is far from stable, and is prone to the fluidity of memory and subject to so many “errors” that it could never be considered an accurate system of fixed facts and figures.

Memory

Cognition is anything but a precise system.  We live in our heads, not only in our imaginations, but in the imaginative reconstructions of reality that occurs every time we remember and reconstruct the reality we live in.  We process the raw materials of memory and what is around us (perception) to create a meta-reality, our version of reality, which corresponds roughly to the reality as it exists and as it exists for other people.  This is why culture exists as the web of ideas (illusions) that bonds us to the brains of others.  Culture is the shared idea of reality that we can reference and rely on—and reshape to any number of uses.  This is the way we develop our capacity to interact with the world effectively.  We don’t retrieve memories – we refashion them to our current needs.  

Hard v. Soft technology: Logic v. Language

Technological systems can be classified into two categories:  hard and soft.  Hard tech refers to those systems that put technology first, with inflexible rigid requirements for the human.  Soft tech refers to compliant, yielding systems that “informate,” providing a richer set of information and options than would otherwise be available, and most important of all, acknowledge the initiative and flexibility of the person.   

Norman continues by noting that the language of logic does not follow the logic of language.  Logic is a machine-controlled system in which every term has a precise interpretation, every operation is well-defined (rigor, consistency, no contradictions, no ambiguities).  Logic is very intolerant of error. A single error in statement or operation can render the results uninterpretable.  On the other hand, language is always open to interpretation and fine-tuning, which is the essence of dialogue and its logic of directed correction and clarification.   

Language is indeed quite different.  Language is a human-centered system that has taken tens of thousands of years to evolve to its current form, which exhibits in the multitude of specific languages across the globe.  Language has to serve human needs, which means it must allow for ambiguity and imprecision when they are beneficial, be robust in the face of noise and difficulties, and somehow bridge the tradeoff bet ease of use. and precision and accuracy (longer and more specific).  At its base, any language has to be learnable by young children without formal instruction, be malleable, continually able to change and adapt itself to new situations, as well as very tolerant of error.

Like language, then, pattern-making usually works well enough so that we think of it as reliable. As larger-than-life patterns, stereotypes have earned the bad reputation of occasionally being wrong.  But against that liability is the evidence that they are usually reliable.  If this were not the case, they wouldn’t proliferate or have any reputation at all to worry about

“Archetype” is a better way of thinking about our thinking – ideal prototypes (from the Greek “original pattern”) that represent whole categories.  Types are the basic currency in which our minds deal, and the cast of myths and storytelling.  Especially central in thinking about people, as in Jung’s 12 universals, they are balanced by the persona or self at the center—Latin for “mask.”  Understanding the world effectively has a strong link to drama and themeing—very far from the stage of computing.

Wednesday, June 14, 2023

Exogamy and Diversity


                                                                                                    *Image from Pixabay

Humans live in small groups dictated by Dunbar’s number, the limit of active relationships that can be managed as a mental system. That number is limited to around 150 people.  However, we need to go outside our close genetic lineage for marriage partners in order to opt for genetic diversity, a seeming contradiction to our close social bonds cultivated by territorial bias.  Exogamy is the fusion of reproductive cells from distantly and unrelated individuals, or outbreeding.  But it comes with a host of advantages beyond the genetic.

Humans are wild breeders, meaning that there is no set pattern to our marriage / reproductive choices – except that of excluding close relatives.  This means that we must actively seek out and recruit different genetics for reproduction—including racial and ethnic “others.”  Thus the appeal of the exotic man or woman – like Harry’s choice of a mixed-marriage product to import within the British Royals.  Did this cause a stir?  Indeed it did.  But the racial factor in Meghan’s makeup was the least controversial aspect of the coupling.  This was far more a matter of class. 

North American Inuit tribes live in paired kinship groups called moiety, two relatively even groups that marry each other, under assigned totem names.  This keeps lineages separated until they pair off. Intermarriage long ago became an excellent way to produce in-laws from warring groups to keep the peace, with continuing exogamy as a primary tool for maintaining alliances between diverse groups.

Beyond the social and political bonds, the gene pool is greatly enriched by intermarriage, meaning more combinations leading to genetic innovation.  In addition, “once the members of human groups began to marry outsiders, and thus to spread beyond their own relatively narrow limits, the knowledge of one group became potentially the knowledge of all, and the possibility of human progress was vastly increased” (Life Nature Library, The Primates).  Diversity is a source of enrichment and biological progress, the banner of DEI programs so much a part of corporate awareness following George Floyd’s May 2020 death.  His long criminal past--including serial rape--makes him a problematic champion as well.

Americans have long been multiracial, considering that race is an informal concept without any scientific validity or formal definition.  Ask an American about their ancestry, and they will instantly cite the most divergent one, not the boring mainstream example.  The cattle rustler or bandito, immigrant or eccentric, not the shopkeeper or accountant.  American culture favors the bottom caste, especially as a point of origin for later success.  There are no purebreds here, because we don’t depend on bloodlines to establish anything important, like citizen status, voting rights, property holding, or clan membership.  The head of state was famously one such person—Barack Obama. 

Since the late 1960s, with the SCOTUS decision in Loving vs. Virginia--under Equal Protection and Due Process--all marriages between races have been legal, setting the precedent for legal same-sex marriage as well. Since 1967 there has been a steady increase in out-marriage between ethnic groups, from 3% in 1967 to 19% of all newlyweds now marrying someone of a different ethnicity.  The average across all married people is 10%, or about 11 million (PEW study, 2017), a five-time increase since 1967.  In California in particular (as a style leader for the nation), white-plus families abound, and it’s almost unnecessary to state “My son is married to a Japanese,” or Chinese, or Latino, Jew, Indian, Iranian, etc.  My own California family is heavily outmarried—60% between five siblings, up to 80% if you let in the Irish. 

Still, whites are least likely to inter-marry, Asians and Hispanics most likely.  30% of Asians are outmarried, and nearly as many Hispanics (who are, in fact, in the White classification).  This carries us beyond the European pale, where Italian used to be considered a sub-white group (along with Jews and Irish).  The rate of outmarriage reliably rises alongside college education, in keeping with middle-class values prevailing over racial stigma.  Striving middle-classers tend to make race far less important than personal achievement.

When your in-laws are members of another group, your feelings about that group improve instantly. And by the time these half-other children begin to have their own children with other half-others, it almost becomes irrelevant to try to name 4 to 8 other groups to cover their offspring's heritage.  As the US becomes more middle-class intermarriage will become more the norm and less exceptional.  Middle-class mixed unions ignore race because it’s the class orientation that becomes the common bond.  For the middle class, race just mostly goes away.  Fifty-five years after Loving, public approval of interracial unions rose from 5% in the 1950s to 95%--virtually universal—in 2021 (Gallup).  

                                                                            

Thursday, May 25, 2023

The Cost of Excellence


 “Excellence is never an accident. It is always the result of high intention, sincere effort, and intelligent execution; it represents the wise choice of many alternatives - choice, not chance, determines your destiny.”    ― Aristotle

            “The best is the mortal enemy of the good. -- Montesquieu

Photo: Pixabay

Bias Part III

In the relentless pursuit of quality standards, and competing to express them, we automatically show our bias against anything but best-in-class.  If we pursue the top nominee for “Best cat breeds for catching mice,” then we must discriminate against less talented mousers.  If we look only at top colleges, we ignore all other options.  We also daydream about absolute top quality in marriage partners, homes, career, and car – the top big-ticket decisions in a lifetime.  It would be rare for anyone to achieve top quality results in all these categories, which is what even the very successful can’t manage to pull off. 

While working or waiting for ideal opportunities, there are many more decisions that are fated to yield less-than-stellar outcomes.  Rarely do all big-ticket criteria align for the perfect world we hold in our heads.  Aristotle championed the excellent while also promoting the Golden Mean as the avenue to avoid the extremes of the excellent and the abysmal.

In practice, though, of course, people can’t perform at their best or fit the top ten criteria for everything, from driving to cooking, singing, organizing, playing bridge, managing their portfolio, or giving presentations.  We do below-best most of the time, and that has consequences across the board for quality of life and reputation. “Anything worth doing, is worth doing well.”  True, but we don’t always choose to pay for that option.  The costs of operating at that level are too high.  Or we must concentrate on one area of life at the expense of others.  The cognitive strain exacted by excellence means we only apply high effort selectively.  On his site FergusonValues.com, Robert Ferguson notes that for the Forbes 500, Excellence is the third most popular core value—after Integrity and Respect.

Social scientist Herbert Simon articulated the cognitive limits to effort and focus in studying complex problems with high demands.  When things get too complex or hard to evaluate, we default to “satisficing,” making efforts good enough for the situation and its goals to get the job done, even if the outcomes are not top-ranking.  Satisficing sees that the job is taken care of but doesn’t impose a mandate for excellence.  This measure departs from the classical Rational Man theory of economics that assumes people know what they want and the logical price they are willing to pay for it for any given choice—like college. Too often we are dealing with incomplete information, with limited resources and energy.  In everyday situations, entropy rules over excellence.

In engineering and economics, this situation is called “theory of second-best.”  No system operates in all its parts and dynamics at top efficiency all the time, and any aspect that isn’t fully operational impacts the effect of every other aspect of the system, as in welfare economics entitlements. There are too many errors to make, and few ways to be top-notch, compared to hundreds or thousands of chances to be less than that.  A basic human brain problem is that there are two brains: we make decisions and take action both on the rational and the non-rational sides—the reason cognitive economics began to study both, venturing beyond the Rational Man theory.

Diversity programs in all sectors of society are dedicated to breaking down the hierarchy of success by insisting on making the successful better represent subset groups within the culture.  To diffuse class envy and inequality, Santa Monica High School in California has closed down its honors program in English in a radical move against excellence based on merit achievement.  As amazing as this sounds as a solution within an academic institution devoted to developing minds to their fullest extent: it is a logical step under the assumption that the top ranks of students express privilege based on unequal advantages such as educated parents in homes full of books.  SAMO’s home page declares its mission as “Extraordinary achievement for all students while simultaneously closing the achievement gap.”  This noble confusion might be rephrased as “Get great, but not too great to be unequal.” 

On another front, Congress is debating a “Worst Passengers” list, a nationwide no-fly blacklist to bar unruly fliers.  “But in a perfect world, who else would be prevented from flying?  Chatty or entitled passengers? Babies?“ (Elliott Advocacy).  The no-fly list is of cultural interest, because it reflects our collective ideas of profiling bad actors.  The nature of close quarters at high altitudes makes this profiling critical as compared to issues on the ground. One would think that suspected terrorists would come first, followed by anger-management failures, then on to the unruly.  Alcoholics, drug addicts, spastics, mental patients, maybe even the anxious and depressed could follow.  Babies and their behavior included.  Comfort animals other than dogs.  And yes, hygiene-compromised passengers as well.  This could become a long and inclusive list.  Any condition that promotes “disruptive” behavior would be eligible, and that, when you think about it, is a widely distributed trait: anyone who fails to fit “normal” parameters.  Exactly like high achievers, just at the other end of the scale.  

Excellence and the competition for virtuosity is the root cause of inequality.  Any effort to separate people based on merited achievement creates an obvious rift: the top 1% versus everyone else, as in the extreme wealth curve.  Sifting for criteria, either competence or character-based, is a discriminatory act.  This happens constantly at all levels of behavior, within our own actions and in the way we think about and judge others and their origin groups.  How are we to reconcile Excellence with Equity?

Monday, May 15, 2023

Ranking: Perils and potentials

“Without changing our patterns of thought, we will not be able to solve the problems that we created with our current patterns of thought.”     --Albert Einstein


Bias Part II 

THE JASTROW ILLUSION


Compare these two stacked curves.  Which is longer?  

This is a classic optical illusion, from the nineteenth century. In fact, the two are actually identical.  The illusion vanishes with a change in perspective to upright/vertical.  The human brain is automatically comparing everything it sees.

Ranking is a human proclivity, and it is all around us.  SEO (search engine optimization) ratings, US News Best Colleges, The Olympics, pro sports and amateur sports, Amazon product reviews, happiness rankings of countries worldwide, employee job applications, political candidates’ approval ratings, reputation polls.  In fact, it is impossible for anyone to examine two objects within the same category without ranking them in some way on some feature.  These can include reputation, performance, brand, cost, design, range of uses, aesthetics, color, size, speed, efficiency, and dozens of other basic aspects.  Think about the time and energy we all expend in comparing ourselves to others.  We compare along these lines and beyond – without having any way of confirming these ratings except a general anxiety about the need to do so.  Our social media scores are a simple example.

Dominance

Top Ten lists are everywhere and cover everything imaginable, including longest reigning monarchs, youngest state leaders, no-hitter record pitchers, highest jumpers, most innovative countries, winning tips for college-level essays, video game characters, famous astronauts, hang-gliding champions, chess minds, Noble Peace Prize winners, teams with the largest stadiums, quickest female Paralympians, and, of course, Best Top Ten lists.  The recent obituary of singer Harry Belafonte ranks him as the first Black Emmy and Tony Award winner as well as the first of any race to sell one million albums (“Calypso,” in 1956). (The Week, May 12 2023)


Our hourly ruminations consist of searching for clues to our standing compared to others.  Talent, wealth, perception, power, influence, trustworthiness, and romantic interest are all rankings we seek to compete and excel in.  These are dominance hierarchies in every society, and they serve a purpose.  As systems expert Peter Erdi puts it in his book Ranking, “Dominance hierarchies are very efficient structures at very different levels of evolution.  They have a major role in reducing conflict and maintaining social stability…to regulate access to these resources [food and mates].”  

Dominance ranking is a great mechanism to maintain the status quo, so that people (and animals in general) have a good idea of where they stand, and where they would like to stand in the future. Dominance goes beyond power, leadership, and authority to include influence, expertise, competence (toward virtuosity), and trustworthiness (a brand of social equity).  Think of writers, athletes, musicians, artists, and inventors and their role as models of prestige.

Emergent properties

Ranking and valuing have their value.  But what are the emergent properties, the unanticipated outcomes, of ranking competitions?  There are costs.  They begin with the constant need to measure and judge, ending often enough in an ongoing critical evaluation of self as never good enough.  Constant comparison is the essential activity of social media worldwide.

The Zoom screen affords the opportunity-as-compulsion to see oneself alongside others.  The self-criticism and appraisal of our appearance up against others in the screen meeting is one reason that remote meetings are as stressful as they are, regardless of the business at hand.  And while we are comparing ourselves to others on dozens of scales, they are doing the same.  No one entirely knows what their score is, but act as if they do.  Billionaire investor Charlie Munger (Warren Buffet’s business partner) declared “The world is not driven by greed. It's driven by envy.”

The obsession with determining the best of everything is a form of “virtue bias,” the directive we all share to seek out a way that lets us agree on rankings for everything from colleges to cars to cappuccinos.  So we curate “best of” lists for everything.  Whatever their standards, and whether those standards are based on tangible and provable truths, these lists take on a life of their own, reinforcing themselves in a self-fulfilling prophecy as the most-cited attract to become the most-desired and best-selling. 

The cost of competition is then passed along to those underneath the top ranks—the second place to mediocre to loser class.  Which, because so few of us are winners (on one scale, let alone several), means that we all tarred by the bias against “second-best,” or as a colleague once phrased it, “First Loser.”  That’s not a great-sounding placement, considering all the effort put out to make something of our lives and our reputations.  Just a reminder that talent is not equally distributed.  Neither is the work ethic necessary to maximize that talent.  This is why equality is such a tricky concept to pin down and engineer.  The social contest is not a level playing field, and some of that levelling is under our own control, while the start-points—family, location, culture, ethnicity, wealth, class—are more steeply slanted as well as harder to equalize later in life.

These contests, in operation in all domains of life, are one way to find information useful in making choices and investments of our time, money, and attention.  To this end, we seek out the best possible in schools--including preschools--for our children, politicians who will represent our interests, cars we can rely on to confer status as well as deliver performance, books that will reward the time investment in reading them.  We seek out friends who will enhance our efforts by reinforcing our values, making them worth the precious time invested in socializing.  We hope for college roommates whose good character and work habits will encourage our own school success (as important, some studies show, as the quality of the school attended).  President-to-be Franklin Pierce had such a roommate at Bowdoin College, one who fired up his ambition and work habits. Homes in the most advantaged parts of town we can afford in order to enjoy quality neighbors.  Colleagues to match our interests and our goals and lifestyles. Marriage partner, ditto.  Such preferences are quality-control devices, deployed as systematic bias protection against making poor judgments by our social group.

Ratings are supposed to help us distinguish between good and less effective use of our resources: time, wealth, energy, reputation.  Life is largely an efficiency game, one we seek to win at as often as possible, by aiming to win each time.

Outcomes and correctives

When recorded music became available by record and radio, everything else started to sound amateurish, or homegrown, or less-than-professional (John Phillip Souza, consummate composer in many genres, predicted this effect of technology).  The music on the ground, as it migrated onstage, created its own recording traditions that nationalized the genre (like folk, country, blues, and jazz), leading to its own “best-of” listings.  Belafonte’s signature “Banana Boat Song,” “Day-O,” is a Jamaican work song out of the colonial island fields but massaged by studio technologies, headed the charts in 1956.  Songwriters led by “the father of American music,” Stephen Foster, could be rewarded for their talents thanks to copyright and printing advances.

In the workplace, to compensate for the seller’s market in computer talent, companies are starting to adopt “skills-based hiring” to get around degree-based ranking of job applicants.  Applied computer skill doesn’t require the traditional four-year degree or professional title, and can be conducted on-line and on the associate level.  Distinction between certification and performance is the focus, opting for evidence-based performance over degree awards.  By the same mentality, merit-based admissions values achievement over race-based pro-bias in college admissions.  Affirmative action continues to be an ongoing debate that pits achievement against adjustment in the cause of balance and fairness.  To erase any competition for recognition, Santa Monica High School in California has done away with its Honors program in English as an enabler of inequality.  Not without concern over loss of opportunity for bright contestants who are now losers of this resume benefit.

Even bat-flipping in professional baseball, the practice of tossing the bat in the air to celebrate a home run, is a point of debate.  The practice was labelled as disrespectful of the opposing team and the game itself.  More recently flipping the bat is being viewed increasingly as simply a celebratory exhilaration and not an insult – realigning expectations and allowing for a more expressive game.  Even the slightest ritual carries with it a bias-based value.

All bias depends on expectations and context as a culturally constructed virtue or vice.  From the birth of human society, nonetheless, physical height is still positively correlated with leadership potential and dominance in pecking orders.  Erdi notes that “the desire to achieve a higher social rank appears to be a universal, a driving force for all human beings.”