Friday, January 31, 2025

Bias Widening, Part 2 – Bias and Discovery

 

Bias Widening, Part 2 – Bias and Discovery

 


“Any choice of evidence depends upon the mindset of the observers."
 --Robert Burton, MD, On Being Certain (2008) *

 

Bias and discovery

Bias toward one set of ideas and against another can either keep knowledge inert—or lead to testing those beliefs to reveal new, contrary ones that work better to define and solve problems. Steven Johnson’s social history of ideas (Where Good Ideas Come From, 2010) gives the example of Joseph Priestley’s discovery that plants expel the oxygen that supports earth’s atmosphere.  The common scientific wisdom until the late 1700s was that without oxygen, plants would die.

But in a simple bell jar experiment, they proved this bias wrong by thriving.  This proof was part of Priestley’s exploration of the nature of air itself.  He is credited for isolating oxygen as an element, in effect “discovering” oxygen. 

Idea bias

Reversing a fundamental bias has been basic to breakthrough paradigm shifts, described by Thomas Kuhn in The Nature of Scientific Revolutions as the recognition of data anomalies as clues that could reveal mistaken—or merely limited—idea bias.  The move from earth-centered to the sun-centered universe, Darwin’s theory of evolution, and the discovery of the genetic code by DNA illustrate the ability of scientists to overcome and surpass accepted truth by revisioning the world. 

“In order to pursue long-range thoughts, we must derive sufficient reward from a line of reasoning to keep at the idea yet remain flexible and willing to abandon the idea once there is contrary evidence,” notes Robert Burton, MD, in On Being Certain (2008).  Burton points to the central problem of certainty:  Feeling certain arises from “involuntary mental sensory systems,” neurological, not rational, that attach and become embedded in the ideas themselves.  So “complete objectivity is not an option,” because we can’t separate our ideas from our devotion to their “rightness.” 

The feeling of knowing in fact can easily become the enemy of certainty, because it regularly betrays us—as everyone knows who has bet on a sure thing that failed to fulfill the promise of coming true; at the racetrack, casino, career path, voting polls, or romantic engagement.

Contrary evidence must be overwhelming and undeniable—and yet, even in the face of contrary proof, we can maintain the original notion in the process of “cognitive dissonance.”  Bias is just one example of the ways that “feeling right” controls thinking.  Better thinking-–a function of feeling--might lead us to rationally debunk ideas that don’t serve us well but become built into the way we process information and experience.  As Burton puts it, “The continuing belief that we can strip our ideas of biases runs deep and isn’t limited to those with a marginal understanding of [brain] science” (p. 157).

Consciousness and culture

Neuroscientist Antonio Damasio, in his Feeling and Knowing: Making Minds Conscious (2021) links the search for human consciousness through the brain to the rise of culture.  He points to the role of pain and suffering in the start-up to creative problem solving.  To counter the negative outcomes of physical and psychic damage (and death), our minds became attuned to finding solutions aimed at avoiding and preventing all manner of less-desirable states—physical, mental, and emotional. In this way, culture was the emergent outcome of our need for at least basic safety and security in all our many operations.

At the same time, well-being, pleasure, and joy inspired efforts and innovations to promote these states (beyond security) as the baseline for civilization and its promise of protections, again through creativity and shared beliefs.  Our intellectual resources allowed for a collective bias against pain and promoting pleasure to prevail.  The uniting consciousness of death, even beyond its suffering, is the powerful motive behind religion.  This is the human attempt to deal with the inevitable yet unpredictable end of life—that is, for the person, the end of everything, evident from Greek tragedy and the Bible—and continuing.


DNA image – Pixabay   

Tuesday, December 31, 2024

Bias Widening by Connecting Ideas Part 1 - Idea Flow

Bias Widening by Connecting Ideas     

Part 1: Idea Flow


                                            Italian Hill Village – Pixabay image

“By challenging basic assumptions, it’s possible to stumble upon simple and unusual solutions to long-standing problems….It’s important to look for discoveries outside the usual suspects—[for] hypotheses worth disproving.….I’m able to pull from disciplines and subcultures that rarely touch one another….”   

                                                                    --Tim Ferriss, transformation guru


Ideas can widen, travelling outward from original preconceptions—that is, bias--by immersion in idea networks that challenge and tweak that bias.  For Europe, these networks were cultivated in the hill towns of northern Italy, birthplace of the Renaissance.  This enterprising incubator brought together both distant and ancient cultures as sources for innovation and invention.  The dominance of the church gave way to alternate worlds. 

Exposure to other operating assumptions for doing things yields new concepts of problems and new approaches to solving them.  Getting beyond the limiting influence of bias requires other assumptions, other ways of thinking and doing, that is, the igniting of an alternate bias.  Such an alternative can appear in a dominating mind, think-tank, imagination, or another cultural mindset—within a time, people, or profession.

Social historian Steven Johnson points to the idea-combining power of the modern city, starting with the Italian hill villages and their cultivation of ideas essential to the Italian Renaissance (Where Good Ideas Come From, 2010).  The graph of human invention runs parallel to the growth of the pulsing interactive structures of urban life.  These immediate interactions were breakthrough developments distinct from the small isolated groups of hunter-gatherers of prehistory. 

With the agricultural revolution came the marketplaces that anchored settlements of thousands, then many hundreds of thousands.  Trade reached out into a regional, then global scope.  Ideas could flow between people, families, clans, and cultures.  The social webs of large cities in their billions of connections parallel the 100 trillion neural connections of the brain’s activity—the most complex network we know.  The fuzzy logic of search engines likewise now enables the randomized meeting up of subjects far afield from each other.

Limits

The only limit on the flow of ideas was the set of pre-assumptions that always limit acceptance or even consideration of ideas that are different from the assumed truth—the shared beliefs that define and channel our thinking.   Shared reality makes agreement and concerted action possible.  This outcome is the strong suit of conformity. 

Idea transformation works by mixing unlike or unlikely elements by deliberate idea cultivation.  This is the opposite of stovepipe or silo mentality of organizations devoted to keeping information under wraps and out of the flow of shared idea generation by keeping it sequestered in need-to-know vaults and private channels.  Pooled insight or a community of truth is sustained by the intermingled thinking of diverse minds operating on the same circuit—a working definition of culture that also proves the value of thought diversity.

Johnson says on this point, “When you work alone in an office peering into a microscope, your ideas can get trapped in place, stuck in your own initial biases.  The social flow of the group conversation turns that private solid state into a liquid network” (p. 61-62). The preexisting preferences built into solo work are the essence of bias. They predetermine the way ideas will be generated and then selected out to focus the work on. This is why artists in every field find it hard to resist visiting the same themes over and over, a form of stickiness that can hinder their creative expression and development in exchange for a tried-and-true thing. 

The “liquid network” of ideas is essential to the image of flow as a property of social networks as they cultivate concepts into forms with social value powered by mobility.  Johnson cites the “hybrid economy” as the liquid network combing group R&D efforts built around individual ideas; open networks such as Nike’s open R&D lab or Alex Osborn’s brainstorming, as working out the concepts of the protected genius of private enterprise.  Thomas Edison was famous for his idea brilliance, but it was his team of hardworking scientists who were needed to bring those ideas to fruition. 

In his 2017 graduation address at Harvard, Meta/Facebook CEO Mark Zuckerberg deplored President Trump’s stand for isolation and against “the flow of knowledge, trade, and immigration” to generate innovation and invention.  


Saturday, November 30, 2024

Appropriation Reconsidered

                                                                                                                                                                             ap·pro·pri·a·tion

      /əˌprōprēˈāSH(ə)n/ noun


The action of taking something for one's own use, typically without the owner's permission.  "The appropriation of parish funds."

 

Example: critics have admonished non-Indigenous people for wearing feathered headdresses or traditional regalia as costumes for Halloween.


 

Reproductions of iconic landmarks in Las Vegas 

 From language to the herding of food animals, inventions quickly become community property.  At least predating intellectual property law, the term “cultural appropriation” was nonexistent.  Culture is essentially the development of shared property, sourced between cultures, across time.  It is not proprietary to individuals or to even groups. 

The modern wish to own and control familiar icons, expressions, even possessions, and to determine how and by whom they can be used, is a limited, non-historical, and idealized notion balancing control over artifacts and ideas against the natural need to innovate, borrow, and connect.  With the advent of major cities from Rome onward, the world became a stage for models of idea-sharing and the dynamic of a creative commons. 

Even the settlements of the most recent Ice Age saw the flowering of arts and invention, on the way to the shared wealth of cultural exchange. Since the Middle Ages, cultural acceleration worldwide gave us the mechanical clock, printing press, the compass, and paper money (with China a major contributor).  At least historically, it is not an act of bullying, in which what’s yours is taken away to become mine—the “without permission” edge of appropriation--but the connectivity of cultures that has been key to their development and flourishing.

The shared mental construct of language—we don’t know when language was first born, or where - is the shared legacy of many thousands of years by the billions of homo sapiens.  As more dominant economies produce and disseminate products, ideas, images, and stories, it becomes clear that these have contrasting functions and meanings across cultures and varying impacts as well.  World culture is a marketplace of ideas that shapes both originators and adaptors.  Imports also show a valuing of the foreign for its own sake, as in French wine and cuisine, Russian caviar, and Italian cars.   American films, jeans, tobacco, and music are world exports long shared as first-world status symbols.  As Americans, this wide adoption of our icons is accepted as a natural process along with the rise of democratic rule.

The adaption of Buddhism to American ideals, and its understanding in the contemporary US as against its ancient Indian origins and Chinese dominance, is another case in point.  The Japanese tea ceremony is an unrepeatable “transience” experience.  Christianity is worldwide, but the practice and ideology looks different even between North and South America, vividly apparent in the contrast between Anglo and Latin themeing of churches. 
 
Appropriation is a decision-making process of group evolution.  Of all the possible choices we make, none are pre-determined; experimentation leads to finding the imported solutions and accommodation that make the most sense and therefore, over time, catch on.  When times change, the desire for and use of imports (like foreign loan words) change, leaving most in the dust.  The universal bias toward copying, importing, sharing, and improving can be read in the history of trade (imports and exports).  
 
This dissemination can be handed back to the authentic origin, sometimes without recognition that it was even adapted by Culture B from Culture A.  St. Patrick’s Day, for example, is heavily celebrated in Ireland only because American tourists have come to expect it to be celebrated as in the US (which it never was, by tradition), making March 17 a large share of tourist revenue.  Before this, it was a day of worship wearing a shamrock complete with roots and dirt.   Tivoli Gardens, inspired by Italy, was adopted in Copenhagen where it began the process of inspiring the theme park in the mind of a tourist, Walt Disney.  Inspiration and adaptation are in no way cultural appropriation but an example of the power of the open flow of ideas. 
 
 Seashell and cowrie (snail) beads have been described as the original personal decoration (“adornment” in anthropology).  Both personal and social identity can be communicated—and are still—by the jewelry we wear.  The beads go back 300,000 years and are touted as the start-point of self-awareness and social status.  But this doesn’t account for less durable goods, like flowers, seeds, wood, feathers, and other perishable “pre-jewelry.”

“Sudden events” have long obscured long-term origins, says archaeologist Alexander Marshack in The Roots of Civilization (1991). Marshack proposes the concept of “cognitive archaeology,” the record of how homo sapiens have seen, abstracted, symbolized, and imaged their world in time and space as a way of dealing with reality.  He calls this approach “a part of the ongoing and broadening inquiry into the nature of being human.”  His analysis of the African Ishango bone for its mathematical genius and the transition from the lunar to solar calendar illustrate this cumulative appreciation of culture. 

“Transhumance” was mathematician and philosopher Jacob Bronowski’s term for cultural evolution beyond the biological, spanning many eons, back beyond Ice Age inventions.  In his exploration for “great moments of human invention” he notes: “In every age there is a turning point, a new way of seeing and asserting the coherence of the world…. That series of inventions, by which man from age to age has remade his environment, is a different kind of evolution…. Man ascends by discovering the fulness of his own gifts.”  (The Ascent of Man, 2013).

Like following reindeer herds, which the Lapps continue to do, group inventions quickly become community property.  Cultural appropriation has only suddenly been invented in the sense of unjustified purloining--which it clearly is not. Denim jeans are an American creation. Should they be restricted only to the American-born? In some countries they are considered a status symbol, and you will be approached on the street with offers to buy the ones you are wearing.  American Cultural innovations over time are enduring as untitled public goods: the accumulated property of all generations.  Since language, fire, tools, and weapons, the pencil, Mercator map, and the theory of evolution, this common wealth has accelerated the pace of progress from the calculating machine to the current artificial intelligence frontier.
 
The mark of multiculturalism is a global culture that emerged thousands of years before the digital age or the World Wars.  This ongoing traffic in ideas, objects, and practices is a mapping of memes that enjoy a life of their own by repetition, imitation, and morphing.  A French Academy can’t hope for a pure French language or expect its speakers to bar “foreign” elements. When you teach me how to read, that transfer doesn’t subtract from your abilities in reading—nor in your further ability to teach innumerable others.

Wherever invented, no group or locale has a lock on the idea of either written or spoken language, nor what is written or spoken within them.  This is the way culture works – and in fact, what it is.  Its higher values are what make it a unique civilization or singular cultural influence on the rest of the planet, not just its content, its museum of people, places, and things.   Do Brits complain about “colonials” holding high tea?  This ritual looks different in Bangalore than in Birmingham, because shaped by the borrowing culture, keeping some elements while letting go of others as less consonant with its needs.  On the other hand, the pineapple and Boxing Day in Great Britain are Christmas symbols that never mingled with American yule traditions. These are rooted in invisible but highly active class values.

 The complex nature of culture and cultures, now a global showcase for interacting material, ethical, and economic processes, causes the journey of appropriation to become widespread, rapidly evolving, and tricky to model or track.  Overall, mass media and social media have made the mapping of material and social diffusion increasingly complicated.  One thing is certain: Cultural exchange is happening at an increasing rate and will not be slackening anytime soon.  We can only look to the collective fruits of that exchange over time as a promise, not a threat, of more to come.                                                                                                                                                                    

 

                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   


                                                                                                                                                                                                      

                                                                                                

                   


Thursday, October 31, 2024

Lifecycle of Values: Human Transformation by Age Stages -- Why our buying values change over time

 

Cultural Studies & Analysis has developed a set of basic tools relating to human behavior in groups, the basic unit of culture.  The Age Stage Development system charts human development from birth into the mid-seventies (and beyond).  This chart is the secret weapon for the ad agency, consumer products company, experience design studio, and most recently, the transformation economy.  

We have distilled a large body of work by both physiologists and psychologists documenting the evolution of human needs over a lifespan. By comparing this to the consistent cultural patterns of behavior over time, it is possible to develop a picture of the human lifespan as a decision-making process. In sum, we all are a different person every twenty years—in our thinking, needs, and buying lives.

This development process is not linear, but cyclical: a repeating progression of awareness, learning, application (reconciling), and transforming. As we move from one developmental stage to the next, we edit our mental database, dropping what no longer works, making and testing new discoveries, adding useful information in our new operating environment, and developing new recognition patterns to direct new solutions to new problems.

Reconciliation stages, those of Application, are particularly significant. During these periods the brain is preparing to re-set for the coming stage of transformation. It subconsciously scans for significant patterns of the upcoming stage ahead, as well as searching for significant moments in the past. Nostalgia exerts a strong pull, as do positive visions of the future.

Every twenty years we all transition through a sort of “systems upgrade.” We emerge from the transformation stage of each cycle as a distinctive new entity: from teenager to young adult, from young adult to maturity, from maturity to a cycle of reflection to resolve the contradictions within our own 60-year-old identity for final acceptance of our life story.

Each transformation carries marked economic consequences. Ages 20 to 35 are the hot demographic for consumer sales because these are the most socially mobile years in American life.  In the previous adolescent transformation, teens are developing an identity independent from their parents. Very soon in the next column, they also have several other identities thrust upon them in rapid succession — student, employee, peer-group member, partner – even parents themselves.

During this conflict of the early 20s, you can be — as far as your brain is concerned — a different person every few months. Whenever we change, our immediate environment must also transform to reflect and enhance our changing self-image. Socially mobile beings are more inclined to buy a new and wider range of “stuff” than the more self-actualized (i.e., older and more stable).

In the mid-30s, this social mobility slows. We start a second stage of reconciliation leading to transformation. We stop buying new stuff, skipping entire generations of technology before replacement. But while the things around us remain relatively stable, we start buying experiences--until our next transformation stage, when we start buying meaning.

At every stage of this process, brand loyalty can disappear when the consumer moves into a new stage.  In the new mindset, values based on the product name no longer enjoy utility.

Age stage determines which values the consumer is drawn to at what age, and how those values are recognized and acted on. Understanding age as a process, rather than an event or a seamless progression, reveals not only why the majority of any particular age group behaves the way they do, but also where they are going, what they need, and what they will gravitate to in the future.


Age-stage Chart ©1996-2024 The Center for Cultural Studies & Analysis www.culturalanalysis.com

 


Monday, September 30, 2024

The Transformation Economy

 

 “Knowing reality means constructing systems of transformation that correspond, more or less adequately, to reality.” 

-        Jean Piaget, psychologist

“[When I read a normal novel], I know what I am going to experience is reality, as expressed and transfigured through art. Reality translated to a higher plane, a more passionate intensity, than most of us can experience at all without the help of art or religion or profound emotion, but reality.  The shared world, the scene of our mortality.”

-       Ursula K. Le Guin, science fiction writer

                    Earthrise image from Pixabay

Humans are famously averse to change, even positive change. We get used to things as they are and learn to deal with them that way. Just as we learn how to be a child, we must start to deal with adolescence. Change requires learning and focus as well as truth suspension so that the new ways can begin to take hold. Yet our lives are constantly evolving, and at each new stage, transformation is the rule.

Starting with coaching, tutoring, diet and exercise, learning new things, situations, and people, even a new career, improving playing sports or speaking ability, planning and seeking out the next opportunity—even faith marketing is an offering about the transformation of salvation.

Joe Pine who, with Jim Gilmore, developed the Experience Economy in 1999, is now at work on defining the next stage of economic value: the latest in the long progression starting with commodities, to goods, to services, to experiences, and now personal transformation—a state where the transformed customer or client is the product. 

In the trajectory of progression of economic value, each stage is an evolution into a more sophisticated and higher-level format as the earlier stage becomes commodified, made less special and distinctive by concerns for efficiency, volume, and replicability (see Pine’s Harvard Business Review article on the slippage of value at Starbucks, June 26, 2024).  Businesses that cannot focus beyond the horizon to appreciate how experiences are sought after by customers as vehicles of transformation, risk regressing back to commodities. The Starbucks premise is important because for the cost of a latte, the high-end in-store experience confirms and encourages a quality-of-life interlude that validates the self-image of aspiration, aesthetic taste, and anticipation of a lifetime identity based on those class values. This is the same business model used by the upper-end mall, which creates an ambience of affluence to assure (or convince) customers that they can afford --and deserve to own—the upscale furniture, art, clothing style, jewelry, cosmetics, shoes, and dining on offer. These offerings, properly analyzed, also contain seeds of transformation that can be identified, spotlighted, and cultivated by marketing.

At the individual level, traced by our own Cultural Studies’ original age-stage development chart, transformation is the objective and the endpoint of four developmental life cycles: awareness, learning, reconciliation, and finally transformation, over each of four 20-year cycles from infancy to 75 years+.

At the level of economy, Transformation reflects the current globalization, technology growth, with increased trade and travel (the US travel industry is the world’s largest), along with higher education, extreme experiences, and search for meaning in all sorts of activities, skills, and achievements: starting with the high-stakes competition of college admission and moving across generations to the search for transcendent meaning in longevity, the marketing term “age wave.”  In psychologist Abraham Maslow’s pyramid progression of human needs, the Self-Actualization level is the top echelon and highest evolved of human life, the transformed human being. One example is the US Army pledge to make you “Be all you can be.”

In folklore and film, the agent of change is a magician who transforms Moses, Jesus, Cinderella’s fairy godmother, Aladin’s genie, the Wizard of Oz, Pollyanna.  Transformation was a gift. Now, it is the gift we give ourselves. Learning is transformational: there is a reason the race to the top of college entry is the leading contest of life’s second decade. (This makes the application essay that stage’s heaviest-weighted writing assignment with a lifetime payoff.) 

Pine and Gilmore’s Chart Transformation is inherent in human life across life stages. The journey from child to adolescent sets much of our personal history, with each adult stage, built around a diverse set of needs and wants. Now more attention is being paid to the aging process at the far end of life, coping with losses and illness inherent in a final search for meaning. With the world population moving into its 60s and 70s, led by the Baby Boom, this stage focuses on spirituality, vitality, creativity, and dying in the final decades migrating toward death. Author Sebastian Junger’s latest book, In My Time of Dying: How I Came Face to Face with the Idea of an Afterlife (2024), about his own near-death experience, is a current case.

The transformational medical spa industry has grown to make over the look of aging; yoga and pickleball are athletic pursuits that can be practiced well into late aging. The idea of age as a condemnation of infirmity and inactivity is fading, as the lifespan grows from the outcome of medical innovations and interventions and their integration.  People are expecting to have second, third, and fourth careers; in the 1980s, an AARP headline asked the question “What will you do for your second career?” Late bloomers are finding a welcome in industry and the arts, as well as athletics. As for the young, expectations are just as trying, building the high-school resume designed to apply to elite colleges, career-priming, and entrepreneurship early in life, as credentials for ever-growing development of mind, body, and the social network set to extend well into the 8th and 9th decades of life.

Life expectancy has more than doubled worldwide from thirty-two at the start of the 20th century to seventy-one in 2000. With this aging revolution, the range of changes over a lifetime, including extended old age, have multiplied, as explained in Ken Dychtwald’s Age Wave: How the Most Important Trend of Our Time Will Change Your Future, in 1989. The world is only now figuring out what this will mean for experience and transformation as the global economy serves increasing numbers of over-65 customers, and families start to include four or even five generations, blended by cross-ethnic parenthood as well as re-marriages.

At least 100,000 years ago, the conquest of fire transformed human existence, channeling our diets to cooked meat protein, expanding our brain power, and allowing us the control and security of a lighted and heated communal hearth. It exerted a kind of magic over human development by taking a natural element and creating a science and practice around it – a game changer, the kind people seek out in their personal journey through the development of their identity and self-invention.  This is no mere hack or transition but an irreversible direction containing its own drivers. Even weight loss and the simple cosmetic makeover echoes the transformation process, involving seeing the potential for change, basing that change model on what’s available and doable, in seeking out a new-and-improved transformed version of the past and present.
 
Anything that enables this process holds the potential for an economic niche, or a place in a wider field like education, skill mastery, mate-seeking, wealth, wellness, discovery (exploration, new skills, new connections, redefining the past). And transformations, lasting change catalyzed by experience, enjoy a higher price point and a long-term connection between the business and the customer. It also calls for a deeper understanding of customer aspirations, the development process at work over the lifespan, and the rich potential of ongoing positive experiences for change within this ongoing relationship.

This kind of experience is transformational in itself, opening up the world through new ideas and identity-seeking. Such as the life pilgrimage tour of Europe or other personal heritage homelands like Israel—or 23AndMe might transform your idea of who you are just by naming your genetic places of origin. In classic extreme expeditions, take climbing Mt. Everest, where enabling technology has invited many more-average climbers to challenge the summit and contribute to the death count. The challenge proved to be more than amateurs could anticipate or even imagine. But for the survivors, extreme travel (even into space at many thousands per seat) is part of the initiation value. Archeological digs, deep-sea dives, Antarctic cruises, and safaris offer similar payoffs, and their marketing literature stresses the link between the customized extreme and achievement of benchmark or “bucket” goals for achieving personal best.

Membership clubs, like alumni groups, carry the experience beyond the moment into the future with the cache of peers who have achieved the extraordinary and transformative—which can be based on talent, interest, participation--or simply cost.  Transformation is the internal dimension of a social initiation—including college, graduate school, or a master class. Dale Carnegie discovered his calling as a transformer when he taught public speaking at a local YMCA.

Of course, education is a life investment in upward mobility, one that is not only now super-expensive but requires inordinate time and focus to achieve, unlike the status of driving a Mercedes or owning a Rothko.  Becoming a doctor or lawyer is a question of being, not just doing.

Consider how one kind of travel, the NASA program, has transformed our ideas of the possible by sending exploration technology into deep space and astronauts to the moon, where seeing their home planet from 250,000 miles away made their new perspective into a life-changing experience. Suddenly the earth was not the central or sole concern.  Along with Big Data and AI, the center of human affairs has realigned to encompass far more space and many more possible futures for the earth race.  

Sunday, August 25, 2024

Equal Opportunity – Santa Monica High School cancels honors classes

 

                                                                                Image by Pixabay

“Today, we live in a society structured to promote early bloomers.  Our school system has sorted people by the time they are 18, using grades and SAT scores.  Some of these people zoom to prestigious academic launching pads while others get left behind.  Many of our most prominent models of success made it big while young—Bill Gates, Mark Zuckerberg, Elon Musk, Taylor Swift, Michael Jordan.”

-- David Brooks, The Atlantic

 

Is there a way past and apart from horse racing as the way to show ability and achievement?  Santa Monica High School has decided that on one educational platform there is.  Since 2022-23 the school has suspended Honors courses for freshman and sophomore years, breaking the tradition of double-track or ability grouping.  The impetus to toward improving equity, cooperations, and participation with blended classes, leading to better class diversity.  Reports are that the opting out of Honors and the practice of de-tracking and blending abilities classes at the lower half has increased enrollment in Advanced Placement college-level courses in the junior and senior years, which is also the college track.  “It’s about saying all students are capable—and we’re going to meet them where they are,” said Sarah Rodriquez, one of the school’s English teachers. The goal is to close the achievement gap between those culturally attuned and those less so, which could be called the color and class gap, in which Honors divides and stigmatizes the non-Honors student.

The School District decided that designating Honors for a set of courses unfairly selected-out students of color and unpreferred class, leading to keeping the courses “blended” to avoid discrimination bias-positive for better-performing, more academically fit 9th and 10th-graders.  Any preference for one group, it was reasoned, was in itself a denigration for less-preferred groups.  The priority of “honors” conferred a dishonor to those excluded.  As the school district’s credo reads, “Extraordinary achievement for all students while simultaneously closing the achievement gap.”  This high tide raises all boats strategy could be called “Survival for all, with the chance of fitness for everyone.” 

The same idea is the force behind Prom as an event for singles or groups expanded beyond the couples model.  Anything that signals “There’s a system working against us,” however much it may advance those with elite skills and aspirations, needs to be closely examined for this bias.  But our meritocracy assumes that individuals are responsible for their own success, and that talent plus hard work will eventually sort out the herd into its proper hierarchy by earned merit.

Also in 2022, Culver City California elected to institute a level field by eliminating the Honors label.  Following were Sequoia Union and San Diego districts.

Against this move to flatten the field is a wave of opposition, both parents and students, who see the move to subtract Honors as a detriment to the intellectual achiever. Brainy and intellectually advanced students are not at the top of the high-school hierarchy and are at risk of bullying in early high school, so that the Honors separation provides some privilege in the form of protection, by conferring separate status in a faster-paced classroom covering a wider scope of material. 

“Comparing ourselves to others is an elementary human activity and we cannot avoid making comparisons and being compared.  There is a tradeoff: favorable comparisons make us happier (at least in the short term), but unfavorable ones drive us to make things harder,” notes Peter Erdi in his book Ranking: The Unwritten Rules of the social Game We all Play (2020), p.40-41. Wikipedia’s definition of competition “is a rivalry where two or more parties strive for a common goal which cannot be shared where one’s gain is the other’s loss. The rivalry can be over attainment of any exclusive goal, including recognition.”

Sports is the poster example of achieving a clear goal.  Academics, the realm of the rational mind on a lifelong development arc, has a foggier profile.  Equality of opportunity faces a continuing debate about how to define, refine, and implement this difficult concept.

Is this a way around the traditional contest to showcase performance and achievement through competition of student against student?  The bias toward using competitive scores to gauge ability and bring out the best is long-standing as an efficient methodology.  But of course, this also means that “winners create losers.”  Horseracing has the long history of this measure to set records as well as stud prices.  But as Laura Hillenbrand’s Seabiscuit: An American Legend (2016) details, the toll of the racetrack can be one of injury and even death. 

Still, pace handicapping has been a long-standing way of equalizing each competitor’s chances of winning, as in horse racing.  Extra weight is assigned to extra-capable mounts like Seabiscuit.  Better-performing horses carry extra weight to make the race a fairer contest, as well as more challenging for the betters.  Like affirmative action, race handicapping is the method of calculating advantage and the exact disadvantage needed to equalize that advantage.  (At one time West Point worried about a pro-bias toward height as an unfair factor in promoting cadets.)  Active in chess, golf, basketball, track and field, cycling, sailing, and auto racing, this practice considers time, distance, and points as adjustable aspects to equalize ability record.  Similar logic could be applied (if we could find an equitable formula) in academics to equalize opportunity.

Countries famous for their educational elitism, such as England, Singapore, and Japan, are also known for the winner-take-all mentality, high-stress lifestyles, and even suicide on failing to make the grade that will set the course for a lifetime.   Are these consequences just a natural part of being in the ring, or should they spark concern about the wages of competition itself? 

Many schools are also doing away with the SAT (Scholastic Aptitude Test) as a measure of ability that used to be standard for college applications.  The SAT/ACT (American College Testing) was conceived as a way to level the field for smart students without the class advantages of a literate background.  Are they waking up to the inefficiencies of ranking tests in their ability to predict future performance by present and past measures?  While high SAT scores are useful for college entry, and essential for upper-echelon schools, effort and motivation are better predictors of long-term career success.   Clearly, more ongoing studies of the social psychology of social status in education will be in order.  The national race to the top for elite college entry is just one example of how motivating this status can be. 

Just a sampling of the competitions open to high schoolers are the Congressional App
Challenge, the National Economics Challenge, the MIT Think Scholars competition, and the Computer Science Olympiad.  It doesn’t get more competitive than that.  



Tuesday, July 30, 2024

The Moon Is a Cultural Force

 

                             Eclipsing Moon                                 Image: Pixabay



“Since the beginning of time, the moon has controlled life on earth and shepherded the human mind through a spectacular journey of thought, wonder, power, knowledge, and myth.”

--Rebecca Boyle, Our Moon: How Earth’s Celestial Companion Transformed the Planet, Guided Evolution, and Made Us Who We Are (2024)

 

Origin

The creation of the moon is a classic instance of destruction as a creative force.  Four and half billion years ago, the earth and moon were a single planet. Then a mars-sized body called Theia (Greek mother of the moon Selene), collided with earth at 20,000 miles per hour, breaking both planets apart.  From the residue of dust and gas, gravity made our moon as well as our earth, meaning that our satellite’s composition and motion can tell us about the earth’s origins, too. 

The philosopher Immanual Kant called such chaos the source of creation.  This is the giant-impact hypothesis, based on geochemistry that also explains the moon’s composition.  A giant spinning ring of vaporized rock and metal heated to four to six thousand degrees Fahrenheit formed from the earth-moon collision, called a “synestia,” “two homes,” a new type of planetary object, named for Hestia, goddess of hearth and home (Stewart and Lock, 2017).  Eventually it cooled, and the earth emerged—after the moon formed.

Time cognition

Science writer Rebecca Boyle recently turned her sights on the moon, or the earth-sun-moon system, for its interest not just to science—which is quite considerable—but to culture and the making of civilizations. She begins by explaining how the moon was once part of earth.  From there she points to the sophistication of prehistoric groups, who by “using the celestial bodies, learned how to grasp time, and how to control its use.”   This endeavor was initiated by the moon-mound calendar at Warren Field in northern Scotland in 3,800 BCE, nearly six thousand years ago.  

This Neolithic monument “marks the first time humans learned to orient ourselves in time, a major leap in cognition.”  Humanity would go on to “use the moon to create religion and consolidate power through it, erecting the foundations of modern society.”

In prehistoric human minds, the moon started out as a fertility symbol, a time counter, and a form of notation.  It soon progressed to a new role as a time reckoner, enabling people to orient themselves in time, imagining the future as well as recalling the past (p. 120)

Plato even asserted that the succession of days and nights, lit by the sun and moon, taught us how to count—and how to think (p. 17).

This analysis shows how a single artifact or element of the wide world can be mobilized to derive multiples levels of meaning to reveal the history and workings of culture.  The moon as cultural artifact is one of many we live with every day and barely ever consider a serious cultural subject.  Along with the sun, fire, water, ice, and air, these are elements of life on and off earth with deep implications for the way we think, act, organize, and imagine.  They are part of our prehistoric and protohistoric cultural heritage yet to be thoroughly analyzed to explore even our most basic operating assumptions. 

Timekeeping

As the Neolithic age began twelve thousand years ago, the moon’s timepiece enabled agriculture with its seasonal monthly calendar to replace hunting and gathering. Barley was first domesticated in Jerico.  The beginnings of history as a written record, starting in Egypt in 3200 BCE, cultivated the ability to predict as well as recall.  Writing had its start in Sumer (now southern Iraq) around 3400, with cuneiform wedges on clay tablets, as well as the base-60 numeric system of 60 minutes, 60 seconds, and 360 degrees.  Uruk (now Warka) in Sumer had 80,000 residents at its height, making it among the first major literate civilizations in Mesopotamia in the fourth millennium BCE and the largest urban settlement in the world. 

With the launch of writing, timekeeping, land cultivation, trade, and the law emerged as coevolving disciplines. The moon had already become a source of spiritual energy through moon gods and sky worship as the practice of religion; now those religions became the hierarchical order for empires.  Close observation of lunar movements laid the groundwork for observational science grafted from religious ritual.  Moon devotion and watching taught both a “new means of control and a new form of thinking.”

As interest in the moon’s keys to understanding developed, that knowledge had applications to widening horizons down on earth. Big cities with thousands of residents dominated the ancient world.  The first coins were minted and exchanged in 7th century BCE; the first paper money was created in 130 BCE.  In Sixth Century Greece presocratic philosophy was born from a curiosity about the natural world and the nature of the cosmos.  Meanwhile, the Persians were making advanced calculations, building on the astrological tables of the conquered Babylonians after Cyrus’s victory in BC 539.  China’s Han Dynasty opened trade with the Roman Empire in 130 BCE (the same year as paper money).  The Silk Road was actually a web of trade routes, land and sea, that connected Asia, Africa, and Europe for nearly 1500 years.  Down its many expansive routes streamed a global civilized culture through cultural exchange between distant groups.  They all viewed the moon and its phases from various positions on earth.

Science

The fifth-century BCE Pre-Socratic Greek philosopher Anaxagoras went beyond astrological wisdom to seek globe-spanning universals, and was first to explain eclipses, and the moon as “earthy” rather than light or vapor. His work drew away from the supernatural imagination and toward a colleagueship of rational thinking and observation, with the moon as his object of study.  Thales of Miletus is reported to have predicted the first solar eclipse, in 585 BCE; how he accomplished this, though, is unknown.  He might have deduced the pattern that solar and lunar eclipses come in pairs about two weeks apart.

The high concept of creating knowledge and making it work as a wealth of opportunity in its own right would power the Enlightenment 20 centuries later.  Driven by Copernicus, Kepler, and Galileo, the next great revolution would begin by breaking with the geo-centric universe. Enter Aristarchus of Samos (310-230 BCE), who determined that the sun is much larger and therefore far more distant than the moon.  Given these distances and sizes, earth must revolve around the sun, not the other way.  This revelation, now seemingly one so obvious, took many centuries (into modernity) to establish, by means of telescopic instruments.  How enduring misassumptions can rule until reliable tests are devised to question them, and then how credible alternatives are proposed and proven, is the history of the scientific revolution. 

By the 17th century the sun-centered scheme of Nicolas Copernicus could be proven scientifically, setting the stage for a new investigation of truth based not on faith or conviction but on observable evidence. This was not taken lightly by The Vatican who famously persecuted Galileo for promoting the Copernican theory of the Earth’s rotation around the Sun. They eventually saw the light, building their own Vatican Observatory in 1580 and Pope John Paul II apologized for the “Galileo Case” on October 31, 1992. This was a mere (in historical church time) 359 years after the event, but he did say the church was sorry about being a little hasty in their judgment in that case.

The moon was central to proving a solar-centric order, based on mathematics, the telescopic lens, gravity, and motion.  The moon orbit and gravity are critical to Einstein’s key assumptions for General Relativity. The geo-centric bias was certainly the greatest barrier to thinking about the universe and our place in it.  Its lifting has liberated all kinds of parallel thought once that barrier was broken, for example, just in considering the earth and moon not two distinct systems but a single dynamic.  “Did Copernicus really understand that his certainty about the ‘chief world systems,’ as Galileo called the heliocentric and geocentric models, would upend society as he knew it?” (p. 190).  As Copernicus was over sixty when he made his late-blooming discovery, perhaps he therefore foresaw less to care about; his major work was published the year of his death, in 1543. 

Inspiration

The Apollo astronauts who went to the moon, the first to transcend earth’s boundaries, have often borne witness to the journey’s transformative impact.  This effect has come to outshine the more famous courage and farsightedness required to undertake such a momentous trip.

“Many report feeling an overwhelming sense of clarity and unity, a heart-swelling state of heightened awareness and togetherness that is common enough to have its own name: the “overview effect…the sense of boundaries evaporating ….”  The missions even brought about a new awakening, in this case new knowledge and a different way of thinking about humanity’s home and our shared experience” (Boyle, p. 235).

In July 1969 the Apollo team placed a pocket-novel-size reflector on the moon’s surface that allows accurate measurement by laser of the distance to earth of up to a few millimeters, a measurement never before possible.  This new capacity is part of the moon’s bounteous potential as an information package. Notes Boyle, “The moon still gives us everything it has ever given us.  It reflects what we want it to reflect in our particular culture, in our particular time” (p. 245).

This includes information, such as the moon’s core is at least partially fluid. And a further revelation: the earth and moon are slowly but surely drifting apart at about an inch and a half a year, with eventual outcomes in the increasing length of the earth’s day and night rotation.  This will take billions of years in which there will be decreasing tides, and from a smaller disc in the night sky, less moonlight for night predators to hunt in.  Eventually the moon will stop retreating to take up a stationary place in the sky, visible from only one earth side, our own version of the dark side of the moon.