Thursday, October 31, 2024

Lifecycle of Values: Human Transformation by Age Stages -- Why our buying values change over time

 

Cultural Studies & Analysis has developed a set of basic tools relating to human behavior in groups, the basic unit of culture.  The Age Stage Development system charts human development from birth into the mid-seventies (and beyond).  This chart is the secret weapon for the ad agency, consumer products company, experience design studio, and most recently, the transformation economy.  

We have distilled a large body of work by both physiologists and psychologists documenting the evolution of human needs over a lifespan. By comparing this to the consistent cultural patterns of behavior over time, it is possible to develop a picture of the human lifespan as a decision-making process. In sum, we all are a different person every twenty years—in our thinking, needs, and buying lives.

This development process is not linear, but cyclical: a repeating progression of awareness, learning, application (reconciling), and transforming. As we move from one developmental stage to the next, we edit our mental database, dropping what no longer works, making and testing new discoveries, adding useful information in our new operating environment, and developing new recognition patterns to direct new solutions to new problems.

Reconciliation stages, those of Application, are particularly significant. During these periods the brain is preparing to re-set for the coming stage of transformation. It subconsciously scans for significant patterns of the upcoming stage ahead, as well as searching for significant moments in the past. Nostalgia exerts a strong pull, as do positive visions of the future.

Every twenty years we all transition through a sort of “systems upgrade.” We emerge from the transformation stage of each cycle as a distinctive new entity: from teenager to young adult, from young adult to maturity, from maturity to a cycle of reflection to resolve the contradictions within our own 60-year-old identity for final acceptance of our life story.

Each transformation carries marked economic consequences. Ages 20 to 35 are the hot demographic for consumer sales because these are the most socially mobile years in American life.  In the previous adolescent transformation, teens are developing an identity independent from their parents. Very soon in the next column, they also have several other identities thrust upon them in rapid succession — student, employee, peer-group member, partner – even parents themselves.

During this conflict of the early 20s, you can be — as far as your brain is concerned — a different person every few months. Whenever we change, our immediate environment must also transform to reflect and enhance our changing self-image. Socially mobile beings are more inclined to buy a new and wider range of “stuff” than the more self-actualized (i.e., older and more stable).

In the mid-30s, this social mobility slows. We start a second stage of reconciliation leading to transformation. We stop buying new stuff, skipping entire generations of technology before replacement. But while the things around us remain relatively stable, we start buying experiences--until our next transformation stage, when we start buying meaning.

At every stage of this process, brand loyalty can disappear when the consumer moves into a new stage.  In the new mindset, values based on the product name no longer enjoy utility.

Age stage determines which values the consumer is drawn to at what age, and how those values are recognized and acted on. Understanding age as a process, rather than an event or a seamless progression, reveals not only why the majority of any particular age group behaves the way they do, but also where they are going, what they need, and what they will gravitate to in the future.


Age-stage Chart ©1996-2024 The Center for Cultural Studies & Analysis www.culturalanalysis.com

 


Monday, September 30, 2024

The Transformation Economy

 

 “Knowing reality means constructing systems of transformation that correspond, more or less adequately, to reality.” 

-        Jean Piaget, psychologist

“[When I read a normal novel], I know what I am going to experience is reality, as expressed and transfigured through art. Reality translated to a higher plane, a more passionate intensity, than most of us can experience at all without the help of art or religion or profound emotion, but reality.  The shared world, the scene of our mortality.”

-       Ursula K. Le Guin, science fiction writer

                    Earthrise image from Pixabay

Humans are famously averse to change, even positive change. We get used to things as they are and learn to deal with them that way. Just as we learn how to be a child, we must start to deal with adolescence. Change requires learning and focus as well as truth suspension so that the new ways can begin to take hold. Yet our lives are constantly evolving, and at each new stage, transformation is the rule.

Starting with coaching, tutoring, diet and exercise, learning new things, situations, and people, even a new career, improving playing sports or speaking ability, planning and seeking out the next opportunity—even faith marketing is an offering about the transformation of salvation.

Joe Pine who, with Jim Gilmore, developed the Experience Economy in 1999, is now at work on defining the next stage of economic value: the latest in the long progression starting with commodities, to goods, to services, to experiences, and now personal transformation—a state where the transformed customer or client is the product. 

In the trajectory of progression of economic value, each stage is an evolution into a more sophisticated and higher-level format as the earlier stage becomes commodified, made less special and distinctive by concerns for efficiency, volume, and replicability (see Pine’s Harvard Business Review article on the slippage of value at Starbucks, June 26, 2024).  Businesses that cannot focus beyond the horizon to appreciate how experiences are sought after by customers as vehicles of transformation, risk regressing back to commodities. The Starbucks premise is important because for the cost of a latte, the high-end in-store experience confirms and encourages a quality-of-life interlude that validates the self-image of aspiration, aesthetic taste, and anticipation of a lifetime identity based on those class values. This is the same business model used by the upper-end mall, which creates an ambience of affluence to assure (or convince) customers that they can afford --and deserve to own—the upscale furniture, art, clothing style, jewelry, cosmetics, shoes, and dining on offer. These offerings, properly analyzed, also contain seeds of transformation that can be identified, spotlighted, and cultivated by marketing.

At the individual level, traced by our own Cultural Studies’ original age-stage development chart, transformation is the objective and the endpoint of four developmental life cycles: awareness, learning, reconciliation, and finally transformation, over each of four 20-year cycles from infancy to 75 years+.

At the level of economy, Transformation reflects the current globalization, technology growth, with increased trade and travel (the US travel industry is the world’s largest), along with higher education, extreme experiences, and search for meaning in all sorts of activities, skills, and achievements: starting with the high-stakes competition of college admission and moving across generations to the search for transcendent meaning in longevity, the marketing term “age wave.”  In psychologist Abraham Maslow’s pyramid progression of human needs, the Self-Actualization level is the top echelon and highest evolved of human life, the transformed human being. One example is the US Army pledge to make you “Be all you can be.”

In folklore and film, the agent of change is a magician who transforms Moses, Jesus, Cinderella’s fairy godmother, Aladin’s genie, the Wizard of Oz, Pollyanna.  Transformation was a gift. Now, it is the gift we give ourselves. Learning is transformational: there is a reason the race to the top of college entry is the leading contest of life’s second decade. (This makes the application essay that stage’s heaviest-weighted writing assignment with a lifetime payoff.) 

Pine and Gilmore’s Chart Transformation is inherent in human life across life stages. The journey from child to adolescent sets much of our personal history, with each adult stage, built around a diverse set of needs and wants. Now more attention is being paid to the aging process at the far end of life, coping with losses and illness inherent in a final search for meaning. With the world population moving into its 60s and 70s, led by the Baby Boom, this stage focuses on spirituality, vitality, creativity, and dying in the final decades migrating toward death. Author Sebastian Junger’s latest book, In My Time of Dying: How I Came Face to Face with the Idea of an Afterlife (2024), about his own near-death experience, is a current case.

The transformational medical spa industry has grown to make over the look of aging; yoga and pickleball are athletic pursuits that can be practiced well into late aging. The idea of age as a condemnation of infirmity and inactivity is fading, as the lifespan grows from the outcome of medical innovations and interventions and their integration.  People are expecting to have second, third, and fourth careers; in the 1980s, an AARP headline asked the question “What will you do for your second career?” Late bloomers are finding a welcome in industry and the arts, as well as athletics. As for the young, expectations are just as trying, building the high-school resume designed to apply to elite colleges, career-priming, and entrepreneurship early in life, as credentials for ever-growing development of mind, body, and the social network set to extend well into the 8th and 9th decades of life.

Life expectancy has more than doubled worldwide from thirty-two at the start of the 20th century to seventy-one in 2000. With this aging revolution, the range of changes over a lifetime, including extended old age, have multiplied, as explained in Ken Dychtwald’s Age Wave: How the Most Important Trend of Our Time Will Change Your Future, in 1989. The world is only now figuring out what this will mean for experience and transformation as the global economy serves increasing numbers of over-65 customers, and families start to include four or even five generations, blended by cross-ethnic parenthood as well as re-marriages.

At least 100,000 years ago, the conquest of fire transformed human existence, channeling our diets to cooked meat protein, expanding our brain power, and allowing us the control and security of a lighted and heated communal hearth. It exerted a kind of magic over human development by taking a natural element and creating a science and practice around it – a game changer, the kind people seek out in their personal journey through the development of their identity and self-invention.  This is no mere hack or transition but an irreversible direction containing its own drivers. Even weight loss and the simple cosmetic makeover echoes the transformation process, involving seeing the potential for change, basing that change model on what’s available and doable, in seeking out a new-and-improved transformed version of the past and present.
 
Anything that enables this process holds the potential for an economic niche, or a place in a wider field like education, skill mastery, mate-seeking, wealth, wellness, discovery (exploration, new skills, new connections, redefining the past). And transformations, lasting change catalyzed by experience, enjoy a higher price point and a long-term connection between the business and the customer. It also calls for a deeper understanding of customer aspirations, the development process at work over the lifespan, and the rich potential of ongoing positive experiences for change within this ongoing relationship.

This kind of experience is transformational in itself, opening up the world through new ideas and identity-seeking. Such as the life pilgrimage tour of Europe or other personal heritage homelands like Israel—or 23AndMe might transform your idea of who you are just by naming your genetic places of origin. In classic extreme expeditions, take climbing Mt. Everest, where enabling technology has invited many more-average climbers to challenge the summit and contribute to the death count. The challenge proved to be more than amateurs could anticipate or even imagine. But for the survivors, extreme travel (even into space at many thousands per seat) is part of the initiation value. Archeological digs, deep-sea dives, Antarctic cruises, and safaris offer similar payoffs, and their marketing literature stresses the link between the customized extreme and achievement of benchmark or “bucket” goals for achieving personal best.

Membership clubs, like alumni groups, carry the experience beyond the moment into the future with the cache of peers who have achieved the extraordinary and transformative—which can be based on talent, interest, participation--or simply cost.  Transformation is the internal dimension of a social initiation—including college, graduate school, or a master class. Dale Carnegie discovered his calling as a transformer when he taught public speaking at a local YMCA.

Of course, education is a life investment in upward mobility, one that is not only now super-expensive but requires inordinate time and focus to achieve, unlike the status of driving a Mercedes or owning a Rothko.  Becoming a doctor or lawyer is a question of being, not just doing.

Consider how one kind of travel, the NASA program, has transformed our ideas of the possible by sending exploration technology into deep space and astronauts to the moon, where seeing their home planet from 250,000 miles away made their new perspective into a life-changing experience. Suddenly the earth was not the central or sole concern.  Along with Big Data and AI, the center of human affairs has realigned to encompass far more space and many more possible futures for the earth race.  

Sunday, August 25, 2024

Equal Opportunity – Santa Monica High School cancels honors classes

 

                                                                                Image by Pixabay

“Today, we live in a society structured to promote early bloomers.  Our school system has sorted people by the time they are 18, using grades and SAT scores.  Some of these people zoom to prestigious academic launching pads while others get left behind.  Many of our most prominent models of success made it big while young—Bill Gates, Mark Zuckerberg, Elon Musk, Taylor Swift, Michael Jordan.”

-- David Brooks, The Atlantic

 

Is there a way past and apart from horse racing as the way to show ability and achievement?  Santa Monica High School has decided that on one educational platform there is.  Since 2022-23 the school has suspended Honors courses for freshman and sophomore years, breaking the tradition of double-track or ability grouping.  The impetus to toward improving equity, cooperations, and participation with blended classes, leading to better class diversity.  Reports are that the opting out of Honors and the practice of de-tracking and blending abilities classes at the lower half has increased enrollment in Advanced Placement college-level courses in the junior and senior years, which is also the college track.  “It’s about saying all students are capable—and we’re going to meet them where they are,” said Sarah Rodriquez, one of the school’s English teachers. The goal is to close the achievement gap between those culturally attuned and those less so, which could be called the color and class gap, in which Honors divides and stigmatizes the non-Honors student.

The School District decided that designating Honors for a set of courses unfairly selected-out students of color and unpreferred class, leading to keeping the courses “blended” to avoid discrimination bias-positive for better-performing, more academically fit 9th and 10th-graders.  Any preference for one group, it was reasoned, was in itself a denigration for less-preferred groups.  The priority of “honors” conferred a dishonor to those excluded.  As the school district’s credo reads, “Extraordinary achievement for all students while simultaneously closing the achievement gap.”  This high tide raises all boats strategy could be called “Survival for all, with the chance of fitness for everyone.” 

The same idea is the force behind Prom as an event for singles or groups expanded beyond the couples model.  Anything that signals “There’s a system working against us,” however much it may advance those with elite skills and aspirations, needs to be closely examined for this bias.  But our meritocracy assumes that individuals are responsible for their own success, and that talent plus hard work will eventually sort out the herd into its proper hierarchy by earned merit.

Also in 2022, Culver City California elected to institute a level field by eliminating the Honors label.  Following were Sequoia Union and San Diego districts.

Against this move to flatten the field is a wave of opposition, both parents and students, who see the move to subtract Honors as a detriment to the intellectual achiever. Brainy and intellectually advanced students are not at the top of the high-school hierarchy and are at risk of bullying in early high school, so that the Honors separation provides some privilege in the form of protection, by conferring separate status in a faster-paced classroom covering a wider scope of material. 

“Comparing ourselves to others is an elementary human activity and we cannot avoid making comparisons and being compared.  There is a tradeoff: favorable comparisons make us happier (at least in the short term), but unfavorable ones drive us to make things harder,” notes Peter Erdi in his book Ranking: The Unwritten Rules of the social Game We all Play (2020), p.40-41. Wikipedia’s definition of competition “is a rivalry where two or more parties strive for a common goal which cannot be shared where one’s gain is the other’s loss. The rivalry can be over attainment of any exclusive goal, including recognition.”

Sports is the poster example of achieving a clear goal.  Academics, the realm of the rational mind on a lifelong development arc, has a foggier profile.  Equality of opportunity faces a continuing debate about how to define, refine, and implement this difficult concept.

Is this a way around the traditional contest to showcase performance and achievement through competition of student against student?  The bias toward using competitive scores to gauge ability and bring out the best is long-standing as an efficient methodology.  But of course, this also means that “winners create losers.”  Horseracing has the long history of this measure to set records as well as stud prices.  But as Laura Hillenbrand’s Seabiscuit: An American Legend (2016) details, the toll of the racetrack can be one of injury and even death. 

Still, pace handicapping has been a long-standing way of equalizing each competitor’s chances of winning, as in horse racing.  Extra weight is assigned to extra-capable mounts like Seabiscuit.  Better-performing horses carry extra weight to make the race a fairer contest, as well as more challenging for the betters.  Like affirmative action, race handicapping is the method of calculating advantage and the exact disadvantage needed to equalize that advantage.  (At one time West Point worried about a pro-bias toward height as an unfair factor in promoting cadets.)  Active in chess, golf, basketball, track and field, cycling, sailing, and auto racing, this practice considers time, distance, and points as adjustable aspects to equalize ability record.  Similar logic could be applied (if we could find an equitable formula) in academics to equalize opportunity.

Countries famous for their educational elitism, such as England, Singapore, and Japan, are also known for the winner-take-all mentality, high-stress lifestyles, and even suicide on failing to make the grade that will set the course for a lifetime.   Are these consequences just a natural part of being in the ring, or should they spark concern about the wages of competition itself? 

Many schools are also doing away with the SAT (Scholastic Aptitude Test) as a measure of ability that used to be standard for college applications.  The SAT/ACT (American College Testing) was conceived as a way to level the field for smart students without the class advantages of a literate background.  Are they waking up to the inefficiencies of ranking tests in their ability to predict future performance by present and past measures?  While high SAT scores are useful for college entry, and essential for upper-echelon schools, effort and motivation are better predictors of long-term career success.   Clearly, more ongoing studies of the social psychology of social status in education will be in order.  The national race to the top for elite college entry is just one example of how motivating this status can be. 

Just a sampling of the competitions open to high schoolers are the Congressional App
Challenge, the National Economics Challenge, the MIT Think Scholars competition, and the Computer Science Olympiad.  It doesn’t get more competitive than that.  



Tuesday, July 30, 2024

The Moon Is a Cultural Force

 

                             Eclipsing Moon                                 Image: Pixabay



“Since the beginning of time, the moon has controlled life on earth and shepherded the human mind through a spectacular journey of thought, wonder, power, knowledge, and myth.”

--Rebecca Boyle, Our Moon: How Earth’s Celestial Companion Transformed the Planet, Guided Evolution, and Made Us Who We Are (2024)

 

Origin

The creation of the moon is a classic instance of destruction as a creative force.  Four and half billion years ago, the earth and moon were a single planet. Then a mars-sized body called Theia (Greek mother of the moon Selene), collided with earth at 20,000 miles per hour, breaking both planets apart.  From the residue of dust and gas, gravity made our moon as well as our earth, meaning that our satellite’s composition and motion can tell us about the earth’s origins, too. 

The philosopher Immanual Kant called such chaos the source of creation.  This is the giant-impact hypothesis, based on geochemistry that also explains the moon’s composition.  A giant spinning ring of vaporized rock and metal heated to four to six thousand degrees Fahrenheit formed from the earth-moon collision, called a “synestia,” “two homes,” a new type of planetary object, named for Hestia, goddess of hearth and home (Stewart and Lock, 2017).  Eventually it cooled, and the earth emerged—after the moon formed.

Time cognition

Science writer Rebecca Boyle recently turned her sights on the moon, or the earth-sun-moon system, for its interest not just to science—which is quite considerable—but to culture and the making of civilizations. She begins by explaining how the moon was once part of earth.  From there she points to the sophistication of prehistoric groups, who by “using the celestial bodies, learned how to grasp time, and how to control its use.”   This endeavor was initiated by the moon-mound calendar at Warren Field in northern Scotland in 3,800 BCE, nearly six thousand years ago.  

This Neolithic monument “marks the first time humans learned to orient ourselves in time, a major leap in cognition.”  Humanity would go on to “use the moon to create religion and consolidate power through it, erecting the foundations of modern society.”

In prehistoric human minds, the moon started out as a fertility symbol, a time counter, and a form of notation.  It soon progressed to a new role as a time reckoner, enabling people to orient themselves in time, imagining the future as well as recalling the past (p. 120)

Plato even asserted that the succession of days and nights, lit by the sun and moon, taught us how to count—and how to think (p. 17).

This analysis shows how a single artifact or element of the wide world can be mobilized to derive multiples levels of meaning to reveal the history and workings of culture.  The moon as cultural artifact is one of many we live with every day and barely ever consider a serious cultural subject.  Along with the sun, fire, water, ice, and air, these are elements of life on and off earth with deep implications for the way we think, act, organize, and imagine.  They are part of our prehistoric and protohistoric cultural heritage yet to be thoroughly analyzed to explore even our most basic operating assumptions. 

Timekeeping

As the Neolithic age began twelve thousand years ago, the moon’s timepiece enabled agriculture with its seasonal monthly calendar to replace hunting and gathering. Barley was first domesticated in Jerico.  The beginnings of history as a written record, starting in Egypt in 3200 BCE, cultivated the ability to predict as well as recall.  Writing had its start in Sumer (now southern Iraq) around 3400, with cuneiform wedges on clay tablets, as well as the base-60 numeric system of 60 minutes, 60 seconds, and 360 degrees.  Uruk (now Warka) in Sumer had 80,000 residents at its height, making it among the first major literate civilizations in Mesopotamia in the fourth millennium BCE and the largest urban settlement in the world. 

With the launch of writing, timekeeping, land cultivation, trade, and the law emerged as coevolving disciplines. The moon had already become a source of spiritual energy through moon gods and sky worship as the practice of religion; now those religions became the hierarchical order for empires.  Close observation of lunar movements laid the groundwork for observational science grafted from religious ritual.  Moon devotion and watching taught both a “new means of control and a new form of thinking.”

As interest in the moon’s keys to understanding developed, that knowledge had applications to widening horizons down on earth. Big cities with thousands of residents dominated the ancient world.  The first coins were minted and exchanged in 7th century BCE; the first paper money was created in 130 BCE.  In Sixth Century Greece presocratic philosophy was born from a curiosity about the natural world and the nature of the cosmos.  Meanwhile, the Persians were making advanced calculations, building on the astrological tables of the conquered Babylonians after Cyrus’s victory in BC 539.  China’s Han Dynasty opened trade with the Roman Empire in 130 BCE (the same year as paper money).  The Silk Road was actually a web of trade routes, land and sea, that connected Asia, Africa, and Europe for nearly 1500 years.  Down its many expansive routes streamed a global civilized culture through cultural exchange between distant groups.  They all viewed the moon and its phases from various positions on earth.

Science

The fifth-century BCE Pre-Socratic Greek philosopher Anaxagoras went beyond astrological wisdom to seek globe-spanning universals, and was first to explain eclipses, and the moon as “earthy” rather than light or vapor. His work drew away from the supernatural imagination and toward a colleagueship of rational thinking and observation, with the moon as his object of study.  Thales of Miletus is reported to have predicted the first solar eclipse, in 585 BCE; how he accomplished this, though, is unknown.  He might have deduced the pattern that solar and lunar eclipses come in pairs about two weeks apart.

The high concept of creating knowledge and making it work as a wealth of opportunity in its own right would power the Enlightenment 20 centuries later.  Driven by Copernicus, Kepler, and Galileo, the next great revolution would begin by breaking with the geo-centric universe. Enter Aristarchus of Samos (310-230 BCE), who determined that the sun is much larger and therefore far more distant than the moon.  Given these distances and sizes, earth must revolve around the sun, not the other way.  This revelation, now seemingly one so obvious, took many centuries (into modernity) to establish, by means of telescopic instruments.  How enduring misassumptions can rule until reliable tests are devised to question them, and then how credible alternatives are proposed and proven, is the history of the scientific revolution. 

By the 17th century the sun-centered scheme of Nicolas Copernicus could be proven scientifically, setting the stage for a new investigation of truth based not on faith or conviction but on observable evidence. This was not taken lightly by The Vatican who famously persecuted Galileo for promoting the Copernican theory of the Earth’s rotation around the Sun. They eventually saw the light, building their own Vatican Observatory in 1580 and Pope John Paul II apologized for the “Galileo Case” on October 31, 1992. This was a mere (in historical church time) 359 years after the event, but he did say the church was sorry about being a little hasty in their judgment in that case.

The moon was central to proving a solar-centric order, based on mathematics, the telescopic lens, gravity, and motion.  The moon orbit and gravity are critical to Einstein’s key assumptions for General Relativity. The geo-centric bias was certainly the greatest barrier to thinking about the universe and our place in it.  Its lifting has liberated all kinds of parallel thought once that barrier was broken, for example, just in considering the earth and moon not two distinct systems but a single dynamic.  “Did Copernicus really understand that his certainty about the ‘chief world systems,’ as Galileo called the heliocentric and geocentric models, would upend society as he knew it?” (p. 190).  As Copernicus was over sixty when he made his late-blooming discovery, perhaps he therefore foresaw less to care about; his major work was published the year of his death, in 1543. 

Inspiration

The Apollo astronauts who went to the moon, the first to transcend earth’s boundaries, have often borne witness to the journey’s transformative impact.  This effect has come to outshine the more famous courage and farsightedness required to undertake such a momentous trip.

“Many report feeling an overwhelming sense of clarity and unity, a heart-swelling state of heightened awareness and togetherness that is common enough to have its own name: the “overview effect…the sense of boundaries evaporating ….”  The missions even brought about a new awakening, in this case new knowledge and a different way of thinking about humanity’s home and our shared experience” (Boyle, p. 235).

In July 1969 the Apollo team placed a pocket-novel-size reflector on the moon’s surface that allows accurate measurement by laser of the distance to earth of up to a few millimeters, a measurement never before possible.  This new capacity is part of the moon’s bounteous potential as an information package. Notes Boyle, “The moon still gives us everything it has ever given us.  It reflects what we want it to reflect in our particular culture, in our particular time” (p. 245).

This includes information, such as the moon’s core is at least partially fluid. And a further revelation: the earth and moon are slowly but surely drifting apart at about an inch and a half a year, with eventual outcomes in the increasing length of the earth’s day and night rotation.  This will take billions of years in which there will be decreasing tides, and from a smaller disc in the night sky, less moonlight for night predators to hunt in.  Eventually the moon will stop retreating to take up a stationary place in the sky, visible from only one earth side, our own version of the dark side of the moon.    

 

 

 

 

 


Sunday, June 30, 2024

Body, Brain, Behavior, and Bias

 

Photo from Pixabay


“It is not the strongest of the species that survive, nor the most intelligent,
 but the one most responsive to change.”  -- Charles Darwin

 The Anthropocene 

The Earth we live on today is a man-made and human-managed project – the world has finally become ahuman artwork, shaped to human needs and values.  “The Anthropocene” has been proposed to describ the human-created earth environment since 1950. 

This latest era is focused on human activity as the dominant influence on climate and the environment starting with the Great Acceleration, the dramatic increase in the effects of our activity on the planet’s viability for human survival (measured by the sudden spike in radioactive plutonium).  The naming is controversial, and only a panel of geologists--the Anthropocene Working Group--can propose its official use.  But in discussions of climate change, the Anthropocene has been mobilized and continues to be treated as a concept with a real existence in public debates on human life and its environmental dynamics.  (The Holocene is still the “present” era, dating from 11,690 years ago and end of the last Ice Age.)  The word Holocene was formed from two Ancient Greek words; “Holos, “the Greek word for “whole,” and “Cene” from the Greek word kainós meaning “new.” The meaning is that this epoch is "entirely new."

Now at eight billion and counting, it would be surprising if humankind’s population alone would fail to show environmental impacts.  But our own limits draw the lines around our abilities and potential to manage and reinvent our surroundings and powers. 

These include our life expectancy, brain capacity, linguistic ability (tied to critical thinking), communication skills (as in sociability), and mobility. These all combine to determine the outer limits of our collective future. For example, our ability to control fertility (reducing the birthrate) in the 20th century was assumed to be the first step to prosperity, whereas traditionally, family size was an economic asset and children a sign of wealth that also predicted any group’s chances of survival.  The new abilities to prevent illness and forestall death were a major factor.

Body and Brain

Cultural analysis begins with human nature – the physical properties and dynamics of the body and the brain, the domain of neuroscience.  The life sciences study growth and structure of the body and its biological capacities.  Everything people do starts with the body: its genetics design (DNA), growth, and change over a lifetime, and corresponding needs, including its tolerances (limits).  Every culture on earth starts here. 

Whatever humans do begins and ends with the limitations of our body and its controlling brain.  Even with augmentations via bioengineering, such as the robotic third thumb just invented that may be deployed to make human hands even more manipulative, and advances in medicine, cognitive science, competitive sports, and space travel, human factors have limitations that can’t easily be surpassed by average people (including genetic gender).  Since the coming of homo sapiens nearly 200,000 years ago, every technology has aimed at the goal of making human beings more effective, faster, smarter, stronger, and more adaptable to hundreds of different circumstances and scenarios.  And power over reproduction, which modified the adult developmental curve, prolonged adolescence indefinitely by making parenthood a matter of choice by timing.

Other technologies affect parenting, sexual mores, information processing, processing speed, memory, mobility, strength, resilience, anesthetics, hunger and thirst control, addiction, sleep needs, and the ability of groups to defend and promote their way of life.  To be outside the group was and is to be defenseless and lost. The ancient Greeks had it that “One man is no man.”

Car design must follow the bulk, flexibility, stamina, and visual acuity of the average human. Computers are tied into the tolerances of sight, logic circuits, and attention limits of the brain and body. From family reunions to business meeting schedules, planners must pencil in restroom breaks, coffee, lunch periods, and needs for socialization, and the press (mental burden) of multiple attention channeling to accommodate the main agenda.   We can fiddle with modifications to our built-in abilities, but we cannot work around the entire issue of the brain and body. If we could travel to Mars at the speed of light, we would have already done so.  At 186,000 miles per second, the ride would take just 12.5 minutes – but would require more energy than the earth can generate, or the body/brain can currently tolerate. 

Then there is the brain, starting with body biology and the 2.5-pound mass of tissue, water, and blood that runs it.  But the workings of the thinking brain extend far beyond its physical properties.  To the emergent world of thinking and imagination, it brings the long-running rational and emotional balance that makes us human. 85 billion brain cells make for astronomical numbers of connections. Language is only a solo example of the ability to represent and combine ideas as the semantic platform of creativity and innovation.  And language is largely responsible for what has been termed the “noosphere” by Pierre Teilhard de Chardin—the total body of knowledge shared by all mankind across the globe.  In the global village, we all have access to a powerhouse of learning, memory, and cognitive creativity.   

Limits

However, our bodies are not designed to handle chronic stress, the dominant mental state of the 21st century.  Constant exposure to conflict, information overload, and contradictory demands on time and attention promotes heart disease and a compromised immune system.  Our brains are not designed to engage in the constant multitasking of technostress brought about by checking social media sources hundreds of times a day.  The American Psychological Association reports that stress levels generated by the 2020 presidential election were up 68%—with another barrage building steam for the 2024 edition coming up.

Behavior

Behavior is far more complex than brainwork because it includes the social dimension.  Humans are intensely social beings, as are chimps, baboons, and bonobos.  It is our social entwinements, those webs of webs, that give us culture, the shared brain of groups from two to billions, shaping “reality by common consent.”  Behind and beneath all human behavior are the social goals of family, politics, education, security, religion, and the social goods made possible by the combined power of many minds with many talents (the noosphere model).  As Kipling said of the individual and society, “For the strength of the Pack is the Wolf, and the strength of the Wolf is the Pack.”

The corresponding symbiosis is the self as unique and the culture as shared, in a constant dance of reflexive co-evolution.  Winston Churchill observed, “We shape our buildings, and afterwards our buildings shape us.”  This is the ongoing symmetry of human beings and our creations, including culture as the predominant. 

Bias

Bias, both positive and negative, is the thinking style that drives each culture: snowbound, desert, waterside, jungle, mountain, city or country based. Culture shapes thinking to adapt to environments, local and regional, national, or company minded.  When the environment changes or the group migrates to a new one, some of its groupthink becomes mismatched to the new stage-set ecology.  Islam grew up under the draconian demands of the seventh-century desert; its ethos now looks and acts medieval under the new rules of urban modernity—55% of the world’s population now lives in urban centers, with more to come.  The survivalist bias against other religions is part of an entrenched history that must now deal with the urban cosmopolitan present.

The mindset of priorities is that of the group ethos: the shared ideals that motivate every level of behavior.  It is this cultural ethos that makes people different – not because of customs, language, dress, or foodways—but because of value bias: for the individual over the group, for example, in the US case, where our devotion to the individual and self-direction sets us apart from the tradition of primal bonds as exceptional.  Knowing the priorities of a culture is key to understanding people’s intentions as the product of their priorities – their bias toward what they are trying to be, and away from what they are trying not to be or to suppress in favor of their ideals. 

This is the logic behind our highest-income tax laws: if we believe that anyone (regardless of background) can become rich, then we want to set up no obstacle, like heavy taxation (as in Great Britian) that would stand in the way of everyone’s ability to achieve wealth.  This can explain the constant back-and-forth motion of the Wealth Tax for the rich, which has never yet been passed.

We think that higher education—using our brains as capital and the ability to improve them as an investment—offers a good chance of building the scaffolding of a successful career, especially a professional path to achievement and riches.  This is the reason the whole country went into serious student debt that is now blocking the future for millions of students for whom no price was too high for the best affordable school—until the career market failed to provide the expected return. We still believe that the right partner in marriage can make all this happen, along with romantic success (which doesn’t quite mesh with the rewards of family values—even though the divorce rate still hovers around 50%.  (Note: This does not mean that half of all couples will divorce. The numbers are significantly skewed by a smaller number of dedicated serial marriers).  Hard work from agricultural and industrial ages has expanded in the knowledge economy: the 24/7 demands of managers spurred by constant connectivity must now be regulated away from off-the-clock hours (the agenda of the Fair Labor Standards Act).  Humans are constantly navigating between our technological abilities, our social lives, and the body/brain need for sleep, privacy, and deep thinking. 

---- 








Tuesday, May 14, 2024

Brain Bias, Male and Female

 

                                                       

Image by The Conversation

“On the basis of the information available it seems unrealistic to deny any longer the existence of male and female brain differences.”  

                                                --Richard Restak, The Brain: The Last Frontier (1979)

 

In the late 1970s, neuroscience investigator Richard Restak was excoriated as a sexist for suggesting—based on evidence, that is, expert opinion—that the brains of human males and females are different.  To hold this opinion is to risk accusations of neurosexism. However, for “gender justice” advocates, female (but not male) brain differences are now insisted on.  This advocacy is mobilized in order to distinguish uniquely women’s issues as the platform for rights and protections, including diversity, health, social, and psychological.

Ongoing studies show a mosaic of male and female features in virtually every brain; there is no pure male or female exhibit.  Yet certain structures and chemistry occur more commonly in each gender.  Male and female brains show bias in a continuum across a number of aspects, from drug processing to reasoning style, mental states to mental disorders. 

One of the reasons brain-sex dimorphism is controversial, Restak explains, is that sex identity and behavior in our species aren’t neatly compartmentalized as in other research subjects like mice and monkeys.  Many question whether animal studies are analogous to studies of brains in primates.  In humans, genotypic sex, phenotypic sex, sex attraction, and gender identity are not reliably aligned.  Sex organs and hormone effects are visible evidence markers between the sexes.   However, these are aligned by genes and their expression throughout the body, like testosterone (male) and estrogen (female) hormones, influencing thinking and behavior in the brain.  These include connections happening prenatally, before exposure to cultural or environmental experience.  The female brain is the default through the X chromosome, meaning that every brain begins as female but only half develop as male.

Stepping outside the research lab, what is the first thing you notice when meeting a new person?  The only biological difference between people isn’t race, class, or age.  It is gender.  We depend on gender knowledge to adjust our communication style to suit male or female.  Gender is embedded in our DNA as chromosomes XY or XX. No matter how gender is expressed or repressed within cultural norms (epigenetics), these genes wire our secondary physical expression as breast size, genitalia, height, weight, muscle size, hip width, sex drive, voice pitch, facial features, and pubic hair.  Unsurprisingly, gender also sets up the male and female brain in distinctive ways.  Whatever gender persona you might decide to exude, your DNA doesn’t migrate between the two gender codes, XX and XY. 

Sex hormones are important to the way people look, feel, and behave.   In men, the Y chromosome carries a protein promoting testes formation, testosterone production, and creation of the male brain.  While this protein makes males more prone to retardation and learning / speech disorders, dyslexia, and autism, males also make up the majority of geniuses at the opposite end of the intelligence scale.  Female fetuses are better able than male to recover from prebirth brain damage. 

Estrogen in women encodes language as spoken sounds (phonemes) as well as the visual coding of written language, and also plays a role in long-term memory (so women are slower to forgive and forget than men).  Testosterone predisposes males to risk danger and aggression, so that most murderers are male.  The male hormone sets desire in men, but testosterone also drives sexual desire in women, even with far less of it in the female brain mix.   

Whatever the differences, Restak concludes, “it helps to keep in mind that such differences do not imply that one sex is superior to the other” (Mysteries of the Mind, 2000, p. 64).   Only that each has a likely inbuilt bias to prefer one type of thinking or acting over others—and therefore, to practice and excel at that behavior.   An article on The Conversation site ventures that  On the other extreme, we are dismissed by women’s health advocates, who believe research has overlooked women’s brains – and that neuroscientists should intensify our search for sex differences to better treat female-dominant disorders, such as depression and Alzheimer’s disease” (April 22, 2021).

For example, women perform better than men on verbal tasks (estrogen-promoted), as well as intuitive reasoning, motor skills, and scanning environments for select features (finding all the green chairs in a mostly blue auditorium, or the best fruits on the tree, or a child on a bustling playground).  The hippocampus, the human memory center, is larger in females, with a higher density of neural connections.  It facilitates memory for people and reading their emotions.

Men excel at spatial tasks, including rotation of objects in mental space, and do better at math, logical reasoning, and motor skills directed at distant targets (aiming and tracking).  Men in mazes navigate by dead reckoning, while women rely on landmarks sequenced in memory. This is why men generally ask for directions much later than women. This is the root of the difference between hunters and gatherers, and why hunter-gatherer societies were based on gendered division of labor.

Another sex-brain disposition is in the anatomical balance of gray and white matter—women have more grey (the core of nerve cell bodies) while men have more white matter (nerve fibers for signaling around the nervous system) involved in connecting brain and body.  Female brains show increased coordination between regions, whereas males have a more separated left-right structure and connect back to front versus women’s hemispheric coordination left to right.  Men and women process neurochemicals differently using different receptors, as for example serotonin synthesis, seen in dominant primates, is over 50% higher in males.

Experience and attitude influence brain dynamics and the development of structure and function.  Lived experience, for example, education, can establish new circuits and outputs, facilitating the hardwiring of the brain. Upbringing and culture can activate or repress brain functions and their genetics, creating new nerve cells and connections, the essence of neuroplasticity.  Adaptability through specific action and memory is our species’ main strength: the ability to connect our biology to our cultural learning by brain growth. The human ability is to adapt to new circumstances over generations, as well as from moment to moment to meet needs and build opportunities. 

The question now arises: How do we know these abilities and their sex DNA are not simply cultural rather than hormonal, as in biodeterminism? For example, Jews make up 2.4% of the US population, but 35% of US Nobel prize winners. Is this nature, or nurture?  The simple response is that culture tends to follow rather than determine biology, reinforcing rather than forcing brain bias. In the Jewish case, this bias is an environment of reward for literacy, encouraging questioning to find answers.  Culture is built on the existing body and brain, but then determines how they work within the process of cultural values and conditioning—for instance, in the way we think about and recognize gender.  This bio-cultural feedback loop is our uniquely human heritage.

-----

Thanks to Dr. Herb Adler for consulting on this topic.

Wednesday, April 24, 2024

Phoenicians – Early Trade in Ideas

 

“Trade and travel bring people into relationships with each other with resulting disruption of the local religious and ethical life, and then some political invention—foreign rule, or an imperial system, perhaps—is developed.”

--Robert Redfield, The Primitive World and Its Transformations (1953)

Model of a Phoenician bireme ship, c. 700 BC.  Science Museum Group collection

 

The original merchant seamen used the north star to navigate under sails red and blue.  The Phoenicians deployed the power of sea travel in the Mediterranean to become a connecting force between cultures.  Their navigation skills, legendary in the ancient world, allowed them to travel farther than other traders while keeping on course.  Coming from Tyre in Lebanon between the mountains and the seacoast, they left land hard to farm, sailing as far west as Spain, south to Egypt, east to Asia Minor, and even (reportedly, by Herodotus) around the African cape: a voyage of three years. 

For these voyages they developed seagoing ships with cedar planks, as well as multi-story concrete homes to save space on the coasts where trading centers were established.  This was their legacy to the Persians and Greeks who followed them, building on sea-going networks by assimilating seagoing knowledge.  Besides the keel and the bow battering ram, the Phoenicians have been credited with inventing caulking between planks as well as concrete construction.  

Their territory was the city ports of Tyre, Sidon, and Byblos, profiting from the breakdown of empires at the close of the Bronze Age until the Iron Age boom (around 1200 – 330 BC, predating and following classical antiquity).  The costs were developed into multicultural exchange posts for ideas as well as goods, and a new worldview orientation, based on what writer Adam Nicolson terms “harbor minds” (How to Be: Life Lessons from the Early Greeks, 2023).  “The whole of the Mediterranean was beginning to become a single maritime space…. that liberation from the overwhelming fixity of fate is an aspect of what we should think of as the dolphin mind, the mindset of entrepreneurial, adventuring people.  It is a form of mercantile courage, of reliance on fluidity” (pp.12, 289). As the first to chart the Mediterranean in total, Phoenicians set the model for the study of geography.

The pre-Socratic Greek philosopher Thales would conclude that the foundational principle of reality was in fact water, as in fluidity, transience, and motion, based in part on the role of sea power.   He proposed an earth floating in space like a ship on the water.

These seafaring people also provided a central communication device to the Western world. The written alphabet they carried was a sound-based language adaptable to all cultures as the prototype of all phonetic tongues.  This was as important between cultures as to the international polyglot populations of melting-pot trading ports. By 730 BC Greece had adopted the alphabet as the foundation for widespread literacy that anchored the rise of an astonishing creativity—starting with the justice system based on the written law of statutes.  Latin further evolved this writing into the letters we know today. It is where we get the word alphabet, from the first two letters of ancient Greek – Alpha and Beta.  

Founding Carthage as a major colony, ruling by merchant oligarchies, the Phoenicians had devised a regional order, an integrated culture based around the central sea.  “By around 800 BC, the Mediterranean was in touch with itself, a spinning, fractalizing, and hybridizing whirlpool of expanding and interacting cultures….in which the seed of early philosophy began to grow. (pp. 12-13).  This seed was nurtured by an alphabet mutually shared between city states, from Italy to the Black Sea by the time of the Odyssey, to include the artists’ signatures on the artifacts they were creating: statues, pottery, glass, jewelry; these items are among the rare archaeological evidence left behind.  Surprisingly, such evidence does not include written records of poetry, song, or narrative. 

The color purple made from tens of thousands of snails was the signature of Phoenicia off the Atlantic coast of Morocco. The famous purple dye, drawn from murex, was named after Tyre as Tyrian purple. This red-purple became the standard hue of power and prestige for the imperial rank in ancient Rome. 

However, as traders, they facilitated the large-scale sharing of cultural resources, the core of a cosmopolitan world.  Nicolson cites Ezekiel’s imagining the city of Tyre as a ship assembled from across the Mediterranean market: “her planks were of pine from what is now southern Turkey, her mast a cedar of Lebanon, her oars of oak from the woods above the Sea of Galilee.  Her bulwarks were inlaid with ivory carved in Cyprus, her sails and pennants of Egyptian linen” (p. 81). 

These ancient emporia gave rise to a world culture of trade, in ideas as well as goods, that began a reorientation to a wider world and the cities that anchored that world of exchanges. Such a realignment yielded innovations of every kind to build a world civilization.  We can look at the earliest organized traders as the entrepreneurs of poly-cultural skill, an interchange basic to a global civilization and its bias for a global emporium of creative power. 

Thursday, March 14, 2024

In and Out: Group Bias

     

“A prejudice, unlike a simple misconception, is actively resistant to all evidence that would unseat it." -- Gordon W. Allport, Psychologist

 Us v. Them

In a social psychology experiment, subjects formed two groups.  Not based on gender, politics, age, race, education, wealth, or anything so salient to identity.  The two groups were divided solely on one criterion: whether their birthdays fell on even or odd days of the calendar.  Not even on astrological sign, or year, or season--just either/or numbers, 1 through 31.

Laboratory experiments have shown how easy it can be to create group identity as well as group divisions that line up loyalties to one group and hostility toward another.  This identification reinforces the differences between us and them.  Favoritism for our group, bias against theirs.  Neuroeconomists George A. Akerlof and Robert J. Shiller, in their review of such experiments, said:   

Even in this division, where the groups are totally pallid and meaningless, subjects who were born on the even days of the month showed a preference toward fellow evens and bias against odds, and odd subjects showed preference toward fellow odds and bias against their rival evens.  Even Dr. Seuss has also gotten into the act.  His Butter Battle Book depicts the Great War that ensues between those who prefer their bread butter side up and those who prefer it butter side down (pp. 158-59, Animal Spirits, 2009).

Blue v. Brown

A well-publicized example of in- and out-group bias is Jane Elliott’s “Blue Eyes/Brown Eyes 1968 exercise in an Iowa public school, following the assassination of Martin Luther King, Jr.   This activist wanted her class to understand what discrimination really feels like--at first hand.  Accordingly, Elliott divided her all-white third-grade classroom into blue-eyed versus brown-eyed students, rotating favored status between them.  On the first day, the blue eyes were told they were smarter, nicer, and better than brown eyes. They were given special privileges.  Results were instant:  the stigmatized group was marginalized by the favored group, shunned, talked about prejudicially, shamed, and otherwise humiliated—while their tormentors’ own grades improved. 

The two groups stopped playing together.  For their part, brown-eyed kids isolated themselves during recess to avoid the blue-eyed scourge, acting intimidated and despondent, while their grades suffered.  The following day, the brown eyes were favored in the same way, with like results, just reversed.  As the odd / even birthdates experiment showed, any trait, including eye color, birthdates, and bread-buttering, can be employed to direct bad behavior towards others. 

However, as a critique, Elliott’s use of eye color has a strong correlation with ethnicity (not exclusively, just correlated).  The choice of eye color is not a neutral factor.  This choice of difference linked her classroom groups to in- and out-groups outside, as they operated in the wider sphere of race stigma--except when the brown eyes were shifted to the top rank over blue.  Said Elliott about her experiment, “You are not born racist. You are born into a racist society. And like anything else, if you can learn it, you can unlearn it. But some people choose not to unlearn it, because they're afraid they'll lose power if they share with other people. We are afraid of sharing power. That's what it's all about.

Language and species

Related to the classroom treatment are studies that show female teachers favor girls over boys, to the detriment of boys’ academic achievement.  Language mastery—a strong pro-bias for teachers in general—comes earlier to girls than to boys.  Dyslexic students are especially disfavored.  Middle-class students, already more successful than lower- or lower-middle, are also favored, as language is correlated positively with class.  Thus a cultural advantage, that of speech, is routinely enhanced through nurturing by teachers—favoring those already versed and skilled as speakers, listeners, readers, and writers by their home environments.  Efforts to transmit this class-based literacy face an uphill struggle.

The story is told in Hollywood about the bias of the Planet of the Apes (“a planet where apes evolved from men”) cast members for their own kind, as the denizens of the post-apocalyptic ape planet in costume gravitated on their own preference to sit down to lunch.  Each type sought out the bench seated with their own ape species – chimps, bonobos, gorillas, orangutans, or humans.

Anti-stigma traits are favored as a sign of ultimate fitness—for example, marriageability. Traits valued are especially:  beauty, fitness and health, vitality (at any age), bilateral symmetry, social graces, competence (a good earner, organized thinker), caring behavior, non-criminal, and lack of negative mental states (neurodiversity). (Plato declared “Beauty is a natural superiority.”)

Salient features of human divisions are age, gender, and race – but primarily age and gender, since definitions of race shift within and between cultures (German and Italian, for example, were considered races within earlier American culture).  The winners and losers effect reflects the ongoing outcomes of class divisions.  Even the evolutionary hierarchies in wild baboons are perpetuated across generations of winners, who most often dominate, and losers, who are most often dominated.   Height estimation goes with dominance and prestige, with observers in the lab estimating higher status for taller men.  More muscular men are seen as more dominant as well, whatever their actual social status (Mark van Vagt, Dutch evolutionary psychologist).

Cultural fitness

The aspirational drive to attain a higher social status—an expression of fitness-- appears to be universal across all human populations and cultures (Peter Erdi, Ranking, 2020).  This is, naturally, an impulse consistently shifted by competition between individuals and their groups, especially when stigma is agreed and applied by the mainstream in which they must operate.   

Psychologist Philip Zimbardo’s infamous Stanford Prison Experiment of 1971 studied the effects of power and powerlessness in a simulated prison cellblock set to run over two weeks.  Twenty-four Stanford students were randomly assigned roles either as wardens and guards or as prisoners.  However, the extent of the abuse meted out by the “guards” on the “inmates” called for the experiment to be terminated early, after only six days.  The prisoners began to show extreme signs of stress and de-individualization.  A humanitarian graduate student pointed out to Zimbardo (who was playing a warden himself) the actual psychic damage being done to real human beings.  The Stanford study remains a landmark example of abuse based on a role-playing exercise.  The parallels to Elliott’s exercise with eye color are all too apparent.

Friday, February 2, 2024

Division of Labor: Excelling v. Extinction

    “When the whole man is involved there is no work.  Work begins with the division of labor.”

                                                                                                --Marshall McLuhan

 

                                                                     White House kitchen staff division of labor

Economist Michael Kremer posed a cultural equation in which shared ideas, free as a public good, begin to be exchanged and grown across space and time.  As populations expanded, “more people, more ideas,” the higher concentrations gave rise to culture.

Over the centuries, population growth and technological change have expanded, including the division of labor (time and talent) that is the hallmark of the world’s great civilizations, marked by the growth of cities as wealth centers.

In fact, one of the great contrasts between the now-extinct Neanderthals and our species--homo sapiens--is this very ability.  Labor division allocates resources between those with differing skillsets and the time to learn and perfect them.  It is also what distinguishes talent in many arenas from those less talented, leading to status and class divisions. But the Neanderthal home showed no dedicated spaces there, nor any evidence of trading goods and behavior. 

Even their hunter-gatherer behavior seemed to be evenly distributed between men and women (Tim Harford, The Logic of Life, (2008) p. 208-09):  This original division of labor, between male and female, is quite ancient as a shared tradition.  “Today’s simple hunter-gatherer societies divide tasks between the sexes.  Men hunt big game and not much else; women hunt small animals, gather berries and nuts, make clothes, and look after the kids.  Early humans, too, seem to have divided jobs between hunters and gatherers, presumably along the same lines.  Neanderthals, apparently, did not.”

Nor do we know if they had language, indispensable to trading information between or within groups, and record-keeping of assignments.  Language and its concepts are essential to cooperation and role designation as well as task assignment.  The difference may be genetic. Reported in Science in 2022 (Sept. 9) is the discovery of a gene mutation in our species.  This mutation signals the development of extra neurons in the frontal neocortex that greatly enhance connectivity. This TKTL1 gene is lacking in all previous hominids.  Homo sapiens’ birth might come down to this single unique trait.

Division of labor is not as simple as dealing out work to be done equally by effort and hours, but applying the principle of diversity to project and process, both simple and complex—from building huts to bridges to cities.  Using the idea of comparative advantage, any group effort leverages the various capabilities within the group (the job of management expertise), including skills, age, gender, ability, experience, strengths, aptitudes, and weaknesses.  The project is taken apart with a view to splitting subtasks so that they can be assigned by talent as well as time. 

Expertise is developed as a cultural tradition: craft, battle, agriculture, hunting, exploration, planning, engineering, building, language, the arts.  Childcare and foraging were classical women’s work, whereas menfolk specialized in hunting, defense, exploration, and leadership.  Human resources is ideally the science of understanding human capabilities and allocating them in the most productive way (beyond just signing up insurance plans). 

The wealth of cities consists in their ability to instantly draw upon large arrays of these specialized traditions, putting these to work in organized group form.  Organizing human talent and skill, beyond just tool-making or invention, is the basis of any civilized order.  Even the division of domestic spaces designated for separate activities is evidence of thinking in terms of labor division and the special needs of any specific job.  From Egypt onward, homes showed the first-discovered spaces dedicated to leisure pursuits alone, diffusing throughout the human indoor landscape.  In the Neanderthal case, low populations, besides keeping cities in the future, also stymied the technological innovations that generate the cross-fertilized energy and growth of cities. 

Failure to think in this way might be the reason behind the fading of Neanderthals, our close cousins, as they become superseded by homo sapiens 40,000 years ago.  Harford speculates that this approach to working was not evident from the Neanderthal record. 

Division of labor was theorized by French social philosopher Emile Durkheim (1893) to correlate to the moral and communal power of groups to be productive and influential.  This mentality was the way humans were not only able to survive but to thrive as well as prevail.  Another expression of labor division is trade, sharing resources by relocation and speculation by importing novelty and specialization (a kind of cultural arbitrage)—evidence also absent from the Neanderthal record. 

Trade also underlies our social nature as a formalized endeavor between unrelated groups—a necessary parallel to the exogamy of marriage and mating between unlike genetic pools. “Computer simulations show that the propensity to track, barter, and exchange could easily have allowed humans to wipe out Neanderthals in a few thousand years, even if the typical Neanderthal was faster and stronger and perhaps smarter, too" (Hartford p. 208). While 99.7% of genetic material is shared by modern humans and Neanderthals—more closely related than chimpanzees—we diverged over half a million years ago from the last common ancestor.  The DNA record also shows evidence of incestuous mating in these late relatives.

And of course, these labor divisions are far from equal in either their demands or rewards, further dividing the merit landscape that says which groups can aspire to and occupy roles in the professions, politics, celebrity, athletics and in the arts, crafts, and letters.   When Americans meet for the first time, our first question is “What do you do?”  We are looking for clues to background, merit, aspiration, and status. We are, on the scale of world cultures, closely identified with our careers as an index to our background, class, and potential.  Our places in work role diversity are equally important a social index as ethnicity, education, and earnings.