“As fuel was consumed, the ship got
lighter and the acceleration more pronounced.
Rising at this exponential rate, the craft quickly reached maximum
acceleration, a limit defined not by the
ship’s power, but by the delicate human bodies inside.” -- Andy Weir, The Martian
“Engineering: The discipline of applying technical and
scientific knowledge and physical resources to design and produce materials,
machines, devices, systems, and processes that meet at desired objective and
specified criteria.” -- New World
Encyclopedia
“Objective and specified criteria” sounds highly rational
and technical. But this demand set
starts out with human factors – the controller, driver, or user of whatever is
design-engineered. Here are two
examples:
Case 1: Climate control:
Many female workers report office climates as chilly, whereas men feel quite
comfortable. Why is this? Medium.com writes that office building algorithms
for temperature regulation date back to the 60s, targeted to a 154-pound
male. The smaller bodies and lower
muscle mass of women make them more susceptible to cold. Unless climate control is updated to reflect
this difference, including the growing numbers of women in the office
workforce, this male-bias design problem persists. “Minor” design aspects like this set-point exert
a major impact. Temperature affects not
just comfort but productivity (like keyboarding performance). The gender pay gap could be just one outcome
of off-balance climate control.
Engineering
The design, building, and use of engines, machines, and
structures begins with the physiology and mentality of human beings (biology
and psychology), moving from that base out into cultural values (how people,
things, and experiences are defined, weighted, and ranked across groups). This means that people, not devices, are the
central core of design thinking. These
human factors introduce a powerful bias into the “neutral” processes based on
math and physics. UX—user experience—experts
understand product users and their experiences—including thought conventions, emotional
feedback, intuitive assumptions, decision-making, task procedure, and options
for action. Human Factors Engineering is
now a subspecialty, but all engineering projects must, ideally from the outset
of the design process, define, test, and evaluate the fit between design and
user (Goddard).
Typical of a project well understood in this way is medical
devices. Less well understood are chronic-care
pharmaceutical regimens and effects, where compliance with use rules (adherence)
is only around 50% (and less for males than females), decreasing over time (US
Pharmacist). The countering
side-effects, dosage schedules, and low effectiveness of any given medication are
the main causes of non-adherence. And
yet these counter-productive factors are not fully recognized or acknowledged
by the medical profession as obstacles to patient compliance that interfere
with the engineering of desired drug outcomes.
Bias
The first bias going in is that the designer looks and uses
the device in the same way the user would – but the first is an expert, whereas
the second, the typical user, is often a first-time user. Don Norman, human-centered design expert,
puts it this way in the opening of his human factors book: “You are designing for
people the way you would like them to be, not the way they really are” (The
Design of Everyday Things p. 7). (When looking under the topic of bias in
engineering, you will see plenty of articles on bias—in the hiring of women and
minorities. While diversity and
inclusion aren’t under discussion here, male dominance in the profession has design
outcomes as bias toward male users.)
But bias is also an outcome of the human factors involved in
the designing assumptions of the (usually male) engineer. Men and male bodies dominate medical testing,
with female subjects missing from medical trials as too complex and variable—and
at special risk for any adverse after-effects of testing. Differing male and
female physiology produce differing responses to drug type and dosage. In parallel, in the design of credit ratings, males
are given higher credit and spending limits—based on assumptions about long-term
earnings and employment. Microsoft
vision systems fail to recognize darker-skinned figures, and self-driving cars
have recognition systems less likely attuned to dark skin tone as well (Techcrunch.com).
In AI, male voices are easier for voice programs to
recognize and interact with. Critics of
this bias have noted that most of the voice-activated home programs (like Siri
and Alexa) use the female assistant model of the young articulate admin with a
compliant and faintly flirtatious edge. It
can also be that female voices signal trust and reassurance—as advertisers are
aware in healthcare, beauty, and hospitality, versus the more authoritative
male voice (ESB Advertising).
This can result in critical situations in automotive design
and safety, as Carol Reiley writes in “When bias in product design means life
or death” (Techcrunch, Nov. 16, 2016). She
points out that test dummies are modelled on the average male body, so that
females are almost half-again as likely to be injured in a crash. The first female crash dummies entered the
design process in 2011, and since then, Toyota and Volvo have coded programs
dedicated to testing the smaller-scale female body as well as pregnant ones.
Self-centric design
Designers use people like themselves (unconsciously) as
models for the majority of products and programs (male, white, US-based). This is no surprise but an outcome of
everyone’s natural homophily—the tendency to relate best to those who look,
think, and act like ourselves. In a
study by the Geena Davis Institute for Gender in Media, white men over-perceive
women and minorities in simulations where just 17% were women (seen as 50/50
ratio to men), with 33% women seen as the majority. This is an irony in view of the fact that
women make three-quarters of all consumer buying decisions.
And consider just designing for the brain itself, which is
complex but runs best on programs and input that are first of all
intuitive. Few people except the
technically inclined even bother to read a manual—a complex tech manual being
an even greater obstacle to operation.
Don Norman points to an early digital watch, the Junghans Mega 1000
Digital Radio Controlled. With five
buttons along the top, bottom, and side for operation, the follow questions
arise: “What is each button for? How would you set the time? There is no way
to tell—no evident relationship between the operating controls and the
functions, no constraints, no apparent mappings. Moreover, the buttons have multiple ways of
being used” (TDOET p. 27-28). And as
much as Norman likes the watch itself, even he (an expert in device design)
can’t recall these functions or how they are deployed in order to fully enjoy
the watch features.
Undiscovered bias makes engineering design much more
difficult to define or shape to the right purposes. Bias skews the problem definition from
solving the problem that needs to be solved (not necessarily the one presented
by the client) for the right array of users, who will then be able to use the
device by the maps, concepts, and symbols already in their heads. Ignoring or failing to identify these factors
leads to more protracted processes to make corrections or change direction as
the team works to solve the wrong or misstated problem (Norman). Finding out what the actual issues are at
their root is the mandate of cultural analysis, based on human biological,
brain, and cultural motives.
No comments:
Post a Comment