“The professional tends to specialize and to merge his being uncritically in the mass. The ground rules provided by the mass response of his colleagues serve as a pervasive environment of which he is uncritical and unaware.”1
—Marshall McLuhan The Emperor’s New Clothes
I’ve lost track of the number of times I’ve sat fuming at presentations by designers extolling the glories of electronic networking.
I’ve always been a believer in the sloppy old face-to-face method of communication, and the tech geeks, from computer lab support staff to “TEDsters” frustrate me with their hand-held pod-cast technoenthusiasm. I use technology, communicate electronically, even socially network to a modest extent, but I’ve never been one to believe that technology, especially “paperless,” would be the salvation of western or any other civilization.
Forty years ago Robert Heilbronner opened a can of worms when he wondered, “Do Machines Make History?” His essay, later reframed and reissued, was an examination of technological determinism. The term technological determinism has been traced to Thorstein Veblen. It is the concept that technology is self-driven, existing outside of social influences. Heilbronner wondered, following Marx, whether “The hand-mill gives you society with the feudal lord; the steam-mill, society with the industrial capitalist.” Ultimately for Heilbronner technology was a player, but not the only one.
Most of my students, and it seems many designers, are quite a bit more sanguine about technology. The inventor and futurist Ray Kurzweil, member of the Singularity Institute and proponent of intelligent machines, impresses them when he suggests the probability of a technological singularity within the next 25 years. This vision, driven by the exponential growth of technology throughout history, proposes that ultra-intelligent artificial life will become self-aware and further accelerate technical progress. In popular culture, a benevolent outcome is proposed by Steven Spielberg’s 2001 movie AI starring Haley Joel Osment.
The nature versus nurture debate has raged endlessly for over a century. Although psychologists consider it a false dichotomy, in popular parlance the conflict seems to continue to have feet, especially whenever discussions about the cultural meme arise. In my last post, I went on record as being on the slow evolution side of biology, but there may be a growing reason to reevaluate such statements beyond the popular idea of accelerating technology.
A recent issue of Time Magazine carried a five page cover story about the rise of the newly minted science of epigenetics. Starting with the genetic research of Dr. Lars Olov Bygren, a specialist at the Karolinska Institute in Stockholm, the article builds a case for the influence of environmental stressors on gene expression that can be heritable, in some cases, through up to 40 generations. Not long ago this would have been considered genetic heresy. It doesn’t take a degree in science to understand that such an idea vindicates the long despised notions of Jean-Baptiste Lamarck, who famously argued that traits acquired in one’s lifetime could be passed on. Generations of Darwinists have been happy to point to the illegitimacy of this line of thought. I mean, if giraffes got long necks from stretching to reach the tender leaves, what’s to stop one from inheriting obesity from a couch-potato parent? This, however, is roughly what epigenetics is arguing.
The tremendous complexity of the universe is no surprise to scientists, but to designers, many of whom subscribe to the dictum that “less is more,” complexity can be daunting. For example, consider the Venter map of the human genome. Completed in 2001, it is the single most detailed graphic image ever developed. In order to view it onscreen it has to be magnified by four orders of magnitude. To think that the potential number of epigenetic “markers” or switches (the molecules that turn a specific gene function on or off) could easily number 100 times as many as the existing number of human genes (25,000) is only to see once again that the more we learn the less we know.
A minute fragment of J. Craig Venter’s Diploid Genome Sequence (2007)
How could this possibly make life easier for designers? One of the things designers are endlessly advocating in favor of is that, anymore, the influence of culture trumps nature. It shows up in the arguments of my tech friends who favor the power of electronic networking, is a common argument of those who suggest that computing technology fundamentally changes human relations, and it even makes an appearance in the AIGA’s latest sustainability project where culture is juxtaposed with the economy, the environment, and social justice.
Epigenetics suggests many possibilities, some of them very dark. If it is possible to inherit asthma from what one’s mother ate during pregnancy, or obesity from the stress of a famine on one’s grandfather, then why not say that constant early exposure to the Internet could result in increased intelligence (or stupidity)? The evidence for this last has yet to be established while proof of some other examples is supported by epidemiological research. There are disturbing warnings in the scientific literature suggesting, for instance, that early television exposure could be implicated in the epidemic of ADHD among adolescents. Could early exposure to electronic technology result in heritable traits? Caution is recommended. But this could be a good thing, too, enabling physicians to better control the as yet unknown origins of certain diseases.
For design theory epigenetics could be the long-sought Rosetta stone, the linchpin that reconciles lightning-quick culture to slowly evolving nature in a way that would warm the cockles of your friendly neighborhood podcaster’s heart. Will it be a matter of rejoining the organic to the man-made, or sundering? Authors like Kurzweil would have it that the separation has already occurred, is irreversible, and does not bode well for the future of organic life. Others will take a more nuanced approach.
As long ago as 1974 E.O.Wilson proposed the rapprochement of gene/culture co-evolution in his writings on sociobiology. Epigenetics takes things further than Wilson, further even than Richard Dawkins is willing to venture. Remember, it was Dawkins who introduced the word “meme” into the English language. In the 2005 30th anniversary edition of The Selfish Gene Dawkins is still on record as saying: “No matter how much knowledge and wisdom you acquire during your life, not one jot will be passed on to your children. Each new generation starts from scratch.”2 Contrarily, epigenetics suggests that trauma, including human induced crises, can be a transmittable learning experience, genetically speaking.
McLuhan would have been fascinated by the possibilities inherent in this debate. As children of McLuhan, designers should be enticed. It would seem to be a potentially fitting solution to a heretofore intractable argument.
1Marshall McLuhan, Essential McLuhan. Basic Books, Toronto, 1995.
2Richard Dawkins, The Selfish Gene. Oxford University Press, Oxford, 2009. p.23
David Stairs is editor of Design-Altruism-Project
Leave a Reply
You must be logged in to post a comment.