Category Archives: Nature and Humanity

The world as we find it

The Origins of Infrastructure

One of the great mysteries of humanity’s history is how we made the transition from an isolated, emergent species to attain today’s globally dominant civilization. Scientists tell us that the story began as early as 7 million years ago in Eastern Africa. Fossils found in the Awash Valley give evidence of our early precursors. Archaeological findings suggest that some of these precursors began to fabricate and use rudimentary stone tools between 6 million and 2 million years ago. Learning to control fire followed about 1 million years ago. By 70,000 years ago homonins had migrated out of Africa and begun to apply more complex technology evidenced in hafted spears, for which a sharpened stone point was attached to the wooden shaft.

The fossil record indicates that our own species, Homo sapiens, evolved during this progression and became the sole survivor among several homonin species. The evolution included a remarkable growth in brain size as well as emergence of social behavior and technological prowess. Some scientists hypothesize an interaction between physical capability and intellectual accomplishment to explain this evolution.

British archeologist Steven Mithen, for example, surmises that early uses of technology (such as hafting of spears) encouraged development of “cognitive fluidity,” an ability to abstract and combine aspects of experience from different domains such as finding shelter or observing game.  The large brain of Homo sapiens was an essential adaptation that enabled this cognitive fluidity to develop, but does not by itself explain how the development came to be. Adopting and using a cultural innovation provides the stimulus for users to extract more from their brains than they might have otherwise.

Drawing on observations of ants and other animals that exhibit eusocial behavior and altruism—in which some individuals in a colony or nest limit their own reproductive potential by raising the offspring of other nest-mates or defending the group against competitors and predators—noted Harvard biologist Edward Wilson suggests that certain “preadaptations” favor the behaviors’ evolutionary development. Among the most important of these preadaptations, Wilson conjectures, is a species’ propensity for living in defensible nests.  When early humans, tribal by nature, learned to use fire and establish campsites sufficiently persistent to be guarded as a refuge, they had taken a crucial step toward modern social organization.

Wilson and his colleagues Martin Nowak and Corina Tarnita assert that the advantage of a defensible nest located within reach of reliable food sources, particularly one requiring greater energy in its construction, is a crucial causative agent in the evolutionary development of eusociality, a trait that loosely applies to humans as well as ants. A next step in humans’ social evolution beyond the adoption of movable campsites would logically seem to be long-term commitment to a fixed location. The earliest evidence of such commitment arguably is found in the Chauvet Cave walls in southern France.  Images painted on the cave walls here and elsewhere (for example, the El Castillo cave in Cantabria, Spain, and others Romania and Australia) are estimated by various archeologists and methods be 28,000 to 40,000 years old.

We have no convincing evidence of the creators’ motivations for any of the cave paintings, but their permanence and often difficult-to-access locations suggest these were not simply decorations of living space, but rather demonstrations of a particular significance of place, perhaps an effort to preserve human memory as recorded history. I propose that in this sense these ancient markings are humanity’s earliest known infrastructure.

University of Cambridge archaeologist Graeme Barker has presented the evidence suggesting that the domestication of various forms of plants and animals evolved in separate locations worldwide, starting around 12,000 to 14,000 years ago.  For many researchers, this domestication is synonymous with “agriculture,” a technological innovation and foundation of modern civilization.  An alternate model proposed by David Rindos in the 1980s proposed that domestication of locally available plants, a co-evolutionary interaction of humans and their food sources, led to intentional agriculture and consequent selection of preferred species and strains.

This domestication of plants has been characterized as the beginning of the Neolithic or Agricultural Revolution.  Evidence, particularly from the Fertile Crescent region in the Middle East, indicates that cultivation was accompanied by construction of settlements and drainage ditches and landforms to control plant irrigation.  Archeological studies by Harvard archeologist Ofer Bar-Yosef and others are currently thought to indicate that the Natufian culture in the region is the world’s oldest example of sedentary settlements and agriculture, notable particularly because the settlements may have preceded the commencement of crop cultivation.

Whether development of agriculture preceded or followed the birth of cities has long been debated.  Mithan, for example, reflecting recently on the progress of human civilization, expressed a widely held view that agriculture came first, and once farming had originated, towns and cities appear to be an almost inevitable consequence.  On the other hand, Jane Jacobs, an economist and unabashed urbanist, famously argued in the 1970s that labor specialization and trade first gave rise to cities, and that feeding their populations necessitated the development of agriculture. (Archaeologists notably disagree. See Smith, Michael E., Jason Ur, and Gary M. Feinman.  2014.    “Jane Jacobs’s ‘Cities-First’ Model and Archaeological Reality.” International Journal of Urban and Regional Research 38 (4): 1525-1535.)

In either case, however, it would seem that infrastructure came first. The investment of effort in clearing fields; moving earth to adjust water flow; building fences, protective walls, and substantial shelters; maintaining paths for transportation; and the like would have contributed substantially to agricultural productivity, settlement economy, and social functioning of the residents.

“Sustainability” may be fundamentally unsustainable, but we have a chance

The idea of “sustainability” has clearly taken root.  The word appears frequently in print as well as Internet media, and national governments around the world have established agencies and programs devoted to it.  There seems to be widespread agreement that the idea has something to do with energy supplies, environmental impact, and economic growth, and perhaps with social engagement and political stability as well, although the scope of what is to be sustained—individual well-being, national prosperity, global status quo, for example—seem to differ from one forum to another.  However, there seems also to be a dawning realization that the idea’s application as a basis for guiding humanity’s actions may not be sustainable.

An important early appearance of the meme, if not its initial source, is often attributed to the World Commission on Environment and Development, commonly known as the Brundtland Commission.  This group of international experts was convened by the United Nations in 1983 to propose long-term environmental strategies for achieving sustainable development; recommend ways that concern for the environment may be translated into greater co-operation among countries; and help define shared perceptions, aspirational goals, a long-term agenda for action.  The Commission’s 1987 report, Our Common Future  suggested that “Sustainable development is development that meets the needs of the present without compromising the ability of future generations to meet their own needs.”  Among the still-expanding literature on the subject, I have found no more cogent definition of the term.

It might seem like a small step from seeking humanity’s “sustainable development,” a steady advancement of people’s achievement and wellbeing, to the “sustainability” of humanity, our survival as a prosperous species.  For a number of reasons, however, I suggest there is a large gap between the two concepts.  The gap is so broad, in fact, that I doubt the value of sustainability as a meaningful basis for guiding our principles and policies. Let me explain why.

To begin, the time scale for thinking about our sustainability far exceeds our abilities—politically, socially, historically, perhaps psychologically—to plan, take meaningful action, or even pay attention.  Scientific evidence suggests that the biological genus of which humans are a part evolved into being and the first hominid use of stone tools began in Africa perhaps 2.5 to 3.5 million years ago.  Evidence of homo sapiens sapiens, our particular species, dates back about 250,000 years.  (In all of this, my phrasing is meant not to convey any skepticism, but rather to acknowledge that we rely entirely on inference from the limited data available to us to draw conclusions about past events and conditions.)

Our various experiments in culture, social, and political organization are rather brief when seen in sharp contrast with these time periods.  Damascus is often claimed to be the oldest continuously inhabited city in the world, but evidence for large-scale
settlement seems to date back only about 4,000 years. (The earliest Egyptian, Sumerian, and Chinese written records may have been created about 6,000 years ago.)  The community water-management schemes of Bali and other parts of Indonesia, arguably among the better models for sustainable relationship of humans and their environment, began to exist perhaps 1000 years ago.  England’s Magna Carta was first issued in 1215 and the United States, our ongoing experiment in capitalist democracy, was established less than three centuries ago.  Viewed against the backdrop of human history, “sustainability” has had a very brief span of influence.

In addition, there is the fundamental uncertainty of our existence as a species.  While some people prefer alternative explanations, the fossil evidence suggests that many varieties of creatures have come and gone since the first simple cells appeared.  The famously extinct dinosaurs died out, some scientists suggest, after an asteroid colliding with the Earth caused extreme global climate change.  On a less cosmic scale, scientists theorize that the ash from a volcanic eruption approximately 70,000 years ago at Lake Toba on the island of Sumatra, Indonesia, similarly caused such a dramatic global cooling that human population was drastically reduced.  Outbreaks of bubonic plague (the infamous Black Death of 14th Century Europe) and related famines in more recent times have dramatically reduced human populations in Asia and Europe.  Apart from simply not giving in to existential despair, probably the best we can do in light of such evidence is to limit our perspectives to decades at most.  Some government agencies already seem to be unable to maintain funding for the programs they established to enhance their communities’ “sustainability.”  (For example, see the commentary on Sustainable Cities Collective.)

Finally, we really cannot know whether our actions are “sustainable” with respect to either our development or our survival.  Application of the Brundtland definition requires forecasting not only the consequences of our current actions but also what future generations may judge to be their own “needs.”  On the one hand, our society and our global environment are a complex system, susceptible to the well-publicized “butterfly effect;” any small perturbation can cause unforeseen consequences.  On the other hand, our values, technologies, and culture change from one generation to the next, so that what may seem to us an inconsequential change may be seen very differently by our children; consider for example the shift in our views about air pollution pesticides.  In general then any assessments of the future consequences of our actions are more than likely to be inaccurate.  Even more fundamentally, it seems quite likely that we simply cannot do anything to meet our own present needs without in some sense compromising the options available to future generations.

While “sustainability” or even “sustainable development” may be problematic as directly useful concepts, the ideas nevertheless do point the way toward useable principles. Applying these principles will at least increase the chances of our long-term survival:

  • Use only renewable resources: No matter how large the supply reservoir may be, it will eventually be exhausted.
  • Eliminate all waste and pollution: What economists refer to as “residuals” are simply an indicator of inefficiencies in
    our production processes.
  • Stabilize our population: Increasing humans’ wellbeing and chances of survival as individuals and as a species depends ultimately on enhancing labor productivity as well as on applying strictly the first two principles.

Baltimore’s Sustainability Report

Baltimore’s Mayor Stephanie Rawlings-Blake on April 16, 2011, stood up at the city’s Druid Hill Park Conservatory to announce the release of the 2010 Annual Sustainability Report.  This “yearly accountability tool to track Baltimore’s towards improving the economic, social, and environmental sustainability” was the city’s second such report, a product of the Baltimore Office of Sustainability.

Baltimore defines sustainability as “meeting the current environmental, social, and economic needs of our community without compromising the ability of future generations to meet these needs.”  This is deceptively similar to the often quoted formulation of the United Nations’ Brundtland Commission.  Our Common Future, the Commission’s 1987 report, asserted that “Sustainable development is development that meets the needs of the present without compromising the ability of future generations to meet their own needs.”  The concept of needs was explained with particular priority of concern for the world’s poor.  Our ability to meet needs was represented as limited by our social organization and technology as well as environmental constraints.

Read literally, Baltimore’s idea of sustainability differs in two possibly controversial ways from conventional usage. First, development—meaning steady increase of living standards and economic activity—is not mentioned.  Second, the needs that future generations will want to meet seemingly are presumed not to differ from ours today.  But perhaps, for Baltimore’s sustainability assessors, development is a fundamental need.

Baltimore's sustainability goals

In any case, the report is structured around 29 specific goals in seven clusters aimed at enhancing the city’s sustainability. Some of the goals are quite specific (for example, reducing greenhouse gas emissions by 15% by 2015), but most are open ended.  And with the possible exception of supporting local business, every goal is crucially linked to the region’s public works infrastructure, although infrastructure is cited explicitly as a contributing resource for only about one-third.

People, Nature, and Green Infrastructure

I shall list explicitly in a future post all of the elements of what I call “infrastructure.”  For now I want to consider one of the outliers: wilderness.

Most readers will grant, I imagine, that infrastructure includes government-provided roads, water supply, and sewers.  Some may argue that I should not include energy supply and telecommunications, because these systems are typically managed by the private sector; but I shall include them.  Monuments and parks—Philadelphia’s Liberty Bell and the Eiffel Tower in Paris, for example—also qualify easily, as I shall write, because they structure our views of the world, our understanding of where we are, and even how we think of ourselves. 

Wilderness is more difficult. 

Henry David Thoreau famously wrote in Walking, penned in 1861 but published after his death, “…in Wildness is the preservation of the world.”  I take his meaning quite literally: We must have places in the world where humanity’s touch remains light if we are to survive as a species and continue to reside in this world.  It is a perspective that photographers Ansel Adams and Eliot Porter conveyed viscerally in influential coffee-table portfolios published by the Sierra Club. (Ansel Adams and Nancy Newell, This is the American Earth, 1960; Elliot Porter, In Wildness is the Preservation of the World, 1962)  It is an argument forcefully made by Harvard naturalist Edward O. Wilson (The Future of Life, 2002) Adam’s images in particular—in the Yosemite Valley and the Tetons are among the most memorable for me—arguably spurred the rise of environmental consciousness in the United States as Sierra Club calendars graced college dorm rooms across the nation.

Set aside—for now, at least—consideration of whether “wildness” and “wilderness” are interchangeable ideas; Henry David seems to use them as such.  The evidence suggests, however, that the sort of place he had in mind could be experienced without making a trek to Yosemite.  He never traveled farther from home than Lake Ontario, and the Maine woods were probably the closest he ever came to wildness.  Granted, the woods in the mid-19th Century were less the habitat of loggers and snowmobilers than they are today, but New England had already seen 200 years of European settlement and native Americans had cleared an farmed throughout region for much longer.  As historian William Cronon convincingly explains, the landscape of nature that Thoreau would have known was far from free of human influence. (Changes in the Land: Indians, Colonists, and the Ecology of New England, 1983)  During his famous two years on the shores of Walden Pond, Thoreau was never beyond the reach of the train’s whistle.

Thoreau’s interest was primarily intellectual.  He wrote that we need the idea wildness to nurture our spirit, that “forest and wilderness” are the source of “the tonics and barks which brace mankind.” (Walking, Part 2)  The perspective separates humanity from the rest of “nature,” as though we are not a part of it.  A rich literature explores the origins and evolution of that view, but I will not go there now.

Suffice it to say that humanity and the other elements of our world are inseparable, and science is now giving us a functional understanding of the essential services we get from natural systems.  Ecology’s initial stirrings among such Thoreau contemporaries as Humboldt, Darwin, and Wallace is now branching into such sub-disciplines as landscape ecology and systems ecology—even urban ecology—and yielding the knowledge to devise management tools.

For lack of a better term, many people refer to this type of infrastructure as “green.”  (See, for example, the admirable book by Mark A. Benedict and Edward T. McMahon, Green Infrastructure: Linking Landscapes and Communities, 2006)  For me, there is no distinction to be made based on color or the underlying science used to develop and manage the essential hardware and software providing underlying structure and essential support for our economy and society.