These days, the vast majority of large global companies produce a sustainability report. If your organization doesn't, it is a member of an ever-shrinking minority. This is a staggering trend, considering that there were hardly any such reports at the turn of the millennium.
But what precisely does it mean for an organization to become "sustainable," and how did this trend start? In this two-part series, we identify four major stages of the modern sustainability movement, which we argue originated at the end of the 18th century. In this article, we explore how corporate sustainability emerged through the first two stages. In the next part, we will clarify the concepts of corporate sustainability and the sustainable enterprise by looking at the third and fourth stages.
Stage 1: Survival
In 1798, Thomas Malthus wrote a book entitled An Essay on the Principle of Population, which stressed the limits to growth caused by resource scarcity. Moreover, Malthus said that the rise in population and consumption would eventually outpace the available resources on earth. Hence, the primary framing for sustainability was survival—how do we ensure society will survive?
Malthus suggested that there were two forms of checks to limit the size of the population: positive checks that increase the death rate (such as hunger, disease and war), and preventative checks that decrease the birth rate (such as birth control and celibacy). While Malthus discussed the important role that technology could play, he discounted its efficacy in curbing the resource scarcity problem due to the increase in consumption that would accompany it and the limits to efficiency that it could provide.
Other researchers disagreed with Malthus, contending that technology was capable of increasing total production and the resources the planet could provide. Through industrialization, businesses and the technological innovation they fostered were characterized as the saviors of humanity—advancing the quality of life at an exponential rate.
But are business and technology the solution? This debate carries forward to the present day. Many businesses say the answer is "yes." From biotech and pharmaceutical companies to more efficient machines and alternative energy, many technology-driven organizations and their proponents believe that technology will save us. However, many people argue otherwise, pointing to the "dark side" of technology.
While traditionally seen as aiding population growth through the increase in efficiency and productivity of resources, technology also contributes to pollution, crowding and resource depletion. Jay Forrester used these three variables to build upon Malthus' theory of the relationship between food supply and population. In many ways, Forrester set the context for sustainability agendas that would ensue in the decades that followed his seminal work of 1971. Nearly every scenario that was played out using Forrester's model spelled impending doom.
Stage 2: Environmentalism and Compliance
What precisely are the roles of technology and firms regarding environmental degradation? This question is at the cornerstone of the second stage of sustainability, which focuses on environmentalism and compliance. While firms can improve society through productivity and efficiency, they can also harm it through the negative externalities that result.
In 1939, Paul Hermann Müller, a Swiss chemist, made a remarkable discovery concerning the insecticidal properties of dichlorodiphenyltrichloroethane, or DDT. He was awarded the Nobel Prize for his research less than a decade later after DDT was extensively used during World War II to curb malaria and typhus among troops. Its efficacy soon led to its adoption in the U.S. as an insecticide but it had harmful side effects, including toxicity to animals and humans. Capable of causing genetic mutations and disrupting the hormone system, DDT can remain toxic in soil for up to 30 years as it slowly gets absorbed into various ecosystems.
With her 1962 publication, Silent Spring, biologist Rachel Carson brought the DDT debate to the forefront and started the environmental movement in the U.S. Carson's book caused a national storm that led to the banning of DDT in the U.S. and the passing of the National Environmental Policy Act, which was signed into law the first day of 1970. Often coined as the modern-day environmental Magna Carta, the Act led President Nixon to establish the U.S. Environmental Protection Agency (EPA) in the same year. This monumental legislation put an end to the largely unregulated industries in the U.S., ensuring that businesses met certain levels of compliance so that they did not permanently damage ecosystems. In 1973, the EPA also worked with the Nixon administration to help pass the Clean Water Act and the Endangered Species Act.
While the environmental movement in the U.S. was primarily spurred by awareness, its counterpart in Europe was triggered by big crises. The first Clean Air Act was passed in the UK in 1956 in response to the "Great London Smog" of 1952, caused by the burning of low-grade coal and unique weather conditions. It resulted in the deaths of thousands. The first major oil leak in 1967 from the Torrey Canyon supertanker prompted the UN to create the UN Environment Programme (UNEP) in 1972. UNEP later established the Intergovernmental Panel on Climate Change (IPCC), which shared the Nobel Peace Prize in 2007.
Implications for business
The environmental movement and accompanying legislation caused businesses to begin accounting for their "negative externalities," or harmful impacts on the environment, that had previously been ignored. Barry Commoner arguably developed the first framework to demonstrate these impacts. Commoner argued that capitalism and the technologies it embraced were responsible for the environmental degradation that was largely going unchecked. In his book The Closing Circle (1971), he sets out Four Laws of Ecology:
[This article has been reproduced with permission from IMD, a leading business school based in Switzerland. http://www.imd.org]