August 2, 2012
By Scott Lindemann
Water is an essential component of life on Earth, and comprises a portion of every living thing on it. Water is a solvent, surrounding organic molecules and allowing them to interact with each other in ways that allow metabolism, replication, and ultimately life. Water has a number of fundamental properties that make it crucial for life, including this solvent ability and its properties of cohesion and adhesion. Interestingly, more than 97% of the world’s water is found in the ocean as salt water, and of the remaining fresh water, nearly 70% is frozen in the polar icecaps or in glaciers. Given the vital importance of water to life on earth as well as its scarcity, it is surprising that some of available fresh water seems to be used irresponsibly.
Currently Los Angeles relies on water diverted from various lakes and rivers, with some water coming from as far away as the Sacramento and Colorado Rivers. In addition to these sources, groundwater is pumped from underground aquifers faster than natural processes are refilling them, meaning the possible loss of these sources of naturally filtered drinking water.
Drinking water can be condensed from seawater at desalination plants, but the evaporation of seawater is an energy intensive process. Water can also be purified from wastewater at treatment plants, but this too is energy intensive. However, even once this water is reclaimed it takes a lot of energy to pump it to consumers, making these two processes very expensive ways of obtaining fresh water.
Despite these sources of water, it seems to that Los Angeles is ignoring what is possibly the easiest source of fresh water: rainwater. When it rains in Los Angeles, water flows off the roofs of buildings and into storm drains, where it flows directly to the Hyperion Treatment Plant along with the wastewater. This is not only a waste of a resource, but actually creates a problem for the city’s water treatment plant as the increased flow of water must be processed alongside sewage water, sometimes doubling the total daily volume of water which must be treated.
One solution could be to simply retrofit buildings in Los Angeles with gutters to divert rainwater into local holding tanks. By doing this, the city would have a renewable source of water that could be used for irrigation or purified into drinking water more easily than wastewater could be, helping meet the demand for water and meaning that less water would have to be diverted from rivers, lakes, and aquifers. This water has also already been “pumped” against gravity and distributed as rain by solar energy, and so would not need to be pumped again. In addition, because this water would be stored instead of diverted into the sewers, it would ease the load on Hyperion on rainy days.
This small change will not solve all of Los Angeles’s problems, but it could be one step towards a better system.
About the author: Scott Lindemann is a fourth year student at the University of Southern Califonia. He enjoys reading, cooking, and lifting weights.
By Roxi Aslan
The history of drinking water is one that is very interesting. As water is fundamental for human life, obtaining drinkable water has always been a goal of any civilization that has been in existence. Originally, humans obtained water from their nearby rivers and springs but as populations expanded, this goal became a greater challenge. These challenges had to be met with great engineering feats such as the first aqueducts and cisterns that the Romans built in regions such as Istanbul and Nimes. However, once the water was obtained, it was not until the 11th century that a physician began to understand the concept of water-borne pathogens and therefore that water should be altered before drinking it in order to prevent illness.
With technological advancements developing very slowly, thousands of people died from sicknesses caused by filthy drinking water and only a century after the microscope was created in 1595 and microbes were first discovered were we able to begin to improve human health. With the subsequent development of sand filtration methods and chlorine treatments that eliminate water-borne pathogens and therefore diseases, water quality has now become an important part of state and federal policies. Today, Los Angeles has the world’s largest filtration plant in which ozone is used as a disinfectant to treat up to 600 million gallons of water per day.
However, drinking water issues still remain. As populations continue to expand and we engage in practices such as obtaining our food from confined animal feeding operations, contamination of our drinking water supply with chemicals from fertilizers and pathogens from sewage has been growing. With crowded conditions in a city like Los Angeles, failures and spills at treatment facilities are likely. At the Hyperion wastewater treatment plant that we visited this week, it wasn’t until 1980 and $1.4 billion dollars later that we were able to stop the discharging of 25 million pounds of wastewater solids per month into the Santa Monica Bay. For less-developed countries, such funding is not possible.
Therefore, focus on source water protection is an effective and necessary way of reducing these high treatment costs and ensuring safe drinking water. Only fairly recently though has this concept been put into action through policies such as the United States’ Safe Drinking Water Act of 1996. After events like the 1996 contamination of drinking water supplies in Santa Monica with MTBE, a gasoline additive that causes health problems, actions have been developed to reduce potential sources of contamination and the use of such chemicals have been greatly decreased. While measures that have been taken to protect our source water have been successful, pollution and contamination will have to be constantly monitored as our populations continue to grow even more.
About the author: Roxi Aslan is a junior biology major with a minor in environmental studies in the USC Dana and David Dornsife College of Letters, Arts, and Sciences. Roxi plans on pursuing a career in marine biomedical research and hopes to use her science diving skills acquired in a Guam and Palau field course to do so.
April 22, 2012
The United Nations has declared water a basic human right, saying that “the human right to water entitles everyone to sufficient, safe, acceptable, physically accessible and affordable water for personal and domestic uses.” Many hold similar views as the UN, while others differ in opinion and think water is a privilege not a right. As the human population increases alongside the demand for clean, safe sources of water, this issue will only escalate in significance and severity. Humanity as a whole must answer the question: Is water a privilege or a right?
Currently access to safe drinking water is not universal. With almost 900 million people lacking access and more than 1.5 million children annually dying due to this reason, the United Nations has recognized clean water and sanitation as “integral to the realization of all human rights.” Providing access to drinking water does not have a simple solution; when dealing with the right of humans to access water: political, social, economic, and industrial changes are needed. At the 2011 United College London (UCL) Annual Conference, the issue of water security was brought up and concluded that the global North cannot simply expect the South to generate access to clean drinking water. For the most part water abuse comes from the North and the South is the region in need of more clean drinking water. Many believe the global North should treat water as a commodity since they tend to overuse water and are not penalized for doing so. With this implementation, water may be better conserved in the North which may help the South receive economic deductions to increase clean water access.
Many take the opposite view on this issue, arguing that water is a privilege and treating water as such does not violate basic human rights. Specifically, some take the stance that water is a simple human need, not a right we all hold. One argument that supports this stance is water privatization. Some believe that the government should not hold the responsibility for providing adequate water to its civilians. Because the privatization of water has been successful before, where companies control the supply and accessibility to water, people cannot assume they are entitled to clean water without paying a price. Also, it is arguable that the number of people living in the world today without access to clean drinking water is proof enough that water is a privilege not a right. History has shown that many people do not have access to safe drinking water, and this issue will only become more severe as human population increases alongside demand. In fact, the number of people currently without access to clean drinking water totals the number of people living in the US, Canada, Argentina, Chile, Singapore, United Arab Emirates, France, Germany, England, Italy, Spain, Japan, Australia and Norway. Thus, critics argue that water is a privilege, because it is not sustainable to treat water as a right where it will become more difficult to supply as we progress into the future.
For those who see water as a privilege, their idea generally revolves around keeping sustainability of the resource. We see water being abused daily through agriculture and private consumption. According to Maslow’s hierarchy of needs, water is a basic physiological right that is essential for survival. Allowing people access to clean drinking water would not set up a system of abuse, but would rather create a need for stricter guidelines. Fracking for example could become a safer practice after the creation of more regulation, because the risk of water source contamination would decrease as water quality and technology improve. If water is granted to everyone, we would see a growing need for protection. Consequently, the enforcement of stricter guidelines and policies would be needed to ensure the well-being of mankind.
Connor Schroeder and Albert Perez are undergraduates in the USC Dana and David Dornsife College of Letters, Arts and Sciences.
The carbon cycle is currently out of balance. Humans have introduced too much carbon dioxide to the atmosphere from burning fossil fuels, causing climate change and changing weather patterns. We have the technology to do work with nature to sequester more carbon. Agricultural land accounts for 455 million acres of the total land area in the US of 1.9 billion acres. Unfortunately, since the time that we have settled the land, the soil organic content has dropped to about less than one-fourth of what it once was.
Humans could sequester organic carbon into soil if we operated farms and ranches with practices that increase and maintain the organic material in the soil. For example, conventional farm practices that include improper tillage and overuse of chemical fertilizers result in about 20,000 pounds of carbon dioxide. We can help control the CO2 released by adding organic material to the soil. Practices in which soil is mulched and rarely tilled result in a dramatic decrease in the loss of carbon dioxide from the soil. Tilling the soil upsets soil life and exposes it to sunlight and oxidation, releasing large amounts of CO2. In the natural environment, the carbon-based roots and other soil life are rarely exposed or destroyed. Such oxidation naturally takes place, but the natural process is much slower, so plants can capture the CO2 and reprocess it instead of letting it into the atmosphere.
California is a huge venue for carbon storage potential, as much of California’s agriculture is perennial. Perennial crop residue is more readily decomposed than annual residues, and perennials store carbon within the woody biomass of trees and vines. Further, with the increase in agricultural yields, the biomass returned to the soils has increased, promoting sequestration. Rice farmers have also contributed to sequestration efforts. Instead of burning the fields after harvest, most of the crop residue is now returned to the soil. Through similar small efforts, California agriculture can greatly increase its agricultural carbon sequestration.
Although there has not been significant research into vineyards as carbon sequestration resources, they hold high potential. Permanent cover cropping has been shown to increase soil organic matter when used instead of bare fallow rotations. Growing cover crops, however, can be negatively impacted by one light tillage annually. Further research is needed to understand the ability of different cover crops to increase soil carbon in vineyards. Still, there have not been many studies of vineyards and carbon sequestration. Vineyard specific studies are needed to understand the effects of vineyard management practices on carbon storage.
California could almost double carbon sequestration by adapting conservation tillage practices and returning prunings to the soil. This assumes that area for perennial agriculture continues to expand, and that the biomass of crops continues to grow. As of 2002, California’s agriculture was not sequestering through conservation tillage although the practice is commonly cited as sequestering carbon by reducing soil respiration. Due to the low erosion potential of the land and the high intensity multicropping, California agriculture has not widely adopted conservation tillage. If further research were done on adapting conservation tillage to California agriculture, we could help restore balance to the carbon cycle.
Christopher Miranda is an undergraduate in the USC Dana and David Dornsife College of Letters, Arts and Sciences.
February 27, 2012
Desertification is defined as the deterioration of land in typically arid areas due to changes in climate and human activities. In the United States, desertification is typically caused by poor farming practices and the conversion of grazing areas to cropland. Climate change intensifies desertification in arid areas because not only are global temperatures rising and natural disasters becoming more extreme, but also the global water cycle and precipitation patterns are such that rainfall is decreasing in most areas and concentrating in a few others. Furthermore, because California is in a climactic region that can be defined as dry subtropical, the effects of climate change and agriculture has led to increased desertification. The short-term and long-term effects of this desertification are numerous and will have many repercussions for both humans and the environment.
The environmental costs of desertification are quite serious and can eventually destroy natural ecosystems. Topsoils lose their fertility and the growth and support of organic life in the pedosphere becomes much more difficult. As topsoil drys out it becomes susceptible to movement from winds, creating new natural disasters such as the Dust Bowl of the 1930’s. Furthermore, this dust can be blown out into the ocean and can affect weather patterns. In order to salvage lands affected by desertification, farmers begin to invest more in irrigation, which in turn diminishes groundwater resources and is the beginning of long-term impacts such as drought and famine. Additionally, as the topsoil becomes less nutrient rich from desertification plants become less productive and many of the ecosystem services they were providing are diminished.
Unfortunately, California becomes more susceptible to desertification there is a tendency to focus only on the immediate effects. Important long-term impacts on the environment also need to be addressed, such as the effects on the carbon cycle, biodiversity, and freshwater supply. Vegetation in arid areas stores a substantial amount of carbon (about 30 tons per hectare) and when desertification causes drought and the vegetation dies, that storage is lost. In addition, desertification dries out soil, the organic matter of which is the largest known carbon sink, resulting in increased greenhouse gas effects as that carbon is released into the atmosphere. As soils and vegetation are affected by desertification, ecosystems lose key resources that result in a loss of biodiversity. Desertification also poses a threat to freshwater resources. River flow rates decrease, leading to silt build up in estuaries, which incites saltwater intrusion into the water tables. As the demand for water increases there is a tendency to over-pump aquifers, which can result in water depletion and land compaction. For example, the San Joaquin Valley of California experienced subsidence at a maximum of 28 feet between 1925-1970 from overdrawn aquifers. Because California relies so much on agriculture, farmers exploit aquifer water for irrigation without considering these long-term issues. However, if the agricultural industry were to collapse from drought, we’d be facing the threat of famine and a huge economy crash.
Clearly there are many negative effects from the process of desertification that need to be addressed. Some of the most popular decisions to combat the effects of the land drying out include sustainable farming practices, such as drip irrigation, integrated crops, or no-till farming, and drought prevention. As stated in the 2010 California Drought Contingency Plan, “California’s water resources have been stressed by periodic drought cycles and unprecedented restrictions in water diversions from the Sacramento-San Joaquin Delta in recent years. Climate change is expected to increase extreme weather. It is not known if the current drought will abate soon or if it will persist for many years. However, it is certain that this is not the last drought that California will face.” The DCP has moved towards enhancing monitoring and early warning capabilities, assessing water shortage impacts, and creating preparedness, response, and recovery programs, which should help California to conserve water and slow down the desertification process.
Harriet Arnold and Divya Rao are undergraduates in the USC Dornsife College of Letters, Arts and Sciences.