UNIVERSITY OF CALIFORNIA
ENVIRONMENTAL TOXICOLOGY NEWSLETTER
Vol. 11 No. 1 March 1991
" WATER: Better late than...... "
Table of Contents
I. Microorganisms: The Drinking Water Contaminants of the '90's?
If you thought that we had given up on the newsletter because of the budget cuts, the answer is no. In fact, it is one of the few forms of communication we can afford anymore! My apologies for the lateness of this first newsletter for 1991. I promise the next newsletter will follow within three weeks. Last year I spoke with Dr. Marylynn Yates, Hydrology/Toxicology Specialist from UCR, about authoring an article for the newsletter. Her article about microbiological contamination of water should be of great interest to you, and I hope you will distribute the information far and wide. I hope to be able to convince her to contribute more articles in the future. Also included is another article about a water contaminant of interest (nitrate) written by Scott Wetzlich, Environmental Toxicology Extension Staff Research Associate. In keeping with past years, I encourage you to send in suggestions for topics, and also to send in articles that you think would be of interest statewide.
I. Microorganisms: The Drinking Water Contaminants of
Dr. Marylynn Yates
In 1985, the United States Environmental Protection Agency (EPA) proposed recommended maximum contaminant levels (RMCLs) of zero for viruses and Giardia in public drinking water supplies. As a result, in June 1989, the final rule requiring all surface waters used as a source of public drinking water to be filtered and disinfected prior to distribution was published. In April 1990, a preliminary proposal that would require all ground water supplies to be disinfected prior to distribution was published by the EPA.
Most people are familiar with the drinking water standards for inorganic (such as nitrate) and organic (such as trichloroethylene) contaminants. People may not be as familiar with drinking water standards for microbiological contaminants, especially because some of the standards are new. Therefore, I would now like to explain why the standards were set, and what the implications of their establishment may be.
For many years, the microbiological quality of our drinking water has been determined by monitoring for the presence of coliform bacteria. Coliforms are bacteria that live and replicate in the intestinal tract of humans and other warm-blooded animals, and thus they are a normal constituent of fecal material. In general, coliform bacteria do not cause disease in humans. Coliforms are inactivated, or destroyed, to varying degrees by water and wastewater treatment processes. It was believed for many years that the processes that inactivate coliforms also inactivate any pathogenic (disease-causing) microorganisms that were present in the water or wastewater. Thus, coliforms have been used as indicators for the presence of pathogenic microorganisms. In other words, if coliforms are detected in water, it is assumed that pathogenic microorganisms may also be present, and steps must be taken to protect the public health. Conversely, if coliforms are not detected, it is assumed that pathogens are not present.
There are many reasons for using an indicator organism for monitoring purposes. One is that there are literally hundreds of different pathogenic microorganisms that are present in human waste, and thus could be present in water and wastewater. Obviously, it would not be practical or economically feasible to monitor water for the presence of all of these microorganisms. The indicator is used as a signal that all pathogens are absent, or that some may be present. The perfect indicator would always be present when any pathogen is present, and absent when all pathogens are absent. Another reason for using an indicator is that indicators by definition are easier and less expensive to detect than the contaminant(s).
Obviously, there are no perfect indicators, but for many years, coliforms were thought to be adequate for protecting the public from pathogens in drinking water. Over the past 15 or 20 years however, there has been increasing evidence that coliforms are not adequate indicators for all pathogenic microorganisms. For example, human enteric viruses have been detected in treated drinking water that contained no coliforms and met other drinking water standards for turbidity and free chlorine. In addition, numerous scientific studies have shown that many pathogenic microorganisms survive in the environment (i.e., soil, water, and air) for much longer periods of time than do the coliforms. Perhaps the most convincing evidence that coliforms are not the best indicators of the microbiological quality of water are the data on waterborne disease outbreaks in the United States. Despite advances in sanitation practices in this country, the number of reported waterborne disease outbreaks has not decreased in recent years. Rather, the number has actually increased. Some of the increase is undoubtedly due to improvements in reporting, but nonetheless, a significant number of waterborne disease outbreaks continues to occur every year.
As shown in Table 1, the majority of acute waterborne disease outbreaks are caused by microorganisms; chemicals are responsible for only about 10% of the outbreaks. The causative agents of waterborne disease in ground water systems are shown in Table 2. Consumption of contaminated ground water is responsible for approximately 45% of all waterborne disease in the United States. Once again, the majority of the outbreaks are caused by microorganisms. Note that in both tables, a significant number of outbreaks are listed as gastroenteritis of unknown etiology. At this time, evidence indicates that the majority of these outbreaks were caused by viruses or parasites for which detection methods have only recently become available.
In light of all of the evidence collected over the past several years, the EPA concluded that coliforms were not adequate indicators of the presence of viruses and certain protozoan parasites, such as Giardia, in drinking water. This resulted in establishing the standards that were mentioned at the beginning of this paper.
Traditionally, when a standard is set for a contaminant in drinking water, public water utilities must establish a program to sample and analyze the water for the presence of the contaminant. In the case of viruses and Giardia, monitoring will not be required. Instead, the EPA has imposed treatment requirements that public utilities must use to remove or inactivate these pathogens. To explain the reasons for this, a digression on some of the pertinent attributes of viruses is necessary.
Viruses are obligate intracellular parasites; that is, they can only replicate in appropriate host cells. Thus, unlike bacteria, they are unable to multiply when in soil or water environments. They are also very host specific. For example, viruses that infect humans can only infect humans and a few types of primates. This means that when a human virus is detected in a water or soil sample, there is conclusive proof that the environment has been contaminated by human waste. Viruses that infect and replicate in the human intestinal tract are called enteric viruses. Unlike bacteria, viruses are not part of our normal intestinal flora, and are only present in fecal material when an individual is intentionally (e.g., by vaccination) or unintentionally infected. Viruses that are present in human waste may be introduced into the environment in a number of ways, including septic tanks, land disposal of municipal water, application of municipal sludge to land, irrigation of plant materials with sewage effluent, and intentional recharge of ground water with treated municipal wastewater. Another characteristic of enteric viruses is that they are extremely small, generally only 20-100 nm in diameter. This allows them to be transported considerable distances through soil and aquifer materials. A final important trait of viruses is that they have a very small infective dose; generally only one to ten virus particles may be required to cause disease.
All of the above-mentioned characteristics contribute to problems in detecting viruses in water. The fact that viruses can only replicate in susceptible host cells means that the sample must be inoculated into a cell culture or tissue culture system to detect viruses. Maintenance of cell culture is very expensive, and requires highly trained personnel to process samples.
The concentration of viruses that would be expected in a drinking water sample is extremely low (less than one virus per 1000 liters). The fact that the infective dose is very low (one virus particle) means that any number of viruses in a water sample is too high. Combining these two facts with the low recovery efficiency of viruses using current sampling techniques means that very large volumes of water must be sampled to assess health risks to an exposed population. Generally, 1,000 to 10,000 liters of water must be passed through a membrane filter in order to obtain an adequate sample. The viruses are retained by the filter (due to chemical adsorption, not physical straining), which then is taken to the laboratory for processing. Several steps are required to remove the viruses from the filter and then reduce the sample size to one that can be handled appropriately. During each of these steps, there is the potential for viruses to be destroyed or rendered undetectable (another reason for the very large sample size). Once an appropriate sample has been obtained, it is inoculated into cell culture. (The low numbers of viruses generally present preclude the use of fluorescent antibody or ELISA techniques for detection.) The cells are then observed for signs of infection, a process that can take up to six weeks. Obviously, six weeks after a contamination event has occurred is too late to take any steps to protect the public health.
Another problem encountered when monitoring water for viruses is that there is no one tissue culture or cell culture system that can detect all of the viruses of concern that could be present. For example, cells that can detect poliovirus cannot detect hepatitis A virus or rotavirus (a major cause of childhood diarrhea). Thus, the sample would have to be split and inoculated onto several different cell lines to detect the viruses of concern. At this time, there are fewer than five laboratories in the country that are capable of detecting hepatitis A virus and rotavirus in environmental samples. There are several viruses that are very important causes of waterborne disease (e.g., the Norwalk virus) for which there is currently no detection technology available.
The cost of collecting and analyzing a water sample for enteric viruses is very high. Depending on the specific virus(es) of interest, and the type of laboratory (university or commercial), costs range anywhere from about $500 to $2,000 per sample. There are methods using nucleic acid probes being tested that would reduce the cost as well as time for analysis, but these methods are still in the research phase.
The fact that several weeks are required to obtain results of virus sampling, as well as the high cost of analysis and requirement for highly trained personnel, led the EPA to impose treatment, rather than monitoring requirements on public water utilities. For public utilities that use surface water as the source of drinking water, they must now disinfect and filter their water prior to distribution. The exact treatment procedures have been published by the EPA.
Public water utilities that use ground waters must disinfect the water to inactivate viruses prior to distribution (Giardia is not of much concern in ground water because its relatively large size results in removal by filtration in the soil). Currently, most ground waters are distributed without any type of treatment, because they are generally thought to be of very high quality. For example, the city of Tucson, Arizona (population 500,000) relies totally on ground water for drinking water. Disinfection of the water is not routinely practiced. However, as shown in Table 2, about 45% of all waterborne disease outbreaks in this country are the result of use of contaminated ground waters; thus the requirement for disinfection. The disinfection requirement will undoubtedly impose a hardship on the many communities that use ground water for all or part of their water supply. In addition, there is concern about using disinfectants because of the possibility of forming disinfection by-products, some of which are carcinogenic. For these reasons, the EPA is considering allowing utilities to obtain a variance from the disinfection requirement if it can be shown that there is very little probability that the ground water is impacted by potential sources of human waste. The variance criteria are still in the very early stages of being developed, but it appears that one of the criteria will be a "vulnerability assessment" that will involve the use of some type of model that can predict virus concentrations down-gradient of any potential source (e.g., septic tanks, sludge application sites). It is anticipated that the final rule for ground water treatment and any variance criteria will be finalized in 1993.
In summary, the inadequacy of coliform bacteria as indicators of the microbiological quality of drinking water led the EPA to set drinking water standards for viruses and Giardia. For several technical, practical, and economic reasons, the EPA will not require public water utilities to monitor for viruses and Giardia. Rather, all surface waters must be filtered and disinfected prior to distribution. All ground waters must be disinfected, unless variance criteria are met.
Note from the Author: Anyone who is interested in obtaining more information on this subject, please contact me. For those of you who wonder what I spend my time doing, a considerable amount of it is currently being spent in the development of a model of virus transport that public water utilities can use in obtaining a variance from the ground water disinfection requirement.
Table 1. Etiology of Waterborne Disease Outbreaks in the United States, 1971-1983
|Disease||Number||% of total||Number||% of total|
|Gastroenteritis, unknown etiology||227||50.9||60191||56.16|
|E. coli diarrhea||1||0.22||1000||0.93|
Table 2. Etiology of Waterborne Disease Outbreaks in Untreated or Inadequately Treated Ground Water Systems, 1971-1982
|Disease||Number||% of total||Number||% of total|
|Gastroenteritis, unknown etiology||132||64.7||25700||74.85|
|E. coli diarrhea||1||0.5||1000||2.91|
Nitrates and nitrites in our diets have been a concern for around a hundred years. An 1895 account of cattle deaths from nitrate contaminated corn stocks was one of the earliest recorded accounts of a nitrate poisoning in food or feed. Recently, high nitrate levels in drinking water have become a concern to many Californians, especially those who get their water from wells. Although the health effects from nitrate poisoning are a concern for infants and a small group of susceptible adults, the amount normally encountered does not pose a significant risk to the general public.
The average daily intake of nitrate is around 100 mg and about 12 mg for nitrite. Vegetarians have a much higher daily intake (about 250 mg) because of the high nitrate content of vegetables. Less than 10 percent of the nitrates we ingest come from our drinking water; most are found in our food. Approximately 9 percent of the nitrates we ingest come from processed meats where it is used as a preservative and as a coloring agent. The greatest amount of nitrates we encounter come from vegetables like lettuce and spinach. A little over one-fifth of the nitrites we ingest come from cured meats. The highest levels originate in our own saliva, where bacteria in our mouths change nitrates to nitrites.
Vegetables and cured meats are the main source of nitrates in our diet. Vegetables with high levels of nitrates include lettuce with averages of 850 parts per million (ppm), celery at 2340 ppm, spinach at 1860 ppm, beets at 2760 ppm, and broccoli at 780 ppm. There are many factors which affect the nitrate levels in plants. The species and variety of plant are very important. Nitrate levels also vary with the part of the plant (leaf, stem, root, etc.) and maturity of the plant. Environmental factors include; drought, high temperature, shading, nutrient deficiencies, excessive fertilizers, and plant damage from insects and herbicides. The average American ingests 86 mg of nitrate a day from vegetables alone; 18.9 mg from lettuce, 16 mg from celery, 4.2 mg from spinach, 5.5 mg from beets, and 14.2 mg from potatoes. Although it would appear that vegetables would be a high risk factor in nitrate poisoning, very few cases have ever been documented. This could be because of the ascorbic acid content of most vegetables, which have protective effects against nitrate poisoning.
Cured meats account for 15.5 mg of nitrate in the American daily diet. They also account for about 4 mg of nitrite a day. Nitrates and nitrites are used on cured meats to give it the distinctive pink color, to prevent rancidity, and to prevent the growth of Clostridium botulinum spores. Nitrate and nitrite levels in cured meats have been set by the FDA at 120 ppm. The FDA also recommends 550 ppm ascorbate or sodium erythorbate which have similar effects as ascorbic acid in vegetables.
Fresh spinach contains high levels of nitrates, but very little nitrite. Storage at room temperature greatly increases the levels of nitrites and decreases the levels of nitrates. Since nitrites are more toxic than nitrates, this can be a serious problem. One study found nitrite levels rose from 30 ppm to 3550 ppm after just four days at room temperature. Refrigerating the spinach will delay the process, but not prevent it. Frozen spinach does not undergo this process. This conversion is caused by certain bacteria and by plant enzymes. Factors affecting eventual nitrite levels are the initial nitrate levels in the plants, the activity of plant enzymes, the storage conditions (aerobic, anaerobic, temperature, etc.), and the extent of microbial contamination.
There have been a number of nitrite poisonings in infants from spinach. One of the first recorded incidents was in Germany. Two infants died of methemoglobinemia after being fed spinach puree containing 661 ppm nitrite. In another case, 14 infants suffered methemoglobinemia after eating spinach that was high in nitrates and had been cooked at least a day in advance. The nitrates had become nitrites during storage. Infants can tolerate fairly high levels of nitrate in spinach, but not nitrite. Infants fed spinach containing 680 ppm nitrate at 16-21 mg nitrate per kg body weight per day for a week had no adverse effects.
Although food accounts for the vast majority of the nitrates we are exposed to, nitrates in drinking water are of a greater concern to the public. The levels that are usually found in drinking water should be of little concern, but in the case of severe contamination, care should be taken for a small segment of the population. Infants are at the greatest risk from nitrate poisoning. Drinking water averages about 1.3 ppm nitrate ion. This would give an average total daily dose from water of 2.6 mg per person per day. The Maximum Contaminant Level (MCL) allowed by law is 45 ppm nitrate ion, or 10 ppm when based on the nitrogen content of nitrate (nitrate-nitrogen). According to a 1985 survey, six percent of the rural wells in the U.S. exceeded this standard. The World Health Organization has proposed acceptable daily intake levels (ADI's) of 5 mg nitrate ion per kg body weight and 0.4 mg nitrite ion per kg body weight in addition to naturally occurring levels we ingest everyday.
The main sources of nitrate contamination of drinking water are animal feedlots, agricultural fertilizers, manured fields, and septic systems. Other sources include industrial wastewater, sanitary landfills, and garbage dumps. Ammonia fertilizers readily breakdown in the presence of oxygen to form nitrates. Nitrates are very water soluble, and can easily move through the soil into the groundwater. It is estimated that up to 40 percent of the nitrogen applied to fields as fertilizers is converted into nitrates and enters water sources as run-off and leachate.
Nitrates themselves are not toxic in the amounts we normally encounter. The toxicity of nitrates is a result of their conversion into nitrites within the body. Nitrites easily enter the blood stream and cause a condition called methemoglobinemia. Hemoglobin is the compound in the red blood cells responsible for transporting oxygen throughout the body. The nitrites oxidize the ferrous (+2) iron molecule in the hemoglobin to the ferric (+3) state. Hemoglobin containing a ferric iron molecule is unable to carry oxygen, and is called methemoglobin. Too much methemoglobin in our system causes methemoglobinemia. The condition is characterized by cyanosis (a bluish color to the skin), stupor, and cerebral anoxia (a lack of oxygen to the brain). About one percent of the hemoglobin in adults is normally methemoglobin, and a little less than two percent in children is methemoglobin. Levels between 10 and 20 percent can cause methemoglobinemia, and levels greater than 60 percent can cause death.
Infants under 4 months of age are at greater risk to the toxic effects of nitrites than are older children and adults. The methemoglobinemia caused by nitrites in infants is called blue baby disease. There are many factors to account for the greater susceptibility of infants. Infants consume a greater amount of fluid compared to body weight than adults do. Thus, if the same amount of water is consumed, an infant will have a much higher nitrate to hemoglobin ratio than an adult. The most important factor involved is that infants secrete very little gastric acid into their stomach. The pH in their stomach ranges from 5 to 7, which is favorable for the growth of bacteria that convert nitrates into nitrites. The pH of an adults stomach is around 2 or 3. Infants also have a deficiency in methemoglobin reductase. This is an enzyme that converts the non oxygen carrying methemoglobin back into the oxygen carrying hemoglobin. Finally, 60 to 80 percent of the hemoglobin in a newborn is hemoglobin F (fetal hemoglobin), which has a greater affinity for nitrites than hemoglobin A (adult hemoglobin). Hemoglobin F is rapidly replaced by hemoglobin A after birth, and by the third month of an infants life, only 30 percent of the hemoglobin is hemoglobin F.
The first case of an infant poisoning due to nitrates in drinking water was reported in 1944. Since then, the number of cases has climbed steadily. There were about 2000 cases of methemoglobinemia in infants reported between 1945 and 1970, with approximately 160 being fatal. Infants consuming milk formula made with water containing nitrate levels greater than the 45 ppm allowable limit are at greatest risk.
The majority of methemoglobinemia cases have been caused by drinking water with nitrate levels greater than 100 ppm. A small percentage of adults are at greater risk than most and these include pregnant women with an enzyme deficiency (glucose-phosphate dehydrogenase), and adults with a hereditary deficiency in methemoglobin reductase. Methemoglobinemia can be detected and treated quite easily. Treatment is by injection of methylene blue or by oral administration of ascorbic acid.
Acute poisoning from nitrates has occurred in adults after accidental ingestion. Eight to fifteen grams of sodium or potassium nitrate is fatal. Symptoms include severe abdominal pain, bloody stools and urine, weakness, and collapse. Nitrites are much more lethal, with only 1 g causing death. Symptoms of nitrite poisoning include flushing of the face and extremities, headache, cyanosis, nausea, vomiting, abdominal pain, and collapse. Nitrates are less toxic because of their rapid elimination in the urine.
Another class of compounds associated with nitrates and nitrites are nitrosamines. Nitrosamines are formed when nitrites react with secondary and tertiary amines on food in the stomach. The main sources of amines in our diet are fish, vegetables, and fruit juices. Nitrosamines have been detected in cured meats, but at very low levels of 20 parts per billion (ppb) or less. Seventy-five to eighty percent of the nitrosamines studied have been shown to cause cancer in laboratory animals. These include cancers of the liver, respiratory tract, kidney, urinary bladder, esophagus, stomach, lower gastrointestinal tract, and the pancreas.
Epidemiologic studies have yet to link nitrates and nitrites to cancer in humans. One reason is that nitrosamines occur in the ppb or part per trillion range. Doses in the ppm or parts per thousand were needed to induce the cancers in animals. High consumption of cured meats has been associated with certain cancers, however, high consumption often indicates a lack of fresh food and vegetables which provide protective effects against nitrosamines. The production of nitrosamines is inhibited by vitamins A and E, sulfamate, butylated hydroxytoluene (BHT), butylated hydroxyanisole (BHA), gallic acid, and various amino acids and proteins.
Finally, a few animal studies have shown very high levels of nitrates to have teratogenic effects. Extrapolation of this data to humans is based on 100 percent conversion of nitrate to nitrite in the human body. However, it is estimated that only about 5 percent is actually converted. If this is taken into account, the estimated human daily dose of nitrite is 1000 times or more below the level needed to cause reproductive effects in animals. With this in mind, there is no conclusive evidence to show that nitrates or nitrites could have a teratogenic effect in humans.
Nitrates and nitrites are also a concern for ranchers and farmers. The effects of nitrates have been extensively studied in ruminants (cattle, sheep, goats, etc.). Ruminants have a very good ability to adapt to nitrates and nitrites and to develop tolerances to them. Sheep have a higher tolerance than cattle because they have higher methemoglobin reductase levels.
Acute symptoms of nitrate poisoning in farm animals include gastrointestinal problems, vomiting, salivation, diarrhea, colicky signs, and frequent urination. Inconsistent signs of chronic poisoning include abortion, decreased milk production, infertility, decreased milk fat, Vitamin A deficiency, and slow weight gain.
Forage and rations containing 2% nitrate or less are considered safe for cattle and sheep. Poisoning is best avoided by acclimating mature animals to the high levels of nitrates in feed or water if present. Sudden changes in nitrate levels can bring on a toxic reaction. Ruminants adapt better to high nitrate levels when urea is added to their feed.
In summary, nitrates do not pose a great risk to the general public. Nitrates and nitrites can cause some health problems, but only at very high levels. Compared to the amount we receive in our food, the normal levels of nitrates in drinking water should not be a concern. An exception to this is in the case of infants and a few rare susceptible adults who are consuming water with levels above the 45 ppm limit. Infants are highly susceptible to nitrate poisoning due to a number of factors not applicable to older children or most adults. Presently, methemoglobinemia would appear to be the only health problem linked to nitrates and nitrites for humans.
Infants and susceptible adults should not drink water that has nitrate levels greater than 45 mg/l. Instead, they should use bottled water or water from a safe source. Contaminated water can be cleaned by deionization, reverse osmosis, or distillation. Normal water filters will not rid the water of nitrates. Private well owners can drill deeper wells to access uncontaminated water. Spinach should be eaten soon after purchasing, and all leftovers should be discarded.