COOPERATIVE EXTENSION UNIVERSITY OF CALIFORNIA DAVIS
ENVIRONMENTAL TOXICOLOGY NEWSLETTER


Vol 13 No 1 July 1993

INTRODUCTION

YES, we are still here, and the newsletter will continue as long as we are. So much for threats. Many readers know that I have an aversion to the word "toxics", an undefined word which has ingrained itself in modern bureaucratese (I can invent words too). For a long time, many practicing toxicologists have also argued that the use of the two words "toxic chemical" is redundant. In the first article in this issue I present some highlights of an article which appeared in the Harvard Center for Risk Analysis newsletter, Risk in Perspective. The authors have done an excellent job of covering this topic, and I will be happy to provide copies of the entire article to Cooperative Extension staff. The address of the center is included for those interested in subscribing to their newsletter.

Other articles in this issue include more information about lead, a description of a ciguatera food poisoning outbreak, some information about testing of milk for antibiotics, and of course, the highly popular "toxicology tidbits", which also includes some information not directly related to toxicology, but of interest. I have also included a short article from Morbidity and Mortality Weekly Report concerning the use of rear-facing baby seats in cars equipped with air-bags since it is likely that this information is not readily available elsewhere.

OPTIMAL USE OF "TOXIC CHEMICALS"

A new concept in environmental policy is "toxics use reduction" (TUR). The basic idea is to protect human health and the environment by reducing the use of chemicals judged to be toxic. Massachusetts and Oregon were the first states to enact TUR laws, and other states are considering similar laws. National TUR legislation is also likely to be debated in the years ahead. This article offers an evaluation of the TUR concept using the principles of risk analysis.

The Case for TUR

Proponents of TUR believe that toxic chemicals are bad and that less of a bad thing is a good thing. According to Dr. Kenneth Geiser of the University of Massachusetts at Lowell, an advocate of the Massachusetts TUR legislation, "these laws bypass debates over acceptable levels of toxicity and the risk of specific levels or releases. They rest on a simple argument: the use of every toxic chemical should be reduced or eliminated."

Some proponents of TUR see it as a first step toward banning toxic chemicals. At a recent conference sponsored by Resources for the Future, Dr. Barry Commoner argued that the best way to keep toxic chemicals out of the environment is to stop producing and using them. He cites as success stories the bans of DDT and PCBs in the 1970's and the phaseout of lead in gasoline in the 1980's. Commoner argues that these examples should serve as models of sustainable industrial development.

Toxic Versus Nontoxic?

In practice, TUR laws define "toxic chemicals" by legislative mandate. The Massachusetts TUR list started with 300 chemicals and now includes over 900 chemicals that are targets of use- reduction planning efforts.

From a scientific perspective, the phrase "toxic chemicals" is a misnomer. There is no such thing as a chemical which is free of harmful effects at any dose. Drinking 1.5 quarts of water per day is normal and healthy while drinking 15 quarts of water per day would be lethal. Similar types of statements can be made about sugar, salt, aspirin, alcohol and any other chemical compound.

Since all chemicals can be toxic under certain circumstances, it is reasonable to question the rationale for a chemical's inclusion on or exclusion from a list of "toxic chemicals." Indeed, without considering the likelihood and degree of human exposure and ecological risks resulting from specific applications of chemicals, there is no defensible method for determining which chemicals should be included on TUR lists. Scientists at the Harvard Center for Risk Analysis have examined the various lists proposed by TUR advocates and can find no sound and consistent technical basis for the lists that have been generated.

A key problem is that a particular chemical may cause significant risk or no risk depending upon how it is used in commerce. The phaseout of lead in gasoline was a success story because this particular use of lead posed serious and widespread risks to children, adults, and the environment. The use of lead-acid batteries in automobiles is currently being reduced, although EPA estimates that the health and environmental risks of this application are not particularly great. Other applications of lead, such as its use in chimney flashing, pose relatively little danger to the public.

Since the potential for human exposure and risk varies widely from one chemical use to another, the focus of TUR laws should be changed from lists of chemicals to lists of chemical applications that are known or suspected to pose significant risk to human health and the environment. This strategy is already widely used in other regulatory settings such as the registration of drugs for particular clinical indications or the registration of pesticides for use on particular crops.

Competing Risks of Substitutes

Chemical substitution is a primary means of achieving TUR. Just because a substitute chemical has escaped inclusion on a TUR list does not mean that its use is innocuous. If the use of one listed chemical is reduced, it is critical to assess what chemicals, processes, and associated risks will replace it. Unless such competing risks are evaluated, TUR may fail to achieve its risk-reduction goals and may actually exacerbate health and environmental risks in certain settings.

Chlorinated drinking water presents a useful example for reflection because the chemical byproducts of chlorinated drinking water may cause cancer. Nonetheless, the addition of chlorine to drinking water is highly effective in combating microbial contamination and human disease. None of the alternative disinfection processes that have been proposed to date are equally effective or economical. If chlorine is banned in the near future, communities that cannot afford expensive alternatives to chlorine may be forced to expose their citizens to injurious microbial diseases. South America has recently experienced cholera epidemics as a result of inadequate disinfection of drinking water supplies.

All TUR legislation should be written to require that chemical users make risk-risk comparisons before engaging in TUR.

The Benefits of Toxic Chemicals

The success stories of TUR tend to involve the end use of chemicals, chiefly as solvents and cleaners. In some applications, the uses of these chemicals can be reduced considerably without incurring economic penalty or loss of benefits to commercial users and consumers. However, most TUR lists also include six of the eight organic chemical building blocks, from which many other chemicals and synthetic products are made. These are butadiene, benzene, ethylene, propylene, xylene, and toluene.

Nor is it clear that we should always promote less use of "toxic" chemicals. Scientists in industry and universities are discovering new applications of toxic chemicals that promise potential benefits to the public. While dioxin is among the most feared chemicals because of its toxicity and presence in defoliants used in Vietnam, recent scientific evidence suggests that dioxins elicit antiestrogenic responses in rodents and in human breast cancer cell lines. Some of the less toxic congeners are currently being investigated as antitumor agents that may be useful in the treatment of breast cancer.

TUR legislation should be written to require consideration of the benefits of chemicals, along with less risky ways to use them.

Conclusion

Broad-scale application of TUR is inefficient unless most uses of a chemical are associated with significant risks and few benefits. All TUR legislation should include a significant-risk requirement designed to focus scarce public and private sector resources on specific industrial processes and applications that are known or suspected to cause significant risks. Our ultimate goal is the optimal use of toxic chemicals in society.

REF: Risk in Perspective, 1(2), May 1993. (Harvard Center for Risk Analysis, Harvard School of Public Health, 718 Huntington Avenue, Boston, Massachusetts, 02115).


CHINESE PATENT MEDICINES: MERCURY

Chinese Patent Medicines continue to be prepared according to ancient formulas, some of which include the use of two natural salts of mercury; cinnabar (mercuric sulfide) and calomel (mercurous chloride). Gastrointestinal absorption of mercury from these salts is considered to be very low since they are both practically insoluble in water. Recent studies however have shown that a tiny amount is absorbed and distributed to the kidneys (the primary site of toxic action for inorganic mercury). Thus, there is concern that long-term use of these preparations may lead to chronic mercury poisoning. The following table is an abridged version of one which appeared in the article referenced below.

Chinese Patent Medicines Which Contain Mercury

English Name Form Cinnabar
She Dan Chen Pi San Powder 12.3%
Xi Gua Shuang Powder 2.0%
An Gong Niu Huang Wan Pill 11.1%
Ci Zhu Wan Pill 14.4%
An Shen Bu Nao Pian Pill 6.9%
Bai Zi Yang Xin Wan Pill 3.8%
Zhu Sha An Shen Wan Pill 17.4%
Jian Nao Wan Pill 4.0%
Qi Li San Powder 7.0%
Zi Jin Ding Powder 6.25%
Bao Ying Dan Powder 4.98%
Hu Po Bao Long Wan Pill 4.7%
Peaceful   0.1%
Tse Koo Choy (contains 9.6 mg calomel) (total weight not presented)

REF: Kang-Yum, E. and Oransky, S.H., Chinese patent medicine as a potential source of mercury poisoning. Vet. Human. Toxicol. 34(3):235-238, 1992.


METHEMOGLOBINEMIA IN AN INFANT -- Wisconsin, 1992

Methemoglobinemia among infants is a rare and potentially fatal condition caused by genetic enzyme deficiencies, metabolic acidosis, and exposure to certain drugs and chemicals. The most widely recognized environmental cause of this problem is ingestion of nitrate-containing water. Ingestion of copper causes abdominal discomfort, nausea, diarrhea, and in cases of high-level exposure, vomiting. This report summarizes an investigation by the Division of Health, Wisconsin Department of Health and Social Services, of methemoglobinemia associated with ingestion of nitrate- and copper-containing water in an infant during 1992.

A 6-week-old girl (birth weight: 7 lbs 9 oz) was hospitalized June 1 for treatment of dehydration. On admission she weighed 6 lbs 10.5 oz and appeared "dusky." She was afebrile and had no signs of infection. A history obtained from her parents indicated that during her first 3 weeks she had appeared well and had consumed approximately 20 ounces per day of soy-based formula (consisting of a liquid concentrate diluted with 1 part water). During her 5th week, she developed loose stools and began to vomit after eating.

Diagnoses on admission included vomiting with failure to thrive and dehydration secondary to vomiting. She was treated and was discharged on June 2. On June 8, because of an acute weight loss (6 oz) and limited consumption of formula (< 3 oz) during the previous 24 hours, she was readmitted to the local hospital. On admission, she weighed 6 lbs 12 oz and appeared cachectic (emaciated). Her hemoglobin level was 13 g/dL, with 21.4% methemoglobin. She continued to vomit yellow- to blue-tinged liquid following ingestion of fluids. Methemoglobinemia was diagnosed, and supportive treatment, including oral fluids and oxygen, was initiated. Within 24 hours, her methemoglobin level declined to 11.1%. Further evaluation at a referral center did not identify any underlying medical problems. Since discharge, her parents have used bottled water for drinking and for preparation of formula and food.

The family's house was situated on a river between a river bank and approximately 100 acres of corn and alfalfa. Water was supplied by a 28-foot deep vacuum-sandpoint well located in a basement pump room. Water used for drinking and food preparation was filtered by a reverse-osmosis (R/O) unit installed for nitrate removal when the family purchased the house in 1989. Water samples collected from the R/O unit and from the well during the infant's hospitalization contained 9.9 mg/L and 58 mg/L nitrate-N, respectively (the U.S. EPA maximum contaminant level (MCL) for nitrate-N in drinking water is 10 mg/L). During the investigation in late July, the well water contained 39.6 mg/L nitrate-N and was free of coliform bacteria. An early morning first draw sample collected from the kitchen faucet contained 7.8 mg/L copper (the EPA MCL for copper in drinking water is 1.3 mg/L). Results of tests for corrosivity included a pH of 6.3 and an alkalinity of 16 mg/L (as CaCO3). Flushing the kitchen faucet for several minutes reduced the copper level to 0.2 mg/L. A midday water sample from the R/O system contained 0.6 mg/L copper.

Based on these analyses, the Wisconsin Division of Health recommended that the family use bottled water for drinking and for preparation of food.

Editorial Note: In 1991 and 1992, a total of 1825 exposures to nitrates/nitrites -- including 542 among children <6 years of age -- from environmental and other sources were reported to the Association of Poison Control Centers. The most common environmental cause of methemoglobinemia in infants in the United States is ingestion of water contaminated with nitrates from agricultural fertilizers, barnyard runoff, or septic-tank effluents. Acute toxicity may result after nitrate is reduced to nitrite in the stomach and saliva. Nitrite reacts with the oxygen- carrying protein, hemoglobin, reducing it to methemoglobin, which is unable to transport oxygen to the tissues. Methemoglobin levels above 10% may result in clinical anoxia, and levels above 60% can cause stupor, coma, and death if the condition is not quickly treated.

The symptoms described in this report appear to have been induced by simultaneous exposure to copper and nitrates at levels close to the federal drinking water standards for these substances; this phenomenon has not previously been implicated as contributing to the development of methemoglobinemia in infants. Copper is an effective emetic and gastrointestinal irritant, and ingestion of water containing copper levels of 2.8-7.8 mg/L has been associated with vomiting and diarrhea among adults and school-aged children. Although the dose required to cause acute symptoms in infants is unknown, children aged <1 year may be more sensitive to copper than older persons. Elevated copper levels in water used to prepare the infant's formula may have caused loose stools and vomiting after eating. Repeated vomiting and diarrhea may have resulted in dehydration and weight loss and, in turn, reduced gastric acidity sufficiently to enhance the growth of nitrate-reducing bacteria and facilitate conversion of ingested nitrates to nitrites. In addition, systemic copper poisoning has been reported to increase methemoglobin levels independent of nitrate exposure -- an effect attributed to the ability of copper to inhibit red cell enzymes needed to reduce endogenous methemoglobin.

The major source of dissolved copper in drinking water is copper pipes in household plumbing. Water that stands overnight in copper pipes may contain copper levels that exceed the federal drinking water standard. This problem is most often associated with corrosive water supplies or with new copper pipes and can usually be prevented by flushing the household plumbing before using water for drinking or food preparation.

This report underscores that drinking water may be contaminated with nitrates and/or copper in some areas of the United States. Accordingly, health practitioners should routinely advise pregnant women that water from private wells be tested for nitrate. In addition, copper exposure should be considered in the differential diagnosis of unexplained gastrointestinal symptoms.

REF: Morbidity and Mortality Weekly Report, 42(12), April 2, 1993


BICYCLE HELMET PROMOTION PROGRAMS -- Canada, Australia, and United States

The use of bicycle helmets substantially reduces the risk for serious head injuries during bicycle-related crashes. Despite this benefit, epidemiologic data indicate a worldwide low prevalence of helmet use. Strategies to increase the use of bicycle helmets in the United States and other countries include subsidies, legislation, and education. This report summarizes information regarding three strategies to increase bicycle helmet use and the impact of implementing these approaches in Canada (helmet subsidies), Australia (legislation), and the United States (education).

Canada. To assess whether the provision of bicycle helmets at reduced cost increases the use of helmets, the Division of General Pediatrics, Hospital for Sick Children, in Toronto conducted a randomized, controlled study in Toronto from May through September 1992. Students in three elementary schools in low-income areas were offered bicycle helmets for $10 U.S. These students were compared with students in similar low-income areas who were not offered subsidized helmets. Reported helmet ownership increased from 10% to 47% among students in the schools where subsidized helmets were offered, and reported helmet use increased from 6% to 34%. However, there were no statistically significant differences in rates of observed helmet use between these areas (3% before to 18% after the study) and the areas where no subsidy was offered (1%-21%).

Australia. In July 1990, the state of Victoria enacted laws that made bicycle helmet use compulsory. Specifically, these laws required that all persons cycling on roads, footpaths, or separate bicycle paths, and in public parks wear a securely fitted, approved bicycle helmet. During the 10 years preceding enactment of these laws, the state conducted promotional activities to increase helmet use, including educational campaigns, rebate programs, and publicity campaigns on radio and television. Direct observation surveys indicated the prevalence of helmet use among persons aged 5-11 years in Victoria increased from 26% before enactment of the law to 80% following enactment.

United States. During 1986, the Children's Bicycle Helmet Coalition in Seattle implemented a community-based education program to reduce bicycle-related head injuries among children by promoting the use of helmets. Components of this program included public and physician education, school safety programs, an outreach campaign for low-income populations, extensive media coverage, and informational brochures in monthly insurance and utility bills. An evaluation of the impact of this program indicated that, from 1986 through 1992, helmet use among 5-15-year-old children increased from 5% to 38%. In addition, the number of children in this age group treated for bicycle-related head injuries at the regional trauma center in Seattle decreased 50% from 1990 through 1992.

Editorial Note: Among the 96 million cyclists in the United States, approximately 950 fatalities and 580,000 emergency department visits occur annually as a result of bicycle injuries. Approximately 62% of these deaths and 32% of the injuries involve head trauma. Helmets are effective in reducing head injuries: the estimated risk for head injuries among persons not using helmets is 3.9-6.7 times greater than that among persons using helmets. However, fewer than 2% of U.S. children and fewer than 10% of all U.S. bicyclists wear helmets.

The Injury Prevention Program of the World Health Organization is coordinating a worldwide initiative to increase the use of motorcycle and bicycle helmets. The initiative focuses on three approaches: developing and testing helmets, promoting helmet use, and evaluating helmet-use promotion strategies. During the Second World Conference on Injury Control, to be held in May 1993, scientists and public health professionals will focus on promoting and evaluating helmet use.

REF: Morbidity and Mortality Weekly Report, 42(11), March 26, 1993.


CIGUATERA FISH POISONING -- Florida, 1991

Twenty cases of ciguatera fish poisoning from consumption of amberjack were reported to the Florida Department of Health and Rehabilitative Services (HRS) in August and September 1991.

On August 9, the Florida HRS was notified of eight persons who developed one or more of the following symptoms: cramps, nausea, vomiting, diarrhea, and chills and sweats within 3-9 hours (mean: 5 hours) after eating amberjack at a restaurant on August 7 or August 8; duration of symptoms was 12-24 hours. Three persons were hospitalized. By August 12, patients began to report pruritus (itching) of the hands and feet, paresthesia (prickling), dysesthesia (abnormality of sensation), and muscle weakness.

The Food and Drug Administration evaluated 19 amberjack samples believed to have originated from a single lot from the Key West dealer and obtained from restaurants and grocery stores in Florida and Alabama for ciguatera-related toxin. Forty percent of the specimens tested by mouse bioassay were positive for ciguatera-related biotoxins.

Editorial Note: Ciguatera is a naturally, sporadically occurring fish toxin that affects a wide variety of popularly consumed reef fish; ciguatera becomes more bioconcentrated as it moves up the food chain. Ciguatera and related toxins are derived from dinoflagellates, which herbivorous fish consume while foraging through macro-algae. Larger predator reef fish (e.g., barracuda, grouper, amberjack, surgeon fish, sea bass, and Spanish mackerel) have been implicated in previous outbreaks.

Humans ingest the toxin by consuming either herbivorous fish or carnivorous fish that have eaten contaminated herbivorous fish. The toxin is tasteless, and because it is heat-stable, cooking does not render the fish safe for consumption. As in this outbreak, ciguatera fish poisoning is diagnosed by the characteristic combination of gastrointestinal and neurologic symptoms in a person who eats a suspected fish. The diagnosis is supported by detection of ciguatoxin in the implicated fish. No specific, effective treatment for ciguatera fish poisoning has been proven; supportive treatment is based on symptoms.

REF: Morbidity and Mortality Weekly Report, 42(21) June 4, 1993.


LEAD INTOXICATION ASSOCIATED WITH CHEWING PLASTIC WIRE COATING -- Ohio

In December 1991, a venous blood lead level (BLL) of 50 ug/dL was detected in a 46-year-old Ohio man during a routine pre-employment examination. He was referred to a university-based pharmacology and toxicology clinic for further evaluation; clinic physicians investigated the case. Although a repeat BLL obtained 1 month later was 51 ug/dL, he reported no exposure to known sources of lead during the interim. However, he reported numbness of his fingers and palms, tinnitus (ringing in the ear), and a possible decrease in his ability to perform basic arithmetical calculations.

He had been employed for approximately 20 years as a microwave technician during military service and while employed at a television station, he reported no history of exposure to lead from soldering or welding. He had no activities or hobbies associated with exposure to lead or lead products, no previous bullet or birdshot wounds, and he denied drinking illicitly distilled alcohol or using lead additives in his car.

His residence was built in 1974 (after lead was banned from use in residential paint), and household water was obtained from a well. In January 1992, blood lead testing of family members revealed levels of 5 ug/dL for his wife and <5 ug/dL for his 17-year-old child. His only medication was ranitidine (ranitidine alters gastric acidity, which theoretically can influence gastrointestinal absorption of lead), which he had used for the previous 1 1/2 years for "indigestion." He reported occasional cigarette smoking.

Although results of a neurologic examination were normal, neuropsychiatric testing on March 13 demonstrated mild memory deficits, as evidenced by abnormalities on verbal and figural memory tests. Because of these abnormalities, beginning March 13, he was treated for 19 days with dimercaptosuccinic acid (DMSA), an oral chelating agent, and on April 4, his BLL had decreased to 13 ug/dL. However, BLLs on May 15 and July 23 were 49 ug/dL and 56 ug/dL, respectively. During a July 1992 follow-up clinic visit, he mentioned that for approximately 20 years he had habitually chewed on the plastic insulation that he stripped off the ends of electrical wires. Samples of the copper wire with white, blue, and yellow plastic insulation were obtained and analyzed for lead content. The clear plastic outer coating (present on all colors of wire) and the copper wire contained no lead; however, the colored coatings contained 10,000-39,000 ug of lead per gram of coating. On receipt of these results, he was instructed immediately to discontinue chewing the wire coating.

Editorial Note: Plastic coatings previously have been associated with lead exposure in the burning of lead-containing plastics during repair of a storage tank, the production of plastics, and the manufacture and use of stabilizers and pigments in the plastics industry. Although lead exposure also can occur among workers who burn the plastic coating off copper wire to recycle the copper, lead intoxication by this route has not been reported.

Lead compounds may be employed in the production of colored plastics (in which lead chromates are used as pigment) and in the manufacture of polyvinyl chloride (PVC) plastics (in which 2%-5% lead salts [including lead oxides, phthalate, sulfate, or carbonate, depending on the desired quality of the final product] are used as stabilizers). Although environmental regulation has reduced considerably the amount of lead used in the United States in the manufacture of PVC plastics, manufacturers of electrical wire and cable continue to produce PVC stabilized and/or pigmented with lead compounds.

REF: Morbidity and Mortality Weekly Report, 42(24), June 25, 1993.


LEAD POISONING IN BRIDGE DEMOLITION WORKERS -- Georgia, 1992

Bridge demolition and maintenance are leading causes of lead poisoning among workers in the United States.

In February 1992, a temporary-service company was subcontracted by a steel corporation to cut apart steel beams that had been removed from a local bridge. Four men were hired. All four were immigrants from Mexico; only two spoke English. The work was performed outdoors, without protective equipment or training, using oxy-acetylene flame-cutting torches.In April, all four workers reported light-headedness and shortness of breath from the metal fumes, requiring frequent fresh-air breaks during the day. In early May, all four workers developed a variety of symptoms including headache, dizziness, fatigue, sleep disturbance, confusion, forgetfulness, arthralgia (joint pain), and abdominal pain. The severity of symptoms intensified through June, with nausea, vomiting, constipation, weakness, shortness of breath, loss of balance, and nervousness.

In early June, the steel company suggested BLL examinations of the workers; their BLLs, measured at the local health department, were 93, 90, 59, and 66 ug/dL. The workers' employment was terminated in late June on receipt of the test results by the company.

The health department recommended that the workers promptly seek medical evaluation and care; however, because they had no medical insurance and both the subcontractor and the steel company declined to assume the costs of treatment, the workers initially delayed seeking medical treatment. They subsequently contacted an attorney, who initiated worker's compensation proceedings and arranged for a local hospital to admit them for treatment. Each worker received three 5-day chelation treatments with intravenous calcium disodium ethylenediamine tetraacetic acid approximately 15 days apart. All four reported improvement but continued to experience memory deficits, arthralgias, headaches, dizziness, and/or sleep disturbances.

Editorial Note: An estimated 90,000 bridges in the United States are coated with lead-containing paints. The findings in this report are consistent with other studies that indicate that minority groups are disproportionately exposed to lead and other occupational hazards.

REF: Morbidity and Mortality Weekly Report, 42(20), May 28, 1993.


TOPICS IN MINORITY HEALTH

Childbearing Patterns Among Selected Racial/Ethnic Minority Groups -- United States, 1990

Childbearing patterns in the United States reflect marked increases in and variation among different racial/ethnic groups. Groups with high rates of teenage childbearing traditionally have elevated risks for low birthweight (LBW [<2500 g (5 lb 8 oz)]) and other poor birth outcomes associated with serious infant morbidity, permanent disability, and death. To characterize childbearing variations among American Indians/Alaskan Natives, Asians/Pacific Islanders, and Hispanic ethnic groups, CDC's (Centers for Disease Control) National Center for Health Statistics analyzed data from U.S. birth certificates for 1990. Birth certificates are the primary source for monitoring childbearing patterns and maternal and infant health; data for this report were based on 1990 birth certificates.

Overall, the fertility rate (births per 1000 women aged 15-44 years) in 1990 was 70.9. The fertility rate for Hispanics (107.7) was approximately 71% higher than that for white non- Hispanics (62.8). Fertility rates varied even more markedly among subgroups, from 40.8 (Japanese Americans) to 118.9 (Mexican Americans).

For teenagers (aged <20 years), birth rates were highest for Hawaiians, black non-Hispanics, and Hispanics. In particular, rates for teenaged Mexican Americans, Puerto Ricans, black non-Hispanics, and Hawaiians were each two to three times the rates for white non-Hispanics, Cuban Americans, and Filipino Americans and up to 31 times the rates for Chinese Americans, Japanese Americans, and "other" Asians/Pacific Islanders. Rates for American Indian/Alaskan Native teenagers were approximately twice those for white non-Hispanic teenagers.

(Note: Copies of this article are available to Cooperative Extension employees by calling 916-752-2936)

REF: Morbidity and Mortality Weekly Report, 42(20), May 28, 1993.


TOXICOLOGY TIDBITS

Too Many Cooks Spoil the Stew

The ration mix-up of the month occurred on a farm in north-eastern Alberta, where one cook too many became involved in the formulation of a cattle ration. The first person placed two old automobile batteries in the bucket of a front-end loader and went off for lunch. Another individual started and operated the tractor without even looking at the bucket. This driver picked up a bucket full of grain and dumped the load into a feed grinder. The resultant ration was fed to 140 head of cattle. Several animals died and many became sick. The attending veterinarian drew blood samples from 21 surviving animals. Dr. Roy Smith of the Animal Health Laboratory in Edmonton analyzed these samples for lead content. Lead levels ranged from 0.227 parts per million (ppm) to 0.762 ppm. Smith says that serum lead levels higher than 0.050 ppm would suggest significant exposure to lead in cattle.

REF: Herd Health Memo, 10, April 1993.

2,4-D Cancer Link Weakly Suggestive

After wrestling for two days with the "inadequate" available data from human and animal studies, half of the 10 attending members of EPA's Special Joint Committee on the Weight of Evidence of Carcinogenicity of 2,4-D concluded April 2 that the evidence was "weakly suggestive" of cancer causation.

Comprised of scientists designated by the agency's Science Advisory Board and Scientific Advisory Panel, the 2,4-D panel easily achieved agreement that the evidence of carcinogenicity was not sufficiently persuasive to merit any of the first four descriptors suggested by EPA -- "proven human carcinogen," "highly probable...," "probable..." or "somewhat probable..." -- but then the haggling over descriptive terminology started.

Following a lengthy discussion, the panel's chair, Dr. Genevieve Matanoski, Johns Hopkins University School of Hygiene and Public Health, called for a show of hands on "weakly possible," and three members raised their hands. Next, five members voted for "weakly suggestive" and Dr. Richard Monson of the Harvard University School of Public Health cast the lone vote for "possible."

Although Matanoski did not vote, earlier she said she believed the human epidemiologic data "in and of itself is very weakly suggestive... I think it's very weak evidence even without any animal data."

REF: Kansas Pesticide Newsletter, 16(5), May 13, 1993.

Lyme Disease

In 1992 there were 231 cases of Lyme Disease reported in California. The greatest number of cases occurred on the East Coast, 1119 in Pennsylvania and 3370 in New York.

REF: Morbidity and Mortality Weekly Report, 42(18), May 14, 1993.

Economic Impact of Motor-Vehicle Crashes -- United States, 1990

Injuries resulting from motor-vehicle crashes are the leading cause of death for persons of every age from 6 through 33 years. Motor-vehicle crashes during 1990 accounted for 44,531 fatalities, 5.4 million non-fatal injuries, and 28 million damaged vehicles, and an estimated total cost of $137.5 billion. Major sources for cost were property damage ($45.7 billion [33%]), productivity losses in the workplace ($39.8 billion [29%]), medical-care expenses ($13.9 billion [10%]), and losses related to household productivity (10.8 billion [8%]).

In 1990, crashes that involved any alcohol (i.e., blood alcohol concentration [BAC] level >0.01 g/dL) cost $46.1 billion and represented approximately 33% of all economic costs attributed to motor-vehicle crashes. Of this amount, $37.5 billion (81%) reflected crashes in which a driver or pedestrian was legally intoxicated (i.e., a BAC of at least 0.10 g/dL in most states). Alcohol use was disproportionately involved in crashes associated with death or critical injury, accounting for an estimated 50% of total incidence and 55% of total cost for these crashes. In contrast, alcohol was involved in approximately 15% of noninjury-related crashes.

The economic impact of motor-vehicle crashes during 1990 was approximately 2.5% of the gross domestic product in the United States.

REF: Morbidity and Mortality Weekly Report, 42/(23), June 18, 1993.

Warnings on Interaction Between Air Bags and Rear-Facing Child Restraints

Air bags and child safety seats are effective in preventing deaths and serious injuries from motor-vehicle crashes, and child safety seats are required by law in all 50 states. However, laboratory crash test data indicate a potential for injury if a child is placed in a rear-facing restraint in the front seat of any vehicle equipped with a passenger-side air bag. Although no children have been injured in this way, parents should not use a rear-facing restraint in this manner.

In a crash, a rear-facing child restraint with its back close to the instrument panel could be struck by the rapidly inflating air bag, and a child in the restraint could be seriously injured (Figures A and B). Rear-facing child restraints must be used in the rear seat of vehicles with passenger-side air bags. To be properly protected, infants must ride in a rear-facing child restraint until they weigh 20 pounds or are approximately 1 year of age. Those vehicles with passenger-side air bags and without back seats are therefore not suitable for rear-facing child restraints. This consideration should be addressed when a family car is purchased or rented.

Parents should always read and follow the child restraint instructions and the vehicle owner's manual for specific directions on where and how to install a particular child restraint in a particular vehicle. Although all children should travel in the back seat of vehicles, forward-facing child restraints may be used in the front seat of a vehicle equipped with a passenger-side air bag if the child's age and weight meet the restraint manufacturer's requirements; the vehicle seat should be moved as far back as possible so the child is positioned similar to a restrained adult.

Industry is pursuing technologic solutions to reduce the compatibility problem. Government, industry, and professional organizations are developing public information strategies to advise the public of the necessary precautions.

REF: Morbidity and Mortality Weekly Report, 42(14) April 16, 1993.

Milk Quality Assurance

The National Conference on Interstate Milk Shipments in April 1991 adopted several proposals regarding milk quality which have been incorporated into the Pasteurized Milk Ordinance (PMO). The PMO sets the federal standard governing the quality of the nation's milk supply.

One new requirement was that starting January 1, 1992 all truckloads of milk arriving at milk processing plants must be tested for beta lactam antibiotics before being unloaded. (Many New York State milk plants were already doing this, but records of results were not available from a central source). Starting January 1, 1992 all positive drug tests on truckloads of milk must be reported to the NYS Department of Agriculture and Markets.

For the 1992 calendar year, less than one-tenth of one percent of all milk in tank trucks arriving at commercial milk plants in New York tested positive for beta lactam drug residues. Reports received by the Division of Milk Control showed that 263 truckloads, or part-loads tested positive - in some cases, only one compartment of a two compartment truck was positive. The total milk represented by these positive truckloads was 7.6 million pounds that were dumped (7.6 million pounds divided by 8.93 billion pounds delivered to NY plants equals .08%; or divided by an estimated 11.9 billion pounds of milk produced in NY equals .06%).

Furthermore, the Division of Milk Control (NYS Ag & Markets) routinely and randomly tests for beta lactam drug residues in samples of finished dairy products from commercial plants. During 1992 there were no positive drug residue tests in such samples.

In relation to the total dairy industry and to consumer interests, it is evident that even the very small percentage of milk that tests positive to drug residues (and usually in very low concentration) is intercepted and dumped before being unloaded at milk plants, and that the effectiveness of this monitoring is indicated by no positive residue test on finished dairy products from commercial plants.

REF: Veterinary Update, Cornell Veterinary Extension, April 1993.

Salinomycin Toxicosis in Pigs

Salinomycin toxicosis caused death in 25 of 150 affected 16 week old pigs. Deaths occurred one day after changing to a feed containing floor sweepings from a mill, hay and other feed stuffs. Affected pigs had dark red to brown urine and ataxia but were alert. Some pigs were listless with fever of 104F preceding death. Pulmonary edema, dark brown urine and nephrosis were noted at necropsy. The feed contained 720 ppm salinomycin which is reported to be toxic at 166 ppm. Cattle receiving the same ration had no problems. Removal of the feed ended the problem.

REF: Lab Notes, 6(2), 1993, California Veterinary Diagnostic Laboratory System.

Moldy Sweet Potatoes

Moldy sweet potatoes, fed as a supplement, caused dyspnea, emphysema, interstitial pneumonia and death in 30 of 180 cattle.

REF: Lab Notes, 6(2), 1993 California Veterinary Diagnostic Laboratory System.

Question Link Between Human Lung Cancer and Pet Bird Exposure

Veterinarians should be aware of a recent report in which Kohlmeier et al1 concluded that pet bird ownership is associated with increased risk of developing primary lung cancer in human beings. This report demonstrates a misunderstanding of causal inference at the population and individual levels. Concluding that an exposure is a risk factor for a population must be based on valid findings.

Determining whether an exposure causes a disease in an individual is difficult, but such determination can be supported by demonstrating biological plausibility. Unfortunately, the mechanisms of carcinogenesis suggested by Kohlmeier et al are not consistent with all available information. Although inhalation of avian antigens may cause hypersensitivity pneumonitis, neither hypersensitivity pneumonitis nor pulmonary fibrosis, which occasionally results, is associated with lung cancer. In addition, avian particulates, owing to their size, are not likely to reach the alveoli, nor have they been proven to be carcinogenic. Finally, a mycologic pathway is unlikely, given that pet birds seldom are a source of Cryptococcus neoformans, even among immunosuppressed individuals, because few birds shed this organism and there is little aerosolization from feces.

In conclusion, there is little justification for the causal inference by Kohlmeier et al that pet bird ownership is a risk factor for lung cancer and that human beings should avoid pet bird exposure.

1Kohlmeier L, Arminger G, Bartolomcycik S, et al. Pet birds as an independent risk factor for lung cancer: case-control study. Br Med J 1992;305:986-989. REF: JAVMA, 202(9), May 1, 1993.


Arthur L. Craigmill, PhD.
Environmental Toxicology University of California
Davis, CA 95616-8588
(530)752-2936 FAX: 752-0903
Email: alcraigmill@ucdavis. edu