COOPERATIVE EXTENSION UNIVERSITY OF CALIFORNIA
ENVIRONMENTAL TOXICOLOGY NEWSLETTER


Vol. 15 No. 2 May 1995

In This Issue


Cryptosporidium Infections Associated with Swimming Pools — Dane County, Wisconsin, 1993

In March and April 1993, an outbreak of cryptosporidiosis in Milwaukee resulted in diarrheal illness in an estimated 403,000 persons. The median age of ill persons was 4 years (range: 1–40 years). Reported signs and symptoms included watery diarrhea (94%), stomach cramps (93%), and vomiting (53%). Median duration of diarrhea was 14 days (range: 1–30 days). Swimming in a pool or lake during the 2 weeks preceding onset of illness was reported by 82% of case-patients and 50% of controls. Twenty-one percent of case-patients and 2% of controls (Matched Odds Ratio [MOR]=7.3; 95% Confidence Interval [CI]=0.9–59.3) reported swimming in pool A.

To limit transmission of Cryptosporidium in Dane County pools, state and local public health officials implemented the following recommendations: 1) closing the pools that were epidemiologically linked to infection and hyperchlorinating those pools to achieve a disinfection (CT=pool chlorine concentration [in parts per million] multiplied by time [in minutes]) value of 9600; 2) advising all area pool managers of the increased potential for waterborne transmission of Cryptosporidium; 3) posting signs at all area pools stating that persons who have diarrhea or have had diarrhea during the previous 14 days should not enter the pool; 4) notifying area physicians of the increased potential for cryptosporidiosis in the community and requesting that patients with watery diarrhea be tested for Cryptosporidium; and 5) maintaining laboratory-based surveillance in the community to determine whether transmission was occurring at other sites (e.g., child-care centers and other pools).

Cryptosporidium oocysts are small (4–6 m), are resistant to chlorine, and have a high infectivity. The chlorine CT of 9600 needed to kill Cryptosporidium oocysts is approximately 640 times greater than required for Giardia cysts. The ability of pool sand-filtration systems to remove oocysts under field conditions has not been well documented, but would not be expected to be effective. Because of the large number of oocysts probably shed by symptomatic persons, even limited fecal contamination could result in sufficient oocyst concentrations in localized areas of a pool to cause additional human infections.

Maintaining the high levels of chlorine necessary to kill Cryptosporidium in swimming pools is not feasible; therefore, such recreational water use should be recognized as a potential increased risk for cryptosporidiosis in immuno-compromised persons, including those with human immunodeficiency virus infection, in whom this infection may cause lifelong, debilitating illness.

REF: Morbidity and Mortality Weekly Report (MMWR), 43(31), August 12, 1994. þ


Firearm-Related Years of Potential Life Lost Before Age 65 Years — United States, 1980–1991

In 1991, deaths from suicide and homicide combined were the third leading cause of years of potential life lost before age 65 (YPLL-65) in the United States.

In 1991, there were 38,317 firearm-related deaths that accounted for 1,072,565 YPLL-65 and represented 9.0% of the total YPLL-65 for all causes of death. Firearms were the fourth leading cause of YPLL-65, following nonfirearm-related unintentional injuries (2,002,616), malignant neoplasms (1,772,010), and diseases of the heart (1,312,765). >From 1980 through 1991, YPLL-65 attributed to nonfirearm-related unintentional injury and heart disease declined 25.2% and 18.1%, respectively, and YPLL-65 attributed to cancer remained virtually unchanged (1.1% increase). In comparison, during the same period, firearm-related YPLL-65 increased 13.6%. Except for infection with human immunodeficiency virus, no other leading cause of death increased substantially in YPLL-65 during this study period.

REF: MMWR, 43(33), August 26, 1994. þ


Down Syndrome Prevalence at Birth - U.S., 1983–1990

Down syndrome (DS) (trisomy 21) is one of the most serious and frequently reported birth defects among live-born infants and an important cause of mental retardation. The prevalence of DS at birth increases with increasing maternal age. Because national population-based estimates of DS have been limited, CDC [Centers for Disease Control] analyzed data from 17 states with population-based birth defects surveillance programs to determine the birth prevalence of DS and describe trends in DS in the United States during 1983–1990. This report summarizes the findings of the analysis.

During 1983–1990, these 17 states reported a total of 7.8 million live-born infants, representing 25% of all U.S. live-born infants. Overall, the birth prevalence rate of DS during 1983–1990 for these states was 9.2 cases per 10,000 live-born infants (Table 1).

TABLE 1. Maternal-age-adjusted prevalence*of Down syndrome (DS) at birth, by region/state and race/ethnicity of mother --17 state-based birth defects surveillance programs, United States, 1983–1990

      White   Black   Hispanic   Total  
Region/State Surveillance
period
No. live-born
infants
No. Cases Rate No. Cases Rate No. Cases Rate No. Cases Rate
California 1983–1988 1,028,266 548 9.4 46 7.0 321 13.4 1,111 10.8

* Per 10,000 live-born infants.

REF: MMWR, 43(33), August 26, 1994. þ


Blood Lead Levels — United States, 1988–1991

For the U.S. population, the geometric mean (GM) blood lead level (BLL)during 1988–1991 was 2.8 mg/dL (95% confidence interval [CI]=2.7–3.0), a 78% decline in the estimated GM BLL since 1976–1980. The decrease in GM BLL was similar across age groups. The highest GM BLLs were among persons aged 1–2 years (4.1 mg/dL), and the lowest were among persons aged 12–19 years (1.6 mg/dL). The prevalence of BLLs >10 mg/dL among children aged 1–5 years decreased substantially, from 88.2% during NHANES II (Second National Health and Nutrition Examination Survey) to 8.9% during NHANES III, Phase 1. An estimated 35% of non-Hispanic black children who were poor (i.e., household income less than 1.3 times the poverty level ) and lived in the central city of a standard metropolitan statistical area had BLLs >10 mg/dL, compared with 5% of nonpoor, non-Hispanic white children living outside of central cities.

The findings in this report indicate that the reduction in lead exposure documented during the late 1970s continued during the 1980s. Reduction in at least two exposure sources probably contributed most to this decline. First, the amount of lead used in gasoline declined by 99.8% from 1976 to 1990. Second, the percentage of food and soft-drink cans manufactured in the United States that contained lead solder declined from 47% in 1980 to 0.9% in 1990.

REF: MMWR, 43(30), August 5, 1994. þ


Surveillance for Emergency Events Involving Hazardous Substances — United States, 1990–1992

During 1990–1992, 3,125 events were reported from participating states to ATSDR’s (Agency for Toxic Substances and Disease Registry) Hazardous Substances Emergency Events Surveillance (HSEES) system. Of these events, 2,391 (77%) were fixed-facility events (i.e., occurred at stationary facilities), and 723 (23%) were transportation related. In 88% of events, a single chemical was released. The most frequently released hazardous substances were volatile organic compounds (18% of the total 4,034 substances released), herbicides (15%), acids (14%), and ammonias (11%). In 467 events (15% of all events), 1,446 persons were injured; 11 persons died as a result of these injuries. Respiratory irritation (37%) and eye irritation (23%) were the most frequently reported health effects. A total of 457 (15%) events resulted in evacuations; of these, 400 (88%) were ordered by an official (e.g., a police officer or firefighter). The median number of persons evacuated was 25 (range: from 12 to >9,999 persons). Evacuations lasted an average of 9.4 hours (median: 3 hours; range: 1-240 hours).

Methods

Five state health departments (Colorado, Iowa, Michigan, New Hampshire, and Wisconsin) began data collection on January 1, 1990. Hazardous substance emergency events were defined as uncontrolled or illegal releases or threatened releases of chemicals or their hazardous by-products. The reportable chemicals included the 200 substances identified by ATSDR as the most hazardous substances found at Superfund sites, all other insecticides and herbicides in addition to those found at Superfund sites, chlorine, hydrochloric acid, sodium hydroxide, nitric acid, phosphoric acid, acrylic acid, and hydrofluoric acid. Events were reported if the amount of substance released needed to be removed, cleaned up, or neutralized according to federal, state, or local law. In addition, events were reported if they resulted in a potential for a release of a designated hazardous substance and if this potential led to an action (e.g., an evacuation) to protect the health of employees, first responders, or the general public.

Results

The hazardous substances released during the events were grouped into 11 categories. The most frequently released hazardous substances were volatile organic compounds (18% of the total 4,034 substances released), herbicides (15%), acids (14%), and ammonias (11%). The substances released during the two types of events were similar; however, a greater number of transportation-related incidents involved the release of herbicides.

For the four categories of substances that were released most frequently, 13%–27% of the releases resulted in injury. The substances released most often, however, were not necessarily those most likely to result in injury. For example, although insecticides were released in only 5% of all events, 80 (37%) of the 217 events with releases of insecticides resulted in injuries.

Of the 1,353 injured persons for whom information concerning use of protective equipment was available, 984 (73%) were not using any type of personal protective equipment at the scene of the event.

REF: MMWR , 43(SS-2), July 22, 1994, þ


Mild Reactions Triggered in IGTC-Backed MSG Challenge Study

A double-blind, placebo-controlled challenge (DBPCC) study administered to 16 individuals who claimed to be sensitive to monosodium glutamate (MSG) elicited mild self-limiting reactions in three subjects who had been given MSG only, according to Daryl Altman, M.D., medical consultant to Allerx Inc. and the Food Allergy Center.

Allerx, which sponsored the study with partial funding from the International Glutamate Technical Committee (IGTC), reportedly made an extensive attempt to recruit MSG-sensitive patients. "We've demonstrated that there may be a small number of people who get mild symptoms of rapid onset and rapid disappearance with high levels of MSG on an empty stomach," Altman told Food Chemical News. Altman noted that the study was designed to maximize reactions to MSG, and did not attempt to simulate real-world exposure to the flavor enhancer. The maximum dose administered in the study was 6 grams, or about one teaspoon full of MSG, she said, which is about 16 times the level typically encountered in a meal in which MSG is used.

Altman said she found "upsetting" the fact that even when participants were shown they had consumed large quantities of MSG without having a reaction, "that didn't stop them from believing that MSG was a problem."

"There's a high prevalence of belief, and a low prevalence of reality," Altman said. This premise was borne out in a survey funded by Allerx which was reported on by the group's vice president of operations, Betty Rauch. She noted in a June 1993 survey that 14% of American households believed one or more persons in their household had a food allergy. Based on the average number of people per allergic household, Rauch estimated that 16 million people, or about 7% of the population, believe they have a food allergy. This percentage represents two or three times the number of people allergen experts estimate are truly food allergic, Rauch said.

REF: Food Chemical News, 36(21), July 18, 1994. þ


CFSAN Survey Reveals High Levels of Risky Eating Behavior

A large percentage of Americans eat foods in a way that makes them vulnerable to foodborne disease, according to the results of a Center for Food Safety and Applied Nutrition (CFSAN) telephone survey.

The telephone survey of 1,620 adults found that 53% consumed raw eggs, 23% consumed undercooked hamburgers, 17% ate raw clams or oysters, and 8% dined on raw sushi or ceviche. In addition, a fourth of the respondents said they didn't wash their cutting boards with soap and water between preparing different kinds of raw foods for a meal.

The survey found that males between the ages of 18-39 and those with an education beyond high school were the most likely to report eating raw animal protein foods. In addition, males and individuals 18-39 years old were more likely to use a dirty cutting board, it added. "Race was not related to most of the high-risk behaviors, although white respondents were more likely to report eating raw eggs and serving undercooked hamburgers," the survey said.

The pattern of those with higher educations engaging in risky behaviors goes against the overall pattern of highly educated groups pursuing health-promoting behaviors such as abstaining from smoking, CFSAN noted.

"Some well-educated persons may not be fully informed about the hazards associated with eating raw or undercooked foods of animal origin. Alternatively, they may be aware of the risks, but choose nevertheless to continue such behaviors, perhaps believing that the risks are outweighed by the culinary experience or that they have sufficient knowledge to control the degree of risk. Finally, it is possible that these practices may be cultural; that is, family and/or friends may have always eaten these foods without noticeable adverse effects," it said.

High Prevalence of Eating Raw Eggs May Pose Risks to Children

"The high prevalence of eating raw eggs is significant from a public health perspective in view of the recent increased incidence of Salmonella sertotype enteritidis infections in the United States," the report said, adding, "Since most parents of young children are in the 18-39 year age group, the higher reported rate of raw egg consumption in this group suggests that children, too, may frequently eat foods containing raw eggs (such as homemade cookie batter, frosting, and ice cream)."

The survey noted that patterns of raw mulluscan shellfish consumption probably vary according to region, with other surveys, for example, showing that one third of the population of Florida eat raw oysters. The lower national rate of 17% revealed in the survey "most likely reflects average consumption patterns for various regions of the country, with higher rates probably occurring in regions near shellfish harvest areas."

REF: Food Chemical News, 36(39), November 28, 1994. þ


Processing Removes Vast Majority of Pesticide Residues

Processing generally removes pesticide residues so thoroughly that 99% of processed products do not contain measurable residues, reported Edgar Elkins, chief scientist at the National Food Processors Association (NFPA) in a presentation at a recent Institute of Technologists meeting. Using benomyl residues on tomatoes as an example, he said that washing removes 82% of the pesticide. By the time it becomes tomato juice, the pesticide level has fallen by 86%, and by the time it becomes catsup, only 2% of the initial level of pesticide remains, he said. Similarly, Elkins reported that processing removes 99% of malathion and carbaryl residues from tomatoes.

In apples selected from a processing plant and run through real-world processing methods, he said, of the 29 pesticides that various apple growers use on raw apples the only detected residues in the processed products were EBDCs. These, he said, occurred at 0.17 ppm in juice and 0.1 ppm in applesauce. Elkins questioned even that residue data, however, because he said he'd never seen EBDCs in the finished product before.

A pesticide residue database compiled by NFPA, he said, contains more than 90,981 samples from processed foods, of which fewer than 1% contain residues above the "limit of quantitation of the analytical method." For tomato products specifically, he said, the database has 24,124 data points, of which more than 99% are below that limit.

"Processed foods contain virtually no quantifiable pesticide residues using the best analytical methods available today," he concluded. As integrated pest management and other alternative methods of pest control gain popularity, he predicted, residues will fall even further in both processed and unprocessed product. "An already safe food supply will become even safer," Elkins said.

REF: Food Chemical News, 36(20), July 11, 1994. þ


Backhauled Raw Eggs Likely Cause of S. Enteritidis Outbreak

Unpasteurized raw eggs apparently backhauled in the same vehicles used to deliver ice cream mixes to a Marshall, Minn., Schwan's Sales Enterprises plant have been tagged as the likely reason for the recent outbreak of illness caused by Salmonella enteritidis in a variety of Schwan's ice cream products, Food and Drug Administration Commissioner David Kessler told a Minneapolis press conference Oct. 20. In addition to the eggs not being pasteurized, the pasteurized ice cream mix was not repasteurized at the receiving plant upon delivery by the allegedly contaminated trucks, Kessler said. Pasteurization often is successful in killing S. enteritidis.

"We have consulted with industry officials about this [lack of pasteurization] practice and they agree with us that steps need to be taken to prevent this from happening again," Kessler said, adding: "Pasteurization, thermal processing or an equivalent treatment should be done at the site of final packaging if at all possible." In addition, food processors should not use transportation vehicles that have previously transported raw eggs if the product cannot undergo a final treatment step to prevent bacterial contamination, Kessler noted, saying: "All companies need to review their practices to prevent cross-contamination and to avoid any transportation vehicles that could be the source of any pathogens."

Kessler added, "I need to stress that our scientific investigation is continuing," noting that Minnesota officials, FDA regional and district personnel and Centers for Disease Control and Prevention officials have all played a big role in tracking down the source of the S. enteritidis problem.

Shortly after Kessler, Minnesota officials and industry representatives appeared at the Oct. 20 press conference, Schwan's announced that it has changed its processing procedures to include repasteurization of "every shipment of ingredients that comes to our plant." The company plans to dedicate a fleet of tankers, which will be sealed between shipments, to carry ingredients, as well as test all mix for Salmonella.

REF: Food Chemical News, 36(35), October 24, 1994. þ


New Guidance of Lead-Based Paint, Dust, and Soil at Non-CERCLA and Non-RCRA Sites

In July, EPA's Office of Prevention, Pesticides and Toxic Substances (OPPTS) released guidance on screening and cleanup of lead-based paint (LBP), lead-contaminated dust, and lead-contaminated soil. This guidance represents EPA's attempt to develop a comprehensive policy to address lead problems in residential areas. The guidance will serve as an interim measure while EPA develops a national regulatory standard for lead hazards as part of the Toxic Substances Control Act (TSCA).

The guidance provides the following recommended cleanup levels:

Lead-contaminated house dust
uncarpeted floors 100 mg/ft
2 (0.93 mg/m2)
interior window sills 500 mg/ft
2 (4.65 mg/m2)
window wells 800 mg/ft
2 (7.45 mg/m2)

Bare lead-contaminated soil 400 mg/kg

Rather than setting cleanup levels for LBP, the guidance indicates the locations where LBP, if detected, should be addressed: (1) areas where paint is deterioriated; (2) in high friction or impact areas; and (3) in areas where children are likely to chew on painted surfaces (e.g. windows sills).

The guidance also provides recommendations for prioritizing soil lead abatement. The following priorities list is provided:

Areas expected to be used by children:

400 to 5,000 mg/kg: Interim controls to change use patterns and establish barriers to exposure
>5,000 mg/kg: Abatement of soil

Areas where contact by children is less likely or infrequent:

400 to 2,000 mg/kg: "Less rigorous" exposure reduction activities
2,000 to 5,000 mg/kg: Interim controls to change use patterns and establish barriers to exposure
>5,000 mg/kg: Abatement of soil

The guidance notes that the 5000 mg/kg abatement level would target soil at approximately 0.5% of American homes.

It should be noted that EPA explicitly states that the guidance is not intended for use at CERCLA (Comprehensive Environmental Response Compensation and Liability Act) or RCRA (Resource Conservation and Recovery Act) corrective action sites. Nonetheless, the guidance could serve as source of information during negotiation of cleanup levels at such sites.

REF: Thanks to Jenifer Heath, editor of Lead Bulletin, November 1994. þ


Jimson Weed Poisoning — Texas, New York, and California, 1994

Ingestion of Jimson weed (Datura stramonium), which contains the anticholinergics atropine and scopolamine, can cause serious illness or death. Sporadic incidents of intentional misuse have been reported throughout the United States, and clusters of poisonings have occurred among adolescents unaware of its potential adverse effects. This report describes incidents of Jimson weed poisoning that occurred in Texas, New York, and California during June-November 1994.

Texas

On June 19, 1994, the El Paso City-County Health and Environmental District was notified of two male adolescents (aged 16 and 17 years) who had died from D. stramonium intoxication. On June 18, the decedents and two other male adolescents had consumed tea brewed from a mixture of roots from a Jimson weed plant and alcoholic beverages, then fell asleep on the ground in the desert. Family and police found the decedents the following afternoon. The other two adolescents reported drinking only small amounts of the tea: one experienced hallucinations; the other had no signs or symptoms. Neither was treated, nor were biologic specimens collected. Screening of a toxicologic postmortem blood sample from one decedent detected atropine (55 ng/mL) and a blood alcohol concentration (BAC) of 0.03 g/dL (in Texas, intoxication is defined as a BAC >0.1 g/dL). Analysis of the tea identified atropine, ethanol, and scopolamine.

New York

On the morning of October 9, 1994, an 18-year-old man from Long Island was brought to an emergency department (ED) by his mother after she found him in his bedroom unclothed and hallucinating. Reports from friends indicated he had ingested 50 Jimson weed seeds and had used controlled substances (i.e., cocaine, "ecstacy," and marijuana) at a party the previous night. On evaluation, the patient was hallucinating and had fully dilated pupils, dry mouth, and decreased bowel sounds. He became progressively agitated and was sedated with intravenous diazepam and alprazolam. Hallucinations continued for 36 hours. On October 11, he was discharged for psychiatric counseling. He had a history of chronic substance abuse.

During October 8-November 15, a regional poison-control center was contacted about this case and for information about 13 other identified cases of Jimson weed intoxication. The mean age of the 14 patients was 16.8 years (range: 14-21 years), and eight were male. In the five incidents for which quantity of Jimson weed exposure was reported, ingestion ranged from 30 to 50 seeds per person. Manifestations included visual hallucinations (12 persons), mydriasis (dilated pupils) (10), tachycardia (rapid heartbeat) (six), dry mouth (five), agitation (four), nausea and vomiting (four), incoherence (three), disorientation (three), auditory hallucinations (two), combativeness (two), decreased bowel sounds (two), slurred speech (two), urinary retention (one), and hypertension (one). Four patients were treated and released from EDs, six were hospitalized, three were admitted to an intensive-care unit (ICU), and one refused medical care. Five of these patients were treated with activated charcoal, one was administered gastric lavage, and none received physostigmine.

California

On October 22, 1994, two male and four female adolescents (aged 15-17 years) with a history of drinking Jimson weed tea were transported to an ED. Two persons were discharged from the ED; four were admitted to the ICU because of symptoms that included headache, fatigue, disorientation, fixed or sluggish dilated pupils, tachycardia (heart rates >120 beats per minute), and hallucinations. These four patients were monitored with electrocardiograms, treated with physostigmine and activated charcoal, and discharged on October 23. The Los Angeles County Forestry Division reported that fires in the Los Angeles area may have promoted regrowth of Jimson weed in defoliated areas.

Editorial Note: D. stramonium grows throughout the United States and, historically, was used by American Indians for medicinal and religious purposes. All parts of the Jimson weed plant are poisonous, containing the alkaloids atropine, hyoscyamine, and scopolamine. Jimson weed -- also known as thorn apple, angel’s trumpet, and Jamestown weed (because the first record of physical symptoms following ingestion occurred in Jamestown, Virginia, in 1676) -- is a member of the nightshade family. The toxicity of Jimson weed varies by year, between plants, and among different leaves on the same plant. Although all parts of the plant are toxic, the highest concentrations of anticholinergic occur in the seeds (equivalent to 0.1 mg of atropine per seed). The estimated lethal doses of atropine and scopolamine in adults are >10 mg and >2–4 mg, respectively.

Symptoms of Jimson weed toxicity usually occur within 30-60 minutes after ingestion and may continue for 24-48 hours because the alkaloids delay gastrointestinal motility. Ingestion of Jimson weed manifests as classic atropine poisoning. Initial manifestations include dry mucous membranes, thirst, difficulty swallowing and speaking, blurred vision, and photophobia, and may be followed by hyperthermia, confusion, agitation, combative behavior, hallucinations typically involving insects, urinary retention, seizures, and coma. Treatment consists of supportive care, gastrointestinal decontamination (i.e., emesis and/or activated charcoal), and physostigmine in severe cases.

In 1993, a total of 94,725 poisonings associated with toxic plants was reported in the United States (Table 1). Although most cases of Jimson weed poisoning in the United States occur sporadically, increased incidence or clustering of cases may follow press and broadcast reports that heighten interest in -- but do not emphasize the adverse effects of -- Jimson weed ingestion. In 1993, the American Association of Poison Control Centers Toxic Exposure Surveillance System received 318 reports of Jimson weed exposure. Although the total number of reported exposures to Jimson weed did not rank among the 20 most frequently reported exposures to poisonous plants (Table 1), telephone calls to poison-control centers about Jimson weed poisoning are more likely than those about other hallucinogens to prompt a need for medical care. Poisoning associated with Jimson weed can be prevented through education of health-care providers and by press and broadcast reports to the public that emphasize the health hazards of Jimson weed ingestion, but that reduce access to the plant by omitting detailed descriptions and drawings and photographs.

TABLE 1. Twenty most frequently reported plants associated with human poisonings, by plant and number of reported exposures — United States, 1993

Plant (Botanical name) No. reported exposures
Philodendron (Philodendron sp.) 4726
Pepper (Capsicum annuum) 3912
Dumb cane (Dieffenbachia sp.) 2837
Poinsettia (Euphorbia putcherrima) 2798
Holly (Liex sp.) 2651
Pokeweed/Inkberry (Phytolacoa Americana) 2231
Peace lilly (Spathiphyllum sp.) 2086
Jade plant (Crassula sp.) 1658
Pothos/Devil’s ivy (Epipremnum aureum) 1401
Poison ivy (Toxicodendron/Rhus radicans) 1308
Umbrella tree (Brassaia actinophylia) 1141
African violet (Saintpaulia ionantha) 1137
Rhododendron/Azalea (Rhododendron sp.) 1029
Yew (Taxus sp.) 969
Eucalyptus (Eucalyptus globulua) 945
Pyracantha (Pyracantha sp.) 894
Spider plant (Chlorophytum comosum) 787
Christmas cactus (Schlumbergera bridgesii ) 781
English ivy ( Hedera helix) 765
Climbing nightshade ( Solanum dulcamara) 754

REF: MMWR, 44(3), January 27, 1995. þ


TIDBITS

Wine May Cut Incidence of Heart Disease

Wine may cut incidence of heart disease but may not guarantee longer life, according to research comparing heart disease and overall death rates in 21 countries, including the U.S. and most of Europe. The British journal Lancet reported that researchers found the French to have the second lowest heart disease rates but "one of the highest rates of liver cirrhosis," and did not live longer.

REF: Food Chemical News, December 26, 1994. þ

Lower Lead in Salt Not Justified or Achievable, Industry Says

The Salt Institute has strongly objected to a proposal by the Food and Drug Administration to drastically lower the lead limit in food and color additives and ingredients, saying it is not justified on health or safety grounds and is not technically achievable.

Proposing that the permissible lead level in salt be established at 2.0 mg/kg, the institute said this is consistent with the Codex Alimentarius standard for lead in food grade salt.

Actual levels of naturally occurring lead in salt are well below the current Food Chemicals Codex heavy metals limit of 4 ppm, the group said. Citing data it collected in a 1994 survey, the institute said of 217 samples tested from all four major North American food grade salt producers the maximum level of 2.866 mg/kg occurred in a single sample of rock salt. Mean lead content was 0.118 mg/kg, the group said.

Even a worst case scenario, using a high estimate of 9.300 mg/day of salt and the highest level of lead in salt sampled in the Salt Institute survey of 2.866 mg/kg, would lead to a maximum daily intake of lead from salt of 26.7 micrograms per day, just 5.5% of the World Health Organization lead intake limit, the group said. The average lead level in salt contributes only 0.22%, the institute pointed out. It noted that the Codex Alimentarius Commission, following 12 years of study and evaluation, in 1987 adopted contaminant provisions in its Standard for Food Grade Salt of 2.0 mg/kg lead.

REF: Food Chemical News, 36(25), August 15, 1994. þ

FDA Urged to Reevaluate Mercury Action Level in Swordfish

The Center for Marine Conservation has urged the Food and Drug Administration to reevaluate its 1.0 ppm action level for methyl mercury in swordfish, along with its practice of exempting from federal inspection swordfish caught within state waters and not transported in interstate commerce.

REF: Food Chemical News, 36(25), August 15, 1994. þ

Pesticide Illnesses More Common Outside Agriculture

Although most pesticide use is in agriculture, more than two-thirds of occupational pesticide illness cases in California during 1991 were from nonagricultural uses, according to a report released in May by the Cal/EPA Department of Pesticide Regulation. Of the 1,804 reported illnesses with a confirmed or potential link to pesticide use that year, 1,675 occurred on the job. Illnesses occurring outside the workplace, however, are probably more seriously under reported. Non-agricultural pesticide illnesses typically were caused by exposure to disinfectants in restaurants, janitorial companies, municipal water treatment plants, swimming pools, and hospitals. The two deaths in 1991 related to pesticide exposure were both cases in which the victims entered locked buildings where signs had been posted warning that the structure was being fumigated with methyl bromide.

REF: Kansas Pesticide Newsletter, 18(2), February 16, 1995. þ

Lessons Learned

What ATSDR (Agency for Toxic Substances and Disease Registry) scientists have learned in 4 years of data collection is that certain aspects of emergency events involving hazardous substances appear to be consistent:

• Most incidents (93%) involve release of only one chemical,
• Most incidents (84%) occur at facilities and not during transportation,
• Industry employees (58% of victims) are more likely than emergency responders or the public to be injured,
• Most of the employees injured (73%) used no personal protective equipment, and
• Respiratory (31%) and eye (16%) irritation are the most common injuries.

REF: Hazardous Substances & Public Health, 5(1), Winter 1995. (See article on page 3, Surveillance for Emergency Events Involving Hazardous Substances for more details.) þ


VETNOTES

Ergot Poisoning from Fescue Screenings

A Kentucky cattleman has reportedly suffered a death loss of 172 beef cows and calves from a herd of 240. Information indicates the cattle were consuming a mixture of approximately 3 parts corn silage and 1 part fescue (50:50 DM basis) screenings for a period of 50 days. Symptoms included cattle struggling to get up, swollen limbs, swollen muzzles, cattle recumbent and paddling, lameness, loss of tail switches and death.

Preliminary diagnosis indicates the likelihood of ergot poisoning from feeding the fescue screenings. A sample of these screenings reviewed by the Seed Testing Laboratory found 5.5% ergot present. Apparently, it is common to find ergot present in fescue seed that is used for seeding purposes. Screenings derived from cleaning of fescue seed can be expected to contain higher levels of ergot.

The Bottom Line -- Feeding of fescue screenings entails considerable risk. Information indicates that feeding of fescue screenings has occurred previously. We are aware of no previous reports of problems. However, this is not evidence of safety, and extreme caution is advised. Seedsmen and livestock producers must cautiously consider and weigh the risks associated with feeding fescue screenings.

REF: Herd Health Memo, July 1994. þ


FSIS Residue Monitoring Finds 0.26% Violative Rate in 1993

The residue monitoring program of USDA's Food Safety and Inspection Service (FSIS) found violative levels of illegal residues in 0.26% of 39,128 monitoring samples taken in 1993, according to the 1993 Domestic Residue Data Book published recently. FSIS sampled and tested for eight classes of animal drug and pesticide compounds, and the results were "comparable to 0.29% in the 1992 samples and 0.26% in the 1991 samples."

"The majority of the violations detected in monitoring were from illegal levels of approved animal drugs, particularly sulfonamide and antibiotic compounds used to prevent or treat bacterial infections," the report said, adding, "Most antibiotic and sulfonamide residue violations are confined to a relatively small percentage of livestock that make up the meat supply. These same data show few residues in poultry. The recurring reason for drug residue violations in livestock (and poultry, in past years) is failure to allow adequate time for the drugs to clear the animal's system." Detected illegal residues are usually concentrated in kidney, liver or fat rather than muscle meat, FSIS said, noting that the monitoring program "focuses on kidney and liver tissues, since most FDA limits are set in terms of these tissues."

The report also said the FSIS has begun implementing the FAST test (Fast Antimicrobial Screen Test), a "suitable replacement for CAST (Calf Antibiotic and Sulfonamide Test) and STOP (Swab Test On Premises)" that "quickly detects both antibiotic and sulfonamide drug residues in kidneys and livers," in pilot plants.

REF: Food Chemical News, 36(28), September 12, 1994. þ


Animal Tissue Violative Residues Continue to Decline

Although the overall trends have remained the same, the number of drug residues found in food animals at slaughter has continued to decrease, with 3,809 animals violative in the last fiscal year (down from 4,325 in '92) and 4,283 total violative residues (down from 4,960 detected in '92).

Residue violations continued to occur predominantly in bob veal (39.8%), culled dairy cows (29%), market hogs (6%) and sows (2%), based on figures from the U.S. Agriculture Department's National Residue Program, said Center for Veterinary Medicine's (CVM) Tissue Residue Annual Report. "Since 1990, the number of total residues has decreased, and so have those caused by the aminoglycosides gentamicin and neomycin," CVM said, adding: "The absolute numbers and percentages have decreased by one-half."

CVM said that the primary causes of residue violations cited by FDA or state investigations were: a failure to adhere to approved withdrawal time; a failure to keep proper animal ID and treatment records; extra-label use/exceeding recommended dosage, and feeding of colostrum-containing drug residues to bob veal.

A total of 56 regulatory actions were initiated for tissue residue violations. Half of these violations occurred in the Chicago and San Francisco districts, the survey said. "As anticipated, the major drugs involved parallel those identified as long-acting and sustained release products, such as penicillin and oxytetracycline," the survey's executive summary read.

REF: Food Chemical News, 36(29), September 19, 1994. þ


BST-Treated Cows No More Likely to get Mastitis: Study

According to a study involving several thousand cows worldwide, treatment with bovine somatotropin does not promote mastitis. Results of the new study were reported in the August 1994 issue of the Journal of Dairy Science and summarized in a news release from Cornell University, Ithaca, NY. Cornell researcher Dale Bauman was one of 27 scientists involved in the study.

BST critics have suggested that the growth hormone would make cows "more susceptible to mastitis," the Cornell release said, which critics say "would lead to increased treatment with antibiotics which might contaminate the milk."

According to the journal study summary, researchers analyzed long-term observations of 914 cows in the U.S. and Europe and short-term observations of 2,697 cows on commercial and research farms in eight other countries. All cows were treated for periods of up to a "full lactation" with Sometribove, Monsanto's commercial version of BST, the Cornell summary noted.

Overall, treated cows showed "no increase in mastitis infections when compared with untreated cows in the same herds," according to the Cornell release. The report appearing in the journal noted existence of a "positive" relationship between increased milk yield and mastitis incidence. Researchers added, however, that the mastitis increases resulting from increased milk yields were similar whether the yields increased due to BST use or "genetic selection."

Noting the relationship between increases in mastitis incidence and milk yield increases, researchers determined that the incidence of "mastitis per unit of milk yield" was another important variable. According to the journal report, in treated and non-treated cows, "mastitis incidence per unit of milk yielded declined slightly as milk yield increased." The number of mastitis cases per unit of milk produced did not increase in treated cows, the journal article said.

In Cornell's study summary, Bauman said that increased milk yield is only a "minor" cause of mastitis, with environmental factors, including sanitation and milking management practices, being "far more important." Calling this study "just the latest of many" showing that BST does not impact the incidence of mastitis, Bauman noted that the quantity of research on BST is "unprecedented for a new technology." "It is time to use this extensive knowledge rather than the widespread misinformation as basis for the public discussion of BST," he added.

REF: Food Chemical News, 36(28), September 12, 1994. þ


CVM Deems Products with Detectable Levels of DON Safe

Results of a survey conducted by FDA's Center for Veterinary Medicine to detect deoxynivalenol (DON) levels in wheat and wheat by-products intended for animal feed found a "majority of products" contained detectable levels, but all represented products that can be "safely used" in feed.

Following a season of "unusually heavy rainfall" in the Midwest in 1993, which resulted in some wheat crops becoming "heavily contaminated" with DON, the center updated its advisory levels to 1 ppm in finished wheat products; 10 ppm on grain and grain by-products; 5 ppm on grain and grain by-products for swine, with DON-contaminated products not to exceed 20% of diet; and 5 ppm on grain and grain by-products for all other animals, with the contaminated grain not to exceed 40% of diet. DON is a tricothecene mycotoxin known to cause feed refusal and emesis in swine, CVM noted, and outbreaks of DON-associated acute gastrointestinal illness in humans have also been reported.

REF: Food Chemical News 36(50), Feb 6, 1995.


Art Craigmill
Extension Toxicologist
UC Davis