Table itemizing and offering highlights of studies addressing risk

advertisement
RISK ASSESSMENT OF LEAFY GREENS
(last updated 1/10/2014)
Reference
Balbus, J., R. Parkin, and M. Embrey.
2000. Susceptibility in microbial risk
assessment: Definitions and
research needs. Environ. Hlth.
Perspectives 108:901-905.
Barker, S.F., J. O'Toole, M.I. Sinclair,
K. Leder, M. Malawaraarachichi, and
A.J. Hamilton. 2013. A probabilistic
model of norovirus disease burden
associated with greywater irrigation
of home-produced lettuce in
Melbourne, Australia. Wat. Res.
47:1421-1432.
Carrasco, E., F. Pérez-Rodríguez, A.
Valero, R.M. García-Gimeno, and G.
Zurera. 2010. Risk assessment and
management of Listeria
monocytogenes in ready-to-eat
lettuce salads. Comp. Rev. Food Sci.
Food Safety 9:498.
Coleman, E., K. Delea, K.
Everstine,D. Reimann, D. Ripley, and
the Environmental Health Specialists
Network Working Group. 2013.
Handling practices of fresh leafy
greens in restaurants: Receiving and
training. J. Food Prot. 76:2126-2131.
da Cruz, A.G., S.A. Cenci, and
M.C.A. Maia. 2006. Quality
assurance requirements in produce
processing. Trends Food Sci.
Technol. 17:406-411.
Danyluk, M.D. and D.W. Schaffner.
2011. Quantitative assessment of the
microbial risk of leafy greens from
farm to consumption: Preliminary
framework, data, and risk estimates.
J. Food Prot. 74:700-708.
Ding, T., J. Iwahori, F. Kasuga, J.
Wang, F. Forghani, M.-S. Park, and
D.-H. Oh. 2013. Risk assessment for
Listeria monocytogenes on lettuce
from farm to table in Korea. Food
Control 30:190-199.
Notes
Participants of a workshop acknowledged that a full consensus on how to define and
incorporate susceptibility into microbial risk assessment was unlikely to emerge. Key
conceptual issues included clarifying the distinction between individual- and
population-scale definitions of susceptibility; identifying which intrinsic and extrinsic
factors are modifiable and which health outcomes should be considered adverse;
determining whether susceptibility exists in the absence of exposure or is conditional
on it; and determining whether agent, exposure, or dose should be included in a
definition of susceptibility.
More than a quarter of greywater users in Australia irrigate their vegetable gardens
with this source despite government advice against this practice. The annual disease
burdens attributed to greywater irrigation ranged from 2 x 10-8 to 5 x 10-4, depending
on the source of greywater and the existence of produce washing within households.
The estimated norovirus disease burden associated with greywater irrigation of
vegetables was estimated to be negligible relative to household contact with an
infected individual.
Food chain was modeled from processing of raw material at the factory up to
consumption. Monte Carlo simulations of the model were run to estimate the number
of cases in low-risk and high-risk populations. From the 4 risk management measures
simulated and the goal of obtaining 100 cfu/g throughout the shelf-life of lettuce, the
injection of a mixture of gases into packages at manufacture was the most effective in
reducing the number of cases, followed by 4 d of storage at home, and prevention of
high-risk consumers from consumption of ready-to-eat lettuce.
The survey revealed that appropriate handling procedures assist in the mitigation of
other unsafe handling practices for leafy greens.
This review contains information on the main factors responsible for the elaboration of
a quality assurance system for produce plants: good agricultural practices (GAP) and
good manufacturing practices (GMP), including the sanitation standard operating
procedures (SSOP) and hazard analysis and critical control points (HACCP).
This QMRA model represents a preliminary framework that identifies available data
and important data gaps and provides initial risk estimates for pathogenic E. coli in
leafy greens. Critical data gaps remain, which include estimates of initial pathogen
prevalence and level in the field, the number of days that the product remains in the
field after contamination, and the fraction of incoming product that may contain the
pathogen. Research-based time range estimates for retail storage and correlations
between time and temperature during storage are needed; and the importance of lag
time in modeling E. coli O157:H7 growth in leafy greens is unknown. The model also
predicts that a majority of simulated cases arise from leafy greens cross-contaminated
during the washing process, which is based on extrapolation from a single study and
requires additional validation.
The whole food chain of lettuce from farm to table including initial contamination on
the farm, growth and cross contamination during transportation, storage and
consumption was simulated employing @Risk software and the results showed mean
final contamination levels of -1.50 log CFU/g and -0.146 log CFU/g by L.
monocytogenes at restaurant and home, respectively. Based on the model, the
incidence of listeriosis ranged from 11.9 to 17.4 cases per million individuals. The
QRAM in this study was calculated assuming the maximum-dose level as 7.5 log
CFU/serving.
Compiled by Marilyn Erickson, Center for Food Safety, University of Georgia
Downloaded from the website: A Systems Approach for Produce Safety: A Research Project Addressing Leafy Greens found at:
http://www.ugacfs.org/producesafety/index.html.
See http://www.ugacfs.org/producesafety/Pages/TermsofUse.html for disclaimers & terms for use of information in this document.
Page 1
RISK ASSESSMENT OF LEAFY GREENS
(last updated 1/10/2014)
Reference
Franz, E., A.V. Semenov, and A.H.C.
van Bruggen. 2008. Modelling the
contamination of lettuce with
Escherichia coli O157:H7 from
manure-amended soil and the effect
of intervention strategies. J. Appl.
Microbiol. 105:1569-1584.
Franz, E., S.O. Tromp, H. Rijgersberg,
and H.J. van der Fels-Klerx. 2010.
Quantitative microbial risk
assessment for Escherichia coli
O157:H7, Salmonella, and Listeria
monocytogenes in leafy green
vegetables consumed at salad bars.
J. Food Prot. 73:274-285.
Gale, P. 2004. Risks to farm animals
from pathogens in composted
catering waste containing meat.
Vet. Rec. 155:77-82.
Gale, P. 2005. Land application of
treated sewage sludge: quantifying
pathogen risks from consumption of
crops. J. Appl. Microbiol. 98:380396.
Gale, P. and G. Stanfield. 2001.
Towards a quantitative risk
assessment for BSE in sewage
sludge. J. Appl. Microbiol. 91:563569.
Notes
The model estimated an average of 0.34 contaminated heads per hectare. A minimum
storage time of 30 days and a minimium fertilization-to-planting interval of 60 days
was most successful in reducing the risk. Sensitivity analysis revealed that the
likelihood of contamination was most sensitive to the prevalence of contaminated
manure, the manure storage time and the initial density of E. coli O157:H7 in naturally
contaminated manure. Increasing the manure storage time (to a minimum of 30 days)
and incorporating a fertilization-to-planting interval of at least 60 days were most
successful in reducing the number of contaminated lettuce heads.
This study included an integration of modeling pathogen growth in the supply chain of
fresh leafy vegetables destined for restaurant salad bars. Did not account for any lag
phase in pathogen growth. Temperature in cold chain was considered to be 2-5°C
whereas at salad bar, temperatures ranged from 0-13°C. As a result, growth of E. coli
O157:H7 and Salmonella was minimal (17 and 15%, respectively) while growth of L.
monocytogenes was considerably greater (194%). Because of the low virulence of this
latter pathogen, it was not considered problematic. The estimated number of annual
infection cases was considered reasonable in relation to epidemiological data (42 to
551 cases for O157; 81 to 281 for Salmonella; and 0.1 to 0.9 for L. monocytogenes).
The factors controlling the level of risk are the separation of the meat at source, the
efficiency of the composting process, and the decay and dilution of the pathogens in
soil. The net pathogen destruction by the composting process is determined largely by
the degree of bypass.
Targeted 7 pathogens (Listeria monocytogenes, Campylobacters, E. coli O157, C.
parvum, Giardia, and enteroviruses) on root crops. Using laboratory data for pathogen
destruction by mesophilic anaerobic digestion, and not extrapolating experimental data
for pathogen decay in soil to the full 30-month harvest interval specified by the Matrix,
predicts 50 Giardia infections per year, but less than one infection per year for the
other 6 pathogens. Assuming linear decay in the soil, a 12- month harvest interval
eliminates the risks from all 7 pathogens; the highest predicted being one infection of
C. parvum in the UK every 45 years. Lack of knowledge on the exact nature of soil
decay processes is a source of uncertainty.
The main sources of uncertainty in the risk assessment are the degree to which sewage
sludge treatment destroys BSE agent, whether there is a threshold dose for initiation of
BSE infection in cattle, and most importantly, the amount of brain and spinal cord
material which enters the sewer from the abattoir. The model developed in this paper
suggests that recycling of BSE agent through sewage sludge will not sustain endemic
levels of BSE in the UK cattle herd. The risks to humans through consumption of
vegetable crops are acceptably low.
Compiled by Marilyn Erickson, Center for Food Safety, University of Georgia
Downloaded from the website: A Systems Approach for Produce Safety: A Research Project Addressing Leafy Greens found at:
http://www.ugacfs.org/producesafety/index.html.
See http://www.ugacfs.org/producesafety/Pages/TermsofUse.html for disclaimers & terms for use of information in this document.
Page 2
RISK ASSESSMENT OF LEAFY GREENS
(last updated 1/10/2014)
Reference
Hamilton, A.J., F. Stagnitti, R.
Premier, A.-M. Boland, andn G. Hale.
2006. Quantitative microbial risk
assessment models for consumption
of raw vegetables irrigated with
reclaimed water. Appl. Environ.
Microbiol. 72:3284-3290.
Harrison, J.A., J.W. Gaskin, M.A.
Harrison, J.L. Cannon, R.R. Boyer,
and G.W. Zehnder. 2013. Survey of
food safety practices on small to
medium-sized farms and in farmers
markets. J. Food Prot. 76:1989-1993.
Hoelzer, K., R. Poulillot, K. Egan, and
S. Dennis. 2012. Produce
consumption in the United States:
An analysis of consumption
frequencies, serving sizes,
processing forms, and highconsuming population subgroups
for microbial risk assessments. J.
Food Prot. 75:328-340.
Holley, R.A. 2011. Food safety
challenges within North American
Free Trade Agreement (NAFTA)
partners. Compr. Rev. Food Sci.
Food Safety 10:131.
Notes
The models presented cover what would generally be considered worst case scenarios:
overhead irrigation and consumption of raw vegetables. Models were run for several
different scenarios of crop type, viral concentration in effluent, and time since last
irrigation. Necessary data on the volume of irrigation water captured by these crops
was estimated in a field study. In determining the likely concentration of enteric
viruses on the product, adopted the previously used conservative approach whereby it
was assumed that all pathogens in the irrigation water found on the plant attached to it.
The annual risk of infection ranged from 10-3 to 10-1 when reclaimed-water irrigation
ceased 1 day before harvest and from 10-9 to 10-3 when it ceased 2 weeks before
harvest. Two previously published decay coefficients were used to describe the die-off
of viruses in the environment. For all combinations of crop type and effluent quality,
application of the more aggressive decay coefficient led to annual risks of infection
that satisfied the commonly propounded benchmark of <10-4, i.e., one infection or less
per 10,000 people per year, providing that 14 days had elapsed since irrigation with
reclaimed water. Conversely, this benchmark was not attained for any combination of
crop and water quality when this withholding period was 1 day. The lower decay rate
conferred markedly less protection, with broccoli and cucumber being the only crops
satisfying the 10-4 standard for all water qualities after a 14-day withholding period.
The mean annual risk of infection was always less for cucumber than for broccoli,
cabbage, or lettuce. Variation in the amount of produce consumed had the most
significant effect on the total uncertainty surrounding the estimate of annual infection
risk.
Survey data was collected from 226 farmers and 45 market managers. More than 56%
of farmers use manures with 34% of those using raw or mixtures of raw and
composted manure, and over 26% wait fewer than 90 days between application of raw
manure and harvest. Water that has not been tested for safety for irrigation is used by
over 27% of farmers while 16% use such water sources for washing produce. Surfaces
that touch produce at the farm is not sanitized by over 43% of farmers while 67% of
farmers do not clean transport containers between uses. Food safety standards for over
42% of farmers markets are absent according to responses by market managers and
only 2 to 11% of these managers ask farmers specific questions about conditions on
the farm that could affect product safety. Market surfaces are sanitized by less than
25% of managers and even fewer managers (11%) clean market containers between
uses. Less than 25% of managers offer sanitation training to workers or vendors.
Data showed that produce consumption differs among fruits and vegetables, fresh and
heat-treated foods, and demographic groups. Such results are valuable for risk
assessments and allow targeting of risk communication or interventions to those
individuals at greatest risk.
Foodborne illness surveillance and reporting are most comprehensive in the U.S., but it
is uniformly more reactive than proactive in all 3 countries. Food Safety policy is
based on outbreak data, but that may be short-sighted because they represent roughly
10% of foodborne illness cases.
Compiled by Marilyn Erickson, Center for Food Safety, University of Georgia
Downloaded from the website: A Systems Approach for Produce Safety: A Research Project Addressing Leafy Greens found at:
http://www.ugacfs.org/producesafety/index.html.
See http://www.ugacfs.org/producesafety/Pages/TermsofUse.html for disclaimers & terms for use of information in this document.
Page 3
RISK ASSESSMENT OF LEAFY GREENS
(last updated 1/10/2014)
Reference
EPA and USDA Interagency
Microbiological Risk Assessment
Guideline Workgroup. 2012.
Microbial risk assessment guideline.
Pathogenic microorganisms with
focus on food and water. Available
at:
http://www.fsis.usda.gov/PDF/Microbi
al_Risk_Assessment_Guideline_2012001.pdf. Accessed 8/13/2012.
Jacxsens, L., P.A. Luning, J.G.A.J.
van der Vorst, F. Devlieghere, R.
Leemans, and M. Uyttendaele. 2010.
Simulation modeling and risk
assessment as tools to identify the
impact of climate change on
microbiological food safety – The
case study of fresh produce supply
chain. Food Res. Int. 43:1925-1935.
Kirezieva, K., J. Nanyunja, L.
Jacxsens, J.G.A.J. van der Vorst, ,M.
Uyttendaele, and P.A. Luning. 2013.
Context factors affecting design and
operation of food safety
management systems in the fresh
produce chain. Trends Food Sci.
Technol. 32:108-127.
Leifert, C., K. Ball, N. Volakakis, and
J.M. Cooper. 2008. Control of enteric
pathogens in ready-to-eat vegetable
crops in organic and ‘low input’
production systems: a HACCPbased approach. J. Appl. Microbiol.
105:931-950.
McKellar, R.C., D.I. LeBlanc, F.P.
Rodríguez, and P. Delaquis. 2013.
Comparative simulation of
Escherichia coli O157:H7 behaviour
in packaged fresh-cut lettuce
distributed in a typical Canadian
supply chain in the summer and
winter. Food Control 35:192-199.
Mena, K.D. and S.D. Pillai. 2008. An
approach for developing
quantitative risk-based microbial
standards for fresh produce. J.
Water Hlth. 6:359-364.
Notes
The goal of the 231-page document is to produce more consistent and transparent
microbial risk assessments across participating federal agencies. It addresses the entire
risk assessment process from an introduction to terminology and roles of the
participants to planning the risk assessment, identifying and characterizing the hazard,
assessing how the size of an outbreak may be affected by the dose (exposure
assessment) or how the severity of the disease may be affected by the pathogen and its
response within the human hose (dose-response assessment). It also provides
information about microbial risk management and risk communication.
The proposed knowledge-based modeling system is a most appropriate way to identify
impacts of anticipated climate change and globalization on microbiological food safety
of fresh produce. Next to this, additional research will be needed towards
technological pre- and post-harvest solutions (e.g. growing and irrigation techniques,
water treatment techniques) in order to be able to control the food safety of fresh
produce.
The major context factors that create risk to decision-making in food safety
management systems in fresh produce chain were defined in this study. In addition, a
tool for their systematic analysis was developed.
This review describes 6 Risk Reduction Points (RRPs) where risks from enteric
pathogens can be reduced in ready-to-eat vegetables. Changes can be made to animal
husbandry practices (RRP1) to reduce inoculum levels in manure. Outdoor livestock
management (RRP2) can be optimized to eliminate the risk of faecal material entering
irrigation water. Manure storage and processing (RRP3), soil management practices
(RRP4) and timing of manure application (RRP5), can be adjusted to reduce the
survival of pathogens originating from manure. During irrigation (RRP6), pathogen
risks can be reduced by choosing a clean water source and minimizing the chances of
faecal material splashing on to the crop.
Reported on temperature profiles measured in winter and summer months in a retail
supply chain and their predicted impact on the fate of E. coli O157:H7 in fresh-cut
lettuce using the stochastic simulation model, @RISK. Outputs from the model
demonstrated a range of possible outcomes for the time-temperature profiles collected
from the commercial supply chain and ranged from slight growth to die-off.
Risks of infection are estimated using typical monitoring data of Salmonella detected
on carrots and assuming various scenarios of the likelihood of an individual
consuming a contaminated serving of carrots in a given year. Estimated annual risks
of infection range from 2.20 x 10-5 to 2.16 x 10-3, assuming 1 % and 100% of an
individual’s carrot servings are contaminated, respectively.
Compiled by Marilyn Erickson, Center for Food Safety, University of Georgia
Downloaded from the website: A Systems Approach for Produce Safety: A Research Project Addressing Leafy Greens found at:
http://www.ugacfs.org/producesafety/index.html.
See http://www.ugacfs.org/producesafety/Pages/TermsofUse.html for disclaimers & terms for use of information in this document.
Page 4
RISK ASSESSMENT OF LEAFY GREENS
(last updated 1/10/2014)
Reference
Mota, A., K.D. Mena, M. SotoBeltran, P.M. Tarwater, and C.
Cháidez. 2009. Risk assessment of
Cryptosporidium and Giardia in
water irrigating fresh produce in
Mexico. J. Food Prot. 72:2184-2188.
Mukherjee, A., D. Speh, and F. DiezGonzalez. 2007. Association of farm
management practices with risk of
Escherichia coli contamination in
pre-harvest produce grown in
Minnesota and Wisconsin. Int. J.
Food Microbiol. 120:296-302.
Nabulo, G., S.D. Young, and C.R.
Black. 2010. Assessing risk to human
health from tropical leafy vegetables
grown on contaminated urban soils.
Sci. Total Environ. 408:5338-5351.
OMAF Food Inspection Branch. 2001.
Carrot risk assessment introduction
and summary.
http://www.omafra.gov.on.ca/english/f
ood/inspection/fruitveg/risk_assessme
nt_pdf/carrot/30ra.pdf.
Pérez Rodríguez, F., D. Campos, E.T.
Ryser, A.L. Buchholz, G.D. PosadaIzauierdo, B.P. Marks, G. Zurera, and
E. Todd. 2011. A mathematical risk
model for Escherichia coli O157:H7
cross-contamination of lettuce
during processing. Food Microbiol.
28:694-701.
Petterson, S.R., N.J. Ashbolt, and A.
Sharma. 2001. Microbial risks from
wastewater irrigation of salad
crops: A screening-level risk
assessment. Wat. Environ. Res.
73:667-672.
Puerto-Gomez, A.F., J. Kim, R.G.
Moreira, G.-A. Klutke, and M.E.
Castell-Perez. 2013. Quantitative
assessment of the effectiveness of
intervention steps to reduce the risk
of contamination of ready-to-eat
baby spinach with Salmonella. Food
Control 31:410-418.
Notes
There have been limited studies regarding the volume of irrigation water retained on
specific produce. As other risk modeling studies have done, used an estimated average
of 0.0036 ml/g retained on cucumbers (i.e. smooth produce) and an estimated average
of 0.108 ml/g water retained on lettuce (i.e. rough produce). Also assumed that all of
the (oo)cysts detected in the irrigation water were transferred to the produce for a
worst-case approach. Furthermore, also assumed that all detected (oo)cysts were
infectious to humans. Am’t of produce consumed per person in the U.S. estimates
were 13.0, 4.3, 3.3, and 6.2 g/day of tomatoes, bell peppers, cucumbers, and lettuce,
respectively. Annual risks range from 9 x 106 for Cryptosporidium at the lowest
concentration associated with bell peppers to almost 2 x 101 for exposure to Giardia
on lettuce at the highest detected concentration.
In Minnesota, surveyed 14 organic, 30 semi-organic, and 19 conventional farms.
Approximately 44 to 55% of conventional farms used animal waste as fertilizer, while
70 to 100% of the semi-organic and organic farms had animal manure as fertilizer.
The use of animal wastes for fertilization of produce plants increased the risk of E. coli
contamination in organic and semi-organic produce significantly. Improper ageing of
untreated animal manure significantly increased this risk in organic produce grown
using such manure as a fertilizer.
focus is on chemical contaminants
There is some potential for biological hazards to contaminate carrots from preproduction to the retail level of trade, although the overall food safety risk was
considered to be quite low. The greatest risk for carrots occurs during the handling,
washing, grading, and packing processes and at the retail level of trade, where carrots
are sold as hand-harvested bunched carrots, or loose in bulk displays.
A probabilistic model was constructed to account for E. coli O157:H7 cross
contamination when contaminated lettuce enters the processing line. Three different
scenarios were considered to represent the initial concentration on the contaminated
batch entering the processing line (0.01, 1 and 100 cfu/g). The initial concentration in
the contaminated batch did not influence significantly the pathogen levels in bags
derived from cross contamination however prevalence levels were impacted. At the
lowest contamination level, prevalence levels were predicted to be less than 1%. In
contrast, prevalence levels of 3 and 13% were predicted at the higher initial
contamination levels of 1 and 100 cfu/g, respectively. The model showed that the
pathogen was able to survive and be present in the final bags in all simulated
interventions scenarios although irradiation (0.5 kGy) was a more effective
decontamination step in reducing prevalence than chlorination or pathogen testing.
Predicted infection rates were much more sensitive to the decay rate of viruses than
occasional high virus numbers. It was assumed that any microorganism contained in
the residual wastewater remaining on the irrigated crop would cling to the leaf even
after the wastewater itself evaporated.
Intervention strategies (temperature control during harvest, washing, and irradiation)
was integrated into the risk assessment model. Based on a low level of crosscontamination of bacteria (1 log CFU/g), the percentage of samples over the safety
limit (1.33 log CFU/g sampl) was estimated at 16.8% but increased to 84% if a high
level of cross-contamination occurred (~3 log CFU/g). Even with this high number of
tainted lots, exposure of the leafy greens to irradiation (1 kGy) would reduce the
incidence to 0.1%.
Compiled by Marilyn Erickson, Center for Food Safety, University of Georgia
Downloaded from the website: A Systems Approach for Produce Safety: A Research Project Addressing Leafy Greens found at:
http://www.ugacfs.org/producesafety/index.html.
See http://www.ugacfs.org/producesafety/Pages/TermsofUse.html for disclaimers & terms for use of information in this document.
Page 5
RISK ASSESSMENT OF LEAFY GREENS
(last updated 1/10/2014)
Reference
Shuval, H., Y. Lampert, and B. Fattal.
1997. Development of a risk
assessment approach for evaluating
wastewater reuse standards for
agriculture. Wat. Sci. Technol.
35(11-12):15-20.
Signorini, M.L., M.V. Zbrun, A.
Romero-Scharpen, C. Olivero, F.
Bongiovanni, L.P. Soto, L.S. Frizzo,
and M.R. Rosmini. 2013.
Quantitative risk assessment of
human campylobacteriosis by
consumption of salad crosscontaminated with thermophilic
Campylobacter spp. from broiler
meat in Argentina. Prev. Vet. Med.
109:37-46.
Stine, S.W., I. Song, C.Y. Choi, and
C.P. Gerba. 2005. Application of
microbial risk assessment to the
development of standards for
enteric pathogens in water used to
irrigate fresh produce. J. Food Prot.
68:913-918.
Strawn, L.K., Y.T. Gröhn, S.
Warchocki, R.W. Worobo, E.A. Bihn,
and M. Wiedmann. 2013. Risk factors
associated with Salmonella and
Listeria monocytogenes
contamination of produce fields.
Appl. Environ. Microbiol. 79:76187627.
Notes
Using the risk assessment model in this paper, it was determined that when irrigating
ready-to-eat crops with wastewater effluent meeting the WHO guidelines (1,000 fecal
coliforms/100 ml), the annual risk of contracting a virus disease was about 10 -6 to 10-7
and for rotavirus disease, it was about 10-5 to 10-6. Based on the statement by the U.S.
EPA that guidelines for drinking water standards should be designed to ensure that
human populations not be subjected to the risk of infection by enteric disease > 10 -4 for
a yearly exposure, this study suggests that the WHO guidelines provide a factor of
safety some 1-2 orders of magnitude greater than that called for by the U.S. EPA for
microbial standards for drinking water. It was also estimated that expenditures of
some $3 – 30 million dollars per case of disease prevented would be necessary to meet
the standards for drinking water.
The model prediction for risk of infection was variable according to the dose-response
model. The risk of human campylobacteriosis was most sensitive to the probability of
infection from a Campylobacter, followed by the number of Campylobacter spp. per
serving, the frequency of washing the cutting board, the preparation of raw poultry
before salad using the same cutting board, and the frequency of hand washing.
The concentration of hepatitis A virus (HAV) and Salmonella in irrigation water
(furrow or drip) necessary to achieve a 1:10,000 annual risk of infection from
cantaloupes, iceberg lettuce, and bell peppers was calculated. These calculations were
based on the transfer of the selected nonpathogenic surrogates to fresh produce via
irrigation water, as well as previously determined preharvest inactivation rates of
pathogenic microorganisms on the surfaces of fresh produce. The risk of infection was
found to be variable depending on the type of crop, irrigation method, and days
between last irrigation event and harvest. The worst-case scenario, in which produce
is harvested and consumed the day after the last irrigation event and maximum
exposure is assumed, indicated that concentrations of 2.5 CFU/100 of Salmonella and
2.5 x 10-5 MPN/100 ml of HAV in irrigation water would result in an annual risk of
1:10,000 when the crop was consumed. If 14 days elapsed before harvest, allowing for
die-off of the pathogens, the concentrations were increased to 5.7 x 103
Salmonella/100 ml and 9.9 x 10-3 HAV/100 ml.
Detection of Salmonella and Listeria monocytogenes occurred in 11% and 30% of
water samples (n=74) and 6.1% and 17.5% of fields (n=263), respectively. Pathogenpositive water samples were primarily from nonirrigation surface water sources. A
nanagement practice that increased the odds of a Salmonella-positive field included
manure application within a year while the presence of a buffer zone had a protective
effect. The likelihood of detecting a L. monocytogenes-positive field was increased if
irrigation occurred within 3 days of sample collection, wildlife were observed within 3
days of sample collection, or the soil had been cultivated within 7 days of sample
collection.
Compiled by Marilyn Erickson, Center for Food Safety, University of Georgia
Downloaded from the website: A Systems Approach for Produce Safety: A Research Project Addressing Leafy Greens found at:
http://www.ugacfs.org/producesafety/index.html.
See http://www.ugacfs.org/producesafety/Pages/TermsofUse.html for disclaimers & terms for use of information in this document.
Page 6
RISK ASSESSMENT OF LEAFY GREENS
(last updated 1/10/2014)
Reference
Szabo, E.A., L. Simons, M.J.
Coventry, and M.B. Cole. 2003.
Assessment of control measures to
achieve a food safety objective of
less than 100 cfu of Listeria
monocytogenes per gram at the
point of consumption for fresh
precut iceberg lettuce. J. Food Prot.
66:256-264.
Topp, E., A. Scott, D.R. Lapen, E.
Lyautey, and P. Duriez. 2009.
Livestock waste treatment systems
for reducing environmental
exposure to hazardous enteric
pathogens: Some considerations.
Bioresource Technol. 100:5395-5398.
Tromp, S.O., H. Rijgersberg, and E.
Franz. 2010. Quantitative microbial
risk assessement for Escherichia coli
O157:H7, Salmonella enterica, and
Listeria monocytogenes in leafy
green vegetables consumed at salad
bars, based on modeling supply
chain logistics. J. Food Prot.
73:1830-1840.
Walls, I. 2007. Framework for
identification and collection of data
useful for risk assessments of
microbial foodborne or waterborne
hazards: A report from the
International Life Sciences Institute
Research Foundation Advisory
Committee on data collection for
microbial risk assessment. J. Food
Prot. 70:1744-1751.
Watanabe, T., D. Sano, and T. Omura.
2002. Risk evaluation for pathogenic
bacteria and viruses in sewage
sludge compost. Wat. Sci. Technol.
46(11-12):325-330.
Notes
The food safety objective (FSO) offers an approach to translate public health risk into
a definable goal wherein there is a specified maximum frequency or concentration of a
hazardous agent in a food at the time of consumption that is deemed acceptable to
provide an appropriate level of health protection. In the case of L. monocytogenes,
there is a proposed FSO of <100 cfu/g in RTE products. A FAO/WHO panel
calculated that 100% compliance for this goal would lead to approximately 5 to 25
cases of listeriosis per year, or a 99% reduction from the current baseline. As an
example of how this concept would work, if the increase in concentration due to
growth of viable L. monocytogenes remaining after washing was assumed to be as high
as 2.7 log CFU/g and the initial level of contamination on whole lettuce were as high
as 0.1 log MPN/g, then a performance criterion of a 0.8 log reduction is required to
meet the FSO.
The beneficial impacts of livestock waste treatment on risk to humans via exposure to
manured land are illustrated using quantitative microbial risk assessment scenarios.
Assumed geometric mean pathogen concentrations of 20 Cryptosporidium oocysts/g
and 320 Campylobacter cells/g fresh weight cattle manure. Human exposure is
assumed to result from ingestion of soil shortly after application of manure (25
tonnes/hectare), and hence little opportunity for die-off in the soil. In the absence of
livestock waste treatment, the risk of infection (expressed as a probability of risk of
infection per exposure event) from Cryptosporidium was 1.75 x 10-4, and from
Campylobacter 1.27 x 10-2. In contrast, when considering treated livestock waste that
had a 3 log reduction in pathogen content, the risk from Cryptosporidium was 1.75 x
10-7, and from Campylobacter was 1.27 x 10-5.
The aim of this study was to quantitatively assess the difference between simulating
supply chain logistics (MOD, complete with storage delays) and assuming fixed
storage times (FIX) in microbial risk estimation for the supply chain of fresh-cut leafy
green vegetables destined for salad bars. The public health effects were assessed by
conducting an exposure assessment and risk characterization. The relative growths of
E. coli O157 (17%) and Salmonella enterica (15%) were identical in the MOD and
FIX models. In contrast, the relative growth of Listeria monocytogenes was
considerably higher in the MOD model than in the FIX model and consequently the
risk of infection was higher in the MOD model than in the FIX model for this
pathogen.
The key data needs identified for a microbial risk assessment were as follows: (i)
burden of foodborne or waterborne disease; (ii) microbial contamination of foods; and
(iii) consumption patterns. In addition, dose-response data may be necessary, if
existing dose-response data cannot be used to estimate dose response for the
population of interest.
In this study, several kinds of compost were investigated for detection of pathogenic
bacteria (Salmonella spp. and E. coli O157) and enteric viruses. It was concluded
from the result that these bacteria and viruses could not be detected in 1.0 g wet weight
of any kinds of compost.
Criteria satisfying the acceptable risk (less than 10 -4 per year) for these pathogenic
bacteria and virus in the compost were determined from the result of simulations. 1.0
[CFU or PFU/gww] was available as the criteria for E. coli O157 and poliovirus 1 in
the compost. On the other hand, the criterion for Salmonella spp. in the compost
should be established on a lower concentration than 0.001 CFU/gww.
Compiled by Marilyn Erickson, Center for Food Safety, University of Georgia
Downloaded from the website: A Systems Approach for Produce Safety: A Research Project Addressing Leafy Greens found at:
http://www.ugacfs.org/producesafety/index.html.
See http://www.ugacfs.org/producesafety/Pages/TermsofUse.html for disclaimers & terms for use of information in this document.
Page 7
Download