781 resultados para Sanitaru risk and foot and mouse disease
Resumo:
Rates of cardiovascular and renal disease in Australian Aboriginal communities are high, but we do not know the contribution of inflammation to these diseases in this setting. In the present study, we sought to examine the distribution of C-reactive protein (CRP) and other markers of inflammation and their relationships with cardiovascular risk markers and renal disease in a remote Australian Aboriginal community. The study included 237 adults (58% of the adult population) in a remote Aboriginal community in the Northern Territory of Australia. Main outcome measures were CRP, fibrinogen and lgG concentrations, blood pressure (BP), presence of diabetes, lipids, albuminuria, seropositivity to three common micro-organisms, as well as carotid intima-media thickness (IMT). Serum concentrations of CRP [7 (5-13) mg/l; median (inter-quartile range)] were markedly increased and were significantly correlated with fibrinogen and lgG concentrations and inversely correlated with serum albumin concentration. Higher CRP concentrations were associated with lgG seropositivity to Helicobacter pylori and Chlamydia pneumoniae and higher lgG titre for cytomegalovirus. Higher CRP concentrations were associated with the following: the 45-54-year age group, female subjects, the presence of skin sores, higher body mass index, waist circumference, BP, glycated haemoglobin and greater albuminuria. CRP concentrations increased with the number of cardiovascular risk factors, carotid IMT and albuminuria independently of other risk factors. These CRP concentrations were markedly higher than described in other community settings and are probably related, in a large part, to chronic and repeated infections. Their association with markers of cardiovascular risk and renal disease are compatible with the high rates of cardiovascular and renal disease in this community, and provide more evidence of strong links between these conditions, through a shared background of infection/inflammation. This suggests that a strong focus on prevention and management of infections will be important in reducing these conditions, in addition to interventions directed at more traditional risk factors.
Resumo:
A method is presented to calculate economic optimum fungicide doses accounting for the risk-aversion of growers responding to variability in disease severity between crops. Simple dose-response and disease-yield loss functions are used to estimate net disease-related costs (fungicide cost, plus disease-induced yield loss) as a function of dose and untreated severity. With fairly general assumptions about the shapes of the probability distribution of disease severity and the other functions involved, we show that a choice of fungicide dose which minimises net costs on average across seasons results in occasional large net costs caused by inadequate control in high disease seasons. This may be unacceptable to a grower with limited capital. A risk-averse grower can choose to reduce the size and frequency of such losses by applying a higher dose as insurance. For example, a grower may decide to accept ‘high loss’ years one year in ten or one year in twenty (i.e. specifying a proportion of years in which disease severity and net costs will be above a specified level). Our analysis shows that taking into account disease severity variation and risk-aversion will usually increase the dose applied by an economically rational grower. The analysis is illustrated with data on septoria tritici leaf blotch of wheat caused by Mycosphaerella graminicola. Observations from untreated field plots at sites across England over three years were used to estimate the probability distribution of disease severities at mid-grain filling. In the absence of a fully reliable disease forecasting scheme, reducing the frequency of ‘high loss’ years requires substantially higher doses to be applied to all crops. Disease resistant cultivars reduce both the optimal dose at all levels of risk and the disease-related costs at all doses.
Resumo:
An analysis was made that calculated the risk of disease for premises in the most heavily affected parts of the county of Cumbria during the foot-and-mouth disease epidemic in the UK in 2001. In over half the cases the occurrence of the disease was not directly attributable to a recently infected premises being located within 1.5 km. Premises more than 1.5 km from recently infected premises faced sufficiently high infection risks that culling within a 1.5 km radius of the infected premises alone could not have prevented the progress of the epidemic. A comparison of the final outcome in two areas of the county, south Penrith and north Cumbria, indicated that focusing on controlling the potential spread of the disease over short distances by culling premises contiguous to infected premises, while the disease continued to spread over longer distances, may have resulted in excessive numbers of premises being culled. Even though the contiguous cull in south Penrith appeared to have resulted in a smaller proportion of premises becoming infected, the overall proportion of premises culled was considerably greater than in north Cumbria, where, because of staff and resource limitations, a smaller proportion of premises contiguous to infected premises was culled
Resumo:
A recent report to the Australian Government identified concerns relating to Australia's capacity to respond to a medium to large outbreak of FMD. To assess the resources required, the AusSpread disease simulation model was used to develop a plausible outbreak scenario that included 62 infected premises in five different states at the time of detection, 28 days after the disease entered the first property in Victoria. Movements of infected animals and/or contaminated product/equipment led to smaller outbreaks in NSW, Queensland, South Australia and Tasmania. With unlimited staff resources, the outbreak was eradicated in 63 days with 54 infected premises and a 98% chance of eradication within 3 months. This unconstrained response was estimated to involve 2724 personnel. Unlimited personnel was considered unrealistic, and therefore, the course of the outbreak was modelled using three levels of staffing and the probability of achieving eradication within 3 or 6 months of introduction determined. Under the baseline staffing level, there was only a 16% probability that the outbreak would be eradicated within 3 months, and a 60% probability of eradication in 6 months. Deployment of an additional 60 personnel in the first 3 weeks of the response increased the likelihood of eradication in 3 months to 68%, and 100% in 6 months. Deployment of further personnel incrementally increased the likelihood of timely eradication and decreased the duration and size of the outbreak. Targeted use of vaccination in high-risk areas coupled with the baseline personnel resources increased the probability of eradication in 3 months to 74% and to 100% in 6 months. This required 25 vaccination teams commencing 12 days into the control program increasing to 50 vaccination teams 3 weeks later. Deploying an equal number of additional personnel to surveillance and infected premises operations was equally effective in reducing the outbreak size and duration.
Resumo:
Foot-and-mouth disease virus (FMDV) is a significant economically and distributed globally pathogen of Artiodactyla. Current vaccines are chemically inactivated whole virus particles that require large-scale virus growth in strict bio-containment with the associated risks of accidental release or incomplete inactivation. Non-infectious empty capsids are structural mimics of authentic particles with no associated risk and constitute an alternate vaccine candidate. Capsids self-assemble from the processed virus structural proteins, VP0, VP3 and VP1, which are released from the structural protein precursor P1-2A by the action of the virus-encoded 3C protease. To date recombinant empty capsid assembly has been limited by poor expression levels, restricting the development of empty capsids as a viable vaccine. Here expression of the FMDV structural protein precursor P1-2A in insect cells is shown to be efficient but linkage of the cognate 3C protease to the C-terminus reduces expression significantly. Inactivation of the 3C enzyme in a P1-2A-3C cassette allows expression and intermediate levels of 3C activity resulted in efficient processing of the P1-2A precursor into the structural proteins which assembled into empty capsids. Expression was independent of the insect host cell background and leads to capsids that are recognised as authentic by a range of anti-FMDV bovine sera suggesting their feasibility as an alternate vaccine.
Resumo:
Foot-and-mouth disease (FMD), a disease of cloven hooved animals caused by FMD virus (FMDV), is one of the most economically devastating diseases of livestock worldwide. The global burden of disease is borne largely by livestock-keepers in areas of Africa and Asia where the disease is endemic and where many people rely on livestock for their livelihoods and food-security. Yet, there are many gaps in our knowledge of the drivers of FMDV circulation in these settings. In East Africa, FMD epidemiology is complicated by the circulation of multiple FMDV serotypes (distinct antigenic variants) and by the presence of large populations of susceptible wildlife and domestic livestock. The African buffalo (Syncerus caffer) is the only wildlife species with consistent evidence of high levels of FMDV infection, and East Africa contains the largest population of this species globally. To inform FMD control in this region, key questions relate to heterogeneities in FMD prevalence and impacts in different livestock management systems and to the role of wildlife as a potential source of FMDV for livestock. To develop FMD control strategies and make best use of vaccine control options, serotype-specific patterns of circulation need to be characterised. In this study, the impacts and epidemiology of FMD were investigated across a range of traditional livestock-keeping systems in northern Tanzania, including pastoralist, agro-pastoralist and rural smallholder systems. Data were generated through field studies and laboratory analyses between 2010 and 2015. The study involved analysis of existing household survey data and generated serological data from cross-sectional livestock and buffalo samples and longitudinal cattle samples. Serological analyses included non-structural protein ELISAs, serotype-specific solid-phase competitive ELISAs, with optimisation to detect East African FMDV variants, and virus neutralisation testing. Risk factors for FMDV infection and outbreaks were investigated through analysis of cross-sectional serological data in conjunction with a case-control outbreak analysis. A novel Bayesian modeling approach was developed to infer serotype-specific infection history from serological data, and combined with virus isolation data from FMD outbreaks to characterise temporal and spatial patterns of serotype-specific infection. A high seroprevalence of FMD was detected in both northern Tanzanian livestock (69%, [66.5 - 71.4%] in cattle and 48.5%, [45.7-51.3%] in small ruminants) and in buffalo (80.9%, [74.7-86.1%]). Four different serotypes of FMDV (A, O, SAT1 and SAT2) were isolated from livestock. Up to three outbreaks per year were reported by households and active surveillance highlighted up to four serial outbreaks in the same herds within three years. Agro-pastoral and pastoral livestock keepers reported more frequent FMD outbreaks compared to smallholders. Households in all three management systems reported that FMD outbreaks caused significant impacts on milk production and sales, and on animals’ draught power, hence on crop production, with implications for food security and livelihoods. Risk factor analyses showed that older livestock were more likely to be seropositive for FMD (Odds Ratio [OR] 1.4 [1.4-1.5] per extra year) and that cattle (OR 3.3 [2.7-4.0]) were more likely than sheep and goats to be seropositive. Livestock managed by agro-pastoralists (OR 8.1 [2.8-23.6]) or pastoralists (OR 7.1 [2.9-17.6]) were more likely to be seropositive compared to those managed by smallholders. Larger herds (OR: 1.02 [1.01-1.03] per extra bovine) and those that recently acquired new livestock (OR: 5.57 [1.01 – 30.91]) had increased odds of suffering an FMD outbreak. Measures of potential contact with buffalo or with other FMD susceptible wildlife did not increase the likelihood of FMD in livestock in either the cross-sectional serological analysis or case-control outbreak analysis. The Bayesian model was validated to correctly infer from ELISA data the most recent serotype to infect cattle. Consistent with the lack of risk factors related to wildlife contact, temporal and spatial patterns of exposure to specific FMDV serotypes were not tightly linked in cattle and buffalo. In cattle, four serial waves of different FMDV serotypes that swept through southern Kenyan and northern Tanzanian livestock populations over a four-year period dominated infection patterns. In contrast, only two serotypes (SAT1 and SAT2) dominated in buffalo populations. Key conclusions are that FMD has a substantial impact in traditional livestock systems in East Africa. Wildlife does not currently appear to act as an important source of FMDV for East African livestock, and control efforts in the region should initially focus on livestock management and vaccination strategies. A novel modeling approach greatly facilitated the interpretation of serological data and may be a potent epidemiological tool in the African setting. There was a clear temporal pattern of FMDV antigenic dominance across northern Tanzania and southern Kenya. Longer-term research to investigate whether serotype-specific FMDV sweeps are truly predictable, and to shed light on FMD post-infection immunity in animals exposed to serial FMD infections is warranted.
Resumo:
The objective of this work was to apply fuzzy majority multicriteria group decision?making to determine risk areas for foot?and?mouth disease (FMD) introduction along the border between Brazil and Paraguay. The study was conducted in three municipalities in the state of Mato Grosso do Sul, Brazil, located along the border with Paraguay. Four scenarios were built, applying the following linguistic quantifiers to describe risk factors: few, half, many, and most. The three criteria considered to be most likely to affect the vulnerability to introduction of FMD, according to experts? opinions, were: the introduction of animals in the farm, the distance from the border, and the type of property settlements. The resulting maps show a strong spatial heterogeneity in the risk of FMD introduction. The used methodology brings out a new approach that can be helpful to policy makers in the combat and eradication of FMD.
Resumo:
Introduction: Smoking status in outpatients with chronic obstructive pulmonary disease (COPD) has been associated with a low body mass index (BMI) and reduced mid-arm muscle circumference (Cochrane & Afolabi, 2004). Individuals with COPD identified as malnourished have also been found to be twice as likely to die within 1 year compared to non-malnourished patients (Collins et al., 2010). Although malnutrition is both preventable and treatable, it is not clear what influence current smoking status, another modifiable risk factor, has on malnutrition risk. The current study aimed to establish the influence of smoking status on malnutrition risk and 1-year mortality in outpatients with COPD. Methods: A prospective nutritional screening survey was carried out between July 2008 and May 2009 at a large teaching hospital (Southampton General Hospital) and a smaller community hospital within Hampshire (Lymington New Forest Hospital). In total, 424 outpatients with a diagnosis of COPD were routinely screened using the ‘Malnutrition Universal Screening Tool’, ‘MUST’ (Elia, 2003); 222 males, 202 females; mean (SD) age 73 (9.9) years; mean (SD) BMI 25.9 (6.4) kg m−2. Smoking status on the date of screening was obtained for 401 of the outpatients. Severity of COPD was assessed using the GOLD criteria, and social deprivation determined using the Index of Multiple Deprivation (Nobel et al., 2008). Results: The overall prevalence of malnutrition (medium + high risk) was 22%, with 32% of current smokers at risk (who accounted for 19% of the total COPD population). In comparison, 19% of nonsmokers and ex-smokers were likely to be malnourished [odds ratio, 1.965; 95% confidence interval (CI), 1.133–3.394; P = 0.015]. Smoking status remained an independent risk factor for malnutrition even after adjustment for age, social deprivation and disease-severity (odds ratio, 2.048; 95% CI, 1.085–3.866; P = 0.027) using binary logistic regression. After adjusting for age, disease severity, social deprivation, smoking status, malnutrition remained a significant predictor of 1-year mortality [odds ratio (medium + high risk versus low risk), 2.161; 95% CI, 1.021–4.573; P = 0.044], whereas smoking status did not (odds ratio for smokers versus ex-smokers + nonsmokers was 1.968; 95% CI, 0.788–4.913; P = 0.147). Discussion: This study highlights the potential importance of combined nutritional support and smoking cessation in order to treat malnutrition. The close association between smoking status and malnutrition risk in COPD suggests that smoking is an important consideration in the nutritional management of malnourished COPD outpatients. Conclusions: Smoking status in COPD outpatients is a significant independent risk factor for malnutrition and a weaker (nonsignificant) predictor of 1-year mortality. Malnutrition significantly predicted 1 year mortality. References: Cochrane, W.J. & Afolabi, O.A. (2004) Investigation into the nutritional status, dietary intake and smoking habits of patients with chronic obstructive pulmonary disease. J. Hum. Nutr. Diet.17, 3–11. Collins, P.F., Stratton, R.J., Kurukulaaratchym R., Warwick, H. Cawood, A.L. & Elia, M. (2010) ‘MUST’ predicts 1-year survival in outpatients with chronic obstructive pulmonary disease. Clin. Nutr.5, 17. Elia, M. (Ed) (2003) The ‘MUST’ Report. BAPEN. http://www.bapen.org.uk (accessed on March 30 2011). Nobel, M., McLennan, D., Wilkinson, K., Whitworth, A. & Barnes, H. (2008) The English Indices of Deprivation 2007. http://www.communities.gov.uk (accessed on March 30 2011).
Resumo:
Deprivation has previously been shown to be an independent risk factor for the high prevalence of malnutrition observed in COPD (Collins et al., 2010). It has been suggested the socioeconomic gradient observed in COPD is greater than any other chronic disease (Prescott & Vestbo, 1999). The current study aimed to examine the infl uence of disease severity and social deprivation on malnutrition risk in outpatients with COPD. 424 COPD outpatients were screened using the ‘Malnutrition Universal Screening Tool’ (‘MUST’). COPD disease severity was recorded in accordance with the GOLD criteria and deprivation was established according to the patient’s geographical location (postcode) at the time of nutritional screening using the UK Government’s Index of Multiple Deprivation (IMD). IMD ranks postcodes from 1 (most deprived) to 32,482 (least deprived). Disease severity was posi tively associated with an increased prevalence of malnutrition risk (p < 0.001) both within and between groups, whilst rank IMD was negatively associated with malnutrition (p = 0.020), i.e. those residing in less deprived areas were less likely to be malnourished. Within each category of disease severity the prevalence of malnutrition was two-fold greater in those residing in the most deprived areas compared to those residing in the least deprived areas. This study suggests that deprivation and disease severity are independent risk factors for malnutrition in COPD both contributing to the widely variable prevalence of malnutrition. Consideration of these issues could assist with the targeted nutritional management of these patients.
Resumo:
This study has provided further understanding of the pathogenesis of EV71, one of the major etiological agents associated with significant mortality in Hand, Foot and Mouth disease. Elucidating the host-pathogen interaction and the mechanism that the virus uses to bypass host defence systems to establish infection will aid in the development of potential antiviral therapeutics against EV71.
Hand, foot and mouth disease in China: Evaluating an automated system for the detection of outbreaks
Resumo:
Objective To evaluate the performance of China’s infectious disease automated alert and response system in the detection of outbreaks of hand, foot and mouth (HFM) disease. Methods We estimated size, duration and delay in reporting HFM disease outbreaks from cases notified between 1 May 2008 and 30 April 2010 and between 1 May 2010 and 30 April 2012, before and after automatic alert and response included HFM disease. Sensitivity, specificity and timeliness of detection of aberrations in the incidence of HFM disease outbreaks were estimated by comparing automated detections to observations of public health staff. Findings The alert and response system recorded 106 005 aberrations in the incidence of HFM disease between 1 May 2010 and 30 April 2012 – a mean of 5.6 aberrations per 100 days in each county that reported HFM disease. The response system had a sensitivity of 92.7% and a specificity of 95.0%. The mean delay between the reporting of the first case of an outbreak and detection of that outbreak by the response system was 2.1 days. Between the first and second study periods, the mean size of an HFM disease outbreak decreased from 19.4 to 15.8 cases and the mean interval between the onset and initial reporting of such an outbreak to the public health emergency reporting system decreased from 10.0 to 9.1 days. Conclusion The automated alert and response system shows good sensitivity in the detection of HFM disease outbreaks and appears to be relatively rapid. Continued use of this system should allow more effective prevention and limitation of such outbreaks in China.
Resumo:
BACKGROUND Quantification of the disease burden caused by different risks informs prevention by providing an account of health loss different to that provided by a disease-by-disease analysis. No complete revision of global disease burden caused by risk factors has been done since a comparative risk assessment in 2000, and no previous analysis has assessed changes in burden attributable to risk factors over time. METHODS We estimated deaths and disability-adjusted life years (DALYs; sum of years lived with disability [YLD] and years of life lost [YLL]) attributable to the independent effects of 67 risk factors and clusters of risk factors for 21 regions in 1990 and 2010. We estimated exposure distributions for each year, region, sex, and age group, and relative risks per unit of exposure by systematically reviewing and synthesising published and unpublished data. We used these estimates, together with estimates of cause-specific deaths and DALYs from the Global Burden of Disease Study 2010, to calculate the burden attributable to each risk factor exposure compared with the theoretical-minimum-risk exposure. We incorporated uncertainty in disease burden, relative risks, and exposures into our estimates of attributable burden. FINDINGS In 2010, the three leading risk factors for global disease burden were high blood pressure (7·0% [95% uncertainty interval 6·2-7·7] of global DALYs), tobacco smoking including second-hand smoke (6·3% [5·5-7·0]), and alcohol use (5·5% [5·0-5·9]). In 1990, the leading risks were childhood underweight (7·9% [6·8-9·4]), household air pollution from solid fuels (HAP; 7·0% [5·6-8·3]), and tobacco smoking including second-hand smoke (6·1% [5·4-6·8]). Dietary risk factors and physical inactivity collectively accounted for 10·0% (95% UI 9·2-10·8) of global DALYs in 2010, with the most prominent dietary risks being diets low in fruits and those high in sodium. Several risks that primarily affect childhood communicable diseases, including unimproved water and sanitation and childhood micronutrient deficiencies, fell in rank between 1990 and 2010, with unimproved water and sanitation accounting for 0·9% (0·4-1·6) of global DALYs in 2010. However, in most of sub-Saharan Africa childhood underweight, HAP, and non-exclusive and discontinued breastfeeding were the leading risks in 2010, while HAP was the leading risk in south Asia. The leading risk factor in Eastern Europe, most of Latin America, and southern sub-Saharan Africa in 2010 was alcohol use; in most of Asia, North Africa and Middle East, and central Europe it was high blood pressure. Despite declines, tobacco smoking including second-hand smoke remained the leading risk in high-income north America and western Europe. High body-mass index has increased globally and it is the leading risk in Australasia and southern Latin America, and also ranks high in other high-income regions, North Africa and Middle East, and Oceania. INTERPRETATION Worldwide, the contribution of different risk factors to disease burden has changed substantially, with a shift away from risks for communicable diseases in children towards those for non-communicable diseases in adults. These changes are related to the ageing population, decreased mortality among children younger than 5 years, changes in cause-of-death composition, and changes in risk factor exposures. New evidence has led to changes in the magnitude of key risks including unimproved water and sanitation, vitamin A and zinc deficiencies, and ambient particulate matter pollution. The extent to which the epidemiological shift has occurred and what the leading risks currently are varies greatly across regions. In much of sub-Saharan Africa, the leading risks are still those associated with poverty and those that affect children.
Resumo:
Qualitative aspects of verbal fluency may be more useful in discerning the precise cause of any quantitative deficits in phonetic or category fluency, especially in the case of mild cognitive impairment (MCI), a possible intermediate stage between normal performance and Alzheimer's disease (AD). The aim of this study was to use both quantitative and qualitative (switches and clusters) methods to compare the phonetic and category verbal fluency performance of elderly adults with no cognitive impairment (n = 51), significant memory impairment (n = 16), and AD (n = 16). As expected, the AD group displayed impairments in all quantitative and qualitative measures of the two fluency tasks relative to their age- and education-matched peers. By contrast, the amnestic MCI group produced fewer animal names on the semantic fluency task than controls and showed normal performance on the phonetic fluency task. The MCI group's inferior category fluency performance was associated with a deficit in their category-switching rate rather than word cluster size. Overall, the results indicate that a semantic measure such as category fluency when used in conjunction with a test of episodic memory may increase the sensitivity for detecting preclinical AD. Future research using external cues and other measures of set shifting capacity may assist in clarifying the origin of the amnestic MCI-specific category-switching deficiency. Copyright
Resumo:
Background Understanding the relationship between extreme weather events and childhood hand, foot and mouth disease (HFMD) is important in the context of climate change. This study aimed to quantify the relationship between extreme precipitation and childhood HFMD in Hefei, China, and further, to explore whether the association varied across urban and rural areas. Methods Daily data on HFMD counts among children aged 0–14 years from 2010 January 1st to 2012 December 31st were retrieved from Hefei Center for Disease Control and Prevention. Daily data on mean temperature, relative humidity and precipitation during the same period were supplied by Hefei Bureau of Meteorology. We used a Poisson linear regression model combined with a distributed lag non-linear model to assess the association between extreme precipitation (≥ 90th precipitation) and childhood HFMD, controlling for mean temperature, humidity, day of week, and long-term trend. Results There was a statistically significant association between extreme precipitation and childhood HFMD. The effect of extreme precipitation on childhood HFMD was the greatest at six days lag, with a 5.12% (95% confident interval: 2.7–7.57%) increase of childhood HFMD for an extreme precipitation event versus no precipitation. Notably, urban children and children aged 0–4 years were particularly vulnerable to the effects of extreme precipitation. Conclusions Our findings indicate that extreme precipitation may increase the incidence of childhood HFMD in Hefei, highlighting the importance of protecting children from forthcoming extreme precipitation, particularly for those who are young and from urban areas.