565 resultados para invasive fungal disease
Resumo:
Introduced predators can have pronounced effects on naïve prey species; thus, predator control is often essential for conservation of threatened native species. Complete eradication of the predator, although desirable, may be elusive in budget-limited situations, whereas predator suppression is more feasible and may still achieve conservation goals. We used a stochastic predator-prey model based on a Lotka-Volterra system to investigate the cost-effectiveness of predator control to achieve prey conservation. We compared five control strategies: immediate eradication, removal of a constant number of predators (fixed-number control), removal of a constant proportion of predators (fixed-rate control), removal of predators that exceed a predetermined threshold (upper-trigger harvest), and removal of predators whenever their population falls below a lower predetermined threshold (lower-trigger harvest). We looked at the performance of these strategies when managers could always remove the full number of predators targeted by each strategy, subject to budget availability. Under this assumption immediate eradication reduced the threat to the prey population the most. We then examined the effect of reduced management success in meeting removal targets, assuming removal is more difficult at low predator densities. In this case there was a pronounced reduction in performance of the immediate eradication, fixed-number, and lower-trigger strategies. Although immediate eradication still yielded the highest expected minimum prey population size, upper-trigger harvest yielded the lowest probability of prey extinction and the greatest return on investment (as measured by improvement in expected minimum population size per amount spent). Upper-trigger harvest was relatively successful because it operated when predator density was highest, which is when predator removal targets can be more easily met and the effect of predators on the prey is most damaging. This suggests that controlling predators only when they are most abundant is the "best" strategy when financial resources are limited and eradication is unlikely. © 2008 Society for Conservation Biology.
Resumo:
The notion of being sure that you have completely eradicated an invasive species is fanciful because of imperfect detection and persistent seed banks. Eradication is commonly declared either on an ad hoc basis, on notions of seed bank longevity, or on setting arbitrary thresholds of 1% or 5% confidence that the species is not present. Rather than declaring eradication at some arbitrary level of confidence, we take an economic approach in which we stop looking when the expected costs outweigh the expected benefits. We develop theory that determines the number of years of absent surveys required to minimize the net expected cost. Given detection of a species is imperfect, the optimal stopping time is a trade-off between the cost of continued surveying and the cost of escape and damage if eradication is declared too soon. A simple rule of thumb compares well to the exact optimal solution using stochastic dynamic programming. Application of the approach to the eradication programme of Helenium amarum reveals that the actual stopping time was a precautionary one given the ranges for each parameter. © 2006 Blackwell Publishing Ltd/CNRS.
Resumo:
Strategic searching for invasive pests presents a formidable challenge for conservation managers. Limited funding can necessitate choosing between surveying many sites cursorily, or focussing intensively on fewer sites. While existing knowledge may help to target more likely sites, e.g. with species distribution models (maps), this knowledge is not flawless and improving it also requires management investment. 2.In a rare example of trading-off action against knowledge gain, we combine search coverage and accuracy, and its future improvement, within a single optimisation framework. More specifically we examine under which circumstances managers should adopt one of two search-and-control strategies (cursory or focussed), and when they should divert funding to improving knowledge, making better predictive maps that benefit future searches. 3.We use a family of Receiver Operating Characteristic curves to reflect the quality of maps that direct search efforts. We demonstrate our framework by linking these to a logistic model of invasive spread such as that for the red imported fire ant Solenopsis invicta in south-east Queensland, Australia. 4.Cursory widespread searching is only optimal if the pest is already widespread or knowledge is poor, otherwise focussed searching exploiting the map is preferable. For longer management timeframes, eradication is more likely if funds are initially devoted to improving knowledge, even if this results in a short-term explosion of the pest population. 5.Synthesis and applications. By combining trade-offs between knowledge acquisition and utilization, managers can better focus - and justify - their spending to achieve optimal results in invasive control efforts. This framework can improve the efficiency of any ecological management that relies on predicting occurrence. © 2010 The Authors. Journal of Applied Ecology © 2010 British Ecological Society.
Resumo:
Background Internet-based surveillance systems provide a novel approach to monitoring infectious diseases. Surveillance systems built on internet data are economically, logistically and epidemiologically appealing and have shown significant promise. The potential for these systems has increased with increased internet availability and shifts in health-related information seeking behaviour. This approach to monitoring infectious diseases has, however, only been applied to single or small groups of select diseases. This study aims to systematically investigate the potential for developing surveillance and early warning systems using internet search data, for a wide range of infectious diseases. Methods Official notifications for 64 infectious diseases in Australia were downloaded and correlated with frequencies for 164 internet search terms for the period 2009–13 using Spearman’s rank correlations. Time series cross correlations were performed to assess the potential for search terms to be used in construction of early warning systems. Results Notifications for 17 infectious diseases (26.6%) were found to be significantly correlated with a selected search term. The use of internet metrics as a means of surveillance has not previously been described for 12 (70.6%) of these diseases. The majority of diseases identified were vaccine-preventable, vector-borne or sexually transmissible; cross correlations, however, indicated that vector-borne and vaccine preventable diseases are best suited for development of early warning systems. Conclusions The findings of this study suggest that internet-based surveillance systems have broader applicability to monitoring infectious diseases than has previously been recognised. Furthermore, internet-based surveillance systems have a potential role in forecasting emerging infectious disease events, especially for vaccine-preventable and vector-borne diseases
Hand, foot and mouth disease in China: Evaluating an automated system for the detection of outbreaks
Resumo:
Objective To evaluate the performance of China’s infectious disease automated alert and response system in the detection of outbreaks of hand, foot and mouth (HFM) disease. Methods We estimated size, duration and delay in reporting HFM disease outbreaks from cases notified between 1 May 2008 and 30 April 2010 and between 1 May 2010 and 30 April 2012, before and after automatic alert and response included HFM disease. Sensitivity, specificity and timeliness of detection of aberrations in the incidence of HFM disease outbreaks were estimated by comparing automated detections to observations of public health staff. Findings The alert and response system recorded 106 005 aberrations in the incidence of HFM disease between 1 May 2010 and 30 April 2012 – a mean of 5.6 aberrations per 100 days in each county that reported HFM disease. The response system had a sensitivity of 92.7% and a specificity of 95.0%. The mean delay between the reporting of the first case of an outbreak and detection of that outbreak by the response system was 2.1 days. Between the first and second study periods, the mean size of an HFM disease outbreak decreased from 19.4 to 15.8 cases and the mean interval between the onset and initial reporting of such an outbreak to the public health emergency reporting system decreased from 10.0 to 9.1 days. Conclusion The automated alert and response system shows good sensitivity in the detection of HFM disease outbreaks and appears to be relatively rapid. Continued use of this system should allow more effective prevention and limitation of such outbreaks in China.
Resumo:
Background: Multipotent mesenchymal stromal cells suppress T-cell function in vitro, a property that has underpinned their use in treating clinical steroid-refractory graft-versus-host disease after allogeneic hematopoietic stem cell transplantation. However the potential of mesenchymal stromal cells to resolve graft-versus-host disease is confounded by a paucity of pre-clinical data delineating their immunomodulatory effects in vivo. Design and Methods: We examined the influence of timing and dose of donor-derived mesenchymal stromal cells on the kinetics of graft-versus-host disease in two murine models of graft-versus-host disease (major histocompatibility complex-mismatched: UBI-GFP/BL6 [H-2b]→BALB/c [H-2d] and the sibling transplant mimic, UBI-GFP/BL6 [H-2b]→BALB.B [H-2b]) using clinically relevant conditioning regimens. We also examined the effect of mesenchymal stromal cell infusion on bone marrow and spleen cellular composition and cytokine secretion in transplant recipients. Results: Despite T-cell suppression in vitro, mesenchymal stromal cells delayed but did not prevent graft-versus-host disease in the major histocompatibility complex-mismatched model. In the sibling transplant model, however, 30% of mesenchymal stromal cell-treated mice did not develop graft-versus-host disease. The timing of administration and dose of the mesenchymal stromal cells influenced their effectiveness in attenuating graft-versus-host disease, such that a low dose of mesenchymal stromal cells administered early was more effective than a high dose of mesenchymal stromal cells given late. Compared to control-treated mice, mesenchymal stromal cell-treated mice had significant reductions in serum and splenic interferon-γ, an important mediator of graft-versus-host disease. Conclusions: Mesenchymal stromal cells appear to delay death from graft-versus-host disease by transiently altering the inflammatory milieu and reducing levels of interferon-γ. Our data suggest that both the timing of infusion and the dose of mesenchymal stromal cells likely influence these cells’ effectiveness in attenuating graft-versus-host disease.
Resumo:
Background Preparative myeloablative conditioning regimens for allogeneic hematopoietic stem-cell transplantation (HSCT) may control malignancy and facilitate engraftment but also contribute to transplant related mortality, cytokine release, and acute graft-versus-host disease (GVHD). Reduced intensity conditioning (RIC) regimens have decreased transplant related mortality but the incidence of acute GVHD, while delayed, remains unchanged. There are currently no in vivo allogeneic models of RIC HSCT, limiting studies into the mechanism behind RIC-associated GVHD. Methods We developed two RIC HSCT models that result in delayed onset GVHD (major histocompatibility complex mismatched (UBI-GFP/BL6 [H-2b]→BALB/c [H-2d]) and major histocompatibility complex matched, minor histocompatibility mismatched (UBI-GFP/BL6 [H-2b]→BALB.B [H-2b])) enabling the effect of RIC on chimerism, dendritic cell (DC) chimerism, and GVHD to be investigated. Results In contrast with myeloablative conditioning, we observed that RIC-associated delayed-onset GVHD is characterized by low production of tumor necrosis factor-α, maintenance of host DC, phenotypic DC activation, increased T-regulatory cell numbers, and a delayed emergence of activated donor DC. Furthermore, changes to the peritransplant milieu in the recipient after RIC lead to the altered activation of DC and the induction of T-regulatory responses. Reduced intensity conditioning recipients suffer less early damage to GVHD target organs. However, as donor cells engraft, activated donor DC and rising levels of tumor necrosis factor-α are associated with a later onset of severe GVHD. Conclusions Delineating the mechanisms underlying delayed onset GVHD in RIC HSCT recipients is vital to improve the prediction of disease onset and allow more targeted interventions for acute GVHD.
Resumo:
Host and donor dendritic cells (DC) stimulate alloreactive donor T lymphocytes, and initiate GVHD. We have shown that polyclonal antibody to the DC surface activation marker human CD83 (anti hCD83), which depletes activated DC, can prevent human DC and T cell induced lethal xenogeneic GVHD in SCID mice without impairing T cell mediated anti-leukaemic and anti-viral (CMV and influenza) immunity (J Exp Med 2009; 206: 387). Therefore, we made and tested a polyclonal anti mouse CD83 (RAM83) antibody in murine HSCT models and developed a human mAb against hCD83 as a potential new therapeutic immunosuppressive agent.
Resumo:
Background Symptom burden in chronic kidney disease (CKD) is poorly understood. To date, the majority of research focuses on single symptoms and there is a lack of suitable multidimensional symptom measures. The purpose of this study was to modify, translate, cross-culturally adapt and psychometrically analyse the Dialysis Symptom Index (DSI). Methods The study methods involved four phases: modification, translation, pilot-testing with a bilingual non-CKD sample and then psychometric testing with the target population. Content validity was assessed using an expert panel. Inter-rater agreement, test-retest reliability and Cronbach’s alpha coefficient were calculated to demonstrate reliability of the modified DSI. Discriminative and convergent validity were assessed to demonstrate construct validity. Results Content validity index during translation was 0.98. In the pilot study with 25 bilingual students a moderate to perfect agreement (Kappa statistic = 0.60-1.00) was found between English and Arabic versions of the modified DSI. The main study recruited 433 patients CKD with stages 4 and 5. The modified DSI was able to discriminate between non-dialysis and dialysis groups (p < 0.001) and demonstrated convergent validity with domains of the Kidney Disease Quality of Life short form. Excellent test-retest and internal consistency (Cronbach’s α = 0.91) reliability were also demonstrated. Conclusion The Arabic version of the modified DSI demonstrated good psychometric properties, measures the multidimensional nature of symptoms and can be used to assess symptom burden at different stages of CKD. The modified instrument, renamed the CKD Symptom Burden Index (CKD-SBI), should encourage greater clinical and research attention to symptom burden in CKD.
Resumo:
Emotional and role functioning difficulties are associated with chronic alcohol use and liver disease. Little is known about prospective changes in psychological and psychosocial functioning following orthotopic liver transplantation (OLT) amongst patients with alcoholic liver disease (ALD). We aimed to assess the functioning of this patient group post liver transplantation. Comprehensive psychosocial assessment of depression (Beck Depression Inventory [BDI]), anxiety (State-Trait Anxiety Inventory-Form X [STAI]) and psychosocial adjustment (Psychosocial Adjustment to Illness Scale-Self-Report version [PAIS-SR]) was conducted with 42 ALD patients available for pre and post OLT testing. Dependence severity was assessed by the Brief Michigan Alcoholism Screening Test (bMAST). Significant reductions in average anxiety and depression symptoms were observed 12-months post-OLT. Significant improvements in psychosocial adjustment to illness were also reported. Patients with higher levels of alcohol dependence severity pre transplant assessment improved comparably to those with lower levels of dependence. In summary, the study found that OLT contributed to reducing overall levels of mood and anxiety symptoms in ALD patients, approximating general (non-clinical) population norms. Psychosocial adjustment also improved significantly post liver transplantation.
Resumo:
Background Globally, over 800 000 children under five die each year from infectious diseases caused by Streptococcus pneumoniae. To understand genetic relatedness between isolates, study transmission routes, assess the impact of human interventions e.g. vaccines, and determine infection sources, genotyping methods are required. The ‘gold standard’ genotyping method, Multi-Locus Sequence Typing (MLST), is useful for long-term and global studies. Another genotyping method, Multi-Locus Variable Number of Tandem Repeat Analysis (MLVA), has emerged as a more discriminatory, inexpensive and faster technique; however there is no universally accepted method and it is currently suitable for short-term and localised epidemiology studies. Currently Australia has no national MLST database, nor has it adopted any MLVA method for short-term or localised studies. This study aims to improve S. pneumoniae genotyping methods by modifying the existing MLVA techniques to be more discriminatory, faster, cheaper and technically less demanding than previously published MLVA methods and MLST. Methods Four different MLVA protocols, including a modified method, were applied to 317 isolates of serotyped invasive S. pneumoniae isolated from sterile body sites of Queensland children under 15 years from 2007–2012. MLST was applied to 202 isolates for comparison. Results The modified MLVA4 is significantly more discriminatory than the ‘gold standard’ MLST method. MLVA4 has similar discrimination compared to other MLVA techniques in this study). The failure to amplify particular loci in previous MLVA methods were minimised in MLVA4. Failure to amplify BOX-13 and Spneu19 were found to be serotype specific. Conclusion We have modified a highly discriminatory MLVA technique for genotyping Queensland invasive S. pneumoniae. MLVA4 has the ability to enhance our understanding of the pneumococcal epidemiology and the changing genetics of the pneumococcus in localised and short-term studies.
Resumo:
OBJECTIVES To estimate the disease burden attributable to being underweight as an indicator of undernutrition in children under 5 years of age and in pregnant women for the year 2000. DESIGN World Health Organization comparative risk assessment (CRA) methodology was followed. The 1999 National Food Consumption Survey prevalence of underweight classified in three low weight-for-age categories was compared with standard growth charts to estimate population-attributable fractions for mortality and morbidity outcomes, based on increased risk for each category and applied to revised burden of disease estimates for South Africa in 2000. Maternal underweight, leading to an increased risk of intra-uterine growth retardation and further risk of low birth weight (LBW), was also assessed using the approach adopted by the global assessment. Monte Carlo simulation-modeling techniques were used for the uncertainty analysis. SETTING South Africa. SUBJECTS Children under 5 years of age and pregnant women. OUTCOME MEASURES Mortality and disability-adjusted life years (DALYs) from protein- energy malnutrition and a fraction of those from diarrhoeal disease, pneumonia, malaria, other non- HIV/AIDS infectious and parasitic conditions in children aged 0 - 4 years, and LBW. RESULTS Among children under 5 years, 11.8% were underweight. In the same age group, 11,808 deaths (95% uncertainty interval 11,100 - 12,642) or 12.3% (95% uncertainty interval 11.5 - 13.1%) were attributable to being underweight. Protein-energy malnutrition contributed 44.7% and diarrhoeal disease 29.6% of the total attributable burden. Childhood and maternal underweight accounted for 2.7% (95% uncertainty interval 2.6 - 2.9%) of all DALYs in South Africa in 2000 and 10.8% (95% uncertainty interval 10.2 - 11.5%) of DALYs in children under 5. CONCLUSIONS The study shows that reduction of the occurrence of underweight would have a substantial impact on child mortality, and also highlights the need to monitor this important indicator of child health.
Resumo:
OBJECTIVES To estimate the burden of disease attributable to diabetes by sex and age group in South Africa in 2000. DESIGN The framework adopted for the most recent World Health Organization comparative risk assessment (CRA) methodology was followed. Small community studies used to derive the prevalence of diabetes by population group were weighted proportionately for a national estimate. Population-attributable fractions were calculated and applied to revised burden of disease estimates. Monte Carlo simulation-modelling techniques were used for uncertainty analysis. SETTING South Africa. SUBJECTS Adults 30 years and older. OUTCOME MEASURES Mortality and disability-adjusted life years (DALYs) for ischaemic heart disease (IHD), stroke, hypertensive disease and renal failure. RESULTS Of South Africans aged >or= 30 years, 5.5% had diabetes which increased with age. Overall, about 14% of IHD, 10% of stroke, 12% of hypertensive disease and 12% of renal disease burden in adult males and females (30+ years) were attributable to diabetes. Diabetes was estimated to have caused 22,412 (95% uncertainty interval 20,755 - 24,872) or 4.3% (95% uncertainty interval 4.0 - 4.8%) of all deaths in South Africa in 2000. Since most of these occurred in middle or old age, the loss of healthy life years comprises a smaller proportion of the total 258,028 DALYs (95% uncertainty interval 236,856 - 290,849) in South Africa in 2000, accounting for 1.6% (95% uncertainty interval 1.5 - 1.8%) of the total burden. CONCLUSIONS Diabetes is an important direct and indirect cause of burden in South Africa. Primary prevention of the disease through multi-level interventions and improved management at primary health care level are needed.
Resumo:
OBJECTIVES To estimate the extent of iron deficiency anaemia (IDA) among children aged 0 - 4 years and pregnant women aged 15 - 49 years, and the burden of disease attributed to IDA in South Africa in 2000. DESIGN The comparative risk assessment (CRA) methodology of the World Health Organization (WHO) was followed using local prevalence and burden estimates. IDA prevalence came from re-analysis of the South African Vitamin A Consultative Group study in the case of the children, and from a pooled estimate from several studies in the case of the pregnant women (haemoglobin level < 11 g/dl and ferritin level < 12 microg/l). Monte Carlo simulation-modelling was used for the uncertainty analysis. SETTING South Africa. SUBJECTS Children under 5 years and pregnant women 15 - 49 years. OUTCOME MEASURES Direct sequelae of IDA, maternal and perinatal deaths and disability-adjusted life years (DALYs) from mild mental disability related to IDA. Results. It is estimated that 5.1% of children and 9 - 12% of pregnant women had IDA and that about 7.3% of perinatal deaths and 4.9% of maternal deaths were attributed to IDA in 2000. Overall, about 174,976 (95% uncertainty interval 150,344 - 203,961) healthy years of life lost (YLLs), or between 0.9% and 1.3% of all DALYs in South Africa in 2000, were attributable to IDA. CONCLUSIONS This first study in South Africa to quantify the burden from IDA suggests that it is a less serious public health problem in South Africa than in many other developing countries. Nevertheless, this burden is preventable, and the study highlights the need to disseminate the food-based dietary guidelines formulated by the National Department of Health to people who need them and to monitor the impact of the food fortification programme.
Resumo:
Several clinical studies suggest the involvement of premature ageing processes in chronic obstructive pulmonary disease (COPD). Using an epidemiological approach, we studied whether accelerated ageing indicated by telomere length, a marker of biological age, is associated with COPD and asthma, and whether intrinsic age-related processes contribute to the interindividual variability of lung function. Our meta-analysis of 14 studies included 934 COPD cases with 15 846 controls defined according to the Global Lungs Initiative (GLI) criteria (or 1189 COPD cases according to the Global Initiative for Chronic Obstructive Lung Disease (GOLD) criteria), 2834 asthma cases with 28 195 controls, and spirometric parameters (forced expiratory volume in 1 s (FEV1), forced vital capacity (FVC) and FEV1/FVC) of 12 595 individuals. Associations with telomere length were tested by linear regression, adjusting for age, sex and smoking status. We observed negative associations between telomere length and asthma (β= −0.0452, p=0.024) as well as COPD (β= −0.0982, p=0.001), with associations being stronger and more significant when using GLI criteria than those of GOLD. In both diseases, effects were stronger in females than males. The investigation of spirometric indices showed positive associations between telomere length and FEV1 (p=1.07×10−7), FVC (p=2.07×10−5), and FEV1/FVC (p=5.27×10−3). The effect was somewhat weaker in apparently healthy subjects than in COPD or asthma patients. Our results provide indirect evidence for the hypothesis that cellular senescence may contribute to the pathogenesis of COPD and asthma, and that lung function may reflect biological ageing primarily due to intrinsic processes, which are likely to be aggravated in lung diseases.