906 resultados para disease control


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tobacco yellow dwarf virus (TbYDV, family Geminiviridae, genus Mastrevirus) is an economically important pathogen causing summer death and yellow dwarf disease in bean (Phaseolus vulgaris L.) and tobacco (Nicotiana tabacum L.), respectively. Prior to the commencement of this project, little was known about the epidemiology of TbYDV, its vector and host-plant range. As a result, disease control strategies have been restricted to regular poorly timed insecticide applications which are largely ineffective, environmentally hazardous and expensive. In an effort to address this problem, this PhD project was carried out in order to better understand the epidemiology of TbYDV, to identify its host-plant and vectors as well as to characterise the population dynamics and feeding physiology of the main insect vector and other possible vectors. The host-plants and possible leafhopper vectors of TbYDV were assessed over three consecutive growing seasons at seven field sites in the Ovens Valley, Northeastern Victoria, in commercial tobacco and bean growing properties. Leafhoppers and plants were collected and tested for the presence of TbYDV by PCR. Using sweep nets, twenty-three leafhopper species were identified at the seven sites with Orosius orientalis the predominant leafhopper. Of the 23 leafhopper species screened for TbYDV, only Orosius orientalis and Anzygina zealandica tested positive. Forty-two different plant species were also identified at the seven sites and tested. Of these, TbYDV was only detected in four dicotyledonous species, Amaranthus retroflexus, Phaseolus vulgaris, Nicotiana tabacum and Raphanus raphanistrum. Using a quadrat survey, the temporal distribution and diversity of vegetation at four of the field sites was monitored in order to assess the presence of, and changes in, potential host-plants for the leafhopper vector(s) and the virus. These surveys showed that plant composition and the climatic conditions at each site were the major influences on vector numbers, virus presence and the subsequent occurrence of tobacco yellow dwarf and bean summer death diseases. Forty-two plant species were identified from all sites and it was found that sites with the lowest incidence of disease had the highest proportion of monocotyledonous plants that are non hosts for both vector and the virus. In contrast, the sites with the highest disease incidence had more host-plant species for both vector and virus, and experienced higher temperatures and less rainfall. It is likely that these climatic conditions forced the leafhopper to move into the irrigated commercial tobacco and bean crop resulting in disease. In an attempt to understand leafhopper species diversity and abundance, in and around the field borders of commercially grown tobacco crops, leafhoppers were collected from four field sites using three different sampling techniques, namely pan trap, sticky trap and sweep net. Over 51000 leafhopper samples were collected, which comprised 57 species from 11 subfamilies and 19 tribes. Twentythree leafhopper species were recorded for the first time in Victoria in addition to several economically important pest species of crops other than tobacco and bean. The highest number and greatest diversity of leafhoppers were collected in yellow pan traps follow by sticky trap and sweep nets. Orosius orientalis was found to be the most abundant leafhopper collected from all sites with greatest numbers of this leafhopper also caught using the yellow pan trap. Using the three sampling methods mentioned above, the seasonal distribution and population dynamics of O. orientalis was studied at four field sites over three successive growing seasons. The population dynamics of the leafhopper was characterised by trimodal peaks of activity, occurring in the spring and summer months. Although O. orientalis was present in large numbers early in the growing season (September-October), TbYDV was only detected in these leafhoppers between late November and the end of January. The peak in the detection of TbYDV in O. orientalis correlated with the observation of disease symptoms in tobacco and bean and was also associated with warmer temperatures and lower rainfall. To understand the feeding requirements of Orosius orientalis and to enable screening of potential control agents, a chemically-defined artificial diet (designated PT-07) and feeding system was developed. This novel diet formulation allowed survival for O. orientalis for up to 46 days including complete development from first instar through to adulthood. The effect of three selected plant derived proteins, cowpea trypsin inhibitor (CpTi), Galanthus nivalis agglutinin (GNA) and wheat germ agglutinin (WGA), on leafhopper survival and development was assessed. Both GNA and WGA were shown to reduce leafhopper survival and development significantly when incorporated at a 0.1% (w/v) concentration. In contrast, CpTi at the same concentration did not exhibit significant antimetabolic properties. Based on these results, GNA and WGA are potentially useful antimetabolic agents for expression in genetically modified crops to improve the management of O. orientalis, TbYDV and the other pathogens it vectors. Finally, an electrical penetration graph (EPG) was used to study the feeding behaviour of O. orientalis to provide insights into TbYDV acquisition and transmission. Waveforms representing different feeding activity were acquired by EPG from adult O. orientalis feeding on two plant species, Phaseolus vulgaris and Nicotiana tabacum and a simple sucrose-based artificial diet. Five waveforms (designated O1-O5) were observed when O. orientalis fed on P. vulgaris, while only four (O1-O4) and three (O1-O3) waveforms were observed during feeding on N. tabacum and the artificial diet, respectively. The mean duration of each waveform and the waveform type differed markedly depending on the food source. This is the first detailed study on the tritrophic interactions between TbYDV, its leafhopper vector, O. orientalis, and host-plants. The results of this research have provided important fundamental information which can be used to develop more effective control strategies not only for O. orientalis, but also for TbYDV and other pathogens vectored by the leafhopper.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study examined the distribution of major mosquito species and their roles in the transmission of Ross River virus (RRV) infection for coastline and inland areas in Brisbane, Australia (27°28′ S, 153°2′ E). We obtained data on the monthly counts of RRV cases in Brisbane between November 1998 and December 2001 by statistical local areas from the Queensland Department of Health and the monthly mosquito abundance from the Brisbane City Council. Correlation analysis was used to assess the pairwise relationships between mosquito density and the incidence of RRV disease. This study showed that the mosquito abundance of Aedes vigilax (Skuse), Culex annulirostris (Skuse), and Aedes vittiger (Skuse) were significantly associated with the monthly incidence of RRV in the coastline area, whereas Aedes vigilax, Culex annulirostris, and Aedes notoscriptus (Skuse) were significantly associated with the monthly incidence of RRV in the inland area. The results of the classification and regression tree (CART) analysis show that both occurrence and incidence of RRV were influenced by interactions between species in both coastal and inland regions. We found that there was an 89% chance for an occurrence of RRV if the abundance of Ae. vigifax was between 64 and 90 in the coastline region. There was an 80% chance for an occurrence of RRV if the density of Cx. annulirostris was between 53 and 74 in the inland area. The results of this study may have applications as a decision support tool in planning disease control of RRV and other mosquito-borne diseases.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: The transmission of hemorrhagic fever with renal syndrome (HFRS) is influenced by climatic variables. However, few studies have examined the quantitative relationship between climate variation and HFRS transmission. ---------- Objective: We examined the potential impact of climate variability on HFRS transmission and developed climate-based forecasting models for HFRS in northeastern China. ---------- Methods: We obtained data on monthly counts of reported HFRS cases in Elunchun and Molidawahaner counties for 1997–2007 from the Inner Mongolia Center for Disease Control and Prevention and climate data from the Chinese Bureau of Meteorology. Cross-correlations assessed crude associations between climate variables, including rainfall, land surface temperature (LST), relative humidity (RH), and the multivariate El Niño Southern Oscillation (ENSO) index (MEI) and monthly HFRS cases over a range of lags. We used time-series Poisson regression models to examine the independent contribution of climatic variables to HFRS transmission. ----------- Results: Cross-correlation analyses showed that rainfall, LST, RH, and MEI were significantly associated with monthly HFRS cases with lags of 3–5 months in both study areas. The results of Poisson regression indicated that after controlling for the autocorrelation, seasonality, and long-term trend, rainfall, LST, RH, and MEI with lags of 3–5 months were associated with HFRS in both study areas. The final model had good accuracy in forecasting the occurrence of HFRS. ---------- Conclusions: Climate variability plays a significant role in HFRS transmission in northeastern China. The model developed in this study has implications for HFRS control and prevention.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Since 2000-2001, dengue virus type 1 has circulated in the Pacific region. However, in 2007, type 4 reemerged and has almost completely displaced the strains of type 1. If only 1 serotype circulates at any time and is replaced approximately every 5 years, DENV-3 may reappear in 2012.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mycobacterium lentiflavum, a slow-growing nontuberculous mycobacterium, is a rare cause of human disease. It has been isolated from environmental samples worldwide. To assess the clinical significance of M. lentiflavum isolates reported to the Queensland Tuberculosis Control Centre, Australia, during 2001-2008, we explored the genotypic similarity and geographic relationship between isolates from humans and potable water in the Brisbane metropolitan area. A total of 47 isolates from 36 patients were reported; 4 patients had clinically significant disease. M. lentiflavum was cultured from 13 of 206 drinking water sites. These sites overlapped geographically with home addresses of the patients who had clinically significant disease. Automated repetitive sequence-based PCR genotyping showed a dominant environmental clone closely related to clinical strains. This finding suggests potable water as a possible source of M. lentiflavum infection in humans.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Humankind has been dealing with all kinds of disasters since the dawn of time. The risk and impact of disasters producing mass casualties worldwide is increasing, due partly to global warming as well as to increased population growth, increased density and the aging population. China, as a country with a large population, vast territory, and complex climatic and geographical conditions, has been plagued by all kinds of disasters. Disaster health management has traditionally been a relatively arcane discipline within public health. However, SARS, Avian Influenza, and earthquakes and floods, along with the need to be better prepared for the Olympic Games in China has brought disasters, their management and their potential for large scale health consequences on populations to the attention of the public, the government and the international community alike. As a result significant improvements were made to the disaster management policy framework, as well as changes to systems and structures to incorporate an improved disaster management focus. This involved the upgrade of the Centres for Disease Control and Prevention (CDC) throughout China to monitor and better control the health consequences particularly of infectious disease outbreaks. However, as can be seen in the Southern China Snow Storm and Wenchuan Earthquake in 2008, there remains a lack of integrated disaster management and efficient medical rescue, which has been costly in terms of economics and health for China. In the context of a very large and complex country, there is a need to better understand whether these changes have resulted in effective management of the health impacts of such incidents. To date, the health consequences of disasters, particularly in China, have not been a major focus of study. The main aim of this study is to analyse and evaluate disaster health management policy in China and in particular, its ability to effectively manage the health consequences of disasters. Flood has been selected for this study as it is a common and significant disaster type in China and throughout the world. This information will then be used to guide conceptual understanding of the health consequences of floods. A secondary aim of the study is to compare disaster health management in China and Australia as these countries differ in their length of experience in having a formalised policy response. The final aim of the study is to determine the extent to which Walt and Gilson’s (1994) model of policy explains how disaster management policy in China was developed and implemented after SARS in 2003 to the present day. This study has utilised a case study methodology. A document analysis and literature search of Chinese and English sources was undertaken to analyse and produce a chronology of disaster health management policy in China. Additionally, three detailed case studies of flood health management in China were undertaken along with three case studies in Australia in order to examine the policy response and any health consequences stemming from the floods. A total of 30 key international disaster health management experts were surveyed to identify fundamental elements and principles of a successful policy framework for disaster health management. Key policy ingredients were identified from the literature, the case-studies and the survey of experts. Walt and Gilson (1994)’s policy model that focuses on the actors, content, context and process of policy was found to be a useful model for analysing disaster health management policy development and implementation in China. This thesis is divided into four parts. Part 1 is a brief overview of the issues and context to set the scene. Part 2 examines the conceptual and operational context including the international literature, government documents and the operational environment for disaster health management in China. Part 3 examines primary sources of information to inform the analysis. This involves two key studies: • A comparative analysis of the management of floods in China and Australia • A survey of international experts in the field of disaster management so as to inform the evaluation of the policy framework in existence in China and the criteria upon which the expression of that policy could be evaluated Part 4 describes the key outcomes of this research which include: • A conceptual framework for describing the health consequences of floods • A conceptual framework for disaster health management • An evaluation of the disaster health management policy and its implementation in China. The research outcomes clearly identified that the most significant improvements are to be derived from improvements in the generic management of disasters, rather than the health aspects alone. Thus, the key findings and recommendations tend to focus on generic issues. The key findings of this research include the following: • The health consequences of floods may be described in terms of time as ‘immediate’, ‘medium term’ and ‘long term’ and also in relation to causation as ‘direct’ and ‘indirect’ consequences of the flood. These two aspects form a matrix which in turn guides management responses. • Disaster health management in China requires a more comprehensive response throughout the cycle of prevention, preparedness, response and recovery but it also requires a more concentrated effort on policy implementation to ensure the translation of the policy framework into effective incident management. • The policy framework in China is largely of international standard with a sound legislative base. In addition the development of the Centres for Disease Control and Prevention has provided the basis for a systematic approach to health consequence management. However, the key weaknesses in the current system include: o The lack of a key central structure to provide the infrastructure with vital support for policy development, implementation and evaluation. o The lack of well-prepared local response teams similar to local government based volunteer groups in Australia. • The system lacks structures to coordinate government action at the local level. The result of this is a poorly coordinated local response and lack of clarity regarding the point at which escalation of the response to higher levels of government is advisable. These result in higher levels of risk and negative health impacts. The key recommendations arising from this study are: 1. Disaster health management policy in China should be enhanced by incorporating disaster management considerations into policy development, and by requiring a disaster management risk analysis and disaster management impact statement for development proposals. 2. China should transform existing organizations to establish a central organisation similar to the Federal Emergency Management Agency (FEMA) in the USA or the Emergency Management Australia (EMA) in Australia. This organization would be responsible for leading nationwide preparedness through planning, standards development, education and incident evaluation and to provide operational support to the national and local government bodies in the event of a major incident. 3. China should review national and local plans to reflect consistency in planning, and to emphasize the advantages of the integrated planning process. 4. Enhance community resilience through community education and the development of a local volunteer organization. China should develop a national strategy which sets direction and standards in regard to education and training, and requires system testing through exercises. Other initiatives may include the development of a local volunteer capability with appropriate training to assist professional response agencies such as police and fire services in a major incident. An existing organisation such as the Communist Party may be an appropriate structure to provide this response in a cost effective manner. 5. Continue development of professional emergency services, particularly ambulance, to ensure an effective infrastructure is in place to support the emergency response in disasters. 6. Funding for disaster health management should be enhanced, not only from government, but also from other sources such as donations and insurance. It is necessary to provide a more transparent mechanism to ensure the funding is disseminated according to the needs of the people affected. 7. Emphasis should be placed on prevention and preparedness, especially on effective disaster warnings. 8. China should develop local disaster health management infrastructure utilising existing resources wherever possible. Strategies for enhancing local infrastructure could include the identification of local resources (including military resources) which could be made available to support disaster responses. It should develop operational procedures to access those resources. Implementation of these recommendations should better position China to reduce the significant health consequences experienced each year from major incidents such as floods and to provide an increased level of confidence to the community about the country’s capacity to manage such events.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

At the beginning of the pandemic (H1N1) 2009 outbreak, we estimated the potential surge in demand for hospital-based services in 4 Health Service Districts of Queensland, Australia, using the FluSurge model. Modifications to the model were made on the basis of emergent evidence and results provided to local hospitals to inform resource planning for the forthcoming pandemic. To evaluate the fit of the model, a comparison between the model's predictions and actual hospitalizations was made. In early 2010, a Web-based survey was undertaken to evaluate the model's usefulness. Predictions based on modified assumptions arising from the new pandemic gained better fit than results from the default model. The survey identified that the modeling support was helpful and useful to service planning for local hospitals. Our research illustrates an integrated framework involving post hoc comparison and evaluation for implementing epidemiologic modeling in response to a public health emergency.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Malaria is a significant threat to population health in the border areas of Yunnan Province, China. How to accurately measure malaria transmission is an important issue. This study aimed to examine the role of slide positivity rates (SPR) in malaria transmission in Mengla County, Yunnan Province, China. Methods: Data on annual malaria cases, SPR and socio-economic factors for the period of 1993 to 2008 were obtained from the Center for Disease Control and Prevention (CDC) and the Bureau of Statistics, Mengla, China. Multiple linear regression models were conducted to evaluate the relationship between socio-ecologic factors and malaria incidence. Results: The results show that SPR was significantly positively associated with the malaria incidence rates. The SPR (beta = 1.244, p = 0.000) alone and combination (SPR, beta = 1.326, p < 0.001) with other predictors can explain about 85% and 95% of variation in malaria transmission, respectively. Every 1% increase in SPR corresponded to an increase of 1.76/100,000 in malaria incidence rates. Conclusion: SPR is a strong predictor of malaria transmission, and can be used to improve the planning and implementation of malaria elimination programmes in Mengla and other similar locations. SPR might also be a useful indicator of malaria early warning systems in China.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Maize streak virus (MSV; Genus Mastrevirus, Family Geminiviridae) occurs throughout Africa, where it causes what is probably the most serious viral crop disease on the continent. It is obligately transmitted by as many as six leafhopper species in the Genus Cicadulina, but mainly by C. mbila Naudé and C. storeyi. In addition to maize, it can infect over 80 other species in the Family Poaceae. Whereas 11 strains of MSV are currently known, only the MSV-A strain is known to cause economically significant streak disease in maize. Severe maize streak disease (MSD) manifests as pronounced, continuous parallel chlorotic streaks on leaves, with severe stunting of the affected plant and, usuallly, a failure to produce complete cobs or seed. Natural resistance to MSV in maize, and/or maize infections caused by non-maize-adapted MSV strains, can result in narrow, interrupted streaks and no obvious yield losses. MSV epidemiology is primarily governed by environmental influences on its vector species, resulting in erratic epidemics every 3-10 years. Even in epidemic years, disease incidences can vary from a few infected plants per field, with little associated yield loss, to 100% infection rates and complete yield loss. Taxonomy: The only virus species known to cause MSD is MSV, the type member of the Genus Mastrevirus in the Family Geminiviridae. In addition to the MSV-A strain, which causes the most severe form of streak disease in maize, 10 other MSV strains (MSV-B to MSV-K) are known to infect barley, wheat, oats, rye, sugarcane, millet and many wild, mostly annual, grass species. Seven other mastrevirus species, many with host and geographical ranges partially overlapping those of MSV, appear to infect primarily perennial grasses. Physical properties: MSV and all related grass mastreviruses have single-component, circular, single-stranded DNA genomes of approximately 2700 bases, encapsidated in 22 × 38-nm geminate particles comprising two incomplete T = 1 icosahedra, with 22 pentameric capsomers composed of a single 32-kDa capsid protein. Particles are generally stable in buffers of pH 4-8. Disease symptoms: In infected maize plants, streak disease initially manifests as minute, pale, circular spots on the lowest exposed portion of the youngest leaves. The only leaves that develop symptoms are those formed after infection, with older leaves remaining healthy. As the disease progresses, newer leaves emerge containing streaks up to several millimetres in length along the leaf veins, with primary veins being less affected than secondary or tertiary veins. The streaks are often fused laterally, appearing as narrow, broken, chlorotic stripes, which may extend over the entire length of severely affected leaves. Lesion colour generally varies from white to yellow, with some virus strains causing red pigmentation on maize leaves and abnormal shoot and flower bunching in grasses. Reduced photosynthesis and increased respiration usually lead to a reduction in leaf length and plant height; thus, maize plants infected at an early stage become severely stunted, producing undersized, misshapen cobs or giving no yield at all. Yield loss in susceptible maize is directly related to the time of infection: Infected seedlings produce no yield or are killed, whereas plants infected at later times are proportionately less affected. Disease control: Disease avoidance can be practised by only planting maize during the early season when viral inoculum loads are lowest. Leafhopper vectors can also be controlled with insecticides such as carbofuran. However, the development and use of streak-resistant cultivars is probably the most effective and economically viable means of preventing streak epidemics. Naturally occurring tolerance to MSV (meaning that, although plants become systemically infected, they do not suffer serious yield losses) has been found, which has primarily been attributed to a single gene, msv-1. However, other MSV resistance genes also exist and improved resistance has been achieved by concentrating these within individual maiz genotypes. Whereas true MSV immunity (meaning that plants cannot be symptomatically infected by the virus) has been achieved in lines that include multiple small-effect resistance genes together with msv-1, it has proven difficult to transfer this immunity into commercial maize genotypes. An alternative resistance strategy using genetic engineering is currently being investigated in South Africa. Useful websites: 〈http://www.mcb.uct.ac.za/MSV/mastrevirus.htm〉; 〈http://www. danforthcenter.org/iltab/geminiviridae/geminiaccess/mastrevirus/Mastrevirus. htm〉. © 2009 Blackwell Publishing Ltd.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Alcohol-related driving is a longstanding, serious problem in China (Li, Xie, Nie, & Zhang, 2012). On 1st May, 2011 a national law was introduced to criminalize drunk driving, and imposed serious penalties including jail for driving with a blood alcohol level of above 80mg/100ml. This pilot study, undertaken a year after introduction of the law, sought traffic police officers’ perceptions of drink driving and the practice of breath alcohol testing (BAT) in a large city in Guangdong Province, southern China. A questionnaire survey and semi-structured interviews were used to gain an in-depth understanding of issues relevant to alcohol-related driving. Fifty-five traffic police officers were recruited for the survey and six traffic police officers with a variety of working experience including roadside alcohol breath testing, traffic crash investigation and police resourcing were interviewed individually. The officers were recruited by the first author with the assistance of the staff from Guangdong Institute of Public Health, Centre for Disease Control and Prevention (CDC). Interview participants reported three primary reasons why people drink and drive: 1) being prepared to take the chance of not being apprehended by police; 2) the strong traditional Chinese drinking culture; and 3) insufficient public awareness about the harmfulness of drink driving. Problems associated with the process of breath alcohol testing (BAT) were described and fit broadly into two categories: resourcing and avoiding detection. It was reported that there were insufficient traffic police officers to conduct routine traffic policing, including alcohol testing. Police BAT equipment was considered sufficient for routine traffic situations but not highway traffic operations. Local media and posters are used by the Public Security Bureau which is responsible for education about safe driving but participants thought that the education campaigns are limited in scope. Participants also described detection avoidance strategies used by drivers including: changing route; ignoring a police instruction to stop; staying inside the vehicle with windows and doors locked to avoid being tested; intentionally not performing breath tests correctly; and arguing with officers. This pilot study provided important insights from traffic police in one Chinese city which suggest there may be potential unintended effects of introducing more severe penalties including a range of strategies reportedly used by drivers to avoid detection. Recommendations for future research include a larger study to confirm these findings and examine the training and education of drivers; the focus and reach of publicity; and possible resource needs to support police enforcement.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Wolbachia pipientis is an endosymbiotic bacterium present in diverse insect species. Although it is well studied for its dramatic effects on host reproductive biology, little is known about its effects on other aspects of host biology, despite its presence in a wide array of host tissues. This study examined the effects of three Wolbachia strains on two different Drosophila species, using a laboratory performance assay for insect locomotion in response to olfactory cues. The results demonstrate that Wolbachia infection can have significant effects on host responsiveness that vary with respect to the Wolbachia strain-host species combination. The wRi strain, native to Drosophila simulans, increases the basal activity level of the host insect as well as its responsiveness to food cues. In contrast, the wMel strain and the virulent wMelPop strain, native to Drosophila melanogaster, cause slight decreases in responsiveness to food cues but do not alter basal activity levels in the host. Surprisingly, the virulent wMelPop strain has very little impact on host responsiveness in D. simulans. This novel strain-host relationship was artificially created previously by transinfection. These findings have implications for understanding the evolution and spread of Wolbachia infections in wild populations and for Wolbachia-based vector-borne disease control strategies currently being developed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: US Centers for Disease Control guidelines recommend replacement of peripheral intravenous (IV) catheters no more frequently than every 72 to 96 hours. Routine replacement is thought to reduce the risk of phlebitis and bloodstream infection. Catheter insertion is an unpleasant experience for patients and replacement may be unnecessary if the catheter remains functional and there are no signs of inflammation. Costs associated with routine replacement may be considerable. This is an update of a review first published in 2010. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Bayesian networks (BNs) provide a statistical modelling framework which is ideally suited for modelling the many factors and components of complex problems such as healthcare-acquired infections. The methicillin-resistant Staphylococcus aureus (MRSA) organism is particularly troublesome since it is resistant to standard treatments for Staph infections. Overcrowding and understa�ng are believed to increase infection transmission rates and also to inhibit the effectiveness of disease control measures. Clearly the mechanisms behind MRSA transmission and containment are very complicated and control strategies may only be e�ective when used in combination. BNs are growing in popularity in general and in medical sciences in particular. A recent Current Content search of the number of published BN journal articles showed a fi�ve fold increase in general and a six fold increase in medical and veterinary science from 2000 to 2009. This chapter introduces the reader to Bayesian network (BN) modelling and an iterative modelling approach to build and test the BN created to investigate the possible role of high bed occupancy on transmission of MRSA while simultaneously taking into account other risk factors.