867 resultados para catheter-related bloodstream infection, nosocomial infection, healthcare associated infection, infection control, antimicrobial catheters, healthcare epidemiology, health economics, economic evaluation, cost-effectiveness, health technology assessment


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: There is growing interest in the potential utility of real-time polymerase chain reaction (PCR) in diagnosing bloodstream infection by detecting pathogen deoxyribonucleic acid (DNA) in blood samples within a few hours. SeptiFast (Roche Diagnostics GmBH, Mannheim, Germany) is a multipathogen probe-based system targeting ribosomal DNA sequences of bacteria and fungi. It detects and identifies the commonest pathogens causing bloodstream infection. As background to this study, we report a systematic review of Phase III diagnostic accuracy studies of SeptiFast, which reveals uncertainty about its likely clinical utility based on widespread evidence of deficiencies in study design and reporting with a high risk of bias. 

Objective: Determine the accuracy of SeptiFast real-time PCR for the detection of health-care-associated bloodstream infection, against standard microbiological culture. 

Design: Prospective multicentre Phase III clinical diagnostic accuracy study using the standards for the reporting of diagnostic accuracy studies criteria. 

Setting: Critical care departments within NHS hospitals in the north-west of England. 

Participants: Adult patients requiring blood culture (BC) when developing new signs of systemic inflammation. 

Main outcome measures: SeptiFast real-time PCR results at species/genus level compared with microbiological culture in association with independent adjudication of infection. Metrics of diagnostic accuracy were derived including sensitivity, specificity, likelihood ratios and predictive values, with their 95% confidence intervals (CIs). Latent class analysis was used to explore the diagnostic performance of culture as a reference standard. 

Results: Of 1006 new patient episodes of systemic inflammation in 853 patients, 922 (92%) met the inclusion criteria and provided sufficient information for analysis. Index test assay failure occurred on 69 (7%) occasions. Adult patients had been exposed to a median of 8 days (interquartile range 4–16 days) of hospital care, had high levels of organ support activities and recent antibiotic exposure. SeptiFast real-time PCR, when compared with culture-proven bloodstream infection at species/genus level, had better specificity (85.8%, 95% CI 83.3% to 88.1%) than sensitivity (50%, 95% CI 39.1% to 60.8%). When compared with pooled diagnostic metrics derived from our systematic review, our clinical study revealed lower test accuracy of SeptiFast real-time PCR, mainly as a result of low diagnostic sensitivity. There was a low prevalence of BC-proven pathogens in these patients (9.2%, 95% CI 7.4% to 11.2%) such that the post-test probabilities of both a positive (26.3%, 95% CI 19.8% to 33.7%) and a negative SeptiFast test (5.6%, 95% CI 4.1% to 7.4%) indicate the potential limitations of this technology in the diagnosis of bloodstream infection. However, latent class analysis indicates that BC has a low sensitivity, questioning its relevance as a reference test in this setting. Using this analysis approach, the sensitivity of the SeptiFast test was low but also appeared significantly better than BC. Blood samples identified as positive by either culture or SeptiFast real-time PCR were associated with a high probability (> 95%) of infection, indicating higher diagnostic rule-in utility than was apparent using conventional analyses of diagnostic accuracy. 

Conclusion: SeptiFast real-time PCR on blood samples may have rapid rule-in utility for the diagnosis of health-care-associated bloodstream infection but the lack of sensitivity is a significant limiting factor. Innovations aimed at improved diagnostic sensitivity of real-time PCR in this setting are urgently required. Future work recommendations include technology developments to improve the efficiency of pathogen DNA extraction and the capacity to detect a much broader range of pathogens and drug resistance genes and the application of new statistical approaches able to more reliably assess test performance in situation where the reference standard (e.g. blood culture in the setting of high antimicrobial use) is prone to error.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introducción: La utilización de catéteres venosos centrales (CVC) en la unidad de cuidado intensivo tiene gran importancia y amplio uso, son fuente de apoyo para la realización de varia actividades, pero con un gran potencial de complicaciones, por lo cual es fundamental conocer todos los aspectos relacionados con su uso, para así poder controlarlas. Métodos: Realizamos un estudio descriptivo de corte transversal con el objetivo de caracterizar los pacientes que requirieron CVC en el Hospital Universitario Fundación Santa Fe de Bogotá durante junio de 2011 y mayo de 2013, describimos sus complicaciones asociadas tanto mecánicas como infecciosas, determinamos la tasa de bacteriemia, gérmenes causales y sus patrones de resistencia. Resultados: Se colocaron 2.286 CVC, el 52,9% en hombres, la media de edad fue 58,9 años. El total de las complicaciones ascienden al 4,5%, infecciosas 4,0% y mecánicas 0,6%. Dentro de las mecánicas solo encontramos inmediatas, no tardías. Con respecto a las infecciosas encontramos infección del sitio de inserción y bacteriemia. Se documentó una tasa de bacteriemia de 3,4 por 1000-días catéter en 2013, en disminución con respecto a 2012 (3,9) y 2011 (4,4). El microorganismo mas frecuentemente aislado fue el Staphylococcus Coagulasa Negativo con patrón usual de resistencia. Conclusión: Las complicaciones asociadas al uso de CVC en el HUFSFB, se presentan en menor frecuencia a las descritas internacionalmente; la tasa de bacteriemia asociada al CVC ha disminuido año tras año, posiblemente asociado al cuidado mas estricto posterior a la implementación de protocolos de manejo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Some types of antimicrobial-coated central venous catheters (A-CVC) have been shown to be cost-effective in preventing catheter-related bloodstream infection (CR-BSI). However, not all types have been evaluated, and there are concerns over the quality and usefulness of these earlier studies. There is uncertainty amongst clinicians over which, if any, antimicrobial-coated central venous catheters to use. We re-evaluated the cost-effectiveness of all commercially available antimicrobialcoated central venous catheters for prevention of catheter-related bloodstream infection in adult intensive care unit (ICU) patients. Methods: We used a Markov decision model to compare the cost-effectiveness of antimicrobial-coated central venous catheters relative to uncoated catheters. Four catheter types were evaluated; minocycline and rifampicin (MR)-coated catheters; silver, platinum and carbon (SPC)-impregnated catheters; and two chlorhexidine and silver sulfadiazine-coated catheters, one coated on the external surface (CH/SSD (ext)) and the other coated on both surfaces (CH/SSD (int/ext)). The incremental cost per qualityadjusted life-year gained and the expected net monetary benefits were estimated for each. Uncertainty arising from data estimates, data quality and heterogeneity was explored in sensitivity analyses. Results: The baseline analysis, with no consideration of uncertainty, indicated all four types of antimicrobial-coated central venous catheters were cost-saving relative to uncoated catheters. Minocycline and rifampicin-coated catheters prevented 15 infections per 1,000 catheters and generated the greatest health benefits, 1.6 quality-adjusted life-years, and cost-savings, AUD $130,289. After considering uncertainty in the current evidence, the minocycline and rifampicin-coated catheters returned the highest incremental monetary net benefits of $948 per catheter; but there was a 62% probability of error in this conclusion. Although the minocycline and rifampicin-coated catheters had the highest monetary net benefits across multiple scenarios, the decision was always associated with high uncertainty. Conclusions: Current evidence suggests that the cost-effectiveness of using antimicrobial-coated central venous catheters within the ICU is highly uncertain. Policies to prevent catheter-related bloodstream infection amongst ICU patients should consider the cost-effectiveness of competing interventions in the light of this uncertainty. Decision makers would do well to consider the current gaps in knowledge and the complexity of producing good quality evidence in this area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: US Centers for Disease Control guidelines recommend replacement of peripheral intravenous (IV) catheters no more frequently than every 72 to 96 hours. Routine replacement is thought to reduce the risk of phlebitis and bloodstream infection. Catheter insertion is an unpleasant experience for patients and replacement may be unnecessary if the catheter remains functional and there are no signs of inflammation. Costs associated with routine replacement may be considerable. This is an update of a review first published in 2010. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Bloodstream infections resulting from intravascular catheters (catheter-BSI) in critical care increase patients' length of stay, morbidity and mortality, and the management of these infections and their complications has been estimated to cost the NHS annually £19.1–36.2M. Catheter-BSI are thought to be largely preventable using educational interventions, but guidance as to which types of intervention might be most clinically effective is lacking. Objective To assess the effectiveness and cost-effectiveness of educational interventions for preventing catheter-BSI in critical care units in England. Data sources Sixteen electronic bibliographic databases – including MEDLINE, MEDLINE In-Process & Other Non-Indexed Citations, Cumulative Index to Nursing and Allied Health Literature (CINAHL), NHS Economic Evaluation Database (NHS EED), EMBASE and The Cochrane Library databases – were searched from database inception to February 2011, with searches updated in March 2012. Bibliographies of systematic reviews and related papers were screened and experts contacted to identify any additional references. Review methods References were screened independently by two reviewers using a priori selection criteria. A descriptive map was created to summarise the characteristics of relevant studies. Further selection criteria developed in consultation with the project Advisory Group were used to prioritise a subset of studies relevant to NHS practice and policy for systematic review. A decision-analytic economic model was developed to investigate the cost-effectiveness of educational interventions for preventing catheter-BSI. Results Seventy-four studies were included in the descriptive map, of which 24 were prioritised for systematic review. Studies have predominantly been conducted in the USA, using single-cohort before-and-after study designs. Diverse types of educational intervention appear effective at reducing the incidence density of catheter-BSI (risk ratios statistically significantly < 1.0), but single lectures were not effective. The economic model showed that implementing an educational intervention in critical care units in England would be cost-effective and potentially cost-saving, with incremental cost-effectiveness ratios under worst-case sensitivity analyses of < £5000/quality-adjusted life-year. Limitations Low-quality primary studies cannot definitively prove that the planned interventions were responsible for observed changes in catheter-BSI incidence. Poor reporting gave unclear estimates of risk of bias. Some model parameters were sourced from other locations owing to a lack of UK data. Conclusions Our results suggest that it would be cost-effective and may be cost-saving for the NHS to implement educational interventions in critical care units. However, more robust primary studies are needed to exclude the possible influence of secular trends on observed reductions in catheter-BSI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Os acessos venosos são indispensáveis para assistência do paciente em situação crítica. O cateter venoso central (CVC) é um acesso que viabiliza a terapêutica dessa clientela, mas o seu uso pode levar à infecções. Estas infecções ocasionam maior permanência hospitalar, elevam os custos totais das instituições e aumentam a morbidade e a mortalidade do paciente. O uso de curativos como cobertura do sítio de saída do CVC é eficaz na prevenção das infecções relacionadas a estes cateteres, em particular, o uso de curativos impregnados com antissépticos como o curativo gel de clorexidina. Este estudo teve como objetivo comparar a efetividade do curativo gel de clorexidina com a do filme transparente de poliuretano na prevenção da colonização do cateter venoso central em pacientes adultos críticos. Trata-se de estudo experimental, do tipo ensaio clínico randomizado, com tratamentos em paralelo, prospectivo e monocêntrico, realizado de acordo com as recomendações do Consolidated Standards of Reporting Trials (CONSORT). O estudo foi realizado na Unidade de Terapia Intensiva e na Unidade Coronariana de um hospital de ensino do interior do Estado de São Paulo. Participaram do estudo 102 indivíduos hospitalizados nestes locais, divididos aleatoriamente em dois grupos: grupo intervenção, no qual o tipo de cobertura utilizada foi o curativo de gel de clorexidina e grupo controle, que utilizou como cobertura o filme transparente de poliuretano. O desfecho primário mensurado foi a colonização do cateter e os desfechos secundários foram a infecção clínica do sítio de saída, a infecção microbiológica do sítio de saída e a infecção da corrente sanguínea relacionada ao cateter. Para a coleta de dados foi elaborado um instrumento, e este validado quanto ao seu conteúdo e forma por 13 enfermeiros pertencentes aos locais do estudo. Estes profissionais foram treinados para a realização dos curativos e coleta das pontas dos cateteres centrais, swabs dos sítios de saída e hemoculturas. Análises descritivas foram usadas para todas as variáveis do estudo. O teste Exato de Fisher foi utilizado para comparar as proporções de cada desfecho nos grupos de intervenção e controle, e a regressão logística para explorar se a colonização no CVC poderia ser associada com o tempo de uso do cateter e com o Acute Physiology and Chronic Health Evaluation II (APACHE II) dos pacientes do estudo. De acordo com os resultados não houve diferença estatisticamente significante entre a colonização nos dois grupos (p valor = 1.00), para a infecção microbiológica do sítio de saída (p valor = 0.08), para a infecção clínica do sítio de saída (p valor = 0.77) e para as infecções da corrente sanguínea relacionadas ao cateter (p valor = 1,00). Conclui-se que o presente estudo pode contribuir para que as unidades de saúde tenham subsídios para realizar a escolha do tipo de curativo baseado em suas necessidades institucionais e no desenvolvimento de protocolos relacionados à medidas de inserção e manutenção do cateter, bem como medidas educativas permanentes

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Needleless connectors are being increasingly used for direct access to intravascular catheters. However, the potential for microbial contamination of these devices and subsequent infection risk is still widely debated. In this study the microbial contamination rate associated with three-way stopcock luers with standard caps attached was compared to those with Y-type extension set luers with Clearlink® needleless connectors attached. Fifty patients undergoing cardiothoracic surgery who required a central venous catheter (CVC) as part of their peri- and postoperative management were studied for microbial contamination of CVC luers following 72 hrs in situ. Each patient's CVC was randomly designated to have either the three-way stopcocks with caps (control patients) or Clearlink® Y-type extension sets (test patients). Prior to, and following each manipulation of the three-way stopcock luers or Clearlink® devices, a 70% (v/v) isopropyl alcohol swab was used for disinfection of the connections. The microbial contamination of 393 luers, 200 with standard caps and 193 with Clearlink® attached, was determined. The internal surfaces of 20 of 200 (10%) three-way stopcock luers with standard caps were contaminated with micro-organisms whereas only one of 193 (0.5%) luers with Clearlink® attached was contaminated (P < 0.0001). These results demonstrate that the use of the Clearlink® device with a dedicated disinfection regimen reduces the internal microbial contamination rate of CVC luers compared with standard caps. The use of such needle-free devices may therefore reduce the intraluminal risk of catheter-related bloodstream infection and thereby supplement current preventive guidelines. © 2006 The Hospital Infection Society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Nursing homes for older people provide an environment likely to promote the acquisition and spread of meticillin-resistant Staphylococcus aureus (MRSA), putting residents at increased risk of colonisation and infection. It is recognised that infection control strategies are important in preventing and controlling MRSA transmission.

Objectives: The objective of this review was to determine the effects of infection control strategies for preventing the transmission of MRSA in nursing homes for older people.

Search strategy: We searched the Cochrane Central Register of Controlled Trials (CENTRAL, The Cochrane Library 2009, Issue 2), the Cochrane Wounds Group Specialised Register (searched May 29th, 2009). We also searched MEDLINE (from 1950 to May Week 4 2009), Ovid EMBASE (1980 to 2009 Week 21), EBSCO CINAHL (1982 to May Week 4 2009), British Nursing Index (1985 to May 2009), DARE (1992 to May 2009), Web of Science (1981 to May 2009), and the Health Technology Assessment (HTA) website (1988 to May 2009). Research in progress was sought through Current Clinical Trials (www.controlled-trials.com), Medical Research Council Research portfolio, and HSRPRoj (current USA projects). SIGLE was also searched in order to identify atypical material which was not accessible through more conventional sources.

Selection criteria: All randomised and controlled clinical trials, controlled before and after studies and interrupted time series studies of infection control interventions in nursing homes for older people were eligible for inclusion.

Data collection and analysis: Two authors independently reviewed the results of the searches.

Main results: Since no studies met the selection criteria, neither a meta-analysis nor a narrative description of studies was possible.

Authors' conclusions: The lack of studies in this field is surprising. Nursing homes for older people provide an environment likely to promote the acquisition and spread of infection, with observational studies repeatedly reporting that being a resident of a nursing home increases the risk of MRSA colonisation. Much of the evidence for recently-issued United Kingdom guidelines for the control and prevention of MRSA in health care facilities was generated in the acute care setting. It may not be possible to transfer such strategies directly to the nursing home environment, which serves as both a healthcare setting and a resident's home. Rigorous studies should be conducted in nursing homes, to test interventions that have been specifically designed for this unique environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Study the semi-quantitative and quantitative technique in the diagnosis of catheter-related infections in newborns and to determine oxacillin resistance in Staphylococcus isolated. It was analyzed 353 catheter tips from 273 newborns in the Neonatal Unit of Hospital FMB. To confirm the diagnosis of infection, were analyzed the clinical data of newborns, the presence of at least one positive blood culture and growth of ≥ 1000 CFU/mL on quantitative culture and/or ³15 UFC on semiquantitative culture, with the same microorganism isolation (species and drug sensitivity) in blood culture and no other focus of infection except the catheter. The disk diffusion technique was used to check similarity of strains and resistance to oxacillin. Of the 353 tips analyzed, 39 were included in this study as the inclusion criteria. The semiquantitative culture was positive in 26 (66.7%) catheters and quantitative culture was positive in 24 (61.5%). Of 273 patients, 19 (6.9%) had a diagnosis of catheter-related bloodstream Infection (CR-BSI). Of the 19 episodes of CR-BSI, S. epidermidis was the predominant etiological agent (84.2%). The resistance to the antibiotic methicillin was found in 14 (73.7%) strains of Staphylococcus. The semiquantitative method was more sensitive (79%) compared with the quantitative method (63%). The use of antibiotics may have influenced the sensitivity of the quantitative method as the microorganisms present in the lumen are exposed to higher concentrations of antibiotics administered via the catheter

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Use of focus groups as a technique of inquiry is gaining attention in the area of health-care research. This paper will report on the technique of focus group interviewing to investigate the role of the infection control practitioner. Infection control is examined as a specialty area of health-care practice that has received little research attention to date. Additionally, it is an area of practice that is expanding in response to social, economic and microbiological forces. The focus group technique in this study helped a group of infection control practitioners from urban, regional and rural areas throughout Queensland identify and categorise their daily work activities. The outcomes of this process were then analysed to identify the growth in breadth and complexity of the role of the infection control practitioner in the contemporary health-care environment. Findings indicate that the role of the infection control practitioner in Australia has undergone changes consistent with and reflecting changing models of health-care delivery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Nursing homes for older people provide an environment likely to promote the acquisition and spread of meticillin-resistant Staphylococcus aureus (MRSA), putting residents at increased risk of colonisation and infection. It is recognised that infection prevention and control strategies are important in preventing and controlling MRSA transmission.

Objectives: To determine the effects of infection prevention and control strategies for preventing the transmission of MRSA in nursing homes for older people.

Search methods: In August 2013, for this third update, we searched the Cochrane Wounds Group Specialised Register, the Cochrane Central Register of Controlled Trials (CENTRAL, The Cochrane Library), Database of Abstracts of Reviews of Effects (DARE, The Cochrane Library), Ovid MEDLINE, OVID MEDLINE (In-process and Other Non-Indexed Citations), Ovid EMBASE, EBSCO CINAHL, Web of Science and the Health Technology Assessment (HTA) website. Research in progress was sought through Current Clinical Trials, Gateway to Reseach, and HSRProj (Health Services Research Projects in Progress).

Selection criteria: All randomised and controlled clinical trials, controlled before and after studies and interrupted time series studies of infection prevention and control interventions in nursing homes for older people were eligible for inclusion.

Data collection and analysis: Two review authors independently reviewed the results of the searches. Another review author appraised identified papers and undertook data extraction which was checked by a second review author.

Main results: For this third update only one study was identified, therefore it was not possible to undertake a meta-analysis. A cluster randomised controlled trial in 32 nursing homes evaluated the effect of an infection control education and training programme on MRSA prevalence. The primary outcome was MRSA prevalence in residents and staff, and a change in infection control audit scores which measured adherence to infection control standards. At the end of the 12 month study, there was no change in MRSA prevalence between intervention and control sites, while mean infection control audit scores were significantly higher in the intervention homes compared with control homes.

Authors' conclusions: There is a lack of research evaluating the effects on MRSA transmission of infection prevention and control strategies in nursing homes. Rigorous studies should be conducted in nursing homes, involving residents and staff to test interventions that have been specifically designed for this unique environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: To investigate epidemiological, social, diagnostic and economic aspects of chlamydia screening in non-genitourinary medicine settings. METHODS: Linked studies around a cross-sectional population-based survey of adult men and women invited to collect urine and (for women) vulvovaginal swab specimens at home and mail these to a laboratory for testing for Chlamydia trachomatis. Specimens were used in laboratory evaluations of an amplified enzyme immunoassay (PCE EIA) and two nucleic acid amplification tests [Cobas polymerase chain reaction (PCR), Becton Dickinson strand displacement amplification (SDA)]. Chlamydia-positive cases and two negative controls completed a risk factor questionnaire. Chlamydia-positive cases were invited into a randomised controlled trial of partner notification strategies. Samples of individuals testing negative completed psychological questionnaires before and after screening. In-depth interviews were conducted at all stages of screening. Chlamydia transmission and cost-effectiveness of screening were investigated in a transmission dynamic model. SETTING AND PARTICIPANTS: General population in the Bristol and Birmingham areas of England. In total, 19,773 women and men aged 16-39 years were randomly selected from 27 general practice lists. RESULTS: Screening invitations reached 73% (14,382/19,773). Uptake (4731 participants), weighted for sampling, was 39.5% (95% CI 37.7, 40.8%) in women and 29.5% (95% CI 28.0, 31.0%) in men aged 16-39 years. Chlamydia prevalence (219 positive results) in 16-24 year olds was 6.2% (95% CI 4.9, 7.8%) in women and 5.3% (95% CI 4.4, 6.3%) in men. The case-control study did not identify any additional factors that would help target screening. Screening did not adversely affect anxiety, depression or self-esteem. Participants welcomed the convenience and privacy of home-sampling. The relative sensitivity of PCR on male urine specimens was 100% (95% CI 89.1, 100%). The combined relative sensitivities of PCR and SDA using female urine and vulvovaginal swabs were 91.8% (86.1, 95.7, 134/146) and 97.3% (93.1, 99.2%, 142/146). A total of 140 people (74% of eligible) participated in the randomised trial. Compared with referral to a genitourinary medicine clinic, partner notification by practice nurses resulted in 12.4% (95% CI -3.7, 28.6%) more patients with at least one partner treated and 22.0% (95% CI 6.1, 37.8%) more patients with all partners treated. The health service and patients costs (2005 prices) of home-based postal chlamydia screening were 21.47 pounds (95% CI 19.91 pounds, 25.99) per screening invitation and 28.56 pounds (95% CI 22.10 pounds, 30.43) per accepted offer. Preliminary modelling found an incremental cost-effectiveness ratio (2003 prices) comparing screening men and women annually to no screening in the base case of 27,000 pounds/major outcome averted at 8 years. If estimated screening uptake and pelvic inflammatory disease incidence were increased, the cost-effectiveness ratio fell to 3700 pounds/major outcome averted. CONCLUSIONS: Proactive screening for chlamydia in women and men using home-collected specimens was feasible and acceptable. Chlamydia prevalence rates in men and women in the general population are similar. Nucleic acid amplification tests can be used on first-catch urine specimens and vulvovaginal swabs. The administrative costs of proactive screening were similar to those for opportunistic screening. Using empirical estimates of screening uptake and incidence of complications, screening was not cost-effective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Age-related macular degeneration is the most common cause of sight impairment in the UK. In neovascular age-related macular degeneration (nAMD), vision worsens rapidly (over weeks) due to abnormal blood vessels developing that leak fluid and blood at the macula.

OBJECTIVES: To determine the optimal role of optical coherence tomography (OCT) in diagnosing people newly presenting with suspected nAMD and monitoring those previously diagnosed with the disease.

DATA SOURCES: Databases searched: MEDLINE (1946 to March 2013), MEDLINE In-Process & Other Non-Indexed Citations (March 2013), EMBASE (1988 to March 2013), Biosciences Information Service (1995 to March 2013), Science Citation Index (1995 to March 2013), The Cochrane Library (Issue 2 2013), Database of Abstracts of Reviews of Effects (inception to March 2013), Medion (inception to March 2013), Health Technology Assessment database (inception to March 2013).

REVIEW METHODS: Types of studies: direct/indirect studies reporting diagnostic outcomes.

INDEX TEST: time domain optical coherence tomography (TD-OCT) or spectral domain optical coherence tomography (SD-OCT).

COMPARATORS: clinical evaluation, visual acuity, Amsler grid, colour fundus photographs, infrared reflectance, red-free images/blue reflectance, fundus autofluorescence imaging, indocyanine green angiography, preferential hyperacuity perimetry, microperimetry. Reference standard: fundus fluorescein angiography (FFA). Risk of bias was assessed using quality assessment of diagnostic accuracy studies, version 2. Meta-analysis models were fitted using hierarchical summary receiver operating characteristic curves. A Markov model was developed (65-year-old cohort, nAMD prevalence 70%), with nine strategies for diagnosis and/or monitoring, and cost-utility analysis conducted. NHS and Personal Social Services perspective was adopted. Costs (2011/12 prices) and quality-adjusted life-years (QALYs) were discounted (3.5%). Deterministic and probabilistic sensitivity analyses were performed.

RESULTS: In pooled estimates of diagnostic studies (all TD-OCT), sensitivity and specificity [95% confidence interval (CI)] was 88% (46% to 98%) and 78% (64% to 88%) respectively. For monitoring, the pooled sensitivity and specificity (95% CI) was 85% (72% to 93%) and 48% (30% to 67%) respectively. The FFA for diagnosis and nurse-technician-led monitoring strategy had the lowest cost (£39,769; QALYs 10.473) and dominated all others except FFA for diagnosis and ophthalmologist-led monitoring (£44,649; QALYs 10.575; incremental cost-effectiveness ratio £47,768). The least costly strategy had a 46.4% probability of being cost-effective at £30,000 willingness-to-pay threshold.

LIMITATIONS: Very few studies provided sufficient information for inclusion in meta-analyses. Only a few studies reported other tests; for some tests no studies were identified. The modelling was hampered by a lack of data on the diagnostic accuracy of strategies involving several tests.

CONCLUSIONS: Based on a small body of evidence of variable quality, OCT had high sensitivity and moderate specificity for diagnosis, and relatively high sensitivity but low specificity for monitoring. Strategies involving OCT alone for diagnosis and/or monitoring were unlikely to be cost-effective. Further research is required on (i) the performance of SD-OCT compared with FFA, especially for monitoring but also for diagnosis; (ii) the performance of strategies involving combinations/sequences of tests, for diagnosis and monitoring; (iii) the likelihood of active and inactive nAMD becoming inactive or active respectively; and (iv) assessment of treatment-associated utility weights (e.g. decrements), through a preference-based study.

STUDY REGISTRATION: This study is registered as PROSPERO CRD42012001930.

FUNDING: The National Institute for Health Research Health Technology Assessment programme.