80 resultados para Catheter Ablation
Resumo:
A combination of laser plasma ablation and strain control in CdO/ZnO heterostructures is used to produce and stabilize a metastable wurtzite CdO nanophase. According to the Raman selection rules, this nanophase is Raman-active whereas the thermodynamically preferred rocksalt phase is inactive. The wurtzite-specific and thickness/strain-dependent Raman fingerprints and phonon modes are identified and can be used for reliable and inexpensive nanophase detection. The wurtzite nanophase formation is also confirmed by x-ray diffractometry. The demonstrated ability of the metastable phase and phonon mode control in CdO/ZnO heterostructures is promising for the development of next-generation light emitting sources and exciton-based laser diodes.
Resumo:
In his letter Cunha suggests that oral antibiotic therapy is safer and less expensive than intravenous therapy via central venous catheters (CVCs) (1). The implication is that costs will fall and increased health benefits will be enjoyed resulting in a gain in efficiency within the healthcare system. CVCs are often used in critically ill patients to deliver antimicrobial therapy, but expose patients to a risk of catheter-related bloodstream infection (CRBSI). Our current knowledge about the efficiency (i.e. costeffectiveness) of allocating resources toward interventions that prevent CRBSI in patients requiring a CVC has already been reviewed (2). If for some patient groups antimicrobial therapy can be delivered orally, instead of through a CVC, then the costs and benefits of this alternate strategy should be evaluated...
Resumo:
Peripheral venous catheters (PVCs) are the simplest and most frequently used method for drug, fluid, and blood product administration in the hospital setting. It is estimated that up to 90% of patients in acute care hospitals require a PVC; however, PVCs are associated with inherent complications, which can be mechanical or infectious. There have been a range of strategies to prevent or reduce PVC-related complications that include optimizing patency through the use of flushing. Little is known about the current status of flushing practice. This observational study quantified preparation and administration time and identified adherence to principles of Aseptic Non-Touch Technique and organizational protocol on PVC flushing by using both manually prepared and prefilled syringes.
Resumo:
Background Bloodstream infections resulting from intravascular catheters (catheter-BSI) in critical care increase patients' length of stay, morbidity and mortality, and the management of these infections and their complications has been estimated to cost the NHS annually £19.1–36.2M. Catheter-BSI are thought to be largely preventable using educational interventions, but guidance as to which types of intervention might be most clinically effective is lacking. Objective To assess the effectiveness and cost-effectiveness of educational interventions for preventing catheter-BSI in critical care units in England. Data sources Sixteen electronic bibliographic databases – including MEDLINE, MEDLINE In-Process & Other Non-Indexed Citations, Cumulative Index to Nursing and Allied Health Literature (CINAHL), NHS Economic Evaluation Database (NHS EED), EMBASE and The Cochrane Library databases – were searched from database inception to February 2011, with searches updated in March 2012. Bibliographies of systematic reviews and related papers were screened and experts contacted to identify any additional references. Review methods References were screened independently by two reviewers using a priori selection criteria. A descriptive map was created to summarise the characteristics of relevant studies. Further selection criteria developed in consultation with the project Advisory Group were used to prioritise a subset of studies relevant to NHS practice and policy for systematic review. A decision-analytic economic model was developed to investigate the cost-effectiveness of educational interventions for preventing catheter-BSI. Results Seventy-four studies were included in the descriptive map, of which 24 were prioritised for systematic review. Studies have predominantly been conducted in the USA, using single-cohort before-and-after study designs. Diverse types of educational intervention appear effective at reducing the incidence density of catheter-BSI (risk ratios statistically significantly < 1.0), but single lectures were not effective. The economic model showed that implementing an educational intervention in critical care units in England would be cost-effective and potentially cost-saving, with incremental cost-effectiveness ratios under worst-case sensitivity analyses of < £5000/quality-adjusted life-year. Limitations Low-quality primary studies cannot definitively prove that the planned interventions were responsible for observed changes in catheter-BSI incidence. Poor reporting gave unclear estimates of risk of bias. Some model parameters were sourced from other locations owing to a lack of UK data. Conclusions Our results suggest that it would be cost-effective and may be cost-saving for the NHS to implement educational interventions in critical care units. However, more robust primary studies are needed to exclude the possible influence of secular trends on observed reductions in catheter-BSI.
Resumo:
Wet-milling protocol was employed to produce pressed powder tablets with excellent cohesion and homogeneity suitable for laser ablation (LA) analysis of volatile and refractive elements in sediment. The influence of sample preparation on analytical performance was also investigated, including sample homogeneity, accuracy and limit of detection. Milling in volatile solvent for 40 min ensured sample is well mixed and could reasonably recover both volatile (Hg) and refractive (Zr) elements. With the exception of Cr (−52%) and Nb (+26%) major, minor and trace elements in STSD-1 and MESS-3 could be analysed within ±20% of the certified values. Comparison of the method with total digestion method using HF was tested by analysing 10 different sediment samples. The laser method recovers significantly higher amounts of analytes such as Ag, Cd, Sn and Sn than the total digestion method making it a more robust method for elements across the periodic table. LA-ICP-MS also eliminates the interferences from chemical reagents as well as the health and safety risks associated with digestion processes. Therefore, it can be considered as an enhanced method for the analysis of heterogeneous matrices such as river sediments.
Resumo:
Background People admitted to intensive care units and those with chronic health care problems often require long-term vascular access. Central venous access devices (CVADs) are used for administering intravenous medications and blood sampling. CVADs are covered with a dressing and secured with an adhesive or adhesive tape to protect them from infection and reduce movement. Dressings are changed when they become soiled with blood or start to come away from the skin. Repeated removal and application of dressings can cause damage to the skin. The skin is an important barrier that protects the body against infection. Less frequent dressing changes may reduce skin damage, but it is unclear whether this practice affects the frequency of catheter-related infections. Objectives To assess the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections and other outcomes including pain and skin damage. Search methods In June 2015 we searched: The Cochrane Wounds Specialised Register; The Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library); Ovid MEDLINE; Ovid MEDLINE (In-Process & Other Non-Indexed Citations); Ovid EMBASE and EBSCO CINAHL. We also searched clinical trials registries for registered trials. There were no restrictions with respect to language, date of publication or study setting. Selection criteria All randomised controlled trials (RCTs) evaluating the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections on all patients in any healthcare setting. Data collection and analysis We used standard Cochrane review methodology. Two review authors independently assessed studies for inclusion, performed risk of bias assessment and data extraction. We undertook meta-analysis where appropriate or otherwise synthesised data descriptively when heterogeneous. Main results We included five RCTs (2277 participants) that compared different frequencies of CVAD dressing changes. The studies were all conducted in Europe and published between 1995 and 2009. Participants were recruited from the intensive care and cancer care departments of one children's and four adult hospitals. The studies used a variety of transparent dressings and compared a longer interval between dressing changes (5 to15 days; intervention) with a shorter interval between changes (2 to 5 days; control). In each study participants were followed up until the CVAD was removed or until discharge from ICU or hospital. - Confirmed catheter-related bloodstream infection (CRBSI) One trial randomised 995 people receiving central venous catheters to a longer or shorter interval between dressing changes and measured CRBSI. It is unclear whether there is a difference in the risk of CRBSI between people having long or short intervals between dressing changes (RR 1.42, 95% confidence interval (CI) 0.40 to 4.98) (low quality evidence). - Suspected catheter-related bloodstream infection Two trials randomised a total of 151 participants to longer or shorter dressing intervals and measured suspected CRBSI. It is unclear whether there is a difference in the risk of suspected CRBSI between people having long or short intervals between dressing changes (RR 0.70, 95% CI 0.23 to 2.10) (low quality evidence). - All cause mortality Three trials randomised a total of 896 participants to longer or shorter dressing intervals and measured all cause mortality. It is unclear whether there is a difference in the risk of death from any cause between people having long or short intervals between dressing changes (RR 1.06, 95% CI 0.90 to 1.25) (low quality evidence). - Catheter-site infection Two trials randomised a total of 371 participants to longer or shorter dressing intervals and measured catheter-site infection. It is unclear whether there is a difference in risk of catheter-site infection between people having long or short intervals between dressing changes (RR 1.07, 95% CI 0.71 to 1.63) (low quality evidence). - Skin damage One small trial (112 children) and three trials (1475 adults) measured skin damage. There was very low quality evidence for the effect of long intervals between dressing changes on skin damage compared with short intervals (children: RR of scoring ≥ 2 on the skin damage scale 0.33, 95% CI 0.16 to 0.68; data for adults not pooled). - Pain Two studies involving 193 participants measured pain. It is unclear if there is a difference between long and short interval dressing changes on pain during dressing removal (RR 0.80, 95% CI 0.46 to 1.38) (low quality evidence). Authors' conclusions The best available evidence is currently inconclusive regarding whether longer intervals between CVAD dressing changes are associated with more or less catheter-related infection, mortality or pain than shorter intervals.
Resumo:
Background Centers for Disease Control Guidelines recommend replacement of peripheral intravenous (IV) catheters every 72 to 96 hours. Routine replacement is thought to reduce the risk of phlebitis and bacteraemia. Catheter insertion is an unpleasant experience for patients and replacement may be unnecessary if the catheter remains functional and there are no signs of inflammation. Costs associated with routine replacement may be considerable. Objectives To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely.
Resumo:
Background: Bone loss associated with low oestrogen levels in postmenopausal women, and with androgen deprivation therapy in men with hormone-sensitive prostate cancer, result in an increased incidence of fractures. Denosumab has been shown to increase bone mineral density in these two conditions. Objectives/methods: The objective of this evaluation is to review the clinical trials that have studied clinical endpoints in these conditions. Results: FREEDOM (Fracture Reduction Evaluation of Denosumab in Osteoporosis Every 6 Months) was an International Phase III clinical trial that measured the clinical endpoints with denosumab in postmenopausal women with osteoporosis. At 36 months, new vertebral fractures had occurred in 7.2% of subjects in the placebo group and this was lowered to 2.3% of subjects treated with denosumab. HALT (Denosumab Hormone Ablation Bone Loss Trial) studied the clinical endpoints in men with non-metastatic prostate cancer receiving androgen-deprivation therapy. The incidence of vertebral fractures was significantly lower in the denosumab group (1.5%) than in the placebo group (3.9%). The incidence of adverse effects with denosumab in both clinical trials was low. Conclusions: Denosumab reduces the incidence of fractures in postmenopausal women with osteoporosis and in men with non-metastatic prostate cancer receiving androgen-deprivation therapy. Denosumab is well tolerated.
Resumo:
The growth of the Penaeus monodon prawn aquaculture industry in Australia is hampered by a reliance on wild-caught broodstock. This species has proven difficult to breed from if broodstock are reared in captivity. Studies were therefore carried out to investigate factors controlling reproduction and influencing egg quality. Results of the studies revealed that patterns of nutrient accumulation during early ovary development are altered by captive conditions, possibly contributing to reduce larval quality. The sinus gland hormones were shown, together with the environment, to regulate two stages of ovary development. In a separate study it was further revealed that the hormone methyl farnesoate (MF) could negatively regulate the final stages of ovary development. Lastly it was shown that broodstock reared in captivity are less likely to mate and that this is due to inherent problems in both the male and the female prawns.
Resumo:
Introduction: Some types of antimicrobial-coated central venous catheters (A-CVC) have been shown to be cost-effective in preventing catheter-related bloodstream infection (CR-BSI). However, not all types have been evaluated, and there are concerns over the quality and usefulness of these earlier studies. There is uncertainty amongst clinicians over which, if any, antimicrobial-coated central venous catheters to use. We re-evaluated the cost-effectiveness of all commercially available antimicrobialcoated central venous catheters for prevention of catheter-related bloodstream infection in adult intensive care unit (ICU) patients. Methods: We used a Markov decision model to compare the cost-effectiveness of antimicrobial-coated central venous catheters relative to uncoated catheters. Four catheter types were evaluated; minocycline and rifampicin (MR)-coated catheters; silver, platinum and carbon (SPC)-impregnated catheters; and two chlorhexidine and silver sulfadiazine-coated catheters, one coated on the external surface (CH/SSD (ext)) and the other coated on both surfaces (CH/SSD (int/ext)). The incremental cost per qualityadjusted life-year gained and the expected net monetary benefits were estimated for each. Uncertainty arising from data estimates, data quality and heterogeneity was explored in sensitivity analyses. Results: The baseline analysis, with no consideration of uncertainty, indicated all four types of antimicrobial-coated central venous catheters were cost-saving relative to uncoated catheters. Minocycline and rifampicin-coated catheters prevented 15 infections per 1,000 catheters and generated the greatest health benefits, 1.6 quality-adjusted life-years, and cost-savings, AUD $130,289. After considering uncertainty in the current evidence, the minocycline and rifampicin-coated catheters returned the highest incremental monetary net benefits of $948 per catheter; but there was a 62% probability of error in this conclusion. Although the minocycline and rifampicin-coated catheters had the highest monetary net benefits across multiple scenarios, the decision was always associated with high uncertainty. Conclusions: Current evidence suggests that the cost-effectiveness of using antimicrobial-coated central venous catheters within the ICU is highly uncertain. Policies to prevent catheter-related bloodstream infection amongst ICU patients should consider the cost-effectiveness of competing interventions in the light of this uncertainty. Decision makers would do well to consider the current gaps in knowledge and the complexity of producing good quality evidence in this area.
Resumo:
Background: Reducing rates of healthcare acquired infection has been identified by the Australian Commission on Safety and Quality in Health Care as a national priority. One of the goals is the prevention of central venous catheter-related bloodstream infection (CR-BSI). At least 3,500 cases of CR-BSI occur annually in Australian hospitals, resulting in unnecessary deaths and costs to the healthcare system between $25.7 and $95.3 million. Two approaches to preventing these infections have been proposed: use of antimicrobial catheters (A-CVCs); or a catheter care and management ‘bundle’. Given finite healthcare budgets, decisions about the optimal infection control policy require consideration of the effectiveness and value for money of each approach. Objectives: The aim of this research is to use a rational economic framework to inform efficient infection control policy relating to the prevention of CR-BSI in the intensive care unit. It addresses three questions relating to decision-making in this area: 1. Is additional investment in activities aimed at preventing CR-BSI an efficient use of healthcare resources? 2. What is the optimal infection control strategy from amongst the two major approaches that have been proposed to prevent CR-BSI? 3. What uncertainty is there in this decision and can a research agenda to improve decision-making in this area be identified? Methods: A decision analytic model-based economic evaluation was undertaken to identify an efficient approach to preventing CR-BSI in Queensland Health intensive care units. A Markov model was developed in conjunction with a panel of clinical experts which described the epidemiology and prognosis of CR-BSI. The model was parameterised using data systematically identified from the published literature and extracted from routine databases. The quality of data used in the model and its validity to clinical experts and sensitivity to modelling assumptions was assessed. Two separate economic evaluations were conducted. The first evaluation compared all commercially available A-CVCs alongside uncoated catheters to identify which was cost-effective for routine use. The uncertainty in this decision was estimated along with the value of collecting further information to inform the decision. The second evaluation compared the use of A-CVCs to a catheter care bundle. We were unable to estimate the cost of the bundle because it is unclear what the full resource requirements are for its implementation, and what the value of these would be in an Australian context. As such we undertook a threshold analysis to identify the cost and effectiveness thresholds at which a hypothetical bundle would dominate the use of A-CVCs under various clinical scenarios. Results: In the first evaluation of A-CVCs, the findings from the baseline analysis, in which uncertainty is not considered, show that the use of any of the four A-CVCs will result in health gains accompanied by cost-savings. The MR catheters dominate the baseline analysis generating 1.64 QALYs and cost-savings of $130,289 per 1.000 catheters. With uncertainty, and based on current information, the MR catheters remain the optimal decision and return the highest average net monetary benefits ($948 per catheter) relative to all other catheter types. This conclusion was robust to all scenarios tested, however, the probability of error in this conclusion is high, 62% in the baseline scenario. Using a value of $40,000 per QALY, the expected value of perfect information associated with this decision is $7.3 million. An analysis of the expected value of perfect information for individual parameters suggests that it may be worthwhile for future research to focus on providing better estimates of the mortality attributable to CR-BSI and the effectiveness of both SPC and CH/SSD (int/ext) catheters. In the second evaluation of the catheter care bundle relative to A-CVCs, the results which do not consider uncertainty indicate that a bundle must achieve a relative risk of CR-BSI of at least 0.45 to be cost-effective relative to MR catheters. If the bundle can reduce rates of infection from 2.5% to effectively zero, it is cost-effective relative to MR catheters if national implementation costs are less than $2.6 million ($56,610 per ICU). If the bundle can achieve a relative risk of 0.34 (comparable to that reported in the literature) it is cost-effective, relative to MR catheters, if costs over an 18 month period are below $613,795 nationally ($13,343 per ICU). Once uncertainty in the decision is considered, the cost threshold for the bundle increases to $2.2 million. Therefore, if each of the 46 Level III ICUs could implement an 18 month catheter care bundle for less than $47,826 each, this approach would be cost effective relative to A-CVCs. However, the uncertainty is substantial and the probability of error in concluding that the bundle is the cost-effective approach at a cost of $2.2 million is 89%. Conclusions: This work highlights that infection control to prevent CR-BSI is an efficient use of healthcare resources in the Australian context. If there is no further investment in infection control, an opportunity cost is incurred, which is the potential for a more efficient healthcare system. Minocycline/rifampicin catheters are the optimal choice of antimicrobial catheter for routine use in Australian Level III ICUs, however, if a catheter care bundle implemented in Australia was as effective as those used in the large studies in the United States it would be preferred over the catheters if it was able to be implemented for less than $47,826 per Level III ICU. Uncertainty is very high in this decision and arises from multiple sources. There are likely greater costs to this uncertainty for A-CVCs, which may carry hidden costs, than there are for a catheter care bundle, which is more likely to provide indirect benefits to clinical practice and patient safety. Research into the mortality attributable to CR-BSI, the effectiveness of SPC and CH/SSD (int/ext) catheters and the cost and effectiveness of a catheter care bundle in Australia should be prioritised to reduce uncertainty in this decision. This thesis provides the economic evidence to inform one area of infection control, but there are many other infection control decisions for which information about the cost-effectiveness of competing interventions does not exist. This work highlights some of the challenges and benefits to generating and using economic evidence for infection control decision-making and provides support for commissioning more research into the cost-effectiveness of infection control.
Resumo:
Prostate cancer is an important male health issue. The strategies used to diagnose and treat prostate cancer underscore the cell and molecular interactions that promote disease progression. Prostate cancer is histologically defined by increasingly undifferentiated tumour cells and therapeutically targeted by androgen ablation. Even as the normal glandular architecture of the adult prostate is lost, prostate cancer cells remain dependent on the androgen receptor (AR) for growth and survival. This project focused on androgen-regulated gene expression, altered cellular differentiation, and the nexus between these two concepts. The AR controls prostate development, homeostasis and cancer progression by regulating the expression of downstream genes. Kallikrein-related serine peptidases are prominent transcriptional targets of AR in the adult prostate. Kallikrein 3 (KLK3), which is commonly referred to as prostate-specific antigen, is the current serum biomarker for prostate cancer. Other kallikreins are potential adjunct biomarkers. As secreted proteases, kallikreins act through enzyme cascades that may modulate the prostate cancer microenvironment. Both as a panel of biomarkers and cascade of proteases, the roles of kallikreins are interconnected. Yet the expression and regulation of different kallikreins in prostate cancer has not been compared. In this study, a spectrum of prostate cell lines was used to evaluate the expression profile of all 15 members of the kallikrein family. A cluster of genes was co-ordinately expressed in androgenresponsive cell lines. This group of kallikreins included KLK2, 3, 4 and 15, which are located adjacent to one another at the centromeric end of the kallikrein locus. KLK14 was also of interest, because it was ubiquitously expressed among the prostate cell lines. Immunohistochemistry showed that these 5 kallikreins are co-expressed in benign and malignant prostate tissue. The androgen-regulated expression of KLK2 and KLK3 is well-characterised, but has not been compared with other kallikreins. Therefore, KLK2, 3, 4, 14 and 15 expression were all measured in time course and dose response experiments with androgens, AR-antagonist treatments, hormone deprivation experiments and cells transfected with AR siRNA. Collectively, these experiments demonstrated that prostatic kallikreins are specifically and directly regulated by the AR. The data also revealed that kallikrein genes are differentially regulated by androgens; KLK2 and KLK3 were strongly up-regulated, KLK4 and KLK15 were modestly up-regulated, and KLK14 was repressed. Notably, KLK14 is located at the telomeric end of the kallikrein locus, far away from the centromeric cluster of kallikreins that are stimulated by androgens. These results show that the expression of KLK2, 3, 4, 14 and 15 is maintained in prostate cancer, but that these genes exhibit different responses to androgens. This makes the kallikrein locus an ideal model to investigate AR signalling. The increasingly dedifferentiated phenotype of aggressive prostate cancer cells is accompanied by the re-expression of signalling molecules that are usually expressed during embryogenesis and foetal tissue development. The Wnt pathway is one developmental cascade that is reactivated in prostate cancer. The canonical Wnt cascade regulates the intracellular levels of β-catenin, a potent transcriptional co-activator of T-cell factor (TCF) transcription factors. Notably, β-catenin can also bind to the AR and synergistically stimulate androgen-mediated gene expression. This is at the expense of typical Wnt/TCF target genes, because the AR:β-catenin and TCF:β-catenin interactions are mutually exclusive. The effect of β-catenin on kallikrein expression was examined to further investigate the role of β-catenin in prostate cancer. Stable knockdown of β-catenin in LNCaP prostate cancer cells attenuated the androgen-regulated expression of KLK2, 3, 4 and 15, but not KLK14. To test whether KLK14 is instead a TCF:β-catenin target gene, the endogenous levels of β-catenin were increased by inhibiting its degradation. Although KLK14 expression was up-regulated by these treatments, siRNA knockdown of β-catenin demonstrated that this effect was independent of β-catenin. These results show that β-catenin is required for maximal expression of KLK2, 3, 4 and 15, but not KLK14. Developmental cells and tumour cells express a similar repertoire of signalling molecules, which means that these different cell types are responsive to one another. Previous reports have shown that stem cells and foetal tissues can reprogram aggressive cancer cells to less aggressive phenotypes by restoring the balance to developmental signalling pathways that are highly dysregulated in cancer. To investigate this phenomenon in prostate cancer, DU145 and PC-3 prostate cancer cells were cultured on matrices pre-conditioned with human embryonic stem cells (hESCs). Soft agar assays showed that prostate cancer cells exposed to hESC conditioned matrices had reduced clonogenicity compared with cells harvested from control matrices. A recent study demonstrated that this effect was partially due to hESC-derived Lefty, an antagonist of Nodal. A member of the transforming growth factor β (TGFβ) superfamily, Nodal regulates embryogenesis and is re-expressed in cancer. The role of Nodal in prostate cancer has not previously been reported. Therefore, the expression and function of the Nodal signalling pathway in prostate cancer was investigated. Western blots confirmed that Nodal is expressed in DU145 and PC-3 cells. Immunohistochemistry revealed greater expression of Nodal in malignant versus benign glands. Notably, the Nodal inhibitor, Lefty, was not expressed at the mRNA level in any prostate cell lines tested. The Nodal signalling pathway is functionally active in prostate cancer cells. Recombinant Nodal treatments triggered downstream phosphorylation of Smad2 in DU145 and LNCaP cells, and stably-transfected Nodal increased the clonogencity of LNCaP cells. Nodal was also found to modulate AR signalling. Nodal reduced the activity of an androgen-regulated KLK3 promoter construct in luciferase assays and attenuated the endogenous expression of AR target genes including prostatic kallikreins. These results demonstrate that Nodal is a novel example of a developmental signalling molecule that is reexpressed in prostate cancer and may have a functional role in prostate cancer progression. In summary, this project clarifies the role of androgens and changing cellular differentiation in prostate cancer by characterising the expression and function of the downstream genes encoding kallikrein-related serine proteases and Nodal. Furthermore, this study emphasises the similarities between prostate cancer and early development, and the crosstalk between developmental signalling pathways and the AR axis. The outcomes of this project also affirm the utility of the kallikrein locus as a model system to monitor tumour progression and the phenotype of prostate cancer cells.