940 resultados para Catheter Ablation
Resumo:
Detection of extraterrestrial life is an ongoing goal in space exploration, and there is a need for advanced instruments and methods for the detection of signatures of life based on chemical and isotopic composition. Here, we present the first investigation of chemical composition of putative microfossils in natural samples using a miniature laser ablation/ionization time-of-flight mass spectrometer (LMS). The studies were conducted with high lateral (similar to 15 mu m) and vertical (similar to 20-200 nm) resolution. The primary aim of the study was to investigate the instrument performance on micrometer-sized samples both in terms of isotope abundance and element composition. The following objectives had to be achieved: (1) Consider the detection and calculation of single stable isotope ratios in natural rock samples with techniques compatible with their employment of space instrumentation for biomarker detection in future planetary missions. (2) Achieve a highly accurate chemical compositional map of rock samples with embedded structures at the micrometer scale in which the rock matrix is easily distinguished from the micrometer structures. Our results indicate that chemical mapping of strongly heterogeneous rock samples can be obtained with a high accuracy, whereas the requirements for isotope ratios need to be improved to reach sufficiently large signal-to-noise ratio (SNR). Key Words: Biogenicity-Biomarkers-Biosignatures-Filaments-Fossilization. Astrobiology 15, 669-682.
Resumo:
This study has the purpose of determining the barriers and facilitators to nurses' acceptance of the Johnson and Johnson Protectiv®* Plus IV catheter safety needle device and implications for needlestick injuries at St. Luke's Episcopal Hospital, Houston, Texas. A one-time cross-sectional survey of 620 responding nurses was conducted by this researcher during December, 2000. The study objectives were to: (1) describe the perceived (a) organizational and individual barriers and facilitators and (b) acceptance of implementation of the IV catheter device; (2) examine the relative importance of these predictors; (3) describe (a) perceived changes in needlestick injuries after implementation of the device; (b) the reported incidence of injuries; and (c) the extent of underreporting by nurses; and (4) examine the relative importance of (a) the preceding predictors and (b) acceptance of the device in predicting perceived changes in needlestick injuries. Safety climate and training were evaluated as organizational factors. Individual factors evaluated were experience with the device, including time using it and frequency of use, and background information, including nursing unit, and length of time as a nurse in this hospital and in total nursing career. The conceptual framework was based upon the safety climate model. Descriptive statistics and multiple and logistic regression were utilized to address the study objectives. ^ The findings showed widespread acceptance of the device and a strong perception that it reduced the number of needlesticks. Acceptance was notably predicted by adequate training, appropriate time between training and device use, solid safety climate, and short length of service, in that order. A barrier to acceptance was nurses' longtime of use of previous needle technologies. Over four-fifths of nurses were compliant in always using the device. Compliance had two facilitators: length of time using device and, to a lesser extent, safety climate. Rates of compliance tended to be lower among nurses in units in which the device was frequently used. ^ High quality training and an atmosphere of caring about nurse safety stand out as primary facilitators that other institutions would need to adopt in order to achieve maximum success in implementing safety programs involving utilization of new safety devices. ^
Resumo:
A case-series analysis of approximately 811 cancer patients who developed Candidemia between 1989 and 1998 and seen at M. D. Anderson Cancer Center, was studied to assess the impact and timing of central venous catheter (CVC) removal on the outcome of fungal bloodstream infections in cancer patients with primary catheter-related Candidemia as well as secondary infections. ^ This study explored the diagnosis and the management of vascular catheter-associated fungemia in patients with cancer. The microbiologic and clinical factors were determined to predict catheter-related Candidemia. Those factors included, in addition to basic demographics, the underlying malignancy, chemotherapy, neutropenia, and other salient data. Statistical analyses included univariate and multivariate logistic regression to determine the outcome of Candidemia in relation to the timing of catheter removal, type of species, and to identify predictors of catheter-related infections. ^ The conclusions of the study aim at enhancing our mastery of issues involving CVC removal and potentially will have an impact on the management of nosocomial bloodstream infections related to timing of CVC removal and the optimal duration of treatment of catheter-related Candidemia. ^
Resumo:
Background. Health care associated catheter related blood stream infections (CRBSI) represent a significant public health concern in the United States. Several studies have suggested that precautions such as maximum sterile barrier and use of antimicrobial catheters are efficacious at reducing CRBSI, but there is concern within the medical community that the prolonged use of antimicrobial catheters may be associated with increased bacterial resistance. Clinical studies have been done showing no association and a significant decrease in microbial resistance with prolonged minocycline/rifampin (M/R) catheter use. One explanation is the emergence of community acquired methicillin resistant Staphylococcus aureus (MRSA), which is more susceptible to antibiotics, as a cause of CRBSI.^ Methods. Data from 323 MRSA isolates cultured from cancer patients at The University of Texas MD Anderson Cancer center from 1997-2007 displaying MRSA infection were analyzed to determine whether there is a relationship between resistance to minocycline and rifampin and prolonged wide spread use of minocycline (M/R) catheters. Analysis was also conducted to determine whether there was a significant change in the prevalence community acquired MRSA (CA-MRSA) during this time period and if this emergence act as a confounder masquerading the true relationship between microbial resistance and prolonged M/R catheter use.^ Results. Our study showed that the significant (p=0.008) change in strain type over time is a confounding variable; the adjusted model showed a significant protective effect (OR 0.000281, 95% CI 1.4x10 -4-5.5x10-4) in the relationship between MRSA resistance to minocycline and prolonged M/R catheter use. The relationship between resistance to rifampin and prolonged M/R catheter use was not significant.^ Conclusion. The emergence of CA-MRSA is a confounder and in the relationship between resistance to minocycline and rifampin and prolonged M/R catheter use. However, despite the adjustment for the more susceptible CA-MRSA the widespread use of M/R catheters does not promote microbial resistance. ^
Resumo:
This report describes the development of a Markov model for comparing percutaneous radiofrequency ablation (RFA) and stereotactic body radiation therapy (SBRT) in terms of their cost-utility in treating isolated liver metastases from colorectal cancer. The model is based on data from multiple retrospective and prospective studies, available data on different utility states associated with treatment and complications, as well as publicly available Medicare costs. The purpose of this report is to establish a well-justified model for clinical management decisions. In comparison with SBRT, RFA is the most cost-effective treatment for this patient population. From the societal perspective, SBRT may be an acceptable alternative with an ICER of $28,673/QALY. ^
Resumo:
Catheter related bloodstream infections are a significant barrier to success in many inpatient healthcare facilities. The goal of this study was to analyze and determine if an evidence based methodology to reduce the number of catheter related bloodstream infections in a pediatric inpatient healthcare facility had significant impact on the infection rate. Catheter related bloodstream infection rates were compared before and after program implementation. The patient population was selected based upon a recommendation in the 2010 National Healthcare Safety Network report on device related infections. This report indicated a need for more data on pediatric populations requiring admission to a long term care facility. The study design is a retrospective cohort study. Catheter related bloodstream infection data was gathered between 2008 and 2011. In October of 2008 a program implementation began to reduce the number of catheter related bloodstream infections. The key components of this initiative were to implement a standardized catheter maintenance checklist, introduce the usage of a chlorhexadine gluconate based product for catheter maintenance and skin antisepsis, and a multidisciplinary education plan that focused on hand hygiene and aseptic technique. The catheter related bloodstream infection rate in 2008 was 21.21 infections per 1000 patient-line days. After program implementation the 2009 catheter related bloodstream infection rate dropped to 1.11 per 1000 patient-line days. The infection rates in 2010 and 2011 were 2.19 and 1.47 respectively. Additionally, this study demonstrated that there was a potential cost savings of $620,000 to $1,240,000 between 2008 and 2009. In conclusion, an evidence based program based upon CDC guidelines can have a significant impact on catheter related bloodstream infection rates. ^
Resumo:
Background: The distinction between catheter-associated asymptomatic bacteriuria (CAABU) and catheter-associated urinary tract infection (CAUTI) has only recently been widely appreciated. Our aims were to describe the relationship between CAUTI/CAABU and subsequent bacteremia and to investigate whether CAUTI/CAABU and antimicrobial use was associated with either bacteremia or mortality within 30 days. ^ Methods: Our study design was retrospective cohort. Patients with a urinary catheter and a positive urine culture between October 2010 and June 2011 at a large tertiary care facility were included. A multivariable model for analysis was constructed which controlled for age, race, Charlson co-morbidity score, catheter type and duration, category of organism,antimicrobials and classification of the catheter-associated bacteriuria as CAUTI or CAABU. ^ Results: Data from 444 catheter associated urine culture episodes in 308 unique patients were included in the analysis. Overall mortality was 21.1% (61 of 308 patients) within 30 days. Among the 444 urine culture episodes, 402 (90.5%) of these episodes were associated with antibiotic use. 52 (11.7%) of episodes were associated with bacteremia, but only 3 episodes of bacteremia (0.7% of 444 CAB episodes) were caused by an organism from the urinary tract. One of these episodes was CAABU and the other 2 were CAUTI. Bacteremia within 30 days was associated with having CAUTI rather than CAABU and having an indwelling urinary catheter rather than a condom catheter. The variables which were found to be significant for mortality within 30 days were a higher Charlson co-morbidity score and the presence of Candida in the urine culture. Use of antimicrobial agents to treat the bacteriuria was not associated with an increase or decrease in either bacteremia or mortality. ^ Conclusions: Our findings call into question the practice of giving antimicrobial agents to treat bacteriuria in an inpatient population with nearly universal antimicrobial use. A better practice may be targeted treatment of bacteriuria in patients with risk factors predictive of bacteremia and mortality.^
Resumo:
An observational study was conducted in a SICU to determine the frequency of subclavian vein catheter-related infection at 72 hours, to identify the hospital cost of exchange via a guidewire and the estimated hospital cost-savings of a 72 hour vs 144 hour exchange policy.^ An overall catheter-related infection ($\geq$15 col. by Maki's technique (1977)) occurred in 3% (3/100) of the catheter tips cultured. Specific infections rates were: 9.7% (3/31) for triple lumen catheters, 0% (0/30) for Swan-Ganz catheters, 0% (0/30) for Cordes catheters, and 0% (0/9) for single lumen catheters.^ An estimated annual hospital cost-savings of $35,699.00 was identified if exchange of 72 hour policy were changed to every 144 hours.^ It was recommended that a randomized clinical trial be conducted to determine the effect of changing a subclavian vein catheter via a guidewire every 72 hours vs 144 hours. ^
Resumo:
Biological activity introduces variability in element incorporation during calcification and thereby decreases the precision and accuracy when using foraminifera as geochemical proxies in paleoceanography. This so-called 'vital effect' consists of organismal and environmental components. Whereas organismal effects include uptake of ions from seawater and subsequent processing upon calcification, environmental effects include migration- and seasonality-induced differences. Triggering asexual reproduction and culturing juveniles of the benthic foraminifer Ammonia tepida under constant, controlled conditions allow environmental and genetic variability to be removed and the effect of cell-physiological controls on element incorporation to be quantified. Three groups of clones were cultured under constant conditions while determining their growth rates, size-normalized weights and single-chamber Mg/Ca and Sr/Ca using laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS). Results show no detectable ontogenetic control on the incorporation of these elements in the species studied here. Despite constant culturing conditions, Mg/Ca varies by a factor of similar to 4 within an individual foraminifer while intra-individual Sr/Ca varies by only a factor of 1.6. Differences between clone groups were similar to the intra-clone group variability in element composition, suggesting that any genetic differences between the clone-groups studied here do not affect trace element partitioning. Instead, variability in Mg/Ca appears to be inherent to the process of bio-calcification itself. The variability in Mg/Ca between chambers shows that measurements of at least 6 different chambers are required to determine the mean Mg/Ca value for a cultured foraminiferal test with a precision of <= 10%
Resumo:
The aim of inertial confinement fusion is the production of energy by the fusion of thermonuclear fuel (deuterium-tritium) enclosed in a spherical target due to its implosion. In the direct-drive approach, the energy needed to spark fusion reactions is delivered by the irradiation of laser beams that leads to the ablation of the outer shell of the target (the so-called ablator). As a reaction to this ablation process, the target is accelerated inwards, and, provided that this implosion is sufficiently strong a symmetric, the requirements of temperature and pressure in the center of the target are achieved leading to the ignition of the target (fusion). One of the obstacles capable to prevent appropriate target implosions takes place in the ablation region where any perturbation can grow even causing the ablator shell break, due to the ablative Rayleigh-Taylor instability. The ablative Rayleigh-Taylor instability has been extensively studied throughout the last 40 years in the case where the density/temperature profiles in the ablation region present a single front (the ablation front). Single ablation fronts appear when the ablator material has a low atomic number (deuterium/tritium ice, plastic). In this case, the main mechanism of energy transport from the laser energy absorption region (low density plasma) to the ablation region is the electron thermal conduction. However, recently, the use of materials with a moderate atomic number (silica, doped plastic) as ablators, with the aim of reducing the target pre-heating caused by suprathermal electrons generated by the laser-plasma interaction, has demonstrated an ablation region composed of two ablation fronts. This fact appears due to increasing importance of radiative effects in the energy transport. The linear theory describing the Rayleigh-Taylor instability for single ablation fronts cannot be applied for the stability analysis of double ablation front structures. Therefore, the aim of this thesis is to develop, for the first time, a linear stability theory for this type of hydrodynamic structures.
Resumo:
Monolithical series connection of silicon thin-film solar cells modules performed by laser scribing plays a very important role in the entire production of these devices. In the current laser process interconnection the two last steps are developed for a configuration of modules where the glass is essential as transparent substrate. In addition, the change of wavelength in the employed laser sources is sometimes enforced due to the nature of the different materials of the multilayer structure which make up the device. The aim of this work is to characterize the laser patterning involved in the monolithic interconnection process in a different configurations of processing than the usually performed with visible laser sources. To carry out this study, we use nanosecond and picosecond laser sources working at 355nm of wavelength in order to achieve the selective ablation of the material from the film side. To assess this selective removal of material has been used EDX (energy dispersive using X-ray) analysis
Resumo:
The linear stability analysis of accelerated double ablation fronts is carried out numerically with a self-consistent approach. Accurate hydrodynamic profiles are taken into account in the theoretical model by means of a fitting parameters method using 1D simulation results. Numerical dispersión relation is compared to an analytical sharp boundary model [Yan˜ez et al., Phys. Plasmas 18, 052701 (2011)] showing an excellent agreement for the radiation dominated regime of very steep ablation fronts, and the stabilization due to smooth profiles. 2D simulations are presented to validate the numerical self-consistent theory.
Resumo:
Laser material processing is being extensively used in photovoltaic applications for both the fabrication of thin film modules and the enhancement of the crystalline silicon solar cells. The two temperature model for thermal diffusion was numerically solved in this paper. Laser pulses of 1064, 532 or 248 nm with duration of 35, 26 or 10 ns were considered as the thermal source leading to the material ablation. Considering high irradiance levels (108–109 W cm−2), a total absorption of the energy during the ablation process was assumed in the model. The materials analysed in the simulation were aluminium (Al) and silver (Ag), which are commonly used as metallic electrodes in photovoltaic devices. Moreover, thermal diffusion was also simulated for crystalline silicon (c-Si). A similar trend of temperature as a function of depth and time was found for both metals and c-Si regardless of the employed wavelength. For each material, the ablation depth dependence on laser pulse parameters was determined by means of an ablation criterion. Thus, after the laser pulse, the maximum depth for which the total energy stored in the material is equal to the vaporisation enthalpy was considered as the ablation depth. For all cases, the ablation depth increased with the laser pulse fluence and did not exhibit a clear correlation with the radiation wavelength. Finally, the experimental validation of the simulation results was carried out and the ability of the model with the initial hypothesis of total energy absorption to closely fit experimental results was confirmed.