949 resultados para 0.9 per mil were added


Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIM: To estimate the incidence of severe chemical corneal injuries in the UK and describe presenting clinical features and initial management.

METHODS: All patients with severe chemical corneal injury in the UK from December 2005 to November 2006 inclusive were prospectively identified using the British Ophthalmological Surveillance Unit. Reporting ophthalmologists provided information regarding presentation and follow-up.

RESULTS: Twelve cases were identified, giving a minimum estimated incidence in the UK of severe chemical corneal injury of 0.02 per 100,000. 66.7% of injuries were in males of working age, 50% occurred at work, and alkali was causative in 66.7%. Only one patient was wearing eye protection at the time of injury, 75% received immediate irrigation. Six patients required one or more surgical procedures, most commonly amniotic membrane graft. At 6 months' follow-up, the best-corrected visual acuity was 6/12 or better in five patients, and worse than 6/60 in two.

CONCLUSION: The incidence of severe chemical corneal injury in the UK is low. The cases that occur can require extended hospital treatment, with substantial ocular morbidity and visual sequelae. Current enforcement of eye protection in the workplace in the UK has probably contributed to a reduced incidence of severe ocular burns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: The authors sought to quantify neighboring and distant interpoint correlations of threshold values within the visual field in patients with glaucoma. Methods: Visual fields of patients with confirmed or suspected glaucoma were analyzed (n = 255). One eye per patient was included. Patients were examined using the 32 program of the Octopus 1-2-3. Linear regression analysis among each of the locations and the rest of the points of the visual field was performed, and the correlation coefficient was calculated. The degree of correlation was categorized as high (r > 0.66), moderate (0.66 = r > 0.33), or low (r = 0.33). The standard error of threshold estimation was calculated. Results: Most locations of the visual field had high and moderate correlations with neighboring points and with distant locations corresponding to the same nerve fiber bundle. Locations of the visual field had low correlations with those of the opposite hemifield, with the exception of locations temporal to the blind spot. The standard error of threshold estimation increased from 0.6 to 0.9 dB with an r reduction of 0.1. Conclusion: Locations of the visual field have highest interpoint correlation with neighboring points and with distant points in areas corresponding to the distribution of the retinal nerve fiber layer. The quantification of interpoint correlations may be useful in the design and interpretation of visual field tests in patients with glaucoma.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background
Inappropriate polypharmacy is a particular concern in older people and is associated with negative health outcomes. Choosing the best interventions to improve appropriate polypharmacy is a priority, hence interest in appropriate polypharmacy, where many medicines may be used to achieve better clinical outcomes for patients, is growing.

Objectives
This review sought to determine which interventions, alone or in combination, are effective in improving the appropriate use of polypharmacy and reducing medication-related problems in older people.

Search methods
In November 2013, for this first update, a range of literature databases including MEDLINE and EMBASE were searched, and handsearching of reference lists was performed. Search terms included 'polypharmacy', 'medication appropriateness' and 'inappropriate prescribing'.

Selection criteria
A range of study designs were eligible. Eligible studies described interventions affecting prescribing aimed at improving appropriate polypharmacy in people 65 years of age and older in which a validated measure of appropriateness was used (e.g. Beers criteria, Medication Appropriateness Index (MAI)).

Data collection and analysis
Two review authors independently reviewed abstracts of eligible studies, extracted data and assessed risk of bias of included studies. Study-specific estimates were pooled, and a random-effects model was used to yield summary estimates of effect and 95% confidence intervals (CIs). The GRADE (Grades of Recommendation, Assessment, Development and Evaluation) approach was used to assess the overall quality of evidence for each pooled outcome.

Main results
Two studies were added to this review to bring the total number of included studies to 12. One intervention consisted of computerised decision support; 11 complex, multi-faceted pharmaceutical approaches to interventions were provided in a variety of settings. Interventions were delivered by healthcare professionals, such as prescribers and pharmacists. Appropriateness of prescribing was measured using validated tools, including the MAI score post intervention (eight studies), Beers criteria (four studies), STOPP criteria (two studies) and START criteria (one study). Interventions included in this review resulted in a reduction in inappropriate medication usage. Based on the GRADE approach, the overall quality of evidence for all pooled outcomes ranged from very low to low. A greater reduction in MAI scores between baseline and follow-up was seen in the intervention group when compared with the control group (four studies; mean difference -6.78, 95% CI -12.34 to -1.22). Postintervention pooled data showed a lower summated MAI score (five studies; mean difference -3.88, 95% CI -5.40 to -2.35) and fewer Beers drugs per participant (two studies; mean difference -0.1, 95% CI -0.28 to 0.09) in the intervention group compared with the control group. Evidence of the effects of interventions on hospital admissions (five studies) and of medication-related problems (six studies) was conflicting.

Authors' conclusions
It is unclear whether interventions to improve appropriate polypharmacy, such as pharmaceutical care, resulted in clinically significant improvement; however, they appear beneficial in terms of reducing inappropriate prescribing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: There are many issues regarding the use of real patients in objective structured clinical examinations (OSCEs). In dermatology OSCE stations, standardised patients (SPs) with clinical photographs are often used. Temporary transfer tattoos can potentially simulate skin lesions when applied to an SP. This study aims to appraise the use of temporary malignant melanoma tattoos within an OSCE framework. Method: Within an 11-station OSCE, a temporary malignant melanoma tattoo was developed and applied to SPs in a 'skin lesion' OSCE station. A questionnaire captured the opinions of the candidate, SP and examiners, and the degree of perceived realism of each station was determined. Standard post hoc OSCE analysis determined the psychometric reliability of the stations. Results: The response rates were 95.9 per cent of candidates and 100 per cent of the examiners and SPs. The 'skin lesion' station achieved the highest realism score compared with other stations: 89.0 per cent of candidates felt that the skin lesion appeared realistic; only 28 per cent of candidates had ever seen a melanoma before in training. The psychometric performance of the melanoma station was comparable with, and in many instances better than, other OSCE stations. Discussion: Transfer tattoo technology facilitates a realistic dermatology OSCE station encounter. Temporary tattoos, alongside trained SPs, provide an authentic, standardised and reliable experience, allowing the assessment of integrated dermatology clinical skills.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: More accurate coronary heart disease (CHD) prediction, specifically in middle-aged men, is needed to reduce the burden of disease more effectively. We hypothesised that a multilocus genetic risk score could refine CHD prediction beyond classic risk scores and obtain more precise risk estimates using a prospective cohort design.

Methods: Using data from nine prospective European cohorts, including 26,221 men, we selected in a case-cohort setting 4,818 healthy men at baseline, and used Cox proportional hazards models to examine associations between CHD and risk scores based on genetic variants representing 13 genomic regions. Over follow-up (range: 5-18 years), 1,736 incident CHD events occurred. Genetic risk scores were validated in men with at least 10 years of follow-up (632 cases, 1361 non-cases). Genetic risk score 1 (GRS1) combined 11 SNPs and two haplotypes, with effect estimates from previous genome-wide association studies. GRS2 combined 11 SNPs plus 4 SNPs from the haplotypes with coefficients estimated from these prospective cohorts using 10-fold cross-validation. Scores were added to a model adjusted for classic risk factors comprising the Framingham risk score and 10-year risks were derived.

Results: Both scores improved net reclassification (NRI) over the Framingham score (7.5%, p = 0.017 for GRS1, 6.5%, p = 0.044 for GRS2) but GRS2 also improved discrimination (c-index improvement 1.11%, p = 0.048). Subgroup analysis on men aged 50-59 (436 cases, 603 non-cases) improved net reclassification for GRS1 (13.8%) and GRS2 (12.5%). Net reclassification improvement remained significant for both scores when family history of CHD was added to the baseline model for this male subgroup improving prediction of early onset CHD events.

Conclusions: Genetic risk scores add precision to risk estimates for CHD and improve prediction beyond classic risk factors, particularly for middle aged men.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The abrasion seen on some of the retrieved CoCrMo hip joints has been reported to be caused by entrained hard particles in vivo. However, little work has been reported on the abrasion mechanisms of CoCrMo alloy in simulated body environments. Therefore. this study covers the mapping of micro-abrasion wear mechanisms of cast CoCrMo induced by third body hard particles under a wide range of abrasive test conditions. This study has a specific focus on covering the possible in vivo wear modes seen on metal-on-metal (MoM) surfaces. Nano-indentation and nano-scratch tests were also employed to further investigate the secondary wear mechanisms-nano-scale material deformation that involved in micro-abrasion processes. This work addresses the potential detrimental effects of third body hard particles in vivo such as increased wear rates (debris generation) and corrosion (metal-ion release). The abrasive wear mechanisms of cast CoCrMo have been investigated under various wear-corrosion conditions employing two abrasives, SiC (similar to 4 mu m) and Al(2)O(3) (similar to 1 mu m), in two test solutions, 0.9% NaCl and 25% bovine serum. The specific wear rates, wear mechanisms and transitions between mechanisms are discussed in terms of the abrasive size, volume fraction and the test solutions deployed. The work shows that at high abrasive volume fractions, the presence of protein enhanced the wear loss due to the enhanced particle entrainment, whereas at much lower abrasive volume fractions, protein reduced the wear loss by acting as a boundary lubricant or rolling elements which reduced the abrasivity (load per particle) of the abrasive particles. The abrasive wear rate and wear mechanisms of the CoCrMo are dependent on the nature of the third body abrasives, their entrainment into the contact and the presence of the proteins. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims To determine whether the financial incentives for tight glycaemic control, introduced in the UK as part of a pay-for-performance scheme in 2004, increased the rate at which people with newly diagnosed Type 2 diabetes were started on anti-diabetic medication.

Methods A secondary analysis of data from the General Practice Research Database for the years 1999-2008 was performed using an interrupted time series analysis of the treatment patterns for people newly diagnosed with Type 2 diabetes (n=21 197).

Results Overall, the proportion of people with newly diagnosed diabetes managed without medication 12months after diagnosis was 47% and after 24months it was 40%. The annual rate of initiation of pharmacological treatment within 12months of diagnosis was decreasing before the introduction of the pay-for-performance scheme by 1.2% per year (95% CI -2.0, -0.5%) and increased after the introduction of the scheme by 1.9% per year (95% CI 1.1, 2.7%). The equivalent figures for treatment within 24months of diagnosis were -1.4% (95% CI -2.1, -0.8%) before the scheme was introduced and 1.6% (95% CI 0.8, 2.3%) after the scheme was introduced.

Conclusion The present study suggests that the introduction of financial incentives in 2004 has effected a change in the management of people newly diagnosed with diabetes. We conclude that a greater proportion of people with newly diagnosed diabetes are being initiated on medication within 1 and 2years of diagnosis as a result of the introduction of financial incentives for tight glycaemic control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Implications Provision of environmental enrichment in line with that required by welfare-based quality assurance schemesdoes not always appear to lead to clear improvements in broiler chicken welfare. This research perhaps serves to highlightthe deficit in information regarding the ‘real world’ implications of enrichment with perches, string and straw bales.

Introduction Earlier work showed that provision of natural light and straw bales improved leg health in commercial broilerchickens (Bailie et al., 2013). This research aimed to determine if additional welfare benefits were shown in windowedhouses by increasing straw bale provision (Study 1), or by providing perches and string in addition to straw bales (Study 2).

Material and methods Commercial windowed houses in Northern Ireland containing ~23,000 broiler chickens (placed inhouses as hatched) were used in this research which took place in 2011. In Study 1 two houses on a single farm wereassigned to one of two treatments: (1) 30 straw bales per house (1 bale/44m2), or (2) 45 straw bales per house (1bale/29m2). Bales of wheat straw, each measuring 80cm x 40cm x 40cm were provided from day 10 of the rearing cycle,as in Bailie et al. (2013). Treatments were replicated over 6 production cycles (using 276,000 Ross 308 and Cobb birds),and were swapped between houses in each replicate. In Study 2, four houses on a single farm were assigned to 1 of 4treatments in a 2 x 2 factorial design. Treatments involved 2 levels of access to perches (present (24/house), or absent), and2 levels of access to string (present (24/house), or absent), and both types of enrichment were provided from the start of thecycle. Each perch consisted of a horizontal, wooden beam (300 cm x 5 cm x 5cm) with a rounded upper edge resting on 2supports (15 cm high). In the string treatment, 6 pieces of white nylon string (60 cm x 10 mm) were tied at their mid-pointto the wire above each of 4 feeder lines. Thirty straw bales were also provided per house from day 10. This study wasreplicated over 4 production cycles using 368,000 Ross 308 birds. In both studies behaviour was observed between 0900and 1800 hours in weeks 3-5 of the cycle. In Study 1, 8 focal birds were selected in each house each week, and generalactivity, exploratory and social behaviours recorded directly for 10 minutes. In Study 2, 10 minute video recordings weremade of 6 different areas (that did not contain enrichment) of each house each week. The percentage of birds engaged inlocomotion or standing was determined through scan sampling these recordings at 120 second intervals. Four perches andfour pieces of string were filmed for 25 mins in each house that contained these enrichments on one day per week. The totalnumber of times the perch or string was used was recorded, along with the duration of each bout. In both studies, gaitscores (0 (perfect) to 5 (unable to walk)) and latency to lie (measured in seconds from when a bird had been encouraged tostand) were recorded in 25 birds in each house each week. Farm and abattoir records were also used in both studies todetermine the number of birds culled for leg and other problems, mortality levels, slaughter weights, and levels of pododermatitis and hock burn. Data were analysed using SPSS (version 20.0) and treatment and age effects on behaviouralparameters were determined in normally distributed data using ANOVA (‘Straw bale density*week’, or‘string*perches*week’ as appropriate), and in non-normally distributed data using Kuskall-Wallace tests (P<0.05 forsignificance) . Treatment (but not age) effects on performance and health data were determined using the same testsdepending on normality of data.

Results Average slaughter weight, and levels of mortality, culling, hock burn and pododermatitis were not affected bytreatment in either study (P<0.05). In Study 1 straw bale (SB) density had no significant effect on the frequency orduration of behaviours including standing, walking, ground pecking, dust bathing, pecking at bales or aggression, or onaverage gait score (P>0.05). However, the average latency to lie was greater when fewer SB were provided (30SB 23.38s,45SB 18.62s, P<0.01). In Study 2 there was an interaction between perches (Pe) and age in lying behaviour, with higherpercentages of birds observed lying in the Pe treatment during weeks 4 and 5 (week 3 +Pe 77.0 -Pe 80.9, week 4 +Pe 79.5 -Pe 75.2, week 5 +Pe 78.4 -Pe 76.2, P<0.02). There was also a significant interaction between string (S) and age inlocomotory behaviour, with higher percentages of birds observed in locomotion in the string treatment during week 3 butnot weeks 4 and 5 (week 3 +S 4.9 -S 3.9, week 4 +S 3.3 -S 3.7, week 5 +S 2.6 -S 2.8, P<0.04). There was also aninteraction between S and age in average gait scores, with lower gait scores in the string treatment in weeks 3 and 5 (week3: +S 0.7, -S 0.9, week 4: +S 1.5, -S 1.4, week 5: +S 1.9, -S 2.0, P<0.05). On average per 25 min observation there were15.1 (±13.6) bouts of perching and 19.2 (±14.08) bouts of string pecking, lasting 117.4 (±92.7) and 4.2 (±2.0) s for perchesand string, respectively.

Conclusion Increasing straw bale levels from 1 bale/44m2 to 1 bale/29m2 floor space does not appear to lead to significantimprovements in the welfare of broilers in windowed houses. The frequent use of perches and string suggests that thesestimuli have the potential to improve welfare. Provision of string also appeared to positively influence walking ability.However, this effect was numerically small, was only shown in certain weeks and was not reflected in the latency to lie.Further research on optimum design and level of provision of enrichment items for broiler chickens is warranted. Thisshould include measures of overall levels of activity (both in the vicinity of, and away from, enrichment items).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Despite vaccines and improved medical intensive care, clinicians must continue to be vigilant of possible Meningococcal Disease in children. The objective was to establish if the procalcitonin test was a cost-effective adjunct for prodromal Meningococcal Disease in children presenting at emergency department with fever without source.

METHODS AND FINDINGS: Data to evaluate procalcitonin, C-reactive protein and white cell count tests as indicators of Meningococcal Disease were collected from six independent studies identified through a systematic literature search, applying PRISMA guidelines. The data included 881 children with fever without source in developed countries.The optimal cut-off value for the procalcitonin, C-reactive protein and white cell count tests, each as an indicator of Meningococcal Disease, was determined. Summary Receiver Operator Curve analysis determined the overall diagnostic performance of each test with 95% confidence intervals. A decision analytic model was designed to reflect realistic clinical pathways for a child presenting with fever without source by comparing two diagnostic strategies: standard testing using combined C-reactive protein and white cell count tests compared to standard testing plus procalcitonin test. The costs of each of the four diagnosis groups (true positive, false negative, true negative and false positive) were assessed from a National Health Service payer perspective. The procalcitonin test was more accurate (sensitivity=0.89, 95%CI=0.76-0.96; specificity=0.74, 95%CI=0.4-0.92) for early Meningococcal Disease compared to standard testing alone (sensitivity=0.47, 95%CI=0.32-0.62; specificity=0.8, 95% CI=0.64-0.9). Decision analytic model outcomes indicated that the incremental cost effectiveness ratio for the base case was £-8,137.25 (US $ -13,371.94) per correctly treated patient.

CONCLUSIONS: Procalcitonin plus standard recommended tests, improved the discriminatory ability for fatal Meningococcal Disease and was more cost-effective; it was also a superior biomarker in infants. Further research is recommended for point-of-care procalcitonin testing and Markov modelling to incorporate cost per QALY with a life-time model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: This study aimed to compare two different tooth replacement strategies for partially dentate older patients namely; removable partial dentures (RPDs) and functionally orientated treatment based on the shortened dental arch (SDA) concept. Method: 88 partially dentate older patients (mean age 69.4 years) completed a randomised controlled clinical trial. 43 patients received RPDs and 45 received functionally orientated treatment where resin bonded bridgework was used to provide 10 pairs of occluding contacts. Patients were followed for 1 year after treatment intervention. The impact of treatment on oral health-related quality of life (OHrQOL) and cost effectiveness were used as outcome measures. Each patient completed the short form of the Oral Health Impact Profile (OHIP-14) at baseline, 6 months and 1 year after treatment intervention. All costs involved in providing and maintaining each intervention were recorded including dental laboratory bills, materials and professional time. Result: Both the RPD (p=0.004) and the functionally orientated (p<0.001) treatment groups demonstrated statistically significant improvements in OHrQOL 1 year after treatment intervention. On average 9.4 visits were required to complete and maintain the RPDs over the 1 year period as compared to 5.3 visits for the functionally orientated group. The average laboratory cost for the RPDs was $537.45 per patient versus $367.89 for functionally orientated treatment. The cost of achieving the Minimally Important Difference of 5 scale points in OHIP-14 score with RPDs was $732.17. For the functionally orientated group the cost was $356.88. Therefore, functionally orientated treatment was more than twice as cost effective (1:2.05). Conclusion: For partially dentate older patients, functionally orientated treatment based on the SDA concept resulted in sustained, significant improvements in OHrQOL. Provision of functionally orientated treatment was also more than twice as cost effective compared to conventional treatment using RPDs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: In this cohort study, we explored the relationship between fluid balance, intradialytic hypotension and outcomes in critically ill patients with acute kidney injury (AKI) who received renal replacement therapy (RRT).

Methods: We analysed prospectively collected registry data on patients older than 16 years who received RRT for at least two days in an intensive care unit at two university-affiliated hospitals. We used multivariable logistic regression to determine the relationship between mean daily fluid balance and intradialytic hypotension, both over seven days following RRT initiation, and the outcomes of hospital mortality and RRT dependence in survivors.

Results: In total, 492 patients were included (299 male (60.8%), mean (standard deviation (SD)) age 62.9 (16.3) years); 251 (51.0%) died in hospital. Independent risk factors for mortality were mean daily fluid balance (odds ratio (OR) 1.36 per 1000 mL positive (95% confidence interval (CI) 1.18 to 1.57), intradialytic hypotension (OR 1.14 per 10% increase in days with intradialytic hypotension (95% CI 1.06 to 1.23)), age (OR 1.15 per five-year increase (95% CI 1.07 to 1.25)), maximum sequential organ failure assessment score on days 1 to 7 (OR 1.21 (95% CI 1.13 to 1.29)), and Charlson comorbidity index (OR 1.28 (95% CI 1.14 to 1.44)); higher baseline creatinine (OR 0.98 per 10 mu mol/L (95% CI 0.97 to 0.996)) was associated with lower risk of death. Of 241 hospital survivors, 61 (25.3%) were RRT dependent at discharge. The only independent risk factor for RRT dependence was pre-existing heart failure (OR 3.13 (95% CI 1.46 to 6.74)). Neither mean daily fluid balance nor intradialytic hypotension was associated with RRT dependence in survivors. Associations between these exposures and mortality were similar in sensitivity analyses accounting for immortal time bias and dichotomising mean daily fluid balance as positive or negative. In the subgroup of patients with data on pre-RRT fluid balance, fluid overload at RRT initiation did not modify the association of mean daily fluid balance with mortality.

Conclusions: In this cohort of patients with AKI requiring RRT, a more positive mean daily fluid balance and intradialytic hypotension were associated with hospital mortality but not with RRT dependence at hospital discharge in survivors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: To study willingness to pay for cataract surgery, and its associations, in Southern China. DESIGN: Cross-sectional willingness-to-pay interview incorporating elements of the open-ended and bidding formats. PARTICIPANTS: Three-hundred thirty-nine persons presenting for cataract screening in Yangjiang, China, with presenting visual acuity (VA) < or = 6/60 in either eye due to cataract. METHODS: Subjects underwent measurement of their VA and a willingness-to-pay interview. Age, gender, literacy, education, and annual income also were recorded. MAIN OUTCOME MEASURES: Maximum amount that the subjects would be willing to pay for cataract surgery. RESULTS: Among 325 (95.9%) subjects completing the interview, 169 (52.0%) were 70 years or older, 213 (65.5%) were women, and 217 (66.8%) had an annual income of <5000 renminbi (5000 = US 625 dollars). Eighty percent (n = 257) of participants were willing to pay something for surgery (mean, 442+/-444 renminbi [US 55 dollars+/-55]). In regression models, older subjects were willing to pay less (8 renminbi [US 1 dollar] per year of age; P = 0.01). Blind subjects were significantly more likely (odds ratio, 5.7; 95% confidence interval, 1.7-19.3) to pay anything for surgery, but would pay on average 255 renminbi (US 32 dollars) less (P = 0.004). Persons at the highest annual income level (>10,000 renminbi [US 1250 dollars]) would pay 50 dollars more for surgery than those at the lowest level (<5000 renminbi) (P = 0.0003). The current cost of surgery in this program is 500 renminbi (US 63 dollars). CONCLUSIONS: Sustainable programs will need to attract younger, more well-to-do persons with better vision, while still providing access to the neediest patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Stereotactic ablative radiotherapy (SABR) has become standard for inoperable early-stage non-small cell lung cancer (NSCLC). However, there is no randomized evidence demonstrating benefit over more fractionated radiotherapy. We compared accelerated hypofractionation (AH) and SABR using a propensity score-matched analysis.

MATERIALS AND METHODS: From 1997-2007, 119 patients (T1-3N0M0 NSCLC) were treated with AH (48-60Gy, 12-15 fractions). Prior to SABR, this represented our institutional standard. From 2008-2012, 192 patients (T1-3N0M0 NSCLC) were treated with SABR (48-52Gy, 4-5 fractions). A total of 114 patients (57 per cohort) were matched (1:1 ratio, caliper: 0.10) using propensity scores.

RESULTS: Median follow-up (range) for the AH cohort was 36.3 (2.5-109.1) months, while that for the SABR group was 32.5 (0.3-62.6)months. Three-year overall survival (OS) and local control (LC) rates were 49.5% vs. 72.4% [p=0.024; hazard ratio (HR): 2.33 (1.28, 4.23), p=0.006] and 71.9% vs. 89.3% [p=0.077; HR: 5.56 (1.53, 20.2), p=0.009], respectively. On multivariable analysis, tumour diameter and PET staging were predictive for OS, while the only predictive factor for LC was treatment cohort.

CONCLUSIONS: OS and LC were improved with SABR, although OS is more closely related to non-treatment factors. This represents one of the few studies comparing AH to SABR for early-stage lung cancer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cationic porphyrins have been widely used as photosensitizers (PSs) in the inactivation of microorganisms, both in biofilms and in planktonic forms. However, the application of curcumin, a natural PS, in the inactivation of biofilms, is poorly studied. The objectives of this study were (1) to evaluate and compare the efficiency of a cationic porphyrin tetra (Tetra-Py+-Me) and curcumin in the photodynamic inactivation of biofilms of Pseudomonas spp and the corresponding planktonic form; (2) to evaluate the effect of these PSs in cell adhesion and biofilm maturation. In eradication assays, biofilms of Pseudomonas spp adherent to silicone tubes were subjected to irradiation with white light (180 J cm-2) in presence of different concentrations (5 and 10 μM) of PS. In colonization experiments, solid supports were immersed in cell suspensions, PS was added and the mixture experimental setup was irradiated (864 J cm-2) during the adhesion phase. After transference solid supports to new PS-containing medium, irradiation (2592 J cm-2) was resumed during biofilm maturation. The assays of inactivation of planktonic cells were conducted in cell suspensions added of PS concentrations equivalent to those used in experiments with biofilms. The inactivation of planktonic cells and biofilms (eradication and colonization assays) was assessed by quantification of viable cells after plating in solid medium, at the beginning and at the end of the experiments. The results show that porphyrin Tetra-Py+-Me effectively inactivated planktonic cells (3.7 and 3.0 log) and biofilms of Pseudomonas spp (3.2 and 3.6 log). In colonization assays, the adhesion of cells was attenuated in 2.2 log, and during the maturation phase, a 5.2 log reduction in the concentration of viable cells was observed. Curcumin failed to cause significant inactivation in planktonic cells (0.7 and 0.9 log) and for that reason it was not tested in biofilm eradication assays. In colonization assays, curcumin did not affect the adhesion of cells to the solid support and caused a very modest reduction (1.0 log) in the concentration of viable cells during the maturation phase. The results confirm that the photodynamic inactivation is a promising strategy to control installed biofilms and in preventing colonization. Curcumin, however, does not represent an advantageous alternative to porphyrins in the case of biofilms of Pseudomonas spp.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hintergrund und Fragestellung: Die durch röntgentechnische Diagnoseverfahren in der Medizin entstehende Strahlenbelastung für Patient und Personal soll laut Strahlenschutzverordnung so gering wie möglich gehalten werden. Um dieses zu erreichen ist ein professioneller und bedachter Umgang mit den Röntgengeräten unabdingbar. Dieses Verhalten kann derzeit jedoch nur theoretisch vermittelt werden, da sich ein Üben mit realer Strahlung von selbst verbietet. Daher stellt sich die Frage wie man die Strahlenschutzausbildung durch eine verbesserte Vermittlung der komplexen Thematik unterstützen kann. Methoden: Das CBT-System (Computer Based Training) virtX, welches das Erlernen der korrekten Handhabung mobiler Röntgengeräte unterstützt, wurde um Aspekte aus dem Bereich des Strahlenschutzes erweitert. Es wurde eine prototypische Visualisierung der entstehenden Streustrahlung sowie die Darstellung des Nutzstrahlenganges integriert. Des Weiteren wurde die Berechnung und Anzeige der virtuellen Einfallsdosis für das durchstrahlte Volumen sowie für den Bereich des Bildverstärkers hinzugefügt. Für die Berechnung und Visualisierung all dieser Komponenten werden die in virtX parametrisierbaren C-Bogen-Einstellungen, z.B. Stellung der Blenden, Positionierung des Röntgengerätes zum durchstrahlten Volumen und Strahlenintensität, herangezogen. Das so erweiterte System wurde auf einem dreitägigen Kurs für OP-Personal mit über 120 Teilnehmern eingesetzt und auf der Basis von Fragebögen evaluiert. Ergebnisse: Von den Teilnehmern gaben 55 einen ausgefüllten Evaluations-Fragebogen ab (Responserate 82%). Das Durchschnittsalter der 39 weiblichen und 15 männlichen Teilnehmer (einer o.A.) lag bei 33±8 Jahren, die Berufserfahrung bei 9,37±7 Jahren. Die Erfahrung mit dem C-Bogen wurde von einem Teilnehmer (2%) mit „Keine oder bisher nur Einführung erhalten“, von acht Teilnehmern (14%) mit „bediene einen C-Bogen gelegentlich“ und von 46 (84%) mit „bediene einen C-Bogen regelmäßig“ angegeben. 45 (92%) der Teilnehmer gaben an, durch die Visualisierung der Streustrahlung etwas Neues zur Vermeidung unnötiger Strahlenbelastung dazugelernt zu haben. Schlussfolgerung: Trotz einer bislang nur prototypischen Visualisierung der Streustrahlung können mit virtX zentrale Aspekte und Verhaltensweisen zur Vermeidung unnötiger Strahlenbelastung erfolgreich vermittelt werden und so Lücken der traditionellen Strahlenschutzausbildung geschlossen werden.