19 resultados para setting time
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
The use of polymethylmethacrylate (PMMA) cement to reinforce fragile or broken vertebral bodies (vertebroplasty) leads to extensive bone stiffening. Fractures in the adjacent vertebrae may be the consequence of this procedure. PMMA with a reduced Young's modulus may be more suitable. The goal of this study was to produce and characterize stiffness adapted PMMA bone cements. Porous PMMA bone cements were produced by combining PMMA with various volume fractions of an aqueous sodium hyaluronate solution. Porosity, Young's modulus, yield strength, polymerization temperature, setting time, viscosity, injectability, and monomer release of those porous cements were investigated. Samples presented pores with diameters in the range of 25-260 microm and porosity up to 56%. Young's modulus and yield strength decreased from 930 to 50 MPa and from 39 to 1.3 MPa between 0 and 56% porosity, respectively. The polymerization temperature decreased from 68 degrees C (0%, regular cement) to 41 degrees C for cement having 30% aqueous fraction. Setting time decreased from 1020 s (0%, regular cement) to 720 s for the 30% composition. Viscosity of the 30% composition (145 Pa s) was higher than the ones received from regular cement and the 45% composition (100-125 Pa s). The monomer release was in the range of 4-10 mg/mL for all porosities; showing no higher release for the porous materials. The generation of pores using an aqueous gel seems to be a promising method to make the PMMA cement more compliant and lower its mechanical properties to values close to those of cancellous bone.
Resumo:
The purpose of the study was to quantify and compare the effect of CT dose and of size and density of nodules on the detectability of lung nodules and to quantify the influence of CT dose on the size of the nodules.
Resumo:
We used a PCR method to quantify the loads of Chlamydia trachomatis organisms in self-collected urine and vulvovaginal swab (VVS) samples from 93 women and 30 men participating in the Chlamydia Screening Studies Project, a community-based study of individuals not seeking health care. For women, self-collected VVS had a higher mean chlamydial load (10,405 organisms/ml; 95% confidence interval [95% CI], 5,167 to 21,163 organisms/ml) than did first-void urines (FVU) (503 organisms/ml; 95% CI, 250 to 1,022 organisms/ml; P < 0.001). Chlamydial loads in female and male self-collected FVU specimens were similar (P = 0.634). The mean chlamydial load in FVU specimens decreased with increasing age in females and males. There was no strong statistical evidence of differences in chlamydial load in repeat male and female FVU specimens taken when patients attended for treatment a median of 23.5 (range, 14 to 62) and 28 (range, 13 to 132) days later, respectively, or in VVS taken a median of 35 (range, 14 to 217) days later. In this study, chlamydial load values for infected persons in the community who were not seeking treatment were lower than those published in other studies involving symptomatic patients attending clinical settings. This might have implications for estimates of the infectiousness of chlamydia. The results of this study provide a scientific rationale for preferring VVS to FVU specimens from women.
Resumo:
We conducted an explorative, cross-sectional, multi-centre study in order to identify the most common problems of people with any kind of (primary) sleep disorder in a clinical setting using the International Classification of Functioning, Disability and Health (ICF) as a frame of reference. Data were collected from patients using a structured face-to-face interview of 45-60 min duration. A case record form for health professionals containing the extended ICF Checklist, sociodemographic variables and disease-specific variables was used. The study centres collected data of 99 individuals with sleep disorders. The identified categories include 48 (32%) for body functions, 13 (9%) body structures, 55 (37%) activities and participation and 32 (22%) for environmental factors. 'Sleep functions' (100%) and 'energy and drive functions', respectively, (85%) were the most severely impaired second-level categories of body functions followed by 'attention functions' (78%) and 'temperament and personality functions' (77%). With regard to the component activities and participation, patients felt most restricted in the categories of 'watching' (e.g. TV) (82%), 'recreation and leisure' (75%) and 'carrying out daily routine' (74%). Within the component environmental factors the categories 'support of immediate family', 'health services, systems and policies' and 'products or substances for personal consumption [medication]' were the most important facilitators; 'time-related changes', 'light' and 'climate' were the most important barriers. The study identified a large variety of functional problems reflecting the complexity of sleep disorders. The ICF has the potential to provide a comprehensive framework for the description of functional health in individuals with sleep disorders in a clinical setting.
Resumo:
Background Urinary tract infections (UTI) are frequent in outpatients. Fast pathogen identification is mandatory for shortening the time of discomfort and preventing serious complications. Urine culture needs up to 48 hours until pathogen identification. Consequently, the initial antibiotic regimen is empirical. Aim To evaluate the feasibility of qualitative urine pathogen identification by a commercially available real-time PCR blood pathogen test (SeptiFast®) and to compare the results with dipslide and microbiological culture. Design of study Pilot study with prospectively collected urine samples. Setting University hospital. Methods 82 prospectively collected urine samples from 81 patients with suspected UTI were included. Dipslide urine culture was followed by microbiological pathogen identification in dipslide positive samples. In parallel, qualitative DNA based pathogen identification (SeptiFast®) was performed in all samples. Results 61 samples were SeptiFast® positive, whereas 67 samples were dipslide culture positive. The inter-methodological concordance of positive and negative findings in the gram+, gram- and fungi sector was 371/410 (90%), 477/492 (97%) and 238/246 (97%), respectively. Sensitivity and specificity of the SeptiFast® test for the detection of an infection was 0.82 and 0.60, respectively. SeptiFast® pathogen identifications were available at least 43 hours prior to culture results. Conclusion The SeptiFast® platform identified bacterial DNA in urine specimens considerably faster compared to conventional culture. For UTI diagnosis sensitivity and specificity is limited by its present qualitative setup which does not allow pathogen quantification. Future quantitative assays may hold promise for PCR based UTI pathogen identification as a supplementation of conventional culture methods.
Resumo:
Objective: To compare clinical outcomes after laparoscopic cholecystectomy (LC) for acute cholecystitis performed at various time-points after hospital admission. Background: Symptomatic gallstones represent an important public health problem with LC the treatment of choice. LC is increasingly offered for acute cholecystitis, however, the optimal time-point for LC in this setting remains a matter of debate. Methods: Analysis was based on the prospective database of the Swiss Association of Laparoscopic and Thoracoscopic Surgery and included patients undergoing emergency LC for acute cholecystitis between 1995 and 2006, grouped according to the time-points of LC since hospital admission (admission day (d0), d1, d2, d3, d4/5, d ≥6). Linear and generalized linear regression models assessed the effect of timing of LC on intra- or postoperative complications, conversion and reoperation rates and length of postoperative hospital stay. Results: Of 4113 patients, 52.8% were female, median age was 59.8 years. Delaying LC resulted in significantly higher conversion rates (from 11.9% at d0 to 27.9% at d ≥6 days after admission, P < 0.001), surgical postoperative complications (5.7% to 13%, P < 0.001) and re-operation rates (0.9% to 3%, P = 0.007), with a significantly longer postoperative hospital stay (P < 0.001). Conclusions: Delaying LC for acute cholecystitis has no advantages, resulting in significantly increased conversion/re-operation rate, postoperative complications and longer postoperative hospital stay. This investigation—one of the largest in the literature—provides compelling evidence that acute cholecystitis merits surgery within 48 hours of hospital admission if impact on the patient and health care system is to be minimized.
Resumo:
Visual results in treating neovascular age-related macular degeneration (AMD) using intravitreal injected anti-VEGF (IVT) clearly depend on injection frequency. Regarding to the European approval Ranibizumab has to be used only in cases of recurrent visual loss after the loading phase. In contrast monthly treatment--as also provided in the ANCHOR and MARINA studies--is generally allowed in Switzerland. However, it is commonly tried to reduce the injection frequency because of the particular cost situation in all health systems and of cause also due to the necessary strict monitoring and reinjection regimes, which raise management problems with increasing patient numbers. In this article the special treatment regimes of our University Eye Hospital is presented, in which a reduced injection frequency basically leads to the same increased and stable visual results as in ANCHOR and MARINA; however, needing significantly more injections as generally provided in other countries of Europe. The main focus for achieving this in a large number of patients is placed on re-structuring our outpatient flow for IVT patients with particular emphasis on patient separation and standardisation of treatment steps leading to significantly reduced time consumption per patient. Measurements of timing and patient satisfaction before and after restructuring underline its importance in order to be able to treat more patients at a high quality even in the future. The exceptional importance of spectral domain OCT measurements as the most important criterium for indicating re-treatment is illustrated.
Resumo:
The sudden independence of Kyrgyzstan from the Soviet Union in 1991 led to a total rupture of industrial and agricultural production. Based on empirical data, this study seeks to identify key land use transformation processes since the late 1980s, their impact on people's livelihoods and the implication for natural resources in the communes of Tosh Bulak and Saz, located in the Sokuluk River Basin on the northern slope of the Kyrgyz Range. Using the concept of the sustainable livelihood approach as an analytical framework, three different livelihood strategies were identified: (1) An accumulation strategy applied by wealthy households where renting and/or buying of land is a key element; they are the only household category capable of venturing into rain fed agriculture. (2) A preserving strategy involving mainly intermediate households who are not able to buy or rent additional agricultural land; very often they are forced to return their land to the commune or sell it to wealthier households. (3) A coping strategy including mainly poor households consisting of elderly pensioners or headed by single mothers; due to their limited labour and economic power, agricultural production is very low and hardly covers subsistence needs; pensions and social allowances form the backbone of these livelihoods. Ecological assessments have shown that the forage productivity of remote high mountain pastures has increased from 5 to 22 per cent since 1978. At the same time forage productivity on pre-mountain and mountain pastures close to villages has generally decreased from 1 to 34 per cent. It seems that the main avenues for livelihoods to increase their wealth are to be found in the agricultural sector by controlling more and mainly irrigated land as well as by increasing livestock. The losers in this process are thus those households unable to keep or exploit their arable land or to benefit from new agricultural land. Ensuring access to land for the poor is therefore imperative in order to combat rural poverty and socio-economic disparities in rural Kyrgyzstan.
Resumo:
Early detection of bloodstream infections (BSI) is crucial in the clinical setting. Blood culture remains the gold standard for diagnosing BSI. Molecular diagnostic tools can contribute to a more rapid diagnosis in septic patients. Here, a multiplex real-time PCR-based assay for rapid detection of 25 clinically important pathogens directly from whole blood in <6 h is presented. Minimal analytical sensitivity was determined by hit rate analysis from 20 independent experiments. At a concentration of 3 CFU/ml a hit rate of 50% was obtained for E. aerogenes and 100% for S. marcescens, E. coli, P. mirabilis, P. aeruginosa, and A. fumigatus. The hit rate for C. glabrata was 75% at 30 CFU/ml. Comparing PCR identification results with conventional microbiology for 1,548 clinical isolates yielded an overall specificity of 98.8%. The analytical specificity in 102 healthy blood donors was 100%. Although further evaluation is warranted, our assay holds promise for more rapid pathogen identification in clinical sepsis.
Resumo:
BACKGROUND: Low back pain (LBP) is by far the most prevalent and costly musculoskeletal problem in our society today. Following the recommendations of the Multinational Musculoskeletal Inception Cohort Study (MMICS) Statement, our study aims to define outcome assessment tools for patients with acute LBP and the time point at which chronic LBP becomes manifest and to identify patient characteristics which increase the risk of chronicity. METHODS: Patients with acute LBP will be recruited from clinics of general practitioners (GPs) in New Zealand (NZ) and Switzerland (CH). They will be assessed by postal survey at baseline and at 3, 6, 12 weeks and 6 months follow-up. Primary outcome will be disability as measured by the Oswestry Disability Index (ODI); key secondary endpoints will be general health as measured by the acute SF-12 and pain as measured on the Visual Analogue Scale (VAS). A subgroup analysis of different assessment instruments and baseline characteristics will be performed using multiple linear regression models. This study aims to examine: 1. Which biomedical, psychological, social, and occupational outcome assessment tools are identifiers for the transition from acute to chronic LBP and at which time point this transition becomes manifest. 2. Which psychosocial and occupational baseline characteristics like work status and period of work absenteeism influence the course from acute to chronic LBP. 3. Differences in outcome assessment tools and baseline characteristics of patients in NZ compared with CH. DISCUSSION: This study will develop a screening tool for patients with acute LBP to be used in GP clinics to access the risk of developing chronic LBP. In addition, biomedical, psychological, social, and occupational patient characteristics which influence the course from acute to chronic LBP will be identified. Furthermore, an appropriate time point for follow-ups will be given to detect this transition. The generalizability of our findings will be enhanced by the international perspective of this study. TRIAL REGISTRATION: [Clinical Trial Registration Number, ACTRN12608000520336].
Resumo:
CONTEXT: It is uncertain whether intensified heart failure therapy guided by N-terminal brain natriuretic peptide (BNP) is superior to symptom-guided therapy. OBJECTIVE: To compare 18-month outcomes of N-terminal BNP-guided vs symptom-guided heart failure therapy. DESIGN, SETTING, AND PATIENTS: Randomized controlled multicenter Trial of Intensified vs Standard Medical Therapy in Elderly Patients With Congestive Heart Failure (TIME-CHF) of 499 patients aged 60 years or older with systolic heart failure (ejection fraction < or = 45%), New York Heart Association (NYHA) class of II or greater, prior hospitalization for heart failure within 1 year, and N-terminal BNP level of 2 or more times the upper limit of normal. The study had an 18-month follow-up and it was conducted at 15 outpatient centers in Switzerland and Germany between January 2003 and June 2008. INTERVENTION: Uptitration of guideline-based treatments to reduce symptoms to NYHA class of II or less (symptom-guided therapy) and BNP level of 2 times or less the upper limit of normal and symptoms to NYHA class of II or less (BNP-guided therapy). MAIN OUTCOME MEASURES: Primary outcomes were 18-month survival free of all-cause hospitalizations and quality of life as assessed by structured validated questionnaires. RESULTS: Heart failure therapy guided by N-terminal BNP and symptom-guided therapy resulted in similar rates of survival free of all-cause hospitalizations (41% vs 40%, respectively; hazard ratio [HR], 0.91 [95% CI, 0.72-1.14]; P = .39). Patients' quality-of-life metrics improved over 18 months of follow-up but these improvements were similar in both the N-terminal BNP-guided and symptom-guided strategies. Compared with the symptom-guided group, survival free of hospitalization for heart failure, a secondary end point, was higher among those in the N-terminal BNP-guided group (72% vs 62%, respectively; HR, 0.68 [95% CI, 0.50-0.92]; P = .01). Heart failure therapy guided by N-terminal BNP improved outcomes in patients aged 60 to 75 years but not in those aged 75 years or older (P < .02 for interaction) CONCLUSION: Heart failure therapy guided by N-terminal BNP did not improve overall clinical outcomes or quality of life compared with symptom-guided treatment. TRIAL REGISTRATION: isrctn.org Identifier: ISRCTN43596477.
Resumo:
BACKGROUND Trials assessing the benefit of immediate androgen-deprivation therapy (ADT) for treating prostate cancer (PCa) have often done so based on differences in detectable prostate-specific antigen (PSA) relapse or metastatic disease rates at a specific time after randomization. OBJECTIVE Based on the long-term results of European Organization for Research and Treatment of Cancer (EORTC) trial 30891, we questioned if differences in time to progression predict for survival differences. DESIGN, SETTING, AND PARTICIPANTS EORTC trial 30891 compared immediate ADT (n=492) with orchiectomy or luteinizing hormone-releasing hormone analog with deferred ADT (n=493) initiated upon symptomatic disease progression or life-threatening complications in randomly assigned T0-4 N0-2 M0 PCa patients. OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS Time to first objective progression (documented metastases, ureteric obstruction, not PSA rise) and time to objective castration-resistant progressive disease were compared as well as PCa mortality and overall survival. RESULTS AND LIMITATIONS After a median of 12.8 yr, 769 of the 985 patients had died (78%), 269 of PCa (27%). For patients receiving deferred ADT, the overall treatment time was 31% of that for patients on immediate ADT. Deferred ADT was significantly worse than immediate ADT for time to first objective disease progression (p<0.0001; 10-yr progression rates 42% vs 30%). However, time to objective castration-resistant disease after deferred ADT did not differ significantly (p=0.42) from that after immediate ADT. In addition, PCa mortality did not differ significantly, except in patients with aggressive PCa resulting in death within 3-5 yr after diagnosis. Deferred ADT was inferior to immediate ADT in terms of overall survival (hazard ratio: 1.21; 95% confidence interval, 1.05-1.39; p [noninferiority]=0.72, p [difference] = 0.0085). CONCLUSIONS This study shows that if hormonal manipulation is used at different times during the disease course, differences in time to first disease progression cannot predict differences in disease-specific survival. A deferred ADT policy may substantially reduce the time on treatment, but it is not suitable for patients with rapidly progressing disease.
Resumo:
Objective. To evaluate the diagnostic benefit of real-time elastography (RTE) in clinical routine. Strain indices (SI) for benign and malignant tumors were assessed. Methods. 100 patients with 110 focal breast lesions were retrieved. Patients had mammography (MG), ultrasound (US), and, if necessary, MRI. RTE was conducted after ultrasound. Lesions were assessed with BI-RADS for mammography and ultrasound. Diagnosis was established with histology or follow-up. Results. SI for BI-RADS 2 was 1.71 ± 0.86. Higher SI (2.21 ± 1.96) was observed for BI-RADS 3 lesions. SI of BI-RADS 4 and 5 lesions were significantly higher (16.92 ± 20.89) and (19.54 ± 10.41). 31 malignant tumors exhibited an average SI of 16.13 ± 14.67; SI of benign lesions was 5.29 ± 11.87 (P value <0.0001). ROC analysis threshold was >3.8 for malignant disease. Sensitivity of sonography was 90.3% (specificity 78.5%). RTE showed a sensitivity of 87.1% (specificity 79.7%). Accuracy of all modalities combined was 96.8%. In BI-RADS 3 lesions RTE was able to detect all malignant lesions (sensitivity 100%, specificity 92.9%, and accuracy 93.9%). Conclusions. RTE increased sensitivity and specificity for breast cancer detection when used in combination with ultrasound.
Resumo:
BACKGROUND Staphylococcus aureus has long been recognized as a major pathogen. Methicillin-resistant strains of S. aureus (MRSA) and methicillin-resistant strains of S. epidermidis (MRSE) are among the most prevalent multiresistant pathogens worldwide, frequently causing nosocomial and community-acquired infections. METHODS In the present pilot study, we tested a polymerase chain reaction (PCR) method to quickly differentiate Staphylococci and identify the mecA gene in a clinical setting. RESULTS Compared to the conventional microbiology testing the real-time PCR assay had a higher detection rate for both S. aureus and coagulase-negative Staphylococci (CoNS; 55 vs. 32 for S. aureus and 63 vs. 24 for CoNS). Hands-on time preparing DNA, carrying out the PCR, and evaluating results was less than 5 h. CONCLUSIONS The assay is largely automated, easy to adapt, and has been shown to be rapid and reliable. Fast detection and differentiation of S. aureus, CoNS, and the mecA gene by means of this real-time PCR protocol may help expedite therapeutic decision-making and enable earlier adequate antibiotic treatment.
Resumo:
OBJECTIVES Gender-specific data on the outcome of combination antiretroviral therapy (cART) are a subject of controversy. We aimed to compare treatment responses between genders in a setting of equal access to cART over a 14-year period. METHODS Analyses included treatment-naïve participants in the Swiss HIV Cohort Study starting cART between 1998 and 2011 and were restricted to patients infected by heterosexual contacts or injecting drug use, excluding men who have sex with men. RESULTS A total of 3925 patients (1984 men and 1941 women) were included in the analysis. Women were younger and had higher CD4 cell counts and lower HIV RNA at baseline than men. Women were less likely to achieve virological suppression < 50 HIV-1 RNA copies/mL at 1 year (75.2% versus 78.1% of men; P = 0.029) and at 2 years (77.5% versus 81.1%, respectively; P = 0.008), whereas no difference between sexes was observed at 5 years (81.3% versus 80.5%, respectively; P = 0.635). The probability of virological suppression increased in both genders over time (test for trend, P < 0.001). The median increase in CD4 cell count at 1, 2 and 5 years was generally higher in women during the whole study period, but it gradually improved over time in both sexes (P < 0.001). Women also were more likely to switch or stop treatment during the first year of cART, and stops were only partly driven by pregnancy. In multivariate analysis, after adjustment for sociodemographic factors, HIV-related factors, cART and calendar period, female gender was no longer associated with lower odds of virological suppression. CONCLUSIONS Gender inequalities in the response to cART are mainly explained by the different prevalence of socioeconomic characteristics in women compared with men.