83 resultados para FRS-ESR FACILITY
Resumo:
QUESTION UNDER STUDY: Hospitals transferring patients retain responsibility until admission to the new health care facility. We define safe transfer conditions, based on appropriate risk assessment, and evaluate the impact of this strategy as implemented at our institution. METHODS: An algorithm defining transfer categories according to destination, equipment monitoring, and medication was developed and tested prospectively over 6 months. Conformity with algorithm criteria was assessed for every transfer and transfer category. After introduction of a transfer coordination centre with transfer nurses, the algorithm was implemented and the same survey was carried out over 1 year. RESULTS: Over the whole study period, the number of transfers increased by 40%, chiefly by ambulance from the emergency department to other hospitals and private clinics. Transfers to rehabilitation centres and nursing homes were reassigned to conventional vehicles. The percentage of patients requiring equipment during transfer, such as an intravenous line, decreased from 34% to 15%, while oxygen or i.v. drug requirement remained stable. The percentage of transfers considered below theoretical safety decreased from 6% to 4%, while 20% of transfers were considered safer than necessary. A substantial number of planned transfers could be "downgraded" by mutual agreement to a lower degree of supervision, and the system was stable on a short-term basis. CONCLUSION: A coordinated transfer system based on an algorithm determining transfer categories, developed on the basis of simple but valid medical and nursing criteria, reduced unnecessary ambulance transfers and treatment during transfer, and increased adequate supervision.
Resumo:
OBJECTIVES: To describe disease characteristics and treatment modalities in a multidisciplinary cohort of systemic lupus erythematosus (SLE) patients in Switzerland. METHODS: Cross-sectional analysis of 255 patients included in the Swiss SLE Cohort and coming from centres specialised in Clinical Immunology, Internal Medicine, Nephrology and Rheumatology. Clinical data were collected with a standardised form. Disease activity was assessed using the Safety of Estrogens in Lupus Erythematosus National Assessment-SLE Disease Activity Index (SELENA-SLEDAI), an integer physician's global assessment score (PGA) ranging from 0 (inactive) to 3 (very active disease) and the erythrocyte sedimentation rate (ESR). The relationship between SLE treatment and activity was assessed by propensity score methods using a mixed-effect logistic regression with a random effect on the contributing centre. RESULTS: Of the 255 patients, 82% were women and 82% were of European ancestry. The mean age at enrolment was 44.8 years and the median SLE duration was 5.2 years. Patients from Rheumatology had a significantly later disease onset. Renal disease was reported in 44% of patients. PGA showed active disease in 49% of patients, median SLEDAI was 4 and median ESR was 14 millimetre/first hour. Prescription rates of anti-malarial drugs ranged from 3% by nephrologists to 76% by rheumatologists. Patients regularly using anti-malarial drugs had significantly lower SELENA-SLEDAI scores and ESR values. CONCLUSION: In our cohort, patients in Rheumatology had a significantly later SLE onset than those in Nephrology. Anti-malarial drugs were mostly prescribed by rheumatologists and internists and less frequently by nephrologists, and appeared to be associated with less active SLE.
Resumo:
OBJECTIVE: To determine the psychometric properties of an adapted version of the Falls Efficacy Scale (FES) in older rehabilitation patients. DESIGN: Cross-sectional survey. SETTING: Postacute rehabilitation facility in Switzerland. PARTICIPANTS: Seventy elderly persons aged 65 years and older receiving postacute, inpatient rehabilitation. INTERVENTIONS: Not applicable. MAIN OUTCOME MEASURES: FES questions asked about subject's confidence (range, 0 [none]-10 [full]) in performing 12 activities of daily living (ADLs) without falling. Construct validity was assessed using correlation with measures of physical (basic ADLs [BADLs]), cognitive (Mini-Mental State Examination [MMSE]), affective (15-item Geriatric Depression Scale [GDS]), and mobility (Performance Oriented Mobility Assessment [POMA]) performance. Predictive validity was assessed using the length of rehabilitation stay as the outcome. To determine test-retest reliability, FES administration was repeated in a random subsample (n=20) within 72 hours. RESULTS: FES scores ranged from 10 to 120 (mean, 88.7+/-26.5). Internal consistency was optimal (Cronbach alpha=.90), and item-to-total correlations were all significant, ranging from .56 (toilet use) to .82 (reaching into closets). Test-retest reliability was high (intraclass correlation coefficient, .97; 95% confidence interval, .95-.99; P<.001). Subjects reporting a fall in the previous year had lower FES scores than nonfallers (85.0+/-25.2 vs 94.4+/-27.9, P=.054). The FES correlated with POMA (Spearman rho=.40, P<.001), MMSE (rho=.37, P=.001), BADL (rho=.43, P<.001), and GDS (rho=-.53, P<.001) scores. These relationships remained significant in multivariable analysis for BADLs and GDS, confirming FES construct validity. There was a significant inverse relationship between FES score and the length of rehabilitation stay, independent of sociodemographic, functional, cognitive, and fall status. CONCLUSIONS: This adapted FES is reliable and valid in older patients undergoing postacute rehabilitation. The independent association between poor falls efficacy and increased length of stay has not been previously described and needs further investigations.
Resumo:
This article examines the existence of a habituation effect to unemployment: Does the subjective well-being of unemployed people decline less if unemployment is more widespread? The underlying idea is that unemployment hysteresis may operate through a sociological channel: if many people in the community lose their job and remain unemployed over an extended period, the psychological cost of being unemployed diminishes and the pressure to accept a new job declines. We analyze this question with individual-level data from the German Socio-Economic Panel (1984-2010) and the Swiss Household Panel (2000-2010). Our fixed-effects estimates show no evidence for a mitigating effect of high surrounding unemployment on the subjective well-being of the unemployed. Becoming unemployed hurts as much when regional unemployment is high as when it is low. Likewise, the strongly harmful impact of being unemployed on well-being does not wear off over time, nor do repeated episodes of unemployment make it any better. It thus appears doubtful that an unemployment shock becomes persistent because the unemployed become used to, and hence reasonably content with, being without a job.
Resumo:
Purpose: In primary prevention of cardiovascular disease (CVD), it is accepted that the intensity of risk factor treatment should be guided by the magnitude of absolute risk. Risk factors tools like Framingham risk score (FHS) or noninvasive atherosclerosis imaging tests are available to detect high risk subjects. However, these methods are imperfect and may misclassify a large number of individuals. The purpose of this prospective study was to evaluate whether the prediction of future cardiovascular events (CVE) can be improved when subclinical imaging atherosclerosis (SCATS) is combined with the FRS in asymptomatic subjects. Methods: Overall, 1038 asymptomatic subjects (413 women, 625 men, mean age 49.1±12.8 years) were assessed for their cardiovascular risk using the FRS. B-mode ultrasonography on carotid and femoral arteries was performed by two investigators to detect atherosclerotic plaques (focal thickening of intima-media > 1.2 mm) and to measure carotid intima-media thickness (C-IMT). The severity of SCATS was expressed by an ATS-burden Score (ABS) reflecting the number of the arterial sites with >1 plaques (range 0-4). CVE were defined as fatal or non fatal acute coronary syndrome, stroke, or angioplasty for peripheral artery disease. Results: during a mean follow-up of 4.9±3.1 years, 61 CVE were recorded. Event rates the rate of CVE increased significantly from 2.7% to 39.1% according to the ABS (p<0.001) and from 4% to 24.6% according to the quartiles of C-IMT. Similarly, FRS predicted CVE (p<0.001). When computing the angiographic markers of SCATS in addition of FRS, we observed an improvement of net reclassification rate of 16.6% (p< 0.04) for ABS as compared to 5.5% (p = 0.26) for C-IMT. Conclusion: these results indicate that the detection of subjects requiring more attention to prevent CVE can be significantly improved when using both FRS and SCATS imaging.
Resumo:
Background: Study in vivo characteristics of a polymethylmethacrylate (PMMA) implant compared to the standard cylindrical collagen implant for deep sclerectomy (DS). Design: Six-month comparative study. Samples: Twenty eyes of ten rabbits. Methods: Eyes were randomized to have DS with PMMA implant in one eye and collagen implant in the opposite eye. The growth of the new subconjunctival drainage vessels was assessed by combined fluorescein and indocyanin green anterior segment angiography; intrascleral and subconjunctival blebs were imaged by ultrasound biomicroscopy (UBM). At six months, outflow facility (C) was measured by anterior chamber perfusion and portions of one side of the DS were compared to portions on the 180° opposite side and native sclera on histology. Results: The mean IOP preoperatively and at one, four, twelve, and twenty-four weeks was comparable in both groups (P > 0.1). UBM showed a statistically insignificant quicker regression of the subconjunctival bleb as well as a durable intrascleral lake in the PMMA group (P > 0.05). New drainage vessels were initially observed one month after surgery; they were more numerous in the PMMA group on angiographic and histological findings at 6 months (P < 0.05). The mean C increased significantly after surgery compared to preoperative values (P < 0.05) and no difference was observed between the implants (0.24 ± 0.06 µl/min/mmHg [PMMA] and 0.23 ± 0.07 µl/min/mmHg [collagen implant]) (P = 0.39). Conclusions: Deep sclerectomy performed with PMMA or collagen implants showed similar IOP lowering effects, outflow facility increase, and degree of inflammatory reaction.
Resumo:
Biologicals have been used for decades in biopharmaceutical topical preparations. Because cellular therapies are rou-tinely used in the clinic they have gained significant attention. Different derivatives are possible from different cell and tissue sources, making the selection of cell types and establishment of consistent cell banks crucial steps in the initial whole-cell bioprocessing. Various cell and tissue types have been used in treatment of skin wounds including autolo-gous and allogenic skin cells, platelets, placenta and amniotic extracts from either human or animal sources. Experience with progenitor cells show that they may provide an interesting cell choice due to facility of out-scaling and known properties for wound healing without scar. Using defined animal cell lines to develop cell-free derivatives may provide initial starting material for pharmaceutical formulations that help in overall stability. Cell lines derived from ovine tis-sue (skin, muscle, connective tissue) can be developed in short periods of time and consistency of these cell lines was monitored by cellular life-span, protein concentrations, stability and activity. Each cell line had long culture periods up to 37 - 41 passages and protein measures for each cell line at passages 2 - 15 had only 1.4-fold maximal difference. Growth stimulation activity towards two target skin cell lines (GM01717 and CRL-1221; 40 year old human males) at concentrations ranging up to 6 μg/ml showed 2-3-fold (single extracts) and 3-7-fold (co-cultured extracts) increase. Proteins from co-culture remained stable up to 1 year in pharmaceutical preparations shown by separation on SDS- PAGE gels. Pharmaceutical cell-free preparations were used for veterinary and human wounds and burns. Cell lines and cell-free extracts can show remarkable consistency and stability for preparation of biopharmaceutical creams, moreover when cells are co-cultured, and have positive effects for tissue repair.
Resumo:
OBJECTIVE: To evaluate the effect of vouchers for maternity care in public health-care facilities on the utilization of maternal health-care services in Cambodia. METHODS: The study involved data from the 2010 Cambodian Demographic and Health Survey, which covered births between 2005 and 2010. The effect of voucher schemes, first implemented in 2007, on the utilization of maternal health-care services was quantified using a difference-in-differences method that compared changes in utilization in districts with voucher schemes with changes in districts without them. FINDINGS: Overall, voucher schemes were associated with an increase of 10.1 percentage points (pp) in the probability of delivery in a public health-care facility; among women from the poorest 40% of households, the increase was 15.6 pp. Vouchers were responsible for about one fifth of the increase observed in institutional deliveries in districts with schemes. Universal voucher schemes had a larger effect on the probability of delivery in a public facility than schemes targeting the poorest women. Both types of schemes increased the probability of receiving postnatal care, but the increase was significant only for non-poor women. Universal, but not targeted, voucher schemes significantly increased the probability of receiving antenatal care. CONCLUSION: Voucher schemes increased deliveries in health centres and, to a lesser extent, improved antenatal and postnatal care. However, schemes that targeted poorer women did not appear to be efficient since these women were more likely than less poor women to be encouraged to give birth in a public health-care facility, even with universal voucher schemes.
Resumo:
Background: Several markers of atherosclerosis and of inflammation have been shown to predict coronary heart disease (CHD) individually. However, the utility of markers of atherosclerosis and of inflammation on prediction of CHD over traditional risk factors has not been well established, especially in the elderly. Methods: We studied 2202 men and women, aged 70-79, without baseline cardiovascular disease over 6-year follow-up to assess the risk of incident CHD associated with baseline noninvasive measures of atherosclerosis (ankle-arm index [AAI], aortic pulse wave velocity [aPWV]) and inflammatory markers (interleukin-6 [IL-6], C-reactive protein [CRP], tumor necrosis factor-a [TNF-a]). CHD events were studied as either nonfatal myocardial infarction or coronary death ("hard" events), and "hard" events plus hospitalization for angina, or the need for coronary-revascularization procedures (total CHD events). Results: During the 6-year follow-up, 283 participants had CHD events (including 136 "hard" events). IL-6, TNF-a and AAI independently predicted CHD events above Framingham Risk Score (FRS) with hazard ratios [HR] for the highest as compared with the lowest quartile for IL-6 of 1.95 (95%CI: 1.38-2.75, p for trend <0.001), TNF-a of 1.45 (95%CI: 1.04-2.02, p for trend 0.03), of 1.66 (95%CI: 1.19-2.31) for AAI 0.9, as compared to AAI 1.01-1.30. CRP and aPWV were not independently associated with CHD events. Results were similar for "hard" CHD events. Addition of IL-6 and AAI to traditional cardiovascular risk factors yielded the greatest improvement in the prediction of CHD; C-index for "hard"/total CHD events increased from 0.62/0.62 for traditional risk factors to 0.64/0.64 for IL-6 addition, 0.65/0.63 for AAI, and 0.66/0.64 for IL-6 combined with AAI. Being in the highest quartile of IL-6 combined with an AAI 0.90 or >1.40 yielded an HR of 2.51 (1.50-4.19) and 4.55 (1.65-12.50) above FRS, respectively. With use of CHD risk categories, risk prediction at 5 years was more accurate in models that included IL-6, AAI or both, with 8.0, 8.3 and 12.1% correctly reclassified, respectively. Conclusions: Among older adults, markers of atherosclerosis and of inflammation, particularly IL-6 and AAI, are independently associated with CHD. However, these markers only modestly improve cardiovascular risk prediction beyond traditional risk factors.
Resumo:
BACKGROUND: In numerous high-risk medical and surgical conditions, a greater volume of patients undergoing treatment in a given setting or facility is associated with better survival. For patients with pulmonary embolism, the relation between the number of patients treated in a hospital (volume) and patient outcome is unknown. METHODS: We studied discharge records from 186 acute care hospitals in Pennsylvania for a total of 15 531 patients for whom the primary diagnosis was pulmonary embolism. The study outcomes were all-cause mortality in hospital and within 30 days after presentation for pulmonary embolism and the length of hospital stay. We used logistic models to study the association between hospital volume and 30-day mortality and discrete survival models to study the association between in-hospital mortality and time to hospital discharge. RESULTS: The median annual hospital volume for pulmonary embolism was 20 patients (interquartile range 10-42). Overall in-hospital mortality was 6.0%, whereas 30-day mortality was 9.3%. In multivariable analysis, very-high-volume hospitals (> or = 42 cases per year) had a significantly lower odds of in-hospital death (odds ratio [OR] 0.71, 95% confidence interval [CI] 0.51-0.99) and of 30-day death (OR 0.71, 95% CI 0.54-0.92) than very-low-volume hospitals (< 10 cases per year). Although patients in the very-high-volume hospitals had a slightly longer length of stay than those in the very-low-volume hospitals (mean difference 0.7 days), there was no association between volume and length of stay. INTERPRETATION: In hospitals with a high volume of cases, pulmonary embolism was associated with lower short-term mortality. Further research is required to determine the causes of the relation between volume and outcome for patients with pulmonary embolism.
Resumo:
The shape of the energy spectrum produced by an x-ray tube has a great importance in mammography. Many anode-filtration combinations have been proposed to obtain the most effective spectrum shape for the image quality-dose relationship. On the other hand, third generation synchrotrons such as the European Synchrotron Radiation Facility in Grenoble are able to produce a high flux of monoenergetic radiation. It is thus a powerful tool to study the effect of beam energy on image quality and dose in mammography. An objective method was used to evaluate image quality and dose in mammography with synchrotron radiation and to compare them to standard conventional units. It was performed systematically in the energy range of interest for mammography through the evaluation of a global image quality index and through the measurement of the mean glandular dose. Compared to conventional mammography units, synchrotron radiation shows a great improvement of the image quality-dose relationship, which is due to the beam monochromaticity and to the high intrinsic collimation of the beam, which allows the use of a slit instead of an anti-scatter grid for scatter rejection.
Resumo:
Although active personal dosemeters (APDs) are not used quite often in hospital environments, the possibility to assess the dose and/or dose rate in real time is particularly interesting in interventional radiology and cardiology (IR/IC) since operators can receive relatively high doses while standing close to the primary radiation field.A study concerning the optimization of the use of APDs in IR/IC was performed in the framework of the ORAMED project, a Collaborative Project (2008-2011) supported by the European Commission within its 7th Framework Program. This paper reports on tests performed with APDs on phantoms using an X-ray facility in a hospital environment and APDs worn by interventionalists during routine practice in different European hospitals.The behaviour of the APDs is more satisfactory in hospitals than in laboratories with respect to the influence of the tube peak high voltage and pulse width, because the APDs are tested in scattered fields with dose equivalent rates generally lower than 1 Sv.h(-1).
Resumo:
Study Objectives: The sleep-deprivation-induced changes in delta power, an electroencephalographical correlate of sleep need, and brain transcriptome profiles have importantly contributed to current hypotheses on sleep function. Because sleep deprivation also induces stress, we here determined the contribution of the corticosterone component of the stress response to the electrophysiological and molecular markers of sleep need in mice. Design: N/A Settings: Mouse sleep facility. Participants: C57BL/6J, AKR/J, DBA/2J mice. Interventions: Sleep deprivation, adrenalectomy (ADX). Measurements and Results: Sleep deprivation elevated corticosterone levels in 3 inbred strains, but this increase was larger in DBA/2J mice; i.e., the strain for which the rebound in delta power after sleep deprivation failed to reach significance. Elimination of the sleep-deprivation-associated corticosterone surge through ADX in DBA/2J mice did not, however, rescue the delta power rebound but did greatly reduce the number of transcripts affected by sleep deprivation. Genes no longer affected by sleep deprivation cover pathways previously implicated in sleep homeostasis, such as lipid, cholesterol (e.g., Ldlr, Hmgcs1, Dhcr7, -24, Fkbp5), energy and carbohydrate metabolism (e.g., Eno3, G6pc3, Mpdu1, Ugdh, Man1b1), protein biosynthesis (e.g., Sgk1, Alad, Fads3, Eif2c2, -3, Mat2a), and some circadian genes (Per1, -3), whereas others, such as Homer1a, remained unchanged. Moreover, several microRNAs were affected both by sleep deprivation and ADX. Conclusions: Our findings indicate that corticosterone contributes to the sleep-deprivation-induced changes in brain transcriptome that have been attributed to wakefulness per se. The study identified 78 transcripts that respond to sleep loss independent of corticosterone and time of day, among which genes involved in neuroprotection prominently feature, pointing to a molecular pathway directly relevant for sleep function.
Resumo:
PRINCIPLES: Advance directives are seen as an important tool for documenting the wishes of patients who are no longer competent to make decisions in regards to their medical care. Due to their nature, approaching the subject of advance directives with a patient can be difficult for both the medical care provider and the patient. This paper focuses on general practitioners' perspectives regarding the timing at which this discussion should take place, as well as the advantages and disadvantages of the different moments. METHODS: In 2013, 23 semi-structured face-to-face interviews were performed with Swiss general practitioners. Interviews were analysed using qualitative content analysis. RESULTS: In our sample, 23 general practitioners provided different options that they felt were appropriate moments: either (a) when the patient is still healthy, (b) when illness becomes predominant, or (c) when a patient has been transferred to a long-term care facility. Furthermore, general practitioners reported uncertainty and discomfort regarding initiating the discussion. CONCLUSION: The distinct approaches, perspectives and rationales show that there is no well-defined or "right" moment. However, participants often associated advance directives with death. This link caused discomfort and uncertainty, which led to hesitation and delay on the part of general practitioners. Therefore we recommend further training on how to professionally initiate a conversation about advance directives. Furthermore, based on our results and experience, we recommend an early approach with healthy patients paired with later regular updates as it seems to be the most effective way to inform patients about their end-of-life care options.
Resumo:
PURPOSE: Since 1982, the Radiation Oncology Group of the EORTC (EORTC ROG) has pursued an extensive Quality Assurance (QA) program involving all centres actively participating in its clinical research. The first step is the evaluation of the structure and of the human, technical and organisational resources of the centres, to assess their ability to comply with the current requirements for high-tech radiotherapy (RT). MATERIALS AND METHODS: A facility questionnaire (FQ) was developed in 1989 and adapted over the years to match the evolution of RT techniques. We report on the contents of the current FQ that was completed online by 98 active EORTC ROG member institutions from 19 countries, between December 2005 and October 2007. RESULTS: Similar to the data collected previously, large variations in equipment, staffing and workload between centres remain. Currently only 15 centres still use a Cobalt unit. All centres perform 3D Conformal RT, 79% of them can perform IMRT and 54% are able to deliver stereotactic RT. An external reference dosimetry audit (ERDA) was performed in 88% of the centres for photons and in 73% for electrons, but it was recent (<2 years) in only 74% and 60%, respectively. CONCLUSION: The use of the FQ helps maintain the minimum quality requirements within the EORTC ROG network: recommendations are made on the basis of the analysis of its results. The present analysis shows that modern RT techniques are widely implemented in the clinic but also that ERDA should be performed more frequently. Repeated assessment using the FQ is warranted to document the future evolution of the EORTC ROG institutions.