924 resultados para noise level in hospital
Resumo:
BACKGROUND AND OBJECTIVE: The decision to maintain intensive treatment in cardiac surgical patients with poor initial outcome is mostly based on individual experience. The risk scoring systems used in cardiac surgery have no prognostic value for individuals. This study aims to assess (a) factors possibly related to poor survival and functional outcomes in cardiac surgery patients requiring prolonged (> or = 5 days) intensive care unit (ICU) treatment, (b) conditions in which treatment withdrawal might be justified, and (c) the patient's perception of the benefits and drawbacks of long intensive treatments. METHODS: The computerized data prospectively recorded for every patient in the intensive care unit over a 3-year period were reviewed and analyzed (n=1859). Survival and quality of life (QOL) outcomes were determined in all patients having required > or =5 consecutive days of intensive treatment (n=194/10.4%). Long-term survivors were interviewed at yearly intervals in a standardized manner and quality of life was assessed using the dependency score of Karnofsky. No interventions or treatments were given, withhold, or withdrawn as part of this study. RESULTS: In-hospital, 1-, and 3-year cumulative survival rates reached 91.3%, 85.6%, and 75.1%, respectively. Quality of life assessed 1 year postoperatively by the score of Karnofsky was good in 119/165 patients, fair in 32 and poor in 14. Multivariate logistic regression analysis of 19 potential predictors of poor outcome identified dialysis as the sole factor significantly (p=0.027) - albeit moderately - reducing long-term survival, and sustained neurological deficit as an inconstant predictor of poor functional outcome (p=0.028). One year postoperatively 0.63% of patients still reminded of severe suffering in the intensive station and 20% of discomfort. Only 7.7% of patients would definitely refuse redo surgery. CONCLUSIONS: This study of cardiac surgical patients requiring > or =5 days of intensive treatment did not identify factors unequivocally justifying early treatment limitation in individuals. It found that 1-year mortality and disability rates can be maintained at a low level in this subset of patients, and that severe suffering in the ICU is infrequent.
Resumo:
Some studies of patients with acute myocardial infarction have reported that hyperglycaemia at admission may be associated with a worse outcome. This study sought to evaluate the association of blood glucose at admission with the outcome of unselected patients with acute coronary syndrome (ACS). Using the Acute Myocardial Infarction and unstable angina in Switzerland (AMIS Plus) registry, ACS patients were stratified according to their blood glucose on admission: group 1: 2.80-6.99 mmol/L, group 2: 7.00-11.09 mmol/L and group 3: > 11.10 mmol/L. Odds ratios for in-hospital mortality were calculated using logistic regression models. Of 2,786 patients, 73% were male and 21% were known to have diabetes. In-hospital mortality increased from 3% in group 1 to 7% in group 2 and to 15% in group 3. Higher glucose levels were associated with larger enzymatic infarct sizes (p<0.001) and had a weak negative correlation with angiographic or echographic left ventricular ejection fraction. High admission glycaemia in ACS patients remains a significant independent predictor of in-hospital mortality (adjusted OR 1.08; 95% confidence intervals [CI] 1.05-1.14, p<0.001) per mmol/L. The OR for in-hospital mortality was 1.04 (95% CI 0.99-1.1; p=0.140) per mmol/L for patients with diabetes but 1.21 (95% CI 112-1.30; p<0.001) per mmol/L for non-diabetic patients. In conclusion, elevated glucose level in ACS patients on admission is a significant independent predictor of in-hospital mortality and is even more important for patients who do not have known diabetes.
Resumo:
BACKGROUND: The role of endothelin-1 (ET-1) and nitric oxide (NO) as two important mediators in the development of cerebral vasospasm (CVS) after subarachnoid haemorrhage (SAH) is controversial. The objective of this study was to determine whether local levels of ET-1 and NO in cerebral arterial plasma and/or in cerebrospinal fluid (CSF) are associated with the occurrence of CVS after SAH. METHODS: CVS was induced using the one-haemorrhage rabbit model and confirmed by digital subtraction angiography of the rabbits' basilar artery on day 5. Prior to sacrifice, local CSF and basilar arterial plasma samples were obtained by a transclival approach to the basilar artery. Systemic arterial plasma samples were obtained. ET-1 levels were determined by immunometric technique (pg/ml +/- SEM) and total nitrate/nitrite level spectrophotometrically (micromol/l +/- SEM). FINDINGS: Angiographic CVS was documented after SAH induction (n = 12, P < 0.05). The ET-1 level in CSF was significantly elevated by 27.3% to 0.84 +/- 0.08 pg/ml in SAH animals (n = 7) in comparison to controls (0.66 +/- 0.04 pg/ml, n = 7, P < 0.05). There was no significant difference in ET-1 levels in systemic and basilar arterial plasma samples of SAH animals compared to controls. A significant lack of local NO metabolites was documented in basilar arterial plasma after SAH (36.8 +/- 3.1 micromol/l, n = 6) compared to controls (61.8 +/- 6.2 micromol/l, n = 6, P < 0.01). CONCLUSION: This study demonstrates that an elevated ET-1 level in CSF and local lack of NO in the basilar arterial plasma samples are associated with CVS after experimental SAH.
Resumo:
OBJECTIVE: In Switzerland there is a shortage of population-based information on stroke incidence and case fatalities (CF). The aim of this study was to estimate stroke event rates and both in- and out-of-hospital CF rates. METHODS: Data on stroke diagnoses, coded according to I60-I64 (ICD 10), were taken from the Federal Hospital Discharge Statistics database (HOST) and the Cause of Death database (CoD) for the year 2004. The number of total stroke events and of age- and gender-specific and agestandardised event rates were estimated; overall CF, in-hospital and out-of-hospital, were determined. RESULTS: Among the overall number of 13 996 hospital discharges from stroke (HOST) the number was lower in women (n = 6736) than in men (n = 7260). A total of 3568 deaths (2137 women and 1431 men) due to stroke were recorded in the CoD database. The number of estimated stroke events was 15 733, and higher in women (n = 7933) than in men (n = 7800). Men presented significantly higher age-specific stroke event rates and a higher age-standardised event rate (178.7/100 000 versus 119.7/100 000). Overall CF rates were significantly higher for women (26.9%) than for men (18.4%). The same was true of out-of-hospital CF but not of in-hospital CF rates. CONCLUSION: The data on estimated stroke events obtained indicate that stroke discharge rate underestimates the stroke event rate. Out-of-hospital deaths from stroke accounted for the largest proportion of total stroke deaths. Sex differences in both number of total stroke events and deaths could be explained by the higher proportion of women than men aged 55+ in the Swiss population.
Resumo:
BACKGROUND Acute cardiogenic shock after myocardial infarction is associated with high in-hospital mortality attributable to persisting low-cardiac output. The Impella-EUROSHOCK-registry evaluates the safety and efficacy of the Impella-2.5-percutaneous left-ventricular assist device in patients with cardiogenic shock after acute myocardial infarction. METHODS AND RESULTS This multicenter registry retrospectively included 120 patients (63.6±12.2 years; 81.7% male) with cardiogenic shock from acute myocardial infarction receiving temporary circulatory support with the Impella-2.5-percutaneous left-ventricular assist device. The primary end point evaluated mortality at 30 days. The secondary end point analyzed the change of plasma lactate after the institution of hemodynamic support, and the rate of early major adverse cardiac and cerebrovascular events as well as long-term survival. Thirty-day mortality was 64.2% in the study population. After Impella-2.5-percutaneous left-ventricular assist device implantation, lactate levels decreased from 5.8±5.0 mmol/L to 4.7±5.4 mmol/L (P=0.28) and 2.5±2.6 mmol/L (P=0.023) at 24 and 48 hours, respectively. Early major adverse cardiac and cerebrovascular events were reported in 18 (15%) patients. Major bleeding at the vascular access site, hemolysis, and pericardial tamponade occurred in 34 (28.6%), 9 (7.5%), and 2 (1.7%) patients, respectively. The parameters of age >65 and lactate level >3.8 mmol/L at admission were identified as predictors of 30-day mortality. After 317±526 days of follow-up, survival was 28.3%. CONCLUSIONS In patients with acute cardiogenic shock from acute myocardial infarction, Impella 2.5-treatment is feasible and results in a reduction of lactate levels, suggesting improved organ perfusion. However, 30-day mortality remains high in these patients. This likely reflects the last-resort character of Impella-2.5-application in selected patients with a poor hemodynamic profile and a greater imminent risk of death. Carefully conducted randomized controlled trials are necessary to evaluate the efficacy of Impella-2.5-support in this high-risk patient group.
Resumo:
Although a radiographic unit is not standard equipment for bovine practitioners in hospital or field situations, ultrasound machines with 7.5-MHz linear transducers have been used in bovine reproduction for many years, and are eminently suitable for evaluation of orthopedic disorders. The goal of this article is to encourage veterinarians to use radiology and ultrasonography for the evaluation of bovine orthopedic disorders. These diagnostic imaging techniques improve the likelihood of a definitive diagnosis in every bovine patient but especially in highly valuable cattle, whose owners demand increasingly more diagnostic and surgical interventions that require high-level specialized techniques.
Resumo:
PURPOSE Computed tomography (CT) accounts for more than half of the total radiation exposure from medical procedures, which makes dose reduction in CT an effective means of reducing radiation exposure. We analysed the dose reduction that can be achieved with a new CT scanner [Somatom Edge (E)] that incorporates new developments in hardware (detector) and software (iterative reconstruction). METHODS We compared weighted volume CT dose index (CTDIvol) and dose length product (DLP) values of 25 consecutive patients studied with non-enhanced standard brain CT with the new scanner and with two previous models each, a 64-slice 64-row multi-detector CT (MDCT) scanner with 64 rows (S64) and a 16-slice 16-row MDCT scanner with 16 rows (S16). We analysed signal-to-noise and contrast-to-noise ratios in images from the three scanners and performed a quality rating by three neuroradiologists to analyse whether dose reduction techniques still yield sufficient diagnostic quality. RESULTS CTDIVol of scanner E was 41.5 and 36.4 % less than the values of scanners S16 and S64, respectively; the DLP values were 40 and 38.3 % less. All differences were statistically significant (p < 0.0001). Signal-to-noise and contrast-to-noise ratios were best in S64; these differences also reached statistical significance. Image analysis, however, showed "non-inferiority" of scanner E regarding image quality. CONCLUSIONS The first experience with the new scanner shows that new dose reduction techniques allow for up to 40 % dose reduction while still maintaining image quality at a diagnostically usable level.
Resumo:
AIM To compare the computed tomography (CT) dose and image quality with the filtered back projection against the iterative reconstruction and CT with a minimal electronic noise detector. METHODS A lung phantom (Chest Phantom N1 by Kyoto Kagaku) was scanned with 3 different CT scanners: the Somatom Sensation, the Definition Flash and the Definition Edge (all from Siemens, Erlangen, Germany). The scan parameters were identical to the Siemens presetting for THORAX ROUTINE (scan length 35 cm and FOV 33 cm). Nine different exposition levels were examined (reference mAs/peek voltage): 100/120, 100/100, 100/80, 50/120, 50/100, 50/80, 25/120, 25/100 and 25 mAs/80 kVp. Images from the SOMATOM Sensation were reconstructed using classic filtered back projection. Iterative reconstruction (SAFIRE, level 3) was performed for the two other scanners. A Stellar detector was used with the Somatom Definition Edge. The CT doses were represented by the dose length products (DLPs) (mGycm) provided by the scanners. Signal, contrast, noise and subjective image quality were recorded by two different radiologists with 10 and 3 years of experience in chest CT radiology. To determine the average dose reduction between two scanners, the integral of the dose difference was calculated from the lowest to the highest noise level. RESULTS When using iterative reconstruction (IR) instead of filtered back projection (FBP), the average dose reduction was 30%, 52% and 80% for bone, soft tissue and air, respectively, for the same image quality (P < 0.0001). The recently introduced Stellar detector (Sd) lowered the radiation dose by an additional 27%, 54% and 70% for bone, soft tissue and air, respectively (P < 0.0001). The benefit of dose reduction was larger at lower dose levels. With the same radiation dose, an average of 34% (22%-37%) and 25% (13%-46%) more contrast to noise was achieved by changing from FBP to IR and from IR to Sd, respectively. For the same contrast to noise level, an average of 59% (46%-71%) and 51% (38%-68%) dose reduction was produced for IR and Sd, respectively. For the same subjective image quality, the dose could be reduced by 25% (2%-42%) and 44% (33%-54%) using IR and Sd, respectively. CONCLUSION This study showed an average dose reduction between 27% and 70% for the new Stellar detector, which is equivalent to using IR instead of FBP.
Resumo:
OBJECTIVES In Europe and elsewhere, health inequalities among HIV-positive individuals are of concern. We investigated late HIV diagnosis and late initiation of combination antiretroviral therapy (cART) by educational level, a proxy of socioeconomic position. DESIGN AND METHODS We used data from nine HIV cohorts within COHERE in Austria, France, Greece, Italy, Spain and Switzerland, collecting data on level of education in categories of the UNESCO/International Standard Classification of Education standard classification: non-completed basic, basic, secondary and tertiary education. We included individuals diagnosed with HIV between 1996 and 2011, aged at least 16 years, with known educational level and at least one CD4 cell count within 6 months of HIV diagnosis. We examined trends by education level in presentation with advanced HIV disease (AHD) (CD4 <200 cells/μl or AIDS within 6 months) using logistic regression, and distribution of CD4 cell count at cART initiation overall and among presenters without AHD using median regression. RESULTS Among 15 414 individuals, 52, 45,37, and 31% with uncompleted basic, basic, secondary and tertiary education, respectively, presented with AHD (P trend <0.001). Compared to patients with tertiary education, adjusted odds ratios of AHD were 1.72 (95% confidence interval 1.48-2.00) for uncompleted basic, 1.39 (1.24-1.56) for basic and 1.20 (1.08-1.34) for secondary education (P < 0.001). In unadjusted and adjusted analyses, median CD4 cell count at cART initiation was lower with poorer educational level. CONCLUSIONS Socioeconomic inequalities in delayed HIV diagnosis and initiation of cART are present in European countries with universal healthcare systems and individuals with lower educational level do not equally benefit from timely cART initiation.
Resumo:
BACKGROUND Acute mesenteric ischemia (AMI) is an emergency with a mortality rate up to 50 %. Detecting AMI continues to be a major challenge. This study assed the correlation of repeated preoperative serum lactate with bowel necrosis and to identify risk factors for a lethal outcome in patients with AMI. METHODS A retrospective study of 91 patients with clinically and pathologically confirmed AMI from January 2006 to December 2012 was performed. RESULTS In-hospital mortality rate was 42.9 %. Two hundred nine preoperative lactate measurements were analyzed (2.3 ± 1.1 values per patient). Less than or equal to six hours prior to surgery, the mean serum lactate level was significantly higher (4.97 ± 4.21 vs. 3.24 ± 3.05 mmol/L, p = 0.006) and the mean pH significantly lower (7.28 ± 0.12 vs. 7.37 ± 0.08, p = 0.001) compared to >6 h before surgery. Thirty-four patients had at least two lactate measurements within 24 h prior to surgery. In this subgroup, 17 patients (50 %) exhibited an increase, 17 patients (50 %) a decrease in lactate levels. Forward logistic regression analysis showed that length of necrotic bowel and the highest lactate value 24 h prior to surgery were independent risk factors for mortality (r (2) = 0.329). CONCLUSION The value of serial lactate and pH measurements to predict the length of necrotic bowel is very limited. Length of necrotic bowel and lactate values are independent risk factors for mortality.
Resumo:
PLACENTAL GLUCOSE TRANSPORTER (GLUT)-1 REGULATION IN PREECLAMPSIA Camilla Marini a,b, Benjamin P. Lüscher a,b, Marianne J€orger-Messerli a,b, Ruth Sager a,b, Xiao Huang c, Jürg Gertsch c, Matthias A. Hediger c, Christiane Albrecht c, Marc U. Baumann a,c, Daniel V. Surbek a,c a Department of Obstetrics and Gynecology, University Hospital of Bern, Bern, Switzerland, Switzerland; b Department of Clinical Research, University of Bern, Bern, Switzerland, Switzerland; c Institute for Biochemistry and Molecular Medicine, University of Bern, Bern, Switzerland, Switzerland Objectives: Glucose is a primary energy source for the fetus. The absence of significant gluconeogenesis in the fetus means that the fetal up-take of this vital nutrient is dependent on maternal supply and subsequent transplacental transport. Altered expression and/or function of placental transporters may affect the intrauterine environment and could compromise fetal and mother well-being. We speculated that pre-eclampsia (PE) impairs the placental glucose transport system. Methods: Placentae were obtained after elective caesarean sections following normal pregnancies and pre-eclamptic pregnancies. Syncytial basal membrane (BM) and apical microvillus membrane (MVM) fractions were prepared using differential ultra-centrifugation and magnesium precipitation. Protein expression was assessed by western blot analysis. mRNA levels in whole villous tissue lysate were quantified by real-time PCR. To assess glucose transport activity a radiolabeled substrate up-take assay and a transepithelial transport model using primary cytotrophoblasts were established. Results: GLUT1 mRNA expression was not changed in PE when compared to control, whereas protein expression was significantly down-regulated. Glucose up-take into syncytial microvesicles was reduced in PE compared to control. In a transepithelial transport model, phloretinmediated inhibition of GLUT1 at the apical side of primary cytotrophoblasts showed a 44% of reduction of transepithelial glucose transport at IC50. Conclusions: GLUT1 is down-regulated on protein and functional level in PE compared to control. Altering glucose transport activity by inhibition of apical GLUT-1 indicates that transplacental glucose transport might be regulated on the apical side of the syncytiotrophoblast. These results might help to understand better the regulation of GLUT1 transporter and maybe in future to develop preventive strategies to modulate the fetal programming and thereby reduce the incidence of disease for both the mother and her child later in life.
Resumo:
Random Forests™ is reported to be one of the most accurate classification algorithms in complex data analysis. It shows excellent performance even when most predictors are noisy and the number of variables is much larger than the number of observations. In this thesis Random Forests was applied to a large-scale lung cancer case-control study. A novel way of automatically selecting prognostic factors was proposed. Also, synthetic positive control was used to validate Random Forests method. Throughout this study we showed that Random Forests can deal with large number of weak input variables without overfitting. It can account for non-additive interactions between these input variables. Random Forests can also be used for variable selection without being adversely affected by collinearities. ^ Random Forests can deal with the large-scale data sets without rigorous data preprocessing. It has robust variable importance ranking measure. Proposed is a novel variable selection method in context of Random Forests that uses the data noise level as the cut-off value to determine the subset of the important predictors. This new approach enhanced the ability of the Random Forests algorithm to automatically identify important predictors for complex data. The cut-off value can also be adjusted based on the results of the synthetic positive control experiments. ^ When the data set had high variables to observations ratio, Random Forests complemented the established logistic regression. This study suggested that Random Forests is recommended for such high dimensionality data. One can use Random Forests to select the important variables and then use logistic regression or Random Forests itself to estimate the effect size of the predictors and to classify new observations. ^ We also found that the mean decrease of accuracy is a more reliable variable ranking measurement than mean decrease of Gini. ^
Resumo:
Context. Healthcare utilization of elder cardiovascular patients in United States will increase in near future, due to an aging population. This trend could burden urban emergency centers, which have become a source of primary care. ^ Objective. The objective of this study was to determine the association of age, gender, ethnicity, insurance and other presenting variables on hospital admission in an emergency center for elder cardiovascular patients. ^ Design, setting and participants. An anonymous retrospective review of emergency center patient login records of an urban emergency center in the years 2004 and 2005 was conducted. Elder patients (age ≥ 65 years) with cardiovascular disease (ICD91 390-459) were included. Multivariate logistic regression analysis was used to identify independent factors for hospital admission. Four major cardiovascular reasons for hospitalisation – ischemic heart disease, heart failure, hypertensive disorders and stroke were analysed separately. ^ Results. The number of elder patients in the emergency center is increasing, the most common reason for their visit was hypertension. Majority (59%) of the 12,306 elder patients were female. Forty five percent were uninsured and 1,973 patients had cardiovascular disease. Older age (OR 1.10; CI 1.02-1.19) was associated with a marginal increase in hospital admission in elder stroke patients. Elder females compared to elder males were more likely to be hospitalised for ischemic heart disease (OR 2.71; CI 1.22-6.00) and heart failure (OR 1.58; CI 1.001-2.52). Furthermore, insured elder heart failure patients (OR 0.54; CI 0.31-0.93) and elder African American heart failure patients (OR 0.32; CI 0.13-0.75) were less likely to be hospitalised. Ambulance use was associated with greater hospital admissions in elder cardiovascular patients studied, except for stroke. ^ Conclusion. Appropriate health care distribution policies are needed for elder patients, particularly elder females, uninsured, and racial/ethnic minorities. These findings could help triage nurse evaluations in emergency centers to identify patients who were more likely to be hospitalised to offer urgent care and schedule appointments in primary care clinics. In addition, health care plans could be formulated to improve elder primary care, decrease overcrowding in emergency centers, and decrease elder healthcare costs in the future. ^
Resumo:
More than a quarter of patients with HIV in the United States are diagnosed in hospital settings most often with advanced HIV related conditions.(1) There has been little research done on the causes of hospitalization when the patients are first diagnosed with HIV. The aim of this study was to determine if the patients are hospitalized due to an HIV related cause or due to some other co-morbidity. Reduced access to care could be one possible reason why patients are diagnosed late in the course of the disease. This study compared the access to care of patients diagnosed with HIV in hospital and outpatient setting. The data used for the study was a part of the ongoing study “Attitudes and Beliefs and Steps of HIV Care”. The participants in the study were newly diagnosed with HIV and recruited from both inpatient and outpatient settings. The primary and the secondary diagnoses from hospital discharge reports were extracted and a primary reason for hospitalization was ascertained. These were classified as HIV-related, other infectious causes, non–infectious causes, other systemic causes, and miscellaneous causes. Access to care was determined by a score based on responses to a set of questions derived from the HIV Cost and Services Utilization Study (HCSUS) on a 6 point scale. The mean score of the hospitalized patients and mean score of the patients diagnosed in an outpatient setting was compared. We used multiple linear regressions to compare mean differences in the two groups after adjusting for age, sex, race, household income educational level and health insurance at the time of diagnosis. There were 185 participants in the study, including 78 who were diagnosed in hospital settings and 107 who were diagnosed in outpatient settings. We found that HIV-related conditions were the leading cause of hospitalization, accounting for 60% of admissions, followed by non-infectious causes (20%) and then other infectious causes (17%). The inpatient diagnosed group did not have greater perceived access-to-care as compared to the outpatient group. Regression analysis demonstrated a statistically significant improvement in access-to-care with advancing education level (p=0.04) and with better health insurance (p=0.004). HIV-related causes account for many hospitalizations when patients are first diagnosed with HIV. Many of these HIV-related hospitalizations could have been prevented if patients were diagnosed early and linked to medical care. Programs to increase HIV awareness need to be an integral part of activities aimed at control of spread of HIV in the community. Routine testing for HIV infection to promote early HIV diagnosis can prevent significant morbidity and mortality.^
Resumo:
Differential access to health care services has been observed among various groups in the United States. Minorities and low-income groups have been especially notable in their decreased access to regular providers of care. This is believed by many to account for some of the higher rates of morbidity and mortality and shorter life expectancies of these groups.^ This research delineated the factors associated with health care access for a particular subset of a minority group, the Mexican American elderly in Texas. Hospital admission and evidence of a regular source of medical care and dental care were chosen as the indicators of access to health care.^ This study analyzed survey interview data from the Texas Study on Aging, 1976. The 597 Mexican American elderly included in this study were representative of the non-institutionalized Mexican American elderly in Texas aged 55 or older.^ The results indicate that hospital admission is not a question of discretion and that common barriers to access, such as income, health insurance, and distance to the nearest facility, are not important in determining hospital admission. Mexican American elderly who need to be hospitalized, as indicated by self-perception of health and disability days, will be hospitalized.^ The results also indicate that having a regular source of medical care is influenced by many factors, some mutable and some immutable. The well-established and immutable factors of age, sex, and need were confirmed. However, the mutable factors such as area of residence and income were also found to have a significant influence. Mexican American elderly living in urban areas had significantly less access to a regular source of medical care as did those who were near the poverty level (as opposed to those who were well below the poverty level). In general, persons claiming a regular source of medical care were more likely to be women, persons who had many health needs, were near the poverty level, lived in urban areas, and had extensive social support systems.^ Persons claiming a regular source of dental care tended to be more advantaged. They had more education, a more extensive informal social support network, higher income, and were generally younger and in better health. They were also more likely to have private health insurance. . . . (Author's abstract exceeds stipulated maximum length. Discontinued here with permission of author.) UMI ^