943 resultados para multi-trauma patients
Resumo:
PURPOSE We tested the hypothesis that whiplash trauma leads to changes of the signal intensity of cervical discs in T2-weighted images. METHODS AND MATERIALS 50 whiplash patients (18-65 years) were examined within 48h after motor vehicle accident, and again after 3 and 6 months and compared to 50 age- and sex-matched controls. Signal intensity in ROI's of the discs at the levels C2/3 to C7/T1 and the adjacent vertebral bodies were measured on sagittal T2 weighted MR images and normalized using the average of ROI's in fat tissue. The contrast between discs and both adjacent vertebrae was calculated and disc degeneration was graded by the Pfirrmann-grading system. RESULTS Whiplash trauma did not have a significant effect on the normalized signals from discs and vertebrae, on the contrast between discs and adjacent vertebrae, or on the Pfirrmann grading. However, the contrast between discs and adjacent vertebrae and the Pfirrmann grading showed a strong correlation. In healthy volunteers, the contrast between discs and adjacent vertebrae and Pfirrmann grading increased with age and was dependent on the disc level. CONCLUSION We could not find any trauma related changes of cervical disc signal intensities. Normalized signals of discs and Pfirrmann grading changed with age and varied between disc levels with the used MR sequence.
Resumo:
Objective: IL23 is involved in chronic inflammation but its role in cancer progression is not fully elucidated. Here we characterize IL23 subunits p40, p19 and IL23 receptor (IL23R) in the normal-adenoma-carcinomametastasis cascade of colorectal cancers and their relationship to clinicopathological and outcome data. Method: Immunohistochemistry for IL23R, IL12p40, IL23 and IL23p19 (monoclonal) was performed on a multi-punch tissue microarray (n=213 patients). Expression differences between normal-adenomas-cancerslymph nodes were evaluated. Correlation with clinicopathological and outcome data was undertaken. Results were validated on an independent cohort (n=341 patients). Results: An increased expression from normal-adenoma-cancer was observed (p<0.0001; all) followed by a marked reduction in lymph nodes (p<0.0001; all). Cytoplasmic and/or membranous staining of all markers was unrelated to outcome. Nuclear IL23p19 staining occurred in 23.1%and was associated with smaller tumor diameter (p=0.0333), early pT (p=0.0213), early TNM (p=0.0186), absence of vascular (p=0.0124) and lymphatic invasion (p=0.01493) and favorable survival (univariate (p=0.014) and multivariable (p=0.0321) analysis). All IL23p19 positive patients were free of distant metastasis (p=0.0146). Survival and metastasis results could be validated in Cohort 2. Conclusion: The presence of nuclear IL23p19 is related to indolent tumor features and favorable outcome supporting a more ‘protective’ role of this protein in colorectal cancer progression
Resumo:
BACKGROUND Posttraumatic Stress Disorder (PTSD) may occur in patients after exposure to a life-threatening illness. About one out of six patients develop clinically relevant levels of PTSD symptoms after acute myocardial infarction (MI). Symptoms of PTSD are associated with impaired quality of life and increase the risk of recurrent cardiovascular events. The main hypothesis of the MI-SPRINT study is that trauma-focused psychological counseling is more effective than non-trauma focused counseling in preventing posttraumatic stress after acute MI. METHODS/DESIGN The study is a single-center, randomized controlled psychological trial with two active intervention arms. The sample consists of 426 patients aged 18 years or older who are at 'high risk' to develop clinically relevant posttraumatic stress symptoms. 'High risk' patients are identified with three single-item questions with a numeric rating scale (0 to 10) asking about 'pain during MI', 'fear of dying until admission' and/or 'worrying and feeling helpless when being told about having MI'. Exclusion criteria are emergency heart surgery, severe comorbidities, current severe depression, disorientation, cognitive impairment and suicidal ideation. Patients will be randomly allocated to a single 45-minute counseling session targeting either specific MI-triggered traumatic reactions (that is, the verum intervention) or the general role of psychosocial stress in coronary heart disease (that is, the control intervention). The session will take place in the coronary care unit within 48 hours, by the bedside, after patients have reached stable circulatory conditions. Each patient will additionally receive an illustrated information booklet as study material. Sociodemographic factors, psychosocial and medical data, and cardiometabolic risk factors will be assessed during hospitalization. The primary outcome is the interviewer-rated posttraumatic stress level at three-month follow-up, which is hypothesized to be at least 20% lower in the verum group than in the control group using the t-test. Secondary outcomes are posttraumatic stress levels at 12-month follow-up, and psychosocial functioning and cardiometabolic risk factors at both follow-up assessments. DISCUSSION If the verum intervention proves to be effective, the study will be the first to show that a brief trauma-focused psychological intervention delivered within a somatic health care setting can reduce the incidence of posttraumatic stress in acute MI patients. TRIAL REGISTRATION ClinicalTrials.gov: NCT01781247.
Resumo:
BACKGROUND Fever and neutropenia (FN) often complicate anticancer treatment and can be caused by potentially fatal infections. Knowledge of pathogen distribution is paramount for optimal patient management. METHODS Microbiologically defined infections (MDI) in pediatric cancer patients presenting with FN by nonmyeloablative chemotherapy enrolled in a prospective multi-center study were analyzed. Effectiveness of empiric antibiotic therapy in FN episodes with bacteremia was assessed taking into consideration recently published treatment guidelines for pediatric patients with FN. RESULTS MDI were identified in a minority (22%) of pediatric cancer patients with FN. In patients with, compared to without MDI, fever (median, 5 [IQR 3-8] vs. 2 [IQR1-3] days, p < 0.001) and hospitalization (10 [6-14] vs. 5 [3-8] days, p < 0.001) lasted longer, transfer to the intensive care unit was more likely (13 of 95 [14%] vs. 7 of 346 [2.0%], p < 0.001), and antibiotics were given longer (10 [7-14] vs. 5 [4-7], p < 0.001). Empiric antibiotic therapy in FN episodes with bacteremia was highly effective if not only intrinsic and reported antimicrobial susceptibilities were considered but the purposeful omission of coverage for coagulase negative staphylococci and enterococci was also taken into account (81% [95%CI 68 - 90] vs. 96.6% [95%CI 87 - 99.4], p = 0.004) CONCLUSIONS: MDI were identified in a minority of FN episodes but they significantly affected management and the clinical course of pediatric cancer patients. Compliance with published guidelines was associated with effectiveness of empiric antibiotic therapy in FN episodes with bacteremia.
Resumo:
PURPOSE Computed tomography (CT) accounts for more than half of the total radiation exposure from medical procedures, which makes dose reduction in CT an effective means of reducing radiation exposure. We analysed the dose reduction that can be achieved with a new CT scanner [Somatom Edge (E)] that incorporates new developments in hardware (detector) and software (iterative reconstruction). METHODS We compared weighted volume CT dose index (CTDIvol) and dose length product (DLP) values of 25 consecutive patients studied with non-enhanced standard brain CT with the new scanner and with two previous models each, a 64-slice 64-row multi-detector CT (MDCT) scanner with 64 rows (S64) and a 16-slice 16-row MDCT scanner with 16 rows (S16). We analysed signal-to-noise and contrast-to-noise ratios in images from the three scanners and performed a quality rating by three neuroradiologists to analyse whether dose reduction techniques still yield sufficient diagnostic quality. RESULTS CTDIVol of scanner E was 41.5 and 36.4 % less than the values of scanners S16 and S64, respectively; the DLP values were 40 and 38.3 % less. All differences were statistically significant (p < 0.0001). Signal-to-noise and contrast-to-noise ratios were best in S64; these differences also reached statistical significance. Image analysis, however, showed "non-inferiority" of scanner E regarding image quality. CONCLUSIONS The first experience with the new scanner shows that new dose reduction techniques allow for up to 40 % dose reduction while still maintaining image quality at a diagnostically usable level.
Resumo:
Background Our knowledge of factors influencing mortality of patients with pelvic ring injuries and the impact of associated injuries is currently based on limited information. Questions/purposes Weidentified the (1) causes and time of death, (2) demography, and (3) pattern and severity of injuries in patients with pelvic ring fractures who did not survive. Methods We prospectively collected data on 5340 patients listed in the German Pelvic Trauma Registry between April 30, 2004 and July 29, 2011; 3034 of 5340 (57%) patientswere female. Demographic data and parameters indicating the type and severity of injury were recorded for patients who died in hospital (nonsurvivors) and compared with data of patients who survived (survivors). The median followup was 13 days (range, 0–1117 days). Results A total of 238 (4%) patients died a median of 2 days after trauma. The main cause of death was massive bleeding (34%), predominantly from the pelvic region (62% of all patients who died because of massive bleeding). Fifty six percent of nonsurvivors and 43% of survivors were male. Nonsurvivors were characterized by a higher incidence of complex pelvic injuries (32% versus 8%), less isolated pelvic ring fractures (13% versus 49%), lower initial blood hemoglobin concentration (6.7 ± 2.9 versus 9.8 ± 3.0 g/dL) and systolic arterial blood pressure (77 ± 27 versus 106 ± 24 mmHg), and higher injury severity score (ISS) (35 ± 16 versus 15 ± 12). Conclusion Patients with pelvic fractures who did not survive were characterized by male gender, severe multiple trauma, and major hemorrhage.
Resumo:
Background: Percutaneous iliosacral screw placement following pelvic trauma is a very demanding technique involving a high rate of screw malpositions possibly associated with the risk of neurological damage or inadequate stability. In the conventional technique, the screw’s correct entry point and the small target corridor for the iliosacral screw may be difficult to visualise using an image intensifier. 2D and 3D navigation techniques may therefore be helpful tools. The aim of this multicentre study was to evaluate the intra- and postoperative complications after percutaneous screw implantation by classifying the fractures using data from a prospective pelvic trauma registry. The a priori hypothesis was that the navigation techniques have lower rates of intraoperative and postoperative complications. Methods: This study is based on data from the prospective pelvic trauma registry introduced by the German Society of Traumatology and the German Section of the AO/ASIF International in 1991. The registry provides data on all patients with pelvic fractures treated between July 2008 and June 2011 at any one of the 23 Level I trauma centres contributing to the registry. Results: A total of 2615 patients were identified. Out of these a further analysis was performed in 597 patients suffering injuries of the SI joint (187 � with surgical interventions) and 597 patients with sacral fractures (334 � with surgical interventions). The rate of intraoperative complications was not significantly different, with 10/114 patients undergoing navigated techniques (8.8%) and 14/239 patients in the conventional group (5.9%) for percutaneous screw implantation (p = 0.4242). Postoperative complications were analysed in 30/114 patients in the navigated group (26.3%) and in 70/239 patients (29.3%) in the conventional group (p = 0.6542). Patients who underwent no surgery had with 66/197 cases (33.5%) a relatively high rate of complications during their hospital stay. The rate of surgically-treated fractures was higher in the group with more unstable Type-C fractures, but the fracture classification had no significant influence on the rate of complications. Discussion: In this prospective multicentre study, the 2D/3D navigation techniques revealed similar results for the rate of intraoperative and postoperative complications compared to the conventional technique. The rate of neurological complications was significantly higher in the navigated group.
Resumo:
Within the past 15 years, significant advances in the imaging of multiorgan and complex trauma primarily due to the improvement of cross-sectional imaging have resulted in the optimization of the expedient diagnosis and management of the polytrauma patient. At the forefront, multidetector computed tomography (MDCT) has become the cornerstone of modern emergency departments and trauma centers. In many institutions, MDCT is the de facto diagnostic tool upon trauma activation. In the setting of pelvic imaging, MDCT (with its high spatial resolution and sensitivity as well as short acquisition times) allows for rapid identification and assessment of pelvic hemorrhage leading to faster triage and definitive management. In trauma centers throughout the world, angiography and minimally invasive catheter-based embolization techniques performed by interventional radiologists have become the standard of care for patients with acute pelvic trauma and related multiorgan hemorrhage. In an interdisciplinary setting, embolization may be performed either alone or as an adjunct procedure with open or closed reduction and stabilization techniques. A team-based approach involving multiple disciplines (e.g., radiology, traumatology, orthopedic surgery, intensive care medicine) is crucial to monitor and treat the actively bleeding patient appropriately.
Resumo:
In this paper, we propose novel methodologies for the automatic segmentation and recognition of multi-food images. The proposed methods implement the first modules of a carbohydrate counting and insulin advisory system for type 1 diabetic patients. Initially the plate is segmented using pyramidal mean-shift filtering and a region growing algorithm. Then each of the resulted segments is described by both color and texture features and classified by a support vector machine into one of six different major food classes. Finally, a modified version of the Huang and Dom evaluation index was proposed, addressing the particular needs of the food segmentation problem. The experimental results prove the effectiveness of the proposed method achieving a segmentation accuracy of 88.5% and recognition rate equal to 87%
Resumo:
BACKGROUND & AIMS: Refractory ascites (RA) affects 10% of patients with advanced cirrhosis and ascites. Usual therapy includes large volume paracentesis, and in selected patients, a transjugular portosystemic shunt (TIPS). These therapies may be associated with increased morbidity: paracentesis may induce circulatory dysfunction and impair quality of life and TIPS may induce encephalopathy and is associated with increased mortality in patients with severe liver dysfunction. We present the results of a multicenter, non-randomized trial to assess the safety and efficacy of a new automated pump system for treatment of RA. METHODS: Forty patients at 9 centers (February 2010-June 2011) received an implanted pump for the automated removal of ascites from the peritoneal cavity into the bladder, from where it was eliminated through normal urination. Patients were followed-up for 6months. The primary study outcome was safety. Secondary outcomes included recurrence of tense ascites and pump performance. RESULTS: Surgical complications occurred early in the study and became less frequent. The pump system removed 90% of the ascites and significantly reduced the median number of large volume paracentesis per month [3.4 (range 1-6) vs. 0.2 (range 0-4); p <0.01]. Cirrhosis-related adverse events decreased along follow-up. CONCLUSIONS: The automated pump seems an efficacious tool to move out ascites from the peritoneal cavity to the bladder. Its safety is still moderate, but a broad use in different countries will improve the surgical technique as well as the medical surveillance. A prospective randomized clinical trial vs. large volume paracentesis is underway to confirm these preliminary results.
Resumo:
Background: Patients presenting to the emergency department (ED) currently face inacceptable delays in initial treatment, and long, costly hospital stays due to suboptimal initial triage and site-of-care decisions. Accurate ED triage should focus not only on initial treatment priority, but also on prediction of medical risk and nursing needs to improve site-of-care decisions and to simplify early discharge management. Different triage scores have been proposed, such as the Manchester triage system (MTS). Yet, these scores focus only on treatment priority, have suboptimal performance and lack validation in the Swiss health care system. Because the MTS will be introduced into clinical routine at the Kantonsspital Aarau, we propose a large prospective cohort study to optimize initial patient triage. Specifically, the aim of this trial is to derive a three-part triage algorithm to better predict (a) treatment priority; (b) medical risk and thus need for in-hospital treatment; (c) post-acute care needs of patients at the most proximal time point of ED admission. Methods/design: Prospective, observational, multicenter, multi-national cohort study. We will include all consecutive medical patients seeking ED care into this observational registry. There will be no exclusions except for non-adult and non-medical patients. Vital signs will be recorded and left over blood samples will be stored for later batch analysis of blood markers. Upon ED admission, the post-acute care discharge score (PACD) will be recorded. Attending ED physicians will adjudicate triage priority based on all available results at the time of ED discharge to the medical ward. Patients will be reassessed daily during the hospital course for medical stability and readiness for discharge from the nurses and if involved social workers perspective. To assess outcomes, data from electronic medical records will be used and all patients will be contacted 30 days after hospital admission to assess vital and functional status, re-hospitalization, satisfaction with care and quality of life measures. We aim to include between 5000 and 7000 patients over one year of recruitment to derive the three-part triage algorithm. The respective main endpoints were defined as (a) initial triage priority (high vs. low priority) adjudicated by the attending ED physician at ED discharge, (b) adverse 30 day outcome (death or intensive care unit admission) within 30 days following ED admission to assess patients risk and thus need for in-hospital treatment and (c) post acute care needs after hospital discharge, defined as transfer of patients to a post-acute care institution, for early recognition and planning of post-acute care needs. Other outcomes are time to first physician contact, time to initiation of adequate medical therapy, time to social worker involvement, length of hospital stay, reasons fordischarge delays, patient’s satisfaction with care, overall hospital costs and patients care needs after returning home. Discussion: Using a reliable initial triage system for estimating initial treatment priority, need for in-hospital treatment and post-acute care needs is an innovative and persuasive approach for a more targeted and efficient management of medical patients in the ED. The proposed interdisciplinary , multi-national project has unprecedented potential to improve initial triage decisions and optimize resource allocation to the sickest patients from admission to discharge. The algorithms derived in this study will be compared in a later randomized controlled trial against a usual care control group in terms of resource use, length of hospital stay, overall costs and patient’s outcomes in terms of mortality, re-hospitalization, quality of life and satisfaction with care.
Resumo:
BACKGROUND AND PURPOSE Reproducible segmentation of brain tumors on magnetic resonance images is an important clinical need. This study was designed to evaluate the reliability of a novel fully automated segmentation tool for brain tumor image analysis in comparison to manually defined tumor segmentations. METHODS We prospectively evaluated preoperative MR Images from 25 glioblastoma patients. Two independent expert raters performed manual segmentations. Automatic segmentations were performed using the Brain Tumor Image Analysis software (BraTumIA). In order to study the different tumor compartments, the complete tumor volume TV (enhancing part plus non-enhancing part plus necrotic core of the tumor), the TV+ (TV plus edema) and the contrast enhancing tumor volume CETV were identified. We quantified the overlap between manual and automated segmentation by calculation of diameter measurements as well as the Dice coefficients, the positive predictive values, sensitivity, relative volume error and absolute volume error. RESULTS Comparison of automated versus manual extraction of 2-dimensional diameter measurements showed no significant difference (p = 0.29). Comparison of automated versus manual segmentation of volumetric segmentations showed significant differences for TV+ and TV (p<0.05) but no significant differences for CETV (p>0.05) with regard to the Dice overlap coefficients. Spearman's rank correlation coefficients (ρ) of TV+, TV and CETV showed highly significant correlations between automatic and manual segmentations. Tumor localization did not influence the accuracy of segmentation. CONCLUSIONS In summary, we demonstrated that BraTumIA supports radiologists and clinicians by providing accurate measures of cross-sectional diameter-based tumor extensions. The automated volume measurements were comparable to manual tumor delineation for CETV tumor volumes, and outperformed inter-rater variability for overlap and sensitivity.
Resumo:
BACKGROUND The fingertip is the most commonly injured part of the hand and is an important aesthetic part of the hand. METHODS In this retrospective study we analyzed data from 700 patients operated on between 1997 and 2008 for complications after nail splinting with native nail or silicone nail. Inclusion criteria were patients living in Bern/Berner Land, complete documentation, same surgical team, standard antibiotics, acute trauma, no nail bed transplantation, and no systemic diseases. Groups were analyzed for differences in age, gender, cause and extension of trauma, bony injury and extent, infection, infectious agent, and nail deformities. Statistical analysis was done using the χ (2) test, Fisher's exact test, and Pearson correlation coefficients. RESULTS A total of 401 patients, with a median age of 39.5 years, were included. There were more men with injured nails. Two hundred forty native nails and 161 silicone splints were used. There were 344 compression injuries, 44 amputations, and 13 avulsion injuries. Forty-three patients had an infection, with gram-positive bacteria (Staphylococcus aureus) causing most infections. A total of 157 nail dystrophies were observed, split nails most often. The native nail splint group showed significantly (p < 0.015) fewer nail deformities than the silicone nail splint group; otherwise, there were no statistical differences. However, there were twice as many infections in the silicone nail group. CONCLUSION It seems to be advantageous to use the native nail for splinting after trauma, when possible. In case of a destroyed and unusable nail plate, a nail substitute has to be used.
Resumo:
BACKGROUND The treatment and outcomes of patients with human immunodeficiency virus (HIV)-associated Hodgkin lymphoma (HL) continue to evolve. The International Prognostic Score (IPS) is used to predict the survival of patients with advanced-stage HL, but it has not been validated in patients with HIV infection. METHODS This was a multi-institutional, retrospective study of 229 patients with HIV-associated, advanced-stage, classical HL who received doxorubicin, bleomycin, vinblastine, and dacarbazine (ABVD) plus combination antiretroviral therapy. Their clinical characteristics were presented descriptively, and multivariate analyses were performed to identify the factors that were predictive of response and prognostic of progression-free survival (PFS) and overall survival (OS). RESULTS The overall and complete response rates to ABVD in patients with HIV-associated HL were 91% and 83%, respectively. After a median follow-up of 5 years, the 5-year PFS and OS rates were 69% and 78%, respectively. In multivariate analyses, there was a trend toward an IPS score >3 as an adverse factor for PFS (hazard ratio [HR], 1.49; P=.15) and OS (HR, 1.84; P=.06). A cluster of differentiation 4 (CD4)-positive (T-helper) cell count <200 cells/μL was associated independently with both PFS (HR, 2.60; P=.002) and OS (HR, 2.04; P=.04). The CD4-positive cell count was associated with an increased incidence of death from other causes (HR, 2.64; P=.04) but not with death from HL-related causes (HR, 1.55; P=.32). CONCLUSIONS The current results indicate excellent response and survival rates in patients with HIV-associated, advanced-stage, classical HL who receive ABVD and combination antiretroviral therapy as well as the prognostic value of the CD4-positive cell count at the time of lymphoma diagnosis for PFS and OS. Cancer 2014. © 2014 American Cancer Society.
Resumo:
INTRODUCTION Rates of both TB/HIV co-infection and multi-drug-resistant (MDR) TB are increasing in Eastern Europe (EE). Data on the clinical management of TB/HIV co-infected patients are scarce. Our aim was to study the clinical characteristics of TB/HIV patients in Europe and Latin America (LA) at TB diagnosis, identify factors associated with MDR-TB and assess the activity of initial TB treatment regimens given the results of drug-susceptibility tests (DST). MATERIAL AND METHODS We enrolled 1413 TB/HIV patients from 62 clinics in 19 countries in EE, Western Europe (WE), Southern Europe (SE) and LA from January 2011 to December 2013. Among patients who completed DST within the first month of TB therapy, we linked initial TB treatment regimens to the DST results and calculated the distribution of patients receiving 0, 1, 2, 3 and ≥4 active drugs in each region. Risk factors for MDR-TB were identified in logistic regression models. RESULTS Significant differences were observed between EE (n=844), WE (n=152), SE (n=164) and LA (n=253) for use of combination antiretroviral therapy (cART) at TB diagnosis (17%, 40%, 44% and 35%, p<0.0001), a definite TB diagnosis (culture and/or PCR positive for Mycobacterium tuberculosis; 47%, 71%, 72% and 40%, p<0.0001) and MDR-TB prevalence (34%, 3%, 3% and 11%, p <0.0001 among those with DST results). The history of injecting drug use [adjusted OR (aOR) = 2.03, (95% CI 1.00-4.09)], prior TB treatment (aOR = 3.42, 95% CI 1.88-6.22) and living in EE (aOR = 7.19, 95% CI 3.28-15.78) were associated with MDR-TB. For 569 patients with available DST, the initial TB treatment contained ≥3 active drugs in 64% of patients in EE compared with 90-94% of patients in other regions (Figure 1a). Had the patients received initial therapy with standard therapy [Rifampicin, Isoniazid, Pyrazinamide, Ethambutol (RHZE)], the corresponding proportions would have been 64% vs. 86-97%, respectively (Figure 1b). CONCLUSIONS In EE, TB/HIV patients had poorer exposure to cART, less often a definitive TB diagnosis and more often MDR-TB compared to other parts of Europe and LA. Initial TB therapy in EE was sub-optimal, with less than two-thirds of patients receiving at least three active drugs, and improved compliance with standard RHZE treatment does not seem to be the solution. Improved management of TB/HIV patients requires routine use of DST, initial TB therapy according to prevailing resistance patterns and more widespread use of cART.