21 resultados para Early warning indicators
Resumo:
BACKGROUND: Elevated lactate and interleukin-6 (IL-6) levels were shown to correlate with mortality and multiple organ dysfunction in severely traumatized patients. The purpose of this study was to test whether an association exists between 24-hour lactate clearance, IL-6 and procalcitonin (PCT) levels, and the development of infectious complications in trauma patients. METHODS: A total of 1757 consecutive trauma patients with an Injury Severity Score (ISS) > 16 admitted over a 10-year period were retrospectively analyzed over a 21-day period. Exclusion criteria included death within 72 h of admission (24.5%), late admission > 12 h after injury (16%), and age < 16 years (0.5%). Data are stated as the median (range). RESULTS: Altogether, 1032 trauma patients (76.2% male) with an average age of 38 years, a median ISS of 29 (16-75), and an Acute Physiology, Age, and Chronic Health Evaluation (APACHE) II score of 14 (0-40) were evaluated. The in-hospital mortality (>3 days) was 10%. Patients with insufficient 24-hour lactate clearance had a high rate of overall mortality and infections. Elevated early serum procalcitonin on days 1 to 5 after trauma was strongly associated with the subsequent development of sepsis (p < 0.01) but not with nonseptic infections. The kinetics of IL-6 were similar to those of PCT but did differentiate between infected and noninfected patients after day 5. CONCLUSIONS: This study demonstrates that elevated early procalcitonin and IL-6 levels and inadequate 24-hour lactate clearance help identify trauma patients who develop septic and nonseptic infectious complications. Definition of specific cutoff values and early monitoring of these parameters may help direct early surgical and antibiotic therapy and reduce infectious mortality.
Resumo:
BACKGROUND It is often assumed that horses with mild respiratory clinical signs, such as mucous nasal discharge and occasional coughing, have an increased risk of developing recurrent airway obstruction (RAO). HYPOTHESIS Compared to horses without any clinical signs of respiratory disease, those with occasional coughing, mucous nasal discharge, or both have an increased risk of developing signs of RAO (frequent coughing, increased breathing effort, exercise intolerance, or a combination of these) as characterized by the Horse Owner Assessed Respiratory Signs Index (HOARSI 1-4). ANIMALS Two half-sibling families descending from 2 RAO-affected stallions (n = 65 and n = 47) and an independent replication population of unrelated horses (n = 88). METHODS In a retrospective cohort study, standardized information on occurrence and frequency of coughing, mucous nasal discharge, poor performance, and abnormal breathing effort-and these factors combined in the HOARSI-as well as management factors were collected at intervals of 1.3-5 years. RESULTS Compared to horses without clinical signs of respiratory disease (half-siblings 7%; unrelated horses 3%), those with mild respiratory signs developed clinical signs of RAO more frequently: half-siblings with mucous nasal discharge 35% (P < .001, OR: 7.0, sensitivity: 62%, specificity: 81%), with mucous nasal discharge and occasional coughing 43% (P < .001, OR: 9.9, sensitivity: 55%, specificity: 89%); unrelated horses with occasional coughing: 25% (P = .006, OR = 9.7, sensitivity: 75%, specificity: 76%). CONCLUSIONS AND CLINICAL IMPORTANCE Occasional coughing and mucous nasal discharge might represent an increased risk of developing RAO.
Resumo:
Psychosocial factors have been described as affecting cellular immune measures in healthy subjects. In patients with early breast cancer we explored bi-directional psycho-immune effects to determine whether subjective burden has an impact on immune measures, and vice versa. Patients (n = 239) operated for early breast cancer and randomized into International Breast Cancer Study Group (IBCSG) adjuvant clinical trials were assessed immediately before the beginning of adjuvant treatment (baseline) and 3 and 6 months thereafter, at the beginning of the corresponding treatment cycle. Cellular immune measures (leukocytes, lymphocytes, lymphocyte subset counts), markers of activation of the cellular immune system (beta2-microglobulin, soluble interleukin-2 receptor serum levels), and self-report subjective burden (global indicators of physical well-being, mood, coping effort) were assessed concurrently. The relationship between subjective burden and gradients of immune measures was investigated with regression analyses controlling for adjuvant treatment. There was a pattern of small negative associations between all variables assessing subjective burden before the beginning of adjuvant therapy with the gradients of the markers of activation of the cellular immune system and NK cell counts. In particular, better mood predicted a decline in the course of beta2-microglobulin and IL-2r at months 3 and 6. The gradient of beta2-microglobulin was associated with mood and coping effort at month 3. However, the effect sizes were very small. In conclusion, in this explorative investigation, there was an indication for subjective burden affecting and being affected by markers of activation of the cellular immune system during the first 3 and 6 months of adjuvant therapy. The question of clinical significance remains unanswered. These associations have to be investigated with refined assessment tools and schedules.
Resumo:
Semi-natural grasslands, biodiversity hotspots in Central-Europe, suffer from the cessation of traditional land-use. Amount and intensity of these changes challenge current monitoring frameworks typically based on classic indicators such as selected target species or diversity indices. Indicators based on plant functional traits provide an interesting extension since they reflect ecological strategies at individual and ecological processes at community levels. They typically show convergent responses to gradients of land-use intensity over scales and regions, are more directly related to environmental drivers than diversity components themselves and enable detecting directional changes in whole community dynamics. However, probably due to their labor- and cost intensive assessment in the field, they have been rarely applied as indicators so far. Here we suggest overcoming these limitations by calculating indicators with plant traits derived from online accessible databases. Aiming to provide a minimal trait set to monitor effects of land-use intensification on plant diversity we investigated relationships between 12 community mean traits, 2 diversity indices and 6 predictors of land-use intensity within grassland communities of 3 different regions in Germany (part of the German ‘Biodiversity Exploratory’ research network). By standardization of traits and diversity measures, use of null models and linear mixed models we confirmed (i) strong links between functional community composition and plant diversity, (ii) that traits are closely related to land-use intensity, and (iii) that functional indicators are equally, or even more sensitive to land-use intensity than traditional diversity indices. The deduced trait set consisted of 5 traits, i.e., specific leaf area (SLA), leaf dry matter content (LDMC), seed release height, leaf distribution, and onset of flowering. These database derived traits enable the early detection of changes in community structure indicative for future diversity loss. As an addition to current monitoring measures they allow to better link environmental drivers to processes controlling community dynamics.
Resumo:
HYPOTHESIS Facial nerve monitoring can be used synchronous with a high-precision robotic tool as a functional warning to prevent of a collision of the drill bit with the facial nerve during direct cochlear access (DCA). BACKGROUND Minimally invasive direct cochlear access (DCA) aims to eliminate the need for a mastoidectomy by drilling a small tunnel through the facial recess to the cochlea with the aid of stereotactic tool guidance. Because the procedure is performed in a blind manner, structures such as the facial nerve are at risk. Neuromonitoring is a commonly used tool to help surgeons identify the facial nerve (FN) during routine surgical procedures in the mastoid. Recently, neuromonitoring technology was integrated into a commercially available drill system enabling real-time monitoring of the FN. The objective of this study was to determine if this drilling system could be used to warn of an impending collision with the FN during robot-assisted DCA. MATERIALS AND METHODS The sheep was chosen as a suitable model for this study because of its similarity to the human ear anatomy. The same surgical workflow applicable to human patients was performed in the animal model. Bone screws, serving as reference fiducials, were placed in the skull near the ear canal. The sheep head was imaged using a computed tomographic scanner and segmentation of FN, mastoid, and other relevant structures as well as planning of drilling trajectories was carried out using a dedicated software tool. During the actual procedure, a surgical drill system was connected to a nerve monitor and guided by a custom built robot system. As the planned trajectories were drilled, stimulation and EMG response signals were recorded. A postoperative analysis was achieved after each surgery to determine the actual drilled positions. RESULTS Using the calibrated pose synchronized with the EMG signals, the precise relationship between distance to FN and EMG with 3 different stimulation intensities could be determined for 11 different tunnels drilled in 3 different subjects. CONCLUSION From the results, it was determined that the current implementation of the neuromonitoring system lacks sensitivity and repeatability necessary to be used as a warning device in robotic DCA. We hypothesize that this is primarily because of the stimulation pattern achieved using a noninsulated drill as a stimulating probe. Further work is necessary to determine whether specific changes to the design can improve the sensitivity and specificity.
Resumo:
Pollen and plant-macrofossil data are presented for two lakes near the timberline in the Italian (Lago Basso, 2250 m) and Swiss Central Alps (Gouille Rion, 2343 m). The reforestation at both sites started at 9700-9500 BP with Pinus cembra, Larbc decidua, and Betula. The timberline reached its highest elevation between 8700 and 5000 BP and retreated after 5000 BP, due to a mid-Holocene climatic change and increasing human impact since about 3500 BP (Bronze Age). The expansion of Picea abies at Lago Basso between ca. 7500 and 6200 BP was probably favored by cold phases accompanied by increased oceanicity, whereas in the area of Gouille Rion, where spruce expanded rather late (between 4500 and 3500 BP), human influence equally might have been important. The mass expansion of Alnus viridis between ca. 5000 and 3500 BP probably can be related to both climatic change and human activity at timberline. During the early and middle Holocene a series of timberline fluctuations is recorded as declines in pollen and macrofossil concentrations of the major tree species, and as increases in nonarboreal pollen in the pollen percentage diagram of Gouille Rion. Most of ·the periods of low timberline can be correlated by radiocarbon dating with climatic changes in the Alps as indicated by glacier ad vances in combination with palynological records, solifluction, and dendrocli matical data. Lago Basso and Gouille Rion are the only sites in the Alps showing complete palaeobotanical records of cold phases between 10,000 and 2000 BP with very good time control. The altitudinal range of the Holocene treeline fluc tuations caused by climate most likely was not more than 100 to 150 m. A possible correlation of a cold period at ca. 7500-6500 BP (Misox oscil lation) in the Alps is made with paleoecological data from North America and Scandinavia and a climatic signal in the GRIP ice core from central Greenland 8200 yr ago (ca. 7400 yr uncal. BP).