287 resultados para average toxicity score
Resumo:
This contribution introduces Data Envelopment Analysis (DEA), a performance measurement technique. DEA helps decision makers for the following reasons: (1) By calculating an efficiency score, it indicates if a firm is efficient or has capacity for improvement; (2) By setting target values for input and output, it calculates how much input must be decreased or output increased in order to become efficient; (3) By identifying the nature of returns to scale, it indicates if a firm has to decrease or increase its scale (or size) in order to minimise the average total cost; (4) By identifying a set of benchmarks, it specifies which other firms' processes need to be analysed in order to improve its own practices. This contribution presents the essentials about DEA, alongside a case study to intuitively understand its application. It also introduces Win4DEAP, a software package that conducts efficiency analysis based on DEA methodology. The methodical background of DEA is presented for more demanding readers. Finally, four advanced topics of DEA are treated: adjustment to the environment, preferences, sensitivity analysis and time series data.
Resumo:
In recent years many clinical prediction rules (CPR) have been developed. Before a CPR can be used in clinical practice, different methodical steps are necessary, from the development of the score, the internal and external validation to the impact study. Before using a CPR in daily practice family doctors have to verify how the rules have been developed and whether this has been done in a population similar to the population in which they would use them. The aim of this paper is to describe the development of a CPR, and to discuss advantages and risks related to the use of CPR in order to help family doctors in their choice of scores for use in their daily practice.
Resumo:
The in vivo bilirubin-albumin binding interaction of ceftriaxone (CRO) was investigated in 14 non-jaundiced newborns, aged 33-42 weeks of gestation, during the first few days of life after they had reached stable clinical condition. CRO (50 mg/kg) was infused intravenously over 30 min. The competitive binding effect of CRO on the bilirubin-albumin complex was estimated by determining the reserve albumin concentration (RAC) at baseline, at the end of CRO infusion, and at 15 and 60 min thereafter. Immediately after the end of drug administration, RAC decreased from 91.9 (+/- 25.1) mumol/l to 38.6 (+/- 10.1) mumol/l (P = 0.0001). At the same time the plasma bilirubin toxicity index (PBTI) increased from 0.64 (+/- 0.40) before drug infusion to 0.96 (+/- 0.44) thereafter (P = 0.0001). The highest displacement factor (DF) was calculated to be 2.8 (+/- 0.6) at the end of drug infusion. Average total serum bilirubin concentrations decreased from a baseline value of 59.6 (+/- 27.0) mumol/l to 55.2 (+/- 27.1) mumol/l (P = 0.026). Sixty minutes after the end of CRO infusion, RAC was 58.3 (+/- 21.7) mumol/l, PBTI regained baseline, but DF was still 1.9 (+/- 0.2). No adverse events were recorded. Our results demonstrate significant competitive interaction of CRO with bilirubin-albumin binding in vivo. Thus, ceftriaxone should not be given to the neonate at risk of developing bilirubin encephalopathy.
Resumo:
The predictive potential of six selected factors was assessed in 72 patients with primary myelodysplastic syndrome using univariate and multivariate logistic regression analysis of survival at 18 months. Factors were age (above median of 69 years), dysplastic features in the three myeloid bone marrow cell lineages, presence of chromosome defects, all metaphases abnormal, double or complex chromosome defects (C23), and a Bournemouth score of 2, 3, or 4 (B234). In the multivariate approach, B234 and C23 proved to be significantly associated with a reduction in the survival probability. The similarity of the regression coefficients associated with these two factors means that they have about the same weight. Consequently, the model was simplified by counting the number of factors (0, 1, or 2) present in each patient, thus generating a scoring system called the Lausanne-Bournemouth score (LB score). The LB score combines the well-recognized and easy-to-use Bournemouth score (B score) with the chromosome defect complexity, C23 constituting an additional indicator of patient outcome. The predicted risk of death within 18 months calculated from the model is as follows: 7.1% (confidence interval: 1.7-24.8) for patients with an LB score of 0, 60.1% (44.7-73.8) for an LB score of 1, and 96.8% (84.5-99.4) for an LB score of 2. The scoring system presented here has several interesting features. The LB score may improve the predictive value of the B score, as it is able to recognize two prognostic groups in the intermediate risk category of patients with B scores of 2 or 3. It has also the ability to identify two distinct prognostic subclasses among RAEB and possibly CMML patients. In addition to its above-described usefulness in the prognostic evaluation, the LB score may bring new insights into the understanding of evolution patterns in MDS. We used the combination of the B score and chromosome complexity to define four classes which may be considered four possible states of myelodysplasia and which describe two distinct evolutional pathways.
Resumo:
BACKGROUND AND PURPOSE: Beyond the Framingham Stroke Risk Score, prediction of future stroke may improve with a genetic risk score (GRS) based on single-nucleotide polymorphisms associated with stroke and its risk factors. METHODS: The study includes 4 population-based cohorts with 2047 first incident strokes from 22,720 initially stroke-free European origin participants aged ≥55 years, who were followed for up to 20 years. GRSs were constructed with 324 single-nucleotide polymorphisms implicated in stroke and 9 risk factors. The association of the GRS to first incident stroke was tested using Cox regression; the GRS predictive properties were assessed with area under the curve statistics comparing the GRS with age and sex, Framingham Stroke Risk Score models, and reclassification statistics. These analyses were performed per cohort and in a meta-analysis of pooled data. Replication was sought in a case-control study of ischemic stroke. RESULTS: In the meta-analysis, adding the GRS to the Framingham Stroke Risk Score, age and sex model resulted in a significant improvement in discrimination (all stroke: Δjoint area under the curve=0.016, P=2.3×10(-6); ischemic stroke: Δjoint area under the curve=0.021, P=3.7×10(-7)), although the overall area under the curve remained low. In all the studies, there was a highly significantly improved net reclassification index (P<10(-4)). CONCLUSIONS: The single-nucleotide polymorphisms associated with stroke and its risk factors result only in a small improvement in prediction of future stroke compared with the classical epidemiological risk factors for stroke.
Resumo:
The trabecular bone score (TBS) is an index of bone microarchitectural texture calculated from anteroposterior dual-energy X-ray absorptiometry (DXA) scans of the lumbar spine (LS) that predicts fracture risk, independent of bone mineral density (BMD). The aim of this study was to compare the effects of yearly intravenous zoledronate (ZOL) versus placebo (PLB) on LS BMD and TBS in postmenopausal women with osteoporosis. Changes in TBS were assessed in the subset of 107 patients recruited at the Department of Osteoporosis of the University Hospital of Berne, Switzerland, who were included in the HORIZON trial. All subjects received adequate calcium and vitamin D3. In these patients randomly assigned to either ZOL (n = 54) or PLB (n = 53) for 3 years, BMD was measured by DXA and TBS assessed by TBS iNsight (v1.9) at baseline and 6, 12, 24, and 36 months after treatment initiation. Baseline characteristics (mean ± SD) were similar between groups in terms of age, 76.8 ± 5.0 years; body mass index (BMI), 24.5 ± 3.6 kg/m(2) ; TBS, 1.178 ± 0.1 but for LS T-score (ZOL-2.9 ± 1.5 versus PLB-2.1 ± 1.5). Changes in LS BMD were significantly greater with ZOL than with PLB at all time points (p < 0.0001 for all), reaching +9.58% versus +1.38% at month 36. Change in TBS was significantly greater with ZOL than with PLB as of month 24, reaching +1.41 versus-0.49% at month 36; p = 0.031, respectively. LS BMD and TBS were weakly correlated (r = 0.20) and there were no correlations between changes in BMD and TBS from baseline at any visit. In postmenopausal women with osteoporosis, once-yearly intravenous ZOL therapy significantly increased LS BMD relative to PLB over 3 years and TBS as of 2 years. © 2013 American Society for Bone and Mineral Research.
Resumo:
Introduction: Low brain tissue oxygen pressure (PbtO2) is associated with worse outcome in patients with severe traumatic brain injury (TBI). However, it is unclear whether brain tissue hypoxia is merely a marker of injury severity or a predictor of prognosis, independent from intracranial pressure (ICP) and injury severity. Hypothesis: We hypothesized that brain tissue hypoxia was an independent predictor of outcome in patients wih severe TBI, irrespective of elevated ICP and of the severity of cerebral and systemic injury. Methods: This observational study was conducted at the Neurological ICU, Hospital of the University of Pennsylvania, an academic level I trauma center. Patients admitted with severe TBI who had PbtO2 and ICP monitoring were included in the study. PbtO2, ICP, mean arterial pressure (MAP) and cerebral perfusion pressure (CPP = MAP-ICP) were monitored continuously and recorded prospectively every 30 min. Using linear interpolation, duration and cumulative dose (area under the curve, AUC) of brain tissue hypoxia (PbtO2 < 15 mm Hg), elevated ICP >20 mm Hg and low CPP <60 mm Hg were calculated, and the association with outcome at hospital discharge, dichotomized as good (Glasgow Outcome Score [GOS] 4-5) vs. poor (GOS 1-3), was analyzed. Results: A total of 103 consecutive patients, monitored for an average of 5 days, was studied. Brain tissue hypoxia was observed in 66 (64%) patients despite ICP was < 20 mm Hg and CPP > 60 mm Hg (72 +/- 39% and 49 +/- 41% of brain hypoxic time, respectively). Compared with patients with good outcome, those with poor outcome had a longer duration of brain hypoxia (1.7 +/- 3.7 vs. 8.3 +/- 15.9 hrs, P<0.01), as well as a longer duration (11.5 +/- 16.5 vs. 21.6 +/- 29.6 hrs, P=0.03) and a greater cumulative dose (56 +/- 93 vs. 143 +/- 218 mm Hg*hrs, P<0.01) of elevated ICP. By multivariable logistic regression, admission Glasgow Coma Scale (OR, 0.83, 95% CI: 0.70-0.99, P=0.04), Marshall CT score (OR 2.42, 95% CI: 1.42-4.11, P<0.01), APACHE II (OR 1.20, 95% CI: 1.03-1.43, P=0.03), and the duration of brain tissue hypoxia (OR 1.13; 95% CI: 1.01-1.27; P=0.04) were all significantly associated with poor outcome. No independent association was found between the AUC for elevated ICP and outcome (OR 1.01, 95% CI 0.97-1.02, P=0.11) in our prospective cohort. Conclusions: In patients with severe TBI, brain tissue hypoxia is frequent, despite normal ICP and CPP, and is associated with poor outcome, independent of intracranial hypertension and the severity of cerebral and systemic injury. Our findings indicate that PbtO2 is a strong physiologic prognostic marker after TBI. Further study is warranted to examine whether PbtO2-directed therapy improves outcome in severely head-injured patients .
Resumo:
It is well known that exposure to low doses of lead causes long-lasting neurobehavioural deficits, but the cellular changes underlying these behavioural changes remain to be elucidated. A protective role of glial cells on neurons through lead sequestration by astrocytes has been proposed. The possible modulation of lead neurotoxicity by neuron-glia interactions was examined in three-dimensional cultures of foetal rat telencephalon. Mixed-brain cell cultures or cultures enriched in either neurons or glial cells were treated for 10 days with lead acetate (10(-6) m), a concentration below the limit of cytotoxicity. Intracellular lead content and cell type-specific enzyme activities were determined. It was found that in enriched cultures neurons stored more lead than glial cells, and each cell type alone stored more lead than in co-culture. Moreover, glial cells but not neurons were more affected by lead in enriched culture than in co-culture. These results show that neuron-glia interactions attenuate the cellular lead uptake and the glial susceptibility to lead, but they do not support the idea of a protective role of astrocytes.
Resumo:
Fish acute toxicity tests play an important role in environmental risk assessment and hazard classification because they allow for first estimates of the relative toxicity of various chemicals in various species. However, such tests need to be carefully interpreted. Here we shortly summarize the main issues which are linked to the genetics and the condition of the test animals, the standardized test situations, the uncertainty about whether a given test species can be seen as representative to a given fish fauna, the often missing knowledge about possible interaction effects, especially with micropathogens, and statistical problems like small sample sizes and, in some cases, pseudoreplication. We suggest that multi-factorial embryo tests on ecologically relevant species solve many of these issues, and we shortly explain how such tests could be done to avoid the weaker points of fish acute toxicity tests.
Resumo:
OBJECTIVE: Best long-term practice in primary HIV-1 infection (PHI) remains unknown for the individual. A risk-based scoring system associated with surrogate markers of HIV-1 disease progression could be helpful to stratify patients with PHI at highest risk for HIV-1 disease progression. METHODS: We prospectively enrolled 290 individuals with well-documented PHI in the Zurich Primary HIV-1 Infection Study, an open-label, non-randomized, observational, single-center study. Patients could choose to undergo early antiretroviral treatment (eART) and stop it after one year of undetectable viremia, to go on with treatment indefinitely, or to defer treatment. For each patient we calculated an a priori defined "Acute Retroviral Syndrome Severity Score" (ARSSS), consisting of clinical and basic laboratory variables, ranging from zero to ten points. We used linear regression models to assess the association between ARSSS and log baseline viral load (VL), baseline CD4+ cell count, and log viral setpoint (sVL) (i.e. VL measured ≥90 days after infection or treatment interruption). RESULTS: Mean ARSSS was 2.89. CD4+ cell count at baseline was negatively correlated with ARSSS (p = 0.03, n = 289), whereas HIV-RNA levels at baseline showed a strong positive correlation with ARSSS (p<0.001, n = 290). In the regression models, a 1-point increase in the score corresponded to a 0.10 log increase in baseline VL and a CD4+cell count decline of 12/µl, respectively. In patients with PHI and not undergoing eART, higher ARSSS were significantly associated with higher sVL (p = 0.029, n = 64). In contrast, in patients undergoing eART with subsequent structured treatment interruption, no correlation was found between sVL and ARSSS (p = 0.28, n = 40). CONCLUSION: The ARSSS is a simple clinical score that correlates with the best-validated surrogate markers of HIV-1 disease progression. In regions where ART is not universally available and eART is not standard this score may help identifying patients who will profit the most from early antiretroviral therapy.
Resumo:
PURPOSE: To evaluate the cause of recurrent pathologic instability after anterior cruciate ligament (ACL) surgery and the effectiveness of revision reconstruction using a quadriceps tendon autograft using a 2-incision technique. TYPE OF STUDY: Retrospective follow-up study. METHODS: Between 1999 and 2001, 31 patients underwent ACL revision reconstruction because of recurrent pathologic instability during sports or daily activities. Twenty-eight patients were reviewed after a mean follow-up of 4.2 years (range, 3.3 to 5.6 years). The mean age at revision surgery was 27 years (range, 18 to 41 years). The average time from primary procedure to revision surgery was 26 months (range, 9 to 45 months). A clinical, functional, and radiographic evaluation was performed. Also magnetic resonance imaging (MRI) or computed tomography (CT) scanning was performed. The International Knee Documentation Committee (IKDC), Lysholm, and Tegner scales were used. A KT-1000 arthrometer measurement (MEDmetric, San Diego, CA) by an experienced physician was made. RESULTS: Of the failures, 79% had radiographic evidence of malposition of their tunnels. In only 6 cases (21%) was the radiologic anatomy of tunnel placement judged to be correct on both the femoral and tibial side. The MRI or CT showed, in 6 cases, a too-centrally placed femoral tunnel. After revision surgery, the position of tunnels was corrected. A significant improvement of Lachman and pivot-shift phenomenon was observed. In particular, 17 patients had a negative Lachman test, and 11 patients had a grade I Lachman with a firm end point. Preoperatively, the pivot-shift test was positive in all cases, and at last follow-up in 7 patients (25%) a grade 1+ was found. Postoperatively, KT-1000 testing showed a mean manual maximum translation of 8.6 mm (SD, 2.34) for the affected knee; 97% of patients had a maximum manual side-to-side translation <5 mm. At the final postoperative evaluation, 26 patients (93%) graded their knees as normal or nearly normal according to the IKDC score. The mean Lysholm score was 93.6 (SD, 8.77) and the mean Tegner activity score was 6.1 (SD, 1.37). No patient required further revision. Five patients (18%) complained of hypersensitive scars from the reconstructive surgery that made kneeling difficult. CONCLUSIONS: There were satisfactory results after ACL revision surgery using quadriceps tendon and a 2-incision technique at a minimum 3 years' follow-up; 93% of patients returned to sports activities. LEVEL OF EVIDENCE: Level IV, case series, no control group.
Resumo:
BACKGROUND: The aim of our study was to assess the feasibility of minimally invasive digestive anastomosis using a modular flexible magnetic anastomotic device made up of a set of two flexible chains of magnetic elements. The assembly possesses a non-deployed linear configuration which allows it to be introduced through a dedicated small-sized applicator into the bowel where it takes the deployed form. A centering suture allows the mating between the two parts to be controlled in order to include the viscerotomy between the two magnetic rings and the connected viscera. METHODS AND PROCEDURES: Eight pigs were involved in a 2-week survival experimental study. In five colorectal anastomoses, the proximal device was inserted by a percutaneous endoscopic technique, and the colon was divided below the magnet. The distal magnet was delivered transanally to connect with the proximal magnet. In three jejunojejunostomies, the first magnetic chain was injected in its linear configuration through a small enterotomy. Once delivered, the device self-assembled into a ring shape. A second magnet was injected more distally through the same port. The centering sutures were tied together extracorporeally and, using a knot pusher, magnets were connected. Ex vivo strain testing to determine the compression force delivered by the magnetic device, burst pressure of the anastomosis, and histology were performed. RESULTS: Mean operative time including endoscopy was 69.2 ± 21.9 min, and average time to full patency was 5 days for colorectal anastomosis. Operative times for jejunojejunostomies were 125, 80, and 35 min, respectively. The postoperative period was uneventful. Burst pressure of all anastomoses was ≥ 110 mmHg. Mean strain force to detach the devices was 6.1 ± 0.98 and 12.88 ± 1.34 N in colorectal and jejunojejunal connections, respectively. Pathology showed a mild-to-moderate inflammation score. CONCLUSIONS: The modular magnetic system showed enormous potential to create minimally invasive digestive anastomoses, and may represent an alternative to stapled anastomoses, being easy to deliver, effective, and low cost.
Resumo:
The trabecular bone score (TBS) is a new parameter that is determined from gray-level analysis of dual-energy X-ray absorptiometry (DXA) images. It relies on the mean thickness and volume fraction of trabecular bone microarchitecture. This was a preliminary case-control study to evaluate the potential diagnostic value of TBS as a complement to bone mineral density (BMD), by comparing postmenopausal women with and without fractures. The sample consisted of 45 women with osteoporotic fractures (5 hip fractures, 20 vertebral fractures, and 20 other types of fracture) and 155 women without a fracture. Stratification was performed, taking into account each type of fracture (except hip), and women with and without fractures were matched for age and spine BMD. BMD and TBS were measured at the total spine. TBS measured at the total spine revealed a significant difference between the fracture and age- and spine BMD-matched nonfracture group, when considering all types of fractures and vertebral fractures. In these cases, the diagnostic value of the combination of BMD and TBS likely will be higher compared with that of BMD alone. TBS, as evaluated from standard DXA scans directly, potentially complements BMD in the detection of osteoporotic fractures. Prospective studies are necessary to fully evaluate the potential role of TBS as a complementary risk factor for fracture.
Resumo:
Developing a novel technique for the efficient, noninvasive clinical evaluation of bone microarchitecture remains both crucial and challenging. The trabecular bone score (TBS) is a new gray-level texture measurement that is applicable to dual-energy X-ray absorptiometry (DXA) images. Significant correlations between TBS and standard 3-dimensional (3D) parameters of bone microarchitecture have been obtained using a numerical simulation approach. The main objective of this study was to empirically evaluate such correlations in anteroposterior spine DXA images. Thirty dried human cadaver vertebrae were evaluated. Micro-computed tomography acquisitions of the bone pieces were obtained at an isotropic resolution of 93μm. Standard parameters of bone microarchitecture were evaluated in a defined region within the vertebral body, excluding cortical bone. The bone pieces were measured on a Prodigy DXA system (GE Medical-Lunar, Madison, WI), using a custom-made positioning device and experimental setup. Significant correlations were detected between TBS and 3D parameters of bone microarchitecture, mostly independent of any correlation between TBS and bone mineral density (BMD). The greatest correlation was between TBS and connectivity density, with TBS explaining roughly 67.2% of the variance. Based on multivariate linear regression modeling, we have established a model to allow for the interpretation of the relationship between TBS and 3D bone microarchitecture parameters. This model indicates that TBS adds greater value and power of differentiation between samples with similar BMDs but different bone microarchitectures. It has been shown that it is possible to estimate bone microarchitecture status derived from DXA imaging using TBS.