102 resultados para random regression model
Resumo:
OBJECTIVES: We compared androgen and gonadotropin values in HIV-infected men who did and did not develop lipoatrophy on combination antiretroviral therapy (cART). METHODS: From a population of 136 treatment-naïve male Caucasians under successful zidovudine/lamivudine-based cART, the 10 patients developing lipoatrophy (cases) were compared with 87 randomly chosen controls. Plasma levels of free testosterone (fT), dehydroepiandrosterone (DHEA), follicle-stimulating hormone and luteinizing hormone (LH) were measured at baseline and after 2 years of cART. RESULTS: At baseline, 60% of the cases and 71% of the controls showed abnormally low fT values. LH levels were normal or low in 67 and 94% of the patients, respectively, indicating a disturbance of the hypothalamic-pituitary-gonadal axis. fT levels did not significantly change after 2 years of cART. Cases showed a significant increase in LH levels, while controls showed a significant increase in DHEA levels. In a multivariate logistic regression model, lipoatrophy was associated with higher baseline DHEA levels (P=0.04), an increase in LH levels during cART (P=0.001), a lower body mass index and greater age. CONCLUSIONS: Hypogonadism is present in the majority of HIV-infected patients. The development of cART-related lipoatrophy is associated with an increase in LH and a lack of increase in DHEA levels.
Resumo:
High-dose cefepime therapy is recommended for febrile neutropenia. Safety issues have been raised in a recent meta-analysis reporting an increased risk of mortality during cefepime therapy. Cefepime-related neurological toxicity has been associated with overdosing due to severe renal dysfunction. This study aimed to investigate the association between cefepime plasma concentrations and neurological toxicity in febrile neutropenic patients. Cefepime trough concentrations (by high-performance liquid chromatography) were retrospectively analyzed for 30 adult febrile neutropenic patients receiving the recommended high-dose regimen (6 g/day for a glomerular filtration rate [GFR] of >50 ml/min). The dose adjustment to renal function was evaluated by the ratio of the cefepime daily dose per 100 ml/min of glomerular filtration. The association between cefepime plasma concentrations and neurological toxicity was assessed on the basis of consistent neurological symptoms and/or signs (by NCI criteria). The median cefepime concentration was 8.7 mg/liter (range, 2.1 to 38 mg/liter) at a median of 4 days (range, 2 to 15 days) after the start of therapy. Neurological toxicity (altered mental status, hallucinations, or myoclonia) was attributed to cefepime in 6/30 (20%) patients (median GFR, 45 ml/min; range, 41 to 65 ml/min) receiving a median dose of 13.2 g/day per 100 ml/min GFR (range, 9.2 to 14.3 g/day per 100 ml/min GFR). Cefepime discontinuation resulted in complete neurological recovery for five patients and improvement for one patient. A multivariate logistic regression model confirmed high cefepime concentrations as an independent predictor of neurological toxicity, with a 50% probability threshold at ≥22 mg/liter (P = 0.05). High cefepime plasma concentrations are associated with neurological toxicity in febrile neutropenic patients with mild renal dysfunction. Careful adherence to normalized dosing per 100 ml/min GFR is crucial. Monitoring of plasma concentrations may contribute to preventing neurological toxicity of high-dose therapy for this life-threatening condition.
Resumo:
INTRODUCTION: Infertility treatments are a major source of the increase in multiple pregnancies (MPs). AIMS: The aims of the present study were (1.) to investigate the origin and maternal/neonatal outcomes of MP and (2.) to review the different measures that can be adopted to reduce these serious complications. METHODS: The study included all women with multiple births between 1 January 1995 and 31 December 2006 at the University Hospital of Bern, Switzerland. The outcomes associated with the various origins of MP (natural conception, ovarian stimulation [OS] ‒ in-vitro fertilisation [IVF-ICSI]) were analysed using a multinomial logistic regression model. An analysis of the Swiss law on reproductive medicine and its current proposed revision, as well as a literature review using Pubmed, was carried out. RESULTS: A total of 592 MP were registered, 91% (n = 537) resulted in live births. There was significantly more neonatal/maternal morbidity in MP after OS compared with natural conception and even with the IVF-ICSI group. With a policy of elective single embryo transfer (eSET), twin rates after IVF-ICSI can be reduced to <5% and triplets to <1%. CONCLUSIONS: After OS, more triplets are found and the outcome of MP is worse. MP is known to be associated with morbidity, mortality, and economic and social risks. To counteract these complications (1.) better training for physicians performing OS should be encouraged and (2.) the Swiss law on reproductive medicine needs to be changed, with the introduction of eSET policies. This would lead to a dramatic decrease in neonatal and maternal morbidity/mortality as well as significant cost reductions for the Swiss healthcare system.
Resumo:
PURPOSE: To determine and compare the diagnostic performance of magnetic resonance imaging (MRI) and computed tomography (CT) for the diagnosis of tumor extent in advanced retinoblastoma, using histopathologic analysis as the reference standard. DESIGN: Systematic review and meta-analysis. PARTICIPANTS: Patients with advanced retinoblastoma who underwent MRI, CT, or both for the detection of tumor extent from published diagnostic accuracy studies. METHODS: Medline and Embase were searched for literature published through April 2013 assessing the diagnostic performance of MRI, CT, or both in detecting intraorbital and extraorbital tumor extension of retinoblastoma. Diagnostic accuracy data were extracted from included studies. Summary estimates were based on a random effects model. Intrastudy and interstudy heterogeneity were analyzed. MAIN OUTCOME MEASURES: Sensitivity and specificity of MRI and CT in detecting tumor extent. RESULTS: Data of the following tumor-extent parameters were extracted: anterior eye segment involvement and ciliary body, optic nerve, choroidal, and (extra)scleral invasion. Articles on MRI reported results of 591 eyes from 14 studies, and articles on CT yielded 257 eyes from 4 studies. The summary estimates with their 95% confidence intervals (CIs) of the diagnostic accuracy of conventional MRI at detecting postlaminar optic nerve, choroidal, and scleral invasion showed sensitivities of 59% (95% CI, 37%-78%), 74% (95% CI, 52%-88%), and 88% (95% CI, 20%-100%), respectively, and specificities of 94% (95% CI, 84%-98%), 72% (95% CI, 31%-94%), and 99% (95% CI, 86%-100%), respectively. Magnetic resonance imaging with a high (versus a low) image quality showed higher diagnostic accuracies for detection of prelaminar optic nerve and choroidal invasion, but these differences were not statistically significant. Studies reporting the diagnostic accuracy of CT did not provide enough data to perform any meta-analyses. CONCLUSIONS: Magnetic resonance imaging is an important diagnostic tool for the detection of local tumor extent in advanced retinoblastoma, although its diagnostic accuracy shows room for improvement, especially with regard to sensitivity. With only a few-mostly old-studies, there is very little evidence on the diagnostic accuracy of CT, and generally these studies show low diagnostic accuracy. Future studies assessing the role of MRI in clinical decision making in terms of prognostic value for advanced retinoblastoma are needed.
Resumo:
OBJECTIVES: The objectives were to identify the social and medical factors associated with emergency department (ED) frequent use and to determine if frequent users were more likely to have a combination of these factors in a universal health insurance system. METHODS: This was a retrospective chart review case-control study comparing randomized samples of frequent users and nonfrequent users at the Lausanne University Hospital, Switzerland. The authors defined frequent users as patients with four or more ED visits within the previous 12 months. Adult patients who visited the ED between April 2008 and March 2009 (study period) were included, and patients leaving the ED without medical discharge were excluded. For each patient, the first ED electronic record within the study period was considered for data extraction. Along with basic demographics, variables of interest included social (employment or housing status) and medical (ED primary diagnosis) characteristics. Significant social and medical factors were used to construct a logistic regression model, to determine factors associated with frequent ED use. In addition, comparison of the combination of social and medical factors was examined. RESULTS: A total of 359 of 1,591 frequent and 360 of 34,263 nonfrequent users were selected. Frequent users accounted for less than a 20th of all ED patients (4.4%), but for 12.1% of all visits (5,813 of 48,117), with a maximum of 73 ED visits. No difference in terms of age or sex occurred, but more frequent users had a nationality other than Swiss or European (n = 117 [32.6%] vs. n = 83 [23.1%], p = 0.003). Adjusted multivariate analysis showed that social and specific medical vulnerability factors most increased the risk of frequent ED use: being under guardianship (adjusted odds ratio [OR] = 15.8; 95% confidence interval [CI] = 1.7 to 147.3), living closer to the ED (adjusted OR = 4.6; 95% CI = 2.8 to 7.6), being uninsured (adjusted OR = 2.5; 95% CI = 1.1 to 5.8), being unemployed or dependent on government welfare (adjusted OR = 2.1; 95% CI = 1.3 to 3.4), the number of psychiatric hospitalizations (adjusted OR = 4.6; 95% CI = 1.5 to 14.1), and the use of five or more clinical departments over 12 months (adjusted OR = 4.5; 95% CI = 2.5 to 8.1). Having two of four social factors increased the odds of frequent ED use (adjusted = OR 5.4; 95% CI = 2.9 to 9.9), and similar results were found for medical factors (adjusted OR = 7.9; 95% CI = 4.6 to 13.4). A combination of social and medical factors was markedly associated with ED frequent use, as frequent users were 10 times more likely to have three of them (on a total of eight factors; 95% CI = 5.1 to 19.6). CONCLUSIONS: Frequent users accounted for a moderate proportion of visits at the Lausanne ED. Social and medical vulnerability factors were associated with frequent ED use. In addition, frequent users were more likely to have both social and medical vulnerabilities than were other patients. Case management strategies might address the vulnerability factors of frequent users to prevent inequities in health care and related costs.
Resumo:
OBJECTIVE: The aim of this study was to determine whether V˙O(2) kinetics and specifically, the time constant of transitions from rest to heavy (τ(p)H) and severe (τ(p)S) exercise intensities, are related to middle distance swimming performance. DESIGN: Fourteen highly trained male swimmers (mean ± SD: 20.5 ± 3.0 yr; 75.4 ± 12.4 kg; 1.80 ± 0.07 m) performed an discontinuous incremental test, as well as square wave transitions for heavy and severe swimming intensities, to determine V˙O(2) kinetics parameters using two exponential functions. METHODS: All the tests involved front-crawl swimming with breath-by-breath analysis using the Aquatrainer swimming snorkel. Endurance performance was recorded as the time taken to complete a 400 m freestyle swim within an official competition (T400), one month from the date of the other tests. RESULTS: T400 (Mean ± SD) (251.4 ± 12.4 s) was significantly correlated with τ(p)H (15.8 ± 4.8s; r=0.62; p=0.02) and τ(p)S (15.8 ± 4.7s; r=0.61; p=0.02). The best single predictor of 400 m freestyle time, out of the variables that were assessed, was the velocity at V˙O(2max)vV˙O(2max), which accounted for 80% of the variation in performance between swimmers. However, τ(p)H and V˙O(2max) were also found to influence the prediction of T400 when they were included in a regression model that involved respiratory parameters only. CONCLUSIONS: Faster kinetics during the primary phase of the V˙O(2) response is associated with better performance during middle-distance swimming. However, vV˙O(2max) appears to be a better predictor of T400.
Resumo:
In this paper we study the relevance of multiple kernel learning (MKL) for the automatic selection of time series inputs. Recently, MKL has gained great attention in the machine learning community due to its flexibility in modelling complex patterns and performing feature selection. In general, MKL constructs the kernel as a weighted linear combination of basis kernels, exploiting different sources of information. An efficient algorithm wrapping a Support Vector Regression model for optimizing the MKL weights, named SimpleMKL, is used for the analysis. In this sense, MKL performs feature selection by discarding inputs/kernels with low or null weights. The approach proposed is tested with simulated linear and nonlinear time series (AutoRegressive, Henon and Lorenz series).
Resumo:
BACKGROUND: Urinary creatinine excretion is used as a marker of completeness of timed urine collections, which are a keystone of several metabolic evaluations in clinical investigations and epidemiological surveys. METHODS: We used data from two independent Swiss cross-sectional population-based studies with standardised 24-hour urinary collection and measured anthropometric variables. Only data from adults of European descent, with estimated glomerular filtration rate (eGFR) ≥60 ml/min/1.73 m2 and reported completeness of the urinary collection were retained. A linear regression model was developed to predict centiles of the 24-hour urinary creatinine excretion in 1,137 participants from the Swiss Survey on Salt and validated in 994 participants from the Swiss Kidney Project on Genes in Hypertension. RESULTS: The mean urinary creatinine excretion was 193 ± 41 μmol/kg/24 hours in men and 151 ± 38 μmol/kg/24 hours in women in the Swiss Survey on Salt. The values were inversely correlated with age and body mass index (BMI). CONCLUSIONS: We propose a validated prediction equation for 24-hour urinary creatinine excretion in the general European population, based on readily available variables such as age, sex and BMI, and a few derived normograms to ease its clinical application. This should help healthcare providers to interpret the completeness of a 24-hour urine collection in daily clinical practice and in epidemiological population studies.
Resumo:
STUDY AIM:: To develop a score predicting the risk of bacteremia in cancer patients with fever and neutropenia (FN), and to evaluate its performance. METHODS:: Pediatric patients with cancer presenting with FN induced by nonmyeloablative chemotherapy were observed in a prospective multicenter study. A score predicting the risk of bacteremia was developed from a multivariate mixed logistic regression model. Its cross-validated predictive performance was compared with that of published risk prediction rules. RESULTS:: Bacteremia was reported in 67 (16%) of 423 FN episodes. In 34 episodes (8%), bacteremia became known only after reassessment after 8 to 24 hours of inpatient management. Predicting bacteremia at reassessment was better than prediction at presentation with FN. A differential leukocyte count did not increase the predictive performance. The reassessment score predicting future bacteremia in 390 episodes without known bacteremia used the following 4 variables: hemoglobin ≥90 g/L at presentation (weight 3), platelet count <50 G/L (3), shaking chills (5), and other need for inpatient treatment or observation according to the treating physician (3). Applying a threshold ≥3, the score-simplified into a low-risk checklist-predicted bacteremia with 100% sensitivity, with 54 episodes (13%) classified as low-risk, and a specificity of 15%. CONCLUSIONS:: This reassessment score, simplified into a low-risk checklist of 4 routinely accessible characteristics, identifies pediatric patients with FN at risk for bacteremia. It has the potential to contribute to the reduction of use of antimicrobials in, and to shorten the length of hospital stays of pediatric patients with cancer and FN.
Resumo:
OBJECTIVES: To determine clinical and ultrasonographic predictors of joint replacement surgery across Europe in primary osteoarthritis (OA) of the knee. METHODS: This was a 3-year prospective study of a painful OA knee cohort (from a EULAR-sponsored, multicentre study). All subjects had clinical evaluation, radiographs and ultrasonography (US) at study entry. The rate of knee replacement surgery over the 3-year follow-up period was determined using Kaplan-Meier survival data analyses. Predictive factors for joint replacement were identified by univariate log-rank test then multivariate analysis using a Cox proportional-hazards regression model. Potential baseline predictors included demographic, clinical, radiographic and US features. RESULTS: Of the 600 original patients, 531 (88.5%), mean age 67+/-10 years, mean disease duration 6.1+/-6.9 years, had follow-up data and were analysed. During follow-up (median 3 years; range 0-4 years), knee replacement was done or required for 94 patients (estimated event rate of 17.7%). In the multivariate analysis, predictors of joint replacement were as follows: Kellgren and Lawrence radiographic grade (grade > or =III vs <III, hazards ratio (HR) = 4.08 (95% CI 2.34 to 7.12), p<0.0001); ultrasonographic knee effusion (> or =4 mm vs <4 mm) (HR = 2.63 (95% CI 1.70 to 4.06), p<0.0001); knee pain intensity on a 0-100 mm visual analogue scale (> or =60 vs <60) (HR = 1.81 (95% CI 1.15 to 2.83), p=0.01) and disease duration (> or =5 years vs <5 years) (HR=1.63 (95% CI 1.08 to 2.47), p=0.02). Clinically detected effusion and US synovitis were not associated with joint replacement in the univariate analysis. CONCLUSION: Longitudinal evaluation of this OA cohort demonstrated significant progression to joint replacement. In addition to severity of radiographic damage and pain, US-detected effusion was a predictor of subsequent joint replacement.
Resumo:
This paper presents 3-D brain tissue classificationschemes using three recent promising energy minimizationmethods for Markov random fields: graph cuts, loopybelief propagation and tree-reweighted message passing.The classification is performed using the well knownfinite Gaussian mixture Markov Random Field model.Results from the above methods are compared with widelyused iterative conditional modes algorithm. Theevaluation is performed on a dataset containing simulatedT1-weighted MR brain volumes with varying noise andintensity non-uniformities. The comparisons are performedin terms of energies as well as based on ground truthsegmentations, using various quantitative metrics.
Resumo:
PURPOSE To develop a score predicting the risk of adverse events (AEs) in pediatric patients with cancer who experience fever and neutropenia (FN) and to evaluate its performance. PATIENTS AND METHODS Pediatric patients with cancer presenting with FN induced by nonmyeloablative chemotherapy were observed in a prospective multicenter study. A score predicting the risk of future AEs (ie, serious medical complication, microbiologically defined infection, radiologically confirmed pneumonia) was developed from a multivariate mixed logistic regression model. Its cross-validated predictive performance was compared with that of published risk prediction rules. Results An AE was reported in 122 (29%) of 423 FN episodes. In 57 episodes (13%), the first AE was known only after reassessment after 8 to 24 hours of inpatient management. Predicting AE at reassessment was better than prediction at presentation with FN. A differential leukocyte count did not increase the predictive performance. The score predicting future AE in 358 episodes without known AE at reassessment used the following four variables: preceding chemotherapy more intensive than acute lymphoblastic leukemia maintenance (weight = 4), hemoglobin > or = 90 g/L (weight = 5), leukocyte count less than 0.3 G/L (weight = 3), and platelet count less than 50 G/L (weight = 3). A score (sum of weights) > or = 9 predicted future AEs. The cross-validated performance of this score exceeded the performance of published risk prediction rules. At an overall sensitivity of 92%, 35% of the episodes were classified as low risk, with a specificity of 45% and a negative predictive value of 93%. CONCLUSION This score, based on four routinely accessible characteristics, accurately identifies pediatric patients with cancer with FN at risk for AEs after reassessment.
Resumo:
BACKGROUND: Cytomegalovirus (CMV) retinitis is a major cause of visual impairment and blindness among patients with uncontrolled HIV infections. Whereas polymorphisms in interferon-lambda 3 (IFNL3, previously named IL28B) strongly influence the clinical course of hepatitis C, few studies examined the role of such polymorphisms in infections due to viruses other than hepatitis C virus. OBJECTIVES: To analyze the association of newly identified IFNL3/4 variant rs368234815 with susceptibility to CMV-associated retinitis in a cohort of HIV-infected patients. DESIGN AND METHODS: This retrospective longitudinal study included 4884 white patients from the Swiss HIV Cohort Study, among whom 1134 were at risk to develop CMV retinitis (CD4 nadir <100 /μl and positive CMV serology). The association of CMV-associated retinitis with rs368234815 was assessed by cumulative incidence curves and multivariate Cox regression models, using the estimated date of HIV infection as a starting point, with censoring at death and/or lost follow-up. RESULTS: A total of 40 individuals among 1134 patients at risk developed CMV retinitis. The minor allele of rs368234815 was associated with a higher risk of CMV retinitis (log-rank test P = 0.007, recessive mode of inheritance). The association was still significant in a multivariate Cox regression model (hazard ratio 2.31, 95% confidence interval 1.09-4.92, P = 0.03), after adjustment for CD4 nadir and slope, HAART and HIV-risk groups. CONCLUSION: We reported for the first time an association between an IFNL3/4 polymorphism and susceptibility to AIDS-related CMV retinitis. IFNL3/4 may influence immunity against viruses other than HCV.
Resumo:
ABSTRACT: BACKGROUND: Chest pain raises concern for the possibility of coronary heart disease. Scoring methods have been developed to identify coronary heart disease in emergency settings, but not in primary care. METHODS: Data were collected from a multicenter Swiss clinical cohort study including 672 consecutive patients with chest pain, who had visited one of 59 family practitioners' offices. Using delayed diagnosis we derived a prediction rule to rule out coronary heart disease by means of a logistic regression model. Known cardiovascular risk factors, pain characteristics, and physical signs associated with coronary heart disease were explored to develop a clinical score. Patients diagnosed with angina or acute myocardial infarction within the year following their initial visit comprised the coronary heart disease group. RESULTS: The coronary heart disease score was derived from eight variables: age, gender, duration of chest pain from 1 to 60 minutes, substernal chest pain location, pain increases with exertion, absence of tenderness point at palpation, cardiovascular risks factors, and personal history of cardiovascular disease. Area under the receiver operating characteristics curve was of 0.95 with a 95% confidence interval of 0.92; 0.97. From this score, 413 patients were considered as low risk for values of percentile 5 of the coronary heart disease patients. Internal validity was confirmed by bootstrapping. External validation using data from a German cohort (Marburg, n = 774) revealed a receiver operating characteristics curve of 0.75 (95% confidence interval, 0.72; 0.81) with a sensitivity of 85.6% and a specificity of 47.2%. CONCLUSIONS: This score, based only on history and physical examination, is a complementary tool for ruling out coronary heart disease in primary care patients complaining of chest pain.
Resumo:
In pediatric echocardiography, cardiac dimensions are often normalized for weight, height, or body surface area (BSA). The combined influence of height and weight on cardiac size is complex and likely varies with age. We hypothesized that increasing weight for height, as represented by body mass index (BMI) adjusted for age, is poorly accounted for in Z scores normalized for weight, height, or BSA. We aimed to evaluate whether a bias related to BMI was introduced when proximal aorta diameter Z scores are derived from bivariate models (only one normalizing variable), and whether such a bias was reduced when multivariable models are used. We analyzed 1,422 echocardiograms read as normal in children ≤18 years. We computed Z scores of the proximal aorta using allometric, polynomial, and multivariable models with four body size variables. We then assessed the level of residual association of Z scores and BMI adjusted for age and sex. In children ≥6 years, we found a significant residual linear association with BMI-for-age and Z scores for most regression models. Only a multivariable model including weight and height as independent predictors produced a Z score free of linear association with BMI. We concluded that a bias related to BMI was present in Z scores of proximal aorta diameter when normalization was done using bivariate models, regardless of the regression model or the normalizing variable. The use of multivariable models with weight and height as independent predictors should be explored to reduce this potential pitfall when pediatric echocardiography reference values are evaluated.