854 resultados para sparse Bayesian regression
Resumo:
Summary points: - The bias introduced by random measurement error will be different depending on whether the error is in an exposure variable (risk factor) or outcome variable (disease) - Random measurement error in an exposure variable will bias the estimates of regression slope coefficients towards the null - Random measurement error in an outcome variable will instead increase the standard error of the estimates and widen the corresponding confidence intervals, making results less likely to be statistically significant - Increasing sample size will help minimise the impact of measurement error in an outcome variable but will only make estimates more precisely wrong when the error is in an exposure variable
Resumo:
Introduction: As imatinib pharmacokinetics are highly variable, plasma levels differ largely between patients under the same dosage. Retrospective studies in chronic myeloid leukemia (CML) patients showed significant correlations between low levels and suboptimal response, as well as between high levels and poor tolerability. Monitoring of trough plasma levels, targeting 1000 μg/L and above, is thus increasingly advised. Our study was launched to assess prospectively the clinical usefulness of systematic imatinib TDM in CML patients. This preliminary analysis addresses the appropriateness of the dosage adjustment approach applied in this study, which targets the recommended trough level and allows an interval of 4-24 h after last drug intake for blood sampling. Methods: Blood samples from the first 15 patients undergoing 1st TDM were obtained 1.5-25 h after last dose. Imatinib plasma levels were measured by LC-MS/MS and the concentrations were extrapolated to trough based on a Bayesian approach using a population pharmacokinetic model. Trough levels were predicted to differ significantly from the target in 12 patients (10 <750 μg/L; 2 >1500 μg/L along with poor tolerance) and individual dose adjustments were proposed. 8 patients underwent a 2nd TDM cycle. Trough levels of 1st and 2nd TDM were compared, the sample drawn 1.5 h after last dose (during distribution phase) was excluded from the analysis. Results: Individual dose adjustments were applied in 6 patients. Observed concentrations extrapolated to trough ranged from 360 to 1832 μg/L (median 725; mean 810, CV 52%) on 1st TDM and from 720 to 1187 μg/L (median 950; mean 940, CV 18%) on 2nd TDM cycle. Conclusions: These preliminary results suggest that TDM of imatinib using a Bayesian interpretation is able to target the recommended trough level of 1000 μg/L and to reduce the considerable differences in trough level exposure between patients (with CV decreasing from 52% to 18%). While this may simplify blood collection in daily practice, as samples do not have to be drawn exactly at trough, the largest possible interval to last drug intake yet remains preferable to avoid sampling during distribution phase leading to biased extrapolation. This encourages the evaluation of the clinical benefit of a routine TDM intervention in CML patients, which the randomized Swiss I-COME trial aims to.
Resumo:
The predictive potential of six selected factors was assessed in 72 patients with primary myelodysplastic syndrome using univariate and multivariate logistic regression analysis of survival at 18 months. Factors were age (above median of 69 years), dysplastic features in the three myeloid bone marrow cell lineages, presence of chromosome defects, all metaphases abnormal, double or complex chromosome defects (C23), and a Bournemouth score of 2, 3, or 4 (B234). In the multivariate approach, B234 and C23 proved to be significantly associated with a reduction in the survival probability. The similarity of the regression coefficients associated with these two factors means that they have about the same weight. Consequently, the model was simplified by counting the number of factors (0, 1, or 2) present in each patient, thus generating a scoring system called the Lausanne-Bournemouth score (LB score). The LB score combines the well-recognized and easy-to-use Bournemouth score (B score) with the chromosome defect complexity, C23 constituting an additional indicator of patient outcome. The predicted risk of death within 18 months calculated from the model is as follows: 7.1% (confidence interval: 1.7-24.8) for patients with an LB score of 0, 60.1% (44.7-73.8) for an LB score of 1, and 96.8% (84.5-99.4) for an LB score of 2. The scoring system presented here has several interesting features. The LB score may improve the predictive value of the B score, as it is able to recognize two prognostic groups in the intermediate risk category of patients with B scores of 2 or 3. It has also the ability to identify two distinct prognostic subclasses among RAEB and possibly CMML patients. In addition to its above-described usefulness in the prognostic evaluation, the LB score may bring new insights into the understanding of evolution patterns in MDS. We used the combination of the B score and chromosome complexity to define four classes which may be considered four possible states of myelodysplasia and which describe two distinct evolutional pathways.
Resumo:
We present formulas for computing the resultant of sparse polyno- mials as a quotient of two determinants, the denominator being a minor of the numerator. These formulas extend the original formulation given by Macaulay for homogeneous polynomials.
Resumo:
The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.
Resumo:
In this paper we present a Bayesian image reconstruction algorithm with entropy prior (FMAPE) that uses a space-variant hyperparameter. The spatial variation of the hyperparameter allows different degrees of resolution in areas of different statistical characteristics, thus avoiding the large residuals resulting from algorithms that use a constant hyperparameter. In the first implementation of the algorithm, we begin by segmenting a Maximum Likelihood Estimator (MLE) reconstruction. The segmentation method is based on using a wavelet decomposition and a self-organizing neural network. The result is a predetermined number of extended regions plus a small region for each star or bright object. To assign a different value of the hyperparameter to each extended region and star, we use either feasibility tests or cross-validation methods. Once the set of hyperparameters is obtained, we carried out the final Bayesian reconstruction, leading to a reconstruction with decreased bias and excellent visual characteristics. The method has been applied to data from the non-refurbished Hubble Space Telescope. The method can be also applied to ground-based images.
Resumo:
When preparing an article on image restoration in astronomy, it is obvious that some topics have to be dropped to keep the work at reasonable length. We have decided to concentrate on image and noise models and on the algorithms to find the restoration. Topics like parameter estimation and stopping rules are also commented on. We start by describing the Bayesian paradigm and then proceed to study the noise and blur models used by the astronomical community. Then the prior models used to restore astronomical images are examined. We describe the algorithms used to find the restoration for the most common combinations of degradation and image models. Then we comment on important issues such as acceleration of algorithms, stopping rules, and parameter estimation. We also comment on the huge amount of information available to, and made available by, the astronomical community.
Resumo:
Prediction of species' distributions is central to diverse applications in ecology, evolution and conservation science. There is increasing electronic access to vast sets of occurrence records in museums and herbaria, yet little effective guidance on how best to use this information in the context of numerous approaches for modelling distributions. To meet this need, we compared 16 modelling methods over 226 species from 6 regions of the world, creating the most comprehensive set of model comparisons to date. We used presence-only data to fit models, and independent presence-absence data to evaluate the predictions. Along with well-established modelling methods such as generalised additive models and GARP and BIOCLIM, we explored methods that either have been developed recently or have rarely been applied to modelling species' distributions. These include machine-learning methods and community models, both of which have features that may make them particularly well suited to noisy or sparse information, as is typical of species' occurrence data. Presence-only data were effective for modelling species' distributions for many species and regions. The novel methods consistently outperformed more established methods. The results of our analysis are promising for the use of data from museums and herbaria, especially as methods suited to the noise inherent in such data improve.
Resumo:
BACKGROUND: Several studies have reported increased levels of inflammatory biomarkers in chronic kidney disease (CKD), but data from the general population are sparse. In this study, we assessed levels of the inflammatory markers C-reactive protein (hsCRP), tumor necrosis factor α (TNF-α), interleukin (IL)-1β and IL-6 across all ranges of renal function. METHODS: We conducted a cross-sectional study in a random sample of 6,184 Caucasian subjects aged 35-75 years in Lausanne, Switzerland. Serum levels of hsCRP, TNF-α, IL-6, and IL-1β were measured in 6,067 participants (98.1%); serum creatinine-based estimated glomerular filtration rate (eGFR(creat), CKD-EPI formula) was used to assess renal function, and albumin/creatinine ratio on spot morning urine to assess microalbuminuria (MAU). RESULTS: Higher serum levels of IL-6, TNF-α and hsCRP and lower levels of IL-1β were associated with a lower renal function, CKD (eGFR(creat) <60 ml/min/1.73 m(2); n = 283), and MAU (n = 583). In multivariate linear regression analysis adjusted for age, sex, hypertension, smoking, diabetes, body mass index, lipids, antihypertensive and hypolipemic therapy, only log-transformed TNF-α remained independently associated with lower renal function (β -0.54 ±0.19). In multivariate logistic regression analysis, higher TNF-α levels were associated with CKD (OR 1.17; 95% CI 1.01-1.35), whereas higher levels of IL-6 (OR 1.09; 95% CI 1.02-1.16) and hsCRP (OR 1.21; 95% CI 1.10-1.32) were associated with MAU. CONCLUSION: We did not confirm a significant association between renal function and IL-6, IL-1β and hsCRP in the general population. However, our results demonstrate a significant association between TNF-α and renal function, suggesting a potential link between inflammation and the development of CKD. These data also confirm the association between MAU and inflammation.
Resumo:
Experimental and clinical evidence indicates that non-steroidal anti-inflammatory drugs and cyclooxygenase-2 inhibitors may have anti-cancer activities. Here we report on a patient with a metastatic melanoma of the leg who experienced a complete and sustained regression of skin metastases upon continuous single treatment with the cyclooxygenase-2 inhibitor rofecoxib. Our observations indicate that the inhibition of cyclooxygenase-2 can lead to the regression of disseminated skin melanoma metastases, even after failure of chemotherapy.
Resumo:
Understanding adaptive genetic responses to climate change is a main challenge for preserving biological diversity. Successful predictive models for climate-driven range shifts of species depend on the integration of information on adaptation, including that derived from genomic studies. Long-lived forest trees can experience substantial environmental change across generations, which results in a much more prominent adaptation lag than in annual species. Here, we show that candidate-gene SNPs (single nucleotide polymorphisms) can be used as predictors of maladaptation to climate in maritime pine (Pinus pinaster Aiton), an outcrossing long-lived keystone tree. A set of 18 SNPs potentially associated with climate, 5 of them involving amino acid-changing variants, were retained after performing logistic regression, latent factor mixed models, and Bayesian analyses of SNP-climate correlations. These relationships identified temperature as an important adaptive driver in maritime pine and highlighted that selective forces are operating differentially in geographically discrete gene pools. The frequency of the locally advantageous alleles at these selected loci was strongly correlated with survival in a common garden under extreme (hot and dry) climate conditions, which suggests that candidate-gene SNPs can be used to forecast the likely destiny of natural forest ecosystems under climate change scenarios. Differential levels of forest decline are anticipated for distinct maritime pine gene pools. Geographically defined molecular proxies for climate adaptation will thus critically enhance the predictive power of range-shift models and help establish mitigation measures for long-lived keystone forest trees in the face of impending climate change.
Resumo:
Tumor-regressions following tumor-associated-antigen vaccination in animal models contrast with the limited clinical outcomes in cancer patients. Most animal studies however used subcutaneous-tumor-models and questions arise as whether these are relevant for tumors growing in mucosae; whether specific mucosal-homing instructions are required; and how this may be influenced by the tumor.
Resumo:
La regressió basada en distàncies és un mètode de predicció que consisteix en dos passos: a partir de les distàncies entre observacions obtenim les variables latents, les quals passen a ser els regressors en un model lineal de mínims quadrats ordinaris. Les distàncies les calculem a partir dels predictors originals fent us d'una funció de dissimilaritats adequada. Donat que, en general, els regressors estan relacionats de manera no lineal amb la resposta, la seva selecció amb el test F usual no és possible. En aquest treball proposem una solució a aquest problema de selecció de predictors definint tests estadístics generalitzats i adaptant un mètode de bootstrap no paramètric per a l'estimació dels p-valors. Incluim un exemple numèric amb dades de l'assegurança d'automòbils.
Resumo:
Fluvial deposits are a challenge for modelling flow in sub-surface reservoirs. Connectivity and continuity of permeable bodies have a major impact on fluid flow in porous media. Contemporary object-based and multipoint statistics methods face a problem of robust representation of connected structures. An alternative approach to model petrophysical properties is based on machine learning algorithm ? Support Vector Regression (SVR). Semi-supervised SVR is able to establish spatial connectivity taking into account the prior knowledge on natural similarities. SVR as a learning algorithm is robust to noise and captures dependencies from all available data. Semi-supervised SVR applied to a synthetic fluvial reservoir demonstrated robust results, which are well matched to the flow performance