898 resultados para Threshold regression


Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Whole pelvis intensity modulated radiotherapy (IMRT) is increasingly being used to treat cervical cancer aiming to reduce side effects. Encouraged by this, some groups have proposed the use of simultaneous integrated boost (SIB) to target the tumor, either to get a higher tumoricidal effect or to replace brachytherapy. Nevertheless, physiological organ movement and rapid tumor regression throughout treatment might substantially reduce any benefit of this approach. PURPOSE: To evaluate the clinical target volume - simultaneous integrated boost (CTV-SIB) regression and motion during chemo-radiotherapy (CRT) for cervical cancer, and to monitor treatment progress dosimetrically and volumetrically to ensure treatment goals are met. METHODS AND MATERIALS: Ten patients treated with standard doses of CRT and brachytherapy were retrospectively re-planned using a helical Tomotherapy - SIB technique for the hypothetical scenario of this feasibility study. Target and organs at risk (OAR) were contoured on deformable fused planning-computed tomography and megavoltage computed tomography images. The CTV-SIB volume regression was determined. The center of mass (CM) was used to evaluate the degree of motion. The Dice's similarity coefficient (DSC) was used to assess the spatial overlap of CTV-SIBs between scans. A cumulative dose-volume histogram modeled estimated delivered doses. RESULTS: The CTV-SIB relative reduction was between 31 and 70%. The mean maximum CM change was 12.5, 9, and 3 mm in the superior-inferior, antero-posterior, and right-left dimensions, respectively. The CTV-SIB-DSC approached 1 in the first week of treatment, indicating almost perfect overlap. CTV-SIB-DSC regressed linearly during therapy, and by the end of treatment was 0.5, indicating 50% discordance. Two patients received less than 95% of the prescribed dose. Much higher doses to the OAR were observed. A multiple regression analysis showed a significant interaction between CTV-SIB reduction and OAR dose increase. CONCLUSIONS: The CTV-SIB had important regression and motion during CRT, receiving lower therapeutic doses than expected. The OAR had unpredictable shifts and received higher doses. The use of SIB without frequent adaptation of the treatment plan exposes cervical cancer patients to an unpredictable risk of under-dosing the target and/or overdosing adjacent critical structures. In that scenario, brachytherapy continues to be the gold standard approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Recently, it was shown that the relation between admission glucose and functional outcome after ischemic stroke is described by a J-shaped curve, with a glucose range of 3.7-7.3 mmol/l associated with a favorable outcome. We tested the hypothesis that persistence of hyperglycemia above this threshold at 24-48 h after stroke onset impairs 3-month functional outcome. METHODS: We analyzed all patients with glucose >7.3 mmol/l on admission from the Acute STroke Registry and Analysis of Lausanne (ASTRAL). Patients were divided into two groups according to their subacute glucose level at 24-48 h after last well-being time (group 1: ≤7.3 mmol/l, group 2: >7.3 mmol/l). A favorable functional outcome was defined as a modified Rankin Score (mRS) ≤2 at 3 months. A multiple logistic regression analysis of multiple demographic, clinical, laboratory and neuroimaging covariates was performed to assess predictors of an unfavorable outcome. RESULTS: A total of 1,984 patients with ischemic stroke were admitted between January 1, 2003 and October 20, 2009, within 24 h after last well-being time. In the 421 patients (21.2%) with admission glucose >7.3 mmol/l, the proportion of patients with a favorable outcome was not statistically significantly different between the two groups (59.2 vs. 48.7%, respectively). In multiple logistic regression analysis, unfavorable outcome was significantly associated with age (odds ratio, OR: 1.06, 95% confidence interval, 95% CI: 1.03-1.08 for every 10-year increase), National Institute of Health Stroke Score, NIHSS score, on admission (OR: 1.16, 95% CI: 1.11-1.21), prehospital mRS (OR: 12.63, 95% CI: 2.61-61.10 for patients with score >0), antidiabetic drug usage (OR: 0.36, 95% CI: 0.15-0.86) and glucose on admission (OR: 1.16, 95% CI: 1.02-1.31 for every 1 mmol/l increase). No association was found between persistent hyperglycemia at 24-28 h and outcome in either diabetics or nondiabetics. CONCLUSIONS: In ischemic stroke patients with acute hyperglycemia, persistent hyperglycemia (>7.3 mmol/l) at 24-48 h after stroke onset is not associated with a worse functional outcome at 3 months whether the patient was previously diabetic or not.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many of the most interesting questions ecologists ask lead to analyses of spatial data. Yet, perhaps confused by the large number of statistical models and fitting methods available, many ecologists seem to believe this is best left to specialists. Here, we describe the issues that need consideration when analysing spatial data and illustrate these using simulation studies. Our comparative analysis involves using methods including generalized least squares, spatial filters, wavelet revised models, conditional autoregressive models and generalized additive mixed models to estimate regression coefficients from synthetic but realistic data sets, including some which violate standard regression assumptions. We assess the performance of each method using two measures and using statistical error rates for model selection. Methods that performed well included generalized least squares family of models and a Bayesian implementation of the conditional auto-regressive model. Ordinary least squares also performed adequately in the absence of model selection, but had poorly controlled Type I error rates and so did not show the improvements in performance under model selection when using the above methods. Removing large-scale spatial trends in the response led to poor performance. These are empirical results; hence extrapolation of these findings to other situations should be performed cautiously. Nevertheless, our simulation-based approach provides much stronger evidence for comparative analysis than assessments based on single or small numbers of data sets, and should be considered a necessary foundation for statements of this type in future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Random scale-free networks have the peculiar property of being prone to the spreading of infections. Here we provide for the susceptible-infected-susceptible model an exact result showing that a scale-free degree distribution with diverging second moment is a sufficient condition to have null epidemic threshold in unstructured networks with either assortative or disassortative mixing. Degree correlations result therefore irrelevant for the epidemic spreading picture in these scale-free networks. The present result is related to the divergence of the average nearest neighbors degree, enforced by the degree detailed balance condition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: Neurophysiological monitoring aims to improve the safety of pedicle screw placement, but few quantitative studies assess specificity and sensitivity. In this study, screw placement within the pedicle is measured (post-op CT scan, horizontal and vertical distance from the screw edge to the surface of the pedicle) and correlated with intraoperative neurophysiological stimulation thresholds. METHODS: A single surgeon placed 68 thoracic and 136 lumbar screws in 30 consecutive patients during instrumented fusion under EMG control. The female to male ratio was 1.6 and the average age was 61.3 years (SD 17.7). Radiological measurements, blinded to stimulation threshold, were done on reformatted CT reconstructions using OsiriX software. A standard deviation of the screw position of 2.8 mm was determined from pilot measurements, and a 1 mm of screw-pedicle edge distance was considered as a difference of interest (standardised difference of 0.35) leading to a power of the study of 75 % (significance level 0.05). RESULTS: Correct placement and stimulation thresholds above 10 mA were found in 71 % of screws. Twenty-two percent of screws caused cortical breach, 80 % of these had stimulation thresholds above 10 mA (sensitivity 20 %, specificity 90 %). True prediction of correct position of the screw was more frequent for lumbar than for thoracic screws. CONCLUSION: A screw stimulation threshold of >10 mA does not indicate correct pedicle screw placement. A hypothesised gradual decrease of screw stimulation thresholds was not observed as screw placement approaches the nerve root. Aside from a robust threshold of 2 mA indicating direct contact with nervous tissue, a secondary threshold appears to depend on patients' pathology and surgical conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La regressió basada en distàncies és un mètode de predicció que consisteix en dos passos: a partir de les distàncies entre observacions obtenim les variables latents, les quals passen a ser els regressors en un model lineal de mínims quadrats ordinaris. Les distàncies les calculem a partir dels predictors originals fent us d'una funció de dissimilaritats adequada. Donat que, en general, els regressors estan relacionats de manera no lineal amb la resposta, la seva selecció amb el test F usual no és possible. En aquest treball proposem una solució a aquest problema de selecció de predictors definint tests estadístics generalitzats i adaptant un mètode de bootstrap no paramètric per a l'estimació dels p-valors. Incluim un exemple numèric amb dades de l'assegurança d'automòbils.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[spa] En un modelo de Poisson compuesto, definimos una estrategia de reaseguro proporcional de umbral : se aplica un nivel de retención k1 siempre que las reservas sean inferiores a un determinado umbral b, y un nivel de retención k2 en caso contrario. Obtenemos la ecuación íntegro-diferencial para la función Gerber-Shiu, definida en Gerber-Shiu -1998- en este modelo, que nos permite obtener las expresiones de la probabilidad de ruina y de la transformada de Laplace del momento de ruina para distintas distribuciones de la cuantía individual de los siniestros. Finalmente presentamos algunos resultados numéricos.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: MLPA method is a potentially useful semi-quantitative method to detect copy number alterations in targeted regions. In this paper, we propose a method for the normalization procedure based on a non-linear mixed-model, as well as a new approach for determining the statistical significance of altered probes based on linear mixed-model. This method establishes a threshold by using different tolerance intervals that accommodates the specific random error variability observed in each test sample.Results: Through simulation studies we have shown that our proposed method outperforms two existing methods that are based on simple threshold rules or iterative regression. We have illustrated the method using a controlled MLPA assay in which targeted regions are variable in copy number in individuals suffering from different disorders such as Prader-Willi, DiGeorge or Autism showing the best performace.Conclusion: Using the proposed mixed-model, we are able to determine thresholds to decide whether a region is altered. These threholds are specific for each individual, incorporating experimental variability, resulting in improved sensitivity and specificity as the examples with real data have revealed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Robust estimators for accelerated failure time models with asymmetric (or symmetric) error distribution and censored observations are proposed. It is assumed that the error model belongs to a log-location-scale family of distributions and that the mean response is the parameter of interest. Since scale is a main component of mean, scale is not treated as a nuisance parameter. A three steps procedure is proposed. In the first step, an initial high breakdown point S estimate is computed. In the second step, observations that are unlikely under the estimated model are rejected or down weighted. Finally, a weighted maximum likelihood estimate is computed. To define the estimates, functions of censored residuals are replaced by their estimated conditional expectation given that the response is larger than the observed censored value. The rejection rule in the second step is based on an adaptive cut-off that, asymptotically, does not reject any observation when the data are generat ed according to the model. Therefore, the final estimate attains full efficiency at the model, with respect to the maximum likelihood estimate, while maintaining the breakdown point of the initial estimator. Asymptotic results are provided. The new procedure is evaluated with the help of Monte Carlo simulations. Two examples with real data are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The relationship between hypoxic stress, autophagy, and specific cell-mediated cytotoxicity remains unknown. This study shows that hypoxia-induced resistance of lung tumor to cytolytic T lymphocyte (CTL)-mediated lysis is associated with autophagy induction in target cells. In turn, this correlates with STAT3 phosphorylation on tyrosine 705 residue (pSTAT3) and HIF-1α accumulation. Inhibition of autophagy by siRNA targeting of either beclin1 or Atg5 resulted in impairment of pSTAT3 and restoration of hypoxic tumor cell susceptibility to CTL-mediated lysis. Furthermore, inhibition of pSTAT3 in hypoxic Atg5 or beclin1-targeted tumor cells was found to be associated with the inhibition Src kinase (pSrc). Autophagy-induced pSTAT3 and pSrc regulation seemed to involve the ubiquitin proteasome system and p62/SQSTM1. In vivo experiments using B16-F10 melanoma tumor cells indicated that depletion of beclin1 resulted in an inhibition of B16-F10 tumor growth and increased tumor apoptosis. Moreover, in vivo inhibition of autophagy by hydroxychloroquine in B16-F10 tumor-bearing mice and mice vaccinated with tyrosinase-related protein-2 peptide dramatically increased tumor growth inhibition. Collectively, this study establishes a novel functional link between hypoxia-induced autophagy and the regulation of antigen-specific T-cell lysis and points to a major role of autophagy in the control of in vivo tumor growth.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Division of labor in social insects is determinant to their ecological success. Recent models emphasize that division of labor is an emergent property of the interactions among nestmates obeying to simple behavioral rules. However, the role of evolution in shaping these rules has been largely neglected. Here, we investigate a model that integrates the perspectives of self-organization and evolution. Our point of departure is the response threshold model, where we allow thresholds to evolve. We ask whether the thresholds will evolve to a state where division of labor emerges in a form that fits the needs of the colony. We find that division of labor can indeed evolve through the evolutionary branching of thresholds, leading to workers that differ in their tendency to take on a given task. However, the conditions under which division of labor evolves depend on the strength of selection on the two fitness components considered: amount of work performed and on worker distribution over tasks. When selection is strongest on the amount of work performed, division of labor evolves if switching tasks is costly. When selection is strongest on worker distribution, division of labor is less likely to evolve. Furthermore, we show that a biased distribution (like 3:1) of workers over tasks is not easily achievable by a threshold mechanism, even under strong selection. Contrary to expectation, multiple matings of colony foundresses impede the evolution of specialization. Overall, our model sheds light on the importance of considering the interaction between specific mechanisms and ecological requirements to better understand the evolutionary scenarios that lead to division of labor in complex systems. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1007/s00265-012-1343-2) contains supplementary material, which is available to authorized users.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.