898 resultados para Threshold regression


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Protective adaptive immune responses rely on TCR-mediated recognition of Ag-derived peptides presented by self-MHC molecules. However, self-Ag (tumor)-specific TCRs are often of too low affinity to achieve best functionality. To precisely assess the relationship between TCR-peptide-MHC binding parameters and T cell function, we tested a panel of sequence-optimized HLA-A(*)0201/NY-ESO-1(157-165)-specific TCR variants with affinities lying within physiological boundaries to preserve antigenic specificity and avoid cross-reactivity, as well as two outliers (i.e., a very high- and a low-affinity TCR). Primary human CD8 T cells transduced with these TCRs demonstrated robust correlations between binding measurements of TCR affinity and avidity and the biological response of the T cells, such as TCR cell-surface clustering, intracellular signaling, proliferation, and target cell lysis. Strikingly, above a defined TCR-peptide-MHC affinity threshold (K(D) < approximately 5 muM), T cell function could not be further enhanced, revealing a plateau of maximal T cell function, compatible with the notion that multiple TCRs with slightly different affinities participate equally (codominantly) in immune responses. We propose that rational design of improved self-specific TCRs may not need to be optimized beyond a given affinity threshold to achieve both optimal T cell function and avoidance of the unpredictable risk of cross-reactivity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Secondary accident statistics can be useful for studying the impact of traffic incident management strategies. An easy-to-implement methodology is presented for classifying secondary accidents using data fusion of a police accident database with intranet incident reports. A current method for classifying secondary accidents uses a static threshold that represents the spatial and temporal region of influence of the primary accident, such as two miles and one hour. An accident is considered secondary if it occurs upstream from the primary accident and is within the duration and queue of the primary accident. However, using the static threshold may result in both false positives and negatives because accident queues are constantly varying. The methodology presented in this report seeks to improve upon this existing method by making the threshold dynamic. An incident progression curve is used to mark the end of the queue throughout the entire incident. Four steps in the development of incident progression curves are described. Step one is the processing of intranet incident reports. Step two is the filling in of incomplete incident reports. Step three is the nonlinear regression of incident progression curves. Step four is the merging of individual incident progression curves into one master curve. To illustrate this methodology, 5,514 accidents from Missouri freeways were analyzed. The results show that secondary accidents identified by dynamic versus static thresholds can differ by more than 30%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

STUDY AIM:: To develop a score predicting the risk of bacteremia in cancer patients with fever and neutropenia (FN), and to evaluate its performance. METHODS:: Pediatric patients with cancer presenting with FN induced by nonmyeloablative chemotherapy were observed in a prospective multicenter study. A score predicting the risk of bacteremia was developed from a multivariate mixed logistic regression model. Its cross-validated predictive performance was compared with that of published risk prediction rules. RESULTS:: Bacteremia was reported in 67 (16%) of 423 FN episodes. In 34 episodes (8%), bacteremia became known only after reassessment after 8 to 24 hours of inpatient management. Predicting bacteremia at reassessment was better than prediction at presentation with FN. A differential leukocyte count did not increase the predictive performance. The reassessment score predicting future bacteremia in 390 episodes without known bacteremia used the following 4 variables: hemoglobin ≥90 g/L at presentation (weight 3), platelet count <50 G/L (3), shaking chills (5), and other need for inpatient treatment or observation according to the treating physician (3). Applying a threshold ≥3, the score-simplified into a low-risk checklist-predicted bacteremia with 100% sensitivity, with 54 episodes (13%) classified as low-risk, and a specificity of 15%. CONCLUSIONS:: This reassessment score, simplified into a low-risk checklist of 4 routinely accessible characteristics, identifies pediatric patients with FN at risk for bacteremia. It has the potential to contribute to the reduction of use of antimicrobials in, and to shorten the length of hospital stays of pediatric patients with cancer and FN.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the application of normal theory methods to the estimation and testing of a general type of multivariate regressionmodels with errors--in--variables, in the case where various data setsare merged into a single analysis and the observable variables deviatepossibly from normality. The various samples to be merged can differ on the set of observable variables available. We show that there is a convenient way to parameterize the model so that, despite the possiblenon--normality of the data, normal--theory methods yield correct inferencesfor the parameters of interest and for the goodness--of--fit test. Thetheory described encompasses both the functional and structural modelcases, and can be implemented using standard software for structuralequations models, such as LISREL, EQS, LISCOMP, among others. An illustration with Monte Carlo data is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Random coefficient regression models have been applied in differentfields and they constitute a unifying setup for many statisticalproblems. The nonparametric study of this model started with Beranand Hall (1992) and it has become a fruitful framework. In thispaper we propose and study statistics for testing a basic hypothesisconcerning this model: the constancy of coefficients. The asymptoticbehavior of the statistics is investigated and bootstrapapproximations are used in order to determine the critical values ofthe test statistics. A simulation study illustrates the performanceof the proposals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The concept of antibody-mediated targeting of antigenic MHC/peptide complexes on tumor cells in order to sensitize them to T-lymphocyte cytotoxicity represents an attractive new immunotherapy strategy. In vitro experiments have shown that an antibody chemically conjugated or fused to monomeric MHC/peptide can be oligomerized on the surface of tumor cells, rendering them susceptible to efficient lysis by MHC-peptide restricted specific T-cell clones. However, this strategy has not yet been tested entirely in vivo in immunocompetent animals. To this aim, we took advantage of OT-1 mice which have a transgenic T-cell receptor specific for the ovalbumin (ova) immunodominant peptide (257-264) expressed in the context of the MHC class I H-2K(b). We prepared and characterized conjugates between the Fab' fragment from a high-affinity monoclonal antibody to carcinoembryonic antigen (CEA) and the H-2K(b) /ova peptide complex. First, we showed in OT-1 mice that the grafting and growth of a syngeneic colon carcinoma line transfected with CEA could be specifically inhibited by systemic injections of the conjugate. Next, using CEA transgenic C57BL/6 mice adoptively transferred with OT-1 spleen cells and immunized with ovalbumin, we demonstrated that systemic injections of the anti-CEA-H-2K(b) /ova conjugate could induce specific growth inhibition and regression of well-established, palpable subcutaneous grafts from the syngeneic CEA-transfected colon carcinoma line. These results, obtained in a well-characterized syngeneic carcinoma model, demonstrate that the antibody-MHC/peptide strategy can function in vivo. Further preclinical experimental studies, using an anti-viral T-cell response, will be performed before this new form of immunotherapy can be considered for clinical use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim This study used data from temperate forest communities to assess: (1) five different stepwise selection methods with generalized additive models, (2) the effect of weighting absences to ensure a prevalence of 0.5, (3) the effect of limiting absences beyond the environmental envelope defined by presences, (4) four different methods for incorporating spatial autocorrelation, and (5) the effect of integrating an interaction factor defined by a regression tree on the residuals of an initial environmental model. Location State of Vaud, western Switzerland. Methods Generalized additive models (GAMs) were fitted using the grasp package (generalized regression analysis and spatial predictions, http://www.cscf.ch/grasp). Results Model selection based on cross-validation appeared to be the best compromise between model stability and performance (parsimony) among the five methods tested. Weighting absences returned models that perform better than models fitted with the original sample prevalence. This appeared to be mainly due to the impact of very low prevalence values on evaluation statistics. Removing zeroes beyond the range of presences on main environmental gradients changed the set of selected predictors, and potentially their response curve shape. Moreover, removing zeroes slightly improved model performance and stability when compared with the baseline model on the same data set. Incorporating a spatial trend predictor improved model performance and stability significantly. Even better models were obtained when including local spatial autocorrelation. A novel approach to include interactions proved to be an efficient way to account for interactions between all predictors at once. Main conclusions Models and spatial predictions of 18 forest communities were significantly improved by using either: (1) cross-validation as a model selection method, (2) weighted absences, (3) limited absences, (4) predictors accounting for spatial autocorrelation, or (5) a factor variable accounting for interactions between all predictors. The final choice of model strategy should depend on the nature of the available data and the specific study aims. Statistical evaluation is useful in searching for the best modelling practice. However, one should not neglect to consider the shapes and interpretability of response curves, as well as the resulting spatial predictions in the final assessment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a new unifying framework for investigating throughput-WIP(Work-in-Process) optimal control problems in queueing systems,based on reformulating them as linear programming (LP) problems withspecial structure: We show that if a throughput-WIP performance pairin a stochastic system satisfies the Threshold Property we introducein this paper, then we can reformulate the problem of optimizing alinear objective of throughput-WIP performance as a (semi-infinite)LP problem over a polygon with special structure (a thresholdpolygon). The strong structural properties of such polygones explainthe optimality of threshold policies for optimizing linearperformance objectives: their vertices correspond to the performancepairs of threshold policies. We analyze in this framework theversatile input-output queueing intensity control model introduced byChen and Yao (1990), obtaining a variety of new results, including (a)an exact reformulation of the control problem as an LP problem over athreshold polygon; (b) an analytical characterization of the Min WIPfunction (giving the minimum WIP level required to attain a targetthroughput level); (c) an LP Value Decomposition Theorem that relatesthe objective value under an arbitrary policy with that of a giventhreshold policy (thus revealing the LP interpretation of Chen andYao's optimality conditions); (d) diminishing returns and invarianceproperties of throughput-WIP performance, which underlie thresholdoptimality; (e) a unified treatment of the time-discounted andtime-average cases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper develops a method to solve higher-dimensional stochasticcontrol problems in continuous time. A finite difference typeapproximation scheme is used on a coarse grid of low discrepancypoints, while the value function at intermediate points is obtainedby regression. The stability properties of the method are discussed,and applications are given to test problems of up to 10 dimensions.Accurate solutions to these problems can be obtained on a personalcomputer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the fixed design regression model, additional weights areconsidered for the Nadaraya--Watson and Gasser--M\"uller kernel estimators.We study their asymptotic behavior and the relationships between new andclassical estimators. For a simple family of weights, and considering theIMSE as global loss criterion, we show some possible theoretical advantages.An empirical study illustrates the performance of the weighted estimatorsin finite samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we examine the determinants of wages and decompose theobserved differences across genders into the "explained by differentcharacteristics" and "explained by different returns components"using a sample of Spanish workers. Apart from the conditionalexpectation of wages, we estimate the conditional quantile functionsfor men and women and find that both the absolute wage gap and thepart attributed to different returns at each of the quantiles, farfrom being well represented by their counterparts at the mean, aregreater as we move up in the wage range.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most facility location decision models ignore the fact that for a facility to survive it needs a minimum demand level to cover costs. In this paper we present a decision model for a firm thatwishes to enter a spatial market where there are several competitors already located. This market is such that for each outlet there is a demand threshold level that has to be achievedin order to survive. The firm wishes to know where to locate itsoutlets so as to maximize its market share taking into account the threshold level. It may happen that due to this new entrance, some competitors will not be able to meet the threshold and therefore will disappear. A formulation is presented together with a heuristic solution method and computational experience.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this paper is to compare the performance of twopredictive radiological models, logistic regression (LR) and neural network (NN), with five different resampling methods. One hundred and sixty-seven patients with proven calvarial lesions as the only known disease were enrolled. Clinical and CT data were used for LR and NN models. Both models were developed with cross validation, leave-one-out and three different bootstrap algorithms. The final results of each model were compared with error rate and the area under receiver operating characteristic curves (Az). The neural network obtained statistically higher Az than LR with cross validation. The remaining resampling validation methods did not reveal statistically significant differences between LR and NN rules. The neural network classifier performs better than the one based on logistic regression. This advantage is well detected by three-fold cross-validation, but remains unnoticed when leave-one-out or bootstrap algorithms are used.