994 resultados para quasi-linear utility


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The glioma CpG island methylator phenotype (G-CIMP) has been shown to be highly correlated with prognosis andwas noted to be highly concordant with IDH1mutation in malignant glioma in the limited number of samples analyzed. To better understand the relationship of G-CIMP with IDH1 mutation status and patient outcome, we examined G-CIMP status in detail in a larger retrospective series of glioblastomas as well as tumor samples from the RTOG 0525 clinical trial. Sampleswere tested for 6 CIMPmarkers andwere correlated with patient outcomes. In the retrospective tumor set (n ¼ 301),we found 3 distinct survival groups based on the number of CIMP markers: 0-1 (CIMP-negative), 2-4 (CIMP-intermediate), and 5 or greater (CIMP-positive) with median survivals 13.8, 20.1, and 90.6 months, respectively. This finding was validated in the RTOG 0525 samples (median survivals 15.0, 20.3, and 37.0 months). Among 787 cases with both IDH and CIMP data, 617 were CIMP-negative, 136 were CIMP-intermediate, and 34 were CIMP-positive. Seven hundred forty-four were wild type for IDH1 mutation, and 43 were mutant. CIMP and IDH status were positively correlated but outliers were found. Among the 610 CIMP-negative tumors, there were 7 IDH-mutant tumors, which showed no difference in outcome. Similarly, among the 34 CIMP-positive tumors, there were 21 IDH-mutant cases, which also showed no difference in outcome. However, among the CIMP-intermediate cases, there were 15 IDH-mutant cases with significantly (p ¼ 0.0003) improved outcome (medians not reached vs. 18.5 months, 2 year survival 87% vs. 32%). Multivariate analysis showed that both IDH1 mutation status and CIMP status were independent predictors of outcome. These findings suggest the clinical utility of refining the CIMP status into negative, intermediate, and positive groups and the finding that both IDH1 and CIMPstatus are important molecular markers in GBM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a number of programs for gene structure prediction in higher eukaryotic genomic sequences, exon prediction is decoupled from gene assembly: a large pool of candidate exons is predicted and scored from features located in the query DNA sequence, and candidate genes are assembled from such a pool as sequences of nonoverlapping frame-compatible exons. Genes are scored as a function of the scores of the assembled exons, and the highest scoring candidate gene is assumed to be the most likely gene encoded by the query DNA sequence. Considering additive gene scoring functions, currently available algorithms to determine such a highest scoring candidate gene run in time proportional to the square of the number of predicted exons. Here, we present an algorithm whose running time grows only linearly with the size of the set of predicted exons. Polynomial algorithms rely on the fact that, while scanning the set of predicted exons, the highest scoring gene ending in a given exon can be obtained by appending the exon to the highest scoring among the highest scoring genes ending at each compatible preceding exon. The algorithm here relies on the simple fact that such highest scoring gene can be stored and updated. This requires scanning the set of predicted exons simultaneously by increasing acceptor and donor position. On the other hand, the algorithm described here does not assume an underlying gene structure model. Indeed, the definition of valid gene structures is externally defined in the so-called Gene Model. The Gene Model specifies simply which gene features are allowed immediately upstream which other gene features in valid gene structures. This allows for great flexibility in formulating the gene identification problem. In particular it allows for multiple-gene two-strand predictions and for considering gene features other than coding exons (such as promoter elements) in valid gene structures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Error-correcting codes and matroids have been widely used in the study of ordinary secret sharing schemes. In this paper, the connections between codes, matroids, and a special class of secret sharing schemes, namely, multiplicative linear secret sharing schemes (LSSSs), are studied. Such schemes are known to enable multiparty computation protocols secure against general (nonthreshold) adversaries.Two open problems related to the complexity of multiplicative LSSSs are considered in this paper. The first one deals with strongly multiplicative LSSSs. As opposed to the case of multiplicative LSSSs, it is not known whether there is an efficient method to transform an LSSS into a strongly multiplicative LSSS for the same access structure with a polynomial increase of the complexity. A property of strongly multiplicative LSSSs that could be useful in solving this problem is proved. Namely, using a suitable generalization of the well-known Berlekamp–Welch decoder, it is shown that all strongly multiplicative LSSSs enable efficient reconstruction of a shared secret in the presence of malicious faults. The second one is to characterize the access structures of ideal multiplicative LSSSs. Specifically, the considered open problem is to determine whether all self-dual vector space access structures are in this situation. By the aforementioned connection, this in fact constitutes an open problem about matroid theory, since it can be restated in terms of representability of identically self-dual matroids by self-dual codes. A new concept is introduced, the flat-partition, that provides a useful classification of identically self-dual matroids. Uniform identically self-dual matroids, which are known to be representable by self-dual codes, form one of the classes. It is proved that this property also holds for the family of matroids that, in a natural way, is the next class in the above classification: the identically self-dual bipartite matroids.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A systolic array to implement lattice-reduction-aided lineardetection is proposed for a MIMO receiver. The lattice reductionalgorithm and the ensuing linear detections are operated in the same array, which can be hardware-efficient. All-swap lattice reduction algorithm (ASLR) is considered for the systolic design.ASLR is a variant of the LLL algorithm, which processes all lattice basis vectors within one iteration. Lattice-reduction-aided linear detection based on ASLR and LLL algorithms have very similarbit-error-rate performance, while ASLR is more time efficient inthe systolic array, especially for systems with a large number ofantennas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The transmembrane protein HER2 is over-expressed in approximately 15% of invasive breast cancers as a result of HER2 gene amplification. HER2 proteolytic cleavage (HER2 shedding) generates soluble truncated HER2 molecules that include only the extracellular domain and the concentration of which can be measured in the serum fraction of blood. HER2 shedding also generates a constitutively active truncated intracellular receptor of 95kDa (p95(HER2)). Another soluble truncated HER2 protein (Herstatin), which can also be found in serum, is the product of an alternatively spliced HER2 transcript. Recent preclinical findings may provide crucial insights into the biological and clinical relevance of increased sHER2 concentrations for the outcome of HER2-positive breast cancer and sensitivity to trastuzumab and lapatinib treatment. We present here the most recent findings about the role and biology of sHER2 based on data obtained using a standardized test, which has been cleared by FDA in 2000, for measuring sHER2. This test includes quality control assessments and has been already widely used to evaluate the clinical utility of sHER2 as a biomarker in breast cancer. We will describe in detail data concerning the assessment of sHER2 as a surrogate maker to optimize the evaluation of the HER2 status of a primary tumor and as a prognosis and predictive marker of response to therapies, both in early and metastatic breast cancer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An important statistical development of the last 30 years has been the advance in regression analysis provided by generalized linear models (GLMs) and generalized additive models (GAMs). Here we introduce a series of papers prepared within the framework of an international workshop entitled: Advances in GLMs/GAMs modeling: from species distribution to environmental management, held in Riederalp, Switzerland, 6-11 August 2001.We first discuss some general uses of statistical models in ecology, as well as provide a short review of several key examples of the use of GLMs and GAMs in ecological modeling efforts. We next present an overview of GLMs and GAMs, and discuss some of their related statistics used for predictor selection, model diagnostics, and evaluation. Included is a discussion of several new approaches applicable to GLMs and GAMs, such as ridge regression, an alternative to stepwise selection of predictors, and methods for the identification of interactions by a combined use of regression trees and several other approaches. We close with an overview of the papers and how we feel they advance our understanding of their application to ecological modeling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pavement settlement occurring in and around utility cuts is a common problem, resulting in uneven pavement surfaces, annoyance to drivers, and ultimately, further maintenance. A survey of municipal authorities and field and laboratory investigations were conducted to identify the factors contributing to the settlement of utility cut restorations in pavement sections. Survey responses were received from seven cities across Iowa and indicate that utility cut restorations often last less than two years. Observations made during site inspections showed that backfill material varies from one city to another, backfill lift thickness often exceeds 12 inches, and the backfill material is often placed at bulking moisture contents with no Quality control/Quality Assurance. Laboratory investigation of the backfill materials indicate that at the field moisture contents encountered, the backfill materials have collapse potentials up to 35%. Falling Weight Deflectometer (FWD) deflection data and elevation shots indicate that the maximum deflection in the pavement occurs in the area around the utility cut restoration. The FWD data indicate a zone of influence around the perimeter of the restoration extending two to three feet beyond the trench perimeter. The research team proposes moisture control, the use of 65% relative density in a granular fill, and removing and compacting the native material near the ground surface around the trench. Test sections with geogrid reinforcement were also incorporated. The performance of inspected and proposed utility cuts needs to be monitored for at least two more years.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper retakes previous work of the authors, about the relationship between non-quasi-competitiveness (the increase in price caused by an increase in the number of oligopolists) and stability of the equilibrium in the classical Cournot oligopoly model. Though it has been widely accepted in the literature that the loss of quasi-competitiveness is linked, in the long run as new firms entered the market, to instability of the model, the authors in their previous work put forward a model in which a situation of monopoly changed to duopoly losing quasi-competitiveness but maintaining the stability of the equilibrium. That model could not, at the time, be extended to any number of oligopolists. The present paper exhibits such an extension. An oligopoly model is shown in which the loss of quasi-competitiveness resists the presence in the market of as many firms as one wishes and where the successive Cournot's equilibrium points are unique and asymptotically stable. In this way, for the first time, the conjecture that non-quasi- competitiveness and instability were equivalent in the long run, is proved false.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Whether providing additional resources to local communities leads to improved public services and better outcomes more generally, given existing management capacity and incentive and accountability structures, is an unresolved yet important question for public policy. This paper uses a regression-discontinuity design to evaluate the effect of unrestricted fiscal transfers on local spending (including on education), schooling and learning in Brazil. Results show that transfers increase local public spending almost one for one with no evidence of crowding out own revenue or other revenue sources. Extra per capita transfers of 1000 Reais lead to about 0.42 additional years of elementary schooling and student literacy rates increase by about 5.6 percentage points on average. Part of this effect arises through higher teacher-student ratios in municipal elementary school systems. Results also suggest that additional resources have stronger effects in more rural and less developed parts of Brazil.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article introduces a model of rationality that combines procedural utility over actions with consequential utility over payoffs. It applies the model to the Prisoners Dilemma and shows that empirically observed cooperative behaviors can be rationally explained by a procedural utility for cooperation. The model characterizes the situations in which cooperation emerges as a Nash equilibrium. When rational individuals are not solely concerned by the consequences of their behavior but also care for the process by which these consequences are obtained, there is no one single rational solution to a Prisoners Dilemma. Rational behavior depends on the payoffs at stake and on the procedural utility of individuals. In this manner, this model of procedural utility reflects how ethical considerations, social norms or emotions can transform a game of consequences.