995 resultados para Rational complexity


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Minimum Description Length (MDL) is an information-theoretic principle that can be used for model selection and other statistical inference tasks. There are various ways to use the principle in practice. One theoretically valid way is to use the normalized maximum likelihood (NML) criterion. Due to computational difficulties, this approach has not been used very often. This thesis presents efficient floating-point algorithms that make it possible to compute the NML for multinomial, Naive Bayes and Bayesian forest models. None of the presented algorithms rely on asymptotic analysis and with the first two model classes we also discuss how to compute exact rational number solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For any q > 1, let MOD_q be a quantum gate that determines if the number of 1's in the input is divisible by q. We show that for any q,t > 1, MOD_q is equivalent to MOD_t (up to constant depth). Based on the case q=2, Moore has shown that quantum analogs of AC^(0), ACC[q], and ACC, denoted QAC^(0)_wf, QACC[2], QACC respectively, define the same class of operators, leaving q > 2 as an open question. Our result resolves this question, implying that QAC^(0)_wf = QACC[q] = QACC for all q. We also prove the first upper bounds for QACC in terms of related language classes. We define classes of languages EQACC, NQACC (both for arbitrary complex amplitudes) and BQACC (for rational number amplitudes) and show that they are all contained in TC^(0). To do this, we show that a TC^(0) circuit can keep track of the amplitudes of the state resulting from the application of a QACC operator using a constant width polynomial size tensor sum. In order to accomplish this, we also show that TC^(0) can perform iterated addition and multiplication in certain field extensions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the late seventies, Megiddo proposed a way to use an algorithm for the problem of minimizing a linear function a(0) + a(1)x(1) + ... + a(n)x(n) subject to certain constraints to solve the problem of minimizing a rational function of the form (a(0) + a(1)x(1) + ... + a(n)x(n))/(b(0) + b(1)x(1) + ... + b(n)x(n)) subject to the same set of constraints, assuming that the denominator is always positive. Using a rather strong assumption, Hashizume et al. extended Megiddo`s result to include approximation algorithms. Their assumption essentially asks for the existence of good approximation algorithms for optimization problems with possibly negative coefficients in the (linear) objective function, which is rather unusual for most combinatorial problems. In this paper, we present an alternative extension of Megiddo`s result for approximations that avoids this issue and applies to a large class of optimization problems. Specifically, we show that, if there is an alpha-approximation for the problem of minimizing a nonnegative linear function subject to constraints satisfying a certain increasing property then there is an alpha-approximation (1 1/alpha-approximation) for the problem of minimizing (maximizing) a nonnegative rational function subject to the same constraints. Our framework applies to covering problems and network design problems, among others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quality management is dominated by rational paradigms for the measurement and management of quality, but these paradigms start to break down, when faced with the inherent complexity of managing quality in intensely competitive changing environments. In this article, the various theoretical strategy paradigms employed to manage quality are reviewed and the advantages and limitations of these paradigms are highlighted. A major implication of this review is that when faced with complexity, an ideological stance to any single strategy paradigm for the management of quality is ineffective. A case study is used to demonstrate the need for an integrative multi-paradigm approach to the management of quality as complexity increases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Targeted cancer therapy aims to disrupt aberrant cellular signalling pathways. Biomarkers are surrogates of pathway state, but there is limited success in translating candidate biomarkers to clinical practice due to the intrinsic complexity of pathway networks. Systems biology approaches afford better understanding of complex, dynamical interactions in signalling pathways targeted by anticancer drugs. However, adoption of dynamical modelling by clinicians and biologists is impeded by model inaccessibility. Drawing on computer games technology, we present a novel visualisation toolkit, SiViT, that converts systems biology models of cancer cell signalling into interactive simulations that can be used without specialist computational expertise. SiViT allows clinicians and biologists to directly introduce for example loss of function mutations and specific inhibitors. SiViT animates the effects of these introductions on pathway dynamics, suggesting further experiments and assessing candidate biomarker effectiveness. In a systems biology model of Her2 signalling we experimentally validated predictions using SiViT, revealing the dynamics of biomarkers of drug resistance and highlighting the role of pathway crosstalk. No model is ever complete: the iteration of real data and simulation facilitates continued evolution of more accurate, useful models. SiViT will make accessible libraries of models to support preclinical research, combinatorial strategy design and biomarker discovery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We generalize the classical notion of VapnikChernovenkis (VC) dimension to ordinal VC-dimension, in the context of logical learning paradigms. Logical learning paradigms encompass the numerical learning paradigms commonly studied in Inductive Inference. A logical learning paradigm is defined as a set W of structures over some vocabulary, and a set D of first-order formulas that represent data. The sets of models of in W, where varies over D, generate a natural topology W over W. We show that if D is closed under boolean operators, then the notion of ordinal VC-dimension offers a perfect characterization for the problem of predicting the truth of the members of D in a member of W, with an ordinal bound on the number of mistakes. This shows that the notion of VC-dimension has a natural interpretation in Inductive Inference, when cast into a logical setting. We also study the relationships between predictive complexity, selective complexitya variation on predictive complexityand mind change complexity. The assumptions that D is closed under boolean operators and that W is compact often play a crucial role to establish connections between these concepts. We then consider a computable setting with effective versions of the complexity measures, and show that the equivalence between ordinal VC-dimension and predictive complexity fails. More precisely, we prove that the effective ordinal VC-dimension of a paradigm can be defined when all other effective notions of complexity are undefined. On a better note, when W is compact, all effective notions of complexity are defined, though they are not related as in the noncomputable version of the framework.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

CFO and I/Q mismatch could cause significant performance degradation to OFDM systems. Their estimation and compensation are generally difficult as they are entangled in the received signal. In this paper, we propose some low-complexity estimation and compensation schemes in the receiver, which are robust to various CFO and I/Q mismatch values although the performance is slightly degraded for very small CFO. These schemes consist of three steps: forming a cosine estimator free of I/Q mismatch interference, estimating I/Q mismatch using the estimated cosine value, and forming a sine estimator using samples after I/Q mismatch compensation. These estimators are based on the perception that an estimate of cosine serves much better as the basis for I/Q mismatch estimation than the estimate of CFO derived from the cosine function. Simulation results show that the proposed schemes can improve system performance significantly, and they are robust to CFO and I/Q mismatch.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This document provides the findings of an international review of investment decision-making practices in road asset management. Efforts were concentrated on identifying the strategic objectives of agencies in road asset management, establishing and understanding criteria different organisations adopted and ascertaining the exact methodologies used by different countries and international organisations. Road assets are powerful drivers of economic development and social equity. They also have significant impacts on the natural and man-made environment. The traditional definition of asset management is A systematic process of maintaining, upgrading and operating physical assets cost effectively. It combines engineering principles with sound business practices and economic theory and it provides tools to facilitate a more organised, logical approach to decision-making (US Dept. of Transportation, 1999). In recent years, the concept has been broadened to cover the complexity of decision making, based on a wider variety of policy considerations as well as social and environmental issues rather than is covered by Benefit-Cost analysis and pure technical considerations. Current international practices are summarised in table 2. It was evident that Engineering-economic analysis methods are well advanced to support decision-making. A range of tools available supports performance predicting of road assets and associated cost/benefit in technical context. The need for considering triple plus one bottom line of social, environmental and economic as well as political factors in decision-making is well understood by road agencies around the world. The techniques used to incorporate these however, are limited. Most countries adopt a scoring method, a goal achievement matrix or information collected from surveys. The greater uncertainty associated with these non-quantitative factors has generally not been taken into consideration. There is a gap between the capacities of the decision-making support systems and the requirements from decision-makers to make more rational and transparent decisions. The challenges faced in developing an integrated decision making framework are both procedural and conceptual. In operational terms, the framework should be easy to be understood and employed. In philosophical terms, the framework should be able to deal with challenging issues, such as uncertainty, time frame, network effects, model changes, while integrating cost and non-cost values into the evaluation. The choice of evaluation techniques depends on the feature of the problem at hand, on the aims of the analysis, and on the underlying information base At different management levels, the complexity in considering social, environmental, economic and political factor in decision-making is different. At higher the strategic planning level, more non-cost factors are involved. The complexity also varies based on the scope of the investment proposals. Road agencies traditionally place less emphasis on evaluation of maintenance works. In some cases, social equity, safety, environmental issues have been used in maintenance project selection. However, there is not a common base for the applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

New product development projects are experiencing increasing internal and external project complexity. Complexity leadership theory proposes that external complexity requires adaptive and enabling leadership, which facilitates opportunity recognition (OR). We ask whether internal complexity also requires OR for increased adaptability. We extend a model of EO and OR to conclude that internal complexity may require more careful OR. This means that leaders of technically or structurally complex projects need to evaluate opportunities more carefully than those in projects with external or technological complexity.