40 resultados para Uncertain parameters

em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The speed of fault isolation is crucial for the design and reconfiguration of fault tolerant control (FTC). In this paper the fault isolation problem is stated as a constraint satisfaction problem (CSP) and solved using constraint propagation techniques. The proposed method is based on constraint satisfaction techniques and uncertainty space refining of interval parameters. In comparison with other approaches based on adaptive observers, the major advantage of the presented method is that the isolation speed is fast even taking into account uncertainty in parameters, measurements and model errors and without the monotonicity assumption. In order to illustrate the proposed approach, a case study of a nonlinear dynamic system is presented

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analyze the effects of uncertainty and private information on horizontal mergers. Firms face uncertain demands or costs and receive private signals. They may decide to merge sharing their private information. If the uncertainty parameters are independent and the signals are perfect, uncertainty generates an informational advantage only to the merging firms, increasing merger incentives and decreasing free-riding effects. Thus, mergers become more profitable and stable. These results generalize to the case of correlated parameters if the correlation is not very severe, and for perfect correlation if the firms receive noisy signals. From the normative point of view, mergers are socially less harmful compared to deterministic markets and may even be welfare enhancing. If the signals are, instead, publicly observed, uncertainty does not necessarily give more incentives to merge, and mergers are not always less socially harmful.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Report for the scientific sojourn at the University of Linköping between April to July 2007. Monitoring of the air intake system of an automotive engine is important to meet emission related legislative diagnosis requirements. During the research the problem of fault detection in the air intake system was stated as a constraint satisfaction problem over continuous domains with a big number of variables and constraints. This problem was solved using Interval-based Consistency Techniques. Interval-based consistency techniques are shown to be particularly efficient for checking the consistency of the Analytical Redundancy Relations (ARRs), dealing with uncertain measurements and parameters, and using experimental data. All experiments were performed on a four-cylinder turbo-charged spark-ignited SAAB engine located in the research laboratory at Vehicular System Group - University of Linköping.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Low concentrations of elements in geochemical analyses have the peculiarity of beingcompositional data and, for a given level of significance, are likely to be beyond thecapabilities of laboratories to distinguish between minute concentrations and completeabsence, thus preventing laboratories from reporting extremely low concentrations of theanalyte. Instead, what is reported is the detection limit, which is the minimumconcentration that conclusively differentiates between presence and absence of theelement. A spatially distributed exhaustive sample is employed in this study to generateunbiased sub-samples, which are further censored to observe the effect that differentdetection limits and sample sizes have on the inference of population distributionsstarting from geochemical analyses having specimens below detection limit (nondetects).The isometric logratio transformation is used to convert the compositional data in thesimplex to samples in real space, thus allowing the practitioner to properly borrow fromthe large source of statistical techniques valid only in real space. The bootstrap method isused to numerically investigate the reliability of inferring several distributionalparameters employing different forms of imputation for the censored data. The casestudy illustrates that, in general, best results are obtained when imputations are madeusing the distribution best fitting the readings above detection limit and exposes theproblems of other more widely used practices. When the sample is spatially correlated, itis necessary to combine the bootstrap with stochastic simulation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of stability analysis for a class of neutral systems with mixed time-varying neutral, discrete and distributed delays and nonlinear parameter perturbations is addressed. By introducing a novel Lyapunov-Krasovskii functional and combining the descriptor model transformation, the Leibniz-Newton formula, some free-weighting matrices, and a suitable change of variables, new sufficient conditions are established for the stability of the considered system, which are neutral-delay-dependent, discrete-delay-range dependent, and distributeddelay-dependent. The conditions are presented in terms of linear matrix inequalities (LMIs) and can be efficiently solved using convex programming techniques. Two numerical examples are given to illustrate the efficiency of the proposed method

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective the present research is try to find some control design strategies, which must be effective and closed to the real operation conditions. As a novel contribution to structural control strategies, the theories of Interval Modal Arithmetic, Backstepping Control and QFT (Qualitative Feedback Theory) will be studied. The steps to follow are to develop first new controllers based on the above theories and then to implement the proposed control strategies to different kind of structures. The report is organized as follows. The Chapter 2 presents the state-of-the-art on structural control systems. The chapter 3 presents the most important open problems found in field of structural control. The exploratory work made by the author, research proposal and working plan are given in the Chapter 4

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Often practical performance of analytical redundancy for fault detection and diagnosis is decreased by uncertainties prevailing not only in the system model, but also in the measurements. In this paper, the problem of fault detection is stated as a constraint satisfaction problem over continuous domains with a big number of variables and constraints. This problem can be solved using modal interval analysis and consistency techniques. Consistency techniques are then shown to be particularly efficient to check the consistency of the analytical redundancy relations (ARRs), dealing with uncertain measurements and parameters. Through the work presented in this paper, it can be observed that consistency techniques can be used to increase the performance of a robust fault detection tool, which is based on interval arithmetic. The proposed method is illustrated using a nonlinear dynamic model of a hydraulic system

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies the dynamics of the distribution of wealth in ageneral equilibrium framework. It considers an overlapping generationsmodel with production and altruistic preferences in which individualsface an uncertain lifetime and annuity markets do not exist. Thispaper focuses on the role that accidental bequests, voluntary bequests,and non--negativity constraints on bequests play in the dynamics of thedistribution of wealth. It is proved that the equilibrium interestrate is lower than the one that satisfies the modified goldenrule. In this economy, a social security system not only plays aninsurance role, but also prevents capital overaccumulation. In fact,this paper shows that a pay--as--you--go social security systemdecentralizes the social planner solution as a competitive equilibrium.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the standard kernel density estimate, it is known that one can tune the bandwidth such that the expected L1 error is within a constant factor of the optimal L1 error (obtained when one is allowed to choose the bandwidth with knowledge of the density). In this paper, we pose the same problem for variable bandwidth kernel estimates where the bandwidths are allowed to depend upon the location. We show in particular that for positive kernels on the real line, for any data-based bandwidth, there exists a densityfor which the ratio of expected L1 error over optimal L1 error tends to infinity. Thus, the problem of tuning the variable bandwidth in an optimal manner is ``too hard''. Moreover, from the class of counterexamples exhibited in the paper, it appears thatplacing conditions on the densities (monotonicity, convexity, smoothness) does not help.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most methods for small-area estimation are based on composite estimators derived from design- or model-based methods. A composite estimator is a linear combination of a direct and an indirect estimator with weights that usually depend on unknown parameters which need to be estimated. Although model-based small-area estimators are usually based on random-effects models, the assumption of fixed effects is at face value more appropriate.Model-based estimators are justified by the assumption of random (interchangeable) area effects; in practice, however, areas are not interchangeable. In the present paper we empirically assess the quality of several small-area estimators in the setting in which the area effects are treated as fixed. We consider two settings: one that draws samples from a theoretical population, and another that draws samples from an empirical population of a labor force register maintained by the National Institute of Social Security (NISS) of Catalonia. We distinguish two types of composite estimators: a) those that use weights that involve area specific estimates of bias and variance; and, b) those that use weights that involve a common variance and a common squared bias estimate for all the areas. We assess their precision and discuss alternatives to optimizing composite estimation in applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies the effects of uncertain lifetime on capitalaccumulation and growth and also the sensitivity of thoseeffects to the existence of a perfect annuities market. Themodel is an overlapping generations model with uncertainlifetimes. The technology is convex and such that the marginalproduct of capital is bounded away from zero. A contribution ofthis paper is to show that the existence of accidental bequestsmay lead the economy to an equilibrium that exhibits asymptoticgrowth, which is impossible in an economy with a perfect annuitiesmarket or with certain lifetimes. This paper also shows that ifindividuals face a positive probability of surviving in everyperiod, they may be willing to save at any age. This effect ofuncertain lifetime on savings may also lead the economy to anequilibrium exhibiting asymptotic growth even if there exists aperfect annuities market.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Let there be a positive (exogenous) probability that, at each date, the human species will disappear.We postulate an Ethical Observer (EO) who maximizes intertemporal welfare under thisuncertainty, with expected-utility preferences. Various social welfare criteria entail alternativevon Neumann- Morgenstern utility functions for the EO: utilitarian, Rawlsian, and an extensionof the latter that corrects for the size of population. Our analysis covers, first, a cake-eating economy(without production), where the utilitarian and Rawlsian recommend the same allocation.Second, a productive economy with education and capital, where it turns out that the recommendationsof the two EOs are in general different. But when the utilitarian program diverges, thenwe prove it is optimal for the extended Rawlsian to ignore the uncertainty concerning the possibledisappearance of the human species in the future. We conclude by discussing the implicationsfor intergenerational welfare maximization in the presence of global warming.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many dynamic revenue management models divide the sale period into a finite number of periods T and assume, invoking a fine-enough grid of time, that each period sees at most one booking request. These Poisson-type assumptions restrict the variability of the demand in the model, but researchers and practitioners were willing to overlook this for the benefit of tractability of the models. In this paper, we criticize this model from another angle. Estimating the discrete finite-period model poses problems of indeterminacy and non-robustness: Arbitrarily fixing T leads to arbitrary control values and on the other hand estimating T from data adds an additional layer of indeterminacy. To counter this, we first propose an alternate finite-population model that avoids this problem of fixing T and allows a wider range of demand distributions, while retaining the useful marginal-value properties of the finite-period model. The finite-population model still requires jointly estimating market size and the parameters of the customer purchase model without observing no-purchases. Estimation of market-size when no-purchases are unobservable has rarely been attempted in the marketing or revenue management literature. Indeed, we point out that it is akin to the classical statistical problem of estimating the parameters of a binomial distribution with unknown population size and success probability, and hence likely to be challenging. However, when the purchase probabilities are given by a functional form such as a multinomial-logit model, we propose an estimation heuristic that exploits the specification of the functional form, the variety of the offer sets in a typical RM setting, and qualitative knowledge of arrival rates. Finally we perform simulations to show that the estimator is very promising in obtaining unbiased estimates of population size and the model parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider an agent who has to repeatedly make choices in an uncertainand changing environment, who has full information of the past, who discountsfuture payoffs, but who has no prior. We provide a learning algorithm thatperforms almost as well as the best of a given finite number of experts orbenchmark strategies and does so at any point in time, provided the agentis sufficiently patient. The key is to find the appropriate degree of forgettingdistant past. Standard learning algorithms that treat recent and distant pastequally do not have the sequential epsilon optimality property.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

uvby H-beta photometry has been obtained for a sample of 93 selected main sequence A stars. The purpose was to determine accurate effective temperatures, surface gravities, and absolute magnitudes for an individual determination of ages and parallaxes, which have to be included in a more extensive work analyzing the kinematic properties of A V stars. Several calibrations and methods to determine the above mentioned parameters have been reviewed, allowing the design of a new algorithm for their determination. The results obtained using this procedure were tested in a previous paper using uvby H-beta data from the Hauck and Mermilliod catalogue, and comparing the rusulting temperatures, surface gravities and absolute magnitudes with empirical determinations of these parameters.