14 resultados para Orion DBMS, Database, Uncertainty, Uncertain values, Benchmark
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
The main argument developed here is the proposal of the concept of “Social Multi-Criteria Evaluation” (SMCE) as a possible useful framework for the application of social choice to the difficult policy problems of our Millennium, where, as stated by Funtowicz and Ravetz, “facts are uncertain, values in dispute, stakes high and decisions urgent”. This paper starts from the following main questions: 1. Why “Social” Multi-criteria Evaluation? 2. How such an approach should be developed? The foundations of SMCE are set up by referring to concepts coming from complex system theory and philosophy, such as reflexive complexity, post-normal science and incommensurability. To give some operational guidelines on the application of SMCE basic questions to be answered are: 1. How is it possible to deal with technical incommensurability? 2. How can we deal with the issue of social incommensurability? To answer these questions, by using theoretical considerations and lessons learned from realworld case studies, is the main objective of the present article.
Resumo:
Los Frameworks constituyen el nuevo paradigma en cuanto al desarrollo de software se refiere. Entre sus principales características se encuentran la facilidad para la reutilización de código. En este marco específico proporcionados por la tecnología usaremos la tecnología JAVA y su extensión en cuanto a la persistencia de datos. El Framework de persistencia es el responsable de gestionar la lógica de acceso a los datos en un SGBD (Sistema de Gestión de Bases de Datos), ya sea de entrada o salida, y ocultando los detalles más pesados relativos a la estructura propia de la Base de Datos utilizada, de manera completa y transparentemente. En conclusión, este proyecto se basa en un análisis del los Frameworks existentes, analizando sus características y profundizando en los detalles concretos de su actividad y manejo en cuanto a la persistencia.
Resumo:
We perform an experiment on a pure coordination game with uncertaintyabout the payoffs. Our game is closely related to models that have beenused in many macroeconomic and financial applications to solve problemsof equilibrium indeterminacy. In our experiment each subject receives anoisy signal about the true payoffs. This game has a unique strategyprofile that survives the iterative deletion of strictly dominatedstrategies (thus a unique Nash equilibrium). The equilibrium outcomecoincides, on average, with the risk-dominant equilibrium outcome ofthe underlying coordination game. The behavior of the subjects convergesto the theoretical prediction after enough experience has been gained. The data (and the comments) suggest that subjects do not apply through"a priori" reasoning the iterated deletion of dominated strategies.Instead, they adapt to the responses of other players. Thus, the lengthof the learning phase clearly varies for the different signals. We alsotest behavior in a game without uncertainty as a benchmark case. The gamewith uncertainty is inspired by the "global" games of Carlsson and VanDamme (1993).
Resumo:
We report on a series of experiments that test the effects of an uncertain supply on the formation of bids and prices in sequential first-price auctions with private-independent values and unit-demands. Supply is assumed uncertain when buyers do not know the exact number of units to be sold (i.e., the length of the sequence). Although we observe a non-monotone behavior when supply is certain and an important overbidding, the data qualitatively support our price trend predictions and the risk neutral Nash equilibrium model of bidding for the last stage of a sequence, whether supply is certain or not. Our study shows that behavior in these markets changes significantly with the presence of an uncertain supply, and that it can be explained by assuming that bidders formulate pessimistic beliefs about the occurrence of another stage.
Resumo:
This paper addresses the issue of policy evaluation in a context in which policymakers are uncertain about the effects of oil prices on economic performance. I consider models of the economy inspired by Solow (1980), Blanchard and Gali (2007), Kim and Loungani (1992) and Hamilton (1983, 2005), which incorporate different assumptions on the channels through which oil prices have an impact on economic activity. I first study the characteristics of the model space and I analyze the likelihood of the different specifications. I show that the existence of plausible alternative representations of the economy forces the policymaker to face the problem of model uncertainty. Then, I use the Bayesian approach proposed by Brock, Durlauf and West (2003, 2007) and the minimax approach developed by Hansen and Sargent (2008) to integrate this form of uncertainty into policy evaluation. I find that, in the environment under analysis, the standard Taylor rule is outperformed under a number of criteria by alternative simple rules in which policymakers introduce persistence in the policy instrument and respond to changes in the real price of oil.
Resumo:
Often practical performance of analytical redundancy for fault detection and diagnosis is decreased by uncertainties prevailing not only in the system model, but also in the measurements. In this paper, the problem of fault detection is stated as a constraint satisfaction problem over continuous domains with a big number of variables and constraints. This problem can be solved using modal interval analysis and consistency techniques. Consistency techniques are then shown to be particularly efficient to check the consistency of the analytical redundancy relations (ARRs), dealing with uncertain measurements and parameters. Through the work presented in this paper, it can be observed that consistency techniques can be used to increase the performance of a robust fault detection tool, which is based on interval arithmetic. The proposed method is illustrated using a nonlinear dynamic model of a hydraulic system
Resumo:
The speed of fault isolation is crucial for the design and reconfiguration of fault tolerant control (FTC). In this paper the fault isolation problem is stated as a constraint satisfaction problem (CSP) and solved using constraint propagation techniques. The proposed method is based on constraint satisfaction techniques and uncertainty space refining of interval parameters. In comparison with other approaches based on adaptive observers, the major advantage of the presented method is that the isolation speed is fast even taking into account uncertainty in parameters, measurements and model errors and without the monotonicity assumption. In order to illustrate the proposed approach, a case study of a nonlinear dynamic system is presented
Resumo:
To obtain a state-of-the-art benchmark potential energy surface (PES) for the archetypal oxidative addition of the methane C-H bond to the palladium atom, we have explored this PES using a hierarchical series of ab initio methods (Hartree-Fock, second-order Møller-Plesset perturbation theory, fourth-order Møller-Plesset perturbation theory with single, double and quadruple excitations, coupled cluster theory with single and double excitations (CCSD), and with triple excitations treated perturbatively [CCSD(T)]) and hybrid density functional theory using the B3LYP functional, in combination with a hierarchical series of ten Gaussian-type basis sets, up to g polarization. Relativistic effects are taken into account either through a relativistic effective core potential for palladium or through a full four-component all-electron approach. Counterpoise corrected relative energies of stationary points are converged to within 0.1-0.2 kcal/mol as a function of the basis-set size. Our best estimate of kinetic and thermodynamic parameters is -8.1 (-8.3) kcal/mol for the formation of the reactant complex, 5.8 (3.1) kcal/mol for the activation energy relative to the separate reactants, and 0.8 (-1.2) kcal/mol for the reaction energy (zero-point vibrational energy-corrected values in parentheses). This agrees well with available experimental data. Our work highlights the importance of sufficient higher angular momentum polarization functions, f and g, for correctly describing metal-d-electron correlation and, thus, for obtaining reliable relative energies. We show that standard basis sets, such as LANL2DZ+ 1f for palladium, are not sufficiently polarized for this purpose and lead to erroneous CCSD(T) results. B3LYP is associated with smaller basis set superposition errors and shows faster convergence with basis-set size but yields relative energies (in particular, a reaction barrier) that are ca. 3.5 kcal/mol higher than the corresponding CCSD(T) values
Resumo:
A new multimodal biometric database designed and acquired within the framework of the European BioSecure Network of Excellence is presented. It is comprised of more than 600 individuals acquired simultaneously in three scenarios: 1) over the Internet, 2) in an office environment with desktop PC, and 3) in indoor/outdoor environments with mobile portable hardware. The three scenarios include a common part of audio/video data. Also, signature and fingerprint data have been acquired both with desktop PC and mobile portable hardware. Additionally, hand and iris data were acquired in the second scenario using desktop PC. Acquisition has been conducted by 11 European institutions. Additional features of the BioSecure Multimodal Database (BMDB) are: two acquisitionsessions, several sensors in certain modalities, balanced gender and age distributions, multimodal realistic scenarios with simple and quick tasks per modality, cross-European diversity, availability of demographic data, and compatibility with other multimodal databases. The novel acquisition conditions of the BMDB allow us to perform new challenging research and evaluation of eithermonomodal or multimodal biometric systems, as in the recent BioSecure Multimodal Evaluation campaign. A description of this campaign including baseline results of individual modalities from the new database is also given. The database is expected to beavailable for research purposes through the BioSecure Association during 2008.
Resumo:
Unemployment rates in developed countries have recently reached levels not seenin a generation, and workers of all ages are facing increasing probabilities of losingtheir jobs and considerable losses in accumulated assets. These events likely increasethe reliance that most older workers will have on public social insurance programs,exactly at a time that public finances are suffering from a large drop in contributions.Our paper explicitly accounts for employment uncertainty and unexpectedwealth shocks, something that has been relatively overlooked in the literature, butthat has grown in importance in recent years. Using administrative and householdlevel data we empirically characterize a life-cycle model of retirement and claimingdecisions in terms of the employment, wage, health, and mortality uncertainty facedby individuals. Our benchmark model explains with great accuracy the strikinglyhigh proportion of individuals who claim benefits exactly at the Early RetirementAge, while still explaining the increased claiming hazard at the Normal RetirementAge. We also discuss some policy experiments and their interplay with employmentuncertainty. Additionally, we analyze the effects of negative wealth shocks on thelabor supply and claiming decisions of older Americans. Our results can explainwhy early claiming has remained very high in the last years even as the early retirementpenalties have increased substantially compared with previous periods, andwhy labor force participation has remained quite high for older workers even in themidst of the worse employment crisis in decades.
Resumo:
This paper introduces a new solution concept, a minimax regret equilibrium, which allows for the possibility that players are uncertain about the rationality and conjectures of their opponents. We provide several applications of our concept. In particular, we consider pricesetting environments and show that optimal pricing policy follows a non-degenerate distribution. The induced price dispersion is consistent with experimental and empirical observations (Baye and Morgan (2004)).
Resumo:
Let there be a positive (exogenous) probability that, at each date, the human species will disappear.We postulate an Ethical Observer (EO) who maximizes intertemporal welfare under thisuncertainty, with expected-utility preferences. Various social welfare criteria entail alternativevon Neumann- Morgenstern utility functions for the EO: utilitarian, Rawlsian, and an extensionof the latter that corrects for the size of population. Our analysis covers, first, a cake-eating economy(without production), where the utilitarian and Rawlsian recommend the same allocation.Second, a productive economy with education and capital, where it turns out that the recommendationsof the two EOs are in general different. But when the utilitarian program diverges, thenwe prove it is optimal for the extended Rawlsian to ignore the uncertainty concerning the possibledisappearance of the human species in the future. We conclude by discussing the implicationsfor intergenerational welfare maximization in the presence of global warming.
Resumo:
We use interplanetary transport simulations to compute a database of electron Green's functions, i.e., differential intensities resulting at the spacecraft position from an impulsive injection of energetic (>20 keV) electrons close to the Sun, for a large number of values of two standard interplanetary transport parameters: the scattering mean free path and the solar wind speed. The nominal energy channels of the ACE, STEREO, and Wind spacecraft have been used in the interplanetary transport simulations to conceive a unique tool for the study of near-relativistic electron events observed at 1 AU. In this paper, we quantify the characteristic times of the Green's functions (onset and peak time, rise and decay phase duration) as a function of the interplanetary transport conditions. We use the database to calculate the FWHM of the pitch-angle distributions at different times of the event and under different scattering conditions. This allows us to provide a first quantitative result that can be compared with observations, and to assess the validity of the frequently used term beam-like pitch-angle distribution.
Resumo:
Marketing has studied the permanence of a client within an enterprise because it is a key element in the study of the value (economic) of the client (CLV). The research that they have developed is based in deterministic or random models, which allowed estimating the permanence of the client, and the CLV. However, when it is not possible to apply these schemes for not having the panel data that this model requires, the period of time of a client with the enterprise is uncertain data. We consider that the value of the current work is to have an alternative way to estimate the period of time with subjective information proper of the theory of uncertainty.