17 resultados para Objective assumptions
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
This paper analyzes the growth and employment effects of the 1994-99 Community Support Framework (CSF) for the Objective 1 Spanish regions using a simple supply-side model estimated with a panel of regional data. The results suggest that the impact of the Structural Funds in Spain has been quite sizable, adding around a percentage point to annual output growth in the average Objective 1 region and 0.4 points to employment growth. Over the period 1994-2000, the Framework has resulted in the creation of over 300,000 new jobs and has eliminated 20% of the initial gap in income per capita between the assisted regions and the rest of the country.
Resumo:
Black-box optimization problems (BBOP) are de ned as those optimization problems in which the objective function does not have an algebraic expression, but it is the output of a system (usually a computer program). This paper is focussed on BBOPs that arise in the eld of insurance, and more speci cally in reinsurance problems. In this area, the complexity of the models and assumptions considered to de ne the reinsurance rules and conditions produces hard black-box optimization problems, that must be solved in order to obtain the optimal output of the reinsurance. The application of traditional optimization approaches is not possible in BBOP, so new computational paradigms must be applied to solve these problems. In this paper we show the performance of two evolutionary-based techniques (Evolutionary Programming and Particle Swarm Optimization). We provide an analysis in three BBOP in reinsurance, where the evolutionary-based approaches exhibit an excellent behaviour, nding the optimal solution within a fraction of the computational cost used by inspection or enumeration methods.
Resumo:
The paper explores the consequences that relying on different behavioral assumptions in training managers may have on their future performance. We argue that training with an emphasis on the standard assumptions used in economics (rationality and self-interest) leads future managers to rely excessively on rational and explicit safeguarding, crowding out instinctive contractual heuristics and signaling a 'bad' type to potential partners. In contrast, human assumptions used in management theories, because of their diverse, implicit and even contradictory nature, do not conflict with the innate set of cooperative tools and may provide a good training ground for such tools. We present tentative confirmatory evidence by examining how the weight given to behavioral assumptions in the core courses of the top 100 business schools influences the average salaries of their MBA graduates. Controlling for the average quality of their students and some other schools' characteristics, average salaries are significantly greater for those schools whose core MBA courses contain a higher proportion of management courses as opposed to courses based on economics or technical disciplines.
Resumo:
We present a new method for constructing exact distribution-free tests (and confidence intervals) for variables that can generate more than two possible outcomes.This method separates the search for an exact test from the goal to create a non-randomized test. Randomization is used to extend any exact test relating to meansof variables with finitely many outcomes to variables with outcomes belonging to agiven bounded set. Tests in terms of variance and covariance are reduced to testsrelating to means. Randomness is then eliminated in a separate step.This method is used to create confidence intervals for the difference between twomeans (or variances) and tests of stochastic inequality and correlation.
Resumo:
The problems arising in commercial distribution are complex and involve several players and decision levels. One important decision is relatedwith the design of the routes to distribute the products, in an efficient and inexpensive way.This article deals with a complex vehicle routing problem that can beseen as a new extension of the basic vehicle routing problem. The proposed model is a multi-objective combinatorial optimization problemthat considers three objectives and multiple periods, which models in a closer way the real distribution problems. The first objective is costminimization, the second is balancing work levels and the third is amarketing objective. An application of the model on a small example, with5 clients and 3 days, is presented. The results of the model show the complexity of solving multi-objective combinatorial optimization problems and the contradiction between the several distribution management objective.
Resumo:
The Person Trade-Off (PTO) is a methodology aimed at measuring thesocial value of health states. The rest of methodologies would measure individualutility and would be less appropriate for taking resource allocation decisions.However few studies have been conducted to test the validity of the method.We present a pilot study with this objective. The study is based on theresult of interviews to 30 undergraduate students in Economics. We judgethe validity of PTO answers by their adequacy to three hypothesis of rationality.First, we show that, given certain rationality assumptions, PTO answersshould be predicted from answers to Standard Gamble questions. This firsthypothesis is not verified. The second hypothesis is that PTO answersshould not vary with different frames of equivalent PTO questions. Thissecond hypothesis is also not verified. Our third hypothesis is that PTOvalues should predict social preferences for allocating resources betweenpatients. This hypothesis is verified. The evidence on the validity of themethod is then conflicting.
Resumo:
In this paper, we discuss pros and cons ofdifferent models for financial market regulationand supervision and we present a proposal forthe re-organisation of regulatory and supervisoryagencies in the Euro Area. Our arguments areconsistent with both new theories and effectivebehaviour of financial intermediaries inindustrialized countries. Our proposed architecturefor financial market regulation is based on theassignment of different objectives or "finalities"to different authorities, both at the domesticand the European level. According to thisperspective, the three objectives of supervision- microeconomic stability, investor protectionand proper behaviour, efficiency and competition- should be assigned to three distinct Europeanauthorities, each one at the centre of a Europeansystem of financial regulators and supervisorsspecialized in overseeing the entire financialmarket with respect to a single regulatoryobjective and regardless of the subjective natureof the intermediaries. Each system should bestructured and organized similarly to the EuropeanSystem of Central Banks and work in connectionwith the central bank which would remain theinstitution responsible for price and macroeconomicstability. We suggest a plausible path to buildour 4-peak regulatory architecture in the Euro area.
Resumo:
We estimate four models of female labour supply using a Spanish sampleof married women from 1994, taking into account the complete form of theindividual s budget set. The models differ in the hypotheses relating tothe presence of optimisation errors and/or the way non-workers contributeto the likelihood function. According to the results, the effects of wagesand non-labour income on the labour supply of Spanish married women dependon the specification used. The model which has both preference andoptimisation errors and allows for both voluntarily and involuntarilyunemployed females desiring to participate seems to better fit the evidencefor Spanish married women.
Resumo:
We present an exact test for whether two random variables that have known bounds on their support are negatively correlated. The alternative hypothesis is that they are not negatively correlated. No assumptions are made on the underlying distributions. We show by example that the Spearman rank correlation test as the competing exact test of correlation in nonparametric settings rests on an additional assumption on the data generating process without which it is not valid as a test for correlation.We then show how to test for the significance of the slope in a linear regression analysis that invovles a single independent variable and where outcomes of the dependent variable belong to a known bounded set.
Resumo:
The paper explores the consequences that relying on different behavioral assumptions intraining managers may have on their future performance. We argue that training with anemphasis on the standard assumptions used in economics (rationality and self-interest) is goodfor technical posts but may also lead future managers to rely excessively on rational and explicitsafeguarding, crowding out instinctive relational heuristics and signaling a bad human type topotential partners. In contrast, human assumptions used in management theories, because oftheir diverse, implicit and even contradictory nature, do not conflict with the innate set ofcooperative tools and may provide a good training ground for such tools. We present tentativeconfirmatory evidence by examining how the weight given to behavioral assumptions in the corecourses of the top 100 business schools influences the average salaries of their MBA graduates.Controlling for the self-selected average quality of their students and some other schools characteristics, average salaries are seen to be significantly greater for schools whose core MBAcourses contain a higher proportion of management courses as opposed to courses based oneconomics or technical disciplines.
Resumo:
In this paper we describe the results of a simulation study performed to elucidate the robustness of the Lindstrom and Bates (1990) approximation method under non-normality of the residuals, under different situations. Concerning the fixed effects, the observed coverage probabilities and the true bias and mean square error values, show that some aspects of this inferential approach are not completely reliable. When the true distribution of the residuals is asymmetrical, the true coverage is markedly lower than the nominal one. The best results are obtained for the skew normal distribution, and not for the normal distribution. On the other hand, the results are partially reversed concerning the random effects. Soybean genotypes data are used to illustrate the methods and to motivate the simulation scenarios
Resumo:
In this paper we analyse, using Monte Carlo simulation, the possible consequences of incorrect assumptions on the true structure of the random effects covariance matrix and the true correlation pattern of residuals, over the performance of an estimation method for nonlinear mixed models. The procedure under study is the well known linearization method due to Lindstrom and Bates (1990), implemented in the nlme library of S-Plus and R. Its performance is studied in terms of bias, mean square error (MSE), and true coverage of the associated asymptotic confidence intervals. Ignoring other criteria like the convenience of avoiding over parameterised models, it seems worst to erroneously assume some structure than do not assume any structure when this would be adequate.
Resumo:
This paper explores analytically the contemporary pottery-making community of Pereruela (north-west Spain) that produces cooking pots from a mixture of red clay and kaolin. Analyses by different techniques (XRF, NAA, XRD, SEM and petrography) showed an extremely high variability for cooking ware pottery produced in a single production centre, by the same technology and using local clays. The main source of chemical variation is related to the use of different red clays and the presence of non-normally distributed inclusions of monazite. These two factors induce a high chemical variability, not only in the output of a single production centre, but even in the paste of a single pot, to an extent to which chemical compositions from one"workshop", or even one"pot", could be classified as having different provenances. The implications for the chemical characterization and for provenance studies of archaeological ceramics are addressed.
Resumo:
Application of semi-distributed hydrological models to large, heterogeneous watersheds deals with several problems. On one hand, the spatial and temporal variability in catchment features should be adequately represented in the model parameterization, while maintaining the model complexity in an acceptable level to take advantage of state-of-the-art calibration techniques. On the other hand, model complexity enhances uncertainty in adjusted model parameter values, therefore increasing uncertainty in the water routing across the watershed. This is critical for water quality applications, where not only streamflow, but also a reliable estimation of the surface versus subsurface contributions to the runoff is needed. In this study, we show how a regularized inversion procedure combined with a multiobjective function calibration strategy successfully solves the parameterization of a complex application of a water quality-oriented hydrological model. The final value of several optimized parameters showed significant and consistentdifferences across geological and landscape features. Although the number of optimized parameters was significantly increased by the spatial and temporal discretization of adjustable parameters, the uncertainty in water routing results remained at reasonable values. In addition, a stepwise numerical analysis showed that the effects on calibration performance due to inclusion of different data types in the objective function could be inextricably linked. Thus caution should be taken when adding or removing data from an aggregated objective function.
Resumo:
Global warming mitigation has recently become a priority worldwide. A large body of literature dealing with energy related problems has focused on reducing greenhouse gases emissions at an engineering scale. In contrast, the minimization of climate change at a wider macroeconomic level has so far received much less attention. We investigate here the issue of how to mitigate global warming by performing changes in an economy. To this end, we make use of a systematic tool that combines three methods: linear programming, environmentally extended input output models, and life cycle assessment principles. The problem of identifying key economic sectors that contribute significantly to global warming is posed in mathematical terms as a bi criteria linear program that seeks to optimize simultaneously the total economic output and the total life cycle CO2 emissions. We have applied this approach to the European Union economy, finding that significant reductions in global warming potential can be attained by regulating specific economic sectors. Our tool is intended to aid policymakers in the design of more effective public policies for achieving the environmental and economic targets sought.