989 resultados para parametric implicit vector equilibrium problems
Resumo:
In this paper a support vector machine (SVM) approach for characterizing the feasible parameter set (FPS) in non-linear set-membership estimation problems is presented. It iteratively solves a regression problem from which an approximation of the boundary of the FPS can be determined. To guarantee convergence to the boundary the procedure includes a no-derivative line search and for an appropriate coverage of points on the FPS boundary it is suggested to start with a sequential box pavement procedure. The SVM approach is illustrated on a simple sine and exponential model with two parameters and an agro-forestry simulation model.
Resumo:
Biological models of an apoptotic process are studied using models describing a system of differential equations derived from reaction kinetics information. The mathematical model is re-formulated in a state-space robust control theory framework where parametric and dynamic uncertainty can be modelled to account for variations naturally occurring in biological processes. We propose to handle the nonlinearities using neural networks.
Resumo:
Extensive population structuring is known to occur in Anopheles darlingi, the primary malaria vector of the Neotropics. We analysed the phylogeographic structure of the species using the mitochondrial cytochrome oxidase I marker. Diversity is divided into six main population groups in South America: Colombia, central Amazonia, southern Brazil, south-eastern Brazil, and two groups in north-east Brazil. The ancestral distribution of the taxon is hypothesized to be central Amazonia, and there is evidence of expansion from this region during the late Pleistocene. The expansion was not a homogeneous front, however, with at least four subgroups being formed due to geographic barriers. As the species spread, populations became isolated from each other by the Amazon River and the coastal mountain ranges of south-eastern Brazil and the Andes. Analyses incorporating distances around these barriers suggest that the entire South American range of An. darlingi is at mutation-dispersal-drift equilibrium. Because the species is distributed throughout such a broad area, the limited dispersal across some landscape types promotes differentiation between otherwise proximate populations. Moreover, samples from the An. darlingi holotype location in Rio de Janeiro State are substantially derived from all other populations, implying that there may be additional genetic differences of epidemiological relevance. The results obtained contribute to our understanding of gene flow in this species and allow the formulation of human mosquito health protocols in light of the potential population differences in vector capacity or tolerance to control strategies. (C) 2009 The Linnean Society of London, Biological Journal of the Linnean Society, 2009, 97, 854-866.
Resumo:
Several real problems involve the classification of data into categories or classes. Given a data set containing data whose classes are known, Machine Learning algorithms can be employed for the induction of a classifier able to predict the class of new data from the same domain, performing the desired discrimination. Some learning techniques are originally conceived for the solution of problems with only two classes, also named binary classification problems. However, many problems require the discrimination of examples into more than two categories or classes. This paper presents a survey on the main strategies for the generalization of binary classifiers to problems with more than two classes, known as multiclass classification problems. The focus is on strategies that decompose the original multiclass problem into multiple binary subtasks, whose outputs are combined to obtain the final prediction.
Resumo:
Support vector machines (SVMs) were originally formulated for the solution of binary classification problems. In multiclass problems, a decomposition approach is often employed, in which the multiclass problem is divided into multiple binary subproblems, whose results are combined. Generally, the performance of SVM classifiers is affected by the selection of values for their parameters. This paper investigates the use of genetic algorithms (GAs) to tune the parameters of the binary SVMs in common multiclass decompositions. The developed GA may search for a set of parameter values common to all binary classifiers or for differentiated values for each binary classifier. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Several popular Machine Learning techniques are originally designed for the solution of two-class problems. However, several classification problems have more than two classes. One approach to deal with multiclass problems using binary classifiers is to decompose the multiclass problem into multiple binary sub-problems disposed in a binary tree. This approach requires a binary partition of the classes for each node of the tree, which defines the tree structure. This paper presents two algorithms to determine the tree structure taking into account information collected from the used dataset. This approach allows the tree structure to be determined automatically for any multiclass dataset.
Resumo:
We consider random generalizations of a quantum model of infinite range introduced by Emch and Radin. The generalizations allow a neat extension from the class l (1) of absolutely summable lattice potentials to the optimal class l (2) of square summable potentials first considered by Khanin and Sinai and generalised by van Enter and van Hemmen. The approach to equilibrium in the case of a Gaussian distribution is proved to be faster than for a Bernoulli distribution for both short-range and long-range lattice potentials. While exponential decay to equilibrium is excluded in the nonrandom l (1) case, it is proved to occur for both short and long range potentials for Gaussian distributions, and for potentials of class l (2) in the Bernoulli case. Open problems are discussed.
Resumo:
Most studies involving statistical time series analysis rely on assumptions of linearity, which by its simplicity facilitates parameter interpretation and estimation. However, the linearity assumption may be too restrictive for many practical applications. The implementation of nonlinear models in time series analysis involves the estimation of a large set of parameters, frequently leading to overfitting problems. In this article, a predictability coefficient is estimated using a combination of nonlinear autoregressive models and the use of support vector regression in this model is explored. We illustrate the usefulness and interpretability of results by using electroencephalographic records of an epileptic patient.
Resumo:
Parametric term structure models have been successfully applied to innumerous problems in fixed income markets, including pricing, hedging, managing risk, as well as studying monetary policy implications. On their turn, dynamic term structure models, equipped with stronger economic structure, have been mainly adopted to price derivatives and explain empirical stylized facts. In this paper, we combine flavors of those two classes of models to test if no-arbitrage affects forecasting. We construct cross section (allowing arbitrages) and arbitrage-free versions of a parametric polynomial model to analyze how well they predict out-of-sample interest rates. Based on U.S. Treasury yield data, we find that no-arbitrage restrictions significantly improve forecasts. Arbitrage-free versions achieve overall smaller biases and Root Mean Square Errors for most maturities and forecasting horizons. Furthermore, a decomposition of forecasts into forward-rates and holding return premia indicates that the superior performance of no-arbitrage versions is due to a better identification of bond risk premium.
Resumo:
We define Nash equilibrium for two-person normal form games in the presence of uncertainty, in the sense of Knight(1921). We use the fonna1iution of uncertainty due to Schmeidler and Gilboa. We show tbat there exist Nash equilibria for any degree of uncertainty, as measured by the uncertainty aversion (Dow anel Wer1ang(l992a». We show by example tbat prudent behaviour (maxmin) can be obtained as an outcome even when it is not rationaliuble in the usual sense. Next, we break down backward industion in the twice repeated prisoner's dilemma. We link these results with those on cooperation in the finitely repeated prisoner's dilemma obtained by Kreps-Milgrom-Roberts-Wdson(1982), and withthe 1iterature on epistemological conditions underlying Nash equilibrium. The knowledge notion implicit in this mode1 of equilibrium does not display logical omniscience.
Resumo:
The paper analyzes a two period general equilibrium model with individual risk and moral hazard. Each household faces two individual states of nature in the second period. These states solely differ in the household's vector of initial endowments, which is strictly larger in the first state (good state) than in the second state (bad state). In the first period households choose a non-observable action. Higher leveis of action give higher probability of the good state of nature to occur, but lower leveIs of utility. Households have access to an insurance market that allows transfer of income across states of oature. I consider two models of financiaI markets, the price-taking behavior model and the nonlínear pricing modelo In the price-taking behavior model suppliers of insurance have a belief about each household's actíon and take asset prices as given. A variation of standard arguments shows the existence of a rational expectations equilibrium. For a generic set of economies every equilibrium is constraíned sub-optímal: there are commodity prices and a reallocation of financiaI assets satisfying the first period budget constraint such that, at each household's optimal choice given those prices and asset reallocation, markets clear and every household's welfare improves. In the nonlinear pricing model suppliers of insurance behave strategically offering nonlinear pricing contracts to the households. I provide sufficient conditions for the existence of equilibrium and investigate the optimality properties of the modeI. If there is a single commodity then every equilibrium is constrained optimaI. Ir there is more than one commodity, then for a generic set of economies every equilibrium is constrained sub-optimaI.
Resumo:
We consider private value auctions where bidders’ types are dependent, a case usually treated by assuming affiliation. We show that affiliation is a restrictive assumption in three senses: topological, measure-theoretic and statistical (affiliation is a very restrictive characterization of positive dependence). We also show that affiliation’s main implications do not generalize for alternative definitions of positive dependence. From this, we propose new approaches to the problems of pure strategy equilibrium existence in first-price auctions (PSEE) and the characterization of the revenue ranking of auctions. For equilibrium existence, we slightly restrict the set of distributions considered, without loss of economic generality, and offer a complete characterization of PSEE. For revenue ranking, we obtain a characterization of the expected revenue differences between second and first price auctions with general dependence of types.
Resumo:
Brazil has demonstrated resilience in relation to the recent economic crises and has an auspicious development potential projected for the coming decades, which, linked to the globalization process, provides important opportunities for our people. Gradually we have established ourselves as one of the leading nations in the world and we have become a reference in questions linked to economic equilibrium, development, energy, agriculture and the environment. This international recognition favors the exchange of experiences with other cultures, governments and organizations, bringing with it the possibility of stimulating a dynamic process of development and innovation.
Resumo:
Real exchange rate is an important macroeconomic price in the economy and a ects economic activity, interest rates, domestic prices, trade and investiments ows among other variables. Methodologies have been developed in empirical exchange rate misalignment studies to evaluate whether a real e ective exchange is overvalued or undervalued. There is a vast body of literature on the determinants of long-term real exchange rates and on empirical strategies to implement the equilibrium norms obtained from theoretical models. This study seeks to contribute to this literature by showing that it is possible to calculate the misalignment from a mixed ointegrated vector error correction framework. An empirical exercise using United States' real exchange rate data is performed. The results suggest that the model with mixed frequency data is preferred to the models with same frequency variables
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)