874 resultados para Objective function values


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Economics is a social science which, therefore, focuses on people and on the decisions they make, be it in an individual context, or in group situations. It studies human choices, in face of needs to be fulfilled, and a limited amount of resources to fulfill them. For a long time, there was a convergence between the normative and positive views of human behavior, in that the ideal and predicted decisions of agents in economic models were entangled in one single concept. That is, it was assumed that the best that could be done in each situation was exactly the choice that would prevail. Or, at least, that the facts that economics needed to explain could be understood in the light of models in which individual agents act as if they are able to make ideal decisions. However, in the last decades, the complexity of the environment in which economic decisions are made and the limits on the ability of agents to deal with it have been recognized, and incorporated into models of decision making in what came to be known as the bounded rationality paradigm. This was triggered by the incapacity of the unboundedly rationality paradigm to explain observed phenomena and behavior. This thesis contributes to the literature in three different ways. Chapter 1 is a survey on bounded rationality, which gathers and organizes the contributions to the field since Simon (1955) first recognized the necessity to account for the limits on human rationality. The focus of the survey is on theoretical work rather than the experimental literature which presents evidence of actual behavior that differs from what classic rationality predicts. The general framework is as follows. Given a set of exogenous variables, the economic agent needs to choose an element from the choice set that is avail- able to him, in order to optimize the expected value of an objective function (assuming his preferences are representable by such a function). If this problem is too complex for the agent to deal with, one or more of its elements is simplified. Each bounded rationality theory is categorized according to the most relevant element it simplifes. Chapter 2 proposes a novel theory of bounded rationality. Much in the same fashion as Conlisk (1980) and Gabaix (2014), we assume that thinking is costly in the sense that agents have to pay a cost for performing mental operations. In our model, if they choose not to think, such cost is avoided, but they are left with a single alternative, labeled the default choice. We exemplify the idea with a very simple model of consumer choice and identify the concept of isofin curves, i.e., sets of default choices which generate the same utility net of thinking cost. Then, we apply the idea to a linear symmetric Cournot duopoly, in which the default choice can be interpreted as the most natural quantity to be produced in the market. We find that, as the thinking cost increases, the number of firms thinking in equilibrium decreases. More interestingly, for intermediate levels of thinking cost, an equilibrium in which one of the firms chooses the default quantity and the other best responds to it exists, generating asymmetric choices in a symmetric model. Our model is able to explain well-known regularities identified in the Cournot experimental literature, such as the adoption of different strategies by players (Huck et al. , 1999), the inter temporal rigidity of choices (Bosch-Dom enech & Vriend, 2003) and the dispersion of quantities in the context of di cult decision making (Bosch-Dom enech & Vriend, 2003). Chapter 3 applies a model of bounded rationality in a game-theoretic set- ting to the well-known turnout paradox in large elections, pivotal probabilities vanish very quickly and no one should vote, in sharp contrast with the ob- served high levels of turnout. Inspired by the concept of rhizomatic thinking, introduced by Bravo-Furtado & Côrte-Real (2009a), we assume that each per- son is self-delusional in the sense that, when making a decision, she believes that a fraction of the people who support the same party decides alike, even if no communication is established between them. This kind of belief simplifies the decision of the agent, as it reduces the number of players he believes to be playing against { it is thus a bounded rationality approach. Studying a two-party first-past-the-post election with a continuum of self-delusional agents, we show that the turnout rate is positive in all the possible equilibria, and that it can be as high as 100%. The game displays multiple equilibria, at least one of which entails a victory of the bigger party. The smaller one may also win, provided its relative size is not too small; more self-delusional voters in the minority party decreases this threshold size. Our model is able to explain some empirical facts, such as the possibility that a close election leads to low turnout (Geys, 2006), a lower margin of victory when turnout is higher (Geys, 2006) and high turnout rates favoring the minority (Bernhagen & Marsh, 1997).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Population-based metaheuristics, such as particle swarm optimization (PSO), have been employed to solve many real-world optimization problems. Although it is of- ten sufficient to find a single solution to these problems, there does exist those cases where identifying multiple, diverse solutions can be beneficial or even required. Some of these problems are further complicated by a change in their objective function over time. This type of optimization is referred to as dynamic, multi-modal optimization. Algorithms which exploit multiple optima in a search space are identified as niching algorithms. Although numerous dynamic, niching algorithms have been developed, their performance is often measured solely on their ability to find a single, global optimum. Furthermore, the comparisons often use synthetic benchmarks whose landscape characteristics are generally limited and unknown. This thesis provides a landscape analysis of the dynamic benchmark functions commonly developed for multi-modal optimization. The benchmark analysis results reveal that the mechanisms responsible for dynamism in the current dynamic bench- marks do not significantly affect landscape features, thus suggesting a lack of representation for problems whose landscape features vary over time. This analysis is used in a comparison of current niching algorithms to identify the effects that specific landscape features have on niching performance. Two performance metrics are proposed to measure both the scalability and accuracy of the niching algorithms. The algorithm comparison results demonstrate the algorithms best suited for a variety of dynamic environments. This comparison also examines each of the algorithms in terms of their niching behaviours and analyzing the range and trade-off between scalability and accuracy when tuning the algorithms respective parameters. These results contribute to the understanding of current niching techniques as well as the problem features that ultimately dictate their success.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We investigate the conditions under which an inequality averse and additively separable welfarist constitution maker would always choose to set up a progressive equalization payments scheme in a federation with local public goods. A progressive equalization payments scheme is defined as a list of per capita net (possibly negative) subsidies - one such net subsidy for every jurisdiction - that are decreasing with respect to jurisdictions per capita wealth. We examine these questions in a setting in which the case for progressivity is a priori the strongest, namely, all citizens have the same utility function for the private and the public goods, inhabitants of a given jurisdiction are all identical, and they are not able to move across jurisdictions. We show that the constitution maker favors a progressive equalization payments scheme for all distributions of wealth and all population sizes if and only if its objective function is additively separable between each jurisdiction’s per capita wealth and number of inhabitants. When interpreted as a mean of order r social welfare function, this condition is shown to be equivalent to additive separability of the individual’s indirect utility function with respect to wealth and the price of the public good. Some implications of this restriction to the case where the individual’s direct utility function is additively separable are also derived.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

When triangulating a belief network we aim to obtain a junction tree of minimum state space. Searching for the optimal triangulation can be cast as a search over all the permutations of the network's vaeriables. Our approach is to embed the discrete set of permutations in a convex continuous domain D. By suitably extending the cost function over D and solving the continous nonlinear optimization task we hope to obtain a good triangulation with respect to the aformentioned cost. In this paper we introduce an upper bound to the total junction tree weight as the cost function. The appropriatedness of this choice is discussed and explored by simulations. Then we present two ways of embedding the new objective function into continuous domains and show that they perform well compared to the best known heuristic.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Los procesos transnacionales han marcado un cambio en las relaciones entre los actores del sistema internacional, permitiendo el trabajo por diversas causas a través de las fronteras. Esto ha sido aprovechado por los movimientos sociales, para que su lucha no quede enmarcada simplemente en su país, sino que a partir de objetivos, problemáticas, valores y acciones similares se vea reflejado en diferentes Estados y se de una acción común y colectiva para generar un cambio. Este fenómeno ha sido tomado como referente el Movimiento Pro-choice para articularse transnacionalmente en Colombia para la promoción de los Derechos Sexuales y Reproductivos en el periodo de 2001 a 2011, alcanzando una serie de objetivos importantes que han permitido cambios legales al interior del país, generando también un cambio dentro de la sociedad colombiana. El estudio, análisis y comprensión de la articulación del movimiento prochoice a partir de una dinámica transnacional para la promoción de los derechos sexuales y reproductivos en Colombia, se perfila como un tema de importancia por su coyuntura actual en el mundo, puesto que ha estado latente en los últimos veinte años. Igualmente, la identificación de la acción de los MST como otros actores internacionales en la transformación de las sociedades tanto locales como internacionales, traducido como un fenómeno que se puede explicar dentro de las Relaciones Internacionales.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La creciente preocupación y concienciación de la sociedad respecto el medio ambiente, y en consecuencia la legislación y regulaciones generadas inducen a la modificación de los procesos productivos existentes en la industria química. Las configuraciones iniciales deben modificarse para conseguir una mayor integración de procesos. Para este fin se han creado y desarrollado diferentes metodologías que deben facilitar la tarea a los responsables del rediseño. El desarrollo de una metodología y herramientas complementarias es el principal objetivo de la investigación aquí presentada, especialmente centrada en el desarrollo y la aplicación de una metodología de optimización de procesos. Esta metodología de optimización se aplica sobre configuraciones de proceso existentes y pretende encontrar nuevas configuraciones viables según los objetivos de optimización fijados. La metodología tiene dos partes diferenciadas: la primera se basa en un simulador de procesos comercial y la segunda es la técnica de optimización propiamente dicha. La metodología se inicia con la elaboración de una simulación convenientemente validada que reproduzca el proceso existente, en este caso una papelera no integrada que produce papel estucado de calidad, para impresión. A continuación la técnica de optimización realiza una búsqueda dentro del dominio de los posibles resultados, en busca de los mejores resultados que satisfazcan plenamente los objetivos planteados. Dicha técnica de optimización está basada en los algoritmos genéticos como herramienta de búsqueda, junto a un subprograma basado en técnicas de programación matemática para el cálculo de resultados. Un número reducido de resultados son finalmente escogidos y utilizados para modificar la simulación existente fijando la redistribución de los flujos del proceso. Los resultados de la simulación del proceso determinan en último caso la viabilidad técnica de cada reconfiguración planteada. En el proceso de optimización, los objetivos están definidos en una función objetivo dentro de la técnica de optimización. Dicha función rige la búsqueda de resultados. La función objetivo puede ser individual o una combinación de objetivos. En el presente caso, la función persigue una minimización del consumo de agua y una minimización de la pérdida de materia prima. La optimización se realiza bajo restricciones para alcanzar este objetivo combinado en forma de una solución de compromiso. Producto de la aplicación de esta metodología se han obtenido resultados interesantes que significan una mejora del cierre de circuitos y un ahorro de materia prima, sin comprometer al mismo tiempo la operabilidad del proceso producto ni la calidad del papel.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Gauss–Newton algorithm is an iterative method regularly used for solving nonlinear least squares problems. It is particularly well suited to the treatment of very large scale variational data assimilation problems that arise in atmosphere and ocean forecasting. The procedure consists of a sequence of linear least squares approximations to the nonlinear problem, each of which is solved by an “inner” direct or iterative process. In comparison with Newton’s method and its variants, the algorithm is attractive because it does not require the evaluation of second-order derivatives in the Hessian of the objective function. In practice the exact Gauss–Newton method is too expensive to apply operationally in meteorological forecasting, and various approximations are made in order to reduce computational costs and to solve the problems in real time. Here we investigate the effects on the convergence of the Gauss–Newton method of two types of approximation used commonly in data assimilation. First, we examine “truncated” Gauss–Newton methods where the inner linear least squares problem is not solved exactly, and second, we examine “perturbed” Gauss–Newton methods where the true linearized inner problem is approximated by a simplified, or perturbed, linear least squares problem. We give conditions ensuring that the truncated and perturbed Gauss–Newton methods converge and also derive rates of convergence for the iterations. The results are illustrated by a simple numerical example. A practical application to the problem of data assimilation in a typical meteorological system is presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In rapid scan Fourier transform spectrometry, we show that the noise in the wavelet coefficients resulting from the filter bank decomposition of the complex insertion loss function is linearly related to the noise power in the sample interferogram by a noise amplification factor. By maximizing an objective function composed of the power of the wavelet coefficients divided by the noise amplification factor, optimal feature extraction in the wavelet domain is performed. The performance of a classifier based on the output of a filter bank is shown to be considerably better than that of an Euclidean distance classifier in the original spectral domain. An optimization procedure results in a further improvement of the wavelet classifier. The procedure is suitable for enhancing the contrast or classifying spectra acquired by either continuous wave or THz transient spectrometers as well as for increasing the dynamic range of THz imaging systems. (C) 2003 Optical Society of America.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A beamforming algorithm is introduced based on the general objective function that approximates the bit error rate for the wireless systems with binary phase shift keying and quadrature phase shift keying modulation schemes. The proposed minimum approximate bit error rate (ABER) beamforming approach does not rely on the Gaussian assumption of the channel noise. Therefore, this approach is also applicable when the channel noise is non-Gaussian. The simulation results show that the proposed minimum ABER solution improves the standard minimum mean squares error beamforming solution, in terms of a smaller achievable system's bit error rate.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Genetic algorithms (GAs) have been introduced into site layout planning as reported in a number of studies. In these studies, the objective functions were defined so as to employ the GAs in searching for the optimal site layout. However, few studies have been carried out to investigate the actual closeness of relationships between site facilities; it is these relationships that ultimately govern the site layout. This study has determined that the underlying factors of site layout planning for medium-size projects include work flow, personnel flow, safety and environment, and personal preferences. By finding the weightings on these factors and the corresponding closeness indices between each facility, a closeness relationship has been deduced. Two contemporary mathematical approaches - fuzzy logic theory and an entropy measure - were adopted in finding these results in order to minimize the uncertainty and vagueness of the collected data and improve the quality of the information. GAs were then applied to searching for the optimal site layout in a medium-size government project using the GeneHunter software. The objective function involved minimizing the total travel distance. An optimal layout was obtained within a short time. This reveals that the application of GA to site layout planning is highly promising and efficient.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An alternative blind deconvolution algorithm for white-noise driven minimum phase systems is presented and verified by computer simulation. This algorithm uses a cost function based on a novel idea: variance approximation and series decoupling (VASD), and suggests that not all autocorrelation function values are necessary to implement blind deconvolution.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Numerical weather prediction (NWP) centres use numerical models of the atmospheric flow to forecast future weather states from an estimate of the current state. Variational data assimilation (VAR) is used commonly to determine an optimal state estimate that miminizes the errors between observations of the dynamical system and model predictions of the flow. The rate of convergence of the VAR scheme and the sensitivity of the solution to errors in the data are dependent on the condition number of the Hessian of the variational least-squares objective function. The traditional formulation of VAR is ill-conditioned and hence leads to slow convergence and an inaccurate solution. In practice, operational NWP centres precondition the system via a control variable transform to reduce the condition number of the Hessian. In this paper we investigate the conditioning of VAR for a single, periodic, spatially-distributed state variable. We present theoretical bounds on the condition number of the original and preconditioned Hessians and hence demonstrate the improvement produced by the preconditioning. We also investigate theoretically the effect of observation position and error variance on the preconditioned system and show that the problem becomes more ill-conditioned with increasingly dense and accurate observations. Finally, we confirm the theoretical results in an operational setting by giving experimental results from the Met Office variational system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A diverse body of empirical literature recognizes that investment can influence tenure security, yet this phenomenon has rarely been examined analytically. This paper develops a theoretical model that demonstrates explicitly conditions under which the probability of eviction is endogenous to investment undertaken on illegally encroached land. By accommodating explicitly the government's objective function and its ability to commit credibly to an eviction policy, the model reveals why both those farmers who under-invest, and those who raise their investment levels to improve tenure security, may be behaving rationally. Indeed, both types of behaviour are accommodated within a single model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper considers variations of a neuron pool selection method known as Affordable Neural Network (AfNN). A saliency measure, based on the second derivative of the objective function is proposed to assess the ability of a trained AfNN to provide neuronal redundancy. The discrepancies between the various affordability variants are explained by correlating unique sub group selections with relevant saliency variations. Overall this study shows that the method in which neurons are selected from a pool is more relevant to how salient individual neurons are, than how often a particular neuron is used during training. The findings herein are relevant to not only providing an analogy to brain function but, also, in optimizing the way a neural network using the affordability method is trained.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

4-Dimensional Variational Data Assimilation (4DVAR) assimilates observations through the minimisation of a least-squares objective function, which is constrained by the model flow. We refer to 4DVAR as strong-constraint 4DVAR (sc4DVAR) in this thesis as it assumes the model is perfect. Relaxing this assumption gives rise to weak-constraint 4DVAR (wc4DVAR), leading to a different minimisation problem with more degrees of freedom. We consider two wc4DVAR formulations in this thesis, the model error formulation and state estimation formulation. The 4DVAR objective function is traditionally solved using gradient-based iterative methods. The principle method used in Numerical Weather Prediction today is the Gauss-Newton approach. This method introduces a linearised `inner-loop' objective function, which upon convergence, updates the solution of the non-linear `outer-loop' objective function. This requires many evaluations of the objective function and its gradient, which emphasises the importance of the Hessian. The eigenvalues and eigenvectors of the Hessian provide insight into the degree of convexity of the objective function, while also indicating the difficulty one may encounter while iterative solving 4DVAR. The condition number of the Hessian is an appropriate measure for the sensitivity of the problem to input data. The condition number can also indicate the rate of convergence and solution accuracy of the minimisation algorithm. This thesis investigates the sensitivity of the solution process minimising both wc4DVAR objective functions to the internal assimilation parameters composing the problem. We gain insight into these sensitivities by bounding the condition number of the Hessians of both objective functions. We also precondition the model error objective function and show improved convergence. We show that both formulations' sensitivities are related to error variance balance, assimilation window length and correlation length-scales using the bounds. We further demonstrate this through numerical experiments on the condition number and data assimilation experiments using linear and non-linear chaotic toy models.