1000 resultados para Solução de problema
Resumo:
Em 1985, Mehra e Prescott levantaram uma questão que até hoje não foi respondida de forma satisfatória: o prêmio de risco das ações americanas é muito maior do que poderia ser explicado pelo “paradigma neoclássico de finanças econômicas” (financial economics) representado pelo modelo C-CAPM. E, a partir de então, este problema não resolvido ficou conhecido como o “Equity Premium Puzzle” (EPP) ou o “Enigma do Prêmio (de risco) das Ações”. Este enigma estimulou a produção de uma série de artigos, dissertações e teses que tentaram ajustar os modelos intertemporais de utilidade esperada aos dados dos mercados financeiros. Dentro deste contexto, esta tese busca (i) revisar a evolução histórica da teoria dos modelos de maximização da utilidade intertemporal dos agentes, (ii) analisar os pressupostos e conceitos chaves desses modelos, (iii) propor um novo modelo que seja capaz de solucionar o EPP, (iv) aplicar este modelo proposto aos dados históricos anuais entre 1929 e 2004 e (v) validar a lógica deste modelo através das metodologias Mehra-Prescott e Hansen-Jagannathan. Esta tese faz uma crítica de que os estudos até aqui desenvolvidos tentaram explicar a dinâmica de um mercado financeiro altamente sofisticado, através de um modelo de economia não-monetária e de subsistência. Assim, a sua contribuição consiste na alteração desse pressuposto de uma economia de subsistência, considerando que a renda disponível do setor privado não seja integralmente consumida, mas que também possa ser poupada. Assumindo que as pessoas obtêm satisfação (utilidade) tanto pelo consumo atual como pela poupança atual (que será o consumo futuro), será deduzido que a utilidade marginal de consumir é igual à de poupar, em todo e qualquer período. Com base nisso, a utilidade marginal a consumir é substituída pela utilidade marginal de poupar dentro do modelo básico do C-CAPM. Para reforçar a idéia de que o modelo desta tese usa dados de poupança em vez de consumo, ao longo do trabalho ele será chamado de Sanving-CAPM, ou S-CAPM. Este novo modelo mostrou-se capaz de solucionar o EPP quando submetidas às abordagens Mehra-Prescott e Hansen-Jagannathan.
Resumo:
Neste trabalho é obtida uma solução híbrida para a equação de Fokker-Planck dependente da energia, muito utilizada em problemas de implantação iônica. A idéia consiste na aplicação da transformada de Laplace na variável de energia e aplicação de um esquema de diferenças finitas nas variáveis espacial e angular desta equação. Tal procedimento gera um problema matricial simbólico para a energia transformada. Para resolver este sistema, procede-se a inversão de Laplace da matriz (sI+A), onde s é um parâmetro complexo, I a matriz identidade e A uma matriz quadrada gerada pela discretização das variáveis espacial e angular. A matriz A não é diagonalizável, desta forma, contorna-se este problema decompondo esta matriz na soma de outras duas, onde uma delas é diagonalizável. É gerado então um método iterativo de inversão, semelhante ao método da fonte fixa associado ao método de diagonalização, do qual o resultado fornecido são os valores para o fluxo de partículas do sistema. A partir disto pode-se determinar a energia depositada no sistema eletrônico e nuclear do alvo. Para validar os resultados obtidos faz-se a simulação de implantação de íons de B em Si numa faixa energética de 1keV a 50MeV, comparam-se os resultados com simulação gerada numericamente pelo software SRIM2003.
Resumo:
Neste trabalho apresenta-se uma solu c~ao para um problema abstrato de Cauchy. Basicamente, d a-se uma formula c~ao abstrata para certos tipos de equa c~oes diferenciais parciais n~ao lineares de evolu c~ao em espa cos de Nikol'skii, tais espa cos possuem boas propriedades de regularidade e resultados de imers~ao compacta, num certo sentido s~ao intermedi arios entre os espa cos de Holder e os espa cos de Sobolev. Aplicando o m etodo de Galerkin, prova-se resultados de exist^encia global de solu c~oes fracas, como tamb em a exist^encia de solu c~oes fracas com a propriedade de reprodu c~ao. E impondo mais hip oteses sobre os operadores envolvidos demonstra-se unicidade de solu c~oes fracas.
Resumo:
A dissertação aborda o assunto central "Alimentação das Classes Baixas no Brasil" enfoca o problema social e a oportunidade de mercado para as indústrias alimentícias. Definidos os objetivos, passarei agora à analise do problema que visa caracterizar a subnutrição, e levantar medidas efetivas e potenciais para a sua solução.
Resumo:
O objetivo deste trabalho é apresentar a base teórica para o problema de aprendizagem através de exemplos conforme as ref. [14], [15] e [16]. Aprender através de exemplos pode ser examinado como o problema de regressão da aproximação de uma função multivaluada sobre um conjunto de dados esparsos. Tal problema não é bem posto e a maneira clássica de resolvê-lo é através da teoria de regularização. A teoria de regularização clássica, como será considerada aqui, formula este problema de regressão como o problema variacional de achar a função f que minimiza o funcional Q[f] = 1 n n Xi=1 (yi ¡ f(xi))2 + ¸kfk2 K; onde kfk2 K é a norma em um espa»co de Hilbert especial que chamaremos de Núcleo Reprodutivo (Reproducing Kernel Hilbert Spaces), ou somente RKHS, IH definido pela função positiva K, o número de pontos do exemplo n e o parâmetro de regularização ¸. Sob condições gerais a solução da equação é dada por f(x) = n Xi=1 ciK(x; xi): A teoria apresentada neste trabalho é na verdade a fundamentação para uma teoria mais geral que justfica os funcionais regularizados para a aprendizagem através de um conjunto infinito de dados e pode ser usada para estender consideravelmente a estrutura clássica a regularização, combinando efetivamente uma perspectiva de análise funcional com modernos avanços em Teoria de Probabilidade e Estatística.
Resumo:
The Combinatorial Optimization is a basic area to companies who look for competitive advantages in the diverse productive sectors and the Assimetric Travelling Salesman Problem, which one classifies as one of the most important problems of this area, for being a problem of the NP-hard class and for possessing diverse practical applications, has increased interest of researchers in the development of metaheuristics each more efficient to assist in its resolution, as it is the case of Memetic Algorithms, which is a evolutionary algorithms that it is used of the genetic operation in combination with a local search procedure. This work explores the technique of Viral Infection in one Memetic Algorithms where the infection substitutes the mutation operator for obtaining a fast evolution or extinguishing of species (KANOH et al, 1996) providing a form of acceleration and improvement of the solution . For this it developed four variants of Viral Infection applied in the Memetic Algorithms for resolution of the Assimetric Travelling Salesman Problem where the agent and the virus pass for a symbiosis process which favored the attainment of a hybrid evolutionary algorithms and computational viable
Resumo:
We revisit the problem of visibility, which is to determine a set of primitives potentially visible in a set of geometry data represented by a data structure, such as a mesh of polygons or triangles, we propose a solution for speeding up the three-dimensional visualization processing in applications. We introduce a lean structure , in the sense of data abstraction and reduction, which can be used for online and interactive applications. The visibility problem is especially important in 3D visualization of scenes represented by large volumes of data, when it is not worthwhile keeping all polygons of the scene in memory. This implies a greater time spent in the rendering, or is even impossible to keep them all in huge volumes of data. In these cases, given a position and a direction of view, the main objective is to determine and load a minimum ammount of primitives (polygons) in the scene, to accelerate the rendering step. For this purpose, our algorithm performs cutting primitives (culling) using a hybrid paradigm based on three known techniques. The scene is divided into a cell grid, for each cell we associate the primitives that belong to them, and finally determined the set of primitives potentially visible. The novelty is the use of triangulation Ja 1 to create the subdivision grid. We chose this structure because of its relevant characteristics of adaptivity and algebrism (ease of calculations). The results show a substantial improvement over traditional methods when applied separately. The method introduced in this work can be used in devices with low or no dedicated processing power CPU, and also can be used to view data via the Internet, such as virtual museums applications
Resumo:
In practically all vertical markets and in every region of the planet, loyalty marketers have adopted the tactic of recognition and reward to identify, maintain and increase the yield of their customers. Several strategies have been adopted by companies, and the most popular among them is the loyalty program, which displays a loyalty club to manage these rewards. But the problem with loyalty programs is that customer identification and transfer of loyalty points are made in a semiautomatic. Aiming at this, this paper presents a master's embedded business automation solution called e-Points. The goal of e-Points is munir clubs allegiances with fully automated tooling technology to identify customers directly at the point of sales, ensuring greater control over the loyalty of associate members. For this, we developed a hardware platform with embedded system and RFID technology to be used in PCs tenant, a smart card to accumulate points with every purchase and a web server, which will provide services of interest to retailers and customers membership to the club
Resumo:
This work intends to show a new and few explored SLAM approach inside the simultaneous localization and mapping problem (SLAM). The purpose is to put a mobile robot to work in an indoor environment. The robot should map the environment and localize itself in the map. The robot used in the tests has an upward camera and encoders on the wheels. The landmarks in this built map are light splotches on the images of the camera caused by luminaries on the ceil. This work develops a solution based on Extended Kalman Filter to the SLAM problem using a developed observation model. Several developed tests and softwares to accomplish the SLAM experiments are shown in details
Resumo:
This work whose title is "The transcendental arguments: Kant Andy Hume's problem" has as its main objective to interpret Kant's answer to Hume's problem in the light of the conjunction of the causality and induction themes which is equivalent to skeptical- naturalist reading of the latter. In this sense, this initiative complements the previous treatment seen in our dissertation, where the same issue had been discussed from a merely skeptical reading that Kant got from Hume thought and was only examined causality. Among the specific objectives, we list the following: a) critical philosophy fulfills three basic functions, a founding, one negative and one would argue that the practical use of reason, here named as defensive b) the Kantian solution of Hume's problem in the first critisism would fulfill its founding and negative functions of critique of reason; c) the Kantian treatment of the theme of induction in other criticisms would will fulfill the defense function of critique of reason; d) that the evidence of Kant's answer to Hume's problem are more consistent when will be satisfied these three functions or moments of criticism. The basic structure of the work consists of three parts: the first the genesis of Hume's problem - our intention is to reconstruct Hume's problem, analyzing it from the perspective of two definitions of cause, where the dilution of the first definition in the second match the reduction of psychological knowledge to the probability of following the called naturalization of causal relations; whereas in the second - Legality and Causality - it is stated that when considering Hume in the skeptic-naturalist option, Kant is not entitled to respond by transcendental argument AB; A⊢B from the second Analogy, evidence that is rooted in the position of contemporary thinkers, such as Strawson and Allison; in third part - Purpose and Induction - admits that Kant responds to Hume on the level of regulative reason use, although the development of this test exceeds the limits of the founding function of criticism. And this is articulated in both the Introduction and Concluding Remarks by meeting the defensive [and negative] function of criticism. In this context, based on the use of so-called transcendental arguments that project throughout the critical trilogy, we provide solution to a recurring issue that recurs at several points in our submission and concerning to the "existence and / or the necessity of empirical causal laws. In this light, our thesis is that transcendental arguments are only an apodictic solution to the Hume s skeptical-naturalist problem when is at stake a practical project in which the interest of reason is ensured, as will, in short, proved in our final considerations
Resumo:
The following work is to interpret and analyze the problem of induction under a vision founded on set theory and probability theory as a basis for solution of its negative philosophical implications related to the systems of inductive logic in general. Due to the importance of the problem and the relatively recent developments in these fields of knowledge (early 20th century), as well as the visible relations between them and the process of inductive inference, it has been opened a field of relatively unexplored and promising possibilities. The key point of the study consists in modeling the information acquisition process using concepts of set theory, followed by a treatment using probability theory. Throughout the study it was identified as a major obstacle to the probabilistic justification, both: the problem of defining the concept of probability and that of rationality, as well as the subtle connection between the two. This finding called for a greater care in choosing the criterion of rationality to be considered in order to facilitate the treatment of the problem through such specific situations, but without losing their original characteristics so that the conclusions can be extended to classic cases such as the question about the continuity of the sunrise
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The Car Rental Salesman Problem (CaRS) is a variant of the classical Traveling Salesman Problem which was not described in the literature where a tour of visits can be decomposed into contiguous paths that may be performed in different rental cars. The aim is to determine the Hamiltonian cycle that results in a final minimum cost, considering the cost of the route added to the cost of an expected penalty paid for each exchange of vehicles on the route. This penalty is due to the return of the car dropped to the base. This paper introduces the general problem and illustrates some examples, also featuring some of its associated variants. An overview of the complexity of this combinatorial problem is also outlined, to justify their classification in the NPhard class. A database of instances for the problem is presented, describing the methodology of its constitution. The presented problem is also the subject of a study based on experimental algorithmic implementation of six metaheuristic solutions, representing adaptations of the best of state-of-the-art heuristic programming. New neighborhoods, construction procedures, search operators, evolutionary agents, cooperation by multi-pheromone are created for this problem. Furtermore, computational experiments and comparative performance tests are conducted on a sample of 60 instances of the created database, aiming to offer a algorithm with an efficient solution for this problem. These results will illustrate the best performance reached by the transgenetic algorithm in all instances of the dataset
Resumo:
Due to great difficulty of accurate solution of Combinatorial Optimization Problems, some heuristic methods have been developed and during many years, the analysis of performance of these approaches was not carried through in a systematic way. The proposal of this work is to make a statistical analysis of heuristic approaches to the Traveling Salesman Problem (TSP). The focus of the analysis is to evaluate the performance of each approach in relation to the necessary computational time until the attainment of the optimal solution for one determined instance of the TSP. Survival Analysis, assisted by methods for the hypothesis test of the equality between survival functions was used. The evaluated approaches were divided in three classes: Lin-Kernighan Algorithms, Evolutionary Algorithms and Particle Swarm Optimization. Beyond those approaches, it was enclosed in the analysis, a memetic algorithm (for symmetric and asymmetric TSP instances) that utilizes the Lin-Kernighan heuristics as its local search procedure