982 resultados para Hidden variable theory
Resumo:
El objetivo de este trabajo consiste en proponer una medida de performance adecuada para los fondos de inversión de renta variable. Las características específicas de este tipo de carteras inducen a tomar un enfoque basado en la L.M.C., por lo que se escoge como medida de riesgo el riesgo total de la cartera (pσ). Se introducen las estrategias pasivas y activas en el análisis, con lo que se consigue desarrollar una medida de performance que, además de medir la rentabilidad por gestión efectiva, la pondera en función del grado de actividad asumido por la cartera a evaluar.
Resumo:
We study the process by which subordinated regions of a country can obtain a more favourable political status. In our theoretical model a dominant and a dominated region first interact through a voting process that can lead to different degrees of autonomy. If this process fails then both regions engage in a costly political conflict which can only lead to the maintenance of the initial subordination of the region in question or to its complete independence. In the subgame-perfect equilibrium the voting process always leads to an intermediate arrangement acceptable for both parts. Hence, the costly political struggle never occurs. In contrast, in our experiments we observe a large amount of fighting involving high material losses, even in a case in which the possibilities for an arrangement without conflict are very salient. In our experimental environment intermediate solutions are feasible and stable, but purely emotional elements prevent them from being reached.
Resumo:
This paper resorts to the contribution of the science philosopher Gerald Holton to map some of the IR arguments and debates in an unconventional and more insightful way. From this starting point, it is sustained that the formerly all-pervading neorealism-neoinstitutionalism debate has lost its appeal and is attracting less and less interest among scholars. It does not structure the approach of the theoretically-oriented authors any more; at least, not with the habitual intensity. More specifically, we defend that the neo-neo rapprochement, even if it could have demonstrated that international cooperation is possible and relevant in a Realist world, it has also impoverished theoretical debate by hiding some of the most significant issues that preoccupied classical transnationalists. Hence, some authors appear to be trying to rescue some of these arguments in an analytical and systematic fashion, opening up a theoretical querelle that may be the next one to pay attention to.
Resumo:
This paper presents an outline of rationale and theory of the MuSIASEM scheme (Multi-Scale Integrated Analysis of Societal and Ecosystem Metabolism). First, three points of the rationale behind our MuSIASEM scheme are discussed: (i) endosomatic and exosomatic metabolism in relation to Georgescu-Roegen’s flow-fund scheme; (2) the bioeconomic analogy of hypercycle and dissipative parts in ecosystems; (3) the dramatic reallocation of human time and land use patterns in various sectors of modern economy. Next, a flow-fund representation of the MUSIASEM scheme on three levels (the whole national level, the paid work sectors level, and the agricultural sector level) is illustrated to look at the structure of the human economy in relation to two primary factors: (i) human time - a fund; and (ii) exosomatic energy - a flow. The three levels representation uses extensive and intensive variables simultaneously. Key conceptual tools of the MuSIASEM scheme - mosaic effects and impredicative loop analysis - are explained using the three level flow-fund representation. Finally, we claim that the MuSIASEM scheme can be seen as a multi-purpose grammar useful to deal with sustainability issues.
Resumo:
Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Since conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. Monte Carlo results show that the estimator performs well in comparison to other estimators that have been proposed for estimation of general DLV models.
Resumo:
In the literature on risk, one generally assume that uncertainty is uniformly distributed over the entire working horizon, when the absolute risk-aversion index is negative and constant. From this perspective, the risk is totally exogenous, and thus independent of endogenous risks. The classic procedure is "myopic" with regard to potential changes in the future behavior of the agent due to inherent random fluctuations of the system. The agent's attitude to risk is rigid. Although often criticized, the most widely used hypothesis for the analysis of economic behavior is risk-neutrality. This borderline case must be envisaged with prudence in a dynamic stochastic context. The traditional measures of risk-aversion are generally too weak for making comparisons between risky situations, given the dynamic �complexity of the environment. This can be highlighted in concrete problems in finance and insurance, context for which the Arrow-Pratt measures (in the small) give ambiguous.
Resumo:
A growing literature integrates theories of debt management into models of optimal fiscal policy. One promising theory argues that the composition of government debt should be chosen so that fluctuations in the market value of debt offset changes in expected future deficits. This complete market approach to debt management is valid even when the government only issues non-contingent bonds. A number of authors conclude from this approach that governments should issue long term debt and invest in short term assets. We argue that the conclusions of this approach are too fragile to serve as a basis for policy recommendations. This is because bonds at different maturities have highly correlated returns, causing the determination of the optimal portfolio to be ill-conditioned. To make this point concrete we examine the implications of this approach to debt management in various models, both analytically and using numerical methods calibrated to the US economy. We find the complete market approach recommends asset positions which are huge multiples of GDP. Introducing persistent shocks or capital accumulation only worsens this problem. Increasing the volatility of interest rates through habits partly reduces the size of these simulations we find no presumption that governments should issue long term debt ? policy recommendations can be easily reversed through small perturbations in the specification of shocks or small variations in the maturity of bonds issued. We further extend the literature by removing the assumption that governments every period costlessly repurchase all outstanding debt. This exacerbates the size of the required positions, worsens their volatility and in some cases produces instability in debt holdings. We conclude that it is very difficult to insulate fiscal policy from shocks by using the complete markets approach to debt management. Given the limited variability of the yield curve using maturities is a poor way to substitute for state contingent debt. The result is the positions recommended by this approach conflict with a number of features that we believe are important in making bond markets incomplete e.g allowing for transaction costs, liquidity effects, etc.. Until these features are all fully incorporated we remain in search of a theory of debt management capable of providing robust policy insights.
Resumo:
Among the largest resources for biological sequence data is the large amount of expressed sequence tags (ESTs) available in public and proprietary databases. ESTs provide information on transcripts but for technical reasons they often contain sequencing errors. Therefore, when analyzing EST sequences computationally, such errors must be taken into account. Earlier attempts to model error prone coding regions have shown good performance in detecting and predicting these while correcting sequencing errors using codon usage frequencies. In the research presented here, we improve the detection of translation start and stop sites by integrating a more complex mRNA model with codon usage bias based error correction into one hidden Markov model (HMM), thus generalizing this error correction approach to more complex HMMs. We show that our method maintains the performance in detecting coding sequences.
Resumo:
The first main result of the paper is a criterion for a partially commutative group G to be a domain. It allows us to reduce the study of algebraic sets over G to the study of irreducible algebraic sets, and reduce the elementary theory of G (of a coordinate group over G) to the elementary theories of the direct factors of G (to the elementary theory of coordinate groups of irreducible algebraic sets). Then we establish normal forms for quantifier-free formulas over a non-abelian directly indecomposable partially commutative group H. Analogously to the case of free groups, we introduce the notion of a generalised equation and prove that the positive theory of H has quantifier elimination and that arbitrary first-order formulas lift from H to H * F, where F is a free group of finite rank. As a consequence, the positive theory of an arbitrary partially commutative group is decidable.
Resumo:
We present a solution to the problem of defining a counterpart in Algebraic Set Theory of the construction of internal sheaves in Topos Theory. Our approach is general in that we consider sheaves as determined by Lawvere-Tierney coverages, rather than by Grothen-dieck coverages, and assume only a weakening of the axioms for small maps originally introduced by Joyal and Moerdijk, thus subsuming the existing topos-theoretic results.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
We extend the theory of Quillen adjunctions by combining ideas of homotopical algebra and of enriched category theory. Our results describe how the formulas for homotopy colimits of Bousfield and Kan arise from general formulas describing the derived functor of the weighted colimit functor.
Resumo:
Report for the scientific sojourn at the James Cook University, Australia, between June to December 2007. Free convection in enclosed spaces is found widely in natural and industrial systems. It is a topic of primary interest because in many systems it provides the largest resistance to the heat transfer in comparison with other heat transfer modes. In such systems the convection is driven by a density gradient within the fluid, which, usually, is produced by a temperature difference between the fluid and surrounding walls. In the oil industry, the oil, which has High Prandtl, usually is stored and transported in large tanks at temperatures high enough to keep its viscosity and, thus the pumping requirements, to a reasonable level. A temperature difference between the fluid and the walls of the container may give rise to the unsteady buoyancy force and hence the unsteady natural convection. In the initial period of cooling the natural convection regime dominates over the conduction contribution. As the oil cools down it typically becomes more viscous and this increase of viscosity inhibits the convection. At this point the oil viscosity becomes very large and unloading of the tank becomes very difficult. For this reason it is of primary interest to be able to predict the cooling rate of the oil. The general objective of this work is to develop and validate a simulation tool able to predict the cooling rates of high Prandtl fluid considering the variable viscosity effects.