36 resultados para Likelihood principle
Resumo:
According to the Taylor principle a central bank should adjust the nominal interest rate by more than one-for-one in response to changes in current inflation. Most of the existing literature supports the view that by following this simple recommendation a central bank can avoid being a source of unnecessary fluctuations in economic activity. The present paper shows that this conclusion is not robust with respect to the modelling of capital accumulation. We use our insights to discuss the desirability of alternative interest raterules. Our results suggest a reinterpretation of monetary policy under Volcker and Greenspan: The empirically plausible characterization of monetary policy can explain the stabilization of macroeconomic outcomes observed in the early eighties for the US economy. The Taylor principle in itself cannot.
Resumo:
This paper extends the theory of network competition betweentelecommunications operators by allowing receivers to derive a surplusfrom receiving calls (call externality) and to affect the volume ofcommunications by hanging up (receiver sovereignty). We investigate theextent to which receiver charges can lead to an internalization of thecalling externality. When the receiver charge and the termination(access) charge are both regulated, there exists an e±cient equilibrium.Effciency requires a termination discount. When reception charges aremarket determined, it is optimal for each operator to set the prices foremission and reception at their off-net costs. For an appropriately chosentermination charge, the symmetric equilibrium is again effcient. Lastly,we show that network-based price discrimination creates strong incentivesfor connectivity breakdowns, even between equal networks.
Resumo:
Precise estimation of propagation parameters inprecipitation media is of interest to improve the performanceof communications systems and in remote sensing applications.In this paper, we present maximum-likelihood estimators ofspecific attenuation and specific differential phase in rain. Themodel used for obtaining the cited estimators assumes coherentpropagation, reflection symmetry of the medium, and Gaussianstatistics of the scattering matrix measurements. No assumptionsabout the microphysical properties of the medium are needed.The performance of the estimators is evaluated through simulateddata. Results show negligible estimators bias and variances closeto Cramer–Rao bounds.
Resumo:
The long-term mean properties of the global climate system and those of turbulent fluid systems are reviewed from a thermodynamic viewpoint. Two general expressions are derived for a rate of entropy production due to thermal and viscous dissipation (turbulent dissipation) in a fluid system. It is shown with these expressions that maximum entropy production in the Earth s climate system suggested by Paltridge, as well as maximum transport properties of heat or momentum in a turbulent system suggested by Malkus and Busse, correspond to a state in which the rate of entropy production due to the turbulent dissipation is at a maximum. Entropy production due to absorption of solar radiation in the climate system is found to be irrelevant to the maximized properties associated with turbulence. The hypothesis of maximum entropy production also seems to be applicable to the planetary atmospheres of Mars and Titan and perhaps to mantle convection. Lorenz s conjecture on maximum generation of available potential energy is shown to be akin to this hypothesis with a few minor approximations. A possible mechanism by which turbulent fluid systems adjust themselves to the states of maximum entropy production is presented as a selffeedback mechanism for the generation of available potential energy. These results tend to support the hypothesis of maximum entropy production that underlies a wide variety of nonlinear fluid systems, including our planet as well as other planets and stars
Resumo:
The work presented evaluates the statistical characteristics of regional bias and expected error in reconstructions of real positron emission tomography (PET) data of human brain fluoro-deoxiglucose (FDG) studies carried out by the maximum likelihood estimator (MLE) method with a robust stopping rule, and compares them with the results of filtered backprojection (FBP) reconstructions and with the method of sieves. The task of evaluating radioisotope uptake in regions-of-interest (ROIs) is investigated. An assessment of bias and variance in uptake measurements is carried out with simulated data. Then, by using three different transition matrices with different degrees of accuracy and a components of variance model for statistical analysis, it is shown that the characteristics obtained from real human FDG brain data are consistent with the results of the simulation studies.
Resumo:
We have shown that the mobility tensor for a particle moving through an arbitrary homogeneous stationary flow satisfies generalized Onsager symmetry relations in which the time-reversal transformation should also be applied to the external forces that keep the system in the stationary state. It is then found that the lift forces, responsible for the motion of the particle in a direction perpendicular to its velocity, have different parity than the drag forces.
Resumo:
In this paper we consider a general action principle for mechanics written by means of the elements of a Lie algebra. We study the physical reasons why we have to choose precisely a Lie algebra to write the action principle. By means of such an action principle we work out the equations of motion and a technique to evaluate perturbations in a general mechanics that is equivalent to a general interaction picture. Classical or quantum mechanics come out as particular cases when we make realizations of the Lie algebra by derivations into the algebra of products of functions or operators, respectively. Later on we develop in particular the applications of the action principle to classical and quantum mechanics, seeing that in this last case it agrees with Schwinger's action principle. The main contribution of this paper is to introduce a perturbation theory and an interaction picture of classical mechanics on the same footing as in quantum mechanics.
Resumo:
This handbook describes the peer review methodology that was applied at the GODIAC project fi eld studies1. The peer review evaluation method as initiated by Otto Adang in the Netherlands and further developed in a European football context (Adang & Brown, 2008) involves experienced police offi cers cooperating with researchers to perform observational fi eld studies to identify good practices and learning points for public order management. The handbook builds on the GODIAC seminars and workshops, for the fi eld study members, which took place in September 2010, January 2012 and January 2013. The handbook has been discussed in the project group and in the steering committee. It is primarily written for the GODIAC fi eld study members as background material for understanding the fi eld study process and for clarifying the different responsibilities that enable active participation in the fi eld study. The handbook has been developed during the project period and incorporates learning points and developments of the peer review method. The handbook aims at promoting the use of fi eld studies for evaluation of policing major events.
Resumo:
The volume is divided into two parts; the fi rst deals with issues related to the police, and the second addresses issues related to demonstrators and protesters. We hope that this volume will provide further insight into issues associated with policing at major events and shed light on the complexity of the organisations, motives, and strategies in play whenever protester groups are involved.
Resumo:
This report summarises the fi eld study results of the project ’Good practice for dialogue and communication as strategic principles for policing political manifestations in Europe’ (GODIAC).1 The overall idea was to integrate operative police work, research and training within the fi eld and to build international and institutional networks, ensuring and recognising the responsibilities of the organisers. The purpose of the GODIAC project was to contribute to the development of a European approach to policing political manifestations.
Resumo:
Biometric system performance can be improved by means of data fusion. Several kinds of information can be fused in order to obtain a more accurate classification (identification or verification) of an input sample. In this paper we present a method for computing the weights in a weighted sum fusion for score combinations, by means of a likelihood model. The maximum likelihood estimation is set as a linear programming problem. The scores are derived from a GMM classifier working on a different feature extractor. Our experimental results assesed the robustness of the system in front a changes on time (different sessions) and robustness in front a change of microphone. The improvements obtained were significantly better (error bars of two standard deviations) than a uniform weighted sum or a uniform weighted product or the best single classifier. The proposed method scales computationaly with the number of scores to be fussioned as the simplex method for linear programming.
Resumo:
The restricted maximum likelihood is preferred by many to the full maximumlikelihood for estimation with variance component and other randomcoefficientmodels, because the variance estimator is unbiased. It is shown that thisunbiasednessis accompanied in some balanced designs by an inflation of the meansquared error.An estimator of the cluster-level variance that is uniformly moreefficient than the fullmaximum likelihood is derived. Estimators of the variance ratio are alsostudied.
Resumo:
Motivation: The comparative analysis of gene gain and loss rates is critical for understanding the role of natural selection and adaptation in shaping gene family sizes. Studying complete genome data from closely related species allows accurate estimation of gene family turnover rates. Current methods and software tools, however, are not well designed for dealing with certain kinds of functional elements, such as microRNAs or transcription factor binding sites. Results: Here, we describe BadiRate, a new software tool to estimate family turnover rates, as well as the number of elements in internal phylogenetic nodes, by likelihood-based methods and parsimony. It implements two stochastic population models, which provide the appropriate statistical framework for testing hypothesis, such as lineage-specific gene family expansions or contractions. We have assessed the accuracy of BadiRate by computer simulations, and have also illustrated its functionality by analyzing a representative empirical dataset.
Resumo:
Motivation: The comparative analysis of gene gain and loss rates is critical for understanding the role of natural selection and adaptation in shaping gene family sizes. Studying complete genome data from closely related species allows accurate estimation of gene family turnover rates. Current methods and software tools, however, are not well designed for dealing with certain kinds of functional elements, such as microRNAs or transcription factor binding sites. Results: Here, we describe BadiRate, a new software tool to estimate family turnover rates, as well as the number of elements in internal phylogenetic nodes, by likelihood-based methods and parsimony. It implements two stochastic population models, which provide the appropriate statistical framework for testing hypothesis, such as lineage-specific gene family expansions or contractions. We have assessed the accuracy of BadiRate by computer simulations, and have also illustrated its functionality by analyzing a representative empirical dataset.