973 resultados para Trimmed likelihood


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most fisheries agencies conduct biological and economic assessments independently. This independent conduct may lead to situations in which economists reject management plans proposed by biologists. The objective of this study is to show how to find optimal strategies that may satisfy biologists and economists' conditions. In particular we characterize optimal fishing trajectories that maximize the present value of a discounted economic indicator taking into account the age-structure of the population as in stock assessment methodologies. This approach is applied to the Northern Stock of Hake. Our main empirical findings are: i) Optimal policy may be far away from any of the classical scenarios proposed by biologists, ii) The more the future is discounted, the higher the likelihood of finding contradictions among scenarios proposed by biologists and conclusions from economic analysis, iii) Optimal management reduces the risk of the stock falling under precautionary levels, especially if the future is not discounted to much, and iv) Optimal stationary fishing rate may be very different depending on the economic indicator used as reference.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper estimates a standard version of the New Keynesian monetary (NKM) model under alternative specifications of the monetary policy rule using U.S. and Eurozone data. The estimation procedure implemented is a classical method based on the indirect inference principle. An unrestricted VAR is considered as the auxiliary model. On the one hand, the estimation method proposed overcomes some of the shortcomings of using a structural VAR as the auxiliary model in order to identify the impulse response that defines the minimum distance estimator implemented in the literature. On the other hand, by following a classical approach we can further assess the estimation results found in recent papers that follow a maximum-likelihood Bayesian approach. The estimation results show that some structural parameter estimates are quite sensitive to the specification of monetary policy. Moreover, the estimation results in the U.S. show that the fit of the NKM under an optimal monetary plan is much worse than the fit of the NKM model assuming a forward-looking Taylor rule. In contrast to the U.S. case, in the Eurozone the best fit is obtained assuming a backward-looking Taylor rule, but the improvement is rather small with respect to assuming either a forward-looking Taylor rule or an optimal plan.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using the ECHP, we explored the determinants of having the first child in Spain. Our main goal was to study the relation between female wages and the decision to enter motherhood. Since the offered wage of non-working women is not observed, we estimate it and impute a potential wage to each woman (working and non-working). This potential wage enable us to investigate the effect of wages (the opportunity cost of time non-worked and dedicated to children) on the decision to have the first child, for both workers and non-workers. Contrary to previous results, we found that female wages are positively related to the likelihood of having the first child. This result suggests that the income effect overcomes the substitution effect when non-participants opportunity cost is also taken into account.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using data from the Spanish Labor Force Survey (Encuesta de Población Activa) from 1999 through 2004, we explore the role of regional employment opportunities in explaining the increasing immigrant flows of recent years despite the limited internal mobility on the part of natives. Subsequently, we investigate the policy question of whether immigration has helped reduced unemployment rate disparities across Spanish regions by attracting immigrant flows to regions offering better employment opportunities. Our results indicate that immigrants choose to reside in regions with larger employment rates and where their probability of finding a job is higher. In particular, and despite some differences depending on their origin, immigrants appear generally more responsive than their native counterparts to a higher likelihood of informal, self, or indefinite employment. More importantly, insofar the vast majority of immigrants locate in regions characterized by higher employment rates, immigration contributes to greasing the wheels of the Spanish labor market by narrowing regional unemployment rate disparities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Contributed to: Fusion of Cultures: XXXVIII Annual Conference on Computer Applications and Quantitative Methods in Archaeology – CAA2010 (Granada, Spain, Apr 6-9, 2010)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We evaluated the use of strip-transect survey methods for manatees through a series of replicate aerial surveys in the Banana River, Brevard County, Florida, during summer 1993 and summer 1994. Transect methods sample a representative portion of the total study area, thus allowing for statistical extrapolation to the total area. Other advantages of transect methods are less flight time and less cost than total coverage, ease of navigation, and reduced likelihood of double-counting. Our objectives were: (1) to identify visibility biases associated with the transect survey method and to adjust the counts accordingly; (2) to derive a population estimate with known variance for the Banana River during summer; and (3) to evaluate the potential value of this survey method for monitoring trends in manatee population size over time. (51 page document)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report the findings of an experiment designed to study how people learn and make decisions in network games. Network games offer new opportunities to identify learning rules, since on networks (compared to e.g. random matching) more rules differ in terms of their information requirements. Our experimental design enables us to observe both which actions participants choose and which information they consult before making their choices. We use this information to estimate learning types using maximum likelihood methods. There is substantial heterogeneity in learning types. However, the vast majority of our participants' decisions are best characterized by reinforcement learning or (myopic) best-response learning. The distribution of learning types seems fairly stable across contexts. Neither network topology nor the position of a player in the network seem to substantially affect the estimated distribution of learning types.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Without knowledge of basic seafloor characteristics, the ability to address any number of critical marine and/or coastal management issues is diminished. For example, management and conservation of essential fish habitat (EFH), a requirement mandated by federally guided fishery management plans (FMPs), requires among other things a description of habitats for federally managed species. Although the list of attributes important to habitat are numerous, the ability to efficiently and effectively describe many, and especially at the scales required, does not exist with the tools currently available. However, several characteristics of seafloor morphology are readily obtainable at multiple scales and can serve as useful descriptors of habitat. Recent advancements in acoustic technology, such as multibeam echosounding (MBES), can provide remote indication of surficial sediment properties such as texture, hardness, or roughness, and further permit highly detailed renderings of seafloor morphology. With acoustic-based surveys providing a relatively efficient method for data acquisition, there exists a need for efficient and reproducible automated segmentation routines to process the data. Using MBES data collected by the Olympic Coast National Marine Sanctuary (OCNMS), and through a contracted seafloor survey, we expanded on the techniques of Cutter et al. (2003) to describe an objective repeatable process that uses parameterized local Fourier histogram (LFH) texture features to automate segmentation of surficial sediments from acoustic imagery using a maximum likelihood decision rule. Sonar signatures and classification performance were evaluated using video imagery obtained from a towed camera sled. Segmented raster images were converted to polygon features and attributed using a hierarchical deep-water marine benthic classification scheme (Greene et al. 1999) for use in a geographical information system (GIS). (PDF contains 41 pages.)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Feasible tomography schemes for large particle numbers must possess, besides an appropriate data acquisition protocol, an efficient way to reconstruct the density operator from the observed finite data set. Since state reconstruction typically requires the solution of a nonlinear large-scale optimization problem, this is a major challenge in the design of scalable tomography schemes. Here we present an efficient state reconstruction scheme for permutationally invariant quantum state tomography. It works for all common state-of-the-art reconstruction principles, including, in particular, maximum likelihood and least squares methods, which are the preferred choices in today's experiments. This high efficiency is achieved by greatly reducing the dimensionality of the problem employing a particular representation of permutationally invariant states known from spin coupling combined with convex optimization, which has clear advantages regarding speed, control and accuracy in comparison to commonly employed numerical routines. First prototype implementations easily allow reconstruction of a state of 20 qubits in a few minutes on a standard computer

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Phosphorus removal by wetlands and basins in Lake Tahoe may be improved through designing these systems to filter storm water through media having higher phosphorus removal capabilities than local parent material. Substrates rich in iron, aluminum and calcium oftentimes have enhanced phosphorus removal. These substrates can be naturally occurring, byproducts of industrial or water treatment processes, or engineered. Phosphorus removal fundamentally occurs through chemical adsorption and/or precipitation and much of the phosphorus can be irreversibly bound. In addition to these standard media, other engineered substrates are available to enhance P removal. One such substrate is locally available in Reno and uses lanthanum coated diatomaceous earth for arsenate removal. This material, which has a high positive surface charge, can also irreversibly remove phosphorus. Physical factors also affect P removal. Specifically, specific surface area and particle shape affect filtration capacity, contact area between water and the surface area, and likelihood of clogging and blinding. A number of substrates have been shown to effectively remove P in case studies. Based upon these studies, promising substrates include WTRs, blast furnace slag, steel furnace slag, OPC, calcite, marble Utelite and other LWAs, zeolite and shale. However, other nonperformance factors such as environmental considerations, application logistics, costs, and potential for cementification narrow the list of possible media for application at Tahoe. Industrial byproducts such as slags risk possible leaching of heavy metals and this potential cannot be easily predicted. Fly ash and other fine particle substrates would be more difficult to apply because they would need to be blended, making them less desirable and more costly to apply than larger diameter media. High transportation costs rule out non-local products. Finally, amorphous calcium products will eventually cementify reducing their effectiveness in filtration systems. Based upon these considerations, bauxite, LWAs and expanded shales/clays, iron-rich sands, activated alumina, marble and dolomite, and natural and lanthanum activated diatomaceous earth are the products most likely to be tested for application at Tahoe. These materials are typically iron, calcium or aluminum based; many have a high specific surface area; and all have low transportation costs. (PDF contains 21 pages)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The learning of probability distributions from data is a ubiquitous problem in the fields of Statistics and Artificial Intelligence. During the last decades several learning algorithms have been proposed to learn probability distributions based on decomposable models due to their advantageous theoretical properties. Some of these algorithms can be used to search for a maximum likelihood decomposable model with a given maximum clique size, k, which controls the complexity of the model. Unfortunately, the problem of learning a maximum likelihood decomposable model given a maximum clique size is NP-hard for k > 2. In this work, we propose a family of algorithms which approximates this problem with a computational complexity of O(k · n^2 log n) in the worst case, where n is the number of implied random variables. The structures of the decomposable models that solve the maximum likelihood problem are called maximal k-order decomposable graphs. Our proposals, called fractal trees, construct a sequence of maximal i-order decomposable graphs, for i = 2, ..., k, in k − 1 steps. At each step, the algorithms follow a divide-and-conquer strategy based on the particular features of this type of structures. Additionally, we propose a prune-and-graft procedure which transforms a maximal k-order decomposable graph into another one, increasing its likelihood. We have implemented two particular fractal tree algorithms called parallel fractal tree and sequential fractal tree. These algorithms can be considered a natural extension of Chow and Liu’s algorithm, from k = 2 to arbitrary values of k. Both algorithms have been compared against other efficient approaches in artificial and real domains, and they have shown a competitive behavior to deal with the maximum likelihood problem. Due to their low computational complexity they are especially recommended to deal with high dimensional domains.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The brain is perhaps the most complex system to have ever been subjected to rigorous scientific investigation. The scale is staggering: over 10^11 neurons, each making an average of 10^3 synapses, with computation occurring on scales ranging from a single dendritic spine, to an entire cortical area. Slowly, we are beginning to acquire experimental tools that can gather the massive amounts of data needed to characterize this system. However, to understand and interpret these data will also require substantial strides in inferential and statistical techniques. This dissertation attempts to meet this need, extending and applying the modern tools of latent variable modeling to problems in neural data analysis.

It is divided into two parts. The first begins with an exposition of the general techniques of latent variable modeling. A new, extremely general, optimization algorithm is proposed - called Relaxation Expectation Maximization (REM) - that may be used to learn the optimal parameter values of arbitrary latent variable models. This algorithm appears to alleviate the common problem of convergence to local, sub-optimal, likelihood maxima. REM leads to a natural framework for model size selection; in combination with standard model selection techniques the quality of fits may be further improved, while the appropriate model size is automatically and efficiently determined. Next, a new latent variable model, the mixture of sparse hidden Markov models, is introduced, and approximate inference and learning algorithms are derived for it. This model is applied in the second part of the thesis.

The second part brings the technology of part I to bear on two important problems in experimental neuroscience. The first is known as spike sorting; this is the problem of separating the spikes from different neurons embedded within an extracellular recording. The dissertation offers the first thorough statistical analysis of this problem, which then yields the first powerful probabilistic solution. The second problem addressed is that of characterizing the distribution of spike trains recorded from the same neuron under identical experimental conditions. A latent variable model is proposed. Inference and learning in this model leads to new principled algorithms for smoothing and clustering of spike data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main theme running through these three chapters is that economic agents are often forced to respond to events that are not a direct result of their actions or other agents actions. The optimal response to these shocks will necessarily depend on agents' understanding of how these shocks arise. The economic environment in the first two chapters is analogous to the classic chain store game. In this setting, the addition of unintended trembles by the agents creates an environment better suited to reputation building. The third chapter considers the competitive equilibrium price dynamics in an overlapping generations environment when there are supply and demand shocks.

The first chapter is a game theoretic investigation of a reputation building game. A sequential equilibrium model, called the "error prone agents" model, is developed. In this model, agents believe that all actions are potentially subjected to an error process. Inclusion of this belief into the equilibrium calculation provides for a richer class of reputation building possibilities than when perfect implementation is assumed.

In the second chapter, maximum likelihood estimation is employed to test the consistency of this new model and other models with data from experiments run by other researchers that served as the basis for prominent papers in this field. The alternate models considered are essentially modifications to the standard sequential equilibrium. While some models perform quite well in that the nature of the modification seems to explain deviations from the sequential equilibrium quite well, the degree to which these modifications must be applied shows no consistency across different experimental designs.

The third chapter is a study of price dynamics in an overlapping generations model. It establishes the existence of a unique perfect-foresight competitive equilibrium price path in a pure exchange economy with a finite time horizon when there are arbitrarily many shocks to supply or demand. One main reason for the interest in this equilibrium is that overlapping generations environments are very fruitful for the study of price dynamics, especially in experimental settings. The perfect foresight assumption is an important place to start when examining these environments because it will produce the ex post socially efficient allocation of goods. This characteristic makes this a natural baseline to which other models of price dynamics could be compared.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As coastal destinations continue to grow, due to tourism and residential expansion, the demand for public beach access and related amenities will also increase. As a resultagencies that provide beach access and related amenities face challenges when considering both residents and visitors use beaches and likely possess different needs, as well as different preferences for management decisions. Being a resident of a coastal county provides more opportunity to use local beaches, but coastal tourism is an important and growing economic engine in coastal communities (Kriesel, Landry, & Keeler, 2005; Pogue & Lee, 1999). Therefore, providing agencies with a comprehensive assessment of the differences between these two groups will increase the likelihood of effective management programs and policies for the provision of public beach access and related amenities. The purpose of this paper was to use a stated preference choice method (SPCM) to identify the extent of both residents’ and visitors’ preferences for public beach management options. (PDF contains 4 pages)