967 resultados para Intractable Likelihood


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many problems in control and signal processing can be formulated as sequential decision problems for general state space models. However, except for some simple models one cannot obtain analytical solutions and has to resort to approximation. In this thesis, we have investigated problems where Sequential Monte Carlo (SMC) methods can be combined with a gradient based search to provide solutions to online optimisation problems. We summarise the main contributions of the thesis as follows. Chapter 4 focuses on solving the sensor scheduling problem when cast as a controlled Hidden Markov Model. We consider the case in which the state, observation and action spaces are continuous. This general case is important as it is the natural framework for many applications. In sensor scheduling, our aim is to minimise the variance of the estimation error of the hidden state with respect to the action sequence. We present a novel SMC method that uses a stochastic gradient algorithm to find optimal actions. This is in contrast to existing works in the literature that only solve approximations to the original problem. In Chapter 5 we presented how an SMC can be used to solve a risk sensitive control problem. We adopt the use of the Feynman-Kac representation of a controlled Markov chain flow and exploit the properties of the logarithmic Lyapunov exponent, which lead to a policy gradient solution for the parameterised problem. The resulting SMC algorithm follows a similar structure with the Recursive Maximum Likelihood(RML) algorithm for online parameter estimation. In Chapters 6, 7 and 8, dynamic Graphical models were combined with with state space models for the purpose of online decentralised inference. We have concentrated more on the distributed parameter estimation problem using two Maximum Likelihood techniques, namely Recursive Maximum Likelihood (RML) and Expectation Maximization (EM). The resulting algorithms can be interpreted as an extension of the Belief Propagation (BP) algorithm to compute likelihood gradients. In order to design an SMC algorithm, in Chapter 8 uses a nonparametric approximations for Belief Propagation. The algorithms were successfully applied to solve the sensor localisation problem for sensor networks of small and medium size.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract: Focusing on Obadiah and Psalm 137, this article provides biblical evidence for an Edomite treaty betrayal of Judah during the Babylonian crisis ca. 588–586 B.C.E. After setting a context that includes the use of treaties in the ancient Near East to establish expectations for political relationships and the likelihood that Edom could operate as a political entity in the Judahite Negev during the Babylonian assault, this article demonstrates that Obadiah’s poetics include a density of inverted form and content (a reversal motif) pointing to treaty betrayal. Obadiah’s modifications of Jeremiah 49, a text with close thematic and terminological parallels, evidence an Edomite treaty betrayal of Judah. Moreover, the study shows that Obadiah is replete with treaty allusions. A study of Psalm 137 in comparison with Aramaic treaty texts from Sefire reveals that this difficult psalm also evidences a treaty betrayal by Edom and includes elements appropriate for treaty curses. The article closes with a discussion of piecemeal data from a few other biblical texts, a criticism of the view that Edom was innocent during the Babylonian crisis, and a suggestion that this treaty betrayal may have contributed to the production of some anti-Edom biblical material.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Optimal Bayesian multi-target filtering is in general computationally impractical owing to the high dimensionality of the multi-target state. The Probability Hypothesis Density (PHD) filter propagates the first moment of the multi-target posterior distribution. While this reduces the dimensionality of the problem, the PHD filter still involves intractable integrals in many cases of interest. Several authors have proposed Sequential Monte Carlo (SMC) implementations of the PHD filter. However, these implementations are the equivalent of the Bootstrap Particle Filter, and the latter is well known to be inefficient. Drawing on ideas from the Auxiliary Particle Filter (APF), a SMC implementation of the PHD filter which employs auxiliary variables to enhance its efficiency was proposed by Whiteley et. al. Numerical examples were presented for two scenarios, including a challenging nonlinear observation model, to support the claim. This paper studies the theoretical properties of this auxiliary particle implementation. $\mathbb{L}_p$ error bounds are established from which almost sure convergence follows.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Optimal Bayesian multi-target filtering is, in general, computationally impractical owing to the high dimensionality of the multi-target state. The Probability Hypothesis Density (PHD) filter propagates the first moment of the multi-target posterior distribution. While this reduces the dimensionality of the problem, the PHD filter still involves intractable integrals in many cases of interest. Several authors have proposed Sequential Monte Carlo (SMC) implementations of the PHD filter. However, these implementations are the equivalent of the Bootstrap Particle Filter, and the latter is well known to be inefficient. Drawing on ideas from the Auxiliary Particle Filter (APF), we present a SMC implementation of the PHD filter which employs auxiliary variables to enhance its efficiency. Numerical examples are presented for two scenarios, including a challenging nonlinear observation model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most fisheries agencies conduct biological and economic assessments independently. This independent conduct may lead to situations in which economists reject management plans proposed by biologists. The objective of this study is to show how to find optimal strategies that may satisfy biologists and economists' conditions. In particular we characterize optimal fishing trajectories that maximize the present value of a discounted economic indicator taking into account the age-structure of the population as in stock assessment methodologies. This approach is applied to the Northern Stock of Hake. Our main empirical findings are: i) Optimal policy may be far away from any of the classical scenarios proposed by biologists, ii) The more the future is discounted, the higher the likelihood of finding contradictions among scenarios proposed by biologists and conclusions from economic analysis, iii) Optimal management reduces the risk of the stock falling under precautionary levels, especially if the future is not discounted to much, and iv) Optimal stationary fishing rate may be very different depending on the economic indicator used as reference.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper estimates a standard version of the New Keynesian monetary (NKM) model under alternative specifications of the monetary policy rule using U.S. and Eurozone data. The estimation procedure implemented is a classical method based on the indirect inference principle. An unrestricted VAR is considered as the auxiliary model. On the one hand, the estimation method proposed overcomes some of the shortcomings of using a structural VAR as the auxiliary model in order to identify the impulse response that defines the minimum distance estimator implemented in the literature. On the other hand, by following a classical approach we can further assess the estimation results found in recent papers that follow a maximum-likelihood Bayesian approach. The estimation results show that some structural parameter estimates are quite sensitive to the specification of monetary policy. Moreover, the estimation results in the U.S. show that the fit of the NKM under an optimal monetary plan is much worse than the fit of the NKM model assuming a forward-looking Taylor rule. In contrast to the U.S. case, in the Eurozone the best fit is obtained assuming a backward-looking Taylor rule, but the improvement is rather small with respect to assuming either a forward-looking Taylor rule or an optimal plan.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using the ECHP, we explored the determinants of having the first child in Spain. Our main goal was to study the relation between female wages and the decision to enter motherhood. Since the offered wage of non-working women is not observed, we estimate it and impute a potential wage to each woman (working and non-working). This potential wage enable us to investigate the effect of wages (the opportunity cost of time non-worked and dedicated to children) on the decision to have the first child, for both workers and non-workers. Contrary to previous results, we found that female wages are positively related to the likelihood of having the first child. This result suggests that the income effect overcomes the substitution effect when non-participants opportunity cost is also taken into account.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using data from the Spanish Labor Force Survey (Encuesta de Población Activa) from 1999 through 2004, we explore the role of regional employment opportunities in explaining the increasing immigrant flows of recent years despite the limited internal mobility on the part of natives. Subsequently, we investigate the policy question of whether immigration has helped reduced unemployment rate disparities across Spanish regions by attracting immigrant flows to regions offering better employment opportunities. Our results indicate that immigrants choose to reside in regions with larger employment rates and where their probability of finding a job is higher. In particular, and despite some differences depending on their origin, immigrants appear generally more responsive than their native counterparts to a higher likelihood of informal, self, or indefinite employment. More importantly, insofar the vast majority of immigrants locate in regions characterized by higher employment rates, immigration contributes to greasing the wheels of the Spanish labor market by narrowing regional unemployment rate disparities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Contributed to: Fusion of Cultures: XXXVIII Annual Conference on Computer Applications and Quantitative Methods in Archaeology – CAA2010 (Granada, Spain, Apr 6-9, 2010)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We evaluated the use of strip-transect survey methods for manatees through a series of replicate aerial surveys in the Banana River, Brevard County, Florida, during summer 1993 and summer 1994. Transect methods sample a representative portion of the total study area, thus allowing for statistical extrapolation to the total area. Other advantages of transect methods are less flight time and less cost than total coverage, ease of navigation, and reduced likelihood of double-counting. Our objectives were: (1) to identify visibility biases associated with the transect survey method and to adjust the counts accordingly; (2) to derive a population estimate with known variance for the Banana River during summer; and (3) to evaluate the potential value of this survey method for monitoring trends in manatee population size over time. (51 page document)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report the findings of an experiment designed to study how people learn and make decisions in network games. Network games offer new opportunities to identify learning rules, since on networks (compared to e.g. random matching) more rules differ in terms of their information requirements. Our experimental design enables us to observe both which actions participants choose and which information they consult before making their choices. We use this information to estimate learning types using maximum likelihood methods. There is substantial heterogeneity in learning types. However, the vast majority of our participants' decisions are best characterized by reinforcement learning or (myopic) best-response learning. The distribution of learning types seems fairly stable across contexts. Neither network topology nor the position of a player in the network seem to substantially affect the estimated distribution of learning types.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Without knowledge of basic seafloor characteristics, the ability to address any number of critical marine and/or coastal management issues is diminished. For example, management and conservation of essential fish habitat (EFH), a requirement mandated by federally guided fishery management plans (FMPs), requires among other things a description of habitats for federally managed species. Although the list of attributes important to habitat are numerous, the ability to efficiently and effectively describe many, and especially at the scales required, does not exist with the tools currently available. However, several characteristics of seafloor morphology are readily obtainable at multiple scales and can serve as useful descriptors of habitat. Recent advancements in acoustic technology, such as multibeam echosounding (MBES), can provide remote indication of surficial sediment properties such as texture, hardness, or roughness, and further permit highly detailed renderings of seafloor morphology. With acoustic-based surveys providing a relatively efficient method for data acquisition, there exists a need for efficient and reproducible automated segmentation routines to process the data. Using MBES data collected by the Olympic Coast National Marine Sanctuary (OCNMS), and through a contracted seafloor survey, we expanded on the techniques of Cutter et al. (2003) to describe an objective repeatable process that uses parameterized local Fourier histogram (LFH) texture features to automate segmentation of surficial sediments from acoustic imagery using a maximum likelihood decision rule. Sonar signatures and classification performance were evaluated using video imagery obtained from a towed camera sled. Segmented raster images were converted to polygon features and attributed using a hierarchical deep-water marine benthic classification scheme (Greene et al. 1999) for use in a geographical information system (GIS). (PDF contains 41 pages.)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Feasible tomography schemes for large particle numbers must possess, besides an appropriate data acquisition protocol, an efficient way to reconstruct the density operator from the observed finite data set. Since state reconstruction typically requires the solution of a nonlinear large-scale optimization problem, this is a major challenge in the design of scalable tomography schemes. Here we present an efficient state reconstruction scheme for permutationally invariant quantum state tomography. It works for all common state-of-the-art reconstruction principles, including, in particular, maximum likelihood and least squares methods, which are the preferred choices in today's experiments. This high efficiency is achieved by greatly reducing the dimensionality of the problem employing a particular representation of permutationally invariant states known from spin coupling combined with convex optimization, which has clear advantages regarding speed, control and accuracy in comparison to commonly employed numerical routines. First prototype implementations easily allow reconstruction of a state of 20 qubits in a few minutes on a standard computer

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Phosphorus removal by wetlands and basins in Lake Tahoe may be improved through designing these systems to filter storm water through media having higher phosphorus removal capabilities than local parent material. Substrates rich in iron, aluminum and calcium oftentimes have enhanced phosphorus removal. These substrates can be naturally occurring, byproducts of industrial or water treatment processes, or engineered. Phosphorus removal fundamentally occurs through chemical adsorption and/or precipitation and much of the phosphorus can be irreversibly bound. In addition to these standard media, other engineered substrates are available to enhance P removal. One such substrate is locally available in Reno and uses lanthanum coated diatomaceous earth for arsenate removal. This material, which has a high positive surface charge, can also irreversibly remove phosphorus. Physical factors also affect P removal. Specifically, specific surface area and particle shape affect filtration capacity, contact area between water and the surface area, and likelihood of clogging and blinding. A number of substrates have been shown to effectively remove P in case studies. Based upon these studies, promising substrates include WTRs, blast furnace slag, steel furnace slag, OPC, calcite, marble Utelite and other LWAs, zeolite and shale. However, other nonperformance factors such as environmental considerations, application logistics, costs, and potential for cementification narrow the list of possible media for application at Tahoe. Industrial byproducts such as slags risk possible leaching of heavy metals and this potential cannot be easily predicted. Fly ash and other fine particle substrates would be more difficult to apply because they would need to be blended, making them less desirable and more costly to apply than larger diameter media. High transportation costs rule out non-local products. Finally, amorphous calcium products will eventually cementify reducing their effectiveness in filtration systems. Based upon these considerations, bauxite, LWAs and expanded shales/clays, iron-rich sands, activated alumina, marble and dolomite, and natural and lanthanum activated diatomaceous earth are the products most likely to be tested for application at Tahoe. These materials are typically iron, calcium or aluminum based; many have a high specific surface area; and all have low transportation costs. (PDF contains 21 pages)