962 resultados para Estimation par maximum de vraisemblance


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Approximate Bayesian computation (ABC) is a popular technique for analysing data for complex models where the likelihood function is intractable. It involves using simulation from the model to approximate the likelihood, with this approximate likelihood then being used to construct an approximate posterior. In this paper, we consider methods that estimate the parameters by maximizing the approximate likelihood used in ABC. We give a theoretical analysis of the asymptotic properties of the resulting estimator. In particular, we derive results analogous to those of consistency and asymptotic normality for standard maximum likelihood estimation. We also discuss how sequential Monte Carlo methods provide a natural method for implementing our likelihood-based ABC procedures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We show that the sensor localization problem can be cast as a static parameter estimation problem for Hidden Markov Models and we develop fully decentralized versions of the Recursive Maximum Likelihood and the Expectation-Maximization algorithms to localize the network. For linear Gaussian models, our algorithms can be implemented exactly using a distributed version of the Kalman filter and a message passing algorithm to propagate the derivatives of the likelihood. In the non-linear case, a solution based on local linearization in the spirit of the Extended Kalman Filter is proposed. In numerical examples we show that the developed algorithms are able to learn the localization parameters well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ENGLISH: Age composition of catch, and growth rate, of yellowfin tuna have been estimated by Hennemuth (1961a) and Davidoff (1963). The relative abundance and instantaneous total mortality rate of yellowfin tuna during 1954-1959 have been estimated by Hennenmuth (1961b). It is now possible to extend this work, because more data are available; these include data for 1951-1954, which were previously not available, and data for 1960-1962, which were collected subsequent to Hennemuth's (1961b) publication. In that publication, Hennemuth estimated the total instantaneous mortality rate (Z) during the entire time period a year class is present in the fishery following full recruitment. However, this method may lead to biased estimates of abundance, and hence mortality rates, because of both seasonal migrations into or out of specific fishing areas and possible seasonal differences in availability or vulnerability of the fish to the fishing gear. Schaefer, Chatwin and Broadhead (1961) and Joseph etl al. (1964) have indicated that seasonal migrations of yellowfin occur. A method of estimating mortality rates which is not biased by seasonal movements would be of value in computations of population dynamics. The method of analysis outlined and used in the present paper may obviate this bias by comparing the abundance of an individual yellowfin year class, following its period of maximum abundance, in an individual area during a specific quarter of the year with its abundance in the same area one year later. The method was suggested by Gulland (1955) and used by Chapman, Holt and Allen (1963) in assessing Antarctic whale stocks. This method, and the results of its use with data for yellowfin caught in the eastern tropical Pacific from 1951-1962 are described in this paper. SPANISH: La composición de edad de la captura, y la tasa de crecimiento del atún aleta amarilla, han sido estimadas por Hennemuth (1961a) y Davidoff (1963). Hennemuth (1961b), estimó la abundancia relativa y la tasa de mortalidad total instantánea del atún aleta amarilla durante 1954-1959. Se puede ampliar ahora, este trabajo, porque se dispone de más datos; éstos incluyen datos de 1951 1954, de los cuales no se disponía antes, y datos de 1960-1962 que fueron recolectados después de la publicación de Hennemuth (1961b). En esa obra, Hennemuth estimó la tasa de mortalidad total instantánea (Z) durante todo el período de tiempo en el cual una clase anual está presente en la pesquería, consecutiva al reclutamiento total. Sin embargo, este método puede conducir a estimaciones con bias (inclinación viciada) de abundancia, y de aquí las tasas de mortalidad, debidas tanto a migraciones estacionales dentro o fuera de las áreas determinadas de pesca, como a posibles diferencias estacionales en la disponibilidad y vulnerabilidad de los peces al equipo de pesca. Schaefer, Chatwin y Broadhead (1961) y Joseph et al. (1964) han indicado que ocurren migraciones estacionales de atún aleta amarilla. Un método para estimar las tasas de mortalidad el cual no tuviera bias debido a los movimientos estacionales, sería de valor en los cómputos de la dinámica de las poblaciones. El método de análisis delineado y usado en el presente estudio puede evitar este bias al comparar la abundancia de una clase anual individual de atún aleta amarilla, subsecuente a su período de abundancia máxima en un área individual, durante un trimestre específico del año, con su abundancia en la misma área un año más tarde. Este método fue sugerido por Gulland (1955) y empleado por Chapman, Holt y Allen (1963) en la declaración de los stocks de la ballena antártica. Este método y los resultados de su uso, en combinación con los datos del atún aleta amarilla capturado en el Pacífico oriental tropical desde 1951-1962, son descritos en este estudio.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The search for reliable proxies of past deep ocean temperature and salinity has proved difficult, thereby limiting our ability to understand the coupling of ocean circulation and climate over glacial-interglacial timescales. Previous inferences of deep ocean temperature and salinity from sediment pore fluid oxygen isotopes and chlorinity indicate that the deep ocean density structure at the Last Glacial Maximum (LGM, approximately 20,000 years BP) was set by salinity, and that the density contrast between northern and southern sourced deep waters was markedly greater than in the modern ocean. High density stratification could help explain the marked contrast in carbon isotope distribution recorded in the LGM ocean relative to that we observe today, but what made the ocean's density structure so different at the LGM? How did it evolve from one state to another? Further, given the sparsity of the LGM temperature and salinity data set, what else can we learn by increasing the spatial density of proxy records?

We investigate the cause and feasibility of a highly and salinity stratified deep ocean at the LGM and we work to increase the amount of information we can glean about the past ocean from pore fluid profiles of oxygen isotopes and chloride. Using a coupled ocean--sea ice--ice shelf cavity model we test whether the deep ocean density structure at the LGM can be explained by ice--ocean interactions over the Antarctic continental shelves, and show that a large contribution of the LGM salinity stratification can be explained through lower ocean temperature. In order to extract the maximum information from pore fluid profiles of oxygen isotopes and chloride we evaluate several inverse methods for ill-posed problems and their ability to recover bottom water histories from sediment pore fluid profiles. We demonstrate that Bayesian Markov Chain Monte Carlo parameter estimation techniques enable us to robustly recover the full solution space of bottom water histories, not only at the LGM, but through the most recent deglaciation and the Holocene up to the present. Finally, we evaluate a non-destructive pore fluid sampling technique, Rhizon samplers, in comparison to traditional squeezing methods and show that despite their promise, Rhizons are unlikely to be a good sampling tool for pore fluid measurements of oxygen isotopes and chloride.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical data (1959-1977) of the trawling fishery off the continental shelf of Côte d'Ivoire has been fitted to the Fox (PRODFIT) global model. Owing to the proliferation of baliste (B. capriscus) since the years 1971-1972, data were divided into two groups. Maximum Sustainable Yield MSY: PMMC in the text), for the whole of commercial species in the continental shelf, decreased from 8800t to 5900t between the two periods; the difference represents balistes potentialities at bottom level. The model has also been fitted to data which concern Sciaenidae coastal community and Sparidae community which are parts by the 50 m isobathe. Deep layer (50-120m) MSY is at 2350t during the whole period of study. Until 1977, this potentiality was never reached, because of the low productivity of the Sparidae community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We show that the sensor self-localization problem can be cast as a static parameter estimation problem for Hidden Markov Models and we implement fully decentralized versions of the Recursive Maximum Likelihood and on-line Expectation-Maximization algorithms to localize the sensor network simultaneously with target tracking. For linear Gaussian models, our algorithms can be implemented exactly using a distributed version of the Kalman filter and a novel message passing algorithm. The latter allows each node to compute the local derivatives of the likelihood or the sufficient statistics needed for Expectation-Maximization. In the non-linear case, a solution based on local linearization in the spirit of the Extended Kalman Filter is proposed. In numerical examples we demonstrate that the developed algorithms are able to learn the localization parameters. © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conventional Hidden Markov models generally consist of a Markov chain observed through a linear map corrupted by additive noise. This general class of model has enjoyed a huge and diverse range of applications, for example, speech processing, biomedical signal processing and more recently quantitative finance. However, a lesser known extension of this general class of model is the so-called Factorial Hidden Markov Model (FHMM). FHMMs also have diverse applications, notably in machine learning, artificial intelligence and speech recognition [13, 17]. FHMMs extend the usual class of HMMs, by supposing the partially observed state process is a finite collection of distinct Markov chains, either statistically independent or dependent. There is also considerable current activity in applying collections of partially observed Markov chains to complex action recognition problems, see, for example, [6]. In this article we consider the Maximum Likelihood (ML) parameter estimation problem for FHMMs. Much of the extant literature concerning this problem presents parameter estimation schemes based on full data log-likelihood EM algorithms. This approach can be slow to converge and often imposes heavy demands on computer memory. The latter point is particularly relevant for the class of FHMMs where state space dimensions are relatively large. The contribution in this article is to develop new recursive formulae for a filter-based EM algorithm that can be implemented online. Our new formulae are equivalent ML estimators, however, these formulae are purely recursive and so, significantly reduce numerical complexity and memory requirements. A computer simulation is included to demonstrate the performance of our results. © Taylor & Francis Group, LLC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The retention factors (k) of 104 hydrophobic organic chemicals (HOCs) were measured in soil column chromatography (SCC) over columns filled with three naturally occurring reference soils and eluted with Milli-Q water. A novel method for the estimation of soil organic partition coefficient (K-oc) was developed based on correlations with k in soil/water systems. Strong log K-oc versus log k correlations (r>0.96) were found. The estimated K-oc values were in accordance with the literature values with a maximum deviation of less than 0.4 log units. All estimated K-oc values from three soils were consistent with each other. The SCC approach is promising for fast screening of a large number of chemicals in their environmental applications. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recovering a volumetric model of a person, car, or other object of interest from a single snapshot would be useful for many computer graphics applications. 3D model estimation in general is hard, and currently requires active sensors, multiple views, or integration over time. For a known object class, however, 3D shape can be successfully inferred from a single snapshot. We present a method for generating a ``virtual visual hull''-- an estimate of the 3D shape of an object from a known class, given a single silhouette observed from an unknown viewpoint. For a given class, a large database of multi-view silhouette examples from calibrated, though possibly varied, camera rigs are collected. To infer a novel single view input silhouette's virtual visual hull, we search for 3D shapes in the database which are most consistent with the observed contour. The input is matched to component single views of the multi-view training examples. A set of viewpoint-aligned virtual views are generated from the visual hulls corresponding to these examples. The 3D shape estimate for the input is then found by interpolating between the contours of these aligned views. When the underlying shape is ambiguous given a single view silhouette, we produce multiple visual hull hypotheses; if a sequence of input images is available, a dynamic programming approach is applied to find the maximum likelihood path through the feasible hypotheses over time. We show results of our algorithm on real and synthetic images of people.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel approach for real-time skin segmentation in video sequences is described. The approach enables reliable skin segmentation despite wide variation in illumination during tracking. An explicit second order Markov model is used to predict evolution of the skin color (HSV) histogram over time. Histograms are dynamically updated based on feedback from the current segmentation and based on predictions of the Markov model. The evolution of the skin color distribution at each frame is parameterized by translation, scaling and rotation in color space. Consequent changes in geometric parameterization of the distribution are propagated by warping and re-sampling the histogram. The parameters of the discrete-time dynamic Markov model are estimated using Maximum Likelihood Estimation, and also evolve over time. Quantitative evaluation of the method was conducted on labeled ground-truth video sequences taken from popular movies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background. Kidney Disease Outcomes Quality Initiative (KDOQI) chronic kidney disease (CKD) guidelines have focused on the utility of using the modified four-variable MDRD equation (now traceable by isotope dilution mass spectrometry IDMS) in calculating estimated glomerular filtration rates (eGFRs). This study assesses the practical implications of eGFR correction equations on the range of creatinine assays currently used in the UK and further investigates the effect of these equations on the calculated prevalence of CKD in one UK region Methods. Using simulation, a range of creatinine data (30–300 µmol/l) was generated for male and female patients aged 20–100 years. The maximum differences between the IDMS and MDRD equations for all 14 UK laboratory techniques for serum creatinine measurement were explored with an average of individual eGFRs calculated according to MDRD and IDMS 30 ml/min/1.73 m2. Observed data for 93,870 patients yielded a first MDRD eGFR 3 months later of which 47 093 (71%) continued to have an eGFR

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Based on experimental viscosity data collected from the literature and using density data obtained from a predictive method previously proposed by the authors, a group contribution method is proposed to estimate viscosity of imidazolium-, pyridinium-, and pyrrolidinium-based ILs containing hexafluorophosphate (PF6), tetrafluoroborate (BF4), bis(trifluoromethanesulfonyl) amide (Tf2N), chloride (Cl), acetate (CH3COO), methyl sulfate (MeSO4), ethyl sulfate (EtSO4), and trifluoromethanesulfonate (CF3SO3) anions, covering wide ranges of temperature, 293–393 K and viscosity, 4–21,000 cP. It is shown that a good agreement with literature data is obtained. For circa 500 data points of 29 ILs studied, a mean percent deviation (MPD) of 7.7% with a maximum deviation smaller than 28% was observed. 71.1% of the estimated viscosities present deviations smaller than 10% of the experimental values while only 6.4% have deviations larger than 20%. The group contribution method here developed can thus be used to evaluate the viscosity of new ionic liquids in wide ranges of temperatures at atmospheric pressure and, as data for new groups of cations and anions became available, can be extended to a larger range of ionic liquids.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The limited availability of experimental data and their quality have been preventing the development of predictive methods and Computer Aided Molecular Design (CAMD) of ionic liquids (ILs). Based on experimental speed of sound data collected from the literature, the inter-relationship of surface tension (s), density (?), and speed of sound (u) has been examined for imidazolium based ILs containing hexafluorophosphate (PF6), tetrafluoroborate (BF4), bis(trifluoromethanesulphonyl) amide (NTf2), methyl sulphate (MeSO4), ethyl sulphate (EtSO4), and trifluoromethanesulphonate (CF3SO3) anions, covering wide ranges of temperature, 278.15–343.15 K and speed of sound, 1129.0–1851.0 m s-1. The speed of sound was correlated with a modified Auerbach's relation, by using surface tension and density data obtained from volume based predictive methods previously proposed by the authors. It is shown that a good agreement with literature data is obtained. For 133 data points of 14 ILs studied a mean percent deviation (MPD) of 1.96% with a maximum deviation inferior to 5% was observed. The correlations developed here can thus be used to evaluate the speeds of sound of new ionic liquids.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE:
The aim of the study was to compare the pre-operative metabolic tumour length on FDG PET/CT with the resected pathological specimen in patients with oesophageal cancer.

METHODS:
All patients diagnosed with oesophageal carcinoma who had undergone staging PET/CT imaging between the period of June 2002 and May 2008 who were then suitable for curative surgery, either with or without neo-adjuvant chemotherapy, were included in this study. Metabolic tumour length was assessed using both visual analysis and a maximum standardised uptake value (SUV(max)) cutoff of 2.5.

RESULTS:
Thirty-nine patients proceeded directly to curative surgical resection, whereas 48 patients received neo-adjuvant chemotherapy, followed by curative surgery. The 95% limits of agreement in the surgical arm were more accurate when the metabolic tumour length was visually assessed with a mean difference of -0.05 cm (SD 2.16 cm) compared to a mean difference of +2.42 cm (SD 3.46 cm) when assessed with an SUV(max) cutoff of 2.5. In the neo-adjuvant group, the 95% limits of agreement were once again more accurate when assessed visually with a mean difference of -0.6 cm (SD 1.84 cm) compared to a mean difference of +1.58 cm (SD 3.1 cm) when assessed with an SUV(max) cutoff of 2.5.

CONCLUSION:
This study confirms the high accuracy of PET/CT in measuring gross target volume (GTV) length. A visual method for GTV length measurement was demonstrated to be superior and more accurate than when using an SUV(max) cutoff of 2.5. This has the potential of reducing the planning target volume with dose escalation to the tumour with a corresponding reduction in normal tissue complication probability.