934 resultados para Restricted maximum likelihood
Resumo:
(This Technical Report revises TR-BUCS-2003-011) The Transmission Control Protocol (TCP) has been the protocol of choice for many Internet applications requiring reliable connections. The design of TCP has been challenged by the extension of connections over wireless links. In this paper, we investigate a Bayesian approach to infer at the source host the reason of a packet loss, whether congestion or wireless transmission error. Our approach is "mostly" end-to-end since it requires only one long-term average quantity (namely, long-term average packet loss probability over the wireless segment) that may be best obtained with help from the network (e.g. wireless access agent).Specifically, we use Maximum Likelihood Ratio tests to evaluate TCP as a classifier of the type of packet loss. We study the effectiveness of short-term classification of packet errors (congestion vs. wireless), given stationary prior error probabilities and distributions of packet delays conditioned on the type of packet loss (measured over a larger time scale). Using our Bayesian-based approach and extensive simulations, we demonstrate that congestion-induced losses and losses due to wireless transmission errors produce sufficiently different statistics upon which an efficient online error classifier can be built. We introduce a simple queueing model to underline the conditional delay distributions arising from different kinds of packet losses over a heterogeneous wired/wireless path. We show how Hidden Markov Models (HMMs) can be used by a TCP connection to infer efficiently conditional delay distributions. We demonstrate how estimation accuracy is influenced by different proportions of congestion versus wireless losses and penalties on incorrect classification.
Resumo:
A novel approach for real-time skin segmentation in video sequences is described. The approach enables reliable skin segmentation despite wide variation in illumination during tracking. An explicit second order Markov model is used to predict evolution of the skin color (HSV) histogram over time. Histograms are dynamically updated based on feedback from the current segmentation and based on predictions of the Markov model. The evolution of the skin color distribution at each frame is parameterized by translation, scaling and rotation in color space. Consequent changes in geometric parameterization of the distribution are propagated by warping and re-sampling the histogram. The parameters of the discrete-time dynamic Markov model are estimated using Maximum Likelihood Estimation, and also evolve over time. Quantitative evaluation of the method was conducted on labeled ground-truth video sequences taken from popular movies.
Resumo:
A non-linear supervised learning architecture, the Specialized Mapping Architecture (SMA) and its application to articulated body pose reconstruction from single monocular images is described. The architecture is formed by a number of specialized mapping functions, each of them with the purpose of mapping certain portions (connected or not) of the input space, and a feedback matching process. A probabilistic model for the architecture is described along with a mechanism for learning its parameters. The learning problem is approached using a maximum likelihood estimation framework; we present Expectation Maximization (EM) algorithms for two different instances of the likelihood probability. Performance is characterized by estimating human body postures from low level visual features, showing promising results.
Resumo:
This study investigates whether higher input use per stay in the hospital (treatment intensity) and longer length of stay improve outcomes of care. We allow for endogeneity of intensity and length of stay by estimating a quasi-maximum-likelihood discrete factor model, where the distribution of the unmeasured variable is modeled using a discrete distribution. Data on elderly persons come from several waves of the National Long-Term Care Survey merged with Medicare claims data for 1984-1995 and the National Death Index. We find that higher intensity improves patient survival and some dimensions of functional status among those who survive.
Resumo:
UNLABELLED: PREMISE OF THE STUDY: The Sphagnopsida, an early-diverging lineage of mosses (phylum Bryophyta), are morphologically and ecologically unique and have profound impacts on global climate. The Sphagnopsida are currently classified in two genera, Sphagnum (peat mosses) with some 350-500 species and Ambuchanania with one species. An analysis of phylogenetic relationships among species and genera in the Sphagnopsida were conducted to resolve major lineages and relationships among species within the Sphagnopsida. • METHODS: Phylogenetic analyses of nucleotide sequences from the nuclear, plastid, and mitochondrial genomes (11 704 nucleotides total) were conducted and analyzed using maximum likelihood and Bayesian inference employing seven different substitution models of varying complexity. • KEY RESULTS: Phylogenetic analyses resolved three lineages within the Sphagnopsida: (1) Sphagnum sericeum, (2) S. inretortum plus Ambuchanania leucobryoides, and (3) all remaining species of Sphagnum. Sister group relationships among these three clades could not be resolved, but the phylogenetic results indicate that the highly divergent morphology of A. leucobryoides is derived within the Sphagnopsida rather than plesiomorphic. A new classification is proposed for class Sphagnopsida, with one order (Sphagnales), three families, and four genera. • CONCLUSIONS: The Sphagnopsida are an old lineage within the phylum Bryophyta, but the extant species of Sphagnum represent a relatively recent radiation. It is likely that additional species critical to understanding the evolution of peat mosses await discovery, especially in the southern hemisphere.
Resumo:
Of key importance to oil and gas companies is the size distribution of fields in the areas that they are drilling. Recent arguments suggest that there are many more fields yet to be discovered in mature provinces than had previously been thought because the underlying distribution is monotonic not peaked. According to this view the peaked nature of the distribution for discovered fields reflects not the underlying distribution but the effect of economic truncation. This paper contributes to the discussion by analysing up-to-date exploration and discovery data for two mature provinces using the discovery-process model, based on sampling without replacement and implicitly including economic truncation effects. The maximum likelihood estimation involved generates a high-dimensional mixed-integer nonlinear optimization problem. A highly efficient solution strategy is tested, exploiting the separable structure and handling the integer constraints by treating the problem as a masked allocation problem in dynamic programming.
Resumo:
Forest fires can cause extensive damage to natural resources and properties. They can also destroy wildlife habitat, affect the forest ecosystem and threaten human lives. In this paper extreme wildland fires are analysed using a point process model for extremes. The model based on a generalised Pareto distribution is used to model data on acres of wildland burnt by extreme fire in the US since 1825. A semi-parametric smoothing approach is adapted with maximum likelihood method to estimate model parameters.
Resumo:
The SB distributional model of Johnson's 1949 paper was introduced by a transformation to normality, that is, z ~ N(0, 1), consisting of a linear scaling to the range (0, 1), a logit transformation, and an affine transformation, z = γ + δu. The model, in its original parameterization, has often been used in forest diameter distribution modelling. In this paper, we define the SB distribution in terms of the inverse transformation from normality, including an initial linear scaling transformation, u = γ′ + δ′z (δ′ = 1/δ and γ′ = �γ/δ). The SB model in terms of the new parameterization is derived, and maximum likelihood estimation schema are presented for both model parameterizations. The statistical properties of the two alternative parameterizations are compared empirically on 20 data sets of diameter distributions of Changbai larch (Larix olgensis Henry). The new parameterization is shown to be statistically better than Johnson's original parameterization for the data sets considered here.
Resumo:
Forest fires can cause extensive damage to natural resources and properties. They can also destroy wildlife habitat, affect the forest ecosystem and threaten human lives. In this paper incidences of extreme wildland fires are modelled by a point process model which incorporates time-trend. A model based on a generalised Pareto distribution is used to model data on acres of wildland burnt by extreme fire in the US since 1825. A semi-parametric smoothing approach, which is very useful in exploratory analysis of changes in extremes, is illustrated with the maximum likelihood method to estimate model parameters.
Resumo:
The Logit-Logistic (LL), Johnson's SB, and the Beta (GBD) are flexible four-parameter probability distribution models in terms of the (skewness-kurtosis) region covered, and each has been used for modeling tree diameter distributions in forest stands. This article compares bivariate forms of these models in terms of their adequacy in representing empirical diameter-height distributions from 102 sample plots. Four bivariate models are compared: SBB, the natural, well-known, and much-used bivariate generalization of SB; the bivariate distributions with LL, SB, and Beta as marginals, constructed using Plackett's method (LL-2P, etc.). All models are fitted using maximum likelihood, and their goodness-of-fits are compared using minus log-likelihood (equivalent to Akaike's Information Criterion, the AIC). The performance ranking in this case study was SBB, LL-2P, GBD-2P, and SB-2P
Resumo:
Johnson's SB and the logit-logistic are four-parameter distribution models that may be obtained from the standard normal and logistic distributions by a four-parameter transformation. For relatively small data sets, such as diameter at breast height measurements obtained from typical sample plots, distribution models with four or less parameters have been found to be empirically adequate. However, in situations in which the distributions are complex, for example in mixed stands or when the stand has been thinned or when working with aggregated data, then distribution models with more shape parameters may prove to be necessary. By replacing the symmetric standard logistic distribution of the logit-logistic with a one-parameter “standard Richards” distribution and transforming by a five-parameter Richards function, we obtain a new six-parameter distribution model, the “Richit-Richards”. The Richit-Richards includes the “logit-Richards”, the “Richit-logistic”, and the logit-logistic as submodels. Maximum likelihood estimation is used to fit the model, and some problems in the maximum likelihood estimation of bounding parameters are discussed. An empirical case study of the Richit-Richards and its submodels is conducted on pooled diameter at breast height data from 107 sample plots of Chinese fir (Cunninghamia lanceolata (Lamb.) Hook.). It is found that the new models provide significantly better fits than the four-parameter logit-logistic for large data sets.
Resumo:
Purpose – This study aims to analyse the influences of prestige, satisfaction, and communication on brand identification and to show how brand identification influences word-of-mouth and brand repurchase. Design/methodology/approach – A theoretical model is developed and tested with a sample of car owners in the UK of two global car brands. Structural equation modelling was used with LISREL 8.54 and the maximum likelihood (ML) method. Findings – This paper draws mainly on the theory of social identity to provide a comprehensive understanding of conditions under which brand owners are likely to identify with their brand and the bases and consequences of such identification. It was shown that prestige, satisfaction, and communication effect brand identification. The study confirms that consumers' development of relationships via brand identification results in word of mouth about the brand and intentions to repurchase the brand. Furthermore, it was found that brand identification fully mediates the influences of prestige, satisfaction, and communication on word of mouth and brand repurchase. Research limitations/implications – The focus was on one country and one industry. Practical implications – Managers are provided with strategies that enhance the identification of their customers with their brand so that they can strengthen the customers' brand identification. Areas for future research are suggested. For instance, it could be interesting to test the model in a different industry and/or cultural context. Originality/value – Very few previous studies have looked at brand identification which is surprising considering it is such an important variable to influence word-of-mouth and brand repurchase. The study tests three antecedents to brand identification and two outcomes that have not been investigated previously. Overall, the study adds knowledge in this somewhat neglected area.
Resumo:
This paper provides mutual information performance analysis of multiple-symbol differential WSK (M-phase shift keying) over time-correlated, time-varying flat-fading communication channels. A state space approach is used to model time correlation of time varying channel phase. This approach captures the dynamics of time correlated, time-varying channels and enables exploitation of the forward-backward algorithm for mutual information performance analysis. It is shown that the differential decoding implicitly uses a sequence of innovations of the channel process time correlation and this sequence is essentially uncorrelated. It enables utilization of multiple-symbol differential detection, as a form of block-by-block maximum likelihood sequence detection for capacity achieving mutual information performance. It is shown that multiple-symbol differential ML detection of BPSK and QPSK practically achieves the channel information capacity with observation times only on the order of a few symbol intervals
Resumo:
Aim Recent studies have suggested that global diatom distributions are not limited by dispersal, in the case of both extant species and fossil species, but rather that environmental filtering explains their spatial patterns. Hubbell's neutral theory of biodiversity provides a framework in which to test these alternatives. Our aim is to test whether the structure of marine phytoplankton (diatoms, dinoflagellates and coccolithophores) assemblages across the Atlantic agrees with neutral theory predictions. We asked: (1) whether intersite variance in phytoplankton diversity is explained predominantly by dispersal limitation or by environmental conditions; and (2) whether species abundance distributions are consistent with those expected by the neutral model. Location Meridional transect of the Atlantic (50 degrees N50 degrees S). Methods We estimated the relative contributions of environmental factors and geographic distance to phytoplankton composition using similarity matrices, Mantel tests and variation partitioning of the species composition based upon canonical ordination methods. We compared the species abundance distribution of phytoplankton with the neutral model using Etienne's maximum-likelihood inference method. Results Phytoplankton communities are slightly more determined by niche segregation (24%), than by dispersal limitation and ecological drift (17%). In 60% of communities, the assumption of neutrality in species' abundance distributions could not be rejected. In tropical zones, where oceanic gyres enclose large stable water masses, most communities showed low species immigration rates; in contrast, we infer that communities in temperate areas, out of oligotrophic gyres, have higher rates of species immigration. Conclusions Phytoplankton community structure is consistent with partial niche assembly and partial dispersal and drift assembly (neutral processes). The role of dispersal limitation is almost as important as habitat filtering, a fact that has been largely overlooked in previous studies. Furthermore, the polewards increase in immigration rates of species that we have discovered is probably caused by water mixing conditions and productivity.
Resumo:
An optimal search theory, the so-called Levy-flight foraging hypothesis(1), predicts that predators should adopt search strategies known as Levy flights where prey is sparse and distributed unpredictably, but that Brownian movement is sufficiently efficient for locating abundant prey(2-4). Empirical studies have generated controversy because the accuracy of statistical methods that have been used to identify Levy behaviour has recently been questioned(5,6). Consequently, whether foragers exhibit Levy flights in the wild remains unclear. Crucially, moreover, it has not been tested whether observed movement patterns across natural landscapes having different expected resource distributions conform to the theory's central predictions. Here we use maximum-likelihood methods to test for Levy patterns in relation to environmental gradients in the largest animal movement data set assembled for this purpose. Strong support was found for Levy search patterns across 14 species of open-ocean predatory fish (sharks, tuna, billfish and ocean sunfish), with some individuals switching between Levy and Brownian movement as they traversed different habitat types. We tested the spatial occurrence of these two principal patterns and found Levy behaviour to be associated with less productive waters (sparser prey) and Brownian movements to be associated with productive shelf or convergence-front habitats (abundant prey). These results are consistent with the Levy-flight foraging hypothesis(1,7), supporting the contention(8,9) that organism search strategies naturally evolved in such a way that they exploit optimal Levy patterns.