973 resultados para Trimmed likelihood
Resumo:
In this paper we formulate the nonnegative matrix factorisation (NMF) problem as a maximum likelihood estimation problem for hidden Markov models and propose online expectation-maximisation (EM) algorithms to estimate the NMF and the other unknown static parameters. We also propose a sequential Monte Carlo approximation of our online EM algorithm. We show the performance of the proposed method with two numerical examples. © 2012 IFAC.
Resumo:
A systematic study of the parameter space of graphene chemical vapor deposition (CVD) on polycrystalline Cu foils is presented, aiming at a more fundamental process rationale in particular regarding the choice of carbon precursor and mitigation of Cu sublimation. CH 4 as precursor requires H 2 dilution and temperatures ≥1000 °C to keep the Cu surface reduced and yield a high-quality, complete monolayer graphene coverage. The H 2 atmosphere etches as-grown graphene; hence, maintaining a balanced CH 4/H 2 ratio is critical. Such balance is more easily achieved at low-pressure conditions, at which however Cu sublimation reaches deleterious levels. In contrast, C 6H 6 as precursor requires no reactive diluent and consistently gives similar graphene quality at 100-150 °C lower temperatures. The lower process temperature and more robust processing conditions allow the problem of Cu sublimation to be effectively addressed. Graphene formation is not inherently self-limited to a monolayer for any of the precursors. Rather, the higher the supplied carbon chemical potential, the higher the likelihood of film inhomogeneity and primary and secondary multilayer graphene nucleation. For the latter, domain boundaries of the inherently polycrystalline CVD graphene offer pathways for a continued carbon supply to the catalyst. Graphene formation is significantly affected by the Cu crystallography; i.e., the evolution of microstructure and texture of the catalyst template form an integral part of the CVD process. © 2012 American Chemical Society.
Resumo:
In this paper, we present an expectation-maximisation (EM) algorithm for maximum likelihood estimation in multiple target models (MTT) with Gaussian linear state-space dynamics. We show that estimation of sufficient statistics for EM in a single Gaussian linear state-space model can be extended to the MTT case along with a Monte Carlo approximation for inference of unknown associations of targets. The stochastic approximation EM algorithm that we present here can be used along with any Monte Carlo method which has been developed for tracking in MTT models, such as Markov chain Monte Carlo and sequential Monte Carlo methods. We demonstrate the performance of the algorithm with a simulation. © 2012 ISIF (Intl Society of Information Fusi).
Resumo:
A growing number of people are now entering the elderly age category in Japan; this raises the likelihood of more persons with dementia, as the probability of becoming cognitively impaired increases with age. There is an increasing need for caregivers who are well trained and experienced and who can pay special attention to the needs of people with dementia. Technology can play an important role in helping such people and their caregivers. A lack of mutual understanding between caregivers and researchers regarding the appropriate uses of assistive technologies is another problem. We have described the relationship between information and communication technology (ICT), especially assistive technologies, and social issues as a first step towards developing a technology roadmap. © 2012 IEEE.
Resumo:
Antibodies are known to be essential in controlling Salmonella infection, but their exact role remains elusive. We recently developed an in vitro model to investigate the relative efficiency of four different human immunoglobulin G (IgG) subclasses in modulating the interaction of the bacteria with human phagocytes. Our results indicated that different IgG subclasses affect the efficacy of Salmonella uptake by human phagocytes. In this study, we aim to quantify the effects of IgG on intracellular dynamics of infection by combining distributions of bacterial numbers per phagocyte observed by fluorescence microscopy with a mathematical model that simulates the in vitro dynamics. We then use maximum likelihood to estimate the model parameters and compare them across IgG subclasses. The analysis reveals heterogeneity in the division rates of the bacteria, strongly suggesting that a subpopulation of intracellular Salmonella, while visible under the microscope, is not dividing. Clear differences in the observed distributions among the four IgG subclasses are best explained by variations in phagocytosis and intracellular dynamics. We propose and compare potential factors affecting the replication and death of bacteria within phagocytes, and we discuss these results in the light of recent findings on dormancy of Salmonella.
Resumo:
Optically-fed distributed antenna system (DAS) technology is combined with passive ultra high frequency (UHF) radio frequency identification (RFID). It is shown that RFID signals can be carried on directly modulated radio over fiber links without impacting their performance. It is also shown that a multi-antenna DAS can greatly reduce the number of nulls experienced by RFID in a complex radio environment, increasing the likelihood of successful tag detection. Consequently, optimization of the DAS reduces nulls further. We demonstrate RFID tag reading using a three antenna DAS system over a 20mx6m area, limited by building constraints, where 100% of the test points can be successfully read. The detected signal strength from the tag is also observed to increase by an average of approximately 10dB compared with a conventional switched multi-antenna RFID system. This improvement is achieved at +31dBm equivalent isotropically radiated power (EIRP) from all three antenna units (AUs).
Resumo:
This paper investigates 'future-proofing' as an unexplored yet all-important aspect in the design of low-energy dwellings. It refers particularly to adopting lifecycle thinking and accommodating risks and uncertainties in the selection of fabric energy efficiency measures and low or zero-carbon technologies. Based on a conceptual framework for future-proofed design, the paper first presents results from the analysis of two 'best practice' housing developments in England; i.e., North West Cambridge in Cambridge and West Carclaze and Baal in St. Austell, Cornwall. Second, it examines the 'Energy and CO2 Emissions' part of the Code for Sustainable Homes to reveal which design criteria and assessment methods can be practically integrated into this established building certification scheme so that it can become more dynamic and future-oriented.Practical application: Future-proofed construction is promoted implicitly within the increasingly stringent building regulations; however, there is no comprehensive method to readily incorporate futures thinking into the energy design of buildings. This study has a three-fold objective of relevance to the building industry:Illuminating the two key categories of long-term impacts in buildings, which are often erroneously treated interchangeably:- The environmental impact of buildings due to their long lifecycles.- The environment's impacts on buildings due to risks and uncertainties affecting the energy consumption by at least 2050. This refers to social, technological, economic, environmental and regulatory (predictable or unknown) trends and drivers of change, such as climate uncertainty, home-working, technology readiness etc.Encouraging future-proofing from an early planning stage to reduce the likelihood of a prematurely obsolete building design.Enhancing established building energy assessment methods (certification, modelling or audit tools) by integrating a set of future-oriented criteria into their methodologies. © 2012 The Chartered Institution of Building Services Engineers.
Resumo:
Electron multiplication charge-coupled devices (EMCCD) are widely used for photon counting experiments and measurements of low intensity light sources, and are extensively employed in biological fluorescence imaging applications. These devices have a complex statistical behaviour that is often not fully considered in the analysis of EMCCD data. Robust and optimal analysis of EMCCD images requires an understanding of their noise properties, in particular to exploit fully the advantages of Bayesian and maximum-likelihood analysis techniques, whose value is increasingly recognised in biological imaging for obtaining robust quantitative measurements from challenging data. To improve our own EMCCD analysis and as an effort to aid that of the wider bioimaging community, we present, explain and discuss a detailed physical model for EMCCD noise properties, giving a likelihood function for image counts in each pixel for a given incident intensity, and we explain how to measure the parameters for this model from various calibration images. © 2013 Hirsch et al.
Resumo:
We study unsupervised learning in a probabilistic generative model for occlusion. The model uses two types of latent variables: one indicates which objects are present in the image, and the other how they are ordered in depth. This depth order then determines how the positions and appearances of the objects present, specified in the model parameters, combine to form the image. We show that the object parameters can be learnt from an unlabelled set of images in which objects occlude one another. Exact maximum-likelihood learning is intractable. However, we show that tractable approximations to Expectation Maximization (EM) can be found if the training images each contain only a small number of objects on average. In numerical experiments it is shown that these approximations recover the correct set of object parameters. Experiments on a novel version of the bars test using colored bars, and experiments on more realistic data, show that the algorithm performs well in extracting the generating causes. Experiments based on the standard bars benchmark test for object learning show that the algorithm performs well in comparison to other recent component extraction approaches. The model and the learning algorithm thus connect research on occlusion with the research field of multiple-causes component extraction methods.
Resumo:
Conventional Hidden Markov models generally consist of a Markov chain observed through a linear map corrupted by additive noise. This general class of model has enjoyed a huge and diverse range of applications, for example, speech processing, biomedical signal processing and more recently quantitative finance. However, a lesser known extension of this general class of model is the so-called Factorial Hidden Markov Model (FHMM). FHMMs also have diverse applications, notably in machine learning, artificial intelligence and speech recognition [13, 17]. FHMMs extend the usual class of HMMs, by supposing the partially observed state process is a finite collection of distinct Markov chains, either statistically independent or dependent. There is also considerable current activity in applying collections of partially observed Markov chains to complex action recognition problems, see, for example, [6]. In this article we consider the Maximum Likelihood (ML) parameter estimation problem for FHMMs. Much of the extant literature concerning this problem presents parameter estimation schemes based on full data log-likelihood EM algorithms. This approach can be slow to converge and often imposes heavy demands on computer memory. The latter point is particularly relevant for the class of FHMMs where state space dimensions are relatively large. The contribution in this article is to develop new recursive formulae for a filter-based EM algorithm that can be implemented online. Our new formulae are equivalent ML estimators, however, these formulae are purely recursive and so, significantly reduce numerical complexity and memory requirements. A computer simulation is included to demonstrate the performance of our results. © Taylor & Francis Group, LLC.
Resumo:
The code provided here originally demonstrated the main algorithms from Rasmussen and Williams: Gaussian Processes for Machine Learning. It has since grown to allow more likelihood functions, further inference methods and a flexible framework for specifying GPs.
Resumo:
The possibility that we will have to invest effort influences our future choice behavior. Indeed deciding whether an action is actually worth taking is a key element in the expression of human apathy or inertia. There is a well developed literature on brain activity related to the anticipation of effort, but how effort affects actual choice is less well understood. Furthermore, prior work is largely restricted to mental as opposed to physical effort or has confounded temporal with effortful costs. Here we investigated choice behavior and brain activity, using functional magnetic resonance imaging, in a study where healthy participants are required to make decisions between effortful gripping, where the factors of force (high and low) and reward (high and low) were varied, and a choice of merely holding a grip device for minimal monetary reward. Behaviorally, we show that force level influences the likelihood of choosing an effortful grip. We observed greater activity in the putamen when participants opt to grip an option with low effort compared with when they opt to grip an option with high effort. The results suggest that, over and above a nonspecific role in movement anticipation and salience, the putamen plays a crucial role in computations for choice that involves effort costs.
Resumo:
采用PCR技术获得了14种主要分布于东亚的低等鲤科鱼类细胞色素b基因的全序列. 所得1 140 bp细胞色素b基因序列与10种取自GenBank, 分布在北美和欧洲的相关鲤科鱼类的同一基因序列一起排序后, 得到了24种鲤科鱼类的DNA序列矩阵. 此矩阵经过最大似然(maximum likelihood)法计算后获得了低等鲤科及相关种类的系统发育分支图解. 分支系统图显示鲤科的雅罗鱼亚科和亚科鱼类并不形成单系类群. 亚科鱼类中的马口鱼、等是原始的鲤科鱼类, 处于分支图的基部. 而其余的亚科鱼类则分散