958 resultados para Standard models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nonlinear non-Gaussian state-space models arise in numerous applications in control and signal processing. Sequential Monte Carlo (SMC) methods, also known as Particle Filters, provide very good numerical approximations to the associated optimal state estimation problems. However, in many scenarios, the state-space model of interest also depends on unknown static parameters that need to be estimated from the data. In this context, standard SMC methods fail and it is necessary to rely on more sophisticated algorithms. The aim of this paper is to present a comprehensive overview of SMC methods that have been proposed to perform static parameter estimation in general state-space models. We discuss the advantages and limitations of these methods. © 2009 IFAC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the general problem of constructing nonparametric Bayesian models on infinite-dimensional random objects, such as functions, infinite graphs or infinite permutations. The problem has generated much interest in machine learning, where it is treated heuristically, but has not been studied in full generality in non-parametric Bayesian statistics, which tends to focus on models over probability distributions. Our approach applies a standard tool of stochastic process theory, the construction of stochastic processes from their finite-dimensional marginal distributions. The main contribution of the paper is a generalization of the classic Kolmogorov extension theorem to conditional probabilities. This extension allows a rigorous construction of nonparametric Bayesian models from systems of finite-dimensional, parametric Bayes equations. Using this approach, we show (i) how existence of a conjugate posterior for the nonparametric model can be guaranteed by choosing conjugate finite-dimensional models in the construction, (ii) how the mapping to the posterior parameters of the nonparametric model can be explicitly determined, and (iii) that the construction of conjugate models in essence requires the finite-dimensional models to be in the exponential family. As an application of our constructive framework, we derive a model on infinite permutations, the nonparametric Bayesian analogue of a model recently proposed for the analysis of rank data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most previous work on trainable language generation has focused on two paradigms: (a) using a statistical model to rank a set of generated utterances, or (b) using statistics to inform the generation decision process. Both approaches rely on the existence of a handcrafted generator, which limits their scalability to new domains. This paper presents BAGEL, a statistical language generator which uses dynamic Bayesian networks to learn from semantically-aligned data produced by 42 untrained annotators. A human evaluation shows that BAGEL can generate natural and informative utterances from unseen inputs in the information presentation domain. Additionally, generation performance on sparse datasets is improved significantly by using certainty-based active learning, yielding ratings close to the human gold standard with a fraction of the data. © 2010 Association for Computational Linguistics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In natural languages multiple word sequences can represent the same underlying meaning. Only modelling the observed surface word sequence can result in poor context coverage, for example, when using n-gram language models (LM). To handle this issue, this paper presents a novel form of language model, the paraphrastic LM. A phrase level transduction model that is statistically learned from standard text data is used to generate paraphrase variants. LM probabilities are then estimated by maximizing their marginal probability. Significant error rate reductions of 0.5%-0.6% absolute were obtained on a state-ofthe-art conversational telephone speech recognition task using a paraphrastic multi-level LM modelling both word and phrase sequences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Processing networks are a variant of the standard linear programming network model which are especially useful for optimizing industrial energy/environment systems. Modelling advantages include an intuitive diagrammatic representation and the ability to incorporate all forms of energy and pollutants in a single integrated linear network model. Added advantages include increased speed of solution and algorithms supporting formulation. The paper explores their use in modelling the energy and pollution control systems in large industrial plants. The pollution control options in an ethylene production plant are analyzed as an example. PROFLOW, a computer tool for the formulation, analysis, and solution of processing network models, is introduced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A framework for adaptive and non-adaptive statistical compressive sensing is developed, where a statistical model replaces the standard sparsity model of classical compressive sensing. We propose within this framework optimal task-specific sensing protocols specifically and jointly designed for classification and reconstruction. A two-step adaptive sensing paradigm is developed, where online sensing is applied to detect the signal class in the first step, followed by a reconstruction step adapted to the detected class and the observed samples. The approach is based on information theory, here tailored for Gaussian mixture models (GMMs), where an information-theoretic objective relationship between the sensed signals and a representation of the specific task of interest is maximized. Experimental results using synthetic signals, Landsat satellite attributes, and natural images of different sizes and with different noise levels show the improvements achieved using the proposed framework when compared to more standard sensing protocols. The underlying formulation can be applied beyond GMMs, at the price of higher mathematical and computational complexity. © 1991-2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The powerful general Pacala-Hassell host-parasitoid model for a patchy environment, which allows host density–dependent heterogeneity (HDD) to be distinguished from between-patch, host density–independent heterogeneity (HDI), is reformulated within the class of the generalized linear model (GLM) family. This improves accessibility through the provision of general software within well–known statistical systems, and allows a rich variety of models to be formulated. Covariates such as age class, host density and abiotic factors may be included easily. For the case where there is no HDI, the formulation is a simple GLM. When there is HDI in addition to HDD, the formulation is a hierarchical generalized linear model. Two forms of HDI model are considered, both with between-patch variability: one has binomial variation within patches and one has extra-binomial, overdispersed variation within patches. Examples are given demonstrating parameter estimation with standard errors, and hypothesis testing. For one example given, the extra-binomial component of the HDI heterogeneity in parasitism is itself shown to be strongly density dependent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dimethylsulphide (DMS) is a globally important aerosol precurser. In 1987 Charlson and others proposed that an increase in DMS production by certain phytoplankton species in response to a warming climate could stimulate increased aerosol formation, increasing the lower-atmosphere's albedo, and promoting cooling. Despite two decades of research, the global significance of this negative climate feedback remains contentious. It is therefore imperative that schemes are developed and tested, which allow for the realistic incorporation of phytoplankton DMS production into Earth System models. Using these models we can investigate the DMS-climate feedback and reduce uncertainty surrounding projections of future climate. Here we examine two empirical DMS parameterisations within the context of an Earth System model and find them to perform marginally better than the standard DMS climatology at predicting observations from an independent global dataset. We then question whether parameterisations based on our present understanding of DMS production by phytoplankton, and simple enough to incorporate into global climate models, can be shown to enhance the future predictive capacity of those models. This is an important question to ask now, as results from increasingly complex Earth System models lead us into the 5th assessment of climate science by the Intergovernmental Panel on Climate Change. Comparing observed and predicted inter-annual variability, we suggest that future climate projections may underestimate the magnitude of surface ocean DMS change. Unfortunately this conclusion relies on a relatively small dataset, in which observed inter-annual variability may be exaggerated by biases in sample collection. We therefore encourage the observational community to make repeat measurements of sea-surface DMS concentrations an important focus, and highlight areas of apparent high inter-annual variability where sampling might be carried out. Finally, we assess future projections from two similarly valid empirical DMS schemes, and demonstrate contrasting results. We therefore conclude that the use of empirical DMS parameterisations within simulations of future climate should be undertaken only with careful appreciation of the caveats discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports the findings from a discrete-choice experiment designed to estimate the economic benefits associated with rural landscape improvements in Ireland. Using a mixed logit model, the panel nature of the dataset is exploited to retrieve willingness-to-pay values for every individual in the sample. This departs from customary approaches in which the willingness-to-pay estimates are normally expressed as measures of central tendency of an a priori distribution. Random-effects models for panel data are subsequently used to identify the determinants of the individual-specific willingness-to-pay estimates. In comparison with the standard methods used to incorporate individual-specific variables into the analysis of discrete-choice experiments, the analytical approach outlined in this paper is shown to add considerable explanatory power to the welfare estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A comparative study of models used to predict contaminant dispersion in a partially stratified room is presented. The experiments were carried out in a ventilated test room, with an initially evenly dispersed pollutant. Air was extracted from the outlet in the ceiling of the room at 1 and 3 air changes per hour. A small temperature difference between the top and bottom of the room causes very low air velocities, and higher concentrations, in the lower half of the room. Grid-independent CFD calculations were compared with predictions from a zonal model and from CFD using a very coarse grid. All the calculations show broadly similar contaminant concentration decay rates for the three locations monitored in the experiments, with the zonal model performing surprisingly well. For the lower air change rate, the models predict a less well mixed contaminant distribution than the experimental measurements suggest. With run times of less than a few minutes, the zonal model is around two orders of magnitude faster than coarse-grid CFD and could therefore be used more easily in parametric studies and sensitivity analyses. For a more detailed picture of internal dispersion, a CFD study using coarse and standard grids may be more appropriate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have previously identified differentially expressed genes in cell models of diabetic nephropathy and renal biopsies. Here we have performed quantitative DNA methylation profiling in cell models of diabetic nephropathy. Over 3,000 CpG units in the promoter regions of 192 candidate genes were assessed in unstimulated human mesangial cells (HMCs) and proximal tubular epithelial cells (PTCs) compared to HMCs or PTCs exposed to appropriate stimuli. A total of 301 CpG units across 38 genes (similar to 20%) were identified as differentially methylated in unstimulated HMCs versus PTCs. Analysis of amplicon methylation values in unstimulated versus stimulated cell models failed to demonstrate a >20% difference between amplicons. In conclusion, our results demonstrate that specific DNA methylation signatures are present in HMCs and PTCs, and standard protocols for exposure of renal cells to stimuli that alter gene expression may be insufficient to replicate possible alterations in DNA methylation profiles in diabetic nephropathy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Polypropylene (PP), a semi-crystalline material, is typically solid phase thermoformed at temperatures associated with crystalline melting, generally in the 150° to 160°Celsius range. In this very narrow thermoforming window the mechanical properties of the material rapidly decline with increasing temperature and these large changes in properties make Polypropylene one of the more difficult materials to process by thermoforming. Measurement of the deformation behaviour of a material under processing conditions is particularly important for accurate numerical modelling of thermoforming processes. This paper presents the findings of a study into the physical behaviour of industrial thermoforming grades of Polypropylene. Practical tests were performed using custom built materials testing machines and thermoforming equipment at Queen′s University Belfast. Numerical simulations of these processes were constructed to replicate thermoforming conditions using industry standard Finite Element Analysis software, namely ABAQUS and custom built user material model subroutines. Several variant constitutive models were used to represent the behaviour of the Polypropylene materials during processing. This included a range of phenomenological, rheological and blended constitutive models. The paper discusses approaches to modelling industrial plug-assisted thermoforming operations using Finite Element Analysis techniques and the range of material models constructed and investigated. It directly compares practical results to numerical predictions. The paper culminates discussing the learning points from using Finite Element Methods to simulate the plug-assisted thermoforming of Polypropylene, which presents complex contact, thermal, friction and material modelling challenges. The paper makes recommendations as to the relative importance of these inputs in general terms with regard to correlating to experimentally gathered data. The paper also presents recommendations as to the approaches to be taken to secure simulation predictions of improved accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background/Aims: Hepatocellular carcinoma is a leading cause of global cancer mortality, with standard chemotherapy being minimally effective in prolonging survival. We investigated if combined targeting of vascular endothelial growth factor protein and expression might affect hepatocellular carcinoma growth and angiogenesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the use of the Euler equations for the generation and testing of tabular aerodynamic models for flight dynamics analysis. Maneuvers for the AGARD Standard Dynamics Model sharp leading-edge wind-tunnel geometry are considered as a test case. Wind-tunnel data is first used to validate the prediction of static and dynamic coefficients at both low and high angles, featuring complex vortical flow, with good agreement obtained at low to moderate angles of attack. Then the generation of aerodynamic tables is described based on a data fusion approach. Time-optimal maneuvers are generated based on these tables, including level flight trim, pull-ups at constant and varying incidence, and level and 90 degrees turns. The maneuver definition includes the aircraft states and also the control deflections to achieve the motion. The main point of the paper is then to assess the validity of the aerodynamic tables which were used to define the maneuvers. This is done by replaying them, including the control surface motions, through the time accurate computational fluid dynamics code. The resulting forces and moments are compared with the tabular values to assess the presence of inadequately modeled dynamic or unsteady effects. The agreement between the tables and the replay is demonstrated for slow maneuvers. Increasing rate maneuvers show discrepancies which are ascribed to vortical flow hysteresis at the higher rate motions. The framework is suitable for application to more complex viscous flow models, and is powerful for the assessment of the validity of aerodynamics models of the type currently used for studies of flight dynamics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivated by recent models involving off-centre ignition of Type Ia supernova explosions, we undertake three-dimensional time-dependent radiation transport simulations to investigate the range of bolometric light-curve properties that could be observed from supernovae in which there is a lop-sided distribution of the products from nuclear burning. We consider both a grid of artificial toy models which illustrate the conceivable range of effects and a recent three-dimensional hydrodynamical explosion model. We find that observationally significant viewing angle effects are likely to arise in such supernovae and that these may have important ramifications for the interpretation of the observed diversity of Type Ia supernova and the systematic uncertainties which relate to their use as standard candles in contemporary cosmology. © 2007 RAS.