264 resultados para Euclidean distance model,
Resumo:
The IWA Anaerobic Digestion Modelling Task Group was established in 1997 at the 8th World Congress on,Anaerobic Digestion (Sendai, Japan) with the goal of developing a generalised anaerobic digestion model. The structured model includes multiple steps describing biochemical as well as physicochemical processes. The biochemical steps include disintegration from homogeneous particulates to carbohydrates, proteins and lipids; extracellular hydrolysis of these particulate substrates to sugars, amino acids, and long chain fatty acids (LCFA), respectively; acidogenesis from sugars and amino acids to volatile fatty acids (VFAs) and hydrogen; acetogenesis of LCFA and VFAs to acetate; and separate methanogenesis steps from acetate and hydrogen/CO2. The physico-chemical equations describe ion association and dissociation, and gas-liquid transfer. Implemented as a differential and algebraic equation (DAE) set, there are 26 dynamic state concentration variables, and 8 implicit algebraic variables per reactor vessel or element. Implemented as differential equations (DE) only, there are 32 dynamic concentration state variables.
Resumo:
In many occupational safety interventions, the objective is to reduce the injury incidence as well as the mean claims cost once injury has occurred. The claims cost data within a period typically contain a large proportion of zero observations (no claim). The distribution thus comprises a point mass at 0 mixed with a non-degenerate parametric component. Essentially, the likelihood function can be factorized into two orthogonal components. These two components relate respectively to the effect of covariates on the incidence of claims and the magnitude of claims, given that claims are made. Furthermore, the longitudinal nature of the intervention inherently imposes some correlation among the observations. This paper introduces a zero-augmented gamma random effects model for analysing longitudinal data with many zeros. Adopting the generalized linear mixed model (GLMM) approach reduces the original problem to the fitting of two independent GLMMs. The method is applied to evaluate the effectiveness of a workplace risk assessment teams program, trialled within the cleaning services of a Western Australian public hospital.
Resumo:
Binning and truncation of data are common in data analysis and machine learning. This paper addresses the problem of fitting mixture densities to multivariate binned and truncated data. The EM approach proposed by McLachlan and Jones (Biometrics, 44: 2, 571-578, 1988) for the univariate case is generalized to multivariate measurements. The multivariate solution requires the evaluation of multidimensional integrals over each bin at each iteration of the EM procedure. Naive implementation of the procedure can lead to computationally inefficient results. To reduce the computational cost a number of straightforward numerical techniques are proposed. Results on simulated data indicate that the proposed methods can achieve significant computational gains with no loss in the accuracy of the final parameter estimates. Furthermore, experimental results suggest that with a sufficient number of bins and data points it is possible to estimate the true underlying density almost as well as if the data were not binned. The paper concludes with a brief description of an application of this approach to diagnosis of iron deficiency anemia, in the context of binned and truncated bivariate measurements of volume and hemoglobin concentration from an individual's red blood cells.
Resumo:
Motivation: This paper introduces the software EMMIX-GENE that has been developed for the specific purpose of a model-based approach to the clustering of microarray expression data, in particular, of tissue samples on a very large number of genes. The latter is a nonstandard problem in parametric cluster analysis because the dimension of the feature space (the number of genes) is typically much greater than the number of tissues. A feasible approach is provided by first selecting a subset of the genes relevant for the clustering of the tissue samples by fitting mixtures of t distributions to rank the genes in order of increasing size of the likelihood ratio statistic for the test of one versus two components in the mixture model. The imposition of a threshold on the likelihood ratio statistic used in conjunction with a threshold on the size of a cluster allows the selection of a relevant set of genes. However, even this reduced set of genes will usually be too large for a normal mixture model to be fitted directly to the tissues, and so the use of mixtures of factor analyzers is exploited to reduce effectively the dimension of the feature space of genes. Results: The usefulness of the EMMIX-GENE approach for the clustering of tissue samples is demonstrated on two well-known data sets on colon and leukaemia tissues. For both data sets, relevant subsets of the genes are able to be selected that reveal interesting clusterings of the tissues that are either consistent with the external classification of the tissues or with background and biological knowledge of these sets.
Resumo:
This paper examines why practitioners and researchers get different estimates of equity value when they use a discounted cash flow (CF) model versus a residual income (RI) model. Both models are derived from the same underlying assumption -- that price is the present value of expected future net dividends discounted at the cost of equity capital -- but in practice and in research they frequently yield different estimates. We argue that the research literature devoted to comparing the accuracy of these two models is misguided; properly implemented, both models yield identical valuations for all firms in all years. We identify how prior research has applied inconsistent assumptions to the two models and show how these seemingly small errors cause surprisingly large differences in the value estimates. [ABSTRACT FROM AUTHOR]
Resumo:
At the core of the analysis task in the development process is information systems requirements modelling, Modelling of requirements has been occurring for many years and the techniques used have progressed from flowcharting through data flow diagrams and entity-relationship diagrams to object-oriented schemas today. Unfortunately, researchers have been able to give little theoretical guidance only to practitioners on which techniques to use and when. In an attempt to address this situation, Wand and Weber have developed a series of models based on the ontological theory of Mario Bunge-the Bunge-Wand-Weber (BWW) models. Two particular criticisms of the models have persisted however-the understandability of the constructs in the BWW models and the difficulty in applying the models to a modelling technique. This paper addresses these issues by presenting a meta model of the BWW constructs using a meta language that is familiar to many IS professionals, more specific than plain English text, but easier to understand than the set-theoretic language of the original BWW models. Such a meta model also facilitates the application of the BWW theory to other modelling techniques that have similar meta models defined. Moreover, this approach supports the identification of patterns of constructs that might be common across meta models for modelling techniques. Such findings are useful in extending and refining the BWW theory. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
Form factors are derived for a model describing the coherent Josephson tunneling between two coupled Bose-Einstein condensates. This is achieved by studying the exact solution of the model within the framework of the algebraic Bethe ansatz. In this approach the form factors are expressed through determinant representations which are functions of the roots of the Bethe ansatz equations.
Resumo:
A pairing model for nucleons, introduced by Richardson in 1966, which describes proton-neutron pairing as well as proton-proton and neutron-neutron pairing, is re-examined in the context of the quantum inverse scattering method. Specifically, this shows that the model is integrable by enabling the explicit construction of the conserved operators. We determine the eigenvalues of these operators in terms of the Bethe ansatz, which in turn leads to an expression for the energy eigenvalues of the Hamiltonian.
Resumo:
A model is introduced for two reduced BCS systems which are coupled through the transfer of Cooper pairs between the systems. The model may thus be used in the analysis of the Josephson effect arising from pair tunneling between two strongly coupled small metallic grains. At a particular coupling strength the model is integrable and explicit results are derived for the energy spectrum, conserved operators, integrals of motion, and wave function scalar products. It is also shown that form factors can be obtained for the calculation of correlation functions. Furthermore, a connection with perturbed conformal field theory is made.