980 resultados para correlation modelling


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Selecting an appropriate working correlation structure is pertinent to clustered data analysis using generalized estimating equations (GEE) because an inappropriate choice will lead to inefficient parameter estimation. We investigate the well-known criterion of QIC for selecting a working correlation Structure. and have found that performance of the QIC is deteriorated by a term that is theoretically independent of the correlation structures but has to be estimated with an error. This leads LIS to propose a correlation information criterion (CIC) that substantially improves the QIC performance. Extensive simulation studies indicate that the CIC has remarkable improvement in selecting the correct correlation structures. We also illustrate our findings using a data set from the Madras Longitudinal Schizophrenia Study.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Structured precision modelling is an important approach to improve the intra-frame correlation modelling of the standard HMM, where Gaussian mixture model with diagonal covariance are used. Previous work has all been focused on direct structured representation of the precision matrices. In this paper, a new framework is proposed, where the structure of the Cholesky square root of the precision matrix is investigated, referred to as Cholesky Basis Superposition (CBS). Each Cholesky matrix associated with a particular Gaussian distribution is represented as a linear combination of a set of Gaussian independent basis upper-triangular matrices. Efficient optimization methods are derived for both combination weights and basis matrices. Experiments on a Chinese dictation task showed that the proposed approach can significantly outperformed the direct structured precision modelling with similar number of parameters as well as full covariance modelling. © 2011 IEEE.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Measurements of global and diffuse solar-radiation, at the Earth's surface, carried out from May 1994 to June 1999 in São Paulo City, Brazil, were used to develop correlation models to estimate hourly, daily and monthly values of diffuse solar-radiation on horizontal surfaces. The polynomials derived by linear regression fitting were able to model satisfactorily the daily and monthly values of diffuse radiation. The comparison with models derived for other places demonstrates some differences related mainly to altitude effects. (C) 2002 Elsevier B.V. Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fusion techniques have received considerable attention for achieving lower error rates with biometrics. A fused classifier architecture based on sequential integration of multi-instance and multi-sample fusion schemes allows controlled trade-off between false alarms and false rejects. Expressions for each type of error for the fused system have previously been derived for the case of statistically independent classifier decisions. It is shown in this paper that the performance of this architecture can be improved by modelling the correlation between classifier decisions. Correlation modelling also enables better tuning of fusion model parameters, ‘N’, the number of classifiers and ‘M’, the number of attempts/samples, and facilitates the determination of error bounds for false rejects and false accepts for each specific user. Error trade-off performance of the architecture is evaluated using HMM based speaker verification on utterances of individual digits. Results show that performance is improved for the case of favourable correlated decisions. The architecture investigated here is directly applicable to speaker verification from spoken digit strings such as credit card numbers in telephone or voice over internet protocol based applications. It is also applicable to other biometric modalities such as finger prints and handwriting samples.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Reliability of the performance of biometric identity verification systems remains a significant challenge. Individual biometric samples of the same person (identity class) are not identical at each presentation and performance degradation arises from intra-class variability and inter-class similarity. These limitations lead to false accepts and false rejects that are dependent. It is therefore difficult to reduce the rate of one type of error without increasing the other. The focus of this dissertation is to investigate a method based on classifier fusion techniques to better control the trade-off between the verification errors using text-dependent speaker verification as the test platform. A sequential classifier fusion architecture that integrates multi-instance and multisample fusion schemes is proposed. This fusion method enables a controlled trade-off between false alarms and false rejects. For statistically independent classifier decisions, analytical expressions for each type of verification error are derived using base classifier performances. As this assumption may not be always valid, these expressions are modified to incorporate the correlation between statistically dependent decisions from clients and impostors. The architecture is empirically evaluated by applying the proposed architecture for text dependent speaker verification using the Hidden Markov Model based digit dependent speaker models in each stage with multiple attempts for each digit utterance. The trade-off between the verification errors is controlled using the parameters, number of decision stages (instances) and the number of attempts at each decision stage (samples), fine-tuned on evaluation/tune set. The statistical validation of the derived expressions for error estimates is evaluated on test data. The performance of the sequential method is further demonstrated to depend on the order of the combination of digits (instances) and the nature of repetitive attempts (samples). The false rejection and false acceptance rates for proposed fusion are estimated using the base classifier performances, the variance in correlation between classifier decisions and the sequence of classifiers with favourable dependence selected using the 'Sequential Error Ratio' criteria. The error rates are better estimated by incorporating user-dependent (such as speaker-dependent thresholds and speaker-specific digit combinations) and class-dependent (such as clientimpostor dependent favourable combinations and class-error based threshold estimation) information. The proposed architecture is desirable in most of the speaker verification applications such as remote authentication, telephone and internet shopping applications. The tuning of parameters - the number of instances and samples - serve both the security and user convenience requirements of speaker-specific verification. The architecture investigated here is applicable to verification using other biometric modalities such as handwriting, fingerprints and key strokes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Intermediate phenotypes are often measured as a proxy for asthma. It is largely unclear to what extent the same set of environmental or genetic factors regulate these traits. Objective: Estimate the environmental and genetic correlations between self-reported and clinical asthma traits. Methods: A total of 3073 subjects from 802 families were ascertained through a twin proband. Traits measured included self-reported asthma, airway histamine responsiveness (AHR), skin prick response to common allergens including house dust mite (Dermatophagoides pteronyssinus [D. pter]), baseline lung function, total serum immunoglobulin E (IgE) and eosinophilia. Bivariate and multivariate analyses of eight traits were performed with adjustment for ascertainment and significant covariates. Results: Overall 2716 participants completed an asthma questionnaire and 2087 were clinically tested, including 1289 self-reported asthmatics (92% previously diagnosed by a doctor). Asthma, AHR, markers of allergic sensitization and eosinophilia had significant environmental correlations with each other (range: 0.23-0.89). Baseline forced expiratory volume in 1 s (FEV1) showed low environmental correlations with most traits. Fewer genetic correlations were significantly different from zero. Phenotypes with greatest genetic similarity were asthma and atopy (0.46), IgE and eosinophilia (0.44), AHR and D. pter (0.43) and AHR and airway obstruction (-0.43). Traits with greatest genetic dissimilarity were FEV1 and atopy (0.05), airway obstruction and IgE (0.07) and FEV1 and D. pter (0.11). Conclusion: These results suggest that the same set of environmental factors regulates the variation of many asthma traits. In addition, although most traits are regulated to great extent by specific genetic factors, there is still some degree of genetic overlap that could be exploited by multivariate linkage approaches.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The study presented here was carried out to obtain the actual solids flow rate by the combination of electrical resistance tomography and electromagnetic flow meter. A new in-situ measurement method based on measurements of the Electromagnetic Flow Meters (EFM) and Electrical Resistance Tomography (ERT) to study the flow rates of individual phases in a vertical flow was proposed. The study was based on laboratory experiments that were carried out with a 50 mm vertical flow rig for a number of sand concentrations and different mixture velocities. A range of sand slurries with median particle size from 212 mu m to 355 mu m was tested. The solid concentration by volume covered was 5% and 15%, and the corresponding density of 5% was 1078 kg/m(3) and of 15% was 1238 kg/m(3). The flow velocity was between 1.5 m/s and 3.0 m/s. A total of 6 experimental tests were conducted. The equivalent liquid model was adopted to validate in-situ volumetric solids fraction and calculate the slip velocity. The results show that the ERT technique can be used in conjunction with an electromagnetic flow meter as a way of measurement of slurry flow rate in a vertical pipe flow. However it should be emphasized that the EFM results must be treated with reservation when the flow pattern at the EFM mounting position is a non-homogenous flow. The flow rate obtained by the EFM should be corrected considering the slip velocity and the flow pattern.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Computational Fluid Dynamics CFD can be used as a powerful tool supporting engineers throughout the steps of the design. The combination of CFD with response surface methodology can play an important role in such cases. During the conceptual engineering design phase, a quick response is always a matter of urgency. During this phase even a sketch of the geometrical model is rare. Therefore, the utilisation of typical response surface developed for congested and confined environment rather than CFD can be an important tool to help the decision making process, when the geometrical model is not available, provided that similarities can be considered when taking into account the characteristic of the geometry in which the response surface was developed. The present work investigates how three different types of response surfaces behave when predicting overpressure in accidental scenarios based on CFD input. First order, partial second order and complete second order polynomial expressions are investigated. The predicted results are compared with CFD findings for a classical offshore experiment conducted by British Gas on behalf of Mobil and good agreement is observed for higher order response surfaces. The higher order response surface calculations are also compared with CFD calculations for a typical offshore module and good agreement is also observed. © 2011 Elsevier Ltd.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An artificial neural network (ANN) model is developed for the analysis and simulation of the correlation between the properties of maraging steels and composition, processing and working conditions. The input parameters of the model consist of alloy composition, processing parameters (including cold deformation degree, ageing temperature, and ageing time), and working temperature. The outputs of the ANN model include property parameters namely: ultimate tensile strength, yield strength, elongation, reduction in area, hardness, notched tensile strength, Charpy impact energy, fracture toughness, and martensitic transformation start temperature. Good performance of the ANN model is achieved. The model can be used to calculate properties of maraging steels as functions of alloy composition, processing parameters, and working condition. The combined influence of Co and Mo on the properties of maraging steels is simulated using the model. The results are in agreement with experimental data. Explanation of the calculated results from the metallurgical point of view is attempted. The model can be used as a guide for further alloy development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis applies Monte Carlo techniques to the study of X-ray absorptiometric methods of bone mineral measurement. These studies seek to obtain information that can be used in efforts to improve the accuracy of the bone mineral measurements. A Monte Carlo computer code for X-ray photon transport at diagnostic energies has been developed from first principles. This development was undertaken as there was no readily available code which included electron binding energy corrections for incoherent scattering and one of the objectives of the project was to study the effects of inclusion of these corrections in Monte Carlo models. The code includes the main Monte Carlo program plus utilities for dealing with input data. A number of geometrical subroutines which can be used to construct complex geometries have also been written. The accuracy of the Monte Carlo code has been evaluated against the predictions of theory and the results of experiments. The results show a high correlation with theoretical predictions. In comparisons of model results with those of direct experimental measurements, agreement to within the model and experimental variances is obtained. The code is an accurate and valid modelling tool. A study of the significance of inclusion of electron binding energy corrections for incoherent scatter in the Monte Carlo code has been made. The results show this significance to be very dependent upon the type of application. The most significant effect is a reduction of low angle scatter flux for high atomic number scatterers. To effectively apply the Monte Carlo code to the study of bone mineral density measurement by photon absorptiometry the results must be considered in the context of a theoretical framework for the extraction of energy dependent information from planar X-ray beams. Such a theoretical framework is developed and the two-dimensional nature of tissue decomposition based on attenuation measurements alone is explained. This theoretical framework forms the basis for analytical models of bone mineral measurement by dual energy X-ray photon absorptiometry techniques. Monte Carlo models of dual energy X-ray absorptiometry (DEXA) have been established. These models have been used to study the contribution of scattered radiation to the measurements. It has been demonstrated that the measurement geometry has a significant effect upon the scatter contribution to the detected signal. For the geometry of the models studied in this work the scatter has no significant effect upon the results of the measurements. The model has also been used to study a proposed technique which involves dual energy X-ray transmission measurements plus a linear measurement of the distance along the ray path. This is designated as the DPA( +) technique. The addition of the linear measurement enables the tissue decomposition to be extended to three components. Bone mineral, fat and lean soft tissue are the components considered here. The results of the model demonstrate that the measurement of bone mineral using this technique is stable over a wide range of soft tissue compositions and hence would indicate the potential to overcome a major problem of the two component DEXA technique. However, the results also show that the accuracy of the DPA( +) technique is highly dependent upon the composition of the non-mineral components of bone and has poorer precision (approximately twice the coefficient of variation) than the standard DEXA measurements. These factors may limit the usefulness of the technique. These studies illustrate the value of Monte Carlo computer modelling of quantitative X-ray measurement techniques. The Monte Carlo models of bone densitometry measurement have:- 1. demonstrated the significant effects of the measurement geometry upon the contribution of scattered radiation to the measurements, 2. demonstrated that the statistical precision of the proposed DPA( +) three tissue component technique is poorer than that of the standard DEXA two tissue component technique, 3. demonstrated that the proposed DPA(+) technique has difficulty providing accurate simultaneous measurement of body composition in terms of a three component model of fat, lean soft tissue and bone mineral,4. and provided a knowledge base for input to decisions about development (or otherwise) of a physical prototype DPA( +) imaging system. The Monte Carlo computer code, data, utilities and associated models represent a set of significant, accurate and valid modelling tools for quantitative studies of physical problems in the fields of diagnostic radiology and radiography.