898 resultados para Analysis Model


Relevância:

70.00% 70.00%

Publicador:

Resumo:

This work presents an extended Joint Factor Analysis model including explicit modelling of unwanted within-session variability. The goals of the proposed extended JFA model are to improve verification performance with short utterances by compensating for the effects of limited or imbalanced phonetic coverage, and to produce a flexible JFA model that is effective over a wide range of utterance lengths without adjusting model parameters such as retraining session subspaces. Experimental results on the 2006 NIST SRE corpus demonstrate the flexibility of the proposed model by providing competitive results over a wide range of utterance lengths without retraining and also yielding modest improvements in a number of conditions over current state-of-the-art.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This paper presents an extended study on the implementation of support vector machine(SVM) based speaker verification in systems that employ continuous progressive model adaptation using the weight-based factor analysis model. The weight-based factor analysis model compensates for session variations in unsupervised scenarios by incorporating trial confidence measures in the general statistics used in the inter-session variability modelling process. Employing weight-based factor analysis in Gaussian mixture models (GMM) was recently found to provide significant performance gains to unsupervised classification. Further improvements in performance were found through the integration of SVM-based classification in the system by means of GMM supervectors. This study focuses particularly on the way in which a client is represented in the SVM kernel space using single and multiple target supervectors. Experimental results indicate that training client SVMs using a single target supervector maximises performance while exhibiting a certain robustness to the inclusion of impostor training data in the model. Furthermore, the inclusion of low-scoring target trials in the adaptation process is investigated where they were found to significantly aid performance.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This paper will focus on the development of an interactive test engine using Rasch analysis of item responses for question selection and reporting of results. The Rasch analysis is used to determine student ability and question difficulty. This model is widely used in the preparation of paper-based tests and has been the subject of particular use and development at the Australian Council for Education Research (ACER). This paper presents an overview of an interactive implementation of the Rasch analysis model in HyperCard, where student ability estimates are generated 'on the fly' and question difficulty values updated from time to time. The student ability estimates are used to determine question selection and are the basis of scoring and reporting schemes.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Information mismatch and overload are two fundamental issues influencing the effectiveness of information filtering systems. Even though both term-based and pattern-based approaches have been proposed to address the issues, neither of these approaches alone can provide a satisfactory decision for determining the relevant information. This paper presents a novel two-stage decision model for solving the issues. The first stage is a novel rough analysis model to address the overload problem. The second stage is a pattern taxonomy mining model to address the mismatch problem. The experimental results on RCV1 and TREC filtering topics show that the proposed model significantly outperforms the state-of-the-art filtering systems.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Background: More than half of all cerebral ischemic events are the result of rupture of extracranial plaques. The clinical determination of carotid plaque vulnerability is currently based solely on luminal stenosis; however, it has been increasingly suggested that plaque morphology and biomechanical stress should also be considered. We used finite element analysis based on in vivo magnetic resonance imaging (MRI) to simulate the stress distributions within plaques of asymptomatic and symptomatic individuals. Methods: Thirty nonconsecutive subjects (15 symptomatic and 15 asymptomatic) underwent high-resolution multisequence in vivo MRI of the carotid bifurcation. Stress analysis was performed based on the geometry derived from in vivo MRI of the carotid artery at the point of maximal stenosis. The finite element analysis model considered plaque components to be hyperelastic. The peak stresses within the plaques of symptomatic and asymptomatic individuals were compared. Results: High stress concentrations were found at the shoulder regions of symptomatic plaques, and the maximal stresses predicted in this group were significantly higher than those in the asymptomatic group (508.2 ± 193.1 vs 269.6 ± 107.9 kPa; P = .004). Conclusions: Maximal predicted plaque stresses in symptomatic patients were higher than those predicted in asymptomatic patients by finite element analysis, suggesting the possibility that plaques with higher stresses may be more prone to be symptomatic and rupture. If further validated by large-scale longitudinal studies, biomechanical stress analysis based on high resolution in vivo MRI could potentially act as a useful tool for risk assessment of carotid atheroma. It may help in the identification of patients with asymptomatic carotid atheroma at greatest risk of developing symptoms or mild-to-moderate symptomatic stenoses, which currently fall outside current clinical guidelines for intervention.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The effects of damping on energy sharing in coupled systems are investigated. The approach taken is to compute the forced response patterns of various idealised systems, and from these to calculate the parameters of Statistical Energy Analysis model for the systems using the matrix inversion approach [1]. It is shown that when SEA models are fitted by this procedure, the values of the coupling loss factors are significantly dependent on damping except when it is sufficiently high. For very lightly damped coupled systems, varying the damping causes the values of the coupling loss factor to vary in direct proportion to the internal loss factor. In the limit of zero damping, the coupling loss factors tend to zero. This is a view which contrasts strongly with 'classical' SEA, in which coupling loss factors are determined by the nature of the coupling between subsystems, independent of subsystem damping. One implication of the strong damping dependency is that equipartition of modal energy under low damping does not in general occur. This is contrary to the classical SEA prediction that equipartition of modal energy always occurs if the damping can be reduced to a sufficiently small value. It is demonstrated that the use of this classical assumption can lead to gross overestimates of subsystem energy ratios, especially in multi-subsystem structures. © 1996 Academic Press Limited.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

We present a growth analysis model that combines large amounts of environmental data with limited amounts of biological data and apply it to Corbicula japonica. The model uses the maximum-likelihood method with the Akaike information criterion, which provides an objective criterion for model selection. An adequate distribution for describing a single cohort is selected from available probability density functions, which are expressed by location and scale parameters. Daily relative increase rates of the location parameter are expressed by a multivariate logistic function with environmental factors for each day and categorical variables indicating animal ages as independent variables. Daily relative increase rates of the scale parameter are expressed by an equation describing the relationship with the daily relative increase rate of the location parameter. Corbicula japonica grows to a modal shell length of 0.7 mm during the first year in Lake Abashiri. Compared with the attain-able maximum size of about 30 mm, the growth of juveniles is extremely slow because their growth is less susceptible to environmental factors until the second winter. The extremely slow growth in Lake Abashiri could be a geographical genetic variation within C. japonica.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The Strengths and Difficulties Questionnaire (SDQ) is a widely used 25-item screening test for emotional and behavioral problems in children and adolescents. This study attempted to critically examine the factor structure of the adolescent self-report version. As part of an ongoing longitudinal cohort study, a total of 3,753 pupils completed the SDQ when aged 12. Both three- and five-factor exploratory factor analysis models were estimated. A number of deviations from the hypothesized SDQ structure were observed, including a lack of unidimensionality within particular subscales, cross-loadings, and items failing to load on any factor. Model fit of the confirmatory factor analysis model was modest, providing limited support for the hypothesized five-component structure. The analyses suggested a number of weaknesses within the component structure of the self-report SDQ, particularly in relation to the reverse-coded items.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Objectives: Methicillin-resistant Staphylococcus aureus (MRSA) is a major nosocomial pathogen worldwide. A wide range of factors have been suggested to influence the spread of MRSA. The objective of this study was to evaluate the effect of antimicrobial drug use and infection control practices on nosocomial MRSA incidence in a 426-bed general teaching hospital in Northern Ireland.

Methods: The present research involved the retrospective collection of monthly data on the usage of antibiotics and on infection control practices within the hospital over a 5 year period (January 2000–December 2004). A multivariate ARIMA (time-series analysis) model was built to relate MRSA incidence with antibiotic use and infection control practices.

Results: Analysis of the 5 year data set showed that temporal variations in MRSA incidence followed temporal variations in the use of fluoroquinolones, third-generation cephalosporins, macrolides and amoxicillin/clavulanic acid (coefficients = 0.005, 0.03, 0.002 and 0.003, respectively, with various time lags). Temporal relationships were also observed between MRSA incidence and infection control practices, i.e. the number of patients actively screened for MRSA (coefficient = -0.007), the use of alcohol-impregnated wipes (coefficient = -0.0003) and the bulk orders of alcohol-based handrub (coefficients = -0.04 and -0.08), with increased infection control activity being associated with decreased MRSA incidence, and between MRSA incidence and the number of new patients admitted with MRSA (coefficient = 0.22). The model explained 78.4% of the variance in the monthly incidence of MRSA.

Conclusions: The results of this study confirm the value of infection control policies as well as suggest the usefulness of restricting the use of certain antimicrobial classes to control MRSA.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The paper describes the development and application of a multiple linear regression model to identify how the key elements of waste and recycling infrastructure, namely container capacity and frequency of collection affect the yield from municipal kerbside recycling programmes. The overall aim of the research was to gain an understanding of the factors affecting the yield from municipal kerbside recycling programmes in Scotland. The study isolates the principal kerbside collection service offered by 32 councils across Scotland, eliminating those recycling programmes associated with flatted properties or multi occupancies. The results of a regression analysis model has identified three principal factors which explain 80% of the variability in the average yield of the principal dry recyclate services: weekly residual waste capacity, number of materials collected and the weekly recycling capacity. The use of the model has been evaluated and recommendations made on ongoing methodological development and the use of the results in informing the design of kerbside recycling programmes. The authors hope that the research can provide insights for the ongoing development of methods to optimise the design and operation of kerbside recycling programmes.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

There is a requirement for better integration between design and analysis tools, which is difficult due to their different objectives, separate data representations and workflows. Currently, substantial effort is required to produce a suitable analysis model from design geometry. Robust links are required between these different representations to enable analysis attributes to be transferred between different design and analysis packages for models at various levels of fidelity.

This paper describes a novel approach for integrating design and analysis models by identifying and managing the relationships between the different representations. Three key technologies, Cellular Modeling, Virtual Topology and Equivalencing, have been employed to achieve effective simulation model management. These technologies and their implementation are discussed in detail. Prototype automated tools are introduced demonstrating how multiple simulation models can be linked and maintained to facilitate seamless integration throughout the design cycle.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This paper outlines the importance of robust interface management for facilitating finite element analysis workflows. Topological equivalences between analysis model representations are identified and maintained in an editable and accessible manner. The model and its interfaces are automatically represented using an analysis-specific cellular decomposition of the design space. Rework of boundary conditions following changes to the design geometry or the analysis idealization can be minimized by tracking interface dependencies. Utilizing this information with the Simulation Intent specified by an analyst, automated decisions can be made to process the interface information required to rebuild analysis models. Through this work automated boundary condition application is realized within multi-component, multi-resolution and multi-fidelity analysis workflows.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Please consult the paper edition of this thesis to read. It is available on the 5th Floor of the Library at Call Number: Z 9999 E38 D56 1992

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Factor analysis as frequent technique for multivariate data inspection is widely used also for compositional data analysis. The usual way is to use a centered logratio (clr) transformation to obtain the random vector y of dimension D. The factor model is then y = Λf + e (1) with the factors f of dimension k < D, the error term e, and the loadings matrix Λ. Using the usual model assumptions (see, e.g., Basilevsky, 1994), the factor analysis model (1) can be written as Cov(y) = ΛΛT + ψ (2) where ψ = Cov(e) has a diagonal form. The diagonal elements of ψ as well as the loadings matrix Λ are estimated from an estimation of Cov(y). Given observed clr transformed data Y as realizations of the random vector y. Outliers or deviations from the idealized model assumptions of factor analysis can severely effect the parameter estimation. As a way out, robust estimation of the covariance matrix of Y will lead to robust estimates of Λ and ψ in (2), see Pison et al. (2003). Well known robust covariance estimators with good statistical properties, like the MCD or the S-estimators (see, e.g. Maronna et al., 2006), rely on a full-rank data matrix Y which is not the case for clr transformed data (see, e.g., Aitchison, 1986). The isometric logratio (ilr) transformation (Egozcue et al., 2003) solves this singularity problem. The data matrix Y is transformed to a matrix Z by using an orthonormal basis of lower dimension. Using the ilr transformed data, a robust covariance matrix C(Z) can be estimated. The result can be back-transformed to the clr space by C(Y ) = V C(Z)V T where the matrix V with orthonormal columns comes from the relation between the clr and the ilr transformation. Now the parameters in the model (2) can be estimated (Basilevsky, 1994) and the results have a direct interpretation since the links to the original variables are still preserved. The above procedure will be applied to data from geochemistry. Our special interest is on comparing the results with those of Reimann et al. (2002) for the Kola project data

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Our goal in this paper is to assess reliability and validity of egocentered network data using multilevel analysis (Muthen, 1989, Hox, 1993) under the multitrait-multimethod approach. The confirmatory factor analysis model for multitrait-multimethod data (Werts & Linn, 1970; Andrews, 1984) is used for our analyses. In this study we reanalyse a part of data of another study (Kogovšek et al., 2002) done on a representative sample of the inhabitants of Ljubljana. The traits used in our article are the name interpreters. We consider egocentered network data as hierarchical; therefore a multilevel analysis is required. We use Muthen's partial maximum likelihood approach, called pseudobalanced solution (Muthen, 1989, 1990, 1994) which produces estimations close to maximum likelihood for large ego sample sizes (Hox & Mass, 2001). Several analyses will be done in order to compare this multilevel analysis to classic methods of analysis such as the ones made in Kogovšek et al. (2002), who analysed the data only at group (ego) level considering averages of all alters within the ego. We show that some of the results obtained by classic methods are biased and that multilevel analysis provides more detailed information that much enriches the interpretation of reliability and validity of hierarchical data. Within and between-ego reliabilities and validities and other related quality measures are defined, computed and interpreted