30 resultados para Evaluation model


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Healthcare systems worldwide face a wide range of challenges, including demographic change, rising drug and medical technology costs, and persistent and widening health inequalities both within and between countries. Simultaneously, issues such as professional silos, static medical curricula, and perceptions of "information overload" have made it difficult for medical training and continued professional development (CPD) to adapt to the changing needs of healthcare professionals in increasingly patient-centered, collaborative, and/or remote delivery contexts. In response to these challenges, increasing numbers of medical education and CPD programs have adopted e-learning approaches, which have been shown to provide flexible, low-cost, user-centered, and easily updated learning. The effectiveness of e-learning varies from context to context, however, and has also been shown to make considerable demands on users' motivation and "digital literacy" and on providing institutions. Consequently, there is a need to evaluate the effectiveness of e-learning in healthcare as part of ongoing quality improvement efforts. This article outlines the key issues for developing successful models for analyzing e-health learning.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper addresses devising a reliable model-based Harmonic-Aware Matching Pursuit (HAMP) for reconstructing sparse harmonic signals from their compressed samples. The performance guarantees of HAMP are provided; they illustrate that the introduced HAMP requires less data measurements and has lower computational cost compared with other greedy techniques. The complexity of formulating a structured sparse approximation algorithm is highlighted and the inapplicability of the conventional thresholding operator to the harmonic signal model is demonstrated. The harmonic sequential deletion algorithm is subsequently proposed and other sparse approximation methods are evaluated. The superior performance of HAMP is depicted in the presented experiments. © 2013 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using spcctroscopic ellipsometry (SE), we have measured the optical properties and optical gaps of a series of amorphous carbon (a-C) films ∼ 100-300 Å thick, prepared using a filtered beam of C+ ions from a cathodic arc. Such films exhibit a wide range of sp3-bonded carbon contents from 20 to 76 at.%, as measured by electron energy loss spectroscopy (EELS). The Taue optical gaps of the a-C films increase monotonically from 0.65 eV for 20 at.% sp3 C to 2.25 eV for 76 at.% sp3 C. Spectra in the ellipsometric angles (1.5-5 eV) have been analyzed using different effective medium theories (EMTs) applying a simplified optical model for the dielectric function of a-C, assuming a composite material with sp2 C and sp3 C components. The most widely used EMT, namely that of Bruggeman (with three-dimensionally isotropic screening), yields atomic fractions of sp3 C that correlate monotonically with those obtained from EELS. The results of the SE analysis, however, range from 10 to 25 at.% higher than those from EELS. In fact, we have found that the volume percent sp3 C from SE using the Bruggeman EMT shows good numerical agreement with the atomic percent sp3 C from EELS. The SE-EELS discrepancy has been reduced by using an optical model in which the dielectric function of the a-C is determined as a volume-fraction-weighted average of the dielectric functions of the sp2 C and sp3 C components. © 1998 Elsevier Science S.A.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

State-of-the-art large vocabulary continuous speech recognition (LVCSR) systems often combine outputs from multiple subsystems developed at different sites. Cross system adaptation can be used as an alternative to direct hypothesis level combination schemes such as ROVER. In normal cross adaptation it is assumed that useful diversity among systems exists only at acoustic level. However, complimentary features among complex LVCSR systems also manifest themselves in other layers of modelling hierarchy, e.g., subword and word level. It is thus interesting to also cross adapt language models (LM) to capture them. In this paper cross adaptation of multi-level LMs modelling both syllable and word sequences was investigated to improve LVCSR system combination. Significant error rate gains up to 6.7% rel. were obtained over ROVER and acoustic model only cross adaptation when combining 13 Chinese LVCSR subsystems used in the 2010 DARPA GALE evaluation. © 2010 ISCA.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the past decade, a variety of user models have been proposed for user simulation-based reinforcement-learning of dialogue strategies. However, the strategies learned with these models are rarely evaluated in actual user trials and it remains unclear how the choice of user model affects the quality of the learned strategy. In particular, the degree to which strategies learned with a user model generalise to real user populations has not be investigated. This paper presents a series of experiments that qualitatively and quantitatively examine the effect of the user model on the learned strategy. Our results show that the performance and characteristics of the strategy are in fact highly dependent on the user model. Furthermore, a policy trained with a poor user model may appear to perform well when tested with the same model, but fail when tested with a more sophisticated user model. This raises significant doubts about the current practice of learning and evaluating strategies with the same user model. The paper further investigates a new technique for testing and comparing strategies directly on real human-machine dialogues, thereby avoiding any evaluation bias introduced by the user model. © 2005 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The product design development has increasingly become a collaborative process. Conflicts often appear in the design process due to multi-actors interactions. Therefore, a critical element of collaborative design would be conflict situations resolution. In this paper, a methodology, based on a process model, is proposed to support conflict management. This methodology deals mainly with the conflict resolution team identification and the solution impact evaluation issues. The proposed process model allows the design process traceability and the data dependencies network identification; which making it be possible to identify the conflict resolution actors as well as to evaluate the selected solution impact. Copyright © 2006 IFAC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An implementation of the inverse vector Jiles-Atherton model for the solution of non-linear hysteretic finite element problems is presented. The implementation applies the fixed point method with differential reluctivity values obtained from the Jiles-Atherton model. Differential reluctivities are usually computed using numerical differentiation, which is ill-posed and amplifies small perturbations causing large sudden increases or decreases of differential reluctivity values, which may cause numerical problems. A rule based algorithm for conditioning differential reluctivity values is presented. Unwanted perturbations on the computed differential reluctivity values are eliminated or reduced with the aim to guarantee convergence. Details of the algorithm are presented together with an evaluation of the algorithm by a numerical example. The algorithm is shown to guarantee convergence, although the rate of convergence depends on the choice of algorithm parameters. © 2011 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computational Design has traditionally required a great deal of geometrical and parametric data. This data can only be supplied at stages later than conceptual design, typically the detail stage, and design quality is given by some absolute fitness function. On the other hand, design evaluation offers a relative measure of design quality that requires only a sparse representation. Quality, in this case, is a measure of how well a design will complete its task.

The research intends to address the question: "Is it possible to evaluate a mechanical design at the conceptual design phase and be able to make some prediction of its quality?" Quality can be interpreted as success in the marketplace, success in performing the required task, or some other user requirement. This work aims to determine a minimum level of representation such that conceptual designs can be usefully evaluated without needing to capture detailed geometry. This representation will form the model for the conceptual designs that are being considered for evaluation. The method to be developed will be a case-based evaluation system, that uses a database of previous designs to support design exploration. The method will not be able to support novel design as case-based design implies the model topology must be fixed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current research into the process of engineering design is extending the use of computers towards the acquisition, representation and application of design process knowledge in addition to the existing storage and manipulation of product-based models of design objects. This is a difficult task because the design of mechanical systems is a complex, often unpredictable process involving ill-structured problem solving skills and large amounts of knowledge, some which may be of an incomplete and subjective nature. Design problems require the integration of a variety of modes of working such as numerical, graphical, algorithmic or heuristic and demand products through synthesis, analysis and evaluation activities.

This report presents the results of a feasibility study into the blackboard approach and discusses the development of an initial prototype system that will enable an alphanumeric design dialogue between a designer and an expert to be analysed in a formal way, thus providing real-life protocol data on which to base the blackboard message structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new method for the optimal design of Functionally Graded Materials (FGM) is proposed in this paper. Instead of using the widely used explicit functional models, a feature tree based procedural model is proposed to represent generic material heterogeneities. A procedural model of this sort allows more than one explicit function to be incorporated to describe versatile material gradations and the material composition at a given location is no longer computed by simple evaluation of an analytic function, but obtained by execution of customizable procedures. This enables generic and diverse types of material variations to be represented, and most importantly, by a reasonably small number of design variables. The descriptive flexibility in the material heterogeneity formulation as well as the low dimensionality of the design vectors help facilitate the optimal design of functionally graded materials. Using the nature-inspired Particle Swarm Optimization (PSO) method, functionally graded materials with generic distributions can be efficiently optimized. We demonstrate, for the first time, that a PSO based optimizer outperforms classical mathematical programming based methods, such as active set and trust region algorithms, in the optimal design of functionally graded materials. The underlying reason for this performance boost is also elucidated with the help of benchmarked examples. © 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In spite of over two decades of intense research, illumination and pose invariance remain prohibitively challenging aspects of face recognition for most practical applications. The objective of this work is to recognize faces using video sequences both for training and recognition input, in a realistic, unconstrained setup in which lighting, pose and user motion pattern have a wide variability and face images are of low resolution. The central contribution is an illumination invariant, which we show to be suitable for recognition from video of loosely constrained head motion. In particular there are three contributions: (i) we show how a photometric model of image formation can be combined with a statistical model of generic face appearance variation to exploit the proposed invariant and generalize in the presence of extreme illumination changes; (ii) we introduce a video sequence re-illumination algorithm to achieve fine alignment of two video sequences; and (iii) we use the smoothness of geodesically local appearance manifold structure and a robust same-identity likelihood to achieve robustness to unseen head poses. We describe a fully automatic recognition system based on the proposed method and an extensive evaluation on 323 individuals and 1474 video sequences with extreme illumination, pose and head motion variation. Our system consistently achieved a nearly perfect recognition rate (over 99.7% on all four databases). © 2012 Elsevier Ltd All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, TiN/La 2O 3/HfSiON/SiO 2/Si gate stacks with thick high-k (HK) and thick pedestal oxide were used. Samples were annealed at different temperatures and times in order to characterize in detail the interaction mechanisms between La and the gate stack layers. Time-of-flight secondary ion mass spectrometry (ToF-SIMS) measurements performed on these samples show a time diffusion saturation of La in the high-k insulator, indicating an La front immobilization due to LaSiO formation at the high-k/interfacial layer. Based on the SIMS data, a technology computer aided design (TCAD) diffusion model including La time diffusion saturation effect was developed. © 2012 American Institute of Physics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a hierarchical probabilistic model for ordinal matrix factorization. Unlike previous approaches, we model the ordinal nature of the data and take a principled approach to incorporating priors for the hidden variables. Two algorithms are presented for inference, one based on Gibbs sampling and one based on variational Bayes. Importantly, these algorithms may be implemented in the factorization of very large matrices with missing entries. The model is evaluated on a collaborative filtering task, where users have rated a collection of movies and the system is asked to predict their ratings for other movies. The Netflix data set is used for evaluation, which consists of around 100 million ratings. Using root mean-squared error (RMSE) as an evaluation metric, results show that the suggested model outperforms alternative factorization techniques. Results also show how Gibbs sampling outperforms variational Bayes on this task, despite the large number of ratings and model parameters. Matlab implementations of the proposed algorithms are available from cogsys.imm.dtu.dk/ordinalmatrixfactorization.