901 resultados para distinguishability metrics
Resumo:
As users continually request additional functionality, software systems will continue to grow in their complexity, as well as in their susceptibility to failures. Particularly for sensitive systems requiring higher levels of reliability, faulty system modules may increase development and maintenance cost. Hence, identifying them early would support the development of reliable systems through improved scheduling and quality control. Research effort to predict software modules likely to contain faults, as a consequence, has been substantial. Although a wide range of fault prediction models have been proposed, we remain far from having reliable tools that can be widely applied to real industrial systems. For projects with known fault histories, numerous research studies show that statistical models can provide reasonable estimates at predicting faulty modules using software metrics. However, as context-specific metrics differ from project to project, the task of predicting across projects is difficult to achieve. Prediction models obtained from one project experience are ineffective in their ability to predict fault-prone modules when applied to other projects. Hence, taking full benefit of the existing work in software development community has been substantially limited. As a step towards solving this problem, in this dissertation we propose a fault prediction approach that exploits existing prediction models, adapting them to improve their ability to predict faulty system modules across different software projects.
Resumo:
Many restaurant organizations have committed a substantial amount of effort to studying the relationship between a firm’s performance and its effort to develop an effective human resources management reward-and-retention system. These studies have produced various metrics for determining the efficacy of restaurant management and human resources management systems. This paper explores the best metrics to use when calculating the overall unit performance of casual restaurant managers. These metrics were identified through an exploratory qualitative case study method that included interviews with executives and a Delphi study. Experts proposed several diverse metrics for measuring management value and performance. These factors seem to represent all stakeholders’interest.
Resumo:
Current methods of understanding microbiome composition and structure rely on accurately estimating the number of distinct species and their relative abundance. Most of these methods require an efficient PCR whose forward and reverse primers bind well to the same, large number of identifiable species, and produce amplicons that are unique. It is therefore not surprising that currently used universal primers designed many years ago are not as efficient and fail to bind to recently cataloged species. We propose an automated general method of designing PCR primer pairs that abide by primer design rules and uses current sequence database as input. Since the method is automated, primers can be designed for targeted microbial species or updated as species are added or deleted from the database. In silico experiments and laboratory experiments confirm the efficacy of the newly designed primers for metagenomics applications.
Resumo:
I discuss geometry and normal forms for pseudo-Riemannian metrics with parallel spinor fields in some interesting dimensions. I also discuss the interaction of these conditions for parallel spinor fields with the condition that the Ricci tensor vanish (which, for pseudo-Riemannian manifolds, is not an automatic consequence of the existence of a nontrivial parallel spinor field).
Resumo:
The central idea of this dissertation is to interpret certain invariants constructed from Laplace spectral data on a compact Riemannian manifold as regularized integrals of closed differential forms on the space of Riemannian metrics, or more generally on a space of metrics on a vector bundle. We apply this idea to both the Ray-Singer analytic torsion
and the eta invariant, explaining their dependence on the metric used to define them with a Stokes' theorem argument. We also introduce analytic multi-torsion, a generalization of analytic torsion, in the context of certain manifolds with local product structure; we prove that it is metric independent in a suitable sense.
Resumo:
Clustering algorithms, pattern mining techniques and associated quality metrics emerged as reliable methods for modeling learners’ performance, comprehension and interaction in given educational scenarios. The specificity of available data such as missing values, extreme values or outliers, creates a challenge to extract significant user models from an educational perspective. In this paper we introduce a pattern detection mechanism with-in our data analytics tool based on k-means clustering and on SSE, silhouette, Dunn index and Xi-Beni index quality metrics. Experiments performed on a dataset obtained from our online e-learning platform show that the extracted interaction patterns were representative in classifying learners. Furthermore, the performed monitoring activities created a strong basis for generating automatic feedback to learners in terms of their course participation, while relying on their previous performance. In addition, our analysis introduces automatic triggers that highlight learners who will potentially fail the course, enabling tutors to take timely actions.
Resumo:
The second theme of this book concerns L&D’s ‘Contributions’, specifically how L&D professionals articulate, communicate and demonstrate value that it brings to the organization. Specifically, Chapter 3, titled ‘Using information, metrics and developing business cases for L&D’, discusses how L&D professionals can do this using the business case as a vehicle. The business case is a tool that L&D professionals can use to show how new L&D initiatives can benefit the organization and its stakeholders. The value of such benefit can be ‘articulated’ quantitatively and qualitatively. Chapter 3 adopts a holistic approach in developing a business case. L&D professionals must be competently knowledgeable about accounting and finance but without the need to be experts – as their expertise lies in L&D. Therefore to successfully complete a business case, L&D professionals need to form teams comprising the right members (depending on what the business case is about). The political realities that are associated with the development of a business case can be important considerations. How well L&D is able to ‘sell’ a business case depends on how well it is framed, usually either as a problem or opportunity. We then discuss the information, data and metrics required to build a typical business case, specifically in terms of identifying the benefits and costs. The chapter concludes with some suggestions on how the findings from the business case can be presented in infographics-inspired form.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-07
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Abstract: This study was designed to validate a constructivist learning framework, herein referred to as Accessible Immersion Metrics (AIM), for second language acquisition (SLA) as well as to compare two delivery methods of the same framework. The AIM framework was originally developed in 2009 and is proposed as a “How to” guide for the application of constructivist learning principles to the second language classroom. Piloted in 2010 at Champlain College St-Lambert, the AIM model allows for language learning to occur, free of a fixed schedule, to be socially constructive through the use of task-based assessments and relevant to the learner’s life experience by focusing on the students’ needs rather than on course content.||Résumé : Cette étude a été principalement conçu pour valider un cadre d'apprentissage constructiviste, ci-après dénommé Accessible Immersion Metrics - AIM, pour l'acquisition d'une langue seconde - SLA. Le cadre de l'AIM est proposé comme un mode d'emploi pour l'application des principes constructivistes à l'apprentissage d’une langue seconde. Créé en 2009 par l'auteur, et piloté en 2010 au Collège Champlain St-Lambert, le modèle de l'AIM permet l'apprentissage des langues à se produire, sans horaire fixe et socialement constructive grâce à l'utilisation des évaluations alignées basées sur des tâches pertinentes à l'expérience de vie de l'étudiant en se concentrant sur les besoins des élèves plutôt que sur le contenu des cours.
Resumo:
OBJECTIVE: Intravoxel incoherent motion (IVIM) is an MRI technique with potential applications in measuring brain tumor perfusion, but its clinical impact remains to be determined. We assessed the usefulness of IVIM-metrics in predicting survival in newly diagnosed glioblastoma. METHODS: Fifteen patients with glioblastoma underwent MRI including spin-echo echo-planar DWI using 13 b-values ranging from 0 to 1000 s/mm2. Parametric maps for diffusion coefficient (D), pseudodiffusion coefficient (D*), and perfusion fraction (f) were generated for contrast-enhancing regions (CER) and non-enhancing regions (NCER). Regions of interest were manually drawn in regions of maximum f and on the corresponding dynamic susceptibility contrast images. Prognostic factors were evaluated by Kaplan-Meier survival and Cox proportional hazards analyses. RESULTS: We found that fCER and D*CER correlated with rCBFCER. The best cutoffs for 6-month survival were fCER>9.86% and D*CER>21.712 x10-3mm2/s (100% sensitivity, 71.4% specificity, 100% and 80% positive predictive values, and 80% and 100% negative predictive values; AUC:0.893 and 0.857, respectively). Treatment yielded the highest hazard ratio (5.484; 95% CI: 1.162-25.88; AUC: 0.723; P = 0.031); fCER combined with treatment predicted survival with 100% accuracy. CONCLUSIONS: The IVIM-metrics fCER and D*CER are promising biomarkers of 6-month survival in newly diagnosed glioblastoma.
Resumo:
This was presented during the 2nd annual Library Research and Innovation Practices at the University of Maryland Libraries, McKeldin Library, on June 8, 2016.
Resumo:
This presentation was one of four during a Mid-Atlantic Regional Archives Conference presentation on April 15, 2016. Digitization of collections can help to improve internal workflows, make materials more accessible, and create new and engaging relationships with users. Laurie Gemmill Arp will discuss the LYRASIS Digitization Collaborative, created to assist institutions with their digitization needs, and how it has worked to help institutions increase connections with users. Robin Pike from the University of Maryland will discuss how they factor requests for access into selection for digitization and how they track the use of digitized materials. Laura Drake Davis of James Madison University will discuss the establishment of a formal digitization program, its impact on users, and the resulting increased use of their collections. Linda Tompkins-Baldwin will discuss Digital Maryland’s partnership with the Digital Public Library of America to provide access to archives held by institutions without a digitization program.