49 resultados para Quality evaluation and certification
Resumo:
The aim of this research was to evaluate how fingerprint analysts would incorporate information from newly developed tools into their decision making processes. Specifically, we assessed effects using the following: (1) a quality tool to aid in the assessment of the clarity of the friction ridge details, (2) a statistical tool to provide likelihood ratios representing the strength of the corresponding features between compared fingerprints, and (3) consensus information from a group of trained fingerprint experts. The measured variables for the effect on examiner performance were the accuracy and reproducibility of the conclusions against the ground truth (including the impact on error rates) and the analyst accuracy and variation for feature selection and comparison.¦The results showed that participants using the consensus information from other fingerprint experts demonstrated more consistency and accuracy in minutiae selection. They also demonstrated higher accuracy, sensitivity, and specificity in the decisions reported. The quality tool also affected minutiae selection (which, in turn, had limited influence on the reported decisions); the statistical tool did not appear to influence the reported decisions.
Resumo:
This article examines the extent and limits of nonstate forms of authority in international relations. It analyzes how the information and communication technology (ICT) infrastructure for the tradability of services in a global knowledge-based economy relies on informal regulatory practices for the adjustment of ICT-related skills. By focusing on the challenge that highly volatile and short-lived cycles of demands for this type of knowledge pose for ensuring the right qualification of the labor force, the article explores how companies and associations provide training and certification programs as part of a growing market for educational services setting their own standards. The existing literature on non-conventional forms of authority in the global political economy has emphasized that the consent of actors, subject to informal rules and some form of state support, remains crucial for the effectiveness of those new forms of power. However, analyses based on a limited sample of actors tend toward a narrow understanding of the issues concerned and fail to fully explore the differentiated space in which non state authority is emerging. This article develops a three-dimensional analytical framework that brings together the scope of the issues involved, the range of nonstate actors concerned, and the spatial scope of their authority. The empirical findings highlight the limits of these new forms of nonstate authority and shed light on the role of the state and international governmental organizations in this new context.
Resumo:
In 2001, it became evident that the domiciliary care nurses needed a tool to assist them in treating patients with chronic wounds. A protocol was therefore developed which could be used not only by the nurses but also by doctors and other health care professionals working in home care. As a parallel measure, a network of nurses specialised in wound care and available for advice and consultation was established.
Resumo:
Selectome (http://selectome.unil.ch/) is a database of positive selection, based on a branch-site likelihood test. This model estimates the number of nonsynonymous substitutions (dN) and synonymous substitutions (dS) to evaluate the variation in selective pressure (dN/dS ratio) over branches and over sites. Since the original release of Selectome, we have benchmarked and implemented a thorough quality control procedure on multiple sequence alignments, aiming to provide minimum false-positive results. We have also improved the computational efficiency of the branch-site test implementation, allowing larger data sets and more frequent updates. Release 6 of Selectome includes all gene trees from Ensembl for Primates and Glires, as well as a large set of vertebrate gene trees. A total of 6810 gene trees have some evidence of positive selection. Finally, the web interface has been improved to be more responsive and to facilitate searches and browsing.
Resumo:
Depth-averaged velocities and unit discharges within a 30 km reach of one of the world's largest rivers, the Rio Parana, Argentina, were simulated using three hydrodynamic models with different process representations: a reduced complexity (RC) model that neglects most of the physics governing fluid flow, a two-dimensional model based on the shallow water equations, and a three-dimensional model based on the Reynolds-averaged Navier-Stokes equations. Row characteristics simulated using all three models were compared with data obtained by acoustic Doppler current profiler surveys at four cross sections within the study reach. This analysis demonstrates that, surprisingly, the performance of the RC model is generally equal to, and in some instances better than, that of the physics based models in terms of the statistical agreement between simulated and measured flow properties. In addition, in contrast to previous applications of RC models, the present study demonstrates that the RC model can successfully predict measured flow velocities. The strong performance of the RC model reflects, in part, the simplicity of the depth-averaged mean flow patterns within the study reach and the dominant role of channel-scale topographic features in controlling the flow dynamics. Moreover, the very low water surface slopes that typify large sand-bed rivers enable flow depths to be estimated reliably in the RC model using a simple fixed-lid planar water surface approximation. This approach overcomes a major problem encountered in the application of RC models in environments characterised by shallow flows and steep bed gradients. The RC model is four orders of magnitude faster than the physics based models when performing steady-state hydrodynamic calculations. However, the iterative nature of the RC model calculations implies a reduction in computational efficiency relative to some other RC models. A further implication of this is that, if used to simulate channel morphodynamics, the present RC model may offer only a marginal advantage in terms of computational efficiency over approaches based on the shallow water equations. These observations illustrate the trade off between model realism and efficiency that is a key consideration in RC modelling. Moreover, this outcome highlights a need to rethink the use of RC morphodynamic models in fluvial geomorphology and to move away from existing grid-based approaches, such as the popular cellular automata (CA) models, that remain essentially reductionist in nature. In the case of the world's largest sand-bed rivers, this might be achieved by implementing the RC model outlined here as one element within a hierarchical modelling framework that would enable computationally efficient simulation of the morphodynamics of large rivers over millennial time scales. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Background: Cardiac magnetic resonance (CMR) is accepted as a method to assess suspected coronary artery disease (CAD). Nonetheless, invasive coronary angiography (CXA) combined or not with fractional flow reserve (FFR) remains the main diagnostic test to evaluate CAD. Little data exist on the economic impact of the use of these procedures in a population with a low to intermediate pre-test probability. Objective: To compare the costs of 3 decision strategies to revascularize a patient with suspected CAD: 1) strategy guided by CMR 2) hypothetical strategy guided by CXA-FFR, 3) hypothetical strategy guided by CXA alone.
Resumo:
At a time when disciplined inference and decision making under uncertainty represent common aims to participants in legal proceedings, the scientific community is remarkably heterogenous in its attitudes as to how these goals ought to be achieved. Probability and decision theory exert a considerable influence, and we think by all reason rightly do so, but they go against a mainstream of thinking that does not embrace-or is not aware of-the 'normative' character of this body of theory. It is normative, in the sense understood in this article, in that it prescribes particular properties, typically (logical) coherence, to which reasoning and decision making ought to conform. Disregarding these properties can result in diverging views which are occasionally used as an argument against the theory, or as a pretext for not following it. Typical examples are objections according to which people, both in everyday life but also individuals involved at various levels in the judicial process, find the theory difficult to understand and to apply. A further objection is that the theory does not reflect how people actually behave. This article aims to point out in what sense these examples misinterpret the analytical framework in its normative perspective. Through examples borrowed mostly from forensic science contexts, it is argued that so-called intuitive scientific attitudes are particularly liable to such misconceptions. These attitudes are contrasted with a statement of the actual liberties and constraints of probability and decision theory and the view according to which this theory is normative.
Resumo:
The concept of energy gap(s) is useful for understanding the consequence of a small daily, weekly, or monthly positive energy balance and the inconspicuous shift in weight gain ultimately leading to overweight and obesity. Energy gap is a dynamic concept: an initial positive energy gap incurred via an increase in energy intake (or a decrease in physical activity) is not constant, may fade out with time if the initial conditions are maintained, and depends on the 'efficiency' with which the readjustment of the energy imbalance gap occurs with time. The metabolic response to an energy imbalance gap and the magnitude of the energy gap(s) can be estimated by at least two methods, i.e. i) assessment by longitudinal overfeeding studies, imposing (by design) an initial positive energy imbalance gap; ii) retrospective assessment based on epidemiological surveys, whereby the accumulated endogenous energy storage per unit of time is calculated from the change in body weight and body composition. In order to illustrate the difficulty of accurately assessing an energy gap we have used, as an illustrative example, a recent epidemiological study which tracked changes in total energy intake (estimated by gross food availability) and body weight over 3 decades in the US, combined with total energy expenditure prediction from body weight using doubly labelled water data. At the population level, the study attempted to assess the cause of the energy gap purported to be entirely due to increased food intake. Based on an estimate of change in energy intake judged to be more reliable (i.e. in the same study population) and together with calculations of simple energetic indices, our analysis suggests that conclusions about the fundamental causes of obesity development in a population (excess intake vs. low physical activity or both) is clouded by a high level of uncertainty.
Resumo:
The R package EasyStrata facilitates the evaluation and visualization of stratified genome-wide association meta-analyses (GWAMAs) results. It provides (i) statistical methods to test and account for between-strata difference as a means to tackle gene-strata interaction effects and (ii) extended graphical features tailored for stratified GWAMA results. The software provides further features also suitable for general GWAMAs including functions to annotate, exclude or highlight specific loci in plots or to extract independent subsets of loci from genome-wide datasets. It is freely available and includes a user-friendly scripting interface that simplifies data handling and allows for combining statistical and graphical functions in a flexible fashion. AVAILABILITY: EasyStrata is available for free (under the GNU General Public License v3) from our Web site www.genepi-regensburg.de/easystrata and from the CRAN R package repository cran.r-project.org/web/packages/EasyStrata/. SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.
Resumo:
Background: The public health burden of coronary artery disease (CAD) is important. Perfusion cardiac magnetic resonance (CMR) is generally accepted to detect and monitor CAD. Few studies have so far addressed its costs and costeffectiveness. Objectives: To compare in a large CMR registry the costs of a CMR-guided strategy vs two hypothetical invasive strategies for the diagnosis and the treatment of patients with suspected CAD. Methods: In 3'647 patients with suspected CAD included prospectively in the EuroCMR Registry (59 centers; 18 countries) costs were calculated for diagnostic examinations, revascularizations as well as for complication management over a 1-year follow-up. Patients with ischemia-positive CMR underwent an invasive X-ray coronary angiography (CXA) and revascularization at the discretion of the treating physician (=CMR+CXA strategy). Ischemia was found in 20.9% of patients and 17.4% of them were revascularized. In ischemia-negative patients by CMR, cardiac death and non-fatal myocardial infarctions occurred in 0.38%/y. In a hypothetical invasive arm the costs were calculated for an initial CXA followed by FFR testing in vessels with ≥50% diameter stenoses (=CXA+FFR strategy). To model this hypothetical arm, the same proportion of ischemic patients and outcome was assumed as for the CMR+CXA strategy. The coronary stenosis - FFR relationship reported in the literature was used to derive the proportion of patients with ≥50% diameter stenoses (Psten) in the study cohort. The costs of a CXA-only strategy were also calculated. Calculations were performed from a third payer perspective for the German, UK, Swiss, and US healthcare systems.
Resumo:
OBJECTIVE: To develop disease-specific recommendations for the diagnosis and management of eosinophilic granulomatosis with polyangiitis (Churg-Strauss syndrome) (EGPA). METHODS: The EGPA Consensus Task Force experts comprised 8 pulmonologists, 6 internists, 4 rheumatologists, 3 nephrologists, 1 pathologist and 1 allergist from 5 European countries and the USA. Using a modified Delphi process, a list of 40 questions was elaborated by 2 members and sent to all participants prior to the meeting. Concurrently, an extensive literature search was undertaken with publications assigned with a level of evidence according to accepted criteria. Drafts of the recommendations were circulated for review to all members until final consensus was reached. RESULTS: Twenty-two recommendations concerning the diagnosis, initial evaluation, treatment and monitoring of EGPA patients were established. The relevant published information on EGPA, antineutrophil-cytoplasm antibody-associated vasculitides, hypereosinophilic syndromes and eosinophilic asthma supporting these recommendations was also reviewed. DISCUSSION: These recommendations aim to give physicians tools for effective and individual management of EGPA patients, and to provide guidance for further targeted research.
Resumo:
Due to various contexts and processes, forensic science communities may have different approaches, largely influenced by their criminal justice systems. However, forensic science practices share some common characteristics. One is the assurance of a high (scientific) quality within processes and practices. For most crime laboratory directors and forensic science associations, this issue is conditioned by the triangle of quality, which represents the current paradigm of quality assurance in the field. It consists of the implementation of standardization, certification, accreditation, and an evaluation process. It constitutes a clear and sound way to exchange data between laboratories and enables databasing due to standardized methods ensuring reliable and valid results; but it is also a means of defining minimum requirements for practitioners' skills for specific forensic science activities. The control of each of these aspects offers non-forensic science partners the assurance that the entire process has been mastered and is trustworthy. Most of the standards focus on the analysis stage and do not consider pre- and post-laboratory stages, namely, the work achieved at the investigation scene and the evaluation and interpretation of the results, intended for intelligence beneficiaries or for court. Such localized consideration prevents forensic practitioners from identifying where the problems really lie with regard to criminal justice systems. According to a performance-management approach, scientific quality should not be restricted to standardized procedures and controls in forensic science practice. Ensuring high quality also strongly depends on the way a forensic science culture is assimilated (into specific education training and workplaces) and in the way practitioners understand forensic science as a whole.