930 resultados para Precision-recall analysis
Resumo:
OBJECTIVE: To investigate if there is a reduced risk of type 1 diabetes in children breastfed or exclusively breastfed by performing a pooled analysis with adjustment for recognized confounders.
RESEARCH DESIGN AND METHODS: Relevant studies were identified from literature searches using MEDLINE, Web of Science, and EMBASE. Authors of relevant studies were asked to provide individual participant data or conduct prespecified analyses. Meta-analysis techniques were used to combine odds ratios (ORs) and investigate heterogeneity between studies.
RESULTS: Data were available from 43 studies including 9,874 patients with type 1 diabetes. Overall, there was a reduction in the risk of diabetes after exclusive breast-feeding for >2 weeks (20 studies; OR = 0.75, 95% CI 0.64-0.88), the association after exclusive breast-feeding for >3 months was weaker (30 studies; OR = 0.87, 95% CI 0.75-1.00), and no association was observed after (nonexclusive) breast-feeding for >2 weeks (28 studies; OR = 0.93, 95% CI 0.81-1.07) or >3 months (29 studies; OR = 0.88, 95% CI 0.78-1.00). These associations were all subject to marked heterogeneity (I(2) = 58, 76, 54, and 68%, respectively). In studies with lower risk of bias, the reduced risk after exclusive breast-feeding for >2 weeks remained (12 studies; OR = 0.86, 95% CI 0.75-0.99), and heterogeneity was reduced (I(2) = 0%). Adjustments for potential confounders altered these estimates very little.
CONCLUSIONS: The pooled analysis suggests weak protective associations between exclusive breast-feeding and type 1 diabetes risk. However, these findings are difficult to interpret because of the marked variation in effect and possible biases (particularly recall bias) inherent in the included studies.
Resumo:
Decision making is an important element throughout the life-cycle of large-scale projects. Decisions are critical as they have a direct impact upon the success/outcome of a project and are affected by many factors including the certainty and precision of information. In this paper we present an evidential reasoning framework which applies Dempster-Shafer Theory and its variant Dezert-Smarandache Theory to aid decision makers in making decisions where the knowledge available may be imprecise, conflicting and uncertain. This conceptual framework is novel as natural language based information extraction techniques are utilized in the extraction and estimation of beliefs from diverse textual information sources, rather than assuming these estimations as already given. Furthermore we describe an algorithm to define a set of maximal consistent subsets before fusion occurs in the reasoning framework. This is important as inconsistencies between subsets may produce results which are incorrect/adverse in the decision making process. The proposed framework can be applied to problems involving material selection and a Use Case based in the Engineering domain is presented to illustrate the approach. © 2013 Elsevier B.V. All rights reserved.
Resumo:
Reducing wafer metrology continues to be a major target in semiconductor manufacturing efficiency initiatives due to it being a high cost, non-value added operation that impacts on cycle-time and throughput. However, metrology cannot be eliminated completely given the important role it plays in process monitoring and advanced process control. To achieve the required manufacturing precision, measurements are typically taken at multiple sites across a wafer. The selection of these sites is usually based on a priori knowledge of wafer failure patterns and spatial variability with additional sites added over time in response to process issues. As a result, it is often the case that in mature processes significant redundancy can exist in wafer measurement plans. This paper proposes a novel methodology based on Forward Selection Component Analysis (FSCA) for analyzing historical metrology data in order to determine the minimum set of wafer sites needed for process monitoring. The paper also introduces a virtual metrology (VM) based approach for reconstructing the complete wafer profile from the optimal sites identified by FSCA. The proposed methodology is tested and validated on a wafer manufacturing metrology dataset. © 2012 IEEE.
Resumo:
OBJECTIVES: Precision Teaching (PT) has been shown to be an effective intervention to assess teaching method effectiveness and evaluate learning outcomes. SAFMEDS (Say All Fast Minute Every Day Shuffled) are a practice/assessment procedure within the PT framework to assist learning and fluency. We explored the effects of a brief intervention with PT, to impart high frequency performance in safe intravenous fluid prescription in a group of final year undergraduate medical students.
METHODS: 133 final year undergraduate medical students completed a multiple choice question (MCQ) test on safe IV fluid prescription at the beginning and end of the study. The control group (n= 76) of students were taught using a current standardized teaching method. Students allocated to the intervention arm of the study were additionally instructed on PT and the use of SAFMEDS. The study group (n = 57) received 50 SAFMEDS cards containing information on the principles of IV fluid prescription scenarios. These students were trained/tested twice per day for 1 minute.
RESULTS: Interim analysis showed that the study group displayed an improvement in fluency and accuracy as the study progressed. There was a statistically significant improvement in MCQ performance for the PT group compared with the control group between the beginning and end of the study (35% vs 15%).
CONCLUSION: These results suggest PT employing SAFMEDS is an effective method for improving fluency, accuracy and patient safety in intravenous fluid prescribing amongst undergraduate medical students.
Resumo:
The precise knowledge of the temperature of an ultracold lattice gas simulating a strongly correlated
system is a question of both fundamental and technological importance. Here, we address such
question by combining tools from quantum metrology together with the study of the quantum
correlations embedded in the system at finite temperatures. Within this frame we examine the spin-
1 2 XY chain, first estimating, by means of the quantum Fisher information, the lowest attainable
bound on the temperature precision. We then address the estimation of the temperature of the sample
from the analysis of correlations using a quantum non demolishing Faraday spectroscopy method.
Remarkably, our results show that the collective quantum correlations can become optimal
observables to accurately estimate the temperature of our model in a given range of temperatures.
Resumo:
We present a detailed study of the use of a non-parallel, inhomogeneous magnetic field spectrometer for the investigation of laser-accelerated ion beams. Employing a wedged yoke design, we demonstrate the feasibility of an in-situ self-calibration technique of the non-uniform magnetic field and show that high-precision measurements of ion energies are possible in a wide-angle configuration. We also discuss the implications of a stacked detector system for unambiguous identification of different ion species present in the ion beam and explore the feasibility of detection of high energy particles beyond 100 MeV/amu in radiation harsh environments.
Resumo:
In order to carry out high-precision machining of aerospace structural components with large size, thin wall and complex surface, this paper proposes a novel parallel kinematic machine (PKM) and formulates its semi-analytical theoretical stiffness model considering gravitational effects that is verified by stiffness experiments. From the viewpoint of topology structure, the novel PKM consists of two substructures in terms of the redundant and overconstrained parallel mechanisms that are connected by two interlinked revolute joints. The theoretical stiffness model of the novel PKM is established based upon the virtual work principle and deformation superposition principle after mapping the stiffness models of substructures from joint space to operated space by Jacobian matrices and considering the deformation contributions of interlinked revolute joints to two substructures. Meanwhile, the component gravities are treated as external payloads exerting on the end reference point of the novel PKM resorting to static equivalence principle. This approach is proved by comparing the theoretical stiffness values with experimental stiffness values in the same configurations, which also indicates equivalent gravity can be employed to describe the actual distributed gravities in an acceptable accuracy manner. Finally, on the basis of the verified theoretical stiffness model, the stiffness distributions of the novel PKM are illustrated and the contributions of component gravities to the stiffness of the novel PKM are discussed.
Resumo:
The discovery and clinical application of molecular biomarkers in solid tumors, increasingly relies on nucleic acid extraction from FFPE tissue sections and subsequent molecular profiling. This in turn requires the pathological review of haematoxylin & eosin (H&E) stained slides, to ensure sample quality, tumor DNA sufficiency by visually estimating the percentage tumor nuclei and tumor annotation for manual macrodissection. In this study on NSCLC, we demonstrate considerable variation in tumor nuclei percentage between pathologists, potentially undermining the precision of NSCLC molecular evaluation and emphasising the need for quantitative tumor evaluation. We subsequently describe the development and validation of a system called TissueMark for automated tumor annotation and percentage tumor nuclei measurement in NSCLC using computerized image analysis. Evaluation of 245 NSCLC slides showed precise automated tumor annotation of cases using Tissuemark, strong concordance with manually drawn boundaries and identical EGFR mutational status, following manual macrodissection from the image analysis generated tumor boundaries. Automated analysis of cell counts for % tumor measurements by Tissuemark showed reduced variability and significant correlation (p < 0.001) with benchmark tumor cell counts. This study demonstrates a robust image analysis technology that can facilitate the automated quantitative analysis of tissue samples for molecular profiling in discovery and diagnostics.
Resumo:
Learning or writing regular expressions to identify instances of a specific
concept within text documents with a high precision and recall is challenging.
It is relatively easy to improve the precision of an initial regular expression
by identifying false positives covered and tweaking the expression to avoid the
false positives. However, modifying the expression to improve recall is difficult
since false negatives can only be identified by manually analyzing all documents,
in the absence of any tools to identify the missing instances. We focus on partially
automating the discovery of missing instances by soliciting minimal user
feedback. We present a technique to identify good generalizations of a regular
expression that have improved recall while retaining high precision. We empirically
demonstrate the effectiveness of the proposed technique as compared to
existing methods and show results for a variety of tasks such as identification of
dates, phone numbers, product names, and course numbers on real world datasets
Resumo:
Background: Search filters are combinations of words and phrases designed to retrieve an optimal set of records on a particular topic (subject filters) or study design (methodological filters). Information specialists are increasingly turning to reusable filters to focus their searches. However, the extent of the academic literature on search filters is unknown. We provide a broad overview to the academic literature on search filters.
Objectives: To map the academic literature on search filters from 2004 to 2015 using a novel form of content analysis.
Methods: We conducted a comprehensive search for literature between 2004 and 2015 across eight databases using a subjectively derived search strategy. We identified key words from titles, grouped them into categories, and examined their frequency and co-occurrences.
Results: The majority of records were housed in Embase (n = 178) and MEDLINE (n = 154). Over the last decade, both databases appeared to exhibit a bimodal distribution with the number of publications on search filters rising until 2006, before dipping in 2007, and steadily increasing until 2012. Few articles appeared in social science databases over the same time frame (e.g. Social Services Abstracts, n = 3).
Unsurprisingly, the term ‘search’ appeared in most titles, and quite often, was used as a noun adjunct for the word 'filter' and ‘strategy’. Across the papers, the purpose of searches as a means of 'identifying' information and gathering ‘evidence’ from 'databases' emerged quite strongly. Other terms relating to the methodological assessment of search filters, such as precision and validation, also appeared albeit less frequently.
Conclusions: Our findings show surprising commonality across the papers with regard to the literature on search filters. Much of the literature seems to be focused on developing search filters to identify and retrieve information, as opposed to testing or validating such filters. Furthermore, the literature is mostly housed in health-related databases, namely MEDLINE, CINAHL, and Embase, implying that it is medically driven. Relatively few papers focus on the use of search filters in the social sciences.
Resumo:
Dissertação de mest., Qualidade em Análises, Faculdade de Ciências e Tecnologia, Univ. do Algarve, 2013
Resumo:
Dissertação de mestrado, Qualidade em Análises, Faculdade de Ciências e Tecnologia, Universidade do Algarve, 2015
Resumo:
Thesis (Ph.D.)--University of Washington, 2015
Resumo:
A multiresidue gas chromatographic method for the determination of six fungicides (captan, chlorthalonil, folpet, iprodione, procymidone and vinclozolin) and one acaricide (dicofol) in still and fortified wines was developed. Solid-phase microextraction (SPME) was chosen for the extraction of the compounds from the studied matrices and tandem mass spectrometry (MS/MS) detection was used. The extraction consists in a solvent free and automated procedure and the detection is highly sensitive and selective. Good linearity was obtained with correlation coefficients of regression (R2) > 0.99 for all the compounds. Satisfactory results of repeatability and intermediate precision were obtained for most of the analytes (RSD < 20%). Recoveries from spiked wine ranged from 80.1% to 112.0%. Limits of quantification (LOQs) were considerably below the proposedmaximumresidue limits (MRLs) for these compounds in grapes and below the suggested limits for wine (MRLs/10), with the exception of captan.