38 resultados para Precision and recall
Resumo:
Increasing research has highlighted the effects of changing climates on the occurrence and prevalence of toxigenic Aspergillus species producing aflatoxins. There is concern of the toxicological effects to human health and animal productivity following acute and chronic exposure that may affect the future ability to provide safe and sufficient food globally. Considerable research has focused on the detection of these toxins, based on the physicochemical and biochemical properties of the aflatoxin compounds, in agricultural products for human and animal consumption. As improvements in food security continue more regulations for acceptable levels of aflatoxins have arisen globally; the most stringent in Europe. These regulations are important for developing countries as aflatoxin occurrence is high significantly effecting international trade and the economy. In developed countries analytical approaches have become highly sophisticated, capable of attaining results with high precision and accuracy, suitable for regulatory laboratories. Regrettably, many countries that are affected by aflatoxin contamination do not have resources for high tech HPLC and MS instrumentation and require more affordable, yet robust equally accurate alternatives that may be used by producers, processors and traders in emerging economies. It is especially important that those companies wishing to exploit the opportunities offered by lucrative but highly regulated markets in the developed world, have access to analytical methods that will ensure that their exports meet their customers quality and safety requirements.
This work evaluates the ToxiMet system as an alternative approach to UPLC–MS/MS for the detection and determination of aflatoxins relative to current European regulatory standards. Four commodities: rice grain, maize cracked and flour, peanut paste and dried distillers grains were analysed for natural aflatoxin contamination. For B1 and total aflatoxins determination the qualitative correlation, above or below the regulatory limit, was good for all commodities with the exception of the dried distillers grain samples for B1 for which no calibration existed. For B1 the quantitative R2 correlations were 0.92, 0.92, 0.88 (<250 μg/kg) and 0.7 for rice, maize, peanuts and dried distillers grain samples respectively whereas for total aflatoxins the quantitative correlation was 0.92, 0.94, 0.88 and 0.91. The ToxiMet system could be used as an alternative for aflatoxin analysis for current legislation but some consideration should be given to aflatoxin M1 regulatory levels for these commodities considering the high levels detected in this study especially for maize and peanuts
Resumo:
New radiocarbon calibration curves, IntCal04 and Marine04, have been constructed and internationally ratified to replace the terrestrial and marine components of IntCal98. The new calibration data sets extend an additional 2000 yr, from 0–26 cal kyr BP (Before Present, 0 cal BP = AD 1950), and provide much higher resolution, greater precision, and more detailed structure than IntCal98. For the Marine04 curve, dendrochronologically-dated tree-ring samples, converted with a box diffusion model to marine mixed-layer ages, cover the period from 0–10.5 cal kyr BP. Beyond 10.5 cal kyr BP, high-resolution marine data become available from foraminifera in varved sediments and U/Th-dated corals. The marine records are corrected with site-specific 14C reservoir age information to provide a single global marine mixed-layer calibration from 10.5–26.0 cal kyr BP. A substantial enhancement relative to IntCal98 is the introduction of a random walk model, which takes into account the uncertainty in both the calendar age and the 14C age to calculate the underlying calibration curve (Buck and Blackwell, this issue). The marine data sets and calibration curve for marine samples from the surface mixed layer (Marine04) are discussed here. The tree-ring data sets, sources of uncertainty, and regional offsets are presented in detail in a companion paper by Reimer et al. (this issue).
Resumo:
SuperWASP is an ultra-wide field (over 300 sq. degrees) photometric survey project designed to monitor stars between 7 - 15 mag to high precision and with high cadence over long (greater than or equal to2 months) timescales. The primary science goal of this project is the detection of exoplanetary transits, as well as NEOs and optical transients. The resulting photometric catalogue will be made public via a web-based interface. The SuperWASP instrument consists of an array of cameras each with a 7.8degrees x 7.8degrees field of view, guided by a robotic fork mount and sited in a fibreglass enclosure at the Observatorio de Roque de los Muchachos (ORM), La Palma, Canary Islands. In this progress report, we describe the specifications of the instrument, its semi-automated operation and pipeline data reduction.
Resumo:
The use of cortisol levels as a measure of stress is often complicated by the use of invasive techniques that may increase hypothalamic-pituitary-adrenal (HPA) axis activity during sample collection. The goal of this study was to collect samples noninvasively and validate an enzyme-immunoassay (EIA) for the measurement of cortisol in urine to quantify HPA axis activity in the bearded emperor tamarin (Saguinus imperator subgrisescens). Urine samples were collected from trained subjects between 0700 and 0730 hr during a 1-month period, and were pooled for immunological validation. We validated the assay immunologically by demonstrating specificity, accuracy, precision, and sensitivity. For biological validation of the assay, we showed that levels of urinary cortisol (in samples collected between 0700 and 1700 hr) varied significantly across the day. Cortisol concentration was lowest at 0700 hr, increased to a mid-morning peak (0900 hr), and declined across the remainder of the day in a typical mammalian circadian pattern. We thus demonstrate that urinary cortisol can be used to quantify HPA activity in S. i. subgrisescens. (C) 2004 Wiley-Liss, Inc.
Resumo:
A distributed optical fiber sensor based on Brillouin scattering (BOTDR or BOTDA) can measure and monitor strain and temperature generated along optical fiber. Because it can measure in real-time with high precision and stability, it is quite suitable for health monitoring of large-scale civil infrastructures. However, the main challenge of applying it to structural health monitoring is to ensure it is robust and can be repaired by adopting a suitable embedding method. In this paper, a novel method based on air-blowing and vacuum grouting techniques for embedding long-distance optical fiber sensors was developed. This method had no interference with normal concrete construction during its installation, and it could easily replace the long-distance embedded optical fiber sensor (LEOFS). Two stages of static loading tests were applied to investigate the performance of the LEOFS. The precision and the repeatability of the LEOFS were studied through an overloading test. The durability and the stability of the LEOFS were confirmed by a corrosion test. The strains of the LEOFS were used to evaluate the reinforcing effect of carbon fiber reinforced polymer and thereby the health state of the beams.
Resumo:
Norms constitute a powerful coordination mechanism among heterogeneous agents. In this paper, we propose a rule language to specify and explicitly manage the normative positions of agents (permissions, prohibitions and obligations), with which distinct deontic notions and their relationships can be captured. Our rule-based formalism includes constraints for more expressiveness and precision and allows to supplement (and implement) electronic institutions with norms. We also show how some normative aspects are given computational interpretation. © 2008 Springer Science+Business Media, LLC.
Resumo:
Introduction: Amplicon deep-sequencing using second-generation sequencing technology is an innovative molecular diagnostic technique and enables a highly-sensitive detection of mutations. As an international consortium we had investigated previously the robustness, precision, and reproducibility of 454 amplicon next-generation sequencing (NGS) across 10 laboratories from 8 countries (Leukemia, 2011;25:1840-8).
Aims: In Phase II of the study, we established distinct working groups for various hematological malignancies, i.e. acute myeloid leukemia (AML), acute lymphoblastic leukemia (ALL), chronic lymphocytic leukemia (CLL), chronic myelogenous leukemia (CML), myelodysplastic syndromes (MDS), myeloproliferative neoplasms (MPN), and multiple myeloma. Currently, 27 laboratories from 13 countries are part of this research consortium. In total, 74 gene targets were selected by the working groups and amplicons were developed for a NGS deep-sequencing assay (454 Life Sciences, Branford, CT). A data analysis pipeline was developed to standardize mutation interpretation both for accessing raw data (Roche Amplicon Variant Analyzer, 454 Life Sciences) and variant interpretation (Sequence Pilot, JSI Medical Systems, Kippenheim, Germany).
Results: We will report on the design, standardization, quality control aspects, landscape of mutations, as well as the prognostic and predictive utility of this assay in a cohort of 8,867 cases. Overall, 1,146 primer sequences were designed and tested. In detail, for example in AML, 924 cases had been screened for CEBPA mutations. RUNX1 mutations were analyzed in 1,888 cases applying the deep-sequencing read counts to study the stability of such mutations at relapse and their utility as a biomarker to detect residual disease. Analyses of DNMT3A (n=1,041) were focused to perform landscape investigations and to address the prognostic relevance. Additionally, this working group is focusing on TET2, ASXL1, and TP53 analyses. A novel prognostic model is being developed allowing stratification of AML into prognostic subgroups based on molecular markers only. In ALL, 1,124 pediatric and adult cases have been screened, including 763 assays for TP53 mutations both at diagnosis and relapse of ALL. Pediatric and adult leukemia expert labs developed additional content to study the mutation incidence of other B and T lineage markers such as IKZF1, JAK2, IL7R, PAX5, EP300, LEF1, CRLF2, PHF6, WT1, JAK1, PTEN, AKT1, IL7R, NOTCH1, CREBBP, or FBXW7. Further, the molecular landscape of CLL is changing rapidly. As such, a separate working group focused on analyses including NOTCH1, SF3B1, MYD88, XPO1, FBXW7 and BIRC3. Currently, 922 cases were screened to investigate the range of mutational burden of NOTCH1 mutations for their prognostic relevance. In MDS, RUNX1 mutation analyses were performed in 977 cases. The prognostic relevance of TP53 mutations in MDS was assessed in additional 327 cases, including isolated deletions of chromosome 5q. Next, content was developed targeting genes of the cellular splicing component, e.g. SF3B1, SRSF2, U2AF1, and ZRSR2. In BCR-ABL1-negative MPN, nine genes of interest (JAK2, MPL, TET2, CBL, KRAS, EZH2, IDH1, IDH2, ASXL1) have been analyzed in a cohort of 155 primary myelofibrosis cases searching for novel somatic mutations and addressing their relevance for disease progression and leukemia transformation. Moreover, an assay was developed and applied to CMML cases allowing the simultaneous analysis of 25 leukemia-associated target genes in a single sequencing run using just 20 ng of starting DNA. Finally, nine laboratories are studying CML, applying ultra-deep sequencing of the BCR-ABL1 tyrosine kinase domain. Analyses were performed on 615 cases investigating the dynamics of expansion of mutated clones under various tyrosine kinase inhibitor therapies.
Conclusion: Molecular characterization of hematological malignancies today requires high diagnostic sensitivity and specificity. As part of the IRON-II study, a network of laboratories analyzed a variety of disease entities applying amplicon-based NGS assays. Importantly, the consortium not only standardized assay design for disease-specific panels, but also achieved consensus on a common data analysis pipeline for mutation interpretation. Distinct working groups have been forged to address scientific tasks and in total 8,867 cases had been analyzed thus far.
Resumo:
This paper describes the deployment on GPUs of PROP, a program of the 2DRMP suite which models electron collisions with H-like atoms and ions. Because performance on GPUs is better in single precision than in double precision, the numerical stability of the PROP program in single precision has been studied. The numerical quality of PROP results computed in single precision and their impact on the next program of the 2DRMP suite has been analyzed. Successive versions of the PROP program on GPUs have been developed in order to improve its performance. Particular attention has been paid to the optimization of data transfers and of linear algebra operations. Performance obtained on several architectures (including NVIDIA Fermi) are presented.
Resumo:
Parallel robot (PR) is a mechanical system that utilized multiple computer-controlled limbs to support one common platform or end effector. Comparing to a serial robot, a PR generally has higher precision and dynamic performance and, therefore, can be applied to many applications. The PR research has attracted a lot of attention in the last three decades, but there are still many challenging issues to be solved before achieving PRs’ full potential. This chapter introduces the state-of-the-art PRs in the aspects of synthesis, design, analysis, and control. The future directions will also be discussed at the end.
Resumo:
Background
Diabetic macular oedema (DMO) is a thickening of the central retina, or the macula, and is associated with long-term visual loss in people with diabetic retinopathy (DR). Clinically significant macular oedema (CSMO) is the most severe form of DMO. Almost 30 years ago, the Early Treatment Diabetic Retinopathy Study (ETDRS) found that CSMO, diagnosed by means of stereoscopic fundus photography, leads to moderate visual loss in one of four people within three years. It also showed that grid or focal laser photocoagulation to the macula halves this risk. Recently, intravitreal injection of antiangiogenic drugs has also been used to try to improve vision in people with macular oedema due to DR.Optical coherence tomography (OCT) is based on optical reflectivity and is able to image retinal thickness and structure producing cross-sectional and three-dimensional images of the central retina. It is widely used because it provides objective and quantitative assessment of macular oedema, unlike the subjectivity of fundus biomicroscopic assessment which is routinely used by ophthalmologists instead of photography. Optical coherence tomography is also used for quantitative follow-up of the effects of treatment of CSMO.
Objectives
To determine the diagnostic accuracy of OCT for detecting DMO and CSMO, defined according to ETDRS in 1985, in patients referred to ophthalmologists after DR is detected. In the update of this review we also aimed to assess whether OCT might be considered the new reference standard for detecting DMO.
Search methods
We searched the Cochrane Database of Systematic Reviews (CDSR), the Database of Abstracts of Reviews of Effects (DARE), the Health Technology Assessment Database (HTA) and the NHS Economic Evaluation Database (NHSEED) (The Cochrane Library 2013, Issue 5), Ovid MEDLINE, Ovid MEDLINE In-Process and Other Non-Indexed Citations, Ovid MEDLINE Daily, Ovid OLDMEDLINE (January 1946 to June 2013), EMBASE (January 1950 to June 2013), Web of Science Conference Proceedings Citation Index - Science (CPCI-S) (January 1990 to June 2013), BIOSIS Previews (January 1969 to June 2013), MEDION and the Aggressive Research Intelligence Facility database (ARIF). We did not use any date or language restrictions in the electronic searches for trials. We last searched the electronic databases on 25 June 2013. We checked bibliographies of relevant studies for additional references.
Selection Criteria
We selected studies that assessed the diagnostic accuracy of any OCT model for detecting DMO or CSMO in patients with DR who were referred to eye clinics. Diabetic macular oedema and CSMO were diagnosed by means of fundus biomicroscopy by ophthalmologists or stereophotography by ophthalmologists or other trained personnel.
Data collection and analysis
Three authors independently extracted data on study characteristics and measures of accuracy. We assessed data using random-effects hierarchical sROC meta-analysis models.
Main results
We included 10 studies (830 participants, 1387 eyes), published between 1998 and 2012. Prevalence of CSMO was 19% to 65% (median 50%) in nine studies with CSMO as the target condition. Study quality was often unclear or at high risk of bias for QUADAS 2 items, specifically regarding study population selection and the exclusion of participants with poor quality images. Applicablity was unclear in all studies since professionals referring patients and results of prior testing were not reported. There was a specific 'unit of analysis' issue because both eyes of the majority of participants were included in the analyses as if they were independent.In nine studies providing data on CSMO (759 participants, 1303 eyes), pooled sensitivity was 0.78 (95% confidence interval (CI) 0.72 to 0.83) and specificity was 0.86 (95% CI 0.76 to 0.93). The median central retinal thickness cut-off we selected for data extraction was 250 µm (range 230 µm to 300 µm). Central CSMO was the target condition in all but two studies and thus our results cannot be applied to non-central CSMO.Data from three studies reporting accuracy for detection of DMO (180 participants, 343 eyes) were not pooled. Sensitivities and specificities were about 0.80 in two studies and were both 1.00 in the third study.Since this review was conceived, the role of OCT has changed and has become a key ingredient of decision-making at all levels of ophthalmic care in this field. Moreover, disagreements between OCT and fundus examination are informative, especially false positives which are referred to as subclinical DMO and are at higher risk of developing clinical CSMO.
Authors' conclusions
Using retinal thickness thresholds lower than 300 µm and ophthalmologist's fundus assessment as reference standard, central retinal thickness measured with OCT was not sufficiently accurate to diagnose the central type of CSMO in patients with DR referred to retina clinics. However, at least OCT false positives are generally cases of subclinical DMO that cannot be detected clinically but still suffer from increased risk of disease progression. Therefore, the increasing availability of OCT devices, together with their precision and the ability to inform on retinal layer structure, now make OCT widely recognised as the new reference standard for assessment of DMO, even in some screening settings. Thus, this review will not be updated further.
Resumo:
Background: Search filters are combinations of words and phrases designed to retrieve an optimal set of records on a particular topic (subject filters) or study design (methodological filters). Information specialists are increasingly turning to reusable filters to focus their searches. However, the extent of the academic literature on search filters is unknown. We provide a broad overview to the academic literature on search filters.
Objectives: To map the academic literature on search filters from 2004 to 2015 using a novel form of content analysis.
Methods: We conducted a comprehensive search for literature between 2004 and 2015 across eight databases using a subjectively derived search strategy. We identified key words from titles, grouped them into categories, and examined their frequency and co-occurrences.
Results: The majority of records were housed in Embase (n = 178) and MEDLINE (n = 154). Over the last decade, both databases appeared to exhibit a bimodal distribution with the number of publications on search filters rising until 2006, before dipping in 2007, and steadily increasing until 2012. Few articles appeared in social science databases over the same time frame (e.g. Social Services Abstracts, n = 3).
Unsurprisingly, the term ‘search’ appeared in most titles, and quite often, was used as a noun adjunct for the word 'filter' and ‘strategy’. Across the papers, the purpose of searches as a means of 'identifying' information and gathering ‘evidence’ from 'databases' emerged quite strongly. Other terms relating to the methodological assessment of search filters, such as precision and validation, also appeared albeit less frequently.
Conclusions: Our findings show surprising commonality across the papers with regard to the literature on search filters. Much of the literature seems to be focused on developing search filters to identify and retrieve information, as opposed to testing or validating such filters. Furthermore, the literature is mostly housed in health-related databases, namely MEDLINE, CINAHL, and Embase, implying that it is medically driven. Relatively few papers focus on the use of search filters in the social sciences.
Resumo:
In this paper, we describe how the pathfinder algorithm converts relatedness ratings of concept pairs to concept maps; we also present how this algorithm has been used to develop the Concept Maps for Learning website (www.conceptmapsforlearning.com) based on the principles of effective formative assessment. The pathfinder networks, one of the network representation tools, claim to help more students memorize and recall the relations between concepts than spatial representation tools (such as Multi- Dimensional Scaling). Therefore, the pathfinder networks have been used in various studies on knowledge structures, including identifying students’ misconceptions. To accomplish this, each student’s knowledge map and the expert knowledge map are compared via the pathfinder software, and the differences between these maps are highlighted. After misconceptions are identified, the pathfinder software fails to provide any feedback on these misconceptions. To overcome this weakness, we have been developing a mobile-based concept mapping tool providing visual, textual and remedial feedback (ex. videos, website links and applets) on the concept relations. This information is then placed on the expert concept map, but not on the student’s concept map. Additionally, students are asked to note what they understand from given feedback, and given the opportunity to revise their knowledge maps after receiving various types of feedback.
Resumo:
Introduction: Anterior and posterior segment eye diseases are highly challenging to treat, due to the barrier properties and relative inaccessibility of the ocular tissues. Topical eye drops and systemically delivered treatments result in low bioavailability. Alternatively, direct injection of medication into the ocular tissues is clinically employed to overcome the barrier properties, but injections cause significant tissue damage and are associated with a number of untoward side effects and poor patient compliance. Microneedles (MNs) has been recently introduced as a minimally invasive means for localizing drug formulation within the target ocular tissues with greater precision and accuracy than the hypodermic needles. Areas covered: This review article seeks to provide an overview of a range of challenges that are often faced to achieve efficient ocular drug levels within targeted tissue(s) of the eye. It also describes the problems encountered using conventional hypodermic needle-based ocular injections for anterior and posterior segment drug delivery. It discusses research carried out in the field of MNs, to date.
Expert opinion: MNs can aid in localization of drug delivery systems within the selected ocular tissue. And, hold the potential to revolutionize the way drug formulations are administered to the eye. However, the current limitations and challenges of MNs application warrant further research in this field to enable its widespread clinical application.
Resumo:
The University of Waikato, Hamilton, New Zealand and The Queen's University of Belfast, Northern Ireland radiocarbon dating laboratories have undertaken a series of high-precision measurements on decadal samples of dendrochronologically dated oak (Quercus petraea) from Great Britain and cedar (Libocedrus bidwillii) and silver pine (Lagarostrobos colensoi) from New Zealand. The results show an average hemispheric offset over the 900 yr of measurement of 40±13 yr. This value is not constant but varies with a periodicity of about 130 yr. The Northern Hemisphere measurements confirm the validity of the Pearson et al. (1986) calibration dataset.
Resumo:
The chronologies of five northern European ombrotrophic peat bogs subjected to a large ANIS C-14 dating effort (32-44 dates/site) are presented here. The results of Bayesian calibration (BCal) of dates with a prior assumption of chronological ordering were compared with a Bayesian wiggle-match approach (Bpeat) which assumes constant linear accumulation over sections of the peat profile. Interpolation of BCal age estimates of dense sequences of C-14 dates showed variable patterns of peat accumulation with time, with changes in accumulation occurring at intervals ranging from 20 to 50 cm. Within these intervals, peat accumulation appeared to be relatively linear. Close analysis suggests that some of the inferred variations in accumulation rate were related to the plant macrofossil composition of the peat. The wiggle-matched age-depth models had relatively high chronological uncertainty within intervals of closely spaced 14 C dates, suggesting that the premise of constant linear accumulation over large sections of the peat profile is unrealistic. Age models based on the assumption of linear accumulation over large parts of a peat core (and therefore only effective over millennial timescales), are not compatible with studies examining environmental change during the Holocene, where variability often occurs at decadal to centennial time-scales. Ideally, future wiggle-match age models should be constrained, with boundaries between sections based on the plant macrofossil composition of the peat and physical-chemical parameters such as the degree of decomposition. Strategies for the selection of material for dating should be designed so that there should be enough C-14 dates to accurately reconstruct the peat accumulation rate of each homogeneous stratigraphic unit. (c) 2006 Elsevier Ltd. All rights reserved.