838 resultados para Precision and recall


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Appearance-based localization is increasingly used for loop closure detection in metric SLAM systems. Since it relies only upon the appearance-based similarity between images from two locations, it can perform loop closure regardless of accumulated metric error. However, the computation time and memory requirements of current appearance-based methods scale linearly not only with the size of the environment but also with the operation time of the platform. These properties impose severe restrictions on longterm autonomy for mobile robots, as loop closure performance will inevitably degrade with increased operation time. We present a set of improvements to the appearance-based SLAM algorithm CAT-SLAM to constrain computation scaling and memory usage with minimal degradation in performance over time. The appearance-based comparison stage is accelerated by exploiting properties of the particle observation update, and nodes in the continuous trajectory map are removed according to minimal information loss criteria. We demonstrate constant time and space loop closure detection in a large urban environment with recall performance exceeding FAB-MAP by a factor of 3 at 100% precision, and investigate the minimum computational and memory requirements for maintaining mapping performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

iTRAQ (isobaric tags for relative or absolute quantitation) is a mass spectrometry technology that allows quantitative comparison of protein abundance by measuring peak intensities of reporter ions released from iTRAQ-tagged peptides by fragmentation during MS/MS. However, current data analysis techniques for iTRAQ struggle to report reliable relative protein abundance estimates and suffer with problems of precision and accuracy. The precision of the data is affected by variance heterogeneity: low signal data have higher relative variability; however, low abundance peptides dominate data sets. Accuracy is compromised as ratios are compressed toward 1, leading to underestimation of the ratio. This study investigated both issues and proposed a methodology that combines the peptide measurements to give a robust protein estimate even when the data for the protein are sparse or at low intensity. Our data indicated that ratio compression arises from contamination during precursor ion selection, which occurs at a consistent proportion within an experiment and thus results in a linear relationship between expected and observed ratios. We proposed that a correction factor can be calculated from spiked proteins at known ratios. Then we demonstrated that variance heterogeneity is present in iTRAQ data sets irrespective of the analytical packages, LC-MS/MS instrumentation, and iTRAQ labeling kit (4-plex or 8-plex) used. We proposed using an additive-multiplicative error model for peak intensities in MS/MS quantitation and demonstrated that a variance-stabilizing normalization is able to address the error structure and stabilize the variance across the entire intensity range. The resulting uniform variance structure simplifies the downstream analysis. Heterogeneity of variance consistent with an additive-multiplicative model has been reported in other MS-based quantitation including fields outside of proteomics; consequently the variance-stabilizing normalization methodology has the potential to increase the capabilities of MS in quantitation across diverse areas of biology and chemistry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An effective technique to improve the precision and throughput of energetic ion condensation through dielectric nanoporous templates and reduce nanopore clogging by using finely tuned pulsed bias is proposed. Multiscale numerical simulations of ion deposition show the possibility of controlling the dynamic charge balance on the upper template's surface to minimize ion deposition on nanopore sidewalls and to deposit ions selectively on the substrate surface in contact with the pore opening. In this way, the shapes of nanodots in template-assisted nanoarray fabrication can be effectively controlled. The results are applicable to various processes involving porous dielectric nanomaterials and dense nanoarrays.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the design of a self~organizing, hierarchical neural network model of unsupervised serial learning. The model learns to recognize, store, and recall sequences of unitized patterns, using either short-term memory (STM) or both STM and long-term memory (LTM) mechanisms. Timing information is learned and recall {both from STM and from LTM) is performed with a learned rhythmical structure. The network, bearing similarities with ART (Carpenter & Grossberg 1987a), learns to map temporal sequences to unitized patterns, which makes it suitable for hierarchical operation. It is therefore capable of self-organizing codes for sequences of sequences. The capacity is only limited by the number of nodes provided. Selected simulation results are reported to illustrate system properties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The COMET (Core Outcome Measures in Effectiveness Trials) Initiative is developing a publicly accessible online resource to collate the knowledge base for core outcome set development (COS) and the applied work from different health conditions. Ensuring that the database is as comprehensive as possible and keeping it up to date are key to its value for users. This requires the development and application of an optimal, multi-faceted search strategy to identify relevant material. This paper describes the challenges of designing and implementing such a search, outlining the development of the search strategy for studies of COS development, and, in turn, the process for establishing a database of COS.

Methods: We investigated the performance characteristics of this strategy including sensitivity, precision and numbers needed to read. We compared the contribution of databases towards identifying included studies to identify the best combination of methods to retrieve all included studies.

Results: Recall of the search strategies ranged from 4% to 87%, and precision from 0.77% to 1.13%. MEDLINE performed best in terms of recall, retrieving 216 (87%) of the 250 included records, followed by Scopus (44%). The Cochrane Methodology Register found just 4% of the included records. MEDLINE was also the database with the highest precision. The number needed to read varied between 89 (MEDLINE) and 130 (SCOPUS).

Conclusions: We found that two databases and hand searching were required to locate all of the studies in this review. MEDLINE alone retrieved 87% of the included studies, but actually 97% of the included studies were indexed on MEDLINE. The Cochrane Methodology Register did not contribute any records that were not found in the other databases, and will not be included in our future searches to identify studies developing COS. SCOPUS had the lowest precision rate (0.77) and highest number needed to read (130). In future COMET searches for COS a balance needs to be struck between the work involved in screening large numbers of records, the frequency of the searching and the likelihood that eligible studies will be identified by means other than the database searches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cancer clinical trials have been one of the key foundations for significant advances in oncology. However, there is a clear recognition within the academic, care delivery and pharmaceutical/biotech communities that our current model of clinical trial discovery and development is no longer fit for purpose. Delivering transformative cancer care should increasingly be our mantra, rather than maintaining the status quo of, at best, the often miniscule incremental benefits that are observed with many current clinical trials. As we enter the era of precision medicine for personalised cancer care (precision and personalised medicine), it is important that we capture and utilise our greater understanding of the biology of disease to drive innovative approaches in clinical trial design and implementation that can lead to a step change in cancer care delivery. A number of advances have been practice changing (e.g. imatinib mesylate in chronic myeloid leukaemia, Herceptin in erb-B2-positive breast cancer), and increasingly we are seeing the promise of a number of newer approaches, particularly in diseases like lung cancer and melanoma. Targeting immune checkpoints has recently yielded some highly promising results. New algorithms that maximise the effectiveness of clinical trials, through for example a multi-stage, multi-arm type design are increasingly gaining traction. However, our enthusiasm for the undoubted advances that have been achieved are being tempered by a realisation that these new approaches may have significant cost implications. This article will address these competing issues, mainly from a European perspective, highlight the problems and challenges to healthcare systems and suggest potential solutions that will ensure that the cost/value rubicon is addressed in a way that allows stakeholders to work together to deliver optimal cost-effective cancer care, the benefits of which can be transferred directly to our patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most metabolic functions are optimized within a narrow range of body temperatures, which is why thermoregulation is of great importance for the survival and overall fitness of an animal. It has been proposed that lizards will thermoregulate less precisely in low thermal quality environments, where the costs associated with thermoregulation are high; in the case of lizards, whose thermoregulation is mainly behavioural, the primary costs ofthermoregulation are those derived from locomotion. Decreasing thermoregulatory precision in costly situations is a strategy that enhances fitness by allowing lizards to be more flexible to changing environmental conditions. It allows animals to maximize the benefits of maintaining a relatively high body temperature while minimizing energy expenditure. In situations where oxygen concentration is low, the costs of thermoregulation are relatively high (i.e. in relation to the amount of oxygen available for metabolic functions). As a result, it is likely that exposures to hypoxic conditions induce a decrease in the precision of thermoregulation. This study evaluated the effects of hypoxia and low environmental thermal quality, two energetically costly conditions, on the precision and level of thermoregulation in the bearded dragon, Pogona vitticeps, in an electronic temperature-choice shuttle box. Four levels of hypoxia (1O, 7, 5 and 4% 02) were tested. Environmental thermal quality was manipulated by varying the rate of temperature change (oTa) in an electronic temperature-choice shuttle box. Higher oT a's translate into more thermally challenging environments, since under these conditions the animals are forced to move a greater number of times (and hence invest more energy in locomotion) to maintain similar temperatures than at lower oTa's. In addition, lizards were tested in an "extreme temperatures" treatment during which air temperatures of the hot and cold compartments of the shuttle box were maintained at a constant 50 and 15°C respectively. This was considered the most thermally challenging environment. The selected ambient (T a) and internal body temperatures (Tb) of bearded dragons, as well as the thermoregulatory precision (measured by the central 68% ofthe Ta and T b distribution) were evaluated. The thermoregulatory response was similar to both conditions. A significant increase in the size of the Tb range, reflecting a decrease in thermoregulatory precision, and a drop in preferred body temperature of ~2 °C, were observed at both 4% oxygen and at the environment of lowest thermal quality. The present study suggests that in energetically costly situations, such as the ones tested in this study, the bearded dragon reduces energy expenditure by decreasing preferred body temperature and minimizing locomotion, at the expense of precise behavioural thermoregulation. The close similarity of the behavioural thermoregulatory response to two very different stimuli suggests a possible common mechanism and neuronal pathway to the thermoregulatory response.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As part of the European Commission (EC)'s revision of the Sewage Sludge Directive and the development of a Biowaste Directive, there was recognition of the difficulty of comparing data from Member States (MSs) because of differences in sampling and analytical procedures. The 'HORIZONTAL' initiative, funded by the EC and MSs, seeks to address these differences in approach and to produce standardised procedures in the form of CEN standards. This article is a preliminary investigation into aspects of the sampling of biosolids, composts and soils to which there is a history of biosolid application. The article provides information on the measurement uncertainty associated with sampling from heaps, large bags and pipes and soils in the landscape under a limited set of conditions, using sampling approaches in space and time and sample numbers based on procedures widely used in the relevant industries and when sampling similar materials. These preliminary results suggest that considerably more information is required before the appropriate sample design, optimum number of samples, number of samples comprising a composite, and temporal and spatial frequency of sampling might be recommended to achieve consistent results of a high level of precision and confidence. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Compared with younger adults, older adults have a relative preference to attend to and remember positive over negative information. This is known as the “positivity effect,” and researchers have typically evoked socioemotional selectivity theory to explain it. According to socioemotional selectivity theory, as people get older they begin to perceive their time left in life as more limited. These reduced time horizons prompt older adults to prioritize achieving emotional gratification and thus exhibit increased positivity in attention and recall. Although this is the most commonly cited explanation of the positivity effect, there is currently a lack of clear experimental evidence demonstrating a link between time horizons and positivity. The goal of the current research was to address this issue. In two separate experiments, we asked participants to complete a writing activity, which directed them to think of time as being either limited or expansive (Experiments 1 and 2) or did not orient them to think about time in a particular manner (Experiment 2). Participants were then shown a series of emotional pictures, which they subsequently tried to recall. Results from both studies showed that regardless of chronological age, thinking about a limited future enhanced the relative positivity of participants’ recall. Furthermore, the results of Experiment 2 showed that this effect was not driven by changes in mood. Thus, the fact that older adults’ recall is typically more positive than younger adults’ recall may index naturally shifting time horizons and goals with age.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sequential studies of osteopenic bone disease in small animals require the availability of non-invasive, accurate and precise methods to assess bone mineral content (BMC) and bone mineral density (BMD). Dual-energy X-ray absorptiometry (DXA), which is currently used in humans for this purpose, can also be applied to small animals by means of adapted software. Precision and accuracy of DXA was evaluated in 10 rats weighing 50-265 g. The rats were anesthetized with a mixture of ketamine-xylazine administrated intraperitoneally. Each rat was scanned six times consecutively in the antero-posterior incidence after repositioning using the rat whole-body software for determination of whole-body BMC and BMD (Hologic QDR 1000, software version 5.52). Scan duration was 10-20 min depending on rat size. After the last measurement, rats were sacrificed and soft tissues were removed by dermestid beetles. Skeletons were then scanned in vitro (ultra high resolution software, version 4.47). Bones were subsequently ashed and dissolved in hydrochloric acid and total body calcium directly assayed by atomic absorption spectrophotometry (TBCa[chem]). Total body calcium was also calculated from the DXA whole-body in vivo measurement (TBCa[DXA]) and from the ultra high resolution measurement (TBCa[UH]) under the assumption that calcium accounts for 40.5% of the BMC expressed as hydroxyapatite. Precision error for whole-body BMC and BMD (mean +/- S.D.) was 1.3% and 1.5%, respectively. Simple regression analysis between TBCa[DXA] or TBCa[UH] and TBCa[chem] revealed tight correlations (n = 0.991 and 0.996, respectively), with slopes and intercepts which were significantly different from 1 and 0, respectively.(ABSTRACT TRUNCATED AT 250 WORDS)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Firn and polar ice cores offer the only direct palaeoatmospheric archive. Analyses of past greenhouse gas concentrations and their isotopic compositions in air bubbles in the ice can help to constrain changes in global biogeochemical cycles in the past. For the analysis of the hydrogen isotopic composition of methane (δD(CH4) or δ2H(CH4)) 0.5 to 1.5 kg of ice was hitherto used. Here we present a method to improve precision and reduce the sample amount for δD(CH4) measurements in (ice core) air. Pre-concentrated methane is focused in front of a high temperature oven (pre-pyrolysis trapping), and molecular hydrogen formed by pyrolysis is trapped afterwards (post-pyrolysis trapping), both on a carbon-PLOT capillary at −196 °C. Argon, oxygen, nitrogen, carbon monoxide, unpyrolysed methane and krypton are trapped together with H2 and must be separated using a second short, cooled chromatographic column to ensure accurate results. Pre- and post-pyrolysis trapping largely removes the isotopic fractionation induced during chromatographic separation and results in a narrow peak in the mass spectrometer. Air standards can be measured with a precision better than 1‰. For polar ice samples from glacial periods, we estimate a precision of 2.3‰ for 350 g of ice (or roughly 30 mL – at standard temperature and pressure (STP) – of air) with 350 ppb of methane. This corresponds to recent tropospheric air samples (about 1900 ppb CH4) of about 6 mL (STP) or about 500 pmol of pure CH4.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research in this thesis is related to static cost and termination analysis. Cost analysis aims at estimating the amount of resources that a given program consumes during the execution, and termination analysis aims at proving that the execution of a given program will eventually terminate. These analyses are strongly related, indeed cost analysis techniques heavily rely on techniques developed for termination analysis. Precision, scalability, and applicability are essential in static analysis in general. Precision is related to the quality of the inferred results, scalability to the size of programs that can be analyzed, and applicability to the class of programs that can be handled by the analysis (independently from precision and scalability issues). This thesis addresses these aspects in the context of cost and termination analysis, from both practical and theoretical perspectives. For cost analysis, we concentrate on the problem of solving cost relations (a form of recurrence relations) into closed-form upper and lower bounds, which is the heart of most modern cost analyzers, and also where most of the precision and applicability limitations can be found. We develop tools, and their underlying theoretical foundations, for solving cost relations that overcome the limitations of existing approaches, and demonstrate superiority in both precision and applicability. A unique feature of our techniques is the ability to smoothly handle both lower and upper bounds, by reversing the corresponding notions in the underlying theory. For termination analysis, we study the hardness of the problem of deciding termination for a speci�c form of simple loops that arise in the context of cost analysis. This study gives a better understanding of the (theoretical) limits of scalability and applicability for both termination and cost analysis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background The accurate measurement of Cardiac output (CO) is vital in guiding the treatment of critically ill patients. Invasive or minimally invasive measurement of CO is not without inherent risks to the patient. Skilled Intensive Care Unit (ICU) nursing staff are in an ideal position to assess changes in CO following therapeutic measures. The USCOM (Ultrasonic Cardiac Output Monitor) device is a non-invasive CO monitor whose clinical utility and ease of use requires testing. Objectives To compare cardiac output measurement using a non-invasive ultrasonic device (USCOM) operated by a non-echocardiograhically trained ICU Registered Nurse (RN), with the conventional pulmonary artery catheter (PAC) using both thermodilution and Fick methods. Design Prospective observational study. Setting and participants Between April 2006 and March 2007, we evaluated 30 spontaneously breathing patients requiring PAC for assessment of heart failure and/or pulmonary hypertension at a tertiary level cardiothoracic hospital. Methods SCOM CO was compared with thermodilution measurements via PAC and CO estimated using a modified Fick equation. This catheter was inserted by a medical officer, and all USCOM measurements by a senior ICU nurse. Mean values, bias and precision, and mean percentage difference between measures were determined to compare methods. The Intra-Class Correlation statistic was also used to assess agreement. The USCOM time to measure was recorded to assess the learning curve for USCOM use performed by an ICU RN and a line of best fit demonstrated to describe the operator learning curve. Results In 24 of 30 (80%) patients studied, CO measures were obtained. In 6 of 30 (20%) patients, an adequate USCOM signal was not achieved. The mean difference (±standard deviation) between USCOM and PAC, USCOM and Fick, and Fick and PAC CO were small, −0.34 ± 0.52 L/min, −0.33 ± 0.90 L/min and −0.25 ± 0.63 L/min respectively across a range of outputs from 2.6 L/min to 7.2 L/min. The percent limits of agreement (LOA) for all measures were −34.6% to 17.8% for USCOM and PAC, −49.8% to 34.1% for USCOM and Fick and −36.4% to 23.7% for PAC and Fick. Signal acquisition time reduced on average by 0.6 min per measure to less than 10 min at the end of the study. Conclusions In 80% of our cohort, USCOM, PAC and Fick measures of CO all showed clinically acceptable agreement and the learning curve for operation of the non-invasive USCOM device by an ICU RN was found to be satisfactorily short. Further work is required in patients receiving positive pressure ventilation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We examine the impact of continuous disclosure regulatory reform on the likelihood, frequency and qualitative characteristics of management earnings forecasts issued in New Zealand’s low private litigation environment. Using a sample of 720 earnings forecasts issued by 94 firms listed on the New Zealand Exchange before and after the reform (1999–2005), we provide strong evidence of significant changes in forecasting behaviour in the post-reform period. Specifically, firms were more likely to issue earnings forecasts to pre-empt earnings announcements and, in contrast to findings in other legal settings, those earnings forecasts exhibited higher frequency and improved qualitative characteristics (better precision and accuracy). An important implication of our findings is that public regulatory reforms may have a greater benefit in a low private litigation environment and thus add to the global debate about the effectiveness of alternative public regulatory reforms of corporate requirements.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A rule-based approach for classifying previously identified medical concepts in the clinical free text into an assertion category is presented. There are six different categories of assertions for the task: Present, Absent, Possible, Conditional, Hypothetical and Not associated with the patient. The assertion classification algorithms were largely based on extending the popular NegEx and Context algorithms. In addition, a health based clinical terminology called SNOMED CT and other publicly available dictionaries were used to classify assertions, which did not fit the NegEx/Context model. The data for this task includes discharge summaries from Partners HealthCare and from Beth Israel Deaconess Medical Centre, as well as discharge summaries and progress notes from University of Pittsburgh Medical Centre. The set consists of 349 discharge reports, each with pairs of ground truth concept and assertion files for system development, and 477 reports for evaluation. The system’s performance on the evaluation data set was 0.83, 0.83 and 0.83 for recall, precision and F1-measure, respectively. Although the rule-based system shows promise, further improvements can be made by incorporating machine learning approaches.