163 resultados para diagnostic-accuracy


Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the primary desired capabilities of any future air traffic separation management system is the ability to provide early conflict detection and resolution effectively and efficiently. In this paper, we consider the risk of conflict as a primary measurement to be used for early conflict detection. This paper focuses on developing a novel approach to assess the impact of different measurement uncertainty models on the estimated risk of conflict. The measurement uncertainty model can be used to represent different sensor accuracy and sensor choices. Our study demonstrates the value of modelling measurement uncertainty in the conflict risk estimation problem and presents techniques providing a means of assessing sensor requirements to achieve desired conflict detection performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Highly sensitive infrared (IR) cameras provide high-resolution diagnostic images of the temperature and vascular changes of breasts. These images can be processed to emphasize hot spots that exhibit early and subtle changes owing to pathology. The resulting images show clusters that appear random in shape and spatial distribution but carry class dependent information in shape and texture. Automated pattern recognition techniques are challenged because of changes in location, size and orientation of these clusters. Higher order spectral invariant features provide robustness to such transformations and are suited for texture and shape dependent information extraction from noisy images. In this work, the effectiveness of bispectral invariant features in diagnostic classification of breast thermal images into malignant, benign and normal classes is evaluated and a phase-only variant of these features is proposed. High resolution IR images of breasts, captured with measuring accuracy of ±0.4% (full scale) and temperature resolution of 0.1 °C black body, depicting malignant, benign and normal pathologies are used in this study. Breast images are registered using their lower boundaries, automatically extracted using landmark points whose locations are learned during training. Boundaries are extracted using Canny edge detection and elimination of inner edges. Breast images are then segmented using fuzzy c-means clustering and the hottest regions are selected for feature extraction. Bispectral invariant features are extracted from Radon projections of these images. An Adaboost classifier is used to select and fuse the best features during training and then classify unseen test images into malignant, benign and normal classes. A data set comprising 9 malignant, 12 benign and 11 normal cases is used for evaluation of performance. Malignant cases are detected with 95% accuracy. A variant of the features using the normalized bispectrum, which discards all magnitude information, is shown to perform better for classification between benign and normal cases, with 83% accuracy compared to 66% for the original.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper looks at the accuracy of using the built-in camera of smart phones and free software as an economical way to quantify and analyse light exposure by producing luminance maps from High Dynamic Range (HDR) images. HDR images were captured with an Apple iPhone 4S to capture a wide variation of luminance within an indoor and outdoor scene. The HDR images were then processed using Photosphere software (Ward, 2010.) to produce luminance maps, where individual pixel values were compared with calibrated luminance meter readings. This comparison has shown an average luminance error of ~8% between the HDR image pixel values and luminance meter readings, when the range of luminances in the image is limited to approximately 1,500cd/m2.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A fundamental proposition is that the accuracy of the designer's tender price forecasts is positively correlated with the amount of information available for that project. The paper describes an empirical study of the effects of the quantity of information available on practicing Quantity Surveyors' forecasting accuracy. The methodology involved the surveyors repeatedly revising tender price forecasts on receipt of chunks of project information. Each of twelve surveyors undertook two projects and selected information chunks from a total of sixteen information types. The analysis indicated marked differences in accuracy between different project types and experts/non-experts. The expert surveyors' forecasts were not found to be significantly improved by information other than that of basic building type and size, even after eliminating project type effects. The expert surveyors' forecasts based on the knowledge of building type and size alone were, however, found to be of similar accuracy to that of average practitioners pricing full bills of quantities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Emerging sciences, such as conceptual cost estimating, seem to have to go through two phases. The first phase involves reducing the field of study down to its basic ingredients - from systems development to technological development (techniques) to theoretical development. The second phase operates in the direction in building up techniques from theories, and systems from techniques. Cost estimating is clearly and distinctly still in the first phase. A great deal of effort has been put into the development of both manual and computer based cost estimating systems during this first phase and, to a lesser extent, the development of a range of techniques that can be used (see, for instance, Ashworth & Skitmore, 1986). Theoretical developments have not, as yet, been forthcoming. All theories need the support of some observational data and cost estimating is not likely to be an exception. These data do not need to be complete in order to build theories. As it is possible to construct an image of a prehistoric animal such as the brontosaurus from only a few key bones and relics, so a theory of cost estimating may possibly be found on a few factual details. The eternal argument of empiricists and deductionists is that, as theories need factual support, so do we need theories in order to know what facts to collect. In cost estimating, the basic facts of interest concern accuracy, the cost of achieving this accuracy, and the trade off between the two. When cost estimating theories do begin to emerge, it is highly likely that these relationships will be central features. This paper presents some of the facts we have been able to acquire regarding one part of this relationship - accuracy, and its influencing factors. Although some of these factors, such as the amount of information used in preparing the estimate, will have cost consequences, we have not yet reached the stage of quantifying these costs. Indeed, as will be seen, many of the factors do not involve any substantial cost considerations. The absence of any theory is reflected in the arbitrary manner in which the factors are presented. Rather, the emphasis here is on the consideration of purely empirical data concerning estimating accuracy. The essence of good empirical research is to .minimize the role of the researcher in interpreting the results of the study. Whilst space does not allow a full treatment of the material in this manner, the principle has been adopted as closely as possible to present results in an uncleaned and unbiased way. In most cases the evidence speaks for itself. The first part of the paper reviews most of the empirical evidence that we have located to date. Knowledge of any work done, but omitted here would be most welcome. The second part of the paper presents an analysis of some recently acquired data pertaining to this growing subject.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several methods of estimating the costs or price of construction projects are now available for use in the construction industry. It is difficult due to the conservative approach of estimators and quantity surveyors, and the fact that the industry is undergoing one of its deepest recessions this century, to implement any changes in these processes. Several methods have been tried and tested and probably discarded forever, whereas other methods are still in their infancy. There is also a movement towards greater use of the computer, whichever method seems to be adopted. An important consideration with any method of estimating is the accuracy by which costs can be calculated. Any improvement in this consideration will be welcomed by a11 parties, because existing methods are poor when measured by this criteria. Estimating, particularly by contractors, has always carried some mystic, and many of the processes discussed both in the classroom and in practice are little more than fallacy when properly investigated. What makes an estimator or quantity surveyor good at forecasting the right price? To what extent does human behaviour influence or have a part to play? These and some of the other aspects of effective estimating are now examined in more detail.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND AND AIMS: Crohn's disease (CD) is an inflammatory bowel disease (IBD) caused by a combination of genetic, clinical, and environmental factors. Identification of CD patients at high risk of requiring surgery may assist clinicians to decide on a top-down or step-up treatment approach. METHODS: We conducted a retrospective case-control analysis of a population-based cohort of 503 CD patients. A regression-based data reduction approach was used to systematically analyse 63 genomic, clinical and environmental factors for association with IBD-related surgery as the primary outcome variable. RESULTS: A multi-factor model was identified that yielded the highest predictive accuracy for need for surgery. The factors included in the model were the NOD2 genotype (OR = 1.607, P = 2.3 × 10(-5)), having ever had perianal disease (OR = 2.847, P = 4 × 10(-6)), being post-diagnosis smokers (OR = 6.312, P = 7.4 × 10(-3)), being an ex-smoker at diagnosis (OR = 2.405, P = 1.1 × 10(-3)) and age (OR = 1.012, P = 4.4 × 10(-3)). Diagnostic testing for this multi-factor model produced an area under the curve of 0.681 (P = 1 × 10(-4)) and an odds ratio of 3.169, (95 % CI P = 1 × 10(-4)) which was higher than any factor considered independently. CONCLUSIONS: The results of this study require validation in other populations but represent a step forward in the development of more accurate prognostic tests for clinicians to prescribe the most optimal treatment approach for complicated CD patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Familial hemiplegic migraine (FHM) is a rare autosomal dominant subtype of migraine with aura. It is divided into three subtypes FHM1, FHM2 and FHM3, which are caused by mutations in the CACNA1A, ATP1A2 and SCN1A genes respectively. As part of a regular diagnostic service, we investigated 168 patients with FHM symptoms. Samples were tested for mutations contained within the CACNA1A gene. Some tested samples (4.43%) showed an FHM1 mutation, with five of the mutations found in exon 5, one mutation in exon 16 and one in exon 17. Four polymorphisms were also detected, one of which occurred in a large percentage of samples (14.88%). The exon 16 2094G>A polymorphism, however, has been found to occur in healthy Caucasian control populations up to a frequency of 16% and is not considered to be significantly associated with FHM. A finding of significance, found in a single patient, was the detection of a novel mutation in exon 5 that results in a P225H change. The affected individual was an 8-year-old female. The exact phenotypic effect of this mutation is unknown, and further studies are needed to understand the pathophysiology of this mutation in FHM1. New information will allow for diagnostic procedures to be constantly updated, thus improving accuracy of diagnosis. It is possible that new information will also aid the development of new therapeutic agents for the treatment of FHM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Although rapid diagnostic tests (RDTs) for Plasmodium falciparum infection that target histidine rich protein 2 (PfHRP2) are generally sensitive, their performance has been reported to be variable. One possible explanation for variable test performance is differences in expression level of PfHRP in different parasite isolates. Methods: Total RNA and protein were extracted from synchronised cultures of 7 P. falciparum lines over 5 time points of the life cycle, and from synchronised ring stages of 10 falciparum lines. Using quantitative real-time polymerase chain reaction, Western blot analysis and ELISA we investigated variations in the transcription and protein levels of pfhrp2, pfhrp3 and PfHRP respectively in the different parasite lines, over the parasite intraerythrocytic life cycle. Results: Transcription of pfhrp2 and pfhrp3 in different parasite lines over the parasite life cycle was observed to vary relative to the control parasite K1. In some parasite lines very low transcription of these genes was observed. The peak transcription was observed in ring-stage parasites. Pfhrp2 transcription was observed to be consistently higher than pfhrp3 transcription within parasite lines. The intraerythrocytic lifecycle stage at which the peak level of protein was present varied across strains. Total protein levels were more constant relative to total mRNA transcription, however a maximum 24 fold difference in expression at ring-stage parasites relative to the K1 strain was observed. Conclusions: The levels of transcription of pfhrp2 and pfhrp3, and protein expression of PfHRP varied between different P. falciparum strains. This variation may impact on the detection sensitivity of PfHRP2-detecting RDTs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Results of 3 tests, intravenous edrophonium chloride, EMG, and acetylcholine receptor antibody testing, were compared in patients with generalised and ocular myasthenia gravis. None of the 3 tests was positive in any patient with a diagnosis other than myasthenia. However, equivocal results were obtained with edrophonium and EMG testing in some patients with myasthenia gravis and in patients with other diseases. It is concluded from this survey that antibody and edrophonium testing were equally efficient in detecting generalised myasthenia gravis. Edrophonium testing was superior in ocular myasthenia gravis. Although the yields from each test varied, all 3 tests were needed for the evaluation of some myasthenia gravis patients as each test may provide additional information.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently, finite element analyses are usually done by means of commercial software tools. Accuracy of analysis and computational time are two important factors in efficiency of these tools. This paper studies the effective parameters in computational time and accuracy of finite element analyses performed by ANSYS and provides the guidelines for the users of this software whenever they us this software for study on deformation of orthopedic bone plates or study on similar cases. It is not a fundamental scientific study and only shares the findings of the authors about structural analysis by means of ANSYS workbench. It gives an idea to the readers about improving the performance of the software and avoiding the traps. The solutions provided in this paper are not the only possible solutions of the problems and in similar cases there are other solutions which are not given in this paper. The parameters of solution method, material model, geometric model, mesh configuration, number of the analysis steps, program controlled parameters and computer settings are discussed through thoroughly in this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study explores the accuracy and valuation implications of the application of a comprehensive list of equity multiples in the takeover context. Motivating the study is the prevalent use of equity multiples in practice, the observed long-run underperformance of acquirers following takeovers, and the scarcity of multiplesbased research in the merger and acquisition setting. In exploring the application of equity multiples in this context three research questions are addressed: (1) how accurate are equity multiples (RQ1); which equity multiples are more accurate in valuing the firm (RQ2); and which equity multiples are associated with greater misvaluation of the firm (RQ3). Following a comprehensive review of the extant multiples-based literature it is hypothesised that the accuracy of multiples in estimating stock market prices in the takeover context will rank as follows (from best to worst): (1) forecasted earnings multiples, (2) multiples closer to bottom line earnings, (3) multiples based on Net Cash Flow from Operations (NCFO) and trading revenue. The relative inaccuracies in multiples are expected to flow through to equity misvaluation (as measured by the ratio of estimated market capitalisation to residual income value, or P/V). Accordingly, it is hypothesised that greater overvaluation will be exhibited for multiples based on Trading Revenue, NCFO, Book Value (BV) and earnings before interest, tax, depreciation and amortisation (EBITDA) versus multiples based on bottom line earnings; and that multiples based on Intrinsic Value will display the least overvaluation. The hypotheses are tested using a sample of 147 acquirers and 129 targets involved in Australian takeover transactions announced between 1990 and 2005. The results show that first, the majority of computed multiples examined exhibit valuation errors within 30 percent of stock market values. Second, and consistent with expectations, the results provide support for the superiority of multiples based on forecasted earnings in valuing targets and acquirers engaged in takeover transactions. Although a gradual improvement in estimating stock market values is not entirely evident when moving down the Income Statement, historical earnings multiples perform better than multiples based on Trading Revenue or NCFO. Third, while multiples based on forecasted earnings have the highest valuation accuracy they, along with Trading Revenue multiples for targets, produce the most overvalued valuations for acquirers and targets. Consistent with predictions, greater overvaluation is exhibited for multiples based on Trading Revenue for targets, and NCFO and EBITDA for both acquirers and targets. Finally, as expected, multiples based Intrinsic Value (along with BV) are associated with the least overvaluation. Given the widespread usage of valuation multiples in takeover contexts these findings offer a unique insight into their relative effectiveness. Importantly, the findings add to the growing body of valuation accuracy literature, especially within Australia, and should assist market participants to better understand the relative accuracy and misvaluation consequences of various equity multiples used in takeover documentation and assist them in subsequent investment decision making.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In early April 1998 the Centre for Disease Control (CDC) in Darwin was notified of a case with positive dengue serology. The illness appeared to have been acquired in the Northern Territory (NT). Because dengue is not endemic to the NT, locally acquired infection has significant public health implications, particularly for vector identification and control to limit the spread of infection. Dengue IgM serology was positive on two occasions but the illness was eventually presumptively identified as Kokobera infection. This case illustrates some important points about serology. The interpretation of flavivirus serology is complex and can be misleading, despite recent improvements. The best method of determining the cause of infection is still attempting to reconcile clinical illness details with incubation times and vector presence, as well as laboratory results. This approach ultimately justified the initial period of waiting for confirmatory results in this case, before the institution of public health measures necessary for a true case of dengue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

iTRAQ (isobaric tags for relative or absolute quantitation) is a mass spectrometry technology that allows quantitative comparison of protein abundance by measuring peak intensities of reporter ions released from iTRAQ-tagged peptides by fragmentation during MS/MS. However, current data analysis techniques for iTRAQ struggle to report reliable relative protein abundance estimates and suffer with problems of precision and accuracy. The precision of the data is affected by variance heterogeneity: low signal data have higher relative variability; however, low abundance peptides dominate data sets. Accuracy is compromised as ratios are compressed toward 1, leading to underestimation of the ratio. This study investigated both issues and proposed a methodology that combines the peptide measurements to give a robust protein estimate even when the data for the protein are sparse or at low intensity. Our data indicated that ratio compression arises from contamination during precursor ion selection, which occurs at a consistent proportion within an experiment and thus results in a linear relationship between expected and observed ratios. We proposed that a correction factor can be calculated from spiked proteins at known ratios. Then we demonstrated that variance heterogeneity is present in iTRAQ data sets irrespective of the analytical packages, LC-MS/MS instrumentation, and iTRAQ labeling kit (4-plex or 8-plex) used. We proposed using an additive-multiplicative error model for peak intensities in MS/MS quantitation and demonstrated that a variance-stabilizing normalization is able to address the error structure and stabilize the variance across the entire intensity range. The resulting uniform variance structure simplifies the downstream analysis. Heterogeneity of variance consistent with an additive-multiplicative model has been reported in other MS-based quantitation including fields outside of proteomics; consequently the variance-stabilizing normalization methodology has the potential to increase the capabilities of MS in quantitation across diverse areas of biology and chemistry.