33 resultados para cost estimation accuracy


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A selective and sensitive liquid chromatography (LC)-atmospheric pressure chemical ionisation (APCI)-mass spectroscopic (MS) assay of canrenone has been developed and validated employing Dried Blood Spots (DBS) as the sample collection medium. DBS samples were prepared by applying 30 mu l of spiked whole blood onto Guthrie cards. A 6 mm disc was punched from the each DBS and extracted with 2 ml of methanolic solution of 17 alpha-methyltestosterone (Internal Standard). The methanolic extract was evaporated to dryness and reconstituted in acetonitrile:water (1:9, v/v). The reconstituted solution was further subjected to solid phase extraction using HLB cartridges. Chromatographic separation was achieved using Waters Sunfire C18 reversed-phase column using isocratic elution, followed by a high organic wash to clear late eluting/highly retained components. The mobile phase consisted of methanol:water (60:40, v/v) pumped at a flow rate of 0.3 ml/min. LC-APCI-MS detection was performed in the selected-ion monitoring (SIM) mode using target ions at m/z 341.1 and 303.3 for canrenone and internal standard respectively. The selectivity of the method was established by analysing DBS samples from 6 different sources (individuals). The calibration curve for canrenone was found to be linear over 25-1000 ng/ml (r >0.994). Accuracy (% RE) and precision (% CV) values for within and between day were

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new domain-specific, reconfigurable system-on-a-chip (SoC) architecture is proposed for video motion estimation. This has been designed to cover most of the common block-based video coding standards, including MPEG-2, MPEG-4, H.264, WMV-9 and AVS. The architecture exhibits simple control, high throughput and relatively low hardware cost when compared with existing circuits. It can also easily handle flexible search ranges without any increase in silicon area and can be configured prior to the start of the motion estimation process for a specific standard. The computational rates achieved make the circuit suitable for high-end video processing applications, such as HDTV. Silicon design studies indicate that circuits based on this approach incur only a relatively small penalty in terms of power dissipation and silicon area when compared with implementations for specific standards. Indeed, the cost/performance achieved exceeds that of existing but specific solutions and greatly exceeds that of general purpose field programmable gate array (FPGA) designs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE:
The aim of the study was to compare the pre-operative metabolic tumour length on FDG PET/CT with the resected pathological specimen in patients with oesophageal cancer.

METHODS:
All patients diagnosed with oesophageal carcinoma who had undergone staging PET/CT imaging between the period of June 2002 and May 2008 who were then suitable for curative surgery, either with or without neo-adjuvant chemotherapy, were included in this study. Metabolic tumour length was assessed using both visual analysis and a maximum standardised uptake value (SUV(max)) cutoff of 2.5.

RESULTS:
Thirty-nine patients proceeded directly to curative surgical resection, whereas 48 patients received neo-adjuvant chemotherapy, followed by curative surgery. The 95% limits of agreement in the surgical arm were more accurate when the metabolic tumour length was visually assessed with a mean difference of -0.05 cm (SD 2.16 cm) compared to a mean difference of +2.42 cm (SD 3.46 cm) when assessed with an SUV(max) cutoff of 2.5. In the neo-adjuvant group, the 95% limits of agreement were once again more accurate when assessed visually with a mean difference of -0.6 cm (SD 1.84 cm) compared to a mean difference of +1.58 cm (SD 3.1 cm) when assessed with an SUV(max) cutoff of 2.5.

CONCLUSION:
This study confirms the high accuracy of PET/CT in measuring gross target volume (GTV) length. A visual method for GTV length measurement was demonstrated to be superior and more accurate than when using an SUV(max) cutoff of 2.5. This has the potential of reducing the planning target volume with dose escalation to the tumour with a corresponding reduction in normal tissue complication probability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background

Biomedical researchers are now often faced with situations where it is necessary to test a large number of hypotheses simultaneously, eg, in comparative gene expression studies using high-throughput microarray technology. To properly control false positive errors the FDR (false discovery rate) approach has become widely used in multiple testing. The accurate estimation of FDR requires the proportion of true null hypotheses being accurately estimated. To date many methods for estimating this quantity have been proposed. Typically when a new method is introduced, some simulations are carried out to show the improved accuracy of the new method. However, the simulations are often very limited to covering only a few points in the parameter space.

Results

Here I have carried out extensive in silico experiments to compare some commonly used methods for estimating the proportion of true null hypotheses. The coverage of these simulations is unprecedented thorough over the parameter space compared to typical simulation studies in the literature. Thus this work enables us to draw conclusions globally as to the performance of these different methods. It was found that a very simple method gives the most accurate estimation in a dominantly large area of the parameter space. Given its simplicity and its overall superior accuracy I recommend its use as the first choice for estimating the proportion of true null hypotheses in multiple testing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diagnostic accuracy and management recommendations of realtime teledermatology consultations using low-cost telemedicine equipment were evaluated. Patients were seen by a dermatologist over a video-link and a diagnosis and treatment plan were recorded. This was followed by a face-to-face consultation on the same day to confirm the earlier diagnosis and management plan. A total of 351 patients with 427 diagnoses participated. Sixty-seven per cent of the diagnoses made over the video-link agreed with the face-to-face diagnosis. Clinical management plans were recorded for 214 patients with 252 diagnoses. For this cohort, 44% of the patients were seen by the same dermatologist at both consultations, while 56% were seen by a different dermatologist. In 64% of cases the same management plan was recommended at both consultations; a sub-optimum treatment plan was recommended in 8% of cases; and in 9% of cases the video-link management plans were judged to be inappropriate. In 20% of cases the dermatologist was unable to recommend a suitable management plan by video-link. There were significant differences in the ability to recommend an optimum management plan by video-link when a different dermatologist made the reference management plan. The results indicate that a high proportion of dermatological conditions can be successfully managed by realtime teledermatology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasing use of teledermatology should be based on demonstration of favourable accuracy and cost-benefit analysis for the different methods of use of this technique. Objectives To evaluate the clinical efficacy and cost-effectiveness of real-time and store-and-forward teledermatology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a new reconfigurable multi-standard architecture is introduced for integer-pixel motion estimation and a standard-cell based chip design study is presented. This has been designed to cover most of the common block-based video compression standards, including MPEG-2, MPEG-4, H.263, H.264, AVS and WMV-9. The architecture exhibits simpler control, high throughput and relative low hardware cost and highly competitive when compared with excising designs for specific video standards. It can also, through the use of control signals, be dynamically reconfigured at run-time to accommodate different system constraint such as the trade-off in power dissipation and video-quality. The computational rates achieved make the circuit suitable for high end video processing applications. Silicon design studies indicate that circuits based on this approach incur only a relatively small penalty in terms of power dissipation and silicon area when compared with implementations for specific standards.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: Scanning laser polarimetry (SLP) has been proposed as a useful diagnostic test for glaucoma. This study was conducted to evaluate the quality of reporting of published studies using the SLP for diagnosing glaucoma. METHODS: A validated Medline and hand search of English-language articles reporting on measures of diagnostic accuracy of the SLP for glaucoma was performed. Two reviewers independently selected and appraised the manuscripts. The Standards for Reporting of Diagnostic Accuracy (STARD) checklist was used to evaluate the quality of each publication. RESULTS: A total of 47 papers were identified of which the first 10 (from 1997 to 2000) and the last 10 articles (from 2004 to 2005) were appraised. Interobserver rating agreement of STARD items was high (85.5% agreement, ?=0.796). The number of STARD items properly reported ranged from 3/25 to 19/25. Only a quarter of studies (5/20) explicitly reported more than half of the STARD items. Important aspects of the methodology were often missing such as participant sampling (reported in 40% of manuscripts), masking of the readers of the index test and reference standard (reported in 20% of manuscripts), and estimation of uncertainty (eg, 95% confidence intervals, reported in 25% of manuscripts). There was a slight increase in the number of STARD items reported with time. CONCLUSIONS: The quality of reporting of diagnostic accuracy tests for glaucoma with SLP is suboptimal. The STARD initiative may be a useful tool for appraising the strengths and weaknesses of diagnostic accuracy studies. © 2007 Lippincott Williams & Wilkins, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: To assess whether open angle glaucoma (OAG) screening meets the UK National Screening Committee criteria, to compare screening strategies with case finding, to estimate test parameters, to model estimates of cost and cost-effectiveness, and to identify areas for future research. Data sources: Major electronic databases were searched up to December 2005. Review methods: Screening strategies were developed by wide consultation. Markov submodels were developed to represent screening strategies. Parameter estimates were determined by systematic reviews of epidemiology, economic evaluations of screening, and effectiveness (test accuracy, screening and treatment). Tailored highly sensitive electronic searches were undertaken. Results: Most potential screening tests reviewed had an estimated specificity of 85% or higher. No test was clearly most accurate, with only a few, heterogeneous studies for each test. No randomised controlled trials (RCTs) of screening were identified. Based on two treatment RCTs, early treatment reduces the risk of progression. Extrapolating from this, and assuming accelerated progression with advancing disease severity, without treatment the mean time to blindness in at least one eye was approximately 23 years, compared to 35 years with treatment. Prevalence would have to be about 3-4% in 40 year olds with a screening interval of 10 years to approach cost-effectiveness. It is predicted that screening might be cost-effective in a 50-year-old cohort at a prevalence of 4% with a 10-year screening interval. General population screening at any age, thus, appears not to be cost-effective. Selective screening of groups with higher prevalence (family history, black ethnicity) might be worthwhile, although this would only cover 6% of the population. Extension to include other at-risk cohorts (e.g. myopia and diabetes) would include 37% of the general population, but the prevalence is then too low for screening to be considered cost-effective. Screening using a test with initial automated classification followed by assessment by a specialised optometrist, for test positives, was more cost-effective than initial specialised optometric assessment. The cost-effectiveness of the screening programme was highly sensitive to the perspective on costs (NHS or societal). In the base-case model, the NHS costs of visual impairment were estimated as £669. If annual societal costs were £8800, then screening might be considered cost-effective for a 40-year-old cohort with 1% OAG prevalence assuming a willingness to pay of £30,000 per quality-adjusted life-year. Of lesser importance were changes to estimates of attendance for sight tests, incidence of OAG, rate of progression and utility values for each stage of OAG severity. Cost-effectiveness was not particularly sensitive to the accuracy of screening tests within the ranges observed. However, a highly specific test is required to reduce large numbers of false-positive referrals. The findings that population screening is unlikely to be cost-effective are based on an economic model whose parameter estimates have considerable uncertainty, in particular, if rate of progression and/or costs of visual impairment are higher than estimated then screening could be cost-effective. Conclusions: While population screening is not cost-effective, the targeted screening of high-risk groups may be. Procedures for identifying those at risk, for quality assuring the programme, as well as adequate service provision for those screened positive would all be needed. Glaucoma detection can be improved by increasing attendance for eye examination, and improving the performance of current testing by either refining practice or adding in a technology-based first assessment, the latter being the more cost-effective option. This has implications for any future organisational changes in community eye-care services. Further research should aim to develop and provide quality data to populate the economic model, by conducting a feasibility study of interventions to improve detection, by obtaining further data on costs of blindness, risk of progression and health outcomes, and by conducting an RCT of interventions to improve the uptake of glaucoma testing. © Queen's Printer and Controller of HMSO 2007. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The optimization of full-scale biogas plant operation is of great importance to make biomass a competitive source of renewable energy. The implementation of innovative control and optimization algorithms, such as Nonlinear Model Predictive Control, requires an online estimation of operating states of biogas plants. This state estimation allows for optimal control and operating decisions according to the actual state of a plant. In this paper such a state estimator is developed using a calibrated simulation model of a full-scale biogas plant, which is based on the Anaerobic Digestion Model No.1. The use of advanced pattern recognition methods shows that model states can be predicted from basic online measurements such as biogas production, CH4 and CO2 content in the biogas, pH value and substrate feed volume of known substrates. The machine learning methods used are trained and evaluated using synthetic data created with the biogas plant model simulating over a wide range of possible plant operating regions. Results show that the operating state vector of the modelled anaerobic digestion process can be predicted with an overall accuracy of about 90%. This facilitates the application of state-based optimization and control algorithms on full-scale biogas plants and therefore fosters the production of eco-friendly energy from biomass.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Restoration of joint centre during total hip arthroplasty is critical. While computer-aided navigation can improve accuracy during total hip arthroplasty, its expense makes it inaccessible to the majority of surgeons. This article evaluates the use, in the laboratory, of a calliper with a simple computer application to measure changes in femoral head centres during total hip arthroplasty. The computer application was designed using Microsoft Excel and used calliper measurements taken pre- and post-femoral head resection to predict the change in head centre in terms of offset and vertical height between the femoral head and newly inserted prosthesis. Its accuracy was assessed using a coordinate measuring machine to compare changes in preoperative and post-operative head centre when simulating stem insertion on 10 sawbone femurs. A femoral stem with a modular neck was used, which meant nine possible head centre configurations were available for each femur, giving 90 results. The results show that using this technique during a simulated total hip arthroplasty, it was possible to restore femoral head centre to within 6?mm for offset (mean 1.67?±?1.16?mm) and vertical height (mean 2.14?±?1.51?mm). It is intended that this low-cost technique be extended to inform the surgeon of a best-fit solution in terms of neck length and neck type for a specific prosthesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Polymer extrusion is regarded as an energy-intensive production process, and the real-time monitoring of both energy consumption and melt quality has become necessary to meet new carbon regulations and survive in the highly competitive plastics market. The use of a power meter is a simple and easy way to monitor energy, but the cost can sometimes be high. On the other hand, viscosity is regarded as one of the key indicators of melt quality in the polymer extrusion process. Unfortunately, viscosity cannot be measured directly using current sensory technology. The employment of on-line, in-line or off-line rheometers is sometimes useful, but these instruments either involve signal delay or cause flow restrictions to the extrusion process, which is obviously not suitable for real-time monitoring and control in practice. In this paper, simple and accurate real-time energy monitoring methods are developed. This is achieved by looking inside the controller, and using control variables to calculate the power consumption. For viscosity monitoring, a ‘soft-sensor’ approach based on an RBF neural network model is developed. The model is obtained through a two-stage selection and differential evolution, enabling compact and accurate solutions for viscosity monitoring. The proposed monitoring methods were tested and validated on a Killion KTS-100 extruder, and the experimental results show high accuracy compared with traditional monitoring approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new programming methodology for introducing and tuning parallelism in Erlang programs, using source-level code refactoring from sequential source programs to parallel programs written using our skeleton library, Skel. High-level cost models allow us to predict with reasonable accuracy the parallel performance of the refactored program, enabling programmers to make informed decisions about which refactorings to apply. Using our approach, we demonstrate easily obtainable, significant and scalable speedups of up to 21 on a 24-core machine over the sequential code.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a novel real-time power-device temperature estimation method that monitors the power MOSFET's junction temperature shift arising from thermal aging effects and incorporates the updated electrothermal models of power modules into digital controllers. Currently, the real-time estimator is emerging as an important tool for active control of device junction temperature as well as online health monitoring for power electronic systems, but its thermal model fails to address the device's ongoing degradation. Because of a mismatch of coefficients of thermal expansion between layers of power devices, repetitive thermal cycling will cause cracks, voids, and even delamination within the device components, particularly in the solder and thermal grease layers. Consequently, the thermal resistance of power devices will increase, making it possible to use thermal resistance (and junction temperature) as key indicators for condition monitoring and control purposes. In this paper, the predicted device temperature via threshold voltage measurements is compared with the real-time estimated ones, and the difference is attributed to the aging of the device. The thermal models in digital controllers are frequently updated to correct the shift caused by thermal aging effects. Experimental results on three power MOSFETs confirm that the proposed methodologies are effective to incorporate the thermal aging effects in the power-device temperature estimator with good accuracy. The developed adaptive technologies can be applied to other power devices such as IGBTs and SiC MOSFETs, and have significant economic implications. 

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Applications that cannot tolerate the loss of accuracy that results from binary arithmetic demand hardware decimal arithmetic designs. Binary arithmetic in Quantum-dot cellular automata (QCA) technology has been extensively investigated in recent years. However, only limited attention has been paid to QCA decimal arithmetic. In this paper, two cost-efficient binary-coded decimal (BCD) adders are presented. One is based on the carry flow adder (CFA) using a conventional correction method. The other uses the carry look ahead (CLA) algorithm which is the first QCA CLA decimal adder proposed to date. Compared with previous designs, both decimal adders achieve better performance in terms of latency and overall cost. The proposed CFA-based BCD adder has the smallest area with the least number of cells. The proposed CLA-based BCD adder is the fastest with an increase in speed of over 60% when compared with the previous fastest decimal QCA adder. It also has the lowest overall cost with a reduction of over 90% when compared with the previous most cost-efficient design.