820 resultados para Accept–reject algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Accurate projection of implanted subdural electrode contacts in presurgical evaluation of pharmacoresistant epilepsy cases by invasive EEG is highly relevant. Linear fusion of CT and MRI images may display the contacts in the wrong position due to brain shift effects. OBJECTIVE: A retrospective study in five patients with pharmacoresistant epilepsy was performed to evaluate whether an elastic image fusion algorithm can provide a more accurate projection of the electrode contacts on the pre-implantation MRI as compared to linear fusion. METHODS: An automated elastic image fusion algorithm (AEF), a guided elastic image fusion algorithm (GEF), and a standard linear fusion algorithm (LF) were used on preoperative MRI and post-implantation CT scans. Vertical correction of virtual contact positions, total virtual contact shift, corrections of midline shift and brain shifts due to pneumencephalus were measured. RESULTS: Both AEF and GEF worked well with all 5 cases. An average midline shift of 1.7mm (SD 1.25) was corrected to 0.4mm (SD 0.8) after AEF and to 0.0mm (SD 0) after GEF. Median virtual distances between contacts and cortical surface were corrected by a significant amount, from 2.3mm after LF to 0.0mm after AEF and GEF (p<.001). Mean total relative corrections of 3.1 mm (SD 1.85) after AEF and 3.0mm (SD 1.77) after GEF were achieved. The tested version of GEF did not achieve a satisfying virtual correction of pneumencephalus. CONCLUSION: The technique provided a clear improvement in fusion of pre- and post-implantation scans, although the accuracy is difficult to evaluate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The International Surface Temperature Initiative (ISTI) is striving towards substantively improving our ability to robustly understand historical land surface air temperature change at all scales. A key recently completed first step has been collating all available records into a comprehensive open access, traceable and version-controlled databank. The crucial next step is to maximise the value of the collated data through a robust international framework of benchmarking and assessment for product intercomparison and uncertainty estimation. We focus on uncertainties arising from the presence of inhomogeneities in monthly mean land surface temperature data and the varied methodological choices made by various groups in building homogeneous temperature products. The central facet of the benchmarking process is the creation of global-scale synthetic analogues to the real-world database where both the "true" series and inhomogeneities are known (a luxury the real-world data do not afford us). Hence, algorithmic strengths and weaknesses can be meaningfully quantified and conditional inferences made about the real-world climate system. Here we discuss the necessary framework for developing an international homogenisation benchmarking system on the global scale for monthly mean temperatures. The value of this framework is critically dependent upon the number of groups taking part and so we strongly advocate involvement in the benchmarking exercise from as many data analyst groups as possible to make the best use of this substantial effort.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fillers are frequently used in beautifying procedures. Despite major advancements of the chemical and biological features of injected materials, filler-related adverse events may occur, and can substantially impact the clinical outcome. Filler granulomas become manifest as visible grains, nodules, or papules around the site of the primary injection. Early recognition and proper treatment of filler-related complications is important because effective treatment options are available. In this report, we provide a comprehensive overview of the differential diagnosis and diagnostics and develop an algorithm of successful therapy regimens.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE To systematically evaluate the dependence of intravoxel-incoherent-motion (IVIM) parameters on the b-value threshold separating the perfusion and diffusion compartment, and to implement and test an algorithm for the standardized computation of this threshold. METHODS Diffusion weighted images of the upper abdomen were acquired at 3 Tesla in eleven healthy male volunteers with 10 different b-values and in two healthy male volunteers with 16 different b-values. Region-of-interest IVIM analysis was applied to the abdominal organs and skeletal muscle with a systematic increase of the b-value threshold for computing pseudodiffusion D*, perfusion fraction Fp , diffusion coefficient D, and the sum of squared residuals to the bi-exponential IVIM-fit. RESULTS IVIM parameters strongly depended on the choice of the b-value threshold. The proposed algorithm successfully provided optimal b-value thresholds with the smallest residuals for all evaluated organs [s/mm2]: e.g., right liver lobe 20, spleen 20, right renal cortex 150, skeletal muscle 150. Mean D* [10(-3) mm(2) /s], Fp [%], and D [10(-3) mm(2) /s] values (±standard deviation) were: right liver lobe, 88.7 ± 42.5, 22.6 ± 7.4, 0.73 ± 0.12; right renal cortex: 11.5 ± 1.8, 18.3 ± 2.9, 1.68 ± 0.05; spleen: 41.9 ± 57.9, 8.2 ± 3.4, 0.69 ± 0.07; skeletal muscle: 21.7 ± 19.0; 7.4 ± 3.0; 1.36 ± 0.04. CONCLUSION IVIM parameters strongly depend upon the choice of the b-value threshold used for computation. The proposed algorithm may be used as a robust approach for IVIM analysis without organ-specific adaptation. Magn Reson Med, 2014. © 2014 Wiley Periodicals, Inc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND A precise detection of volume change allows for better estimating the biological behavior of the lung nodules. Postprocessing tools with automated detection, segmentation, and volumetric analysis of lung nodules may expedite radiological processes and give additional confidence to the radiologists. PURPOSE To compare two different postprocessing software algorithms (LMS Lung, Median Technologies; LungCARE®, Siemens) in CT volumetric measurement and to analyze the effect of soft (B30) and hard reconstruction filter (B70) on automated volume measurement. MATERIAL AND METHODS Between January 2010 and April 2010, 45 patients with a total of 113 pulmonary nodules were included. The CT exam was performed on a 64-row multidetector CT scanner (Somatom Sensation, Siemens, Erlangen, Germany) with the following parameters: collimation, 24x1.2 mm; pitch, 1.15; voltage, 120 kVp; reference tube current-time, 100 mAs. Automated volumetric measurement of each lung nodule was performed with the two different postprocessing algorithms based on two reconstruction filters (B30 and B70). The average relative volume measurement difference (VME%) and the limits of agreement between two methods were used for comparison. RESULTS At soft reconstruction filters the LMS system produced mean nodule volumes that were 34.1% (P < 0.0001) larger than those by LungCARE® system. The VME% was 42.2% with a limit of agreement between -53.9% and 138.4%.The volume measurement with soft filters (B30) was significantly larger than with hard filters (B70); 11.2% for LMS and 1.6% for LungCARE®, respectively (both with P < 0.05). LMS measured greater volumes with both filters, 13.6% for soft and 3.8% for hard filters, respectively (P < 0.01 and P > 0.05). CONCLUSION There is a substantial inter-software (LMS/LungCARE®) as well as intra-software variability (B30/B70) in lung nodule volume measurement; therefore, it is mandatory to use the same equipment with the same reconstruction filter for the follow-up of lung nodule volume.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cataloging geocentric objects can be put in the framework of Multiple Target Tracking (MTT). Current work tends to focus on the S = 2 MTT problem because of its favorable computational complexity of O(n²). The MTT problem becomes NP-hard for a dimension of S˃3. The challenge is to find an approximation to the solution within a reasonable computation time. To effciently approximate this solution a Genetic Algorithm is used. The algorithm is applied to a simulated test case. These results represent the first steps towards a method that can treat the S˃3 problem effciently and with minimal manual intervention.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently several thousands of objects are being tracked in the MEO and GEO regions through optical means. The problem faced in this framework is that of Multiple Target Tracking (MTT). In this context both the correct associations among the observations, and the orbits of the objects have to be determined. The complexity of the MTT problem is defined by its dimension S. Where S stands for the number of ’fences’ used in the problem, each fence consists of a set of observations that all originate from dierent targets. For a dimension of S ˃ the MTT problem becomes NP-hard. As of now no algorithm exists that can solve an NP-hard problem in an optimal manner within a reasonable (polynomial) computation time. However, there are algorithms that can approximate the solution with a realistic computational e ort. To this end an Elitist Genetic Algorithm is implemented to approximately solve the S ˃ MTT problem in an e cient manner. Its complexity is studied and it is found that an approximate solution can be obtained in a polynomial time. With the advent of improved sensors and a heightened interest in the problem of space debris, it is expected that the number of tracked objects will grow by an order of magnitude in the near future. This research aims to provide a method that can treat the correlation and orbit determination problems simultaneously, and is able to e ciently process large data sets with minimal manual intervention.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Behavior is one of the most important indicators for assessing cattle health and well-being. The objective of this study was to develop and validate a novel algorithm to monitor locomotor behavior of loose-housed dairy cows based on the output of the RumiWatch pedometer (ITIN+HOCH GmbH, Fütterungstechnik, Liestal, Switzerland). Data of locomotion were acquired by simultaneous pedometer measurements at a sampling rate of 10 Hz and video recordings for manual observation later. The study consisted of 3 independent experiments. Experiment 1 was carried out to develop and validate the algorithm for lying behavior, experiment 2 for walking and standing behavior, and experiment 3 for stride duration and stride length. The final version was validated, using the raw data, collected from cows not included in the development of the algorithm. Spearman correlation coefficients were calculated between accelerometer variables and respective data derived from the video recordings (gold standard). Dichotomous data were expressed as the proportion of correctly detected events, and the overall difference for continuous data was expressed as the relative measurement error. The proportions for correctly detected events or bouts were 1 for stand ups, lie downs, standing bouts, and lying bouts and 0.99 for walking bouts. The relative measurement error and Spearman correlation coefficient for lying time were 0.09% and 1; for standing time, 4.7% and 0.96; for walking time, 17.12% and 0.96; for number of strides, 6.23% and 0.98; for stride duration, 6.65% and 0.75; and for stride length, 11.92% and 0.81, respectively. The strong to very high correlations of the variables between visual observation and converted pedometer data indicate that the novel RumiWatch algorithm may markedly improve automated livestock management systems for efficient health monitoring of dairy cows.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Any image processing object detection algorithm somehow tries to integrate the object light (Recognition Step) and applies statistical criteria to distinguish objects of interest from other objects or from pure background (Decision Step). There are various possibilities how these two basic steps can be realized, as can be seen in the different proposed detection methods in the literature. An ideal detection algorithm should provide high recognition sensitiv ity with high decision accuracy and require a reasonable computation effort . In reality, a gain in sensitivity is usually only possible with a loss in decision accuracy and with a higher computational effort. So, automatic detection of faint streaks is still a challenge. This paper presents a detection algorithm using spatial filters simulating the geometrical form of possible streaks on a CCD image. This is realized by image convolution. The goal of this method is to generate a more or less perfect match between a streak and a filter by varying the length and orientation of the filters. The convolution answers are accepted or rejected according to an overall threshold given by the ackground statistics. This approach yields as a first result a huge amount of accepted answers due to filters partially covering streaks or remaining stars. To avoid this, a set of additional acceptance criteria has been included in the detection method. All criteria parameters are justified by background and streak statistics and they affect the detection sensitivity only marginally. Tests on images containing simulated streaks and on real images containing satellite streaks show a very promising sensitivity, reliability and running speed for this detection method. Since all method parameters are based on statistics, the true alarm, as well as the false alarm probability, are well controllable. Moreover, the proposed method does not pose any extraordinary demands on the computer hardware and on the image acquisition process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When considering data from many trials, it is likely that some of them present a markedly different intervention effect or exert an undue influence on the summary results. We develop a forward search algorithm for identifying outlying and influential studies in meta-analysis models. The forward search algorithm starts by fitting the hypothesized model to a small subset of likely outlier-free studies and proceeds by adding studies into the set one-by-one that are determined to be closest to the fitted model of the existing set. As each study is added to the set, plots of estimated parameters and measures of fit are monitored to identify outliers by sharp changes in the forward plots. We apply the proposed outlier detection method to two real data sets; a meta-analysis of 26 studies that examines the effect of writing-to-learn interventions on academic achievement adjusting for three possible effect modifiers, and a meta-analysis of 70 studies that compares a fluoride toothpaste treatment to placebo for preventing dental caries in children. A simple simulated example is used to illustrate the steps of the proposed methodology, and a small-scale simulation study is conducted to evaluate the performance of the proposed method. Copyright © 2016 John Wiley & Sons, Ltd.