983 resultados para Statistical mean
Resumo:
Statistical shape models (SSMs) have been used widely as a basis for segmenting and interpreting complex anatomical structures. The robustness of these models are sensitive to the registration procedures, i.e., establishment of a dense correspondence across a training data set. In this work, two SSMs based on the same training data set of scoliotic vertebrae, and registration procedures were compared. The first model was constructed based on the original binary masks without applying any image pre- and post-processing, and the second was obtained by means of a feature preserving smoothing method applied to the original training data set, followed by a standard rasterization algorithm. The accuracies of the correspondences were assessed quantitatively by means of the maximum of the mean minimum distance (MMMD) and Hausdorf distance (H(D)). Anatomical validity of the models were quantified by means of three different criteria, i.e., compactness, specificity, and model generalization ability. The objective of this study was to compare quasi-identical models based on standard metrics. Preliminary results suggest that the MMMD distance and eigenvalues are not sensitive metrics for evaluating the performance and robustness of SSMs.
Resumo:
This paper presents a kernel density correlation based nonrigid point set matching method and shows its application in statistical model based 2D/3D reconstruction of a scaled, patient-specific model from an un-calibrated x-ray radiograph. In this method, both the reference point set and the floating point set are first represented using kernel density estimates. A correlation measure between these two kernel density estimates is then optimized to find a displacement field such that the floating point set is moved to the reference point set. Regularizations based on the overall deformation energy and the motion smoothness energy are used to constraint the displacement field for a robust point set matching. Incorporating this non-rigid point set matching method into a statistical model based 2D/3D reconstruction framework, we can reconstruct a scaled, patient-specific model from noisy edge points that are extracted directly from the x-ray radiograph by an edge detector. Our experiment conducted on datasets of two patients and six cadavers demonstrates a mean reconstruction error of 1.9 mm
Resumo:
For various reasons, it is important, if not essential, to integrate the computations and code used in data analyses, methodological descriptions, simulations, etc. with the documents that describe and rely on them. This integration allows readers to both verify and adapt the statements in the documents. Authors can easily reproduce them in the future, and they can present the document's contents in a different medium, e.g. with interactive controls. This paper describes a software framework for authoring and distributing these integrated, dynamic documents that contain text, code, data, and any auxiliary content needed to recreate the computations. The documents are dynamic in that the contents, including figures, tables, etc., can be recalculated each time a view of the document is generated. Our model treats a dynamic document as a master or ``source'' document from which one can generate different views in the form of traditional, derived documents for different audiences. We introduce the concept of a compendium as both a container for the different elements that make up the document and its computations (i.e. text, code, data, ...), and as a means for distributing, managing and updating the collection. The step from disseminating analyses via a compendium to reproducible research is a small one. By reproducible research, we mean research papers with accompanying software tools that allow the reader to directly reproduce the results and employ the methods that are presented in the research paper. Some of the issues involved in paradigms for the production, distribution and use of such reproducible research are discussed.
Resumo:
Exposimeters are increasingly applied in bioelectromagnetic research to determine personal radiofrequency electromagnetic field (RF-EMF) exposure. The main advantages of exposimeter measurements are their convenient handling for study participants and the large amount of personal exposure data, which can be obtained for several RF-EMF sources. However, the large proportion of measurements below the detection limit is a challenge for data analysis. With the robust ROS (regression on order statistics) method, summary statistics can be calculated by fitting an assumed distribution to the observed data. We used a preliminary sample of 109 weekly exposimeter measurements from the QUALIFEX study to compare summary statistics computed by robust ROS with a naïve approach, where values below the detection limit were replaced by the value of the detection limit. For the total RF-EMF exposure, differences between the naïve approach and the robust ROS were moderate for the 90th percentile and the arithmetic mean. However, exposure contributions from minor RF-EMF sources were considerably overestimated with the naïve approach. This results in an underestimation of the exposure range in the population, which may bias the evaluation of potential exposure-response associations. We conclude from our analyses that summary statistics of exposimeter data calculated by robust ROS are more reliable and more informative than estimates based on a naïve approach. Nevertheless, estimates of source-specific medians or even lower percentiles depend on the assumed data distribution and should be considered with caution.
Resumo:
PURPOSE Segmentation of the proximal femur in digital antero-posterior (AP) pelvic radiographs is required to create a three-dimensional model of the hip joint for use in planning and treatment. However, manually extracting the femoral contour is tedious and prone to subjective bias, while automatic segmentation must accommodate poor image quality, anatomical structure overlap, and femur deformity. A new method was developed for femur segmentation in AP pelvic radiographs. METHODS Using manual annotations on 100 AP pelvic radiographs, a statistical shape model (SSM) and a statistical appearance model (SAM) of the femur contour were constructed. The SSM and SAM were used to segment new AP pelvic radiographs with a three-stage approach. At initialization, the mean SSM model is coarsely registered to the femur in the AP radiograph through a scaled rigid registration. Mahalanobis distance defined on the SAM is employed as the search criteria for each annotated suggested landmark location. Dynamic programming was used to eliminate ambiguities. After all landmarks are assigned, a regularized non-rigid registration method deforms the current mean shape of SSM to produce a new segmentation of proximal femur. The second and third stages are iteratively executed to convergence. RESULTS A set of 100 clinical AP pelvic radiographs (not used for training) were evaluated. The mean segmentation error was [Formula: see text], requiring [Formula: see text] s per case when implemented with Matlab. The influence of the initialization on segmentation results was tested by six clinicians, demonstrating no significance difference. CONCLUSIONS A fast, robust and accurate method for femur segmentation in digital AP pelvic radiographs was developed by combining SSM and SAM with dynamic programming. This method can be extended to segmentation of other bony structures such as the pelvis.
Resumo:
OBJECTIVE: The assessment of coronary stents with present-generation 64-detector row computed tomography (HDCT) scanners is limited by image noise and blooming artefacts. We evaluated the performance of adaptive statistical iterative reconstruction (ASIR) for noise reduction in coronary stent imaging with HDCT. METHODS AND RESULTS: In 50 stents of 28 patients (mean age 64 ± 10 years) undergoing coronary CT angiography (CCTA) on an HDCT scanner the mean in-stent luminal diameter, stent length, image quality, in-stent contrast attenuation, and image noise were assessed. Studies were reconstructed using filtered back projection (FBP) and ASIR-FBP composites. ASIR resulted in reduced image noise vs. FBP (P < 0.0001). Two readers graded the CCTA stent image quality on a 4-point Likert scale and determined the proportion of interpretable stent segments. The best image quality for all clinical images was obtained with 40 and 60% ASIR with significantly larger luminal area visualization compared with FBP (+42.1 ± 5.4% with 100% ASIR vs. FBP alone; P < 0.0001) while the stent length was decreased (-4.7 ± 0.9%,
Resumo:
Calcium levels in spines play a significant role in determining the sign and magnitude of synaptic plasticity. The magnitude of calcium influx into spines is highly dependent on influx through N-methyl D-aspartate (NMDA) receptors, and therefore depends on the number of postsynaptic NMDA receptors in each spine. We have calculated previously how the number of postsynaptic NMDA receptors determines the mean and variance of calcium transients in the postsynaptic density, and how this alters the shape of plasticity curves. However, the number of postsynaptic NMDA receptors in the postsynaptic density is not well known. Anatomical methods for estimating the number of NMDA receptors produce estimates that are very different than those produced by physiological techniques. The physiological techniques are based on the statistics of synaptic transmission and it is difficult to experimentally estimate their precision. In this paper we use stochastic simulations in order to test the validity of a physiological estimation technique based on failure analysis. We find that the method is likely to underestimate the number of postsynaptic NMDA receptors, explain the source of the error, and re-derive a more precise estimation technique. We also show that the original failure analysis as well as our improved formulas are not robust to small estimation errors in key parameters.
Resumo:
Nuclear morphometry (NM) uses image analysis to measure features of the cell nucleus which are classified as: bulk properties, shape or form, and DNA distribution. Studies have used these measurements as diagnostic and prognostic indicators of disease with inconclusive results. The distributional properties of these variables have not been systematically investigated although much of the medical data exhibit nonnormal distributions. Measurements are done on several hundred cells per patient so summary measurements reflecting the underlying distribution are needed.^ Distributional characteristics of 34 NM variables from prostate cancer cells were investigated using graphical and analytical techniques. Cells per sample ranged from 52 to 458. A small sample of patients with benign prostatic hyperplasia (BPH), representing non-cancer cells, was used for general comparison with the cancer cells.^ Data transformations such as log, square root and 1/x did not yield normality as measured by the Shapiro-Wilks test for normality. A modulus transformation, used for distributions having abnormal kurtosis values, also did not produce normality.^ Kernel density histograms of the 34 variables exhibited non-normality and 18 variables also exhibited bimodality. A bimodality coefficient was calculated and 3 variables: DNA concentration, shape and elongation, showed the strongest evidence of bimodality and were studied further.^ Two analytical approaches were used to obtain a summary measure for each variable for each patient: cluster analysis to determine significant clusters and a mixture model analysis using a two component model having a Gaussian distribution with equal variances. The mixture component parameters were used to bootstrap the log likelihood ratio to determine the significant number of components, 1 or 2. These summary measures were used as predictors of disease severity in several proportional odds logistic regression models. The disease severity scale had 5 levels and was constructed of 3 components: extracapsulary penetration (ECP), lymph node involvement (LN+) and seminal vesicle involvement (SV+) which represent surrogate measures of prognosis. The summary measures were not strong predictors of disease severity. There was some indication from the mixture model results that there were changes in mean levels and proportions of the components in the lower severity levels. ^
Resumo:
PURPOSE Confidence intervals (CIs) are integral to the interpretation of the precision and clinical relevance of research findings. The aim of this study was to ascertain the frequency of reporting of CIs in leading prosthodontic and dental implantology journals and to explore possible factors associated with improved reporting. MATERIALS AND METHODS Thirty issues of nine journals in prosthodontics and implant dentistry were accessed, covering the years 2005 to 2012: The Journal of Prosthetic Dentistry, Journal of Oral Rehabilitation, The International Journal of Prosthodontics, The International Journal of Periodontics & Restorative Dentistry, Clinical Oral Implants Research, Clinical Implant Dentistry and Related Research, The International Journal of Oral & Maxillofacial Implants, Implant Dentistry, and Journal of Dentistry. Articles were screened and the reporting of CIs and P values recorded. Other information including study design, region of authorship, involvement of methodologists, and ethical approval was also obtained. Univariable and multivariable logistic regression was used to identify characteristics associated with reporting of CIs. RESULTS Interrater agreement for the data extraction performed was excellent (kappa = 0.88; 95% CI: 0.87 to 0.89). CI reporting was limited, with mean reporting across journals of 14%. CI reporting was associated with journal type, study design, and involvement of a methodologist or statistician. CONCLUSIONS Reporting of CI in implant dentistry and prosthodontic journals requires improvement. Improved reporting will aid appraisal of the clinical relevance of research findings by providing a range of values within which the effect size lies, thus giving the end user the opportunity to interpret the results in relation to clinical practice.
Resumo:
Purpose: Proper delineation of ocular anatomy in 3D imaging is a big challenge, particularly when developing treatment plans for ocular diseases. Magnetic Resonance Imaging (MRI) is nowadays utilized in clinical practice for the diagnosis confirmation and treatment planning of retinoblastoma in infants, where it serves as a source of information, complementary to the Fundus or Ultrasound imaging. Here we present a framework to fully automatically segment the eye anatomy in the MRI based on 3D Active Shape Models (ASM), we validate the results and present a proof of concept to automatically segment pathological eyes. Material and Methods: Manual and automatic segmentation were performed on 24 images of healthy children eyes (3.29±2.15 years). Imaging was performed using a 3T MRI scanner. The ASM comprises the lens, the vitreous humor, the sclera and the cornea. The model was fitted by first automatically detecting the position of the eye center, the lens and the optic nerve, then aligning the model and fitting it to the patient. We validated our segmentation method using a leave-one-out cross validation. The segmentation results were evaluated by measuring the overlap using the Dice Similarity Coefficient (DSC) and the mean distance error. Results: We obtained a DSC of 94.90±2.12% for the sclera and the cornea, 94.72±1.89% for the vitreous humor and 85.16±4.91% for the lens. The mean distance error was 0.26±0.09mm. The entire process took 14s on average per eye. Conclusion: We provide a reliable and accurate tool that enables clinicians to automatically segment the sclera, the cornea, the vitreous humor and the lens using MRI. We additionally present a proof of concept for fully automatically segmenting pathological eyes. This tool reduces the time needed for eye shape delineation and thus can help clinicians when planning eye treatment and confirming the extent of the tumor.
Resumo:
We describe the recovery of three daily meteorological records for the southern Alps (Domodossola, Riva del Garda, and Rovereto), all starting in the second half of the nineteenth century. We use these new data, along with additional records, to study regional changes in the mean temperature and extreme indices of heat waves and cold spells frequency and duration over the period 1874–2015. The records are homogenized using subdaily cloud cover observations as a constraint for the statistical model, an approach that has never been applied before in the literature. A case study based on a record of parallel observations between a traditional meteorological window and a modern screen shows that the use of cloud cover can reduce the root-mean-square error of the homogenization by up to 30% in comparison to an unaided statistical correction. We find that mean temperature in the southern Alps has increased by 1.4°C per century over the analyzed period, with larger increases in daily minimum temperatures than maximum temperatures. The number of hot days in summer has more than tripled, and a similar increase is observed in duration of heat waves. Cold days in winter have dropped at a similar rate. These trends are mainly caused by climate change over the last few decades.
Resumo:
Land and water management in semi-arid regions requires detailed information on precipitation distribution, including extremes, and changes therein. Such information is often lacking. This paper describes statistics of mean and extreme precipitation in a unique data set from the Mount Kenya region, encompassing around 50 stations with at least 30 years of data. We describe the data set, including quality control procedures and statistical break detection. Trends in mean precipitation and extreme indices calculated from these data for individual rainy seasons are compared with corresponding trends in reanalysis products. From 1979 to 2011, mean precipitation decreased at 75% of the stations during the ‘long rains’ (March to May) and increased at 70% of the stations during the ‘short rains’ (October to December). Corresponding trends are found in the number of heavy precipitation days, and maximum of consecutive 5-day precipitation. Conversely, an increase in consecutive dry days within both main rainy seasons is found. However, trends are only statistically significant in very few cases. Reanalysis data sets agree with observations with respect to interannual variability, while correlations are considerably lower for monthly deviations (ratios) from the mean annual cycle. While some products well reproduce the rainfall climatology and some the spatial trend pattern, no product reproduces both.
Resumo:
The Weyburn Oil Field, Saskatchewan is the site of a large (5000 tonnes/day of CO2) CO2-EOR injection project By EnCana Corporation. Pre- and post-injection samples (Baseline and Monitor-1, respectively) of produced fluids from approximately 45 vertical wells were taken and chemically analyzed to determine changes in the fluid chemistry and isotope composition between August 2000 and March 2001. After 6 months of CO2 injection, geochemical parameters including pH, [HCO3], [Ca], [Mg], and ?13CO2(g) point to areas in which injected CO2 dissolution and reservoir carbonate mineral dissolution have occurred. Pre-injection fluid compositions suggest that the reservoir brine in the injection area may be capable of storing as much as 100 million tonnes of dissolved CO2. Modeling of water-rock reactions show that clay minerals and feldspar, although volumetrically insignificant, may be capable of acting as pH buffers, allowing injected CO2 to be stored as bicarbonate in the formation water or as newly precipitated carbonate minerals, given favorable reaction kinetics.