303 resultados para histogram


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Spatial prediction of hourly rainfall via radar calibration is addressed. The change of support problem (COSP), arising when the spatial supports of different data sources do not coincide, is faced in a non-Gaussian setting; in fact, hourly rainfall in Emilia-Romagna region, in Italy, is characterized by abundance of zero values and right-skeweness of the distribution of positive amounts. Rain gauge direct measurements on sparsely distributed locations and hourly cumulated radar grids are provided by the ARPA-SIMC Emilia-Romagna. We propose a three-stage Bayesian hierarchical model for radar calibration, exploiting rain gauges as reference measure. Rain probability and amounts are modeled via linear relationships with radar in the log scale; spatial correlated Gaussian effects capture the residual information. We employ a probit link for rainfall probability and Gamma distribution for rainfall positive amounts; the two steps are joined via a two-part semicontinuous model. Three model specifications differently addressing COSP are presented; in particular, a stochastic weighting of all radar pixels, driven by a latent Gaussian process defined on the grid, is employed. Estimation is performed via MCMC procedures implemented in C, linked to R software. Communication and evaluation of probabilistic, point and interval predictions is investigated. A non-randomized PIT histogram is proposed for correctly assessing calibration and coverage of two-part semicontinuous models. Predictions obtained with the different model specifications are evaluated via graphical tools (Reliability Plot, Sharpness Histogram, PIT Histogram, Brier Score Plot and Quantile Decomposition Plot), proper scoring rules (Brier Score, Continuous Rank Probability Score) and consistent scoring functions (Root Mean Square Error and Mean Absolute Error addressing the predictive mean and median, respectively). Calibration is reached and the inclusion of neighbouring information slightly improves predictions. All specifications outperform a benchmark model with incorrelated effects, confirming the relevance of spatial correlation for modeling rainfall probability and accumulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lattice Quantum Chromodynamics (LQCD) is the preferred tool for obtaining non-perturbative results from QCD in the low-energy regime. It has by nowrnentered the era in which high precision calculations for a number of phenomenologically relevant observables at the physical point, with dynamical quark degrees of freedom and controlled systematics, become feasible. Despite these successes there are still quantities where control of systematic effects is insufficient. The subject of this thesis is the exploration of the potential of todays state-of-the-art simulation algorithms for non-perturbativelyrn$\mathcal{O}(a)$-improved Wilson fermions to produce reliable results in thernchiral regime and at the physical point both for zero and non-zero temperature. Important in this context is the control over the chiral extrapolation. Thisrnthesis is concerned with two particular topics, namely the computation of hadronic form factors at zero temperature, and the properties of the phaserntransition in the chiral limit of two-flavour QCD.rnrnThe electromagnetic iso-vector form factor of the pion provides a platform to study systematic effects and the chiral extrapolation for observables connected to the structure of mesons (and baryons). Mesonic form factors are computationally simpler than their baryonic counterparts but share most of the systematic effects. This thesis contains a comprehensive study of the form factor in the regime of low momentum transfer $q^2$, where the form factor is connected to the charge radius of the pion. A particular emphasis is on the region very close to $q^2=0$ which has not been explored so far, neither in experiment nor in LQCD. The results for the form factor close the gap between the smallest spacelike $q^2$-value available so far and $q^2=0$, and reach an unprecedented accuracy at full control over the main systematic effects. This enables the model-independent extraction of the pion charge radius. The results for the form factor and the charge radius are used to test chiral perturbation theory ($\chi$PT) and are thereby extrapolated to the physical point and the continuum. The final result in units of the hadronic radius $r_0$ is rn$$ \left\langle r_\pi^2 \right\rangle^{\rm phys}/r_0^2 = 1.87 \: \left(^{+12}_{-10}\right)\left(^{+\:4}_{-15}\right) \quad \textnormal{or} \quad \left\langle r_\pi^2 \right\rangle^{\rm phys} = 0.473 \: \left(^{+30}_{-26}\right)\left(^{+10}_{-38}\right)(10) \: \textnormal{fm} \;, $$rn which agrees well with the results from other measurements in LQCD and experiment. Note, that this is the first continuum extrapolated result for the charge radius from LQCD which has been extracted from measurements of the form factor in the region of small $q^2$.rnrnThe order of the phase transition in the chiral limit of two-flavour QCD and the associated transition temperature are the last unkown features of the phase diagram at zero chemical potential. The two possible scenarios are a second order transition in the $O(4)$-universality class or a first order transition. Since direct simulations in the chiral limit are not possible the transition can only be investigated by simulating at non-zero quark mass with a subsequent chiral extrapolation, guided by the universal scaling in the vicinity of the critical point. The thesis presents the setup and first results from a study on this topic. The study provides the ideal platform to test the potential and limits of todays simulation algorithms at finite temperature. The results from a first scan at a constant zero-temperature pion mass of about 290~MeV are promising, and it appears that simulations down to physical quark masses are feasible. Of particular relevance for the order of the chiral transition is the strength of the anomalous breaking of the $U_A(1)$ symmetry at the transition point. It can be studied by looking at the degeneracies of the correlation functions in scalar and pseudoscalar channels. For the temperature scan reported in this thesis the breaking is still pronounced in the transition region and the symmetry becomes effectively restored only above $1.16\:T_C$. The thesis also provides an extensive outline of research perspectives and includes a generalisation of the standard multi-histogram method to explicitly $\beta$-dependent fermion actions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A major barrier to widespread clinical implementation of Monte Carlo dose calculation is the difficulty in characterizing the radiation source within a generalized source model. This work aims to develop a generalized three-component source model (target, primary collimator, flattening filter) for 6- and 18-MV photon beams that match full phase-space data (PSD). Subsource by subsource comparison of dose distributions, using either source PSD or the source model as input, allows accurate source characterization and has the potential to ease the commissioning procedure, since it is possible to obtain information about which subsource needs to be tuned. This source model is unique in that, compared to previous source models, it retains additional correlations among PS variables, which improves accuracy at nonstandard source-to-surface distances (SSDs). In our study, three-dimensional (3D) dose calculations were performed for SSDs ranging from 50 to 200 cm and for field sizes from 1 x 1 to 30 x 30 cm2 as well as a 10 x 10 cm2 field 5 cm off axis in each direction. The 3D dose distributions, using either full PSD or the source model as input, were compared in terms of dose-difference and distance-to-agreement. With this model, over 99% of the voxels agreed within +/-1% or 1 mm for the target, within 2% or 2 mm for the primary collimator, and within +/-2.5% or 2 mm for the flattening filter in all cases studied. For the dose distributions, 99% of the dose voxels agreed within 1% or 1 mm when the combined source model-including a charged particle source and the full PSD as input-was used. The accurate and general characterization of each photon source and knowledge of the subsource dose distributions should facilitate source model commissioning procedures by allowing scaling the histogram distributions representing the subsources to be tuned.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Northern hardwood management was assessed throughout the state of Michigan using data collected on recently harvested stands in 2010 and 2011. Methods of forensic estimation of diameter at breast height were compared and an ideal, localized equation form was selected for use in reconstructing pre-harvest stand structures. Comparisons showed differences in predictive ability among available equation forms which led to substantial financial differences when used to estimate the value of removed timber. Management on all stands was then compared among state, private, and corporate landowners. Comparisons of harvest intensities against a liberal interpretation of a well-established management guideline showed that approximately one third of harvests were conducted in a manner which may imply that the guideline was followed. One third showed higher levels of removals than recommended, and one third of harvests were less intensive than recommended. Multiple management guidelines and postulated objectives were then synthesized into a novel system of harvest taxonomy, against which all harvests were compared. This further comparison showed approximately the same proportions of harvests, while distinguishing sanitation cuts and the future productive potential of harvests cut more intensely than suggested by guidelines. Stand structures are commonly represented using diameter distributions. Parametric and nonparametric techniques for describing diameter distributions were employed on pre-harvest and post-harvest data. A common polynomial regression procedure was found to be highly sensitive to the method of histogram construction which provides the data points for the regression. The discriminative ability of kernel density estimation was substantially different from that of the polynomial regression technique.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis, I study skin lesion detection and its applications to skin cancer diagnosis. A skin lesion detection algorithm is proposed. The proposed algorithm is based color information and threshold. For the proposed algorithm, several color spaces are studied and the detection results are compared. Experimental results show that YUV color space can achieve the best performance. Besides, I develop a distance histogram based threshold selection method and the method is proven to be better than other adaptive threshold selection methods for color detection. Besides the detection algorithms, I also investigate GPU speed-up techniques for skin lesion extraction and the results show that GPU has potential applications in speeding-up skin lesion extraction. Based on the skin lesion detection algorithms proposed, I developed a mobile-based skin cancer diagnosis application. In this application, the user with an iPhone installed with the proposed application can use the iPhone as a diagnosis tool to find the potential skin lesions in a persons' skin and compare the skin lesions detected by the iPhone with the skin lesions stored in a database in a remote server.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE Different international target volume delineation guidelines exist and different treatment techniques are available for salvage radiation therapy (RT) for recurrent prostate cancer, but less is known regarding their respective applicability in clinical practice. METHODS AND MATERIALS A randomized phase III trial testing 64 Gy vs 70 Gy salvage RT was accompanied by an intense quality assurance program including a site-specific and study-specific questionnaire and a dummy run (DR). Target volume delineation was performed according to the European Organisation for the Research and Treatment of Cancer guidelines, and a DR-based treatment plan was established for 70 Gy. Major and minor protocol deviations were noted, interobserver agreement of delineated target contours was assessed, and dose-volume histogram (DVH) parameters of different treatment techniques were compared. RESULTS Thirty European centers participated, 43% of which were using 3-dimensional conformal RT (3D-CRT), with the remaining centers using intensity modulated RT (IMRT) or volumetric modulated arc technique (VMAT). The first submitted version of the DR contained major deviations in 21 of 30 (70%) centers, mostly caused by inappropriately defined or lack of prostate bed (PB). All but 5 centers completed the DR successfully with their second submitted version. The interobserver agreement of the PB was moderate and was improved by the DR review, as indicated by an increased κ value (0.59 vs 0.55), mean sensitivity (0.64 vs 0.58), volume of total agreement (3.9 vs 3.3 cm(3)), and decrease in the union volume (79.3 vs 84.2 cm(3)). Rectal and bladder wall DVH parameters of IMRT and VMAT vs 3D-CRT plans were not significantly different. CONCLUSIONS The interobserver agreement of PB delineation was moderate but was improved by the DR. Major deviations could be identified for the majority of centers. The DR has improved the acquaintance of the participating centers with the trial protocol.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Degeneration of the intervertebral disc, sometimes associated with low back pain and abnormal spinal motions, represents a major health issue with high costs. A non-invasive degeneration assessment via qualitative or quantitative MRI (magnetic resonance imaging) is possible, yet, no relation between mechanical properties and T2 maps of the intervertebral disc (IVD) has been considered, albeit T2 relaxation time values quantify the degree of degeneration. Therefore, MRI scans and mechanical tests were performed on 14 human lumbar intervertebral segments freed from posterior elements and all soft tissues excluding the IVD. Degeneration was evaluated in each specimen using morphological criteria, qualitative T2 weighted images and quantitative axial T2 map data and stiffness was calculated from the load-deflection curves of in vitro compression, torsion, lateral bending and flexion/extension tests. In addition to mean T2, the OTSU threshold of T2 (TOTSU), a robust and automatic histogram-based method that computes the optimal threshold maximizing the distinction of two classes of values, was calculated for anterior, posterior, left and right regions of each annulus fibrosus (AF). While mean T2 and degeneration schemes were not related to the IVDs' mechanical properties, TOTSU computed in the posterior AF correlated significantly with those classifications as well as with all stiffness values. TOTSU should therefore be included in future degeneration grading schemes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we propose a fully automatic, robust approach for segmenting proximal femur in conventional X-ray images. Our method is based on hierarchical landmark detection by random forest regression, where the detection results of 22 global landmarks are used to do the spatial normalization, and the detection results of the 59 local landmarks serve as the image cue for instantiation of a statistical shape model of the proximal femur. To detect landmarks in both levels, we use multi-resolution HoG (Histogram of Oriented Gradients) as features which can achieve better accuracy and robustness. The efficacy of the present method is demonstrated by experiments conducted on 150 clinical x-ray images. It was found that the present method could achieve an average point-to-curve error of 2.0 mm and that the present method was robust to low image contrast, noise and occlusions caused by implants.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cognitive event-related potentials (ERPs) are widely employed in the study of dementive disorders. The morphology of averaged response is known to be under the influence of neurodegenerative processes and exploited for diagnostic purposes. This work is built over the idea that there is additional information in the dynamics of single-trial responses. We introduce a novel way to detect mild cognitive impairment (MCI) from the recordings of auditory ERP responses. Using single trial responses from a cohort of 25 amnestic MCI patients and a group of age-matched controls, we suggest a descriptor capable of encapsulating single-trial (ST) response dynamics for the benefit of early diagnosis. A customized vector quantization (VQ) scheme is first employed to summarize the overall set of ST-responses by means of a small-sized codebook of brain waves that is semantically organized. Each ST-response is then treated as a trajectory that can be encoded as a sequence of code vectors. A subject's set of responses is consequently represented as a histogram of activated code vectors. Discriminating MCI patients from healthy controls is based on the deduced response profiles and carried out by means of a standard machine learning procedure. The novel response representation was found to improve significantly MCI detection with respect to the standard alternative representation obtained via ensemble averaging (13% in terms of sensitivity and 6% in terms of specificity). Hence, the role of cognitive ERPs as biomarker for MCI can be enhanced by adopting the delicate description of our VQ scheme.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE Texture analysis is an alternative method to quantitatively assess MR-images. In this study, we introduce dynamic texture parameter analysis (DTPA), a novel technique to investigate the temporal evolution of texture parameters using dynamic susceptibility contrast enhanced (DSCE) imaging. Here, we aim to introduce the method and its application on enhancing lesions (EL), non-enhancing lesions (NEL) and normal appearing white matter (NAWM) in multiple sclerosis (MS). METHODS We investigated 18 patients with MS and clinical isolated syndrome (CIS), according to the 2010 McDonald's criteria using DSCE imaging at different field strengths (1.5 and 3 Tesla). Tissues of interest (TOIs) were defined within 27 EL, 29 NEL and 37 NAWM areas after normalization and eight histogram-based texture parameter maps (TPMs) were computed. TPMs quantify the heterogeneity of the TOI. For every TOI, the average, variance, skewness, kurtosis and variance-of-the-variance statistical parameters were calculated. These TOI parameters were further analyzed using one-way ANOVA followed by multiple Wilcoxon sum rank testing corrected for multiple comparisons. RESULTS Tissue- and time-dependent differences were observed in the dynamics of computed texture parameters. Sixteen parameters discriminated between EL, NEL and NAWM (pAVG = 0.0005). Significant differences in the DTPA texture maps were found during inflow (52 parameters), outflow (40 parameters) and reperfusion (62 parameters). The strongest discriminators among the TPMs were observed in the variance-related parameters, while skewness and kurtosis TPMs were in general less sensitive to detect differences between the tissues. CONCLUSION DTPA of DSCE image time series revealed characteristic time responses for ELs, NELs and NAWM. This may be further used for a refined quantitative grading of MS lesions during their evolution from acute to chronic state. DTPA discriminates lesions beyond features of enhancement or T2-hypersignal, on a numeric scale allowing for a more subtle grading of MS-lesions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To evaluate normal tissue dose reduction in step-and-shoot intensity-modulated radiation therapy (IMRT) on the Varian 2100 platform by tracking the multileaf collimator (MLC) apertures with the accelerator jaws. Methods: Clinical radiation treatment plans for 10 thoracic, 3 pediatric and 3 head and neck patients were converted to plans with the jaws tracking each segment’s MLC apertures. Each segment was then renormalized to account for the change in collimator scatter to obtain target coverage within 1% of that in the original plan. The new plans were compared to the original plans in a commercial radiation treatment planning system (TPS). Reduction in normal tissue dose was evaluated in the new plan by using the parameters V5, V10, and V20 in the cumulative dose-volume histogram for the following structures: total lung minus GTV (gross target volume), heart, esophagus, spinal cord, liver, parotids, and brainstem. In order to validate the accuracy of our beam model, MLC transmission measurements were made and compared to those predicted by the TPS. Results: The greatest change between the original plan and new plan occurred at lower dose levels. The reduction in V20 was never more than 6.3% and was typically less than 1% for all patients. The reduction in V5 was 16.7% maximum and was typically less than 3% for all patients. The variation in normal tissue dose reduction was not predictable, and we found no clear parameters that indicated which patients would benefit most from jaw tracking. Our TPS model of MLC transmission agreed with measurements with absolute transmission differences of less than 0.1 % and thus uncertainties in the model did not contribute significantly to the uncertainty in the dose determination. Conclusion: The amount of dose reduction achieved by collimating the jaws around each MLC aperture in step-and-shoot IMRT does not appear to be clinically significant.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The prognosis for lung cancer patients remains poor. Five year survival rates have been reported to be 15%. Studies have shown that dose escalation to the tumor can lead to better local control and subsequently better overall survival. However, dose to lung tumor is limited by normal tissue toxicity. The most prevalent thoracic toxicity is radiation pneumonitis. In order to determine a safe dose that can be delivered to the healthy lung, researchers have turned to mathematical models predicting the rate of radiation pneumonitis. However, these models rely on simple metrics based on the dose-volume histogram and are not yet accurate enough to be used for dose escalation trials. The purpose of this work was to improve the fit of predictive risk models for radiation pneumonitis and to show the dosimetric benefit of using the models to guide patient treatment planning. The study was divided into 3 specific aims. The first two specifics aims were focused on improving the fit of the predictive model. In Specific Aim 1 we incorporated information about the spatial location of the lung dose distribution into a predictive model. In Specific Aim 2 we incorporated ventilation-based functional information into a predictive pneumonitis model. In the third specific aim a proof of principle virtual simulation was performed where a model-determined limit was used to scale the prescription dose. The data showed that for our patient cohort, the fit of the model to the data was not improved by incorporating spatial information. Although we were not able to achieve a significant improvement in model fit using pre-treatment ventilation, we show some promising results indicating that ventilation imaging can provide useful information about lung function in lung cancer patients. The virtual simulation trial demonstrated that using a personalized lung dose limit derived from a predictive model will result in a different prescription than what was achieved with the clinically used plan; thus demonstrating the utility of a normal tissue toxicity model in personalizing the prescription dose.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a phase I clinical trial, six multiple myeloma patients, who were non-responsive to conventional therapy and were scheduled for bone marrow transplantation, received Holmium-166 ($\sp{166}$Ho) labeled to a bone seeking agent, DOTMP (1,4,7,10-tetraazacyclododecane-1,4,7,10-tetramethylene-phosphonic acid), for the purpose of bone marrow ablation. The specific aims of my research within this protocol were to evaluate the toxicity and efficacy of $\sp{166}$Ho DOTMP by quantifying the in vivo pharmacokinetics and radiation dosimetry, and by correlating these results to the biologic response observed. The reproducibility of pharmacokinetics from multiple injections of $\sp{166}$Ho DOTMP administered to these myeloma patients was demonstrated from both blood and whole body retention. The skeletal concentration of $\sp{166}$Ho DOTMP was heterogenous in all six patients: high in the ribs, pelvis, and lumbar vertebrae regions, and relatively low in the femurs, arms, and head.^ A novel technique was developed to calculate the radiation dose to the bone marrow in each skeletal ROI, and was applied to all six $\sp{166}$Ho DOTMP patients. Radiation dose estimates for the bone marrow calculated using the standard MIRD "S" factors were compared with the average values derived from the heterogenous distribution of activity in the skeleton (i.e., the regional technique). The results from the two techniques were significantly different; the average of the dose estimates from the regional technique were typically 30% greater. Furthermore, the regional technique provided a range of radiation doses for the entire marrow volume, while the MIRD "S" factors only provided a single value. Dose volume histogram analysis of data from the regional technique indicated a range of dose estimates that varied by a factor of 10 between the high dose and low dose regions. Finally, the observed clinical response of cells and abnormal proteins measured in bone marrow aspirates and peripheral blood samples were compared with radiation dose estimates for the bone marrow calculated from the standard and regional technique. The results showed the regional technique values correlated more closely to several clinical response parameters. (Abstract shortened by UMI.) ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

1. BMC Clin Pathol. 2014 May 1;14:19. doi: 10.1186/1472-6890-14-19. eCollection 2014. A case of EDTA-dependent pseudothrombocytopenia: simple recognition of an underdiagnosed and misleading phenomenon. Nagler M, Keller P, Siegrist D, Alberio L. Author information: Department of Hematology and Central Hematology Laboratory, Inselspital University Hospital and University of Berne, CH-3010 Berne, Switzerland. BACKGROUND: EDTA-dependent pseudothrombocytopenia (EDTA-PTCP) is a common laboratory phenomenon with a prevalence ranging from 0.1-2% in hospitalized patients to 15-17% in outpatients evaluated for isolated thrombocytopenia. Despite its harmlessness, EDTA-PTCP frequently leads to time-consuming, costly and even invasive diagnostic investigations. EDTA-PTCP is often overlooked because blood smears are not evaluated visually in routine practice and histograms as well as warning flags of hematology analyzers are not interpreted correctly. Nonetheless, EDTA-PTCP may be diagnosed easily even by general practitioners without any experiences in blood film examinations. This is the first report illustrating the typical patterns of a platelet (PLT) and white blood cell (WBC) histograms of hematology analyzers. CASE PRESENTATION: A 37-year-old female patient of Caucasian origin was referred with suspected acute leukemia and the crew of the emergency unit arranged extensive investigations for work-up. However, examination of EDTA blood sample revealed atypical lymphocytes and an isolated thrombocytopenia together with typical patterns of WBC and PLT histograms: a serrated curve of the platelet histogram and a peculiar peak on the left side of the WBC histogram. EDTA-PTCP was confirmed by a normal platelet count when examining citrated blood. CONCLUSION: Awareness of typical PLT and WBC patterns may alert to the presence of EDTA-PTCP in routine laboratory practice helping to avoid unnecessary investigations and over-treatment. PMCID: PMC4012027 PMID: 24808761 [PubMed]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present an image quality assessment and enhancement method for high-resolution Fourier-Domain OCT imaging like in sub-threshold retina therapy. A Maximum-Likelihood deconvolution algorithm as well as a histogram-based quality assessment method are evaluated.