188 resultados para Histograms
Resumo:
Statistical computing when input/output is driven by a Graphical User Interface is considered. A proposal is made for automatic control ofcomputational flow to ensure that only strictly required computationsare actually carried on. The computational flow is modeled by a directed graph for implementation in any object-oriented programming language with symbolic manipulation capabilities. A complete implementation example is presented to compute and display frequency based piecewise linear density estimators such as histograms or frequency polygons.
Resumo:
PURPOSE: To quantify the relationship between bone marrow (BM) response to radiation and radiation dose by using (18)F-labeled fluorodeoxyglucose positron emission tomography [(18)F]FDG-PET standard uptake values (SUV) and to correlate these findings with hematological toxicity (HT) in cervical cancer (CC) patients treated with chemoradiation therapy (CRT). METHODS AND MATERIALS: Seventeen women with a diagnosis of CC were treated with standard doses of CRT. All patients underwent pre- and post-therapy [(18)F]FDG-PET/computed tomography (CT). Hemograms were obtained before and during treatment and 3 months after treatment and at last follow-up. Pelvic bone was autosegmented as total bone marrow (BMTOT). Active bone marrow (BMACT) was contoured based on SUV greater than the mean SUV of BMTOT. The volumes (V) of each region receiving 10, 20, 30, and 40 Gy (V10, V20, V30, and V40, respectively) were calculated. Metabolic volume histograms and voxel SUV map response graphs were created. Relative changes in SUV before and after therapy were calculated by separating SUV voxels into radiation therapy dose ranges of 5 Gy. The relationships among SUV decrease, radiation dose, and HT were investigated using multiple regression models. RESULTS: Mean relative pre-post-therapy SUV reductions in BMTOT and BMACT were 27% and 38%, respectively. BMACT volume was significantly reduced after treatment (from 651.5 to 231.6 cm(3), respectively; P<.0001). BMACT V30 was significantly correlated with a reduction in BMACT SUV (R(2), 0.14; P<.001). The reduction in BMACT SUV significantly correlated with reduction in white blood cells (WBCs) at 3 months post-treatment (R(2), 0.27; P=.04) and at last follow-up (R(2), 0.25; P=.04). Different dosimetric parameters of BMTOT and BMACT correlated with long-term hematological outcome. CONCLUSIONS: The volumes of BMTOT and BMACT that are exposed to even relatively low doses of radiation are associated with a decrease in WBC counts following CRT. The loss in proliferative BM SUV uptake translates into low WBC nadirs after treatment. These results suggest the potential of intensity modulated radiation therapy to spare BMTOT to reduce long-term hematological toxicity.
Resumo:
Aim: When planning SIRT using 90Y microspheres, the partition model is used to refine the activity calculated by the body surface area (BSA) method to potentially improve the safety and efficacy of treatment. For this partition model dosimetry, accurate determination of mean tumor-to-normal liver ratio (TNR) is critical since it directly impacts absorbed dose estimates. This work aimed at developing and assessing a reliable methodology for the calculation of 99mTc-MAA SPECT/CT-derived TNR ratios based on phantom studies. Materials and methods: IQ NEMA (6 hot spheres) and Kyoto liver phantoms with different hot/background activity concentration ratios were imaged on a SPECT/CT (GE Infinia Hawkeye 4). For each reconstruction with the IQ phantom, TNR quantification was assessed in terms of relative recovery coefficients (RC) and image noise was evaluated in terms of coefficient of variation (COV) in the filled background. RCs were compared using OSEM with Hann, Butterworth and Gaussian filters, as well as FBP reconstruction algorithms. Regarding OSEM, RCs were assessed by varying different parameters independently, such as the number of iterations (i) and subsets (s) and the cut-off frequency of the filter (fc). The influence of the attenuation and diffusion corrections was also investigated. Furthermore, both 2D-ROIs and 3D-VOIs contouring were compared. For this purpose, dedicated Matlab© routines were developed in-house for automatic 2D-ROI/3D-VOI determination to reduce intra-user and intra-slice variability. Best reconstruction parameters and RCs obtained with the IQ phantom were used to recover corrected TNR in case of the Kyoto phantom for arbitrary hot-lesion size. In addition, we computed TNR volume histograms to better assess uptake heterogeneityResults: The highest RCs were obtained with OSEM (i=2, s=10) coupled with the Butterworth filter (fc=0.8). Indeed, we observed a global 20% RC improvement over other OSEM settings and a 50% increase as compared to the best FBP reconstruction. In any case, both attenuation and diffusion corrections must be applied, thus improving RC while preserving good image noise (COV<10%). Both 2D-ROI and 3D-VOI analysis lead to similar results. Nevertheless, we recommend using 3D-VOI since tumor uptake regions are intrinsically 3D. RC-corrected TNR values lie within 17% around the true value, substantially improving the evaluation of small volume (<15 mL) regions. Conclusions: This study reports the multi-parameter optimization of 99mTc MAA SPECT/CT images reconstruction in planning 90Y dosimetry for SIRT. In phantoms, accurate quantification of TNR was obtained using OSEM coupled with Butterworth and RC correction.
Resumo:
PURPOSE: Peptide receptor radionuclide therapy (PRRT) delivers high absorbed doses to kidneys and may lead to permanent nephropathy. Reliable dosimetry of kidneys is thus critical for safe and effective PRRT. The aim of this work was to assess the feasibility of planning PRRT based on 3D radiobiological dosimetry (3D-RD) in order to optimize both the amount of activity to administer and the fractionation scheme, while limiting the absorbed dose and the biological effective dose (BED) to the renal cortex. METHODS: Planar and SPECT data were available for a patient examined with (111)In-DTPA-octreotide at 0.5 (planar only), 4, 24, and 48 h post-injection. Absorbed dose and BED distributions were calculated for common therapeutic radionuclides, i.e., (111)In, (90)Y and (177)Lu, using the 3D-RD methodology. Dose-volume histograms were computed and mean absorbed doses to kidneys, renal cortices, and medullae were compared with results obtained using the MIRD schema (S-values) with the multiregion kidney dosimetry model. Two different treatment planning approaches based on (1) the fixed absorbed dose to the cortex and (2) the fixed BED to the cortex were then considered to optimize the activity to administer by varying the number of fractions. RESULTS: Mean absorbed doses calculated with 3D-RD were in good agreement with those obtained with S-value-based SPECT dosimetry for (90)Y and (177)Lu. Nevertheless, for (111)In, differences of 14% and 22% were found for the whole kidneys and the cortex, respectively. Moreover, the authors found that planar-based dosimetry systematically underestimates the absorbed dose in comparison with SPECT-based methods, up to 32%. Regarding the 3D-RD-based treatment planning using a fixed BED constraint to the renal cortex, the optimal number of fractions was found to be 3 or 4, depending on the radionuclide administered and the value of the fixed BED. Cumulative activities obtained using the proposed simulated treatment planning are compatible with real activities administered to patients in PRRT. CONCLUSIONS: The 3D-RD treatment planning approach based on the fixed BED was found to be the method of choice for clinical implementation in PRRT by providing realistic activity to administer and number of cycles. While dividing the activity in several cycles is important to reduce renal toxicity, the clinical outcome of fractionated PRRT should be investigated in the future.
Resumo:
Introduction ICM+ software encapsulates our 20 years' experience in brain monitoring. It collects data from a variety of bedside monitors and produces time trends of parameters defi ned using confi gurable mathematical formulae. To date it is being used in nearly 40 clinical research centres worldwide. We present its application for continuous monitoring of cerebral autoregulation using near-infrared spectroscopy (NIRS). Methods Data from multiple bedside monitors are processed by ICM+ in real time using a large selection of signal processing methods. These include various time and frequency domain analysis functions as well as fully customisable digital fi lters. The fi nal results are displayed in a variety of ways including simple time trends, as well as time window based histograms, cross histograms, correlations, and so forth. All this allows complex information from bedside monitors to be summarized in a concise fashion and presented to medical and nursing staff in a simple way that alerts them to the development of various pathological processes. Results One hundred and fi fty patients monitored continuously with NIRS, arterial blood pressure (ABP) and intracranial pressure (ICP), where available, were included in this study. There were 40 severely headinjured adult patients, 27 SAH patients (NCCU, Cambridge); 60 patients undergoing cardiopulmonary bypass (John Hopkins Hospital, Baltimore) and 23 patients with sepsis (University Hospital, Basel). In addition, MCA fl ow velocity (FV) was monitored intermittently using transcranial Doppler. FV-derived and ICP-derived pressure reactivity indices (PRx, Mx), as well as NIRS-derived reactivity indices (Cox, Tox, Thx) were calculated and showed signifi cant correlation with each other in all cohorts. Errorbar charts showing reactivity index PRx versus CPP (optimal CPP chart) as well as similar curves for NIRS indices versus CPP and ABP were also demonstrated. Conclusions ICM+ software is proving to be a very useful tool for enhancing the battery of available means for monitoring cerebral vasoreactivity and potentially facilitating autoregulation guided therapy. Complexity of data analysis is also hidden inside loadable profi les, thus allowing investigators to take full advantage of validated protocols including advanced processing formulas.
Resumo:
PURPOSE: In the radiopharmaceutical therapy approach to the fight against cancer, in particular when it comes to translating laboratory results to the clinical setting, modeling has served as an invaluable tool for guidance and for understanding the processes operating at the cellular level and how these relate to macroscopic observables. Tumor control probability (TCP) is the dosimetric end point quantity of choice which relates to experimental and clinical data: it requires knowledge of individual cellular absorbed doses since it depends on the assessment of the treatment's ability to kill each and every cell. Macroscopic tumors, seen in both clinical and experimental studies, contain too many cells to be modeled individually in Monte Carlo simulation; yet, in particular for low ratios of decays to cells, a cell-based model that does not smooth away statistical considerations associated with low activity is a necessity. The authors present here an adaptation of the simple sphere-based model from which cellular level dosimetry for macroscopic tumors and their end point quantities, such as TCP, may be extrapolated more reliably. METHODS: Ten homogenous spheres representing tumors of different sizes were constructed in GEANT4. The radionuclide 131I was randomly allowed to decay for each model size and for seven different ratios of number of decays to number of cells, N(r): 1000, 500, 200, 100, 50, 20, and 10 decays per cell. The deposited energy was collected in radial bins and divided by the bin mass to obtain the average bin absorbed dose. To simulate a cellular model, the number of cells present in each bin was calculated and an absorbed dose attributed to each cell equal to the bin average absorbed dose with a randomly determined adjustment based on a Gaussian probability distribution with a width equal to the statistical uncertainty consistent with the ratio of decays to cells, i.e., equal to Nr-1/2. From dose volume histograms the surviving fraction of cells, equivalent uniform dose (EUD), and TCP for the different scenarios were calculated. Comparably sized spherical models containing individual spherical cells (15 microm diameter) in hexagonal lattices were constructed, and Monte Carlo simulations were executed for all the same previous scenarios. The dosimetric quantities were calculated and compared to the adjusted simple sphere model results. The model was then applied to the Bortezomib-induced enzyme-targeted radiotherapy (BETR) strategy of targeting Epstein-Barr virus (EBV)-expressing cancers. RESULTS: The TCP values were comparable to within 2% between the adjusted simple sphere and full cellular models. Additionally, models were generated for a nonuniform distribution of activity, and results were compared between the adjusted spherical and cellular models with similar comparability. The TCP values from the experimental macroscopic tumor results were consistent with the experimental observations for BETR-treated 1 g EBV-expressing lymphoma tumors in mice. CONCLUSIONS: The adjusted spherical model presented here provides more accurate TCP values than simple spheres, on par with full cellular Monte Carlo simulations while maintaining the simplicity of the simple sphere model. This model provides a basis for complementing and understanding laboratory and clinical results pertaining to radiopharmaceutical therapy.
Resumo:
Objective: To report a single-center experience treating patients with squamous- cell carcinoma of the anal canal using helical Tomotherapy (HT) and concurrent chemotherapy (CT).Materials/Methods: From October 2007 to February 2011, 55 patients were treated with HT and concurrent CT (5-fluorouracil/capecitabin and mitomycin) for anal squamous-cell carcinoma. All patients underwent computed- tomography-based treatment planning, with pelvic and inguinal nodes receiving 36 Gy in 1.8 Gy/fraction. Following a planned 1-week break, primary tumor site and involved nodes were boosted to a total dose 59.4 Gy in 1.8 Gy/fraction. Dose-volume histograms of several organs at risk (OAR; bladder, small intestine, rectum, femoral heads, penile bulb, external genitalia) were assessed in terms of conformal avoidance. All toxicity was scored according to the CTCAE, v.3.0. HT plans and treatment were implemented using the Tomotherapy, Inc. software and hardware. For dosimetric comparisons, 3D RT and/or IMRT plans were also computed for some of the patients using the CMS planning system, for treatment with 6-18 MV photons and/or electrons with suitable energies from a Siemens Primus linear accelerator equipped with a multileaf collimator.Locoregional control and survival curves were compared with the log-rank test, and multivariate analysis by the Cox model.Results: With 360-degree-of-freedom beam projection, HT has an advantage over other RT techniques (3D or 5-field step-and-shot IMRT). There is significant improvement over 3D or 5-field IMRT plans in terms of dose conformity around the PTV, and dose gradients are steeper outside the target volume, resulting in reduced doses to OARs. Using HT, acute toxicity was acceptable, and seemed to be better than historical standards.Conclusions: Our results suggest that HT combined with concurrent CT for anal cancer is effective and tolerable. Compared to 3D RT or 5-field step-andshot IMRT, there is better conformity around the PTV, and better OAR sparing.
Resumo:
In this paper, we describe several techniques for detecting tonic pitch value in Indian classical music. In Indian music, the raga is the basic melodic framework and it is built on the tonic. Tonic detection is therefore fundamental for any melodic analysis in Indian classical music. This workexplores detection of tonic by processing the pitch histograms of Indian classic music. Processing of pitch histograms using group delay functions and its ability to amplify certain traits of Indian music in the pitch histogram, is discussed. Three different strategies to detect tonic, namely, the concert method, the template matching and segmented histogram method are proposed. The concert method exploits the fact that the tonic is constant over a piece/concert.templatematchingmethod and segmented histogrammethodsuse the properties: (i) the tonic is always present in the background, (ii) some notes are less inflected and dominant, to detect the tonic of individual pieces. All the three methods yield good results for Carnatic music (90−100% accuracy), while for Hindustanimusic, the templatemethod works best, provided the v¯adi samv¯adi notes for a given piece are known (85%).
Resumo:
In this paper we investigate how note onsets in Turkish Makam music compositions are distributed, and in how far this distribution supports or contradicts the metrical structure of the pieces, the usul. We use MIDI data to derive the distributions in the form of onset histograms, and comparethem with metrical weights that are applied to describe the usul in theory. We compute correlation and syncopation values to estimate the degrees of support and contradiction, respectively. While the concept of syncopation is rarelymentioned in the context of this music, we can gain interesting insight into the structure of a piece using such a measure.We show that metrical contradiction is systematically applied in some metrical structures. We will compare thedifferences between Western music and Turkish Makam music regarding metrical support and contradiction. Such a study can help avoiding pitfalls in later attempts to perform audio processing tasks such as beat tracking or rhythmic similarity measurements.
Resumo:
The atomic force microscope is not only a very convenient tool for studying the topography of different samples, but it can also be used to measure specific binding forces between molecules. For this purpose, one type of molecule is attached to the tip and the other one to the substrate. Approaching the tip to the substrate allows the molecules to bind together. Retracting the tip breaks the newly formed bond. The rupture of a specific bond appears in the force-distance curves as a spike from which the binding force can be deduced. In this article we present an algorithm to automatically process force-distance curves in order to obtain bond strength histograms. The algorithm is based on a fuzzy logic approach that permits an evaluation of "quality" for every event and makes the detection procedure much faster compared to a manual selection. In this article, the software has been applied to measure the binding strength between tubuline and microtubuline associated proteins.
Resumo:
The current study was initiated to quantify the stresses induced in critical details on the reinforcing jacket and the tower itself through the use of field instrumentation, load testing, and long-term monitoring. Strain gages were installed on the both the tower and the reinforcing jacket. Additional strain gages were installed on two anchor rods. Tests were conducted with and without the reinforcing jacket installed. Data were collected from all strain gages during static load testing and were used to study the stress distribution of the tower caused by known loads, both with and without the reinforcing jacket. The tower was tested dynamically by first applying a static load, and then quickly releasing the load causing the tower to vibrate freely. Furthermore, the tower was monitored over a period of over 1 year to obtain stress range histograms at the critical details to be used for a fatigue evaluation. Also during the long-term monitoring, triggered time-history data were recorded to study the wind loading phenomena that excite the tower.
Resumo:
The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalogue will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigmbut without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.
Resumo:
The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalogue will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigmbut without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.
Resumo:
In this paper we introduce a highly efficient reversible data hiding system. It is based on dividing the image into tiles and shifting the histograms of each image tile between its minimum and maximum frequency. Data are then inserted at the pixel level with the largest frequency to maximize data hiding capacity. It exploits the special properties of medical images, where the histogram of their nonoverlapping image tiles mostly peak around some gray values and the rest of the spectrum is mainlyempty. The zeros (or minima) and peaks (maxima) of the histograms of the image tiles are then relocated to embed the data. The grey values of some pixels are therefore modified.High capacity, high fidelity, reversibility and multiple data insertions are the key requirements of data hiding in medical images. We show how histograms of image tiles of medical images can be exploited to achieve these requirements. Compared with data hiding method applied to the whole image, our scheme can result in 30%-200% capacity improvement and still with better image quality, depending on the medical image content. Additional advantages of the proposed method include hiding data in the regions of non-interest and better exploitation of spatial masking.
Resumo:
This paper presents a novel image classification scheme for benthic coral reef images that can be applied to both single image and composite mosaic datasets. The proposed method can be configured to the characteristics (e.g., the size of the dataset, number of classes, resolution of the samples, color information availability, class types, etc.) of individual datasets. The proposed method uses completed local binary pattern (CLBP), grey level co-occurrence matrix (GLCM), Gabor filter response, and opponent angle and hue channel color histograms as feature descriptors. For classification, either k-nearest neighbor (KNN), neural network (NN), support vector machine (SVM) or probability density weighted mean distance (PDWMD) is used. The combination of features and classifiers that attains the best results is presented together with the guidelines for selection. The accuracy and efficiency of our proposed method are compared with other state-of-the-art techniques using three benthic and three texture datasets. The proposed method achieves the highest overall classification accuracy of any of the tested methods and has moderate execution time. Finally, the proposed classification scheme is applied to a large-scale image mosaic of the Red Sea to create a completely classified thematic map of the reef benthos