18 resultados para Weighted histogram analysis method

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: The purpose of our study was to evaluate the efficacy of CT histogram analysis for further characterization of lipid-poor adenomas on unenhanced CT. MATERIALS AND METHODS: One hundred thirty-two adrenal nodules were identified in 104 patients with lung cancer who underwent PET/CT. Sixty-five nodules were classified as lipid-rich adenomas if they had an unenhanced CT attenuation of less than or equal to 10 H. Thirty-one masses were classified as lipid-poor adenomas if they had an unenhanced CT attenuation greater than 10 H and stability for more than 1 year. Thirty-six masses were classified as lung cancer metastases if they showed rapid growth in 1 year (n = 27) or were biopsy-proven (n = 9). Histogram analysis was performed for all lesions to provide the mean attenuation value and percentage of negative pixels. RESULTS: All lipid-rich adenomas had more than 10% negative pixels; 51.6% of lipid-poor adenomas had more than 10% negative pixels and would have been classified as indeterminate nodules on the basis of mean attenuation alone. None of the metastases had more than 10% negative pixels. Using an unenhanced CT mean attenuation threshold of less than 10 H yielded a sensitivity of 68% and specificity of 100% for the diagnosis of an adenoma. Using an unenhanced CT threshold of more than 10% negative pixels yielded a sensitivity of 84% and specificity of 100% for the diagnosis of an adenoma. CONCLUSION: CT histogram analysis is superior to mean CT attenuation analysis for the evaluation of adrenal nodules and may help decrease referrals for additional imaging or biopsy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Phosphorus (P) is an essential macronutrient for all living organisms. Phosphorus is often present in nature as the soluble phosphate ion PO43– and has biological, terrestrial, and marine emission sources. Thus PO43– detected in ice cores has the potential to be an important tracer for biological activity in the past. In this study a continuous and highly sensitive absorption method for detection of dissolved reactive phosphorus (DRP) in ice cores has been developed using a molybdate reagent and a 2-m liquid waveguide capillary cell (LWCC). DRP is the soluble form of the nutrient phosphorus, which reacts with molybdate. The method was optimized to meet the low concentrations of DRP in Greenland ice, with a depth resolution of approximately 2 cm and an analytical uncertainty of 1.1 nM (0.1 ppb) PO43–. The method has been applied to segments of a shallow firn core from Northeast Greenland, indicating a mean concentration level of 2.74 nM (0.26 ppb) PO43– for the period 1930–2005 with a standard deviation of 1.37 nM (0.13 ppb) PO43– and values reaching as high as 10.52 nM (1 ppb) PO43–. Similar levels were detected for the period 1771–1823. Based on impurity abundances, dust and biogenic particles were found to be the most likely sources of DRP deposited in Northeast Greenland.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this clinical trial was to determine the active tactile sensibility of natural teeth and to obtain a statistical analysis method fitting a psychometric function through the observed data points. On 68 complete dentulous test persons (34 males, 34 females, mean age 45.9 ± 16.1 years), one pair of healthy natural teeth each was tested: n = 24 anterior teeth and n = 44 posterior teeth. The computer-assisted, randomized measurement was done by having the subjects bite on thin copper foils of different thickness (5-200 µm) inserted between the teeth. The threshold of active tactile sensibility was defined by the 50% value of correct answers. Additionally, the gradient of the sensibility curve and the support area (90-10% value) as a description of the shape of the sensibility curve were calculated. For modeling the sensibility curve, symmetric and asymmetric functions were used. The mean sensibility threshold was 14.2 ± 12.1 µm. The older the subject, the higher the tactile threshold (r = 0.42, p = 0.0006). The support area was 41.8 ± 43.3 µm. The higher the 50% threshold, the smaller the gradient of the curve and the larger the support area. The curves showing the active tactile sensibility of natural teeth demonstrate a tendency towards asymmetry, so that the active tactile sensibility of natural teeth can mathematically best be described by using the asymmetric Weibull function.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When estimating the effect of treatment on HIV using data from observational studies, standard methods may produce biased estimates due to the presence of time-dependent confounders. Such confounding can be present when a covariate, affected by past exposure, is both a predictor of the future exposure and the outcome. One example is the CD4 cell count, being a marker for disease progression for HIV patients, but also a marker for treatment initiation and influenced by treatment. Fitting a marginal structural model (MSM) using inverse probability weights is one way to give appropriate adjustment for this type of confounding. In this paper we study a simple and intuitive approach to estimate similar treatment effects, using observational data to mimic several randomized controlled trials. Each 'trial' is constructed based on individuals starting treatment in a certain time interval. An overall effect estimate for all such trials is found using composite likelihood inference. The method offers an alternative to the use of inverse probability of treatment weights, which is unstable in certain situations. The estimated parameter is not identical to the one of an MSM, it is conditioned on covariate values at the start of each mimicked trial. This allows the study of questions that are not that easily addressed fitting an MSM. The analysis can be performed as a stratified weighted Cox analysis on the joint data set of all the constructed trials, where each trial is one stratum. The model is applied to data from the Swiss HIV cohort study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Craniosynostosis consists of a premature fusion of the sutures in an infant skull that restricts skull and brain growth. During the last decades, there has been a rapid increase of fundamentally diverse surgical treatment methods. At present, the surgical outcome has been assessed using global variables such as cephalic index, head circumference, and intracranial volume. However, these variables have failed in describing the local deformations and morphological changes that may have a role in the neurologic disorders observed in the patients. This report describes a rigid image registration-based method to evaluate outcomes of craniosynostosis surgical treatments, local quantification of head growth, and indirect intracranial volume change measurements. The developed semiautomatic analysis method was applied to computed tomography data sets of a 5-month-old boy with sagittal craniosynostosis who underwent expansion of the posterior skull with cranioplasty. Quantification of the local changes between pre- and postoperative images was quantified by mapping the minimum distance of individual points from the preoperative to the postoperative surface meshes, and indirect intracranial volume changes were estimated. The proposed methodology can provide the surgeon a tool for the quantitative evaluation of surgical procedures and detection of abnormalities of the infant skull and its development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Assessment of lung volume (FRC) and ventilation inhomogeneities with ultrasonic flowmeter and multiple breath washout (MBW) has been used to provide important information about lung disease in infants. Sub-optimal adjustment of the mainstream molar mass (MM) signal for temperature and external deadspace may lead to analysis errors in infants with critically small tidal volume changes during breathing. METHODS: We measured expiratory temperature in human infants at 5 weeks of age and examined the influence of temperature and deadspace changes on FRC results with computer simulation modeling. A new analysis method with optimized temperature and deadspace settings was then derived, tested for robustness to analysis errors and compared with the previously used analysis methods. RESULTS: Temperature in the facemask was higher and variations of deadspace volumes larger than previously assumed. Both showed considerable impact upon FRC and LCI results with high variability when obtained with the previously used analysis model. Using the measured temperature we optimized model parameters and tested a newly derived analysis method, which was found to be more robust to variations in deadspace. Comparison between both analysis methods showed systematic differences and a wide scatter. CONCLUSION: Corrected deadspace and more realistic temperature assumptions improved the stability of the analysis of MM measurements obtained by ultrasonic flowmeter in infants. This new analysis method using the only currently available commercial ultrasonic flowmeter in infants may help to improve stability of the analysis and further facilitate assessment of lung volume and ventilation inhomogeneities in infants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Degeneration of the intervertebral disc, sometimes associated with low back pain and abnormal spinal motions, represents a major health issue with high costs. A non-invasive degeneration assessment via qualitative or quantitative MRI (magnetic resonance imaging) is possible, yet, no relation between mechanical properties and T2 maps of the intervertebral disc (IVD) has been considered, albeit T2 relaxation time values quantify the degree of degeneration. Therefore, MRI scans and mechanical tests were performed on 14 human lumbar intervertebral segments freed from posterior elements and all soft tissues excluding the IVD. Degeneration was evaluated in each specimen using morphological criteria, qualitative T2 weighted images and quantitative axial T2 map data and stiffness was calculated from the load-deflection curves of in vitro compression, torsion, lateral bending and flexion/extension tests. In addition to mean T2, the OTSU threshold of T2 (TOTSU), a robust and automatic histogram-based method that computes the optimal threshold maximizing the distinction of two classes of values, was calculated for anterior, posterior, left and right regions of each annulus fibrosus (AF). While mean T2 and degeneration schemes were not related to the IVDs' mechanical properties, TOTSU computed in the posterior AF correlated significantly with those classifications as well as with all stiffness values. TOTSU should therefore be included in future degeneration grading schemes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The T2K long-baseline neutrino oscillation experiment in Japan needs precise predictions of the initial neutrino flux. The highest precision can be reached based on detailed measurements of hadron emission from the same target as used by T2K exposed to a proton beam of the same kinetic energy of 30 GeV. The corresponding data were recorded in 2007-2010 by the NA61/SHINE experiment at the CERN SPS using a replica of the T2K graphite target. In this paper details of the experiment, data taking, data analysis method and results from the 2007 pilot run are presented. Furthermore, the application of the NA61/SHINE measurements to the predictions of the T2K initial neutrino flux is described and discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the directional distribution of heavy neutral atoms in the heliosphere by using heavy neutral maps generated with the IBEX-Lo instrument over three years from 2009 to 2011. The interstellar neutral (ISN) O&Ne gas flow was found in the first-year heavy neutral map at 601 keV and its flow direction and temperature were studied. However, due to the low counting statistics, researchers have not treated the full sky maps in detail. The main goal of this study is to evaluate the statistical significance of each pixel in the heavy neutral maps to get a better understanding of the directional distribution of heavy neutral atoms in the heliosphere. Here, we examine three statistical analysis methods: the signal-to-noise filter, the confidence limit method, and the cluster analysis method. These methods allow us to exclude background from areas where the heavy neutral signal is statistically significant. These methods also allow the consistent detection of heavy neutral atom structures. The main emission feature expands toward lower longitude and higher latitude from the observational peak of the ISN O&Ne gas flow. We call this emission the extended tail. It may be an imprint of the secondary oxygen atoms generated by charge exchange between ISN hydrogen atoms and oxygen ions in the outer heliosheath.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In many cases, it is not possible to call the motorists to account for their considerable excess in speeding, because they deny being the driver on the speed-check photograph. An anthropological comparison of facial features using a photo-to-photo comparison can be very difficult depending on the quality of the photographs. One difficulty of that analysis method is that the comparison photographs of the presumed driver are taken with a different camera or camera lens and from a different angle than for the speed-check photo. To take a comparison photograph with exactly the same camera setup is almost impossible. Therefore, only an imprecise comparison of the individual facial features is possible. The geometry and position of each facial feature, for example the distances between the eyes or the positions of the ears, etc., cannot be taken into consideration. We applied a new method using 3D laser scanning, optical surface digitalization, and photogrammetric calculation of the speed-check photo, which enables a geometric comparison. Thus, the influence of the focal length and the distortion of the objective lens are eliminated and the precise position and the viewing direction of the speed-check camera are calculated. Even in cases of low-quality images or when the face of the driver is partly hidden, good results are delivered using this method. This new method, Geometric Comparison, is evaluated and validated in a prepared study which is described in this article.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Even though complete resection is regarded as the only curative treatment for nonsmall cell lung cancer (NSCLC), >50% of resected patients die from a recurrence or a second primary tumour of the lung within 5 yrs. It remains unclear, whether follow-up in these patients is cost-effective and whether it can improve the outcome due to early detection of recurrent tumour. The benefit of regular follow-up in a consecutive series of 563 patients, who had undergone potentially curative resection for NSCLC at the University Hospital, was analysed. The follow-up consisted of clinical visits and chest radiography according to a standard protocol for up to 10 yrs. Survival rates were estimated using the Kaplan-Meier analysis method and the cost-effectiveness of the follow-up programme was assessed. A total of 23 patients (6.4% of the group with lobectomy) underwent further operation with curative intent for a second pulmonary malignancy. The regular follow-up over a 10-yr period provided the chance for a second curative treatment to 3.8% of all patients. The calculated costs per life-yr gained were 90,000 Swiss Francs. The cost-effectiveness of the follow-up protocol was far above those of comparable large-scale surveillance programmes. Based on these data, the intensity and duration of the follow-up was reduced.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Frequency-transformed EEG resting data has been widely used to describe normal and abnormal brain functional states as function of the spectral power in different frequency bands. This has yielded a series of clinically relevant findings. However, by transforming the EEG into the frequency domain, the initially excellent time resolution of time-domain EEG is lost. The topographic time-frequency decomposition is a novel computerized EEG analysis method that combines previously available techniques from time-domain spatial EEG analysis and time-frequency decomposition of single-channel time series. It yields a new, physiologically and statistically plausible topographic time-frequency representation of human multichannel EEG. The original EEG is accounted by the coefficients of a large set of user defined EEG like time-series, which are optimized for maximal spatial smoothness and minimal norm. These coefficients are then reduced to a small number of model scalp field configurations, which vary in intensity as a function of time and frequency. The result is thus a small number of EEG field configurations, each with a corresponding time-frequency (Wigner) plot. The method has several advantages: It does not assume that the data is composed of orthogonal elements, it does not assume stationarity, it produces topographical maps and it allows to include user-defined, specific EEG elements, such as spike and wave patterns. After a formal introduction of the method, several examples are given, which include artificial data and multichannel EEG during different physiological and pathological conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: Identification of the ventrointermediate thalamic nucleus (Vim) in modern 3T high-field MRI for image-based targeting in deep brain stimulation (DBS) is still challenging. To evaluate the usefulness and reliability of analyzing the connectivity with the cerebellum using Q-ball-calculation we performed a retrospective analysis. Method: 5 patients who underwent bilateral implantation of electrodes in the Vim for treatment of Essential Tremor between 2011 and 2012 received additional preoperative Q-ball imaging. Targeting was performed according to atlas coordinates and standard MRI. Additionally we performed a retrospective identification of the Vim by analyzing the connectivity of the thalamus with the dentate nucleus. The exact position of the active stimulation contact in the postoperative CT was correlated with the Vim as it was identified by Q-ball calculation. Results: Localization of the Vim by analysis of the connectivity between thalamus and cerebellum was successful in all 5 patients on both sides. The average position of the active contacts was 14.6 mm (SD 1.24) lateral, 5.37 mm (SD 0.094 posterior and 2.21 mm (SD 0.69) cranial of MC. The cranial portion of the dentato-rubro-thalamic tract was localized an average of 3.38 mm (SD 1.57) lateral and 1.5 mm (SD 1.22) posterior of the active contact. Conclusions: Connectivity analysis by Q-ball calculation provided direct visualization of the Vim in all cases. Our preliminary results suggest, that the target determined by connectivity analysis is valid and could possibly be used in addition to or even instead of atlas based targeting. Larger prospective calculations are needed to determine the robustness of this method in providing refined information useful for neurosurgical treatment of tremor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1 Natural soil profiles may be interpreted as an arrangement of parts which are characterized by properties like hydraulic conductivity and water retention function. These parts form a complicated structure. Characterizing the soil structure is fundamental in subsurface hydrology because it has a crucial influence on flow and transport and defines the patterns of many ecological processes. We applied an image analysis method for recognition and classification of visual soil attributes in order to model flow and transport through a man-made soil profile. Modeled and measured saturation-dependent effective parameters were compared. We found that characterizing and describing conductivity patterns in soils with sharp conductivity contrasts is feasible. Differently, solving flow and transport on the basis of these conductivity maps is difficult and, in general, requires special care for representation of small-scale processes.