897 resultados para Mesh generation from image data
Resumo:
Satellite image classification involves designing and developing efficient image classifiers. With satellite image data and image analysis methods multiplying rapidly, selecting the right mix of data sources and data analysis approaches has become critical to the generation of quality land-use maps. In this study, a new postprocessing information fusion algorithm for the extraction and representation of land-use information based on high-resolution satellite imagery is presented. This approach can produce land-use maps with sharp interregional boundaries and homogeneous regions. The proposed approach is conducted in five steps. First, a GIS layer - ATKIS data - was used to generate two coarse homogeneous regions, i.e. urban and rural areas. Second, a thematic (class) map was generated by use of a hybrid spectral classifier combining Gaussian Maximum Likelihood algorithm (GML) and ISODATA classifier. Third, a probabilistic relaxation algorithm was performed on the thematic map, resulting in a smoothed thematic map. Fourth, edge detection and edge thinning techniques were used to generate a contour map with pixel-width interclass boundaries. Fifth, the contour map was superimposed on the thematic map by use of a region-growing algorithm with the contour map and the smoothed thematic map as two constraints. For the operation of the proposed method, a software package is developed using programming language C. This software package comprises the GML algorithm, a probabilistic relaxation algorithm, TBL edge detector, an edge thresholding algorithm, a fast parallel thinning algorithm, and a region-growing information fusion algorithm. The county of Landau of the State Rheinland-Pfalz, Germany was selected as a test site. The high-resolution IRS-1C imagery was used as the principal input data.
Resumo:
In the present thesis we address the problem of detecting and localizing a small spherical target with characteristic electrical properties inside a volume of cylindrical shape, representing female breast, with MWI. One of the main works of this project is to properly extend the existing linear inversion algorithm from planar slice to volume reconstruction; results obtained, under the same conditions and experimental setup are reported for the two different approaches. Preliminar comparison and performance analysis of the reconstruction algorithms is performed via numerical simulations in a software-created environment: a single dipole antenna is used for illuminating the virtual breast phantom from different positions and, for each position, the corresponding scattered field value is registered. Collected data are then exploited in order to reconstruct the investigation domain, along with the scatterer position, in the form of image called pseudospectrum. During this process the tumor is modeled as a dielectric sphere of small radius and, for electromagnetic scattering purposes, it's treated as a point-like source. To improve the performance of reconstruction technique, we repeat the acquisition for a number of frequencies in a given range: the different pseudospectra, reconstructed from single frequency data, are incoherently combined with MUltiple SIgnal Classification (MUSIC) method which returns an overall enhanced image. We exploit multi-frequency approach to test the performance of 3D linear inversion reconstruction algorithm while varying the source position inside the phantom and the height of antenna plane. Analysis results and reconstructed images are then reported. Finally, we perform 3D reconstruction from experimental data gathered with the acquisition system in the microwave laboratory at DIFA, University of Bologna for a recently developed breast-phantom prototype; obtained pseudospectrum and performance analysis for the real model are reported.
Resumo:
Background Current knowledge about risk factors promoting hypertensive crisis originates from retrospective data. Therefore, potential risk factors of hypertensive crisis were assessed in a prospective longitudinal study. Methods Eighty-nine patients of the medical outpatient unit at the University Hospital of Bern (Bern, Switzerland) with previously diagnosed hypertension participated in this study. At baseline, 33 potential risk factors were assessed. All patients were followed-up for the outcome of hypertensive crisis. Cox regression models were used to detect relationships between risk factors and hypertensive crisis (defined as acute rise of systolic blood pressure (BP) ≥200mmHg and/or diastolic BP ≥120mmHg). Results The mean duration of follow-up was 1.6 ± 0.3 years (range 1.0–2.4 years). Four patients (4.5%) were lost to follow-up. Thirteen patients (15.3%) experienced hypertensive crisis during follow-up. Several potential risk factors were significantly associated with hypertensive crisis: female sex, higher grades of obesity, the presence of a hypertensive or coronary heart disease, the presence of a somatoform disorder, a higher number of antihypertensive drugs, and nonadherence to medication. As measured by the hazard ratio, nonadherence was the most important factor associated with hypertensive crisis (hazard ratio 5.88, 95% confidence interval 1.59–21.77, P < 0.01). Conclusions This study identified several potential risk factors of hypertensive crisis. Results of this study are consistent with the hypothesis that improvement of medical adherence in antihypertensive therapy would help to prevent hypertensive crises. However, larger studies are needed to assess potential confounding, other risk factors and the possibility of interaction between predictors.
Resumo:
Osteoarticular allograft is one possible treatment in wide surgical resections with large defects. Performing best osteoarticular allograft selection is of great relevance for optimal exploitation of the bone databank, good surgery outcome and patient’s recovery. Current approaches are, however, very time consuming hindering these points in practice. We present a validation study of a software able to perform automatic bone measurements used to automatically assess the distal femur sizes across a databank. 170 distal femur surfaces were reconstructed from CT data and measured manually using a size measure protocol taking into account the transepicondyler distance (A), anterior-posterior distance in medial condyle (B) and anterior-posterior distance in lateral condyle (C). Intra- and inter-observer studies were conducted and regarded as ground truth measurements. Manual and automatic measures were compared. For the automatic measurements, the correlation coefficients between observer one and automatic method, were of 0.99 for A measure and 0.96 for B and C measures. The average time needed to perform the measurements was of 16 h for both manual measurements, and of 3 min for the automatic method. Results demonstrate the high reliability and, most importantly, high repeatability of the proposed approach, and considerable speed-up on the planning.
Resumo:
The occupant impact velocity (OIV) and acceleration severity index (ASI) are competing measures of crash severity used to assess occupant injury risk in full-scale crash tests involving roadside safety hardware, e.g. guardrail. Delta-V, or the maximum change in vehicle velocity, is the traditional metric of crash severity for real world crashes. This study compares the ability of the OIV, ASI, and delta-V to discriminate between serious and non-serious occupant injury in real world frontal collisions. Vehicle kinematics data from event data recorders (EDRs) were matched with detailed occupant injury information for 180 real world crashes. Cumulative probability of injury risk curves were generated using binary logistic regression for belted and unbelted data subsets. By comparing the available fit statistics and performing a separate ROC curve analysis, the more computationally intensive OIV and ASI were found to offer no significant predictive advantage over the simpler delta-V.
Resumo:
With improvements in acquisition speed and quality, the amount of medical image data to be screened by clinicians is starting to become challenging in the daily clinical practice. To quickly visualize and find abnormalities in medical images, we propose a new method combining segmentation algorithms with statistical shape models. A statistical shape model built from a healthy population will have a close fit in healthy regions. The model will however not fit to morphological abnormalities often present in the areas of pathologies. Using the residual fitting error of the statistical shape model, pathologies can be visualized very quickly. This idea is applied to finding drusen in the retinal pigment epithelium (RPE) of optical coherence tomography (OCT) volumes. A segmentation technique able to accurately segment drusen in patients with age-related macular degeneration (AMD) is applied. The segmentation is then analyzed with a statistical shape model to visualize potentially pathological areas. An extensive evaluation is performed to validate the segmentation algorithm, as well as the quality and sensitivity of the hinting system. Most of the drusen with a height of 85.5 microm were detected, and all drusen at least 93.6 microm high were detected.
Resumo:
Image-guided microsurgery requires accuracies an order of magnitude higher than today's navigation systems provide. A critical step toward the achievement of such low-error requirements is a highly accurate and verified patient-to-image registration. With the aim of reducing target registration error to a level that would facilitate the use of image-guided robotic microsurgery on the rigid anatomy of the head, we have developed a semiautomatic fiducial detection technique. Automatic force-controlled localization of fiducials on the patient is achieved through the implementation of a robotic-controlled tactile search within the head of a standard surgical screw. Precise detection of the corresponding fiducials in the image data is realized using an automated model-based matching algorithm on high-resolution, isometric cone beam CT images. Verification of the registration technique on phantoms demonstrated that through the elimination of user variability, clinically relevant target registration errors of approximately 0.1 mm could be achieved.
Resumo:
A confocal imaging and image processing scheme is introduced to visualize and evaluate the spatial distribution of spectral information in tissue. The image data are recorded using a confocal laser-scanning microscope equipped with a detection unit that provides high spectral resolution. The processing scheme is based on spectral data, is less error-prone than intensity-based visualization and evaluation methods, and provides quantitative information on the composition of the sample. The method is tested and validated in the context of the development of dermal drug delivery systems, introducing a quantitative uptake indicator to compare the performances of different delivery systems is introduced. A drug penetration study was performed in vitro. The results show that the method is able to detect, visualize and measure spectral information in tissue. In the penetration study, uptake efficiencies of different experiment setups could be discriminated and quantitatively described. The developed uptake indicator is a step towards a quantitative assessment and, in a more general view apart from pharmaceutical research, provides valuable information on tissue composition. It can potentially be used for clinical in vitro and in vivo applications.
Resumo:
Investigators interested in whether a disease aggregates in families often collect case-control family data, which consist of disease status and covariate information for families selected via case or control probands. Here, we focus on the use of case-control family data to investigate the relative contributions to the disease of additive genetic effects (A), shared family environment (C), and unique environment (E). To this end, we describe a ACE model for binary family data and then introduce an approach to fitting the model to case-control family data. The structural equation model, which has been described previously, combines a general-family extension of the classic ACE twin model with a (possibly covariate-specific) liability-threshold model for binary outcomes. Our likelihood-based approach to fitting involves conditioning on the proband’s disease status, as well as setting prevalence equal to a pre-specified value that can be estimated from the data themselves if necessary. Simulation experiments suggest that our approach to fitting yields approximately unbiased estimates of the A, C, and E variance components, provided that certain commonly-made assumptions hold. These assumptions include: the usual assumptions for the classic ACE and liability-threshold models; assumptions about shared family environment for relative pairs; and assumptions about the case-control family sampling, including single ascertainment. When our approach is used to fit the ACE model to Austrian case-control family data on depression, the resulting estimate of heritability is very similar to those from previous analyses of twin data.
Resumo:
The demands in production and associate costs at power generation through non renewable resources are increasing at an alarming rate. Solar energy is one of the renewable resource that has the potential to minimize this increase. Utilization of solar energy have been concentrated mainly on heating application. The use of solar energy in cooling systems in building would benefit greatly achieving the goal of non-renewable energy minimization. The approaches of solar energy heating system research done by initiation such as University of Wisconsin at Madison and building heat flow model research conducted by Oklahoma State University can be used to develop and optimize solar cooling building system. The research uses two approaches to develop a Graphical User Interface (GUI) software for an integrated solar absorption cooling building model, which is capable of simulating and optimizing the absorption cooling system using solar energy as the main energy source to drive the cycle. The software was then put through a number of litmus test to verify its integrity. The litmus test was conducted on various building cooling system data sets of similar applications around the world. The output obtained from the software developed were identical with established experimental results from the data sets used. Software developed by other research are catered for advanced users. The software developed by this research is not only reliable in its code integrity but also through its integrated approach which is catered for new entry users. Hence, this dissertation aims to correctly model a complete building with the absorption cooling system in appropriate climate as a cost effective alternative to conventional vapor compression system.
Resumo:
AIMS: To compare the gender distribution of HIV-infected adults receiving highly active antiretroviral treatment (HAART) in resource-constrained settings with estimates of the gender distribution of HIV infection; to describe the clinical characteristics of women and men receiving HAART. METHODS: The Antiretroviral Therapy in Lower-Income Countries, ART-LINC Collaboration is a network of clinics providing HAART in Africa, Latin America, and Asia. We compared UNAIDS data on the gender distribution of HIV infection with the proportions of women and men receiving HAART in the ART-LINC Collaboration. RESULTS: Twenty-nine centers in 13 countries participated. Among 33,164 individuals, 19,989 (60.3%) were women. Proportions of women receiving HAART in ART-LINC centers were similar to, or higher than, UNAIDS estimates of the proportions of HIV-infected women in all but two centers. There were fewer women receiving HAART than expected from UNAIDS data in one center in Uganda and one center in India. Taking into account heterogeneity across cohorts, women were younger than men, less likely to have advanced HIV infection, and more likely to be anemic at HAART initiation. CONCLUSIONS: Women in resource-constrained settings are not necessarily disadvantaged in their access to HAART. More attention needs to be paid to ensuring that HIV-infected men are seeking care and starting HAART.
Resumo:
Strain rate significantly affects the strength of a material. The Split-Hopkinson Pressure Bar (SHPB) was initially used to study the effects of high strain rate (~103 1/s) testing of metals. Later modifications to the original technique allowed for the study of brittle materials such as ceramics, concrete, and rock. While material properties of wood for static and creep strain rates are readily available, data on the dynamic properties of wood are sparse. Previous work using the SHPB technique with wood has been limited in scope to variability of only a few conditions and tests of the applicability of the SHPB theory on wood have not been performed. Tests were conducted using a large diameter (3.0 inch (75 mm)) SHPB. The strain rate and total strain applied to a specimen are dependent on the striker bar length and velocity at impact. Pulse shapers are used to further modify the strain rate and change the shape of the strain pulse. A series of tests were used to determine test conditions necessary to produce a strain rate, total strain, and pulse shape appropriate for testing wood specimens. Hard maple, consisting of sugar maple (Acer saccharum) and black maple (Acer nigrum), and eastern white pine (Pinus strobus) specimens were used to represent a dense hardwood and a low-density soft wood. Specimens were machined to diameters of 2.5 and 3.0 inches and an assortment of lengths were tested to determine the appropriate specimen dimensions. Longitudinal specimens of 1.5 inch length and radial and tangential specimens of 0.5 inch length were found to be most applicable to SHPB testing. Stress/strain curves were generated from the SHPB data and validated with 6061-T6 aluminum and wood specimens. Stress was indirectly corroborated with gaged aluminum specimens. Specimen strain was assessed with strain gages, digital image analysis, and measurement of residual strain to confirm the strain calculated from SHPB data. The SHPB was found to be a useful tool in accurately assessing the material properties of wood under high strain rates (70 to 340 1/s) and short load durations (70 to 150 μs to compressive failure).
Resumo:
Many methodologies dealing with prediction or simulation of soft tissue deformations on medical image data require preprocessing of the data in order to produce a different shape representation that complies with standard methodologies, such as mass–spring networks, finite element method s (FEM). On the other hand, methodologies working directly on the image space normally do not take into account mechanical behavior of tissues and tend to lack physics foundations driving soft tissue deformations. This chapter presents a method to simulate soft tissue deformations based on coupled concepts from image analysis and mechanics theory. The proposed methodology is based on a robust stochastic approach that takes into account material properties retrieved directly from the image, concepts from continuum mechanics and FEM. The optimization framework is solved within a hierarchical Markov random field (HMRF) which is implemented on the graphics processor unit (GPU See Graphics processing unit ).
Resumo:
High density spatial and temporal sampling of EEG data enhances the quality of results of electrophysiological experiments. Because EEG sources typically produce widespread electric fields (see Chapter 3) and operate at frequencies well below the sampling rate, increasing the number of electrodes and time samples will not necessarily increase the number of observed processes, but mainly increase the accuracy of the representation of these processes. This is namely the case when inverse solutions are computed. As a consequence, increasing the sampling in space and time increases the redundancy of the data (in space, because electrodes are correlated due to volume conduction, and time, because neighboring time points are correlated), while the degrees of freedom of the data change only little. This has to be taken into account when statistical inferences are to be made from the data. However, in many ERP studies, the intrinsic correlation structure of the data has been disregarded. Often, some electrodes or groups of electrodes are a priori selected as the analysis entity and considered as repeated (within subject) measures that are analyzed using standard univariate statistics. The increased spatial resolution obtained with more electrodes is thus poorly represented by the resulting statistics. In addition, the assumptions made (e.g. in terms of what constitutes a repeated measure) are not supported by what we know about the properties of EEG data. From the point of view of physics (see Chapter 3), the natural “atomic” analysis entity of EEG and ERP data is the scalp electric field
Resumo:
[1] We present quantitative autumn, summer and annual precipitation and summer temperature reconstructions from proglacial annually laminated Lake Silvaplana, eastern Swiss Alps back to AD 1580. We used X-ray diffraction peak intensity ratios of minerals in the sediment layers (quartz qz, plagioclase pl, amphibole am, mica mi) that are diagnostic for different source areas and hydro-meteorological transport processes in the catchment. XRD data were calibrated with meteorological data (AD 1800/1864–1950) and revealed significant correlations: mi/pl with SON precipitation (r = 0.56, p < 0.05) and MJJAS precipitation (r = 0.66, p < 0.01); qz/mi with MJJAS temperature (r = −0.72, p < 0.01)and qz/am with annual precipitation (r = −0.54, p < 0.05). Geological catchment settings and hydro-meteorological processes provide deterministic explanations for the correlations. Our summer temperature reconstruction reproduces the typical features of past climate variability known from independent data sets. The precipitation reconstructions show a LIA climate moister than today. Exceptionally wet periods in our reconstruction coincide with regional glacier advances.