29 resultados para Image processing -- Digital techniques -- Mathematical models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The inference of gene regulatory networks (GRNs) from large-scale expression profiles is one of the most challenging problems of Systems Biology nowadays. Many techniques and models have been proposed for this task. However, it is not generally possible to recover the original topology with great accuracy, mainly due to the short time series data in face of the high complexity of the networks and the intrinsic noise of the expression measurements. In order to improve the accuracy of GRNs inference methods based on entropy (mutual information), a new criterion function is here proposed. Results: In this paper we introduce the use of generalized entropy proposed by Tsallis, for the inference of GRNs from time series expression profiles. The inference process is based on a feature selection approach and the conditional entropy is applied as criterion function. In order to assess the proposed methodology, the algorithm is applied to recover the network topology from temporal expressions generated by an artificial gene network (AGN) model as well as from the DREAM challenge. The adopted AGN is based on theoretical models of complex networks and its gene transference function is obtained from random drawing on the set of possible Boolean functions, thus creating its dynamics. On the other hand, DREAM time series data presents variation of network size and its topologies are based on real networks. The dynamics are generated by continuous differential equations with noise and perturbation. By adopting both data sources, it is possible to estimate the average quality of the inference with respect to different network topologies, transfer functions and network sizes. Conclusions: A remarkable improvement of accuracy was observed in the experimental results by reducing the number of false connections in the inferred topology by the non-Shannon entropy. The obtained best free parameter of the Tsallis entropy was on average in the range 2.5 <= q <= 3.5 (hence, subextensive entropy), which opens new perspectives for GRNs inference methods based on information theory and for investigation of the nonextensivity of such networks. The inference algorithm and criterion function proposed here were implemented and included in the DimReduction software, which is freely available at http://sourceforge.net/projects/dimreduction and http://code.google.com/p/dimreduction/.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Considering the difficulties in finding good-quality images for the development and test of computer-aided diagnosis (CAD), this paper presents a public online mammographic images database free for all interested viewers and aimed to help develop and evaluate CAD schemes. The digitalization of the mammographic images is made with suitable contrast and spatial resolution for processing purposes. The broad recuperation system allows the user to search for different images, exams, or patient characteristics. Comparison with other databases currently available has shown that the presented database has a sufficient number of images, is of high quality, and is the only one to include a functional search system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sossego was the first Vale SAG mill operation to process copper-gold ore. It is located in the State of Para, southeastern Amazon region of Brazil. In the first three years of continuous operation, Vale investigated different alternatives for improving the circuit`s performance by investigating operating conditions, mainly focusing on the SAG mill. It was decided to further assess the performance of the comminution circuit as a function of ore characteristics. A comprehensive ore characterization program was then conducted, together with the calibration of mathematical models on the basis of surveys carried out at the industrial circuit. The simulator was then used to predict the throughput associated to each ore type, as well as to establish the optimized circuit configuration and tailored operating conditions. This paper describes in detail the main aspects of optimizing the industrial circuit performance, as well as the successful method for predicting the production as a function of ore characteristics and circuit configuration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The classical approach for acoustic imaging consists of beamforming, and produces the source distribution of interest convolved with the array point spread function. This convolution smears the image of interest, significantly reducing its effective resolution. Deconvolution methods have been proposed to enhance acoustic images and have produced significant improvements. Other proposals involve covariance fitting techniques, which avoid deconvolution altogether. However, in their traditional presentation, these enhanced reconstruction methods have very high computational costs, mostly because they have no means of efficiently transforming back and forth between a hypothetical image and the measured data. In this paper, we propose the Kronecker Array Transform ( KAT), a fast separable transform for array imaging applications. Under the assumption of a separable array, it enables the acceleration of imaging techniques by several orders of magnitude with respect to the fastest previously available methods, and enables the use of state-of-the-art regularized least-squares solvers. Using the KAT, one can reconstruct images with higher resolutions than was previously possible and use more accurate reconstruction techniques, opening new and exciting possibilities for acoustic imaging.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background. - Tardive dyskinesia (TD) is a movement disorder observed after chronic neuroleptic treatment. Smoking is presumed to increase the prevalence of TD. The question of a cause-effect-relationship between smoking and TD, however, remains to be answered. Purpose of this study was to examine the correlation between the degree of smoking and the severity of TD with respect to differences caused by medication. Method. - We examined 60 patients suffering from schizophrenia and TD, We compared a clozapine-treated group With a group treated with typical neuroleptics. Movement disorders were assessed using the Abnormal-Involuntary-Movement-Scale and the technical device digital image processing, providing rater independent information on perioral movements. Results. - We found a strong correlation (.80 < r < .90, always p < .0001) between the degree of smoking and severity of TD. Repeated measurements revealed a positive correlation between changes in cigarette consumption and changes of the severity of TD (p < .0001). Analyses of covariance indicated a significant group-effect with a lower severity of TD in the clozapine-group compared to the typical-neuroleptics-group (p = .010). Interaction-analyses indicated a higher impact of smoking oil the severity of TD in the typical-neuroleptics-group compared to the clozapine-group (p = .033). Conclusion. - Concerning a possible cause-effect-relationship between smoking and TD, smoking is more of a general health hazard than neuroleptic exposure in terms of TD. (C) 2008 Elsevier Masson SAS. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND CONTEXT: The vertebral spine angle in the frontal plane is an important parameter in the assessment of scoliosis and may be obtained from panoramic X-ray images. Technological advances have allowed for an increased use of digital X-ray images in clinical practice. PURPOSE: In this context, the objective of this study is to assess the reliability of computer-assisted Cobb angle measurements taken from digital X-ray images. STUDY DESIGN/SETTING: Clinical investigation quantifying scoliotic deformity with Cobb method to evaluate the intra- and interobserver variability using manual and digital techniques. PATIENT SAMPLE: Forty-nine patients diagnosed with idiopathic scoliosis were chosen based on convenience, without predilection for gender, age, type, location, or magnitude of the curvature. OUTCOME MEASURES: Images were examined to evaluate Cobb angle variability, end plate selection, as well as intra- and interobserver errors. METHODS: Specific software was developed to digitally reproduce the Cobb method and calculate semiautomatically the degree of scoliotic deformity. During the study, three observers estimated the Cobb angle using both the digital and the traditional manual methods. RESULTS: The results showed that Cobb angle measurements may be reproduced in the computer as reliably as with the traditional manual method, in similar conditions to those found in clinical practice. CONCLUSIONS: The computer-assisted method (digital method) is clinically advantageous and appropriate to assess the scoliotic curvature in the frontal plane using Cobb method. (C) 2010 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, methods are presented for automatic detection of the nipple and the pectoral muscle edge in mammograms via image processing in the Radon domain. Radon-domain information was used for the detection of straight-line candidates with high gradient. The longest straight-line candidate was used to identify the pectoral muscle edge. The nipple was detected as the convergence point of breast tissue components, indicated by the largest response in the Radon domain. Percentages of false-positive (FP) and false-negative (FN) areas were determined by comparing the areas of the pectoral muscle regions delimited manually by a radiologist and by the proposed method applied to 540 mediolateral-oblique (MLO) mammographic images. The average FP and FN were 8.99% and 9.13%, respectively. In the detection of the nipple, an average error of 7.4 mm was obtained with reference to the nipple as identified by a radiologist on 1,080 mammographic images (540 MLO and 540 craniocaudal views).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a new technique for obtaining model fittings to very long baseline interferometric images of astrophysical jets. The method minimizes a performance function proportional to the sum of the squared difference between the model and observed images. The model image is constructed by summing N(s) elliptical Gaussian sources characterized by six parameters: two-dimensional peak position, peak intensity, eccentricity, amplitude, and orientation angle of the major axis. We present results for the fitting of two main benchmark jets: the first constructed from three individual Gaussian sources, the second formed by five Gaussian sources. Both jets were analyzed by our cross-entropy technique in finite and infinite signal-to-noise regimes, the background noise chosen to mimic that found in interferometric radio maps. Those images were constructed to simulate most of the conditions encountered in interferometric images of active galactic nuclei. We show that the cross-entropy technique is capable of recovering the parameters of the sources with a similar accuracy to that obtained from the very traditional Astronomical Image Processing System Package task IMFIT when the image is relatively simple (e. g., few components). For more complex interferometric maps, our method displays superior performance in recovering the parameters of the jet components. Our methodology is also able to show quantitatively the number of individual components present in an image. An additional application of the cross-entropy technique to a real image of a BL Lac object is shown and discussed. Our results indicate that our cross-entropy model-fitting technique must be used in situations involving the analysis of complex emission regions having more than three sources, even though it is substantially slower than current model-fitting tasks (at least 10,000 times slower for a single processor, depending on the number of sources to be optimized). As in the case of any model fitting performed in the image plane, caution is required in analyzing images constructed from a poorly sampled (u, v) plane.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Emission line ratios have been essential for determining physical parameters such as gas temperature and density in astrophysical gaseous nebulae. With the advent of panoramic spectroscopic devices, images of regions with emission lines related to these physical parameters can, in principle, also be produced. We show that, with observations from modern instruments, it is possible to transform images taken from density-sensitive forbidden lines into images of emission from high- and low-density clouds by applying a transformation matrix. In order to achieve this, images of the pairs of density-sensitive lines as well as the adjacent continuum have to be observed and combined. We have computed the critical densities for a series of pairs of lines in the infrared, optical, ultraviolet and X-rays bands, and calculated the pair line intensity ratios in the high- and low-density limit using a four- and five-level atom approximation. In order to illustrate the method, we applied it to Gemini Multi-Object Spectrograph (GMOS) Integral Field Unit (GMOS-IFU) data of two galactic nuclei. We conclude that this method provides new information of astrophysical interest, especially for mapping low- and high-density clouds; for this reason, we call it `the ld/hd imaging method`.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Astronomy has evolved almost exclusively by the use of spectroscopic and imaging techniques, operated separately. With the development of modern technologies, it is possible to obtain data cubes in which one combines both techniques simultaneously, producing images with spectral resolution. To extract information from them can be quite complex, and hence the development of new methods of data analysis is desirable. We present a method of analysis of data cube (data from single field observations, containing two spatial and one spectral dimension) that uses Principal Component Analysis (PCA) to express the data in the form of reduced dimensionality, facilitating efficient information extraction from very large data sets. PCA transforms the system of correlated coordinates into a system of uncorrelated coordinates ordered by principal components of decreasing variance. The new coordinates are referred to as eigenvectors, and the projections of the data on to these coordinates produce images we will call tomograms. The association of the tomograms (images) to eigenvectors (spectra) is important for the interpretation of both. The eigenvectors are mutually orthogonal, and this information is fundamental for their handling and interpretation. When the data cube shows objects that present uncorrelated physical phenomena, the eigenvector`s orthogonality may be instrumental in separating and identifying them. By handling eigenvectors and tomograms, one can enhance features, extract noise, compress data, extract spectra, etc. We applied the method, for illustration purpose only, to the central region of the low ionization nuclear emission region (LINER) galaxy NGC 4736, and demonstrate that it has a type 1 active nucleus, not known before. Furthermore, we show that it is displaced from the centre of its stellar bulge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Navigation is a broad topic that has been receiving considerable attention from the mobile robotic community over the years. In order to execute autonomous driving in outdoor urban environments it is necessary to identify parts of the terrain that can be traversed and parts that should be avoided. This paper describes an analyses of terrain identification based on different visual information using a MLP artificial neural network and combining responses of many classifiers. Experimental tests using a vehicle and a video camera have been conducted in real scenarios to evaluate the proposed approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents an automatic method to detect and classify weathered aggregates by assessing changes of colors and textures. The method allows the extraction of aggregate features from images and the automatic classification of them based on surface characteristics. The concept of entropy is used to extract features from digital images. An analysis of the use of this concept is presented and two classification approaches, based on neural networks architectures, are proposed. The classification performance of the proposed approaches is compared to the results obtained by other algorithms (commonly considered for classification purposes). The obtained results confirm that the presented method strongly supports the detection of weathered aggregates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The count intercept is a robust method for the numerical analysis of fabrics Launeau and Robin (1996). It counts the number of intersections between a set of parallel scan lines and a mineral phase, which must be identified on a digital image. However, the method is only sensitive to boundaries and therefore supposes the user has some knowledge about their significance. The aim of this paper is to show that a proper grey level detection of boundaries along scan lines is sufficient to calculate the two-dimensional anisotropy of grain or crystal distributions without any particular image processing. Populations of grains and crystals usually display elliptical anisotropies in rocks. When confirmed by the intercept analysis, a combination of a minimum of 3 mean length intercept roses, taken on 3 more or less perpendicular sections, allows the calculation of 3-dimensional ellipsoids and the determination of their standard deviation with direction and intensity in 3 dimensions as well. The feasibility of this quick method is attested by numerous examples on theoretical objects deformed by active and passive deformation, on BSE images of synthetic magma flow, on drawing or direct analysis of thin section pictures of sandstones and on digital images of granites directly taken and measured in the field. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The design of binary morphological operators that are translation-invariant and locally defined by a finite neighborhood window corresponds to the problem of designing Boolean functions. As in any supervised classification problem, morphological operators designed from a training sample also suffer from overfitting. Large neighborhood tends to lead to performance degradation of the designed operator. This work proposes a multilevel design approach to deal with the issue of designing large neighborhood-based operators. The main idea is inspired by stacked generalization (a multilevel classifier design approach) and consists of, at each training level, combining the outcomes of the previous level operators. The final operator is a multilevel operator that ultimately depends on a larger neighborhood than of the individual operators that have been combined. Experimental results show that two-level operators obtained by combining operators designed on subwindows of a large window consistently outperform the single-level operators designed on the full window. They also show that iterating two-level operators is an effective multilevel approach to obtain better results.