897 resultados para Mesh generation from image data
Resumo:
The past years have shown an enormous advancement in sequencing and array-based technologies, producing supplementary or alternative views of the genome stored in various formats and databases. Their sheer volume and different data scope pose a challenge to jointly visualize and integrate diverse data types. We present AmalgamScope a new interactive software tool focusing on assisting scientists with the annotation of the human genome and particularly the integration of the annotation files from multiple data types, using gene identifiers and genomic coordinates. Supported platforms include next-generation sequencing and microarray technologies. The available features of AmalgamScope range from the annotation of diverse data types across the human genome to integration of the data based on the annotational information and visualization of the merged files within chromosomal regions or the whole genome. Additionally, users can define custom transcriptome library files for any species and use the file exchanging distant server options of the tool.
Resumo:
The increased availability of digital elevation models and satellite image data enable testing of morphometric relationships between sand dune variables (dune height, spacing and equivalent sand thickness), which were originally established using limited field survey data. These long-established geomorphological hypotheses can now be tested against very much larger samples than were possible when available data were limited to what could be collected by field surveys alone. This project uses ASTER Global Digital Elevation Model (GDEM) data to compare morphometric relationships between sand dune variables in the southwest Kalahari dunefield to those of the Namib Sand Sea, to test whether the relationships found in an active sand sea (Namib) also hold for the fixed dune system of the nearby southwest Kalahari. The data show significant morphometric differences between the simple linear dunes of the Namib sand sea and the southwest Kalahari; the latter do not show the expected positive relationship between dune height and spacing. The southwest Kalahari dunes show a similar range of dune spacings, but they are less tall, on average, than the Namib sand sea dunes. There is a clear spatial pattern to these morphometric data; the tallest and most closely spaced dunes are towards the southeast of the Kalahari dunefield; and this is where the highest values of equivalent sand thickness result. We consider the possible reasons for the observed differences and highlight the need for more studies comparing sand seas and dunefields from different environmental settings.
Resumo:
Definition of the long-term variation of the geomagnetic virtual dipole moment requires more reliable paleointensity results. Here, we applied a multisample protocol to the study of the 130.5 Ma Ponta Grossa basaltic dikes (southern Brazil) that carry a very stable dual-polarity magnetic component. The magnetic stability of the samples wits checked using thermomagnetic curves and by monitoring the magnetic Susceptibility evolution through the paleointensity experiments. Twelve sites containing the least alterable samples were chosen for the paleointensity measurements. Although these rocks failed stepwise double-heating experiments, they yielded coherent results in the multisample method for all sites but one. The coherent sites show low to moderate field intensities between 5.7 +/- 0.2 and 26.4 +/- 0.7 mu T (average 13.4 +/- 1.9 mu T). Virtual dipole moments for these sites range from 1.3 +/- 0.04 to 6.0 +/- 0.2 x 10(22) A m(2) (average 2.9 +/- 0.5 x 10(22) A m(2)). Our results agree with the tendency for low dipole moments during the Early Cretaceous, immediately prior to the Cretaceous Normal Superchron (CNS). The available paleointensity database shows a strong variability of the field between 80 and 160 Ma. There seems to be no firm evidence for a Mesozoic Dipole Low, but a long-term tendency does emerge from the data with the highest dipole moments Occurring at the middle of the CNS.
Resumo:
Chemometric methods can contribute to soil research by permitting the extraction of more information from the data. The aim of this work was to use Principal Component Analysis to evaluate data obtained through chemical and spectroscopic methods on the changes in the humification process of soil organic matter from two tropical soils after sewage sludge application. In this case, humic acids extracted from Typic Eutrorthox and Typic Haplorthox soils with and without sewage sludge application for 7 consecutive years were studied. The results obtained for all of the samples and methods showed two clusters: samples extracted from the two soil types. These expected results indicated the textural difference between the two soils was more significant than the differences between treatments (control and sewage sludge application) or between depths. In this case, an individual chemometric treatment was made for each type of soil. It was noted that the characterization of the humic acids extracted from soils with and without sewage sludge application after 7 consecutive years using several methods supplies important results about changes in the humification degree of soil organic matter, These important result obtained by Principal Component Analysis justify further research using these methods to characterize the changes in the humic acids extracted from sewage sludge-amended soils. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
A student from the Data Processing program at the New York Trade School is shown working. Black and white photograph with some edge damage due to writing in black along the top.
Resumo:
A key to maintain Enterprises competitiveness is the ability to describe, standardize, and adapt the way it reacts to certain types of business events, and how it interacts with suppliers, partners, competitors, and customers. In this context the field of organization modeling has emerged with the aim to create models that help to create a state of self-awareness in the organization. This project's context is the use of Semantic Web in the Organizational modeling area. The Semantic Web technology advantages can be used to improve the way of modeling organizations. This was accomplished using a Semantic wiki to model organizations. Our research and implementation had two main purposes: formalization of textual content in semantic wiki pages; and automatic generation of diagrams from organization data stored in the semantic wiki pages.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Models where the dark matter component of the Universe interacts with the dark energy field have been proposed as a solution to the cosmic coincidence problem, since in the attractor regime both dark energy and dark matter scale in the same way. In these models the mass of the cold dark matter particles is a function of the dark energy field responsible for the present acceleration of the Universe, and different scenarios can be parametrized by how the mass of the cold dark matter particles evolves with time. In this article we study the impact of a constant coupling delta between dark energy and dark matter on the determination of a redshift dependent dark energy equation of state w(DE)(z) and on the dark matter density today from SNIa data. We derive an analytical expression for the luminosity distance in this case. In particular, we show that the presence of such a coupling increases the tension between the cosmic microwave background data from the analysis of the shift parameter in models with constant w(DE) and SNIa data for realistic values of the present dark matter density fraction. Thus, an independent measurement of the present dark matter density can place constraints on models with interacting dark energy.
Resumo:
The Capacitated p-median problem (CPMP) seeks to solve the optimal location of p facilities, considering distances and capacities for the service to be given by each median. In this paper we present a column generation approach to CPMP. The identified restricted master problem optimizes the covering of 1-median clusters satisfying the capacity constraints, and new columns are generated considering knapsack subproblems. The Lagrangean/surrogate relaxation has been used recently to accelerate subgradient like methods. In this work the Lagrangean/surrogate relaxation is directly identified from the master problem dual and provides new bounds and new productive columns through a modified knapsack subproblem. The overall column generation process is accelerated, even when multiple pricing is observed. Computational tests are presented using instances taken from real data from Sao Jose dos Campos' city.
Resumo:
A search for pair production of third-generation scalar leptoquarks and supersymmetric top quark partners, top squarks, in final states involving tau leptons and bottom quarks is presented. The search uses events from a data sample of proton-proton collisions corresponding to an integrated luminosity of 19.7 fb(-1), collected with the CMS detector at the LHC with root s = 8 TeV. The number of observed events is found to be in agreement with the expected standard model background. Third-generation scalar leptoquarks with masses below 740 GeV are excluded at 95% confidence level, assuming a 100% branching fraction for the leptoquark decay to a tau lepton and a bottom quark. In addition, this mass limit applies directly to top squarks decaying via an R-parity violating coupling. lambda(') (333). The search also considers a similar signature from top squarks undergoing a chargino-mediated decay involving the Rparity violating coupling. lambda(')(3jk). Each top squark decays to a tau lepton, a bottom quark, and two light quarks. Top squarks in this model with masses below 580 GeV are excluded at 95% confidence level. The constraint on the leptoquark mass is the most stringent to date, and this is the first search for top squarks decaying via. lambda(')(3jk). (C) 2014 The Authors. Published by Elsevier B. V.
Resumo:
Pós-graduação em Engenharia Mecânica - FEIS
Resumo:
Content-based image retrieval is still a challenging issue due to the inherent complexity of images and choice of the most discriminant descriptors. Recent developments in the field have introduced multidimensional projections to burst accuracy in the retrieval process, but many issues such as introduction of pattern recognition tasks and deeper user intervention to assist the process of choosing the most discriminant features still remain unaddressed. In this paper, we present a novel framework to CBIR that combines pattern recognition tasks, class-specific metrics, and multidimensional projection to devise an effective and interactive image retrieval system. User interaction plays an essential role in the computation of the final multidimensional projection from which image retrieval will be attained. Results have shown that the proposed approach outperforms existing methods, turning out to be a very attractive alternative for managing image data sets.
Resumo:
Empirical approaches and, more recently, physical approaches, have grounded the establishment of logical connections between radiometric variables derived from remote data and biophysical variables derived from vegetation cover. This study was aimed at evaluating correlations of dendrometric and density data from canopies of Eucalyptus spp., as collected in Capao Bonito forest unit, with radiometric data from imagery acquired by the TM/Landsat-5 sensor on two orbital passages over the study site (dates close to field data collection). Results indicate that stronger correlations were identified between crown dimensions and canopy height with near-infrared spectral band data (rho(s)4), irrespective of the satellite passage date. Estimates of spatial distribution of dendrometric data and canopy density (D) using spectral characterization were consistent with the spatial distribution of tree ages during the study period. Statistical tests were applied to evaluate performance disparities of empirical models depending on which date data were acquired. Results indicated a significant difference between models based on distinct data acquisition dates.
Resumo:
Fractal theory presents a large number of applications to image and signal analysis. Although the fractal dimension can be used as an image object descriptor, a multiscale approach, such as multiscale fractal dimension (MFD), increases the amount of information extracted from an object. MFD provides a curve which describes object complexity along the scale. However, this curve presents much redundant information, which could be discarded without loss in performance. Thus, it is necessary the use of a descriptor technique to analyze this curve and also to reduce the dimensionality of these data by selecting its meaningful descriptors. This paper shows a comparative study among different techniques for MFD descriptors generation. It compares the use of well-known and state-of-the-art descriptors, such as Fourier, Wavelet, Polynomial Approximation (PA), Functional Data Analysis (FDA), Principal Component Analysis (PCA), Symbolic Aggregate Approximation (SAX), kernel PCA, Independent Component Analysis (ICA), geometrical and statistical features. The descriptors are evaluated in a classification experiment using Linear Discriminant Analysis over the descriptors computed from MFD curves from two data sets: generic shapes and rotated fish contours. Results indicate that PCA, FDA, PA and Wavelet Approximation provide the best MFD descriptors for recognition and classification tasks. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
OBJECTIVE: To evaluate tools for the fusion of images generated by tomography and structural and functional magnetic resonance imaging. METHODS: Magnetic resonance and functional magnetic resonance imaging were performed while a volunteer who had previously undergone cranial tomography performed motor and somatosensory tasks in a 3-Tesla scanner. Image data were analyzed with different programs, and the results were compared. RESULTS: We constructed a flow chart of computational processes that allowed measurement of the spatial congruence between the methods. There was no single computational tool that contained the entire set of functions necessary to achieve the goal. CONCLUSION: The fusion of the images from the three methods proved to be feasible with the use of four free-access software programs (OsiriX, Register, MRIcro and FSL). Our results may serve as a basis for building software that will be useful as a virtual tool prior to neurosurgery.