127 resultados para Sistema de processamento de informações georeferenciadas
Resumo:
This study includes the results of the analysis of areas susceptible to degradation by remote sensing in semi-arid region, which is a matter of concern and affects the whole population and the catalyst of this process occurs by the deforestation of the savanna and improper practices by the use of soil. The objective of this research is to use biophysical parameters of the MODIS / Terra and images TM/Landsat-5 to determine areas susceptible to degradation in semi-arid Paraiba. The study area is located in the central interior of Paraíba, in the sub-basin of the River Taperoá, with average annual rainfall below 400 mm and average annual temperature of 28 ° C. To draw up the map of vegetation were used TM/Landsat-5 images, specifically, the composition 5R4G3B colored, commonly used for mapping land use. This map was produced by unsupervised classification by maximum likelihood. The legend corresponds to the following targets: savanna vegetation sparse and dense, riparian vegetation and exposed soil. The biophysical parameters used in the MODIS were emissivity, albedo and vegetation index for NDVI (NDVI). The GIS computer programs used were Modis Reprojections Tools and System Information Processing Georeferenced (SPRING), which was set up and worked the bank of information from sensors MODIS and TM and ArcGIS software for making maps more customizable. Initially, we evaluated the behavior of the vegetation emissivity by adapting equation Bastiaanssen on NDVI for spatialize emissivity and observe changes during the year 2006. The albedo was used to view your percentage of increase in the periods December 2003 and 2004. The image sensor of Landsat TM were used for the month of December 2005, according to the availability of images and in periods of low emissivity. For these applications were made in language programs for GIS Algebraic Space (LEGAL), which is a routine programming SPRING, which allows you to perform various types of algebras of spatial data and maps. For the detection of areas susceptible to environmental degradation took into account the behavior of the emissivity of the savanna that showed seasonal coinciding with the rainy season, reaching a maximum emissivity in the months April to July and in the remaining months of a low emissivity . With the images of the albedo of December 2003 and 2004, it was verified the percentage increase, which allowed the generation of two distinct classes: areas with increased variation percentage of 1 to 11.6% and the percentage change in areas with less than 1 % albedo. It was then possible to generate the map of susceptibility to environmental degradation, with the intersection of the class of exposed soil with varying percentage of the albedo, resulting in classes susceptibility to environmental degradation
Resumo:
The use of graphical objects three-dimensional (3D) multimedia applications is gaining more space in the media. Networks with high transmission rates, computers with large processing and graphics boost and popularize such three-dimensional applications. The areas of 3D applications ranging from military applications, entertainment applications geared up for education. Within the applications related to education, we highlight the applications that create virtual copies of cultural spaces such as museums. Through this copy, you can virtually visit a museum, see other users, communicate, exchange information on works, etc. Thereby allowing the visit museums physically distant remote users. A major problem of such virtual environments is its update. By dealing with various media (text, images, sounds, and 3D models), its subsequent handling and update on a virtual environment requires staff with specialized knowledge. Speaking of museums, they hardly have people on your team with this profile. Inside the GT-MV (Grupo de Trabalho de Museus Virtuais), funded by RNP (Rede Nacional de Ensino e Pesquisa) propose a portal for registration, amendment and seen collaborative virtual museums of Brazil. The update, be it related to work or physical space, a system with a national scale like this, would be impossible if done only by the project team. Within this scenario, we propose the modeling and implementation of a tool that allows editing of virtual spaces in an easy and intuitive as compared with available tools. Within the context of GT-MV, we apply the SAMVC (Sistema de Autoria de Museus Virtuais Colaborativos) to museums where curators build the museum from a 3D floor plan (2D). The system, from these twodimensional information, recreates the equivalent in three dimensions. With this, through little or no training, team members from each museum may be responsible for updating the system
Resumo:
In this work, spoke about the importance of image compression for the industry, it is known that processing and image storage is always a challenge in petrobrás to optimize the storage time and store a maximum number of images and data. We present an interactive system for processing and storing images in the wavelet domain and an interface for digital image processing. The proposal is based on the Peano function and wavelet transform in 1D. The storage system aims to optimize the computational space, both for storage and for transmission of images. Being necessary to the application of the Peano function to linearize the images and the 1D wavelet transform to decompose it. These applications allow you to extract relevant information for the storage of an image with a lower computational cost and with a very small margin of error when comparing the images, original and processed, ie, there is little loss of quality when applying the processing system presented . The results obtained from the information extracted from the images are displayed in a graphical interface. It is through the graphical user interface that the user uses the files to view and analyze the results of the programs directly on the computer screen without the worry of dealing with the source code. The graphical user interface, programs for image processing via Peano Function and Wavelet Transform 1D, were developed in Java language, allowing a direct exchange of information between them and the user
Resumo:
The number of applications based on embedded systems grows significantly every year, even with the fact that embedded systems have restrictions, and simple processing units, the performance of these has improved every day. However the complexity of applications also increase, a better performance will always be necessary. So even such advances, there are cases, which an embedded system with a single unit of processing is not sufficient to achieve the information processing in real time. To improve the performance of these systems, an implementation with parallel processing can be used in more complex applications that require high performance. The idea is to move beyond applications that already use embedded systems, exploring the use of a set of units processing working together to implement an intelligent algorithm. The number of existing works in the areas of parallel processing, systems intelligent and embedded systems is wide. However works that link these three areas to solve any problem are reduced. In this context, this work aimed to use tools available for FPGA architectures, to develop a platform with multiple processors to use in pattern classification with artificial neural networks
Resumo:
Large efforts have been maden by the scientific community on tasks involving locomotion of mobile robots. To execute this kind of task, we must develop to the robot the ability of navigation through the environment in a safe way, that is, without collisions with the objects. In order to perform this, it is necessary to implement strategies that makes possible to detect obstacles. In this work, we deal with this problem by proposing a system that is able to collect sensory information and to estimate the possibility for obstacles to occur in the mobile robot path. Stereo cameras positioned in parallel to each other in a structure coupled to the robot are employed as the main sensory device, making possible the generation of a disparity map. Code optimizations and a strategy for data reduction and abstraction are applied to the images, resulting in a substantial gain in the execution time. This makes possible to the high level decision processes to execute obstacle deviation in real time. This system can be employed in situations where the robot is remotely operated, as well as in situations where it depends only on itself to generate trajectories (the autonomous case)
Resumo:
In this work, spoke about the importance of image compression for the industry, it is known that processing and image storage is always a challenge in petrobrás to optimize the storage time and store a maximum number of images and data. We present an interactive system for processing and storing images in the wavelet domain and an interface for digital image processing. The proposal is based on the Peano function and wavelet transform in 1D. The storage system aims to optimize the computational space, both for storage and for transmission of images. Being necessary to the application of the Peano function to linearize the images and the 1D wavelet transform to decompose it. These applications allow you to extract relevant information for the storage of an image with a lower computational cost and with a very small margin of error when comparing the images, original and processed, ie, there is little loss of quality when applying the processing system presented . The results obtained from the information extracted from the images are displayed in a graphical interface. It is through the graphical user interface that the user uses the files to view and analyze the results of the programs directly on the computer screen without the worry of dealing with the source code. The graphical user interface, programs for image processing via Peano Function and Wavelet Transform 1D, were developed in Java language, allowing a direct exchange of information between them and the user
Resumo:
Internet applications such as media streaming, collaborative computing and massive multiplayer are on the rise,. This leads to the need for multicast communication, but unfortunately group communications support based on IP multicast has not been widely adopted due to a combination of technical and non-technical problems. Therefore, a number of different application-layer multicast schemes have been proposed in recent literature to overcome the drawbacks. In addition, these applications often behave as both providers and clients of services, being called peer-topeer applications, and where participants come and go very dynamically. Thus, servercentric architectures for membership management have well-known problems related to scalability and fault-tolerance, and even peer-to-peer traditional solutions need to have some mechanism that takes into account member's volatility. The idea of location awareness distributes the participants in the overlay network according to their proximity in the underlying network allowing a better performance. Given this context, this thesis proposes an application layer multicast protocol, called LAALM, which takes into account the actual network topology in the assembly process of the overlay network. The membership algorithm uses a new metric, IPXY, to provide location awareness through the processing of local information, and it was implemented using a distributed shared and bi-directional tree. The algorithm also has a sub-optimal heuristic to minimize the cost of membership process. The protocol has been evaluated in two ways. First, through an own simulator developed in this work, where we evaluated the quality of distribution tree by metrics such as outdegree and path length. Second, reallife scenarios were built in the ns-3 network simulator where we evaluated the network protocol performance by metrics such as stress, stretch, time to first packet and reconfiguration group time
Resumo:
The seismic method is of extreme importance in geophysics. Mainly associated with oil exploration, this line of research focuses most of all investment in this area. The acquisition, processing and interpretation of seismic data are the parts that instantiate a seismic study. Seismic processing in particular is focused on the imaging that represents the geological structures in subsurface. Seismic processing has evolved significantly in recent decades due to the demands of the oil industry, and also due to the technological advances of hardware that achieved higher storage and digital information processing capabilities, which enabled the development of more sophisticated processing algorithms such as the ones that use of parallel architectures. One of the most important steps in seismic processing is imaging. Migration of seismic data is one of the techniques used for imaging, with the goal of obtaining a seismic section image that represents the geological structures the most accurately and faithfully as possible. The result of migration is a 2D or 3D image which it is possible to identify faults and salt domes among other structures of interest, such as potential hydrocarbon reservoirs. However, a migration fulfilled with quality and accuracy may be a long time consuming process, due to the mathematical algorithm heuristics and the extensive amount of data inputs and outputs involved in this process, which may take days, weeks and even months of uninterrupted execution on the supercomputers, representing large computational and financial costs, that could derail the implementation of these methods. Aiming at performance improvement, this work conducted the core parallelization of a Reverse Time Migration (RTM) algorithm, using the parallel programming model Open Multi-Processing (OpenMP), due to the large computational effort required by this migration technique. Furthermore, analyzes such as speedup, efficiency were performed, and ultimately, the identification of the algorithmic scalability degree with respect to the technological advancement expected by future processors
Resumo:
The pregeniculate nucleus (PGN) of the primate s thalamus is an agglomerate neuronal having a cap shaped located dorsomedially to the main relay visual information to the cerebral cortex, the dorsal lateral geniculate nucleus (GLD). Several cytoarchitectonic, neurochemical and retinal projections studies have pointed PGN as a structure homologous to intergeniculate leaflet (IGL) of rodents. The IGL receives retinal terminals and appears to be involved in the integration of photic and non-photic information relaying them, through geniculo-hypothalamic tract (TGH), to the main circadian oscillator in mammals, the suprachiasmatic nucleus (SCN) of the hypothalamus. Thus, the IGL participates in the control of the biological rhythm by modulating the activity of the SCN. Pharmacological and IGL injury studies conclude that it is critical in the processing of non-photic information which is transmitted to the SCN. Other studies have found that especially neurons immunoreactive to neuropeptide Y (NPY) respond to this type of stimulation, determined by its colocation with the FOS protein. Has not been determined if the PGN responds, expressing the FOS protein, to the non-photic stimulus nor the neurochemical nature of these cells. Thus, we apply a dark pulse in the specifics circadian phases and analyze the pattern of expression of FOS protein in PGN of the marmoset (Callithrix jacchus). We found that in all animals analyzed the FOS expression was higher in the experimental than in the control group. There was a higher expression of FOS when the dark pulse was applied during the subjective day between the groups. Still, a subregion of the PGN, known by immunoreactive to NPY, had a greater number of FOS-positive cells in relation to his other just close dorsal region. Our data corroborate the theory that the PGN and IGL are homologous structures that were anatomically modified during the evolutionary process, but kept its main neurochemical and functional characteristics. However, injury and hodological studies are still needed for a more accurate conclusion
Resumo:
The thalamus plays an important role in the sensorial processing information, in this particular case, the visual information. Several neuronal groups have been characterized as conductors and processors of important sensorial information to the cerebral cortex. The lateral geniculate complex is one to them, and appears as a group very studied once it is responsible, in almost all totality, for the processing of visual information. Among the nuclei that constitute the lateral geniculate complex we highlight the dorsal lateral geniculate nucleus of the thalamus (DLG), the main thalamic relay for the visual information. This nucleus is located rostral and lateral to medial geniculate nucleus and ventral to thalamic pulvinar nucleus in most of the mammals. In the primates humans and non-humans, it presents as a laminate structure, arranged in layers, when observed in coronal sections. The objective of this work was to do a mapping of the retinal projections and a citoarchictetonic and neurochemical characterization of DLG in the marmoset (Callithrix jacchus), a New World primate. The retinal projections were traced by anterograde transport of subunit b of cholera toxin (CTb), the citoarchicteture was described by Nissl method, and to neurochemical characterization immunohistochemicals technical were used to examine the main neurotransmitters and neuroatives substances present in this neural center. In DGL of marmoset thalamus, in coronal sections labeled by Nissl method, was possible to visualize the division of this nucleus in four layers divided in two portions: magnocellular and parvocellular. The retinal projections were present being visualized fibers and terminals immunorreactives to CTb (IR-CTb) in the DLG ipsilateral and contralateral. And through the immunohistochemicals techniques was observed that DLG contain cells, fibers and/or terminals immunoreactives against neuronal nuclear protein, subunits of AMPA 15 glutamate receptors (GluR1, GluR2/3, GluR4), choline acetyltransferase, serotonin, glutamic acid decarboxylase, binding calcium proteins (calbindin, parvalbumin and calretinin), vasopressin, vasoactive intestinal polypeptide, and an astrocyte protein, glial fibrillary acidic protein.
Resumo:
It bet on the next generation of computers as architecture with multiple processors and/or multicore processors. In this sense there are challenges related to features interconnection, operating frequency, the area on chip, power dissipation, performance and programmability. The mechanism of interconnection and communication it was considered ideal for this type of architecture are the networks-on-chip, due its scalability, reusability and intrinsic parallelism. The networks-on-chip communication is accomplished by transmitting packets that carry data and instructions that represent requests and responses between the processing elements interconnected by the network. The transmission of packets is accomplished as in a pipeline between the routers in the network, from source to destination of the communication, even allowing simultaneous communications between pairs of different sources and destinations. From this fact, it is proposed to transform the entire infrastructure communication of network-on-chip, using the routing mechanisms, arbitration and storage, in a parallel processing system for high performance. In this proposal, the packages are formed by instructions and data that represent the applications, which are executed on routers as well as they are transmitted, using the pipeline and parallel communication transmissions. In contrast, traditional processors are not used, but only single cores that control the access to memory. An implementation of this idea is called IPNoSys (Integrated Processing NoC System), which has an own programming model and a routing algorithm that guarantees the execution of all instructions in the packets, preventing situations of deadlock, livelock and starvation. This architecture provides mechanisms for input and output, interruption and operating system support. As proof of concept was developed a programming environment and a simulator for this architecture in SystemC, which allows configuration of various parameters and to obtain several results to evaluate it
Resumo:
A degradação dos recursos naturais é talvez o principal problema da região do semiárido brasileiro, e essa degradação é principalmente resultante das perdas de solo, decorrente do processo erosivo. Na busca de melhor conhecer esta problemática vem sendo empregado o processo de modelagem ambiental, cujo objetivo é identificar e propor soluções para a degradação dos solos. Nesse sentido, o trabalho aplica o modelo da Equação Universal de Perda de Solos (EUPS), desenvolvido nos Estados Unidos ao longo da década de 1950, agregado as ferramentas de geoprocessamento, informações de sensoriamento remoto e Sistemas de Informações Geográficas (SIGs). A área de estudo é a Microbacia Riacho Passagem localizada na região oeste do Estado do Rio Grande do Norte, a microbacia tem uma área de 221,7Km² e esta inserida no semiárido, região Nordeste do Brasil. A metodologia utilizada consiste: em agrupar as variáveis da EUPS no ambiente SIG utilizando imagens de satélite, levantamentos bibliográficos e trabalhos de campo. Para determinação das extensões das vertentes foi empregado o Modelo RAMPA, e para adequar a EUPS as condições da área de estudo, foram realizados ajuste através de modelos estatísticos, aperfeiçoando o trabalho e os resultados gerados pelo modelo. Ao fim do processo foi desenvolvida uma pseudo linguagem no aplicativo Linguagem Espacial para Geoprocessamento Algébrico (LEGAL) disponível no software SPRING versão 5.1.2 servindo de suporte para o processamento das informações contidas no banco de dados, base da EUPS. Os resultados demonstram que inicialmente é necessário delimitar com precisão o período seco e chuvoso, informação fundamental para a EUPS, uma vez que o trabalho busca identificar a perda de solo por erosão hídrica. O modelo RAMPA apresentou-se satisfatório e com elevado potencial de aplicação na determinação dos comprimentos de vertentes utilizando imagens de radar. Quanto ao comportamento das extensões de vertentes, na microbacia, o mesmo apresentou uma pequena variação na porção leste, maiores vertentes, área próxima a desembocadura. Após a aplicação do modelo o valor máximo de perda de solo foi 88 ton/ha.ano com núcleos localizados no NEOSSOLOS LITÓLICOS e o mínimo 0,01 ton/ha.ano localizado no domínio dos LATOSSOLOS e NEOSSOLOS FLÚVICOS. A erosão provoca diminuição do perfil de solo, principalmente nos NEOSSOLOS LITÓLICOS, resultando em alteração no balanço hídrico e conseqüentemente aumento da temperatura do solo, podendo desencadear a desertificação. Os resultados e a metodologia do presente trabalho poderão ser aplicados na busca pelo desenvolvimento sustentável, na região do semiárido brasileiro, auxiliando na compreensão do binômio uso do solo e capacidade de suporte do meio natural.
Resumo:
Embedded systems are widely spread nowadays. An example is the Digital Signal Processor (DSP), which is a high processing power device. This work s contribution consist of exposing DSP implementation of the system logic for detecting leaks in real time. Among the various methods of leak detection available today this work uses a technique based on the pipe pressure analysis and usesWavelet Transform and Neural Networks. In this context, the DSP, in addition to do the pressure signal digital processing, also communicates to a Global Positioning System (GPS), which helps in situating the leak, and to a SCADA, sharing information. To ensure robustness and reliability in communication between DSP and SCADA the Modbus protocol is used. As it is a real time application, special attention is given to the response time of each of the tasks performed by the DSP. Tests and leak simulations were performed using the structure of Laboratory of Evaluation of Measurement in Oil (LAMP), at Federal University of Rio Grande do Norte (UFRN)
Resumo:
Objective to establish a methodology for the oil spill monitoring on the sea surface, located at the Submerged Exploration Area of the Polo Region of Guamaré, in the State of Rio Grande do Norte, using orbital images of Synthetic Aperture Radar (SAR integrated with meteoceanographycs products. This methodology was applied in the following stages: (1) the creation of a base map of the Exploration Area; (2) the processing of NOAA/AVHRR and ERS-2 images for generation of meteoceanographycs products; (3) the processing of RADARSAT-1 images for monitoring of oil spills; (4) the integration of RADARSAT-1 images with NOAA/AVHRR and ERS-2 image products; and (5) the structuring of a data base. The Integration of RADARSAT-1 image of the Potiguar Basin of day 21.05.99 with the base map of the Exploration Area of the Polo Region of Guamaré for the identification of the probable sources of the oil spots, was used successfully in the detention of the probable spot of oil detected next to the exit to the submarine emissary in the Exploration Area of the Polo Region of Guamaré. To support the integration of RADARSAT-1 images with NOAA/AVHRR and ERS-2 image products, a methodology was developed for the classification of oil spills identified by RADARSAT-1 images. For this, the following algorithms of classification not supervised were tested: K-means, Fuzzy k-means and Isodata. These algorithms are part of the PCI Geomatics software, which was used for the filtering of RADARSAT-1 images. For validation of the results, the oil spills submitted to the unsupervised classification were compared to the results of the Semivariogram Textural Classifier (STC). The mentioned classifier was developed especially for oil spill classification purposes and requires PCI software for the whole processing of RADARSAT-1 images. After all, the results of the classifications were analyzed through Visual Analysis; Calculation of Proportionality of Largeness and Analysis Statistics. Amongst the three algorithms of classifications tested, it was noted that there were no significant alterations in relation to the spills classified with the STC, in all of the analyses taken into consideration. Therefore, considering all the procedures, it has been shown that the described methodology can be successfully applied using the unsupervised classifiers tested, resulting in a decrease of time in the identification and classification processing of oil spills, if compared with the utilization of the STC classifier
Resumo:
The traditional processes for treatment of hazardous waste are questionable for it generates other wastes that adversely affect people s health. As an attempt to minimize these problems, it was developed a system for treatment of hazardous waste by thermal plasma, a more appropriate technology since it produces high temperatures, preventing the formation of toxic pollutants to human beings. The present work brings out a solution of automation for this plant. The system has local and remote monitoring resources to ensure the operators security as well as the process itself. A special attention was given to the control of the main reactor temperature of the plant as it is the place where the main processing occurs and because it presents a complex mathematical model. To this, it was employed cascaded controls based on Fuzzy logic. A process computer, with a particular man-machine interface (MMI), provides information and controls of the plant to the operator, including by Internet. A compact PLC module is in charge of the central element of management automation and plant control which receives information from sensors, and sends it to the MMI