25 resultados para vectorization
Resumo:
This paper proposes a method by simulated annealing for building roof contours identification from LiDAR-derived digital elevation model. Our method is based on the concept of first extracting aboveground objects and then identifying those objects that are building roof contours. First, to detect aboveground objects (buildings, trees, etc.), the digital elevation model is segmented through a recursive splitting technique followed by a region merging process. Vectorization and polygonization are used to obtain polyline representations of the detected aboveground objects. Second, building roof contours are identified from among the aboveground objects by optimizing a Markov-random-field-based energy function that embodies roof contour attributes and spatial constraints. The solution of this function is a polygon set corresponding to building roof contours and is found by using a minimization technique, like the Simulated Annealing algorithm. Experiments carried out with laser scanning digital elevation model showed that the methodology works properly, as it provides roof contour information with approximately 90% shape accuracy and no verified false positives.
Resumo:
Pós-graduação em Agronomia (Produção Vegetal) - FCAV
Resumo:
Pós-graduação em Geografia - IGCE
Resumo:
“Cartographic heritage” is different from “cartographic history”. The second term refers to the study of the development of surveying and drawing techniques related to maps, through time, i.e. through different types of cultural environment which were background for the creation of maps. The first term concerns the whole amount of ancient maps, together with these different types of cultural environment, which the history has brought us and which we perceive as cultural values to be preserved and made available to many users (public, institutions, experts). Unfortunately, ancient maps often suffer preservation problems of their analog support, mostly due to aging. Today, metric recovery in digital form and digital processing of historical cartography allow preserving map heritage. Moreover, modern geomatic techniques give us new chances of using historical information, which would be unachievable on analog supports. In this PhD thesis, the whole digital processing of recovery and elaboration of ancient cartography is reported, with a special emphasis on the use of digital tools in preservation and elaboration of cartographic heritage. It is possible to divide the workflow into three main steps, that reflect the chapter structure of the thesis itself: • map acquisition: conversion of the ancient map support from analog to digital, by means of high resolution scanning or 3D surveying (digital photogrammetry or laser scanning techniques); this process must be performed carefully, with special instruments, in order to reduce deformation as much as possible; • map georeferencing: reproducing in the digital image the native metric content of the map, or even improving it by selecting a large number of still existing ground control points; this way it is possible to understand the projection features of the historical map, as well as to evaluate and represent the degree of deformation induced by the old type of cartographic transformation (that can be unknown to us), by surveying errors or by support deformation, usually all errors of too high value with respect to our standards; • data elaboration and management in a digital environment, by means of modern software tools: vectorization, giving the map a new and more attractive graphic view (for instance, by creating a 3D model), superimposing it on current base maps, comparing it to other maps, and finally inserting it in GIS or WebGIS environment as a specific layer. The study is supported by some case histories, each of them interesting from the point of view of one digital cartographic elaboration step at least. The ancient maps taken into account are the following ones: • three maps of the Po river delta, made at the end of the XVI century by a famous land-surveyor, Ottavio Fabri (he is single author in the first map, co-author with Gerolamo Pontara in the second map, co-author with Bonajuto Lorini and others in the third map), who wrote a methodological textbook where he explains a new topographical instrument, the squadra mobile (mobile square) invented and used by himself; today all maps are preserved in the State Archive of Venice; • the Ichnoscenografia of Bologna by Filippo de’ Gnudi, made in the 1702 and today preserved in the Archiginnasio Library of Bologna; it is a scenographic view of the city, captured in a bird’s eye flight, but also with an icnographic value, as the author himself declares; • the map of Bologna by the periti Gregorio Monari and Antonio Laghi, the first map of the city derived from a systematic survey, even though it was made only ten years later (1711–1712) than the map by de’ Gnudi; in this map the scenographic view was abandoned, in favor of a more correct representation by means of orthogonal projection; today the map is preserved in the State Archive of Bologna; • the Gregorian Cadastre of Bologna, made in 1831 and updated until 1927, now preserved in the State Archive of Bologna; it is composed by 140 maps and 12 brogliardi (register volumes). In particular, the three maps of the Po river delta and the Cadastre were studied with respect to their acquisition procedure. Moreover, the first maps were analyzed from the georeferencing point of view, and the Cadastre was analyzed with respect to a possible GIS insertion. Finally, the Ichnoscenografia was used to illustrate a possible application of digital elaboration, such as 3D modeling. Last but not least, we must not forget that the study of an ancient map should start, whenever possible, from the consultation of the precious original analogical document; analysis by means of current digital techniques allow us new research opportunities in a rich and modern multidisciplinary context.
Resumo:
In this thesis we present techniques that can be used to speed up the calculation of perturbative matrix elements for observables with many legs ($n = 3, 4, 5, 6, 7, ldots$). We investigate several ways to achieve this, including the use of Monte Carlo methods, the leading-color approximation, numerically less precise but faster operations, and SSE-vectorization. An important idea is the use of enquote{random polarizations} for which we derive subtraction terms for the real corrections in next-to-leading order calculations. We present the effectiveness of all these methods in the context of electron-positron scattering to $n$ jets, $n$ ranging from two to seven.
Resumo:
This paper describes the spatial data handling procedures used to create a vector database of the Connecticut shoreline from Coastal Survey Maps. The appendix contains detailed information on how the procedures were implemented using Geographic Transformer Software 5 and ArcGIS 8.3. The project was a joint project of the Connecticut Department of Environmental Protection and the University of Connecticut Center for Geographic Information and Analysis.
Resumo:
Vast portions of Arctic and sub-Arctic Siberia, Alaska and the Yukon Territory are covered by ice-rich silty to sandy deposits that are containing large ice wedges, resulting from syngenetic sedimentation and freezing. Accompanied by wedge-ice growth in polygonal landscapes, the sedimentation process was driven by cold continental climatic and environmental conditions in unglaciated regions during the late Pleistocene, inducing the accumulation of the unique Yedoma deposits up to >50 meters thick. Because of fast incorporation of organic material into syngenetic permafrost during its formation, Yedoma deposits include well-preserved organic matter. Ice-rich deposits like Yedoma are especially prone to degradation triggered by climate changes or human activity. When Yedoma deposits degrade, large amounts of sequestered organic carbon as well as other nutrients are released and become part of active biogeochemical cycling. This could be of global significance for future climate warming as increased permafrost thaw is likely to lead to a positive feedback through enhanced greenhouse gas fluxes. Therefore, a detailed assessment of the current Yedoma deposit coverage and its volume is of importance to estimate its potential response to future climate changes. We synthesized the map of the coverage and thickness estimation, which will provide critical data needed for further research. In particular, this preliminary Yedoma map is a great step forward to understand the spatial heterogeneity of Yedoma deposits and its regional coverage. There will be further applications in the context of reconstructing paleo-environmental dynamics and past ecosystems like the mammoth-steppe-tundra, or ground ice distribution including future thermokarst vulnerability. Moreover, the map will be a crucial improvement of the data basis needed to refine the present-day Yedoma permafrost organic carbon inventory, which is assumed to be between 83±12 (Strauss et al., 2013, doi:10.1002/2013GL058088) and 129±30 (Walter Anthony et al., 2014, doi:10.1038/nature13560) gigatonnes (Gt) of organic carbon in perennially-frozen archives. Hence, here we synthesize data on the circum-Arctic and sub-Arctic distribution and thickness of Yedoma for compiling a preliminary circum-polar Yedoma map. For compiling this map, we used (1) maps of the previous Yedoma coverage estimates, (2) included the digitized areas from Grosse et al. (2013) as well as extracted areas of potential Yedoma distribution from additional surface geological and Quaternary geological maps (1.: 1:500,000: Q-51-V,G; P-51-A,B; P-52-A,B; Q-52-V,G; P-52-V,G; Q-51-A,B; R-51-V,G; R-52-V,G; R-52-A,B; 2.: 1:1,000,000: P-50-51; P-52-53; P-58-59; Q-42-43; Q-44-45; Q-50-51; Q-52-53; Q-54-55; Q-56-57; Q-58-59; Q-60-1; R-(40)-42; R-43-(45); R-(45)-47; R-48-(50); R-51; R-53-(55); R-(55)-57; R-58-(60); S-44-46; S-47-49; S-50-52; S-53-55; 3.: 1:2,500,000: Quaternary map of the territory of Russian Federation, 4.: Alaska Permafrost Map). The digitalization was done using GIS techniques (ArcGIS) and vectorization of raster Images (Adobe Photoshop and Illustrator). Data on Yedoma thickness are obtained from boreholes and exposures reported in the scientific literature. The map and database are still preliminary and will have to undergo a technical and scientific vetting and review process. In their current form, we included a range of attributes for Yedoma area polygons based on lithological and stratigraphical information from the original source maps as well as a confidence level for our classification of an area as Yedoma (3 stages: confirmed, likely, or uncertain). In its current version, our database includes more than 365 boreholes and exposures and more than 2000 digitized Yedoma areas. We expect that the database will continue to grow. In this preliminary stage, we estimate the Northern Hemisphere Yedoma deposit area to cover approximately 625,000 km². We estimate that 53% of the total Yedoma area today is located in the tundra zone, 47% in the taiga zone. Separated from west to east, 29% of the Yedoma area is found in North America and 71 % in North Asia. The latter include 9% in West Siberia, 11% in Central Siberia, 44% in East Siberia and 7% in Far East Russia. Adding the recent maximum Yedoma region (including all Yedoma uplands, thermokarst lakes and basins, and river valleys) of 1.4 million km² (Strauss et al., 2013, doi:10.1002/2013GL058088) and postulating that Yedoma occupied up to 80% of the adjacent formerly exposed and now flooded Beringia shelves (1.9 million km², down to 125 m below modern sea level, between 105°E - 128°W and >68°N), we assume that the Last Glacial Maximum Yedoma region likely covered more than 3 million km² of Beringia. Acknowledgements: This project is part of the Action Group "The Yedoma Region: A Synthesis of Circum-Arctic Distribution and Thickness" (funded by the International Permafrost Association (IPA) to J. Strauss) and is embedded into the Permafrost Carbon Network (working group Yedoma Carbon Stocks). We acknowledge the support by the European Research Council (Starting Grant #338335), the German Federal Ministry of Education and Research (Grant 01DM12011 and "CarboPerm" (03G0836A)), the Initiative and Networking Fund of the Helmholtz Association (#ERC-0013) and the German Federal Environment Agency (UBA, project UFOPLAN FKZ 3712 41 106).
Resumo:
The deployment of nodes in Wireless Sensor Networks (WSNs) arises as one of the biggest challenges of this field, which involves in distributing a large number of embedded systems to fulfill a specific application. The connectivity of WSNs is difficult to estimate due to the irregularity of the physical environment and affects the WSN designers? decision on deploying sensor nodes. Therefore, in this paper, a new method is proposed to enhance the efficiency and accuracy on ZigBee propagation simulation in indoor environments. The method consists of two steps: automatic 3D indoor reconstruction and 3D ray-tracing based radio simulation. The automatic 3D indoor reconstruction employs unattended image classification algorithm and image vectorization algorithm to build the environment database accurately, which also significantly reduces time and efforts spent on non-radio propagation issue. The 3D ray tracing is developed by using kd-tree space division algorithm and a modified polar sweep algorithm, which accelerates the searching of rays over the entire space. Signal propagation model is proposed for the ray tracing engine by considering both the materials of obstacles and the impact of positions along the ray path of radio. Three different WSN deployments are realized in the indoor environment of an office and the results are verified to be accurate. Experimental results also indicate that the proposed method is efficient in pre-simulation strategy and 3D ray searching scheme and is suitable for different indoor environments.
Resumo:
This paper focuses on the parallelization of an ocean model applying current multicore processor-based cluster architectures to an irregular computational mesh. The aim is to maximize the efficiency of the computational resources used. To make the best use of the resources offered by these architectures, this parallelization has been addressed at all the hardware levels of modern supercomputers: firstly, exploiting the internal parallelism of the CPU through vectorization; secondly, taking advantage of the multiple cores of each node using OpenMP; and finally, using the cluster nodes to distribute the computational mesh, using MPI for communication within the nodes. The speedup obtained with each parallelization technique as well as the combined overall speedup have been measured for the western Mediterranean Sea for different cluster configurations, achieving a speedup factor of 73.3 using 256 processors. The results also show the efficiency achieved in the different cluster nodes and the advantages obtained by combining OpenMP and MPI versus using only OpenMP or MPI. Finally, the scalability of the model has been analysed by examining computation and communication times as well as the communication and synchronization overhead due to parallelization.
Resumo:
Gene therapy is based on the vectorization of genes to target cells and their subsequent expression. Cationic amphiphile-mediated delivery of plasmid DNA is the nonviral gene transfer method most often used. We examined the supramolecular structure of lipopolyamine/plasmid DNA complexes under various condensing conditions. Plasmid DNA complexation with lipopolyamine micelles whose mean diameter was 5 nm revealed three domains, depending on the lipopolyamine/plasmid DNA ratio. These domains respectively corresponded to negatively, neutrally, and positively charged complexes. Transmission electron microscopy and x-ray scattering experiments on complexes originating from these three domains showed that although their morphology depends on the lipopolyamine/plasmid DNA ratio, their particle structure consists of ordered domains characterized by even spacing of 80 Å, irrespective of the lipid/DNA ratio. The most active lipopolyamine/DNA complexes for gene transfer were positively charged. They were characterized by fully condensed DNA inside spherical particles (diameter: 50 nm) sandwiched between lipid bilayers. These results show that supercoiled plasmid DNA is able to transform lipopolyamine micelles into a supramolecular organization characterized by ordered lamellar domains.