187 resultados para POLYGONS
Resumo:
1. The UK Biodiversity Action Plan (UKBAP) identifies invertebrate species in danger of national extinction. For many of these species, targets for recovery specify the number of populations that should exist by a specific future date but offer no procedure to plan strategically to achieve the target for any species. 2. Here we describe techniques based upon geographic information systems (GIS) that produce conservation strategy maps (CSM) to assist with achieving recovery targets based on all available and relevant information. 3. The heath fritillary Mellicta athalia is a UKBAP species used here to illustrate the use of CSM. A phase 1 habitat survey was used to identify habitat polygons across the county of Kent, UK. These were systematically filtered using relevant habitat, botanical and autecological data to identify seven types of polygon, including those with extant colonies or in the vicinity of extant colonies, areas managed for conservation but without colonies, and polygons that had the appropriate habitat structure and may therefore be suitable for reintroduction. 4. Five clusters of polygons of interest were found across the study area. The CSM of two of them are illustrated here: the Blean Wood complex, which contains the existing colonies of heath fritillary in Kent, and the Orlestone Forest complex, which offers opportunities for reintroduction. 5. Synthesis and applications. Although the CSM concept is illustrated here for the UK, we suggest that CSM could be part of species conservation programmes throughout the world. CSM are dynamic and should be stored in electronic format, preferably on the world-wide web, so that they can be easily viewed and updated. CSM can be used to illustrate opportunities and to develop strategies with scientists and non-scientists, enabling the engagement of all communities in a conservation programme. CSM for different years can be presented to illustrate the progress of a plan or to provide continuous feedback on how a field scenario develops.
Resumo:
1. The UK Biodiversity Action Plan (UKBAP) identifies invertebrate species in danger of national extinction. For many of these species, targets for recovery specify the number of populations that should exist by a specific future date but offer no procedure to plan strategically to achieve the target for any species. 2. Here we describe techniques based upon geographic information systems (GIS) that produce conservation strategy maps (CSM) to assist with achieving recovery targets based on all available and relevant information. 3. The heath fritillary Mellicta athalia is a UKBAP species used here to illustrate the use of CSM. A phase 1 habitat survey was used to identify habitat polygons across the county of Kent, UK. These were systematically filtered using relevant habitat, botanical and autecological data to identify seven types of polygon, including those with extant colonies or in the vicinity of extant colonies, areas managed for conservation but without colonies, and polygons that had the appropriate habitat structure and may therefore be suitable for reintroduction. 4. Five clusters of polygons of interest were found across the study area. The CSM of two of them are illustrated here: the Blean Wood complex, which contains the existing colonies of heath fritillary in Kent, and the Orlestone Forest complex, which offers opportunities for reintroduction. 5. Synthesis and applications. Although the CSM concept is illustrated here for the UK, we suggest that CSM could be part of species conservation programmes throughout the world. CSM are dynamic and should be stored in electronic format, preferably on the world-wide web, so that they can be easily viewed and updated. CSM can be used to illustrate opportunities and to develop strategies with scientists and non-scientists, enabling the engagement of all communities in a conservation programme. CSM for different years can be presented to illustrate the progress of a plan or to provide continuous feedback on how a field scenario develops.
Resumo:
We bridge the properties of the regular triangular, square, and hexagonal honeycomb Voronoi tessellations of the plane to the Poisson-Voronoi case, thus analyzing in a common framework symmetry breaking processes and the approach to uniform random distributions of tessellation-generating points. We resort to ensemble simulations of tessellations generated by points whose regular positions are perturbed through a Gaussian noise, whose variance is given by the parameter α2 times the square of the inverse of the average density of points. We analyze the number of sides, the area, and the perimeter of the Voronoi cells. For all valuesα >0, hexagons constitute the most common class of cells, and 2-parameter gamma distributions provide an efficient description of the statistical properties of the analyzed geometrical characteristics. The introduction of noise destroys the triangular and square tessellations, which are structurally unstable, as their topological properties are discontinuous in α = 0. On the contrary, the honeycomb hexagonal tessellation is topologically stable and, experimentally, all Voronoi cells are hexagonal for small but finite noise withα <0.12. For all tessellations and for small values of α, we observe a linear dependence on α of the ensemble mean of the standard deviation of the area and perimeter of the cells. Already for a moderate amount of Gaussian noise (α >0.5), memory of the specific initial unperturbed state is lost, because the statistical properties of the three perturbed regular tessellations are indistinguishable. When α >2, results converge to those of Poisson-Voronoi tessellations. The geometrical properties of n-sided cells change with α until the Poisson- Voronoi limit is reached for α > 2; in this limit the Desch law for perimeters is shown to be not valid and a square root dependence on n is established. This law allows for an easy link to the Lewis law for areas and agrees with exact asymptotic results. Finally, for α >1, the ensemble mean of the cells area and perimeter restricted to the hexagonal cells agree remarkably well with the full ensemble mean; this reinforces the idea that hexagons, beyond their ubiquitous numerical prominence, can be interpreted as typical polygons in 2D Voronoi tessellations.
Resumo:
In this paper we propose and analyse a hybrid numerical-asymptotic boundary element method for the solution of problems of high frequency acoustic scattering by a class of sound-soft nonconvex polygons. The approximation space is enriched with carefully chosen oscillatory basis functions; these are selected via a study of the high frequency asymptotic behaviour of the solution. We demonstrate via a rigorous error analysis, supported by numerical examples, that to achieve any desired accuracy it is sufficient for the number of degrees of freedom to grow only in proportion to the logarithm of the frequency as the frequency increases, in contrast to the at least linear growth required by conventional methods. This appears to be the first such numerical analysis result for any problem of scattering by a nonconvex obstacle. Our analysis is based on new frequency-explicit bounds on the normal derivative of the solution on the boundary and on its analytic continuation into the complex plane.
Resumo:
A previously proposed model describing the trapping site of the interstitial atomic hydrogen in borate glasses is analyzed. In this model the atomic hydrogen is stabilized at the centers of oxygen polygons belonging to B-O ring structures in the glass network by van der Waals forces. The previously reported atomic hydrogen isothermal decay experimental data are discussed in the light of this microscopic model. A coupled differential equation system of the observed decay kinetics was solved numerically using the Runge Kutta method. The experimental untrapping activation energy of 0.7 x 10(-19) J is in good agreement with the calculated results of dispersion interaction between the stabilized atomic hydrogen and the neighboring oxygen atoms at the vertices of hexagonal ring structures. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The Patino Formation sandstones, which crop out in Aregua neighborhood in Eastern Paraguay and show columnar joints near the contact zone with a nephelinite dyke, have as their main characteristics the high proportion of syntaxial quartz overgrowth and a porosity originated from different processes, initially by dissolution and later by partial filling and fracturing. Features like the presence of floating grains in the syntaxial cement, the transitional interpenetrative contact between the silica-rich cement and grains as well as the intense fracture porosity are strong indications that the cement has been formed by dissolution and reprecipitation of quartz from the framework under the effect of thermal expansion followed by rapid contraction. The increase of the silica-rich cement towards the dyke in association with the orthogonal disposition of the columns relative to dyke walls are indicative that the igneous body may represent the main heat source for the interstitial aqueous solutions previously existing in the sediments. At macroscopic scale, the increasing of internal tensions in the sandstones is responsible for the nucleation of polygons, leading to the individualization of prisms, which are interconnected by a system of joints, formed firstly on isotherm surfaces of low temperature and later on successive adjacent planes towards the dyke heat source.
Resumo:
The objective of the present work was develop a study about the writing and the algebraic manipulation of symbolical expressions for perimeter and area of some convex polygons, approaching the properties of the operations and equality, extending to the obtaining of the formulas of length and area of the circle, this one starting on the formula of the perimeter and area of the regular hexagon. To do so, a module with teaching activities was elaborated based on constructive teaching. The study consisted of a methodological intervention, done by the researcher, and had as subjects students of the 8th grade of the State School Desembargador Floriano Cavalcanti, located on the city of Natal, Rio Grande do Norte. The methodological intervention was done in three stages: applying of a initial diagnostic evaluation, developing of the teaching module, and applying of the final evaluation based on the Mathematics teaching using Constructivist references. The data collected in the evaluations was presented as descriptive statistics. The results of the final diagnostic evaluation were analyzed in the qualitative point of view, using the criteria established by Richard Skemp s second theory about the comprehension of mathematical concepts. The general results about the data from the evaluations and the applying of the teaching module showed a qualitative difference in the learning of the students who participated of the intervention
Resumo:
We revisit the problem of visibility, which is to determine a set of primitives potentially visible in a set of geometry data represented by a data structure, such as a mesh of polygons or triangles, we propose a solution for speeding up the three-dimensional visualization processing in applications. We introduce a lean structure , in the sense of data abstraction and reduction, which can be used for online and interactive applications. The visibility problem is especially important in 3D visualization of scenes represented by large volumes of data, when it is not worthwhile keeping all polygons of the scene in memory. This implies a greater time spent in the rendering, or is even impossible to keep them all in huge volumes of data. In these cases, given a position and a direction of view, the main objective is to determine and load a minimum ammount of primitives (polygons) in the scene, to accelerate the rendering step. For this purpose, our algorithm performs cutting primitives (culling) using a hybrid paradigm based on three known techniques. The scene is divided into a cell grid, for each cell we associate the primitives that belong to them, and finally determined the set of primitives potentially visible. The novelty is the use of triangulation Ja 1 to create the subdivision grid. We chose this structure because of its relevant characteristics of adaptivity and algebrism (ease of calculations). The results show a substantial improvement over traditional methods when applied separately. The method introduced in this work can be used in devices with low or no dedicated processing power CPU, and also can be used to view data via the Internet, such as virtual museums applications
Resumo:
O presente trabalho foi realizado com o objetivo de estudar a coleta e o descarte de plantas aquáticas em diferentes locais e infestações do sistema Tietê/Paraná, no reservatório de Jupiá. A operação foi realizada com auxílio de instrumentação instalada em uma colhedora de plantas aquáticas, com sistema de GPS dotado de sinal de correção diferencial. Os tempos gastos para carregar e descarregar a colhedora foram determinados por cronometragem, e a distância do ponto final de coleta ao ponto de descarte e o tempo de deslocamento, por cronometragem e uso de GPS convencional. em algumas coletas foram demarcados polígonos, instruindo-se o operador a trabalhar exclusivamente na área correspondente. A interpretação dos resultados permitiu determinar a participação do tempo de coleta em relação ao tempo total de operação, indicando um valor significativo do ponto de vista operacional (>70%). Considerando o descarte em áreas infestadas com taboa, o deslocamento total médio foi de apenas 383 m, com gasto médio de 200,96 s. Os valores de capacidade operacional da colhedora oscilaram entre 0,23 e 1,60 ha h-1, indicando valor médio de 4,48 ha dia-1. A maior limitação à capacidade operacional associou-se à velocidade média de deslocamento, com maior agravante em áreas com altas infestações ou profundas. Considerando-se o deslocamento da colhedora, houve grande dificuldade de orientação em condições normais de operação, inviabilizando a manutenção de espaçamentos uniformes entre as faixas de coleta e sobrepondo as passagens. Conclui-se que a avaliação operacional indicou a impossibilidade de operar a colhedora sem o auxílio de um sistema de navegação que permita orientar a sua movimentação nas áreas de controle.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This paper proposes a methodology for automatic extraction of building roof contours from a Digital Elevation Model (DEM), which is generated through the regularization of an available laser point cloud. The methodology is based on two steps. First, in order to detect high objects (buildings, trees etc.), the DEM is segmented through a recursive splitting technique and a Bayesian merging technique. The recursive splitting technique uses the quadtree structure for subdividing the DEM into homogeneous regions. In order to minimize the fragmentation, which is commonly observed in the results of the recursive splitting segmentation, a region merging technique based on the Bayesian framework is applied to the previously segmented data. The high object polygons are extracted by using vectorization and polygonization techniques. Second, the building roof contours are identified among all high objects extracted previously. Taking into account some roof properties and some feature measurements (e. g., area, rectangularity, and angles between principal axes of the roofs), an energy function was developed based on the Markov Random Field (MRF) model. The solution of this function is a polygon set corresponding to building roof contours and is found by using a minimization technique, like the Simulated Annealing (SA) algorithm. Experiments carried out with laser scanning DEM's showed that the methodology works properly, as it delivered roof contours with approximately 90% shape accuracy and no false positive was verified.
Resumo:
This research presents a methodology for prediction of building shadows cast on urban roads existing on high-resolution aerial imagery. Shadow elements can be used in the modeling of contextual information, whose use has become more and more common in image analysis complex processes. The proposed methodology consists in three sequential steps. First, the building roof contours are manually extracted from an intensity image generated by the transformation of a digital elevation model (DEM) obtained from airborne laser scanning data. In similarly, the roadside contours are extracted, now from the radiometric information of the laser scanning data. Second, the roof contour polygons are projected onto the adjacent roads by using the parallel projection straight lines, whose directions are computed from the solar ephemeris, which depends on the aerial image acquisition time. Finally, parts of shadow polygons that are free from building perspective obstructions are determined, given rise to new shadow polygons. The results obtained in the experimental evaluation of the methodology showed that the method works properly, since it allowed the prediction of shadow in high-resolution imagery with high accuracy and reliability.
Resumo:
In this paper, a methodology is proposed for the geometric refinement of laser scanning building roof contours using high-resolution aerial images and Markov Random Field (MRF) models. The proposed methodology takes for granted that the 3D description of each building roof reconstructed from the laser scanning data (i.e., a polyhedron) is topologically correct and that it is only necessary to improve its accuracy. Since roof ridges are accurately extracted from laser scanning data, our main objective is to use high-resolution aerial images to improve the accuracy of roof outlines. In order to meet this goal, the available roof contours are first projected onto the image-space. After that, the projected polygons and the straight lines extracted from the image are used to establish an MRF description, which is based on relations ( relative length, proximity, and orientation) between the two sets of straight lines. The energy function associated with the MRF is minimized by using a modified version of the brute force algorithm, resulting in the grouping of straight lines for each roof object. Finally, each grouping of straight lines is topologically reconstructed based on the topology of the corresponding laser scanning polygon projected onto the image-space. The preliminary results showed that the proposed methodology is promising, since most sides of the refined polygons are geometrically better than corresponding projected laser scanning straight lines.
Resumo:
OBJETIVO: Avaliar a prevalência de tracoma em escolares de Botucatu/SP-Brasil e a distribuição espacial dos casos. MÉTODOS: Foi realizado um estudo transversal, em crianças de 7-14 anos, que frequentavam as escolas do ensino fundamental de Botucatu/SP, em novembro/2005. O tamanho da amostra foi estimado em 2.092 crianças, considerando-se a prevalência histórica de 11,2%, aceitando-se erro de estimação de 10% e nível de confiança de 95%. A amostra foi probabilística, ponderada e acrescida de 20%, devido à possível ocorrência de perdas. Examinaram-se 2.692 crianças. O diagnóstico foi clínico, baseado na normatização da Organização Mundial da Saúde (OMS). Para avaliação dos dados espaciais, utilizou-se o programa CartaLinx (v1.2), sendo os setores de demanda escolar digitalizados de acordo com as divisões do planejamento da Secretaria de Educação. Os dados foram analisados estatisticamente, sendo a análise da estrutura espacial dos eventos calculadas usando o programa Geoda. RESULTADOS: A prevalência de tracoma nos escolares de Botucatu foi de 2,9%, tendo sido detectados casos de tracoma folicular. A análise exploratória espacial não permitiu rejeitar a hipótese nula de aleatoriedade (I= -0,45, p>0,05), não havendo setores de demanda significativos. A análise feita para os polígonos de Thiessen também mostrou que o padrão global foi aleatório (I= -0,07; p=0,49). Entretanto, os indicadores locais apontaram um agrupamento do tipo baixo-baixo para um polígono ao norte da área urbana. CONCLUSÃO: A prevalência de tracoma em escolares de Botucatu foi de 2,9%. A análise da distribuição espacial não revelou áreas de maior aglomeração de casos. Embora o padrão global da doença não reproduza as condições socioeconômicas da população, a prevalência mais baixa do tracoma foi encontrada em setores de menor vulnerabilidade social.
Resumo:
The objective of the present study, developed in a mountainous region in Brazil where many landslides occur, is to present a method for detecting landslide scars that couples image processing techniques with spatial analysis tools. An IKONOS image was initially segmented, and then classified through a Batthacharrya classifier, with an acceptance limit of 99%, resulting in 216 polygons identified with a spectral response similar to landslide scars. After making use of some spatial analysis tools that took into account a susceptibility map, a map of local drainage channels and highways, and the maximum expected size of scars in the study area, some features misinterpreted as scars were excluded. The 43 resulting features were then compared with visually interpreted landslide scars and field observations. The proposed method can be reproduced and enhanced by adding filtering criteria and was able to find new scars on the image, with a final error rate of 2.3%.