938 resultados para Clouds of points
Resumo:
We analyse in a common framework the properties of the Voronoi tessellations resulting from regular 2D and 3D crystals and those of tessellations generated by Poisson distributions of points, thus joining on symmetry breaking processes and the approach to uniform random distributions of seeds. We perturb crystalline structures in 2D and 3D with a spatial Gaussian noise whose adimensional strength is α and analyse the statistical properties of the cells of the resulting Voronoi tessellations using an ensemble approach. In 2D we consider triangular, square and hexagonal regular lattices, resulting into hexagonal, square and triangular tessellations, respectively. In 3D we consider the simple cubic (SC), body-centred cubic (BCC), and face-centred cubic (FCC) crystals, whose corresponding Voronoi cells are the cube, the truncated octahedron, and the rhombic dodecahedron, respectively. In 2D, for all values α>0, hexagons constitute the most common class of cells. Noise destroys the triangular and square tessellations, which are structurally unstable, as their topological properties are discontinuous in α=0. On the contrary, the honeycomb hexagonal tessellation is topologically stable and, experimentally, all Voronoi cells are hexagonal for small but finite noise with α<0.12. Basically, the same happens in the 3D case, where only the tessellation of the BCC crystal is topologically stable even against noise of small but finite intensity. In both 2D and 3D cases, already for a moderate amount of Gaussian noise (α>0.5), memory of the specific initial unperturbed state is lost, because the statistical properties of the three perturbed regular tessellations are indistinguishable. When α>2, results converge to those of Poisson-Voronoi tessellations. In 2D, while the isoperimetric ratio increases with noise for the perturbed hexagonal tessellation, for the perturbed triangular and square tessellations it is optimised for specific value of noise intensity. The same applies in 3D, where noise degrades the isoperimetric ratio for perturbed FCC and BCC lattices, whereas the opposite holds for perturbed SCC lattices. This allows for formulating a weaker form of the Kelvin conjecture. By analysing jointly the statistical properties of the area and of the volume of the cells, we discover that also the cells shape heavily fluctuates when noise is introduced in the system. In 2D, the geometrical properties of n-sided cells change with α until the Poisson-Voronoi limit is reached for α>2; in this limit the Desch law for perimeters is shown to be not valid and a square root dependence on n is established, which agrees with exact asymptotic results. Anomalous scaling relations are observed between the perimeter and the area in the 2D and between the areas and the volumes of the cells in 3D: except for the hexagonal (2D) and FCC structure (3D), this applies also for infinitesimal noise. In the Poisson-Voronoi limit, the anomalous exponent is about 0.17 in both the 2D and 3D case. A positive anomaly in the scaling indicates that large cells preferentially feature large isoperimetric quotients. As the number of faces is strongly correlated with the sphericity (cells with more faces are bulkier), in 3D it is shown that the anomalous scaling is heavily reduced when we perform power law fits separately on cells with a specific number of faces.
Resumo:
In this paper a support vector machine (SVM) approach for characterizing the feasible parameter set (FPS) in non-linear set-membership estimation problems is presented. It iteratively solves a regression problem from which an approximation of the boundary of the FPS can be determined. To guarantee convergence to the boundary the procedure includes a no-derivative line search and for an appropriate coverage of points on the FPS boundary it is suggested to start with a sequential box pavement procedure. The SVM approach is illustrated on a simple sine and exponential model with two parameters and an agro-forestry simulation model.
Resumo:
Drought characterisation is an intrinsically spatio-temporal problem. A limitation of previous approaches to characterisation is that they discard much of the spatio-temporal information by reducing events to a lower-order subspace. To address this, an explicit 3-dimensional (longitude, latitude, time) structure-based method is described in which drought events are defined by a spatially and temporarily coherent set of points displaying standardised precipitation below a given threshold. Geometric methods can then be used to measure similarity between individual drought structures. Groupings of these similarities provide an alternative to traditional methods for extracting recurrent space-time signals from geophysical data. The explicit consideration of structure encourages the construction of summary statistics which relate to the event geometry. Example measures considered are the event volume, centroid, and aspect ratio. The utility of a 3-dimensional approach is demonstrated by application to the analysis of European droughts (15 °W to 35°E, and 35 °N to 70°N) for the period 1901–2006. Large-scale structure is found to be abundant with 75 events identified lasting for more than 3 months and spanning at least 0.5 × 106 km2. Near-complete dissimilarity is seen between the individual drought structures, and little or no regularity is found in the time evolution of even the most spatially similar drought events. The spatial distribution of the event centroids and the time evolution of the geographic cross-sectional areas strongly suggest that large area, sustained droughts result from the combination of multiple small area (∼106 km2) short duration (∼3 months) events. The small events are not found to occur independently in space. This leads to the hypothesis that local water feedbacks play an important role in the aggregation process.
Resumo:
It is often assumed that humans generate a 3D reconstruction of the environment, either in egocentric or world-based coordinates, but the steps involved are unknown. Here, we propose two reconstruction-based models, evaluated using data from two tasks in immersive virtual reality. We model the observer’s prediction of landmark location based on standard photogrammetric methods and then combine location predictions to compute likelihood maps of navigation behaviour. In one model, each scene point is treated independently in the reconstruction; in the other, the pertinent variable is the spatial relationship between pairs of points. Participants viewed a simple environment from one location, were transported (virtually) to another part of the scene and were asked to navigate back. Error distributions varied substantially with changes in scene layout; we compared these directly with the likelihood maps to quantify the success of the models. We also measured error distributions when participants manipulated the location of a landmark to match the preceding interval, providing a direct test of the landmark-location stage of the navigation models. Models such as this, which start with scenes and end with a probabilistic prediction of behaviour, are likely to be increasingly useful for understanding 3D vision.
Resumo:
Multibiometrics aims at improving biometric security in presence of spoofing attempts, but exposes a larger availability of points of attack. Standard fusion rules have been shown to be highly sensitive to spoofing attempts – even in case of a single fake instance only. This paper presents a novel spoofing-resistant fusion scheme proposing the detection and elimination of anomalous fusion input in an ensemble of evidence with liveness information. This approach aims at making multibiometric systems more resistant to presentation attacks by modeling the typical behaviour of human surveillance operators detecting anomalies as employed in many decision support systems. It is shown to improve security, while retaining the high accuracy level of standard fusion approaches on the latest Fingerprint Liveness Detection Competition (LivDet) 2013 dataset.
Resumo:
Detailed observations of the solar system planets reveal a wide variety of local atmospheric conditions. Astronomical observations have revealed a variety of extrasolar planets none of which resembles any of the solar system planets in full. Instead, the most massive amongst the extrasolar planets, the gas giants, appear very similar to the class of (young) Brown Dwarfs which are amongst the oldest objects in the universe. Despite of this diversity, solar system planets, extrasolar planets and Brown Dwarfs have broadly similar global temperatures between 300K and 2500K. In consequence, clouds of different chemical species form in their atmospheres. While the details of these clouds differ, the fundamental physical processes are the same. Further to this, all these objects were observed to produce radio and X-ray emission. While both kinds of radiation are well studied on Earth and to a lesser extent on the solar system planets, the occurrence of emission that potentially originate from accelerated electrons on Brown Dwarfs, extrasolar planets and protoplanetary disks is not well understood yet. This paper offers an interdisciplinary view on electrification processes and their feedback on their hosting environment in meteorology, volcanology, planetology and research on extrasolar planets and planet formation.
Resumo:
In order to accelerate computing the convex hull on a set of n points, a heuristic procedure is often applied to reduce the number of points to a set of s points, s ≤ n, which also contains the same hull. We present an algorithm to precondition 2D data with integer coordinates bounded by a box of size p × q before building a 2D convex hull, with three distinct advantages. First, we prove that under the condition min(p, q) ≤ n the algorithm executes in time within O(n); second, no explicit sorting of data is required; and third, the reduced set of s points forms a simple polygonal chain and thus can be directly pipelined into an O(n) time convex hull algorithm. This paper empirically evaluates and quantifies the speed up gained by preconditioning a set of points by a method based on the proposed algorithm before using common convex hull algorithms to build the final hull. A speedup factor of at least four is consistently found from experiments on various datasets when the condition min(p, q) ≤ n holds; the smaller the ratio min(p, q)/n is in the dataset, the greater the speedup factor achieved.
Resumo:
The problem of projecting multidimensional data into lower dimensions has been pursued by many researchers due to its potential application to data analyses of various kinds. This paper presents a novel multidimensional projection technique based on least square approximations. The approximations compute the coordinates of a set of projected points based on the coordinates of a reduced number of control points with defined geometry. We name the technique Least Square Projections ( LSP). From an initial projection of the control points, LSP defines the positioning of their neighboring points through a numerical solution that aims at preserving a similarity relationship between the points given by a metric in mD. In order to perform the projection, a small number of distance calculations are necessary, and no repositioning of the points is required to obtain a final solution with satisfactory precision. The results show the capability of the technique to form groups of points by degree of similarity in 2D. We illustrate that capability through its application to mapping collections of textual documents from varied sources, a strategic yet difficult application. LSP is faster and more accurate than other existing high-quality methods, particularly where it was mostly tested, that is, for mapping text sets.
Resumo:
DEN KOMMUNALA FÖRVALTNINGEN SOM RATIONALISTISKT IDEAL - en fallstudie om styrning och handlingsutrymme inom skola, barnomsorg och miljö- och hälsoskydd.(The municipal authority as a rationalist ideal - a case-study on steering and scope for initiative within child-care, education and environmental departments.)A municipal authority is a considerable producer of services in the local community and iscommonly perceived as an important sector of the Swedish welfare system. One aspect of awell-functioning municipal organisation is that its administrative organs function efficiently.This study examines how activities in municipal administration are steered. The focus is on how different methods are used within a vertical hierarchical perspective to influence the actions of the participants and how the latter try to create space for action. To analyse the problem an ideal-type steering model is used.The study consists of three sections. In the first the research problem and the aims of the study are introduced as well as the methodological and theoretical approach. The result of the study is presented in the second section and in the third conclusions are drawn and discussed.The study shows that the perceptions of the participants involved regarding the possibilities of steering the everyday activities with the support of the methods studied differ on a number of points depending on the sector studied. When control of the various steering methods is distributed in different organisational units in the municipality a number of steering mechanisms operate side-by-side, sometimes in harmony and sometimes independently or in pure conflict with their goals. Steering leads to clear restrictions but there is clearlyroom for initiative, a ‘free-zone’ where the individual has room to act independently. Is it possible based on this study to state whether the ideal-type model functions in the way intended? On many accounts it would seem doubtful whether the effects of steering lead to beneficial effects for the activity. Rather it would seem that the effects of steeringsometimes function more or less randomly because the administration exists in a complexcontext in which the staff can be expected to have its own expectations and act in accordancewith them.
Resumo:
Currently, there is a public bus transportation route in Waterville, Maine. However, this system could be improved. Our goal was to use GIS to find optimal public transportation routes throughout the city based on given points of interest and high population density areas. Three different groups of points of interest were created in the North, West, and South sections of Waterville. Using the Network Analyst tool, which calculates optimal routes, using existing street data, based on the input of stops, barriers, and impedance, we ran an analysis of what we thought would be the routes that best served the greatest number of people. Two different sets of routes were found: one with length as the impedance (the shortest length between the selected stops was favored), and one with population density as the impedance (the roads with the highest population density were favored). Finally, the times of the resulting routes (given a constant speed limit of 25 mph) were calculated and evaluated.
Resumo:
Point pattern matching in Euclidean Spaces is one of the fundamental problems in Pattern Recognition, having applications ranging from Computer Vision to Computational Chemistry. Whenever two complex patterns are encoded by two sets of points identifying their key features, their comparison can be seen as a point pattern matching problem. This work proposes a single approach to both exact and inexact point set matching in Euclidean Spaces of arbitrary dimension. In the case of exact matching, it is assured to find an optimal solution. For inexact matching (when noise is involved), experimental results confirm the validity of the approach. We start by regarding point pattern matching as a weighted graph matching problem. We then formulate the weighted graph matching problem as one of Bayesian inference in a probabilistic graphical model. By exploiting the existence of fundamental constraints in patterns embedded in Euclidean Spaces, we prove that for exact point set matching a simple graphical model is equivalent to the full model. It is possible to show that exact probabilistic inference in this simple model has polynomial time complexity with respect to the number of elements in the patterns to be matched. This gives rise to a technique that for exact matching provably finds a global optimum in polynomial time for any dimensionality of the underlying Euclidean Space. Computational experiments comparing this technique with well-known probabilistic relaxation labeling show significant performance improvement for inexact matching. The proposed approach is significantly more robust under augmentation of the sizes of the involved patterns. In the absence of noise, the results are always perfect.
Resumo:
Neste trabalho é tentada uma abordagem teórica 40 "Psicodiagnóstico", de H. Rorschach. Procura-se mostrar que as bases te6ricas do teste, em sua origem, predeterminam de certo modo os seus caminhos de desenvolvimento conceitual. Assim fazendo, facilitam sua manipulação por pontos de vista próximos a essas bases, e colocam dificuldades ou mesmo impedem, em muitos aspectos, a aceitação da técnica, de modo integral e pleno, por concepções e sistemas de ideias afastadas destas origens teóricas. Para isto, faz-se uma análise dos fundamentos teóricos do teste, tal como surgiu do pensamento de seu autor, e historia-se seu desenvolvimento e uso por psicólogos, clínicos, e outros profissionais. Em seguida, mostra-se como abordam o teste os principais sistemas te6ricos na psicologia contemporânea. Deste modo, o teste é apreciado segundo a psicanálise, o gestaltismo, o behaviorismo e a psicometria, a fenomenologia e o existencialismo, e a psicolinística. Nas conclusões procura-se mostrar como as bases conceituais dadas ao teste por seu autor, embora consideradas pouco desenvolvidas pelos estudiosos, marcaram certos limites para sua evolução e abordagem teórica, de tal modo que concepções objetivistas, como o behaviorismo e a psicometria, não cabem dentro desses limites, o que impede a absorção plena do teste por estes sistemas.
Resumo:
O presente trabalho objetivou conhecer a percepção de profissionais de nível superior e religiosos sobre as causas e o controle da doença mental. Levantou-se opiniões de 360 profissionais (psicólogos, médicos e médicos psiquiatras, enfermeiros e enfermeiros psiquiatras, assistentes sociais, sociólogos, técnicos de administração, economistas e engenheiro s) e 60 religiosos (padres e freiras). Foram utilizadas duas escalas em forma de questionário: a Mental Health of Locus of Origin (MHLO) e a Mental Health Locus of Control (MHLC). Foi feita a distribuição dos sujeitos de acordo com os pontos obtidos em cada escala; discriminou-se o número de sujeitos que acreditam mais nas causas orgânicas da doença mental e o número de sujeitos que acreditam nas causas ambientais, segundo cada item da MHLO. Também discriminou-se quantos sujeitos acreditam que o sucesso da psicoterapia depende mais do comportamento do cliente e os que acreditam que tal sucesso depende mais da capacidade do terapeuta, segundo cada item da MHLC. Caluulou-se o coeficiente de correlação entre locus de origem e locus de controle da doença mental, e coeficiente de confiabilidade das duas escalas. As principais conclusões indicam que os psicólogos, assistentes sociais e ·sociólogos acreditam mais que os médicos e enfermeiros e estes mais do que os religiosos e os tecnólogos que a doença mental e causada por fatores ambientais e que o seu controle depende mais do comportamento do cliente do que d a capacidade do terapeuta.
Resumo:
Neste artigo se desenvolvem alternativas de abordagem multicritério visando à seleção de municípios para a implementação de políticas públicas de educação. Considerando que a situação da educação no Brasil apresenta grande variabilidade, o que impacta as demandas educacionais por políticas públicas de educação de forma diferente em diferentes regiões, verifica-se a necessidade de critérios objetivos para a seleção de pontos de aplicação de recursos para o combate a desigualdades nessa área. Em particular, se discute o emprego do Índice de Desenvolvimento Humano como fundamento para decisões na área educacional. Além disso, visa-se a estabelecer condições para a comparação dessas situações segundo diferentes critérios utilizados para fundamentar políticas públicas de educação.
Resumo:
AIRES, Kelson R. T.; ARAÚJO, Hélder J.; MEDEIROS, Adelardo A. D. Plane Detection Using Affine Homography. In: CONGRESSO BRASILEIRO DE AUTOMÁTICA, 2008, Juiz de Fora, MG: Anais... do CBA 2008.