8 resultados para Spatial Point Pattern analysis
em Universidad de Alicante
Resumo:
Las funciones de segundo orden son cada vez más empleadas en el análisis de procesos ecológicos. En este trabajo presentamos dos funciones de 2º orden desarrolladas recientemente que permiten analizar la interacción espacio-temporal entre dos especies o tipos funcionales de individuos. Estas funciones han sido desarrolladas para el estudio de interacciones entre especies en masas forestales a partir de la actual distribución diamétrica de los árboles. La primera de ellas es la función bivariante para procesos de puntos con marca Krsmm, que permite analizar la correlación espacial de una variable entre los individuos pertenecientes a dos especies en función de la distancia. La segunda es la función de reemplazo , que permite analizar la asociación entre los individuos pertenecientes a dos especies en función de la diferencia entre sus diámetros u otra variable asociada a dichos individuos. Para mostrar el comportamiento de ambas funciones en el análisis de sistemas forestales en los que operan diferentes procesos ecológicos se presentan tres casos de estudio: una masa mixta de Pinus pinea L. y Pinus pinaster Ait. en la Meseta Norte, un bosque de niebla de la Región Tropical Andina y el ecotono entre las masas de Quercus pyrenaica Willd. y Pinus sylvestris L. en el Sistema Central, en los que tanto la función Krsmm como la función r se utilizan para analizar la dinámica forestal a partir de parcelas experimentales con todos los árboles localizados y de parcelas de inventario.
Resumo:
Objective: To assess the usefulness of microperimetry (MP) as an additional objective method for characterizing the fixation pattern in nystagmus. Design: Prospective study. Participants: Fifteen eyes of 8 subjects (age, 12–80 years) with nystagmus from the Lluís Alcanyís Foundation (University of Valencia, Spain) were included. Methods: All patients had a comprehensive ophthalmologic examination including a microperimetric examination (MAIA, CenterVue, Padova, Italy). The following microperimetric parameters were evaluated: average threshold (AT), macular integrity index (MI), fixating points within a circle of 1° (P1) and 2° of radius (P2), bivariate contour ellipse area (BCEA) considering 63% and 95% of fixating points, and horizontal and vertical axes of that ellipse. Results: In monocular conditions, 6 eyes showed a fixation classified as stable, 6 eyes showed a relatively unstable fixation, and 3 eyes showed an unstable fixation. Statistically significant differences were found between the horizontal and vertical components of movement (p = 0.001), as well as in their ranges (p < 0.001). Intereye comparison showed differences between eyes in some subjects, but only statistically significant differences were found in the fixation coordinates X and Y (p < 0.001). No significant intereye differences were found between microperimetric parameters. Between monocular and binocular conditions, statistically significant differences in the X and Y coordinates were found in all eyes (p < 0.02) except one. No significant differences were found between MP parameters for monocular or binocular conditions. Strong correlations of corrected distance visual acuity (CDVA) with AT (r = 0.812, p = 0.014), MI (r = –0.812, p = 0.014), P1 (r = 0.729, p = 0.002), horizontal diameter of BCEA (r = –0.700, p = 0.004), and X range (r = –0.722, p = 0.005) were found. Conclusions: MP seems to be a useful technology for the characterization of the fixation pattern in nystagmus, which seems to be related to the level of visual acuity achieved by the patient.
Resumo:
Estudi sobre l'obra dramàtica de Francesc Renart, inventari i inicis del sainet costumista català del Vuit-cents.
Resumo:
Staff detection and removal is one of the most important issues in optical music recognition (OMR) tasks since common approaches for symbol detection and classification are based on this process. Due to its complexity, staff detection and removal is often inaccurate, leading to a great number of errors in posterior stages. For this reason, a new approach that avoids this stage is proposed in this paper, which is expected to overcome these drawbacks. Our approach is put into practice in a case of study focused on scores written in white mensural notation. Symbol detection is performed by using the vertical projection of the staves. The cross-correlation operator for template matching is used at the classification stage. The goodness of our proposal is shown in an experiment in which our proposal attains an extraction rate of 96 % and a classification rate of 92 %, on average. The results found have reinforced the idea of pursuing a new research line in OMR systems without the need of the removal of staff lines.
Resumo:
Polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/F) have been studied for several decades and are well-known as unintentionally generated persistent organic pollutants (POPs), which pose serious health and environmental risks on a global scale1. Polybrominated dibenzo-p-dioxins and dibenzofurans (PBDD/F) have similar properties and effects to PCDD/F, as they are structural analogs with all the chlorine atoms substituted by bromine atoms. PBDD/F have been found in various matrices such as air, sediments, marine products, and human adipose samples.
Resumo:
The complete characterization of rock masses implies the acquisition of information of both, the materials which compose the rock mass and the discontinuities which divide the outcrop. Recent advances in the use of remote sensing techniques – such as Light Detection and Ranging (LiDAR) – allow the accurate and dense acquisition of 3D information that can be used for the characterization of discontinuities. This work presents a novel methodology which allows the calculation of the normal spacing of persistent and non-persistent discontinuity sets using 3D point cloud datasets considering the three dimensional relationships between clusters. This approach requires that the 3D dataset has been previously classified. This implies that discontinuity sets are previously extracted, every single point is labeled with its corresponding discontinuity set and every exposed planar surface is analytically calculated. Then, for each discontinuity set the method calculates the normal spacing between an exposed plane and its nearest one considering 3D space relationship. This link between planes is obtained calculating for every point its nearest point member of the same discontinuity set, which provides its nearest plane. This allows calculating the normal spacing for every plane. Finally, the normal spacing is calculated as the mean value of all the normal spacings for each discontinuity set. The methodology is validated through three cases of study using synthetic data and 3D laser scanning datasets. The first case illustrates the fundamentals and the performance of the proposed methodology. The second and the third cases of study correspond to two rock slopes for which datasets were acquired using a 3D laser scanner. The second case study has shown that results obtained from the traditional and the proposed approaches are reasonably similar. Nevertheless, a discrepancy between both approaches has been found when the exposed planes members of a discontinuity set were hard to identify and when the planes pairing was difficult to establish during the fieldwork campaign. The third case study also has evidenced that when the number of identified exposed planes is high, the calculated normal spacing using the proposed approach is minor than those using the traditional approach.
Resumo:
The Iterative Closest Point algorithm (ICP) is commonly used in engineering applications to solve the rigid registration problem of partially overlapped point sets which are pre-aligned with a coarse estimate of their relative positions. This iterative algorithm is applied in many areas such as the medicine for volumetric reconstruction of tomography data, in robotics to reconstruct surfaces or scenes using range sensor information, in industrial systems for quality control of manufactured objects or even in biology to study the structure and folding of proteins. One of the algorithm’s main problems is its high computational complexity (quadratic in the number of points with the non-optimized original variant) in a context where high density point sets, acquired by high resolution scanners, are processed. Many variants have been proposed in the literature whose goal is the performance improvement either by reducing the number of points or the required iterations or even enhancing the complexity of the most expensive phase: the closest neighbor search. In spite of decreasing its complexity, some of the variants tend to have a negative impact on the final registration precision or the convergence domain thus limiting the possible application scenarios. The goal of this work is the improvement of the algorithm’s computational cost so that a wider range of computationally demanding problems from among the ones described before can be addressed. For that purpose, an experimental and mathematical convergence analysis and validation of point-to-point distance metrics has been performed taking into account those distances with lower computational cost than the Euclidean one, which is used as the de facto standard for the algorithm’s implementations in the literature. In that analysis, the functioning of the algorithm in diverse topological spaces, characterized by different metrics, has been studied to check the convergence, efficacy and cost of the method in order to determine the one which offers the best results. Given that the distance calculation represents a significant part of the whole set of computations performed by the algorithm, it is expected that any reduction of that operation affects significantly and positively the overall performance of the method. As a result, a performance improvement has been achieved by the application of those reduced cost metrics whose quality in terms of convergence and error has been analyzed and validated experimentally as comparable with respect to the Euclidean distance using a heterogeneous set of objects, scenarios and initial situations.
Resumo:
.bin files should be opened using CloudCompare