938 resultados para Clouds of points
Resumo:
Given that the total amount of losses in a distribution system is known, with a reliable methodology for the technical loss calculation, the non-technical losses can be obtained by subtraction. A usual method of calculation technical losses in the electric utilities uses two important factors: load factor and the loss factor. The load factor is usually obtained with energy and demand measurements, whereas, to compute the loss factor it is necessary the learning of demand and energy loss, which are not, in general, prone of direct measurements. In this work, a statistical analysis of this relationship using the curves of a sampling of consumers in a specific company is presented. These curves will be summarized in different bands of coefficient k. Then, it will be possible determine where each group of consumer has its major concentration of points. ©2008 IEEE.
Resumo:
The aim of this work is to evaluate the influence of point measurements in images, with subpixel accuracy, and its contribution in the calibration of digital cameras. Also, the effect of subpixel measurements in 3D coordinates of check points in the object space will be evaluated. With this purpose, an algorithm that allows subpixel accuracy was implemented for semi-automatic determination of points of interest, based on Fõrstner operator. Experiments were accomplished with a block of images acquired with the multispectral camera DuncanTech MS3100-CIR. The influence of subpixel measurements in the adjustment by Least Square Method (LSM) was evaluated by the comparison of estimated standard deviation of parameters in both situations, with manual measurement (pixel accuracy) and with subpixel estimation. Additionally, the influence of subpixel measurements in the 3D reconstruction was also analyzed. Based on the obtained results, i.e., on the quantification of the standard deviation reduction in the Inner Orientation Parameters (IOP) and also in the relative error of the 3D reconstruction, it was shown that measurements with subpixel accuracy are relevant for some tasks in Photogrammetry, mainly for those in which the metric quality is of great relevance, as Camera Calibration.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Objective: The aim of the present study was to evaluate the effect of pursed-lip breathing (PLB) on cardiac autonomic modulation in individuals with chronic obstructive pulmonary disease (COPD) while at rest. Methods: Thirty-two individuals were allocated to one of two groups: COPD (n = 17; 67.29 +/- 6.87 years of age) and control (n = 15; 63.2 +/- 7.96 years of age). The groups were submitted to a two-stage experimental protocol. The first stage consisted of the characterization of the sample and spirometry. The second stage comprised the analysis of cardiac autonomic modulation through the recording of R-R intervals. This analysis was performed using both nonlinear and linear heart rate variability (HRV). In the statistical analysis, the level of significance was set to 5% (p = 0.05). Results: PLB promoted significant increases in the SD1, SD2, RMSSD and LF (ms(2)) indices as well as an increase in alpha(1) and a reduction in alpha(2) in the COPD group. A greater dispersion of points on the Poincare plots was also observed. The magnitude of the changes produced by PLB differed between groups. Conclusion: PLB led to a loss of fractal correlation properties of heart rate in the direction of linearity in patients with COPD as well as an increase in vagal activity and impact on the spectral analysis. The difference in the magnitude of the changes produced by PLB between groups may be related to the presence of the disease and alterations in the respiration rate.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The algorithm creates a buffer area around the cartographic features of interest in one of the images and compare it with the other one. During the comparison, the algorithm calculates the number of equals and different points and uses it to calculate the statistical values of the analysis. One calculated statistical value is the correctness, which shows the user the percentage of points that were correctly extracted. Another one is the completeness that shows the percentage of points that really belong to the interest feature. And the third value shows the idea of quality obtained by the extraction method, since that in order to calculate the quality the algorithm uses the correctness and completeness previously calculated. All the performed tests using this algorithm were possible to use the statistical values calculated to represent quantitatively the quality obtained by the extraction method executed. So, it is possible to say that the developed algorithm can be used to analyze extraction methods of cartographic features of interest, since that the results obtained were promising.
Resumo:
The reverse Monte Carlo (RMC) method generates sets of points in space which yield radial distribution functions (RDFS) that approximate those of the system of interest. Such sets of configurations should, in principle, be sufficient to determine the structural properties of the system. In this work we apply the RMC technique to fluids of hard diatomic molecules. The experimental RDFs of the hard-dimer fluid were generated by the conventional MC method and used as input in the RMC simulations. Our results indicate that the RMC method is only satisfactory in determining the local structure of the fluid studied by means of only mono-variable RDF. Also we suggest that the use of multi-variable RDFs would improve the technique significantly. However, the accuracy of the method turned out to be very sensitive to the variance of the input experimental RDF. © 1995.
Resumo:
Vortex-induced motion (VIM) is a highly nonlinear dynamic phenomenon. Usual spectral analysis methods, using the Fourier transform, rely on the hypotheses of linear and stationary dynamics. A method to treat nonstationary signals that emerge from nonlinear systems is denoted Hilbert-Huang transform (HHT) method. The development of an analysis methodology to study the VIM of a monocolumn production, storage, and offloading system using HHT is presented. The purposes of the present methodology are to improve the statistics analysis of VIM. The results showed to be comparable to results obtained from a traditional analysis (mean of the 10% highest peaks) particularly for the motions in the transverse direction, although the difference between the results from the traditional analysis for the motions in the in-line direction showed a difference of around 25%. The results from the HHT analysis are more reliable than the traditional ones, owing to the larger number of points to calculate the statistics characteristics. These results may be used to design risers and mooring lines, as well as to obtain VIM parameters to calibrate numerical predictions. [DOI: 10.1115/1.4003493]
Resumo:
Effects of roads on wildlife and its habitat have been measured using metrics, such as the nearest road distance, road density, and effective mesh size. In this work we introduce two new indices: (1) Integral Road Effect (IRE), which measured the sum effects of points in a road at a fixed point in the forest; and (2) Average Value of the Infinitesimal Road Effect (AVIRE), which measured the average of the effects of roads at this point. IRE is formally defined as the line integral of a special function (the infinitesimal road effect) along the curves that model the roads, whereas AVIRE is the quotient of IRE by the length of the roads. Combining tools of ArcGIS software with a numerical algorithm, we calculated these and other road and habitat cover indices in a sample of points in a human-modified landscape in the Brazilian Atlantic Forest, where data on the abundance of two groups of small mammals (forest specialists and habitat generalists) were collected in the field. We then compared through the Akaike Information Criterion (AIC) a set of candidate regression models to explain the variation in small mammal abundance, including models with our two new road indices (AVIRE and IRE) or models with other road effect indices (nearest road distance, mesh size, and road density), and reference models (containing only habitat indices, or only the intercept without the effect of any variable). Compared to other road effect indices, AVIRE showed the best performance to explain abundance of forest specialist species, whereas the nearest road distance obtained the best performance to generalist species. AVIRE and habitat together were included in the best model for both small mammal groups, that is, higher abundance of specialist and generalist small mammals occurred where there is lower average road effect (less AVIRE) and more habitat. Moreover, AVIRE was not significantly correlated with habitat cover of specialists and generalists differing from the other road effect indices, except mesh size, which allows for separating the effect of roads from the effect of habitat on small mammal communities. We suggest that the proposed indices and GIS procedures could also be useful to describe other spatial ecological phenomena, such as edge effect in habitat fragments. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
The concept of metacontingency was taught to undergraduate students of Psychology by using a "game" simulation proposed originally by Vichi, Andery and Glenn (2009). Twenty-five students, distributed into three groups were exposed to six experimental sessions in which they had to make bets and divide the amounts gained. The three groups competed against each other for photocopies quotas. Two contingencies shifted over the sessions. Under Contingency B, the group would win points only if in the previous round each member had received the same amount of points and under Contingency A, winning was contingent on an unequal distribution of the points. We observed that proportional divisions predominated independent of the contingency in course. The manipulation of cultural consequences (winning or losing points) produced consistent modifications in two response categories: 1) choices of the value bet in each round, and 2) divisions of the points among group members. Controlling relations between cultural consequences and the behavior of dividing were statistically significant in one of the groups, whereas in the other two groups controlling relations were observed only in Contingency B. A review of the reinforcement criteria used in the original experiment is suggested.
Resumo:
The thesis deals with the modularity conjecture for three-dimensional Calabi-Yau varieties. This is a generalization of the work of A. Wiles and others on modularity of elliptic curves. Modularity connects the number of points on varieties with coefficients of certain modular forms. In chapter 1 we collect the basics on arithmetic on Calabi-Yau manifolds, including general modularity results and strategies for modularity proofs. In chapters 2, 3, 4 and 5 we investigate examples of modular Calabi-Yau threefolds, including all examples occurring in the literature and many new ones. Double octics, i.e. Double coverings of projective 3-space branched along an octic surface, are studied in detail. In chapter 6 we deal with examples connected with the same modular forms. According to the Tate conjecture there should be correspondences between them. Many correspondences are constructed explicitly. We finish by formulating conjectures on the occurring newforms, especially their levels. In the appendices we compile tables of coefficients of weight 2 and weight 4 newforms and many examples of double octics.
Resumo:
We have developed a method for locating sources of volcanic tremor and applied it to a dataset recorded on Stromboli volcano before and after the onset of the February 27th 2007 effusive eruption. Volcanic tremor has attracted considerable attention by seismologists because of its potential value as a tool for forecasting eruptions and for better understanding the physical processes that occur inside active volcanoes. Commonly used methods to locate volcanic tremor sources are: 1) array techniques, 2) semblance based methods, 3) calculation of wave field amplitude. We have choosen the third approach, using a quantitative modeling of the seismic wavefield. For this purpose, we have calculated the Green Functions (GF) in the frequency domain with the Finite Element Method (FEM). We have used this method because it is well suited to solve elliptic problems, as the elastodynamics in the Fourier domain. The volcanic tremor source is located by determining the source function over a regular grid of points. The best fit point is choosen as the tremor source location. The source inversion is performed in the frequency domain, using only the wavefield amplitudes. We illustrate the method and its validation over a synthetic dataset. We show some preliminary results on the Stromboli dataset, evidencing temporal variations of the volcanic tremor sources.
Resumo:
The Scilla rock avalanche occurred on 6 February 1783 along the coast of the Calabria region (southern Italy), close to the Messina Strait. It was triggered by a mainshock of the Terremoto delle Calabrie seismic sequence, and it induced a tsunami wave responsible for more than 1500 casualties along the neighboring Marina Grande beach. The main goal of this work is the application of semi-analtycal and numerical models to simulate this event. The first one is a MATLAB code expressly created for this work that solves the equations of motion for sliding particles on a two-dimensional surface through a fourth-order Runge-Kutta method. The second one is a code developed by the Tsunami Research Team of the Department of Physics and Astronomy (DIFA) of the Bologna University that describes a slide as a chain of blocks able to interact while sliding down over a slope and adopts a Lagrangian point of view. A wide description of landslide phenomena and in particular of landslides induced by earthquakes and with tsunamigenic potential is proposed in the first part of the work. Subsequently, the physical and mathematical background is presented; in particular, a detailed study on derivatives discratization is provided. Later on, a description of the dynamics of a point-mass sliding on a surface is proposed together with several applications of numerical and analytical models over ideal topographies. In the last part, the dynamics of points sliding on a surface and interacting with each other is proposed. Similarly, different application on an ideal topography are shown. Finally, the applications on the 1783 Scilla event are shown and discussed.
Resumo:
In questa tesi viene studiato l'approccio funtoriale alla supergeometria. In particolare si usano le topologie di Grothendieck per studiare il concetto di rappresentabilità in questo contesto, in analogia a quanto fatto in geometria algebrica classica. Vengono poi introdotti i funtori di Weil-Berezin e lo Schwarz embedding, motivando i legami tra questi concetti e la rappresentabilità nel caso classico.
Resumo:
The penetration, translocation, and distribution of ultrafine and nanoparticles in tissues and cells are challenging issues in aerosol research. This article describes a set of novel quantitative microscopic methods for evaluating particle distributions within sectional images of tissues and cells by addressing the following questions: (1) is the observed distribution of particles between spatial compartments random? (2) Which compartments are preferentially targeted by particles? and (3) Does the observed particle distribution shift between different experimental groups? Each of these questions can be addressed by testing an appropriate null hypothesis. The methods all require observed particle distributions to be estimated by counting the number of particles associated with each defined compartment. For studying preferential labeling of compartments, the size of each of the compartments must also be estimated by counting the number of points of a randomly superimposed test grid that hit the different compartments. The latter provides information about the particle distribution that would be expected if the particles were randomly distributed, that is, the expected number of particles. From these data, we can calculate a relative deposition index (RDI) by dividing the observed number of particles by the expected number of particles. The RDI indicates whether the observed number of particles corresponds to that predicted solely by compartment size (for which RDI = 1). Within one group, the observed and expected particle distributions are compared by chi-squared analysis. The total chi-squared value indicates whether an observed distribution is random. If not, the partial chi-squared values help to identify those compartments that are preferential targets of the particles (RDI > 1). Particle distributions between different groups can be compared in a similar way by contingency table analysis. We first describe the preconditions and the way to implement these methods, then provide three worked examples, and finally discuss the advantages, pitfalls, and limitations of this method.