48 resultados para statistical softwares
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Anaerobic threshold (AT) is usually estimated as a change point problem by visual analysis of the cardiorespiratory response to incremental dynamic exercise. In this study, two phase linear (TPL) models of the linear-linear and linear-quadratic type were used for the estimation of AT. The correlation coefficient between the classical and statistical approaches was 0.88, and 0.89 after outlier exclusion. The TPL models provide a simple method for estimating AT that can be easily implemented using a digital computer for the automatic pattern recognition of AT.
Resumo:
The distribution of short-circuit current is investigated by means of two methods, one direct and the other analytic; both methods consider uniform probability distribution of line faults. In the direct method, the procedure consists of calculating fault currents at equidistant points along the line, starting from one of the end points and considering the other end open. The magnitude of the current is classified according to Brazilian standards (regulation NBR-7118). The analytic method assumes that the distribution of short-circuit currents through the busbar and the distribution of the line length connected to it are known, as well as the independence of values. The method is designed to determine the probability that fault currents through a line will surpass the pre-established magnitude, thus generating frequency distribution curves of short-circuit currents along the lines.
Resumo:
The sedimentary Curitiba basin is located in the Central-Southern part of the first Parananense plateau, and comprises Curitiba (PR), and part of the neighbour Municipalities (fig.1). It is supposed to be of Plio-Pleistocene age. It has a shallow sedimentary fulfillment, represented by the Guabirotuba formation (BIGARELLA and SALAMUNI, 1962) which is dristributed over a large area of about 3.000km2. The internal geometry, not entirely known yet, is actually object of detailed research, that shows its geological evolution to Cenozoic tectonic movements. For the purpose of this study the definition of the structural contour of the basement and their depo-centers is fundamental. This paper presents the results of the integration of surface and subsurface data, processed by statistical methods, which allowed a more precise definition of the morphostructural framework of the basement. For the analysis of the geological spacial data, specific softwares were used for statistical processing for trend surfaces analysis. The data used in this study are of following types: a) drilling logs for ground water; b) description of surface points of geological maps (CRPM, 1977); c) description of points of geotechnical drillings and down geological survey. The data of 223 drilling logs for ground water were selected out of 770 wells. The description files of 700 outcrops, as well as planialtimetric field data, were used for the localization of the basement outcrop. Thus, a matrix with five columns was set up: utm E-W (x) and utm N-S (y); surface altitude (z); altimetric cote of the contact between sedimentary rocks and the basement (k); isopachs (l). For the study of the basement limits, the analysis of surface trends of 2(nd) and 3(rd) degree polinomial for the altimetric data (figs. 2 and 3) were used. For the residuals the method of the inverse of the square of the distance (fig.4) was used. The adjustments and the explanations of the surfaces were made with the aid of multiple linear regressions. The analysis of 3rd degree polinomial trend surface (fig.3) confirmed that the basement tends to be more exposed towards NNW-SSE explaining better the data trend through an ellipse, which striking NE-SW and dipping SW axis coincides with the trough of the basin observed in the trending surface of the basement. The performed analysis and the respective images offer a good degree of certainty of the geometric model of the Curitiba Basin and of the morphostructure of its basement. The surface trend allows to sketch with a greater degree of confidence the structural contour of the topgraphic surface (figs. 5 and 6) and of the basement (figs. 7 and 8), as well as the delimitation of intermediate structural heights, which were responsible for isolated and assymmetric depocenters. These details are shown in the map of figures 9 and 10. Thus, the Curitiba Basin is made up by a structural trough stretching NE-SW, with maximum preserved depths of about 80m, which are separated by heights and depocenters striking NW-SE (fig. 11). These structural features seems to have been controlled by tectonic reactivation during the Tertiary (HASUI, 1990) and which younger dissection was conditioned by neotectonic processes (SALAMUNI and EBERT, 1994).
Resumo:
The strangeness content of the nucleon is determined from a statistical model using confined quark levels, and is shown to have a good agreement with the corresponding values extracted from experimental data. The quark levels are generated in a Dirac equation that uses a linear confining potential (scalar plus vector). With the requirement that the result for the Gottfried sum rule violation, given by the New Muon Collaboration (NMC), is well reproduced, we also obtain the difference between the structure functions of the proton and neutron, and the corresponding sea quark contributions.
Resumo:
The nearest-neighbor spacing distributions proposed by four models, namely, the Berry-Robnik, Caurier-Grammaticos-Ramani, Lenz-Haake, and the deformed Gaussian orthogonal ensemble, as well as the ansatz by Brody, are applied to the transition between chaos and order that occurs in the isotropic quartic oscillator. The advantages and disadvantages of these five descriptions are discussed. In addition, the results of a simple extension of the expression for the Dyson-Mehta statistic Δ3 are compared with those of a more popular one, usually associated with the Berry-Robnik formalism. ©1999 The American Physical Society.
Resumo:
The main objective involved with this paper consists of presenting the results obtained from the application of artificial neural networks and statistical tools in the automatic identification and classification process of faults in electric power distribution systems. The developed techniques to treat the proposed problem have used, in an integrated way, several approaches that can contribute to the successful detection process of faults, aiming that it is carried out in a reliable and safe way. The compilations of the results obtained from practical experiments accomplished in a pilot distribution feeder have demonstrated that the developed techniques provide accurate results, identifying and classifying efficiently the several occurrences of faults observed in the feeder. © 2006 IEEE.
Resumo:
Structural health monitoring (SHM) is related to the ability of monitoring the state and deciding the level of damage or deterioration within aerospace, civil and mechanical systems. In this sense, this paper deals with the application of a two-step auto-regressive and auto-regressive with exogenous inputs (AR-ARX) model for linear prediction of damage diagnosis in structural systems. This damage detection algorithm is based on the. monitoring of residual error as damage-sensitive indexes, obtained through vibration response measurements. In complex structures there are. many positions under observation and a large amount of data to be handed, making difficult the visualization of the signals. This paper also investigates data compression by using principal component analysis. In order to establish a threshold value, a fuzzy c-means clustering is taken to quantify the damage-sensitive index in an unsupervised learning mode. Tests are made in a benchmark problem, as proposed by IASC-ASCE with different damage patterns. The diagnosis that was obtained showed high correlation with the actual integrity state of the structure. Copyright © 2007 by ABCM.
Resumo:
Given that the total amount of losses in a distribution system is known, with a reliable methodology for the technical loss calculation, the non-technical losses can be obtained by subtraction. A usual method of calculation technical losses in the electric utilities uses two important factors: load factor and the loss factor. The load factor is usually obtained with energy and demand measurements, whereas, to compute the loss factor it is necessary the learning of demand and energy loss, which are not, in general, prone of direct measurements. In this work, a statistical analysis of this relationship using the curves of a sampling of consumers in a specific company is presented. These curves will be summarized in different bands of coefficient k. Then, it will be possible determine where each group of consumer has its major concentration of points. ©2008 IEEE.
Resumo:
A statistical quark model, with quark energy levels given by a central linear confining potential is used to obtain the light sea-quark asymmetry, d̄/ū, and also for the ratio d/u, inside the nucleon. After adjusting a temperature parameter by the Gottfried sum rule violation, and chemical potentials by the valence up and down quark normalizations, the results are compared with experimental data available. © 2009 American Institute of Physics.
Resumo:
An improved statistical quark model, with quark energy levels given by a central linear confining potential, is used to obtain the light sea-quark asymmetry, d̄/ū, and also for the corresponding difference d̄-ū, inside the nucleon. In the model, a temperature parameter is adjusted by recent results obtained for the Gottfried sum rule violation, with two chemical potentials adjusted by the valence up and down quark normalizations. The results are compared with available recent experimental data. © 2010 American Institute of Physics.
ANN statistical image recognition method for computer vision in agricultural mobile robot navigation
Resumo:
The main application area in this project, is to deploy image processing and segmentation techniques in computer vision through an omnidirectional vision system to agricultural mobile robots (AMR) used for trajectory navigation problems, as well as localization matters. Thereby, computational methods based on the JSEG algorithm were used to provide the classification and the characterization of such problems, together with Artificial Neural Networks (ANN) for image recognition. Hence, it was possible to run simulations and carry out analyses of the performance of JSEG image segmentation technique through Matlab/Octave computational platforms, along with the application of customized Back-propagation Multilayer Perceptron (MLP) algorithm and statistical methods as structured heuristics methods in a Simulink environment. Having the aforementioned procedures been done, it was practicable to classify and also characterize the HSV space color segments, not to mention allow the recognition of segmented images in which reasonably accurate results were obtained. © 2010 IEEE.
Resumo:
We consider some existing relativistic models for the nucleon structure functions, relying on statistical approaches instead of perturbative ones. These models are based on the Fermi-Dirac distribution for the confined quarks, where a density of energy levels is obtained from an effective confining potential. In this context, it is presented some results obtained with a recent statistical quark model for the sea-quark asymmetry in the nucleon. It is shown, within this model, that experimental available observables, such as the ratio and difference between proton and neutron structure functions, are quite well reproduced with just three parameters: two chemical potentials used to reproduce the valence up and down quark numbers in the nucleon, and a temperature that is being used to reproduce the Gottfried sum rule violation. © 2010 American Institute of Physics.
Resumo:
The computers and network services became presence guaranteed in several places. These characteristics resulted in the growth of illicit events and therefore the computers and networks security has become an essential point in any computing environment. Many methodologies were created to identify these events; however, with increasing of users and services on the Internet, many difficulties are found in trying to monitor a large network environment. This paper proposes a methodology for events detection in large-scale networks. The proposal approaches the anomaly detection using the NetFlow protocol, statistical methods and monitoring the environment in a best time for the application. © 2010 Springer-Verlag Berlin Heidelberg.
Resumo:
Botryosphaeria rhodina MAMB-05 produced β-1,3-glucanases and botryosphaeran when grown on glucose, while Trichoderma harzianum Rifai only produced the enzyme. A comparison of long-term cultivation (300h) by B. rhodina demonstrated a correlation between the formation of botryosphaeran (48h) and its consumption (after 108h), and de-repression of β-1,3-glucanase synthesis when glucose was depleted from the nutrient medium, whereas for T. harzianum enzyme production commenced during exponential growth. Growth profiles and levels of β-1,3-glucanases produced by both fungi on botryosphaeran also differed, as well as the production of β-1,3-glucanases and β-1,6-glucanases on glucose, lactose, laminarin, botryosphaeran, lasiodiplodan, curdlan, Brewer's yeast powder and lyophilized fungal mycelium, which were dependent upon the carbon source used. A statistical mixture-design used to optimize β-1,3-glucanase production by both fungi evaluated botryosphaeran, glucose and lactose concentrations as variables. For B. rhodina, glucose and lactose promoted enzyme production at the same levels (2.30UmL -1), whereas botryosphaeran added to these substrates exerted a synergic effect favorable for β-glucanase production by T. harzianum (4.25UmL -1). © 2010 Elsevier B.V.