601 resultados para Estimators
Resumo:
Most algorithms for state estimation based on the classical model are just adequate for use in transmission networks. Few algorithms were developed specifically for distribution systems, probably because of the little amount of data available in real time. Most overhead feeders possess just current and voltage measurements at the middle voltage bus-bar at the substation. In this way, classical algorithms are of difficult implementation, even considering off-line acquired data as pseudo-measurements. However, the necessity of automating the operation of distribution networks, mainly in regard to the selectivity of protection systems, as well to implement possibilities of load transfer maneuvers, is changing the network planning policy. In this way, some equipments incorporating telemetry and command modules have been installed in order to improve operational features, and so increasing the amount of measurement data available in real-time in the System Operation Center (SOC). This encourages the development of a state estimator model, involving real-time information and pseudo-measurements of loads, that are built from typical power factors and utilization factors (demand factors) of distribution transformers. This work reports about the development of a new state estimation method, specific for radial distribution systems. The main algorithm of the method is based on the power summation load flow. The estimation is carried out piecewise, section by section of the feeder, going from the substation to the terminal nodes. For each section, a measurement model is built, resulting in a nonlinear overdetermined equations set, whose solution is achieved by the Gaussian normal equation. The estimated variables of a section are used as pseudo-measurements for the next section. In general, a measurement set for a generic section consists of pseudo-measurements of power flows and nodal voltages obtained from the previous section or measurements in real-time, if they exist -, besides pseudomeasurements of injected powers for the power summations, whose functions are the load flow equations, assuming that the network can be represented by its single-phase equivalent. The great advantage of the algorithm is its simplicity and low computational effort. Moreover, the algorithm is very efficient, in regard to the accuracy of the estimated values. Besides the power summation state estimator, this work shows how other algorithms could be adapted to provide state estimation of middle voltage substations and networks, namely Schweppes method and an algorithm based on current proportionality, that is usually adopted for network planning tasks. Both estimators were implemented not only as alternatives for the proposed method, but also looking for getting results that give support for its validation. Once in most cases no power measurement is performed at beginning of the feeder and this is required for implementing the power summation estimations method, a new algorithm for estimating the network variables at the middle voltage bus-bar was also developed
Resumo:
This work describes the study and the implementation of the speed control for a three-phase induction motor of 1,1 kW and 4 poles using the neural rotor flux estimation. The vector speed control operates together with the winding currents controller of the stator phasis. The neural flux estimation applied to the vector speed controls has the objective of compensating the parameter dependences of the conventional estimators in relation to the parameter machine s variations due to the temperature increases or due to the rotor magnetic saturation. The implemented control system allows a direct comparison between the respective responses of the speed controls to the machine oriented by the neural rotor flux estimator in relation to the conventional flux estimator. All the system control is executed by a program developed in the ANSI C language. The main DSP recources used by the system are, respectively, the Analog/Digital channels converters, the PWM outputs and the parallel and RS-232 serial interfaces, which are responsible, respectively, by the DSP programming and the data capture through the supervisory system
Resumo:
In this work, we study the survival cure rate model proposed by Yakovlev et al. (1993), based on a competing risks structure concurring to cause the event of interest, and the approach proposed by Chen et al. (1999), where covariates are introduced to model the risk amount. We focus the measurement error covariates topics, considering the use of corrected score method in order to obtain consistent estimators. A simulation study is done to evaluate the behavior of the estimators obtained by this method for finite samples. The simulation aims to identify not only the impact on the regression coefficients of the covariates measured with error (Mizoi et al. 2007) but also on the coefficients of covariates measured without error. We also verify the adequacy of the piecewise exponential distribution to the cure rate model with measurement error. At the end, model applications involving real data are made
Resumo:
In this work we study the survival cure rate model proposed by Yakovlev (1993) that are considered in a competing risk setting. Covariates are introduced for modeling the cure rate and we allow some covariates to have missing values. We consider only the cases by which the missing covariates are categorical and implement the EM algorithm via the method of weights for maximum likelihood estimation. We present a Monte Carlo simulation experiment to compare the properties of the estimators based on this method with those estimators under the complete case scenario. We also evaluate, in this experiment, the impact in the parameter estimates when we increase the proportion of immune and censored individuals among the not immune one. We demonstrate the proposed methodology with a real data set involving the time until the graduation for the undergraduate course of Statistics of the Universidade Federal do Rio Grande do Norte
Resumo:
In this work we studied the asymptotic unbiasedness, the strong and the uniform strong consistencies of a class of kernel estimators fn as an estimator of the density function f taking values on a k-dimensional sphere
Resumo:
In this work, the paper of Campos and Dorea [3] was detailed. In that article a Kernel Estimator was applied to a sequence of random variables with general state space, which were independent and identicaly distributed. In chapter 2, the estimator´s properties such as asymptotic unbiasedness, consistency in quadratic mean, strong consistency and asymptotic normality were verified. In chapter 3, using R software, numerical experiments were developed in order to give a visual idea of the estimate process
Resumo:
This work has as main objective to find mathematical models based on linear parametric estimation techniques applied to the problem of calculating the grow of gas in oil wells. In particular we focus on achieving grow models applied to the case of wells that produce by plunger-lift technique on oil rigs, in which case, there are high peaks in the grow values that hinder their direct measurement by instruments. For this, we have developed estimators based on recursive least squares and make an analysis of statistical measures such as autocorrelation, cross-correlation, variogram and the cumulative periodogram, which are calculated recursively as data are obtained in real time from the plant in operation; the values obtained for these measures tell us how accurate the used model is and how it can be changed to better fit the measured values. The models have been tested in a pilot plant which emulates the process gas production in oil wells
Resumo:
In this work we study the Hidden Markov Models with finite as well as general state space. In the finite case, the forward and backward algorithms are considered and the probability of a given observed sequence is computed. Next, we use the EM algorithm to estimate the model parameters. In the general case, the kernel estimators are used and to built a sequence of estimators that converge in L1-norm to the density function of the observable process
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Nonparametric simple-contrast estimates for one-way layouts based on Hodges-Lehmann estimators for two samples and confidence intervals for all contrasts involving only two treatments are found in the literature.Tests for such contrasts are performed from the distribution of the maximum of the rank sum between two treatments. For random block designs, simple contrast estimates based on Hodges-Lehmann estimators for one sample are presented. However, discussions concerning the significance levels of more complex contrast tests in nonparametric statistics are not well outlined.This work aims at presenting a methodology to obtain p-values for any contrast types based on the construction of the permutations required by each design model using a C-language program for each design type. For small samples, all possible treatment configurations are performed in order to obtain the desired p-value. For large samples, a fixed number of random configurations are used. The program prompts the input of contrast coefficients, but does not assume the existence or orthogonality among them.In orthogonal contrasts, the decomposition of the value of the suitable statistic for each case is performed and it is observed that the same procedure used in the parametric analysis of variance can be applied in the nonparametric case, that is, each of the orthogonal contrasts has a chi(2) distribution with one degree of freedom. Also, the similarities between the p-values obtained for nonparametric contrasts and those obtained through approximations suggested in the literature are discussed.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Phytophthora nicotianae was added to pasteurized soil at the rate of 500 laboratory-produced chlamydospores per gram of soil and exposed to temperatures ranging from 35 to 53°C for 20 days. The time required to reduce soil populations to residual levels (0.2 propagule per gram of soil or less) decreased with increasing temperatures. Addition of cabbage residue to the soil reduced the time required to inactivate chlamydo spores. Temperature regimes were established to simulate daily temperature changes observed in the field, with a high temperature of 47°C for 3 h/day, and were good estimators of the efficacy of soil solarization for the control of P. nicotianae in soil. Cabbage amendment reduced the time required to inactivate chlamydospores of P. nicotianae and its effect was more pronounced at lower temperature regimes.
Resumo:
We review the basic hypotheses which motivate the statistical framework used to analyze the cosmic microwave background, and how that framework can be enlarged as we relax those hypotheses. In particular, we try to separate as much as possible the questions of gaussianity, homogeneity, and isotropy from each other. We focus both on isotropic estimators of nongaussianity as well as statistically anisotropic estimators of gaussianity, giving particular emphasis on their signatures and the enhanced cosmic variances that become increasingly important as our putative Universe becomes less symmetric. After reviewing the formalism behind some simple model-independent tests, we discuss how these tests can be applied to CMBdata when searching for large-scale anomalies. Copyright © 2010 L. Raul Abramo and Thiago S. Pereira.
Resumo:
Includes bibliography
Resumo:
In most studies on beef cattle longevity, only the cows reaching a given number of calvings by a specific age are considered in the analyses. With the aim of evaluating all cows with productive life in herds, taking into consideration the different forms of management on each farm, it was proposed to measure cow longevity from age at last calving (ALC), that is, the most recent calving registered in the files. The objective was to characterize this trait in order to study the longevity of Nellore cattle, using the Kaplan-Meier estimators and the Cox model. The covariables and class effects considered in the models were age at first calving (AFC), year and season of birth of the cow and farm. The variable studied (ALC) was classified as presenting complete information (uncensored = 1) or incomplete information (censored = 0), using the criterion of the difference between the date of each cow's last calving and the date of the latest calving at each farm. If this difference was >36 months, the cow was considered to have failed. If not, this cow was censored, thus indicating that future calving remained possible for this cow. The records of 11 791 animals from 22 farms within the Nellore Breed Genetic Improvement Program ('Nellore Brazil') were used. In the estimation process using the Kaplan-Meier model, the variable of AFC was classified into three age groups. In individual analyses, the log-rank test and the Wilcoxon test in the Kaplan-Meier model showed that all covariables and class effects had significant effects (P < 0.05) on ALC. In the analysis considering all covariables and class effects, using the Wald test in the Cox model, only the season of birth of the cow was not significant for ALC (P > 0.05). This analysis indicated that each month added to AFC diminished the risk of the cow's failure in the herd by 2%. Nonetheless, this does not imply that animals with younger AFC had less profitability. Cows with greater numbers of calvings were more precocious than those with fewer calvings. Copyright © The Animal Consortium 2012.