12 resultados para ESTIMATOR
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
The autoregressive (AR) estimator, a non-parametric method, is used to analyze functional magnetic resonance imaging (fMRI) data. The same method has been used, with success, in several other time series data analysis. It uses exclusively the available experimental data points to estimate the most plausible power spectra compatible with the experimental data and there is no need to make any assumption about non-measured points. The time series, obtained from fMRI block paradigm data, is analyzed by the AR method to determine the brain active regions involved in the processing of a given stimulus. This method is considerably more reliable than the fast Fourier transform or the parametric methods. The time series corresponding to each image pixel is analyzed using the AR estimator and the corresponding poles are obtained. The pole distribution gives the shape of power spectra, and the pixels with poles at the stimulation frequency are considered as the active regions. The method was applied in simulated and real data, its superiority is shown by the receiver operating characteristic curves which were obtained using the simulated data.
Resumo:
This work proposes the development of an Adaptive Neuro-fuzzy Inference System (ANFIS) estimator applied to speed control in a three-phase induction motor sensorless drive. Usually, ANFIS is used to replace the traditional PI controller in induction motor drives. The evaluation of the estimation capability of the ANFIS in a sensorless drive is one of the contributions of this work. The ANFIS speed estimator is validated in a magnetizing flux oriented control scheme, consisting in one more contribution. As an open-loop estimator, it is applied to moderate performance drives and it is not the proposal of this work to solve the low and zero speed estimation problems. Simulations to evaluate the performance of the estimator considering the vector drive system were done from the Matlab/Simulink(R) software. To determine the benefits of the proposed model, a practical system was implemented using a voltage source inverter (VSI) to drive the motor and the vector control including the ANFIS estimator, which is carried out by the Real Time Toolbox from Matlab/Simulink(R) software and a data acquisition card from National Instruments.
Resumo:
OBJECTIVE: Due to their toxicity, diesel emissions have been submitted to progressively more restrictive regulations in developed countries. However, in Brazil, the implementation of the Cleaner Diesel Technologies policy (Euro IV standards for vehicles produced in 2009 and low-sulfur diesel with 50 ppm of sulfur) was postponed until 2012 without a comprehensive analysis of the effect of this delay on public health parameters. We aimed to evaluate the impact of the delay in implementing the Cleaner Diesel Technologies policy on health indicators and monetary health costs in Brazil. METHODS: The primary estimator of exposure to air pollution was the concentration of ambient fine particulate matter (particles with aerodynamic diameters, <2.5 mu m, [PM2.5]). This parameter was measured daily in six Brazilian metropolitan areas during 2007-2008. We calculated 1) the projected reduction in the PM2.5 that would have been achieved if the Euro IV standards had been implemented in 2009 and 2) the expected reduction after implementation in 2012. The difference between these two time curves was transformed into health outcomes using previous dose-response curves. The economic valuation was performed based on the DALY (disability-adjusted life years) method. RESULTS: The delay in implementing the Cleaner Diesel Technologies policy will result in an estimated excess of 13,984 deaths up to 2040. Health expenditures are projected to be increased by nearly US$ 11.5 billion for the same period. CONCLUSIONS: The present results indicate that a significant health burden will occur because of the postponement in implementing the Cleaner Diesel Technologies policy. These results also reinforce the concept that health effects must be considered when revising fuel and emission policies.
Resumo:
Objective: To identify spatial patterns in rates of admission for pneumonia among children and relate them to the number of fires reported in the state of Mato Grosso, Brazil. Methods: We conducted an ecological and exploratory study of data from the state of Mato Grosso for 2008 and 2009 on hospital admissions of children aged 0 to 4 years due to pneumonia and on fires in the same period. Admission rates were calculated and choropleth maps were plotted for rates and for fire outbreaks, Moran's I was calculated and the kernel estimator used to identify "hotspots." Data were analyzed using TerraView 3.3.1. Results: Fifteen thousand six hundred eighty-nine children were hospitalized (range zero to 2,315), and there were 161,785 fires (range 7 to 6,454). The average rate of admissions per 1,000 inhabitants was 2.89 (standard deviation [SD] = 5.18) and the number of fires per 1,000 inhabitants was 152.81 (SD = 199.91). Moran's I for the overall number of admissions was I = 0.02 (p = 0.26), the index for rate of admission was I = 0.02 (p = 0.21) and the index for the number of fires was I = 0.31 (p < 0.01). It proved possible to identify four municipalities with elevated rates of admissions for pneumonia. It was also possible to identify two regions with high admission densities. A clustering of fires was evident along what is known as the "arc of deforestation." Conclusions: This study identified municipalities in the state of Mato Grosso that require interventions to reduce rates of admission due to pneumonia and the number fires.
Resumo:
We extend the random permutation model to obtain the best linear unbiased estimator of a finite population mean accounting for auxiliary variables under simple random sampling without replacement (SRS) or stratified SRS. The proposed method provides a systematic design-based justification for well-known results involving common estimators derived under minimal assumptions that do not require specification of a functional relationship between the response and the auxiliary variables.
Resumo:
Intravascular ultrasound (IVUS) phantoms are important to calibrate and evaluate many IVUS imaging processing tasks. However, phantom generation is never the primary focus of related works; hence, it cannot be well covered, and is usually based on more than one platform, which may not be accessible to investigators. Therefore, we present a framework for creating representative IVUS phantoms, for different intraluminal pressures, based on the finite element method and Field II. First, a coronary cross-section model is selected. Second, the coronary regions are identified to apply the properties. Third, the corresponding mesh is generated. Fourth, the intraluminal force is applied and the deformation computed. Finally, the speckle noise is incorporated. The framework was tested taking into account IVUS contrast, noise and strains. The outcomes are in line with related studies and expected values. Moreover, the framework toolbox is freely accessible and fully implemented in a single platform. (E-mail: fernando.okara@gmail.com) (c) 2012 World Federation for Ultrasound in Medicine & Biology.
Resumo:
The objective of this study was to compare the BLUP selection method with different selection strategies in F-2:4 and assess the efficiency of this method on the early choice of the best common bean (Phaseolus vulgaris) lines. Fifty-one F-2:4 progenies were produced from a cross between the CVIII8511 x RP-26 lines. A randomized block design was used with 20 replications and one-plant field plots. Character data on plant architecture and grain yield were obtained and then the sum of the standardized variables was estimated for simultaneous selection of both traits. Analysis was carried out by mixed models (BLUP) and the least squares method to compare different selection strategies, like mass selection, stratified mass selection and between and within progeny selection. The progenies selected by BLUP were assessed in advanced generations, always selecting the greatest and smallest sum of the standardized variables. Analyses by the least squares method and BLUP procedure ranked the progenies in the same way. The coincidence of the individuals identified by BLUP and between and within progeny selection was high and of the greatest magnitude when BLUP was compared with mass selection. Although BLUP is the best estimator of genotypic value, its efficiency in the response to long term selection is not different from any of the other methods, because it is also unable to predict the future effect of the progenies x environments interaction. It was inferred that selection success will always depend on the most accurate possible progeny assessment and using alternatives to reduce the progenies x environments interaction effect.
Resumo:
In many applications of lifetime data analysis, it is important to perform inferences about the change-point of the hazard function. The change-point could be a maximum for unimodal hazard functions or a minimum for bathtub forms of hazard functions and is usually of great interest in medical or industrial applications. For lifetime distributions where this change-point of the hazard function can be analytically calculated, its maximum likelihood estimator is easily obtained from the invariance properties of the maximum likelihood estimators. From the asymptotical normality of the maximum likelihood estimators, confidence intervals can also be obtained. Considering the exponentiated Weibull distribution for the lifetime data, we have different forms for the hazard function: constant, increasing, unimodal, decreasing or bathtub forms. This model gives great flexibility of fit, but we do not have analytic expressions for the change-point of the hazard function. In this way, we consider the use of Markov Chain Monte Carlo methods to get posterior summaries for the change-point of the hazard function considering the exponentiated Weibull distribution.
Resumo:
In this paper, we carry out robust modeling and influence diagnostics in Birnbaum-Saunders (BS) regression models. Specifically, we present some aspects related to BS and log-BS distributions and their generalizations from the Student-t distribution, and develop BS-t regression models, including maximum likelihood estimation based on the EM algorithm and diagnostic tools. In addition, we apply the obtained results to real data from insurance, which shows the uses of the proposed model. Copyright (c) 2011 John Wiley & Sons, Ltd.
Resumo:
The study of proportions is a common topic in many fields of study. The standard beta distribution or the inflated beta distribution may be a reasonable choice to fit a proportion in most situations. However, they do not fit well variables that do not assume values in the open interval (0, c), 0 < c < 1. For these variables, the authors introduce the truncated inflated beta distribution (TBEINF). This proposed distribution is a mixture of the beta distribution bounded in the open interval (c, 1) and the trinomial distribution. The authors present the moments of the distribution, its scoring vector, and Fisher information matrix, and discuss estimation of its parameters. The properties of the suggested estimators are studied using Monte Carlo simulation. In addition, the authors present an application of the TBEINF distribution for unemployment insurance data.
Resumo:
We investigate how the initial geometry of a heavy-ion collision is transformed into final flow observables by solving event-by-event ideal hydrodynamics with realistic fluctuating initial conditions. We study quantitatively to what extent anisotropic flow (nu(n)) is determined by the initial eccentricity epsilon(n) for a set of realistic simulations, and we discuss which definition of epsilon(n) gives the best estimator of nu(n). We find that the common practice of using an r(2) weight in the definition of epsilon(n) in general results in a poorer predictor of nu(n) than when using r(n) weight, for n > 2. We similarly study the importance of additional properties of the initial state. For example, we show that in order to correctly predict nu(4) and nu(5) for noncentral collisions, one must take into account nonlinear terms proportional to epsilon(2)(2) and epsilon(2)epsilon(3), respectively. We find that it makes no difference whether one calculates the eccentricities over a range of rapidity or in a single slice at z = 0, nor is it important whether one uses an energy or entropy density weight. This knowledge will be important for making a more direct link between experimental observables and hydrodynamic initial conditions, the latter being poorly constrained at present.
Resumo:
Estimators of home-range size require a large number of observations for estimation and sparse data typical of tropical studies often prohibit the use of such estimators. An alternative may be use of distance metrics as indexes of home range. However, tests of correlation between distance metrics and home-range estimators only exist for North American rodents. We evaluated the suitability of 3 distance metrics (mean distance between successive captures [SD], observed range length [ORL], and mean distance between all capture points [AD]) as indexes for home range for 2 Brazilian Atlantic forest rodents, Akodon montensis (montane grass mouse) and Delomys sublineatus (pallid Atlantic forest rat). Further, we investigated the robustness of distance metrics to low numbers of individuals and captures per individual. We observed a strong correlation between distance metrics and the home-range estimator. None of the metrics was influenced by the number of individuals. ORL presented a strong dependence on the number of captures per individual. Accuracy of SD and AD was not dependent on number of captures per individual, but precision of both metrics was low with numbers of captures below 10. We recommend the use of SD and AD instead of ORL and use of caution in interpretation of results based on trapping data with low captures per individual.