898 resultados para Nonparametric Estimators
Resumo:
In this paper, a new algorithm for blind inversion of Wiener systems is presented. The algorithm is based on minimization of mutual information of the output samples. This minimization is done through a Minimization-Projection (MP) approach, using a nonparametric “gradient” of mutual information.
Resumo:
STUDY DESIGN: Case-control study. OBJECTIVES: To assess serum myostatin levels, bone mineral density (BMD), appendicular skeletal muscle mass (ASMM) and serum sclerostin levels in chronic spinal cord injured (SCI) patients and healthy controls. SETTING: SCI centre in Italy. METHODS: Blood samples, whole-body bioelectrical impedance analysis and BMD measurement with the ultrasound technique at the calcaneus level were taken from patients suffering from chronic SCI (both motor complete and incomplete) and healthy control subjects. RESULTS: A total of 28 SCI patients and 15 healthy controls were enrolled. Serum myostatin levels were statistically higher (P<0.01) in SCI patients compared with healthy controls. Similar results were found comparing both the motor complete and the motor incomplete SCI subgroups to healthy controls. Serum sclerostin was significantly higher in patients with SCI compared with healthy controls (P<0.01). BMD, stiffness and mean T-score values in SCI patients were significantly lower than those in healthy controls. Serum myostatin concentrations in the motor complete SCI subgroups correlated only with serum sclerostin levels (r(2)=0.42; P=0.001) and ASMM (r(2)=0.70; P=0.002) but not in healthy controls. DISCUSSION: Serum myostatin and serum sclerostin are significantly higher in chronic SCI patients compared with healthy controls. They are potential biomarkers of muscle and bone modifications after SCI. This is the first study reporting an increase in serum myostatin in patients suffering from chronic SCI and a correlation with ASMM.
Resumo:
We investigate the importance of the labour mobility of inventors, as well as the scale, extent and density of their collaborative research networks, for regional innovation outcomes. To do so, we apply a knowledge production function framework at the regional level and include inventors’ networks and their labour mobility as regressors. Our empirical approach takes full account of spatial interactions by estimating a spatial lag model together, where necessary, with a spatial error model. In addition, standard errors are calculated using spatial heteroskedasticity and autocorrelation consistent estimators to ensure their robustness in the presence of spatial error autocorrelation and heteroskedasticity of unknown form. Our results point to the existence of a robust positive correlation between intraregional labour mobility and regional innovation, whilst the relationship with networks is less clear. However, networking across regions positively correlates with a region’s innovation intensity.
Resumo:
This chapter presents possible uses and examples of Monte Carlo methods for the evaluation of uncertainties in the field of radionuclide metrology. The method is already well documented in GUM supplement 1, but here we present a more restrictive approach, where the quantities of interest calculated by the Monte Carlo method are estimators of the expectation and standard deviation of the measurand, and the Monte Carlo method is used to propagate the uncertainties of the input parameters through the measurement model. This approach is illustrated by an example of the activity calibration of a 103Pd source by liquid scintillation counting and the calculation of a linear regression on experimental data points. An electronic supplement presents some algorithms which may be used to generate random numbers with various statistical distributions, for the implementation of this Monte Carlo calculation method.
Resumo:
Corticosterone is an important hormone of the stress response that regulates physiological processes and modifies animal behavior. While it positively acts on locomotor activity, it may negatively affect reproduction and social activity. This suggests that corticosterone may promote behaviors that increase survival at the cost of reproduction. In this study, we experimentally investigate the link between corticosterone levels and survival in adult common lizards (Lacerta vivipara) by comparing corticosterone-treated with placebo-treated lizards. We experimentally show that corticosterone enhances energy expenditure, daily activity, food intake, and it modifies the behavioral time budget. Enhanced appetite of corticosterone-treated individuals compensated for increased energy expenditure and corticosterone-treated males showed increased survival. This suggests that corticosterone may promote behaviors that reduce stress and it shows that corticosterone per se does not reduce but directly or indirectly increases longer-term survival. This suggests that the production of corticosterone as a response to a stressor may be an adaptive mechanism that even controls survival.
Resumo:
Las pruebas paramétricas son un tipo de pruebas de significación estadística que cuantifican la asociación o independencia entre una variable cuantitativa y una categórica. Las pruebas paramétricas exigen ciertos requisitos previos para su aplicación: la distribución Normal de la variable cuantitativa en los grupos que se comparan, la homogeneidad de varianzas en las poblaciones de las que proceden los grupos y una n muestral no inferior a 30. Su incumplimiento conlleva la necesidad de recurrir a pruebas estadísticas no paramétricas. Las pruebas paramétricas se clasifican en dos: prueba t (para una muestra o para dos muestras relacionadas o independientes) y prueba ANOVA (para más de dos muestras independientes).
Resumo:
With this paper we build a two-region model where both innovation and imitation are performed. In particular imitation takes the form of technological spillovers that lagging regions may exploit given certain human capital conditions. We show how the high skill content of each region’s workforce (rather than the average human capital stock) is crucial to determine convergence towards the income level of the leader region and to exploit the technological spillovers coming from the frontier. The same applies to bureaucratic/institutional quality which are conductive to higher growth in the long run. We test successfully our theoretical result over Spanish regions for the period between 1960 and 1997. We exploit system GMM estimators which allow us to correctly deal with endogeneity problems and small sample bias.
Resumo:
The author studies random walk estimators for radiosity with generalized absorption probabilities. That is, a path will either die or survive on a patch according to an arbitrary probability. The estimators studied so far, the infinite path length estimator and finite path length one, can be considered as particular cases. Practical applications of the random walks with generalized probabilities are given. A necessary and sufficient condition for the existence of the variance is given, together with heuristics to be used in practical cases. The optimal probabilities are also found for the case when one is interested in the whole scene, and are equal to the reflectivities
Resumo:
The directional consistency and skew-symmetry statistics have been proposed as global measurements of social reciprocity. Although both measures can be useful for quantifying social reciprocity, researchers need to know whether these estimators are biased in order to assess descriptive results properly. That is, if estimators are biased, researchers should compare actual values with expected values under the specified null hypothesis. Furthermore, standard errors are needed to enable suitable assessment of discrepancies between actual and expected values. This paper aims to derive some exact and approximate expressions in order to obtain bias and standard error values for both estimators for round-robin designs, although the results can also be extended to other reciprocal designs.
Resumo:
In the context of the evidence-based practices movement, the emphasis on computing effect sizes and combining them via meta-analysis does not preclude the demonstration of functional relations. For the latter aim, we propose to augment the visual analysis to add consistency to the decisions made on the existence of a functional relation without losing sight of the need for a methodological evaluation of what stimuli and reinforcement or punishment are used to control the behavior. Four options for quantification are reviewed, illustrated, and tested with simulated data. These quantifications include comparing the projected baseline with the actual treatment measurements, on the basis of either parametric or nonparametric statistics. The simulated data used to test the quantifications include nine data patterns in terms of the presence and type of effect and comprising ABAB and multiple baseline designs. Although none of the techniques is completely flawless in terms of detecting a functional relation only when it is present but not when it is absent, an option based on projecting split-middle trend and considering data variability as in exploratory data analysis proves to be the best performer for most data patterns. We suggest that the information on whether a functional relation has been demonstrated should be included in meta-analyses. It is also possible to use as a weight the inverse of the data variability measure used in the quantification for assessing the functional relation. We offer an easy to use code for open-source software for implementing some of the quantifications.
Resumo:
Standard indirect Inference (II) estimators take a given finite-dimensional statistic, Z_{n} , and then estimate the parameters by matching the sample statistic with the model-implied population moment. We here propose a novel estimation method that utilizes all available information contained in the distribution of Z_{n} , not just its first moment. This is done by computing the likelihood of Z_{n}, and then estimating the parameters by either maximizing the likelihood or computing the posterior mean for a given prior of the parameters. These are referred to as the maximum indirect likelihood (MIL) and Bayesian Indirect Likelihood (BIL) estimators, respectively. We show that the IL estimators are first-order equivalent to the corresponding moment-based II estimator that employs the optimal weighting matrix. However, due to higher-order features of Z_{n} , the IL estimators are higher order efficient relative to the standard II estimator. The likelihood of Z_{n} will in general be unknown and so simulated versions of IL estimators are developed. Monte Carlo results for a structural auction model and a DSGE model show that the proposed estimators indeed have attractive finite sample properties.
Resumo:
Ten common doubts of chemistry students and professionals about their statistical applications are discussed. The use of the N-1 denominator instead of N is described for the standard deviation. The statistical meaning of the denominators of the root mean square error of calibration (RMSEC) and root mean square error of validation (RMSEV) are given for researchers using multivariate calibration methods. The reason why scientists and engineers use the average instead of the median is explained. Several problematic aspects about regression and correlation are treated. The popular use of triplicate experiments in teaching and research laboratories is seen to have its origin in statistical confidence intervals. Nonparametric statistics and bootstrapping methods round out the discussion.
Resumo:
The topic of this thesis is the simulation of a combination of several control and data assimilation methods, meant to be used for controlling the quality of paper in a paper machine. Paper making is a very complex process and the information obtained from the web is sparse. A paper web scanner can only measure a zig zag path on the web. An assimilation method is needed to process estimates for Machine Direction (MD) and Cross Direction (CD) profiles of the web. Quality control is based on these measurements. There is an increasing need for intelligent methods to assist in data assimilation. The target of this thesis is to study how such intelligent assimilation methods are affecting paper web quality. This work is based on a paper web simulator, which has been developed in the TEKES funded MASI NoTes project. The simulator is a valuable tool in comparing different assimilation methods. The thesis contains the comparison of four different assimilation methods. These data assimilation methods are a first order Bayesian model estimator, an ARMA model based on a higher order Bayesian estimator, a Fourier transform based Kalman filter estimator and a simple block estimator. The last one can be considered to be close to current operational methods. From these methods Bayesian, ARMA and Kalman all seem to have advantages over the commercial one. The Kalman and ARMA estimators seems to be best in overall performance.
Resumo:
Scientific evidence on climate changes at global level has gained increasing interest in the scientific community in general. The impacts of climate change as well as anthropogenic actions may cause errors in hydro-agricultural projects existent in the watershed under study. This study aimed to identify the presence or absence of trend in total annual precipitation series of the watershed of the Mirim Lagoon, state of Rio Grande do Sul-RS / Brazil / Uruguay (Brazilian side) as well as to detect the period in which they occurred. For that, it was analyzed the precipitation data belonging to 14 weather stations. To detect the existence of monotonic trend and change points, it was used the nonparametric tests of Mann-Kendall and Mann-Whitney, the "t" test of Student for two samples of unpaired data (parametric), as well as the technique of progressive mean. The Weather Station 3152014 (Pelotas) presented changes in the trend in the series of annual precipitation in the period from 1953 to 2007. The methodologies that use subdivided series were more efficient in detecting change in trend when compared with the Mann-Kendall test, which uses the complete series (from 1921 to 2007).
Resumo:
PURPOSE: To investigate the association between polymorphisms in genes that encode enzymes involved in folate- and vitamin B12-dependent homocysteine metabolism and recurrent spontaneous abortion (RSA).METHODS: We investigated the C677T and A1298C polymorphisms of the methylenetetrahydrofalate reductase gene (MTHFR), the A2756G polymorphism of the methionine synthase gene (MS) and the 844ins68 insertion of the cystathionine beta synthetase gene (CBS). The PCR technique followed by RFLP was used to assess the polymorphisms; the serum levels of homocysteine, vitamin B12 and folate were investigated by chemiluminescence. The EPI Info Software version 6.04 was used for statistical analysis. Parametric variables were compared by Student's t-test and nonparametric variables by the Wilcoxon rank sum test.RESULTS: The frequencies of gene polymorphisms in 89 women with a history of idiopathic recurrent miscarriage and 150 controls were 19.1 and 19.6% for the C677T, insertion, 20.8 and 26% for the A1298C insertion, 14.2 and 21.9% for the A2756G insertion, and 16.4 and 18% for the 844ins68 insertion, respectively. There were no significant differences between case and control groups in any of the gene polymorphisms investigated. However, the frequency of the 844ins68 insertion in the CBS gene was higher among women with a history of loss during the third trimester of pregnancy (p=0.003). Serum homocysteine, vitamin B12 and folate levels id not differ between the polymorphisms studied in the case and control groups. However, linear regression analysis showed a dependence of serum folate levels on the maintenance of tHcy levels.CONCLUSION: The investigated gene polymorphisms and serum homocysteine, vitamin B12 and folate levels were not associated with idiopathic recurrent miscarriage in the present study. Further investigations are needed in order to confirm the role of the CBS 844ins68 insertion in recurrent miscarriage.