124 resultados para Step Length Estimation
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
En este trabajo se investiga la coherencia y confiabilidad de estimaciones de funciones de densidad de probabilidad (FDP) subjetivas de rendimientos de cultivos realizadas por un amplio grupo de agricultores. Se utilizaron tres técnicas de elicitación diferentes: el método de estimación de FDP en dos pasos, la distribución Triangular y la distribución Beta. Los sujetos entrevistados ofrecieron estimaciones para los valores puntuales de rendimientos de cultivos (medio, máximo posible, más frecuente y mínimo posible) y para las FDP basadas en la estimación de intervalos. Para evaluar la persistencia, se utilizaron los conceptos de persistencia temporal y persistencia metodológica. Los resultados son interesantes para juzgar la adecuación de las técnicas de estimación de probabilidades subjetivas a los sistemas de ayuda en la toma de decisiones en agricultura.
Resumo:
This work proposes novel network analysis techniques for multivariate time series.We define the network of a multivariate time series as a graph where verticesdenote the components of the process and edges denote non zero long run partialcorrelations. We then introduce a two step LASSO procedure, called NETS, toestimate high dimensional sparse Long Run Partial Correlation networks. This approachis based on a VAR approximation of the process and allows to decomposethe long run linkages into the contribution of the dynamic and contemporaneousdependence relations of the system. The large sample properties of the estimatorare analysed and we establish conditions for consistent selection and estimation ofthe non zero long run partial correlations. The methodology is illustrated with anapplication to a panel of U.S. bluechips.
Resumo:
The growth of five variables of the tibia (diaphyseal length, diaphyseal length plus distal epiphysis, condylo-malleolar length, sagittal diameter of the proximal epiphysis, maximum breadth of the distal epiphysis) were analysed using polynomial regression in order to evaluate their significance and capacity for age and sex determination during and after growth. Data were collected from 181 (90♂ and 91♀) individuals ranging from birth to 25 years of age and belonging to three documented collections from Western Europe. Results indicate that all five variables exhibit linear behaviour during growth, which can be expressed by a first-degree polynomial function. Sexual significant differences were observed from age 15 onward in the two epiphysis measurements and condylo-malleolar length, suggesting that these three variables could be useful for sex determination in individuals older than 15 years. Strong correlation coefficients were identified between the five tibial variables and age. These results indicate that any of the studied tibial measurements is likely to serve as a useful source for estimating sub-adult age in both archaeological and forensic samples.
Resumo:
Electroencephalographic (EEG) recordings are, most of the times, corrupted by spurious artifacts, which should be rejected or cleaned by the practitioner. As human scalp EEG screening is error-prone, automatic artifact detection is an issue of capital importance, to ensure objective and reliable results. In this paper we propose a new approach for discrimination of muscular activity in the human scalp quantitative EEG (QEEG), based on the time-frequency shape analysis. The impact of the muscular activity on the EEG can be evaluated from this methodology. We present an application of this scoring as a preprocessing step for EEG signal analysis, in order to evaluate the amount of muscular activity for two set of EEG recordings for dementia patients with early stage of Alzheimer’s disease and control age-matched subjects.
Resumo:
In the world of transport management, the term ‘anticipation’ is gradually replacing ‘reaction’. Indeed, the ability to forecast traffic evolution in a network should ideally form the basis for many traffic management strategies and multiple ITS applications. Real-time prediction capabilities are therefore becoming a concrete need for the management of networks, both for urban and interurban environments, and today’s road operator has increasingly complex and exacting requirements. Recognising temporal patterns in traffic or the manner in which sequential traffic events evolve over time have been important considerations in short-term traffic forecasting. However, little work has been conducted in the area of identifying or associating traffic pattern occurrence with prevailing traffic conditions. This paper presents a framework for detection pattern identification based on finite mixture models using the EM algorithm for parameter estimation. The computation results have been conducted taking into account the traffic data available in an urban network.
Resumo:
The filling length of an edge-circuit η in the Cayley 2-complex of a finite presentation of a group is the minimal integer length L such that there is a combinatorial null-homotopy of η down to a base point through loops of length at most L. We introduce similar notions in which the full-homotopy is not required to fix a base point, and in which the contracting loop is allowed to bifurcate. We exhibit a group in which the resulting filling invariants exhibit dramatically different behaviour to the standard notion of filling length. We also define the corresponding filling invariants for Riemannian manifolds and translate our results to this setting.
Resumo:
This comment corrects the errors in the estimation process that appear in Martins (2001). The first error is in the parametric probit estimation, as the previously presented results do not maximize the log-likelihood function. In the global maximum more variables become significant. As for the semiparametric estimation method, the kernel function used in Martins (2001) can take on both positive and negative values, which implies that the participation probability estimates may be outside the interval [0,1]. We have solved the problem by applying local smoothing in the kernel estimation, as suggested by Klein and Spady (1993).
Resumo:
Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Since conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. Monte Carlo results show that the estimator performs well in comparison to other estimators that have been proposed for estimation of general DLV models.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
Lean meat percentage (LMP) is an important carcass quality parameter. The aim of this work is to obtain a calibration equation for the Computed Tomography (CT) scans with the Partial Least Square Regression (PLS) technique in order to predict the LMP of the carcass and the different cuts and to study and compare two different methodologies of the selection of the variables (Variable Importance for Projection — VIP- and Stepwise) to be included in the prediction equation. The error of prediction with cross-validation (RMSEPCV) of the LMP obtained with PLS and selection based on VIP value was 0.82% and for stepwise selection it was 0.83%. The prediction of the LMP scanning only the ham had a RMSEPCV of 0.97% and if the ham and the loin were scanned the RMSEPCV was 0.90%. Results indicate that for CT data both VIP and stepwise selection are good methods. Moreover the scanning of only the ham allowed us to obtain a good prediction of the LMP of the whole carcass.
Resumo:
Properties of GMM estimators for panel data, which have become very popular in the empirical economic growth literature, are not well known when the number of individuals is small. This paper analyses through Monte Carlo simulations the properties of various GMM and other estimators when the number of individuals is the one typically available in country growth studies. It is found that, provided that some persistency is present in the series, the system GMM estimator has a lower bias and higher efficiency than all the other estimators analysed, including the standard first-differences GMM estimator.
Resumo:
Aquest projecte realitza una auditoria ambiental de l’edifici de l’Àrea de Territori, Medi Ambient, Paisatge i Espai Urbà de l’Ajuntament de Sitges, com a primer pas per a la implantació d’un sistema de gestió ambiental (SGA), en acord amb el Reglament (CE) nº 761/2001, i la posterior obtenció d’un certificat de gestió i auditories ambientals (EMAS). L’auditoria s’inicia amb la identificació dels aspectes ambientals de l’edifici, mitjançant la recopilació de dades sobre consums energètics i hídrics, la estimació de la generació de residus i enquestes de mobilitat als treballadors. Aquestes dades són utilitzades per determinar els aspectes ambientals significatius i posteriorment, exposar una sèrie de propostes de millora per tal de corregir-los o minimitzar-los, com ara sistemes d’estalvi d’aigua, d’enllumenat, de producció d’electricitat i l’educació ambiental dels treballadors. Per a la implantació d’un SGA en la situació actual de l’edifici, es necessària, entre altres coses, la implantació d’un sistema de registre que arxivi els consums d’aigua, energia i generació de residus, per tal de dur un control de les despeses de cadascun dels vectors. A més, caldrà implementar un Programa de bones pràctiques ambientals en l’oficina per tal de reduir el consum elèctric, d’aigua i generació de residus, i la seva classificació per part dels treballadors.
Resumo:
Abstract. Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Because conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. It is shown that as the number of simulations diverges, the estimator is consistent and a higher-order expansion reveals the stochastic difference between the infeasible GMM estimator based on the same moment conditions and the simulated version. In particular, we show how to adjust standard errors to account for the simulations. Monte Carlo results show how the estimator may be applied to a range of dynamic latent variable (DLV) models, and that it performs well in comparison to several other estimators that have been proposed for DLV models.
Resumo:
This paper presents an analysis of motor vehicle insurance claims relating to vehicle damage and to associated medical expenses. We use univariate severity distributions estimated with parametric and non-parametric methods. The methods are implemented using the statistical package R. Parametric analysis is limited to estimation of normal and lognormal distributions for each of the two claim types. The nonparametric analysis presented involves kernel density estimation. We illustrate the benefits of applying transformations to data prior to employing kernel based methods. We use a log-transformation and an optimal transformation amongst a class of transformations that produces symmetry in the data. The central aim of this paper is to provide educators with material that can be used in the classroom to teach statistical estimation methods, goodness of fit analysis and importantly statistical computing in the context of insurance and risk management. To this end, we have included in the Appendix of this paper all the R code that has been used in the analysis so that readers, both students and educators, can fully explore the techniques described
Resumo:
This paper analyses the impact of using different correlation assumptions between lines of business when estimating the risk-based capital reserve, the Solvency Capital Requirement (SCR), under Solvency II regulations. A case study is presented and the SCR is calculated according to the Standard Model approach. Alternatively, the requirement is then calculated using an Internal Model based on a Monte Carlo simulation of the net underwriting result at a one-year horizon, with copulas being used to model the dependence between lines of business. To address the impact of these model assumptions on the SCR we conduct a sensitivity analysis. We examine changes in the correlation matrix between lines of business and address the choice of copulas. Drawing on aggregate historical data from the Spanish non-life insurance market between 2000 and 2009, we conclude that modifications of the correlation and dependence assumptions have a significant impact on SCR estimation.