995 resultados para Bootstrap weights approach
Resumo:
Le sujet principal de cette thèse porte sur l'étude de l'estimation de la variance d'une statistique basée sur des données d'enquête imputées via le bootstrap (ou la méthode de Cyrano). L'application d'une méthode bootstrap conçue pour des données d'enquête complètes (en absence de non-réponse) en présence de valeurs imputées et faire comme si celles-ci étaient de vraies observations peut conduire à une sous-estimation de la variance. Dans ce contexte, Shao et Sitter (1996) ont introduit une procédure bootstrap dans laquelle la variable étudiée et l'indicateur de réponse sont rééchantillonnés ensemble et les non-répondants bootstrap sont imputés de la même manière qu'est traité l'échantillon original. L'estimation bootstrap de la variance obtenue est valide lorsque la fraction de sondage est faible. Dans le chapitre 1, nous commençons par faire une revue des méthodes bootstrap existantes pour les données d'enquête (complètes et imputées) et les présentons dans un cadre unifié pour la première fois dans la littérature. Dans le chapitre 2, nous introduisons une nouvelle procédure bootstrap pour estimer la variance sous l'approche du modèle de non-réponse lorsque le mécanisme de non-réponse uniforme est présumé. En utilisant seulement les informations sur le taux de réponse, contrairement à Shao et Sitter (1996) qui nécessite l'indicateur de réponse individuelle, l'indicateur de réponse bootstrap est généré pour chaque échantillon bootstrap menant à un estimateur bootstrap de la variance valide même pour les fractions de sondage non-négligeables. Dans le chapitre 3, nous étudions les approches bootstrap par pseudo-population et nous considérons une classe plus générale de mécanismes de non-réponse. Nous développons deux procédures bootstrap par pseudo-population pour estimer la variance d'un estimateur imputé par rapport à l'approche du modèle de non-réponse et à celle du modèle d'imputation. Ces procédures sont également valides même pour des fractions de sondage non-négligeables.
Resumo:
We consider model selection uncertainty in linear regression. We study theoretically and by simulation the approach of Buckland and co-workers, who proposed estimating a parameter common to all models under study by taking a weighted average over the models, using weights obtained from information criteria or the bootstrap. This approach is compared with the usual approach in which the 'best' model is used, and with Bayesian model averaging. The weighted predictor behaves similarly to model averaging, with generally more realistic mean-squared errors than the usual model-selection-based estimator.
Resumo:
Contexte: L'utilisation de suppléments alimentaires est répandue chez les populations américaines et canadiennes en général, mais on en sait peu sur la consommation de suppléments alimentaires dans la population autochtone canadienne. Objectif: L'objectif général de cette étude est de prendre en compte l'utilisation de suppléments alimentaires dans l'évaluation nutritionnelle des apports alimentaires des adultes des Premières nations vivant dans les réserves en Colombie-Britannique et Manitoba. Conception: Les données ont été recueillies par l’étude ‘First Nations Food, Nutrition, and Environment Study’ de 1103 (Colombie-Britannique) et 706 (Manitoba) adultes des Premières Nations âgés de 19 à 70 ans. L'étude a utilisé un rappel alimentaire des dernières 24 heures (avec un deuxième rappel pour un sous-échantillon) pour évaluer la diète alimentaire. L'utilisation de suppléments alimentaires et des antiacides ont été recueillis par un questionnaire de fréquence. En utilisant le logiciel SIDE pour tenir compte des variations intra-individuelles dans la prise alimentaire et la technique du bootstrap pour obtenir des estimations représentatives des différentes régions, l'utilisation de suppléments de la vitamine A, D, C et de calcium ont été intégrées aux estimations de la consommation alimentaire. Résultats: Environ 30% des adultes des Premières Nations de la Colombie-Britannique et seulement 13,2% des adultes des Premières Nations du Manitoba âgés entre 19-70 ans vivant dans les réserves ont déclaré utiliser au moins un supplément alimentaire durant les 30 jours précédents. Lors de l'examen des nutriments d'intérêt, un plus faible pourcentage de la population en a fait usage, de 14,8 à 18,5% en Colombie-Britannique et de 4,9 à 8% de la population du Manitoba. La prévalence de l'usage de tout supplément alimentaire était plus élevée chez les femmes que chez les hommes dans tous les groupes d'âge et augmente avec l'âge dans les deux sexes. La plus forte prévalence d'un apport insuffisant provenant de la nourriture a été observée pour la vitamine D et le calcium en Colombie-Britannique et Manitoba, variant de 75 à 100%, et de la vitamine A dans le Manitoba (73-96%). Après avoir examiné l'utilisation de suppléments alimentaires, plus des trois quarts des participants n’ont toujours pas réussi à répondre au besoin moyen estimatif pour ces nutriments. La vitamine C est l'oligo-élément avec le plus faible pourcentage sous le besoin moyen estimatif (avec au sans suppléments) pour la Colombie-Britannique et le Manitoba. Conclusion: La majorité des adultes des Premières nations de la Colombie-Britannique et du Manitoba, même après prise en compte de l'utilisation de suppléments alimentaires, avaient des apports en vitamines A, D et des apports de calcium sous les niveaux recommandés. L'utilisation de compléments alimentaires n'a pas contribué de façon significative à l'apport total en nutriments sélectionnés sauf pour la vitamine C dans certains groupes d'âge.
Resumo:
This study computed trends in extreme precipitation events of Florida for 1950-2010. Hourly aggregated rainfall data from 24 stations of the National Climatic Data Centre were analyzed to derive time-series of extreme rainfalls for 12 durations, ranging from 1 hour to 7 day. Non-parametric Mann-Kendall test and Theil-Sen Approach were applied to detect the significance of trends in annual maximum rainfalls, number of above threshold events and average magnitude of above threshold events for four common analysis periods. Trend Free Pre-Whitening (TFPW) approach was applied to remove the serial correlations and bootstrap resampling approach was used to detect the field significance of trends. The results for annual maximum rainfall revealed dominant increasing trends at the statistical significance level of 0.10, especially for hourly events in longer period and daily events in recent period. The number of above threshold events exhibited strong decreasing trends for hourly durations in all time periods.
Resumo:
The purpose of this study is to provide a comparative analysis of the efficiency of Islamic and conventional banks in Gulf Cooperation Council (GCC) countries. In this study, we explain inefficiencies obtained by introducing firm-specific as well as macroeconomic variables. Our findings indicate that during the eight years of study, conventional banks largely outperform Islamic banks with an average technical efficiency score of 81% compared to 95.57%. However, it is clear that since 2008, efficiency of conventional banks was in a downward trend while the efficiency of their Islamic counterparts was in an upward trend since 2009. This indicates that Islamic banks have succeeded to maintain a level of efficiency during the subprime crisis period. Finally, for the whole sample, the analysis demonstrates the strong link of macroeconomic indicators with efficiency for GCC banks. Surprisingly, we have not found any significant relationship in the case of Islamic banks.
Resumo:
Resumo:
This work project was conducted under a Direct Research internship (DRI) that consists on an individual dissertation established on a given organization. DRI has a problem solving format to an empirical question to be addressed, «Which country has the highest potential for the next step of XY internationalization process? ». In order to achieve the project’s purpose, it was conducted a scanning process using a top-down approach over an initial list of nine countries given by XY. To do so it was developed an international scanning framework based on different domains and weights that allowed to achieve the top two countries with highest potential. After an in depth analysis over the final set, it was recommended Switzerland as the best country to make the next step of XY internationalization in Europe.
Resumo:
Background. Obesity is considered a major public health issue in most developed countries nowadays. This paper provides an overview of current population data available in Spain and the approach to develop preventive strategies in the country. Methods. Review of population data available is based on individually measured weight and height as well as determinants. On this basis, the approach used in the country to develop preventive strategies is discussed. Results. According to the DORICA study, the prevalence of obesity (BMI ≥30 kg m−2) is 15.5% in Spanish adults aged 25–60 years (13.2% in men and 17.5% in women). Obesity rates are higher among women aged 45 years and older, low social class, living in semi-urban places. Population estimates for the prevalence of obesity in Spanish children and young people based on the enKid study are 13.9% for the whole group. In this study, overweight and obesity is related to absence of breastfeeding, low consumption of fruit and vegetables, high consumption of cakes, buns, softdrinks and butchery products, low physical activity levels and a positive association with time spent watching TV. In 2005, the Spanish Ministry of Health jointly with the Spanish Agency for Food Safety and Nutrition launched the multifaceted NAOS strategy for nutrition, physical activity and the prevention of obesity. The important role of the family and the school setting as well as the responsibility of the Health Administration and Pediatric Care in the prevention of obesity is highlighted in the document. The need for environmental actions is recognised. The PERSEO programme, a multicomponent school-based intervention project is part of the strategy currently in place. Conclusion. Obesity is a public health issue in Spain. A national multifaceted strategy was launched to counteract the problem. Environmental and policy actions are a priority. Young children and their families are among the main target groups.
Resumo:
This paper considers the estimation of the geographical scope of industrial location determinants. While previous studies impose strong assumptions on the weighting scheme of the spatial neighbour matrix, we propose a exible parametrisation that allows for di fferent (distance-based) de finitions of neighbourhood and di fferent weights to the neighbours. In particular, we estimate how far can reach indirect marginal e ffects and discuss how to report them. We also show that the use of smooth transition functions provides tools for policy analysis that are not available in the traditional threshold modelling. Keywords: count data models, industrial location, smooth transition functions, threshold models. JEL-Codes: C25, C52, R11, R30.
Resumo:
The objective of this report is to provide Iowa county engineers and highway maintenance personnel with procedures that will allow them to efficiently and effectively interpret and repair or avoid landslides. The research provides an overview of basic slope stability analyses that can be used to diagnose the cause and effect associated with a slope failure. Field evidence for identifying active or potential slope stability problems is outlined. A survey of county engineers provided data for presenting a slope stability risk map for the state of Iowa. Areas of high risk are along the western border and southeastern portion of the state. These regions contain deep to moderately deep loess. The central portion of the state is a low risk area where the surficial soils are glacial till or thin loess over till. In this region, the landslides appear to occur predominately in backslopes along deeply incised major rivers, such as the Des Moines River, or in foreslopes. The south-central portion of the state is an area of medium risk where failures are associated with steep backslopes and improperly compacted foreslopes. Soil shear strength data compiled from the Iowa DOT and consulting engineers files are correlated with geologic parent materials and mean values of shear strength parameters and unit weights were computed for glacial till, friable loess, plastic loess and local alluvium. Statistical tests demonstrate that friction angles and unit weights differ significantly but in some cases effective stress cohesion intercept and undrained shear strength data do not. Moreover, effective stress cohesion intercept and undrained shear strength data show a high degree of variability. The shear strength and unit weight data are used in slope stability analyses for both drained and undrained conditions to generate curves that can be used for a preliminary evaluation of the relative stability of slopes within the four materials. Reconnaissance trips to over fifty active and repaired landslides in Iowa suggest that, in general, landslides in Iowa are relatively shallow [i.e., failure surfaces less than 6 ft (2 m) deep] and are either translational or shallow rational. Two foreslope and two backslope failure case histories provide additional insights into slope stability problems and repair in Iowa. These include the observation that embankment soils compacted to less than 95% relative density show a marked strength decrease from soils at or above that density. Foreslopes constructed of soils derived from shale exhibit loss of strength as a result of weathering. In some situations, multiple causes of instability can be discerned from back analyses with the slope stability program XSTABL. In areas where the stratigraphy consists of loess over till or till over bedrock, the geologic contracts act as surfaces of groundwater accumulation that contribute to slope instability.
Resumo:
This paper proposes a spatial filtering technique forthe reception of pilot-aided multirate multicode direct-sequencecode division multiple access (DS/CDMA) systems such as widebandCDMA (WCDMA). These systems introduce a code-multiplexedpilot sequence that can be used for the estimation of thefilter weights, but the presence of the traffic signal (transmittedat the same time as the pilot sequence) corrupts that estimationand degrades the performance of the filter significantly. This iscaused by the fact that although the traffic and pilot signals areusually designed to be orthogonal, the frequency selectivity of thechannel degrades this orthogonality at hte receiving end. Here,we propose a semi-blind technique that eliminates the self-noisecaused by the code-multiplexing of the pilot. We derive analyticallythe asymptotic performance of both the training-only andthe semi-blind techniques and compare them with the actual simulatedperformance. It is shown, both analytically and via simulation,that high gains can be achieved with respect to training-onlybasedtechniques.
Resumo:
In this paper we propose exact likelihood-based mean-variance efficiency tests of the market portfolio in the context of Capital Asset Pricing Model (CAPM), allowing for a wide class of error distributions which include normality as a special case. These tests are developed in the frame-work of multivariate linear regressions (MLR). It is well known however that despite their simple statistical structure, standard asymptotically justified MLR-based tests are unreliable. In financial econometrics, exact tests have been proposed for a few specific hypotheses [Jobson and Korkie (Journal of Financial Economics, 1982), MacKinlay (Journal of Financial Economics, 1987), Gib-bons, Ross and Shanken (Econometrica, 1989), Zhou (Journal of Finance 1993)], most of which depend on normality. For the gaussian model, our tests correspond to Gibbons, Ross and Shanken’s mean-variance efficiency tests. In non-gaussian contexts, we reconsider mean-variance efficiency tests allowing for multivariate Student-t and gaussian mixture errors. Our framework allows to cast more evidence on whether the normality assumption is too restrictive when testing the CAPM. We also propose exact multivariate diagnostic checks (including tests for multivariate GARCH and mul-tivariate generalization of the well known variance ratio tests) and goodness of fit tests as well as a set estimate for the intervening nuisance parameters. Our results [over five-year subperiods] show the following: (i) multivariate normality is rejected in most subperiods, (ii) residual checks reveal no significant departures from the multivariate i.i.d. assumption, and (iii) mean-variance efficiency tests of the market portfolio is not rejected as frequently once it is allowed for the possibility of non-normal errors.
Resumo:
The technique of Monte Carlo (MC) tests [Dwass (1957), Barnard (1963)] provides an attractive method of building exact tests from statistics whose finite sample distribution is intractable but can be simulated (provided it does not involve nuisance parameters). We extend this method in two ways: first, by allowing for MC tests based on exchangeable possibly discrete test statistics; second, by generalizing the method to statistics whose null distributions involve nuisance parameters (maximized MC tests, MMC). Simplified asymptotically justified versions of the MMC method are also proposed and it is shown that they provide a simple way of improving standard asymptotics and dealing with nonstandard asymptotics (e.g., unit root asymptotics). Parametric bootstrap tests may be interpreted as a simplified version of the MMC method (without the general validity properties of the latter).