15 resultados para Fixed-effect estimator

em CentAUR: Central Archive University of Reading - UK


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Theoretical models suggest that decisions about diet, weight and health status are endogenous within a utility maximization framework. In this article, we model these behavioural relationships in a fixed-effect panel setting using a simultaneous equation system, with a view to determining whether economic variables can explain the trends in calorie consumption, obesity and health in Organization for Economic Cooperation and Development (OECD) countries and the large differences among the countries. The empirical model shows that progress in medical treatment and health expenditure mitigates mortality from diet-related diseases, despite rising obesity rates. While the model accounts for endogeneity and serial correlation, results are affected by data limitations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper evaluates the extent to which the performance of English Premier League football club managers can be attributed to skill or luck when measured separately from the characteristics of the team. We first use a specification that models managerial skill as a fixed effect and we examine the relationship between the number of points earned in league matches and the club’s wage bill, transfer spending, and the extent to which they were hit by absent players through injuries, suspensions or unavailability. We next implement a bootstrapping approach to generate a simulated distribution of average points that could have taken place after the impact of the manager has been removed. The findings suggest that there are a considerable number of highly skilled managers but also several who perform below expectations. The paper proceeds to illustrate how the approach adopted could be used to determine the optimal time for a club to part company with its manager. We are able to identify in advance several managers who the analysis suggests could have been fired earlier and others whose sackings were hard to justify based on their performances.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There are now many reports of imaging experiments with small cohorts of typical participants that precede large-scale, often multicentre studies of psychiatric and neurological disorders. Data from these calibration experiments are sufficient to make estimates of statistical power and predictions of sample size and minimum observable effect sizes. In this technical note, we suggest how previously reported voxel-based power calculations can support decision making in the design, execution and analysis of cross-sectional multicentre imaging studies. The choice of MRI acquisition sequence, distribution of recruitment across acquisition centres, and changes to the registration method applied during data analysis are considered as examples. The consequences of modification are explored in quantitative terms by assessing the impact on sample size for a fixed effect size and detectable effect size for a fixed sample size. The calibration experiment dataset used for illustration was a precursor to the now complete Medical Research Council Autism Imaging Multicentre Study (MRC-AIMS). Validation of the voxel-based power calculations is made by comparing the predicted values from the calibration experiment with those observed in MRC-AIMS. The effect of non-linear mappings during image registration to a standard stereotactic space on the prediction is explored with reference to the amount of local deformation. In summary, power calculations offer a validated, quantitative means of making informed choices on important factors that influence the outcome of studies that consume significant resources.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Real estate securities have a number of distinct characteristics that differentiate them from stocks generally. Key amongst them is that under-pinning the firms are both real as well as investment assets. The connections between the underlying macro-economy and listed real estate firms is therefore clearly demonstrated and of heightened importance. To consider the linkages with the underlying macro-economic fundamentals we extract the ‘low-frequency’ volatility component from aggregate volatility shocks in 11 international markets over the 1990-2014 period. This is achieved using Engle and Rangel’s (2008) Spline-Generalized Autoregressive Conditional Heteroskedasticity (Spline-GARCH) model. The estimated low-frequency volatility is then examined together with low-frequency macro data in a fixed-effect pooled regression framework. The analysis reveals that the low-frequency volatility of real estate securities has strong and positive association with most of the macroeconomic risk proxies examined. These include interest rates, inflation, GDP and foreign exchange rates.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper uses a panel data-fixed effect approach and data collected from Chinese public manufacturing firms between 1999 and 2011 to investigate the impacts of business life cycle stages on capital structure. We find that cash flow patterns capture more information on business life cycle stages than firm age and have a stronger impact on capital structure decision-making. We also find that the adjustment speed of capital structure varies significantly across life cycle stages and that non-sequential transitions over life cycle stages play an important role in the determination of capital structure. Our study indicates that it is important for policy-makers to ensure that products and financial markets are well-balanced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding links between the El Nino-Southern Oscillation (ENSO) and snow would be useful for seasonal forecasting, but also for understanding natural variability and interpreting climate change predictions. Here, a 545-year run of the general circulation model HadCM3, with prescribed external forcings and fixed greenhouse gas concentrations, is used to explore the impact of ENSO on snow water equivalent (SWE) anomalies. In North America, positive ENSO events reduce the mean SWE and skew the distribution towards lower values, and vice versa during negative ENSO events. This is associated with a dipole SWE anomaly structure, with anomalies of opposite sign centered in western Canada and the central United States. In Eurasia, warm episodes lead to a more positively skewed distribution and the mean SWE is raised. Again, the opposite effect is seen during cold episodes. In Eurasia the largest anomalies are concentrated in the Himalayas. These correlations with February SWE distribution are seen to exist from the previous June-July-August (JJA) ENSO index onwards, and are weakly detected in 50-year subsections of the control run, but only a shifted North American response can be detected in the anaylsis of 40 years of ERA40 reanalysis data. The ENSO signal in SWE from the long run could still contribute to regional predictions although it would be a weak indicator only

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Matheron's usual variogram estimator can result in unreliable variograms when data are strongly asymmetric or skewed. Asymmetry in a distribution can arise from a long tail of values in the underlying process or from outliers that belong to another population that contaminate the primary process. This paper examines the effects of underlying asymmetry on the variogram and on the accuracy of prediction, and the second one examines the effects arising from outliers. Standard geostatistical texts suggest ways of dealing with underlying asymmetry; however, this is based on informed intuition rather than detailed investigation. To determine whether the methods generally used to deal with underlying asymmetry are appropriate, the effects of different coefficients of skewness on the shape of the experimental variogram and on the model parameters were investigated. Simulated annealing was used to create normally distributed random fields of different size from variograms with different nugget:sill ratios. These data were then modified to give different degrees of asymmetry and the experimental variogram was computed in each case. The effects of standard data transformations on the form of the variogram were also investigated. Cross-validation was used to assess quantitatively the performance of the different variogram models for kriging. The results showed that the shape of the variogram was affected by the degree of asymmetry, and that the effect increased as the size of data set decreased. Transformations of the data were more effective in reducing the skewness coefficient in the larger sets of data. Cross-validation confirmed that variogram models from transformed data were more suitable for kriging than were those from the raw asymmetric data. The results of this study have implications for the 'standard best practice' in dealing with asymmetry in data for geostatistical analyses. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Asymmetry in a distribution can arise from a long tail of values in the underlying process or from outliers that belong to another population that contaminate the primary process. The first paper of this series examined the effects of the former on the variogram and this paper examines the effects of asymmetry arising from outliers. Simulated annealing was used to create normally distributed random fields of different size that are realizations of known processes described by variograms with different nugget:sill ratios. These primary data sets were then contaminated with randomly located and spatially aggregated outliers from a secondary process to produce different degrees of asymmetry. Experimental variograms were computed from these data by Matheron's estimator and by three robust estimators. The effects of standard data transformations on the coefficient of skewness and on the variogram were also investigated. Cross-validation was used to assess the performance of models fitted to experimental variograms computed from a range of data contaminated by outliers for kriging. The results showed that where skewness was caused by outliers the variograms retained their general shape, but showed an increase in the nugget and sill variances and nugget:sill ratios. This effect was only slightly more for the smallest data set than for the two larger data sets and there was little difference between the results for the latter. Overall, the effect of size of data set was small for all analyses. The nugget:sill ratio showed a consistent decrease after transformation to both square roots and logarithms; the decrease was generally larger for the latter, however. Aggregated outliers had different effects on the variogram shape from those that were randomly located, and this also depended on whether they were aggregated near to the edge or the centre of the field. The results of cross-validation showed that the robust estimators and the removal of outliers were the most effective ways of dealing with outliers for variogram estimation and kriging. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the case of a multicenter trial in which the center specific sample sizes are potentially small. Under homogeneity, the conventional procedure is to pool information using a weighted estimator where the weights used are inverse estimated center-specific variances. Whereas this procedure is efficient for conventional asymptotics (e. g. center-specific sample sizes become large, number of center fixed), it is commonly believed that the efficiency of this estimator holds true also for meta-analytic asymptotics (e.g. center-specific sample size bounded, potentially small, and number of centers large). In this contribution we demonstrate that this estimator fails to be efficient. In fact, it shows a persistent bias with increasing number of centers showing that it isnot meta-consistent. In addition, we show that the Cochran and Mantel-Haenszel weighted estimators are meta-consistent and, in more generality, provide conditions on the weights such that the associated weighted estimator is meta-consistent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An experiment was undertaken to investigate the effect of milk fat level (0%, 2.5% and 5.0% w/w) and gel firmness level at cutting (5, 35 and 65 Pa) on indices of syneresis, while curd was undergoing stirring. The curd moisture content, yield of whey, fat in whey and casein fines in whey were measured at fixed intervals between 5 and 75 min after cutting the gel. The casein level in milk and clotting conditions was kept constant in all trials. The trials were carried out using recombined whole milk in an 11 L cheese vat. The fat level in milk had a large negative effect on the yield of whey. A clear effect of gel firmness on casein fines was observed. The best overall prediction, in terms of coefficient of determination, was for curd moisture content using milk fat concentration, time after gel cutting and set-to-cut time (R2 = 0.95).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article reports on a detailed empirical study of the way narrative task design influences the oral performance of second-language (L2) learners. Building on previous research findings, two dimensions of narrative design were chosen for investigation: narrative complexity and inherent narrative structure. Narrative complexity refers to the presence of simultaneous storylines; in this case, we compared single-story narratives with dual-story narratives. Inherent narrative structure refers to the order of events in a narrative; we compared narratives where this was fixed to others where the events could be reordered without loss of coherence. Additionally, we explored the influence of learning context on performance by gathering data from two comparable groups of participants: 60 learners in a foreign language context in Teheran and 40 in an L2 context in London. All participants recounted two of four narratives from cartoon pictures prompts, giving a between-subjects design for narrative complexity and a within-subjects design for inherent narrative structure. The results show clearly that for both groups, L2 performance was affected by the design of the task: Syntactic complexity was supported by narrative storyline complexity and grammatical accuracy was supported by an inherently fixed narrative structure. We reason that the task of recounting simultaneous events leads learners into attempting more hypotactic language, such as subordinate clauses that follow, for example, while, although, at the same time as, etc. We reason also that a tight narrative structure allows learners to achieve greater accuracy in the L2 (within minutes of performing less accurately on a loosely structured narrative) because the tight ordering of events releases attentional resources that would otherwise be spent on finding connections between the pictures. The learning context was shown to have no effect on either accuracy or fluency but an unexpectedly clear effect on syntactic complexity and lexical diversity. The learners in London seem to have benefited from being in the target language environment by developing not more accurate grammar but a more diverse resource of English words and syntactic choices. In a companion article (Foster & Tavakoli, 2009) we compared their performance with native-speaker baseline data and see that, in terms of nativelike selection of vocabulary and phrasing, the learners in London are closing in on native-speaker norms. The study provides empirical evidence that L2 performance is affected by task design in predictable ways. It also shows that living within the target language environment, and presumably using the L2 in a host of everyday tasks outside the classroom, confers a distinct lexical advantage, not a grammatical one.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An analysis of the attribution of past and future changes in stratospheric ozone and temperature to anthropogenic forcings is presented. The analysis is an extension of the study of Shepherd and Jonsson (2008) who analyzed chemistry-climate simulations from the Canadian Middle Atmosphere Model (CMAM) and attributed both past and future changes to changes in the external forcings, i.e. the abundances of ozone-depleting substances (ODS) and well-mixed greenhouse gases. The current study is based on a new CMAM dataset and includes two important changes. First, we account for the nonlinear radiative response to changes in CO2. It is shown that over centennial time scales the radiative response in the upper stratosphere to CO2 changes is significantly nonlinear and that failure to account for this effect leads to a significant error in the attribution. To our knowledge this nonlinearity has not been considered before in attribution analysis, including multiple linear regression studies. For the regression analysis presented here the nonlinearity was taken into account by using CO2 heating rate, rather than CO2 abundance, as the explanatory variable. This approach yields considerable corrections to the results of the previous study and can be recommended to other researchers. Second, an error in the way the CO2 forcing changes are implemented in the CMAM was corrected, which significantly affects the results for the recent past. As the radiation scheme, based on Fomichev et al. (1998), is used in several other models we provide some description of the problem and how it was fixed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we introduce a new testing procedure for evaluating the rationality of fixed-event forecasts based on a pseudo-maximum likelihood estimator. The procedure is designed to be robust to departures in the normality assumption. A model is introduced to show that such departures are likely when forecasters experience a credibility loss when they make large changes to their forecasts. The test is illustrated using monthly fixed-event forecasts produced by four UK institutions. Use of the robust test leads to the conclusion that certain forecasts are rational while use of the Gaussian-based test implies that certain forecasts are irrational. The difference in the results is due to the nature of the underlying data. Copyright © 2001 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We apply a new parameterisation of the Greenland ice sheet (GrIS) feedback between surface mass balance (SMB: the sum of surface accumulation and surface ablation) and surface elevation in the MAR regional climate model (Edwards et al., 2014) to projections of future climate change using five ice sheet models (ISMs). The MAR (Modèle Atmosphérique Régional: Fettweis, 2007) climate projections are for 2000–2199, forced by the ECHAM5 and HadCM3 global climate models (GCMs) under the SRES A1B emissions scenario. The additional sea level contribution due to the SMB– elevation feedback averaged over five ISM projections for ECHAM5 and three for HadCM3 is 4.3% (best estimate; 95% credibility interval 1.8–6.9 %) at 2100, and 9.6% (best estimate; 95% credibility interval 3.6–16.0 %) at 2200. In all results the elevation feedback is significantly positive, amplifying the GrIS sea level contribution relative to the MAR projections in which the ice sheet topography is fixed: the lower bounds of our 95% credibility intervals (CIs) for sea level contributions are larger than the “no feedback” case for all ISMs and GCMs. Our method is novel in sea level projections because we propagate three types of modelling uncertainty – GCM and ISM structural uncertainties, and elevation feedback parameterisation uncertainty – along the causal chain, from SRES scenario to sea level, within a coherent experimental design and statistical framework. The relative contributions to uncertainty depend on the timescale of interest. At 2100, the GCM uncertainty is largest, but by 2200 both the ISM and parameterisation uncertainties are larger. We also perform a perturbed parameter ensemble with one ISM to estimate the shape of the projected sea level probability distribution; our results indicate that the probability density is slightly skewed towards higher sea level contributions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Initializing the ocean for decadal predictability studies is a challenge, as it requires reconstructing the little observed subsurface trajectory of ocean variability. In this study we explore to what extent surface nudging using well-observed sea surface temperature (SST) can reconstruct the deeper ocean variations for the 1949–2005 period. An ensemble made with a nudged version of the IPSLCM5A model and compared to ocean reanalyses and reconstructed datasets. The SST is restored to observations using a physically-based relaxation coefficient, in contrast to earlier studies, which use a much larger value. The assessment is restricted to the regions where the ocean reanalyses agree, i.e. in the upper 500 m of the ocean, although this can be latitude and basin dependent. Significant reconstruction of the subsurface is achieved in specific regions, namely region of subduction in the subtropical Atlantic, below the thermocline in the equatorial Pacific and, in some cases, in the North Atlantic deep convection regions. Beyond the mean correlations, ocean integrals are used to explore the time evolution of the correlation over 20-year windows. Classical fixed depth heat content diagnostics do not exhibit any significant reconstruction between the different existing observation-based references and can therefore not be used to assess global average time-varying correlations in the nudged simulations. Using the physically based average temperature above an isotherm (14 °C) alleviates this issue in the tropics and subtropics and shows significant reconstruction of these quantities in the nudged simulations for several decades. This skill is attributed to the wind stress reconstruction in the tropics, as already demonstrated in a perfect model study using the same model. Thus, we also show here the robustness of this result in an historical and observational context.