146 resultados para empirical likelihood

em CentAUR: Central Archive University of Reading - UK


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We examine whether and under what circumstances World Bank and International Monetary Fund (IMF) programs affect the likelihood of major government crises. We find that crises are, on average, more likely as a consequence of World Bank programs. We also find that governments face an increasing risk of entering a crisis when they remain under an IMF or World Bank arrangement once the economy's performance improves. The international financial institution's (IFI) scapegoat function thus seems to lose its value when the need for financial support is less urgent. While the probability of a crisis increases when a government turns to the IFIs, programs inherited by preceding governments do not affect the probability of a crisis. This is in line with two interpretations. First, the conclusion of IFI programs can signal the government's incompetence, and second, governments that inherit programs might be less likely to implement program conditions agreed to by their predecessors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe the use of bivariate 3d empirical orthogonal functions (EOFs) in characterising low frequency variability of the Atlantic thermohaline circulation (THC) in the Hadley Centre global climate model, HadCM3. We find that the leading two modes are well correlated with an index of the meridional overturning circulation (MOC) on decadal timescales, with the leading mode alone accounting for 54% of the decadal variance. Episodes of coherent oscillations in the sub-space of the leading EOFs are identified; these episodes are of great interest for the predictability of the THC, and could indicate the existence of different regimes of natural variability. The mechanism identified for the multi-decadal variability is an internal ocean mode, dominated by changes in convection in the Nordic Seas, which lead the changes in the MOC by a few years. Variations in salinity transports from the Arctic and from the North Atlantic are the main feedbacks which control the oscillation. This mode has a weak feedback onto the atmosphere and hence a surface climatic influence. Interestingly, some of these climate impacts lead the changes in the overturning. There are also similarities to observed multi-decadal climate variability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Current research agendas are increasingly encouraging the construction industry to operate on the basis of 'added value'. Such debates echo the established concept of 'high value manufacturing' and associated trends towards servitization. Within construction, the so-called 'value agenda' draws heavily from the notion of integrated solutions. This is held to be especially appropriate in the context of PFI projects. Also relevant is the concept of service-led projects whereby the project rationale is driven by the client's objectives for delivering an enhanced service to its own customers. Such ideas are contextualized by a consideration of broader trends of privatization and outsourcing within and across the construction industry's client base. The current emphasis on integrated solutions reflects long-term trends within privatized client organizations towards the outsourcing of asset management capabilities. However, such trends are by no means uniform or consistent. An in-depth case study of three operating divisions within a major construction company illustrates that firms are unlikely to reorientate their business in response to the 'value agenda'. In the case of PFI, the tendency has been to establish specialist units for the purposes of winning work. Meanwhile, institutionally embedded operating routines within the rest of the business remain broadly unaffected.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has been generally accepted that the method of moments (MoM) variogram, which has been widely applied in soil science, requires about 100 sites at an appropriate interval apart to describe the variation adequately. This sample size is often larger than can be afforded for soil surveys of agricultural fields or contaminated sites. Furthermore, it might be a much larger sample size than is needed where the scale of variation is large. A possible alternative in such situations is the residual maximum likelihood (REML) variogram because fewer data appear to be required. The REML method is parametric and is considered reliable where there is trend in the data because it is based on generalized increments that filter trend out and only the covariance parameters are estimated. Previous research has suggested that fewer data are needed to compute a reliable variogram using a maximum likelihood approach such as REML, however, the results can vary according to the nature of the spatial variation. There remain issues to examine: how many fewer data can be used, how should the sampling sites be distributed over the site of interest, and how do different degrees of spatial variation affect the data requirements? The soil of four field sites of different size, physiography, parent material and soil type was sampled intensively, and MoM and REML variograms were calculated for clay content. The data were then sub-sampled to give different sample sizes and distributions of sites and the variograms were computed again. The model parameters for the sets of variograms for each site were used for cross-validation. Predictions based on REML variograms were generally more accurate than those from MoM variograms with fewer than 100 sampling sites. A sample size of around 50 sites at an appropriate distance apart, possibly determined from variograms of ancillary data, appears adequate to compute REML variograms for kriging soil properties for precision agriculture and contaminated sites. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Annual total phosphorus (TP) export data from 108 European micro-catchments were analyzed against descriptive catchment data on climate (runoff), soil types, catchment size, and land use. The best possible empirical model developed included runoff, proportion of agricultural land and catchment size as explanatory variables but with a low explanation of the variance in the dataset (R-2 = 0.37). Improved country specific empirical models could be developed in some cases. The best example was from Norway where an analysis of TP-export data from 12 predominantly agricultural micro-catchments revealed a relationship explaining 96% of the variance in TP-export. The explanatory variables were in this case soil-P status (P-AL), proportion of organic soil, and the export of suspended sediment. Another example is from Denmark where an empirical model was established for the basic annual average TP-export from 24 catchments with percentage sandy soils, percentage organic soils, runoff, and application of phosphorus in fertilizer and animal manure as explanatory variables (R-2 = 0.97).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An unbalanced nested sampling design was used to investigate the spatial scale of soil and herbicide interactions at the field scale. A hierarchical analysis of variance based on residual maximum likelihood (REML) was used to analyse the data and provide a first estimate of the variogram. Soil samples were taken at 108 locations at a range of separating distances in a 9 ha field to explore small and medium scale spatial variation. Soil organic matter content, pH, particle size distribution, microbial biomass and the degradation and sorption of the herbicide, isoproturon, were determined for each soil sample. A large proportion of the spatial variation in isoproturon degradation and sorption occurred at sampling intervals less than 60 m, however, the sampling design did not resolve the variation present at scales greater than this. A sampling interval of 20-25 m should ensure that the main spatial structures are identified for isoproturon degradation rate and sorption without too great a loss of information in this field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the study was to establish and verify a predictive vegetation model for plant community distribution in the alti-Mediterranean zone of the Lefka Ori massif, western Crete. Based on previous work three variables were identified as significant determinants of plant community distribution, namely altitude, slope angle and geomorphic landform. The response of four community types against these variables was tested using classification trees analysis in order to model community type occurrence. V-fold cross-validation plots were used to determine the length of the best fitting tree. The final 9node tree selected, classified correctly 92.5% of the samples. The results were used to provide decision rules for the construction of a spatial model for each community type. The model was implemented within a Geographical Information System (GIS) to predict the distribution of each community type in the study site. The evaluation of the model in the field using an error matrix gave an overall accuracy of 71%. The user's accuracy was higher for the Crepis-Cirsium (100%) and Telephium-Herniaria community type (66.7%) and relatively lower for the Peucedanum-Alyssum and Dianthus-Lomelosia community types (63.2% and 62.5%, respectively). Misclassification and field validation points to the need for improved geomorphological mapping and suggests the presence of transitional communities between existing community types.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Empirical orthogonal function (EOF) analysis is a powerful tool for data compression and dimensionality reduction used broadly in meteorology and oceanography. Often in the literature, EOF modes are interpreted individually, independent of other modes. In fact, it can be shown that no such attribution can generally be made. This review demonstrates that in general individual EOF modes (i) will not correspond to individual dynamical modes, (ii) will not correspond to individual kinematic degrees of freedom, (iii) will not be statistically independent of other EOF modes, and (iv) will be strongly influenced by the nonlocal requirement that modes maximize variance over the entire domain. The goal of this review is not to argue against the use of EOF analysis in meteorology and oceanography; rather, it is to demonstrate the care that must be taken in the interpretation of individual modes in order to distinguish the medium from the message.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives - To assess the general public's interpretation of the verbal descriptors for side effect frequency recommended for use in medicine information leaflets by a European Union (EU) guideline, and to examine the extent to which differences in interpretation affect people's perception of risk and their judgments of intention to comply with the prescribed treatment. Method - Two studies used a controlled empirical methodology in which people were presented with a hypothetical, but realistic, scenario about visiting their general practitioner and being prescribed medication. They were given an explanation that focused on the side effects of the medicine, together with information about the probability of occurrence using either numerical percentages or the corresponding EU verbal descriptors. Interpretation of the descriptors was assessed. In study 2, participants were also required to make various judgments, including risk to health and intention to comply. Key findings - In both studies, use of the EU recommended descriptors led to significant overestimations of the likelihood of particular side effects occurring. Study 2 further showed that the "overestimation" resulted in significantly increased ratings of perceived severity of side effects and risk to health, as well as significantly reduced ratings of intention to comply, compared with those for people who received the probability information in numerical form. Conclusion - While it is recognised that the current findings require replication in a clinical setting, the European and national authorities should suspend the use of the EU recommended terms until further research is available to allow the use of an evidence-based approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article describes an empirical, user-centred approach to explanation design. It reports three studies that investigate what patients want to know when they have been prescribed medication. The question is asked in the context of the development of a drug prescription system called OPADE. The system is aimed primarily at improving the prescribing behaviour of physicians, but will also produce written explanations for indirect users such as patients. In the first study, a large number of people were presented with a scenario about a visit to the doctor, and were asked to list the questions that they would like to ask the doctor about the prescription. On the basis of the results of the study, a categorization of question types was developed in terms of how frequently particular questions were asked. In the second and third studies a number of different explanations were generated in accordance with this categorization, and a new sample of people were presented with another scenario and were asked to rate the explanations on a number of dimensions. The results showed significant differences between the different explanations. People preferred explanations that included items corresponding to frequently asked questions in study 1. For an explanation to be considered useful, it had to include information about side effects, what the medication does, and any lifestyle changes involved. The implications of the results of the three studies are discussed in terms of the development of OPADE's explanation facility.