990 resultados para Bayesian chronological modelling
Resumo:
The potential impact of the abrupt 8.2 ka cold event on human demography, settlement patterns and culture in Europe and the Near East has emerged as a key theme in current discussion and debate. We test whether this event had an impact on the Mesolithic population of western Scotland, a case study located within the North Atlantic region where the environmental impact of the 8.2 ka event is likely to have been the most severe. By undertaking a Bayesian analysis of the radiocarbon record and using the number of activity events as a proxy for the size of the human population, we find evidence for a dramatic reduction in the Mesolithic population synchronous with the 8.2 ka event. We interpret this as reflecting the demographic collapse of a low density population that lacked the capability to adapt to the rapid onset of new environmental conditions. This impact of the 8.2 ka event in the North Atlantic region lends credence to the possibility of a similar impact on populations in Continental Europe and the Near East.
Resumo:
Radiocarbon dating and Bayesian chronological modelling, undertaken as part of the investigation by the Times of Their Lives project into the development of Late Neolithic settlement and pottery in Orkney, has provided precise new dating for the Grooved Ware settlement of Barnhouse, excavated in 1985–91. Previous understandings of the site and its pottery are presented. A Bayesian model based on 70 measurements on 62 samples (of which 50 samples are thought to date accurately the deposits from which they were recovered) suggests that the settlement probably began in the later 32nd century cal bc (with Houses 2, 9, 3 and perhaps 5a), possibly as a planned foundation. Structure 8 – a large, monumental structure that differs in character from the houses – was probably built just after the turn of the millennium. Varied house durations and replacements are estimated. House 2 went out of use before the end of the settlement, and Structure 8 was probably the last element to be abandoned, probably during the earlier 29th century cal bc. The Grooved Ware pottery from the site is characterised by small, medium-sized, and large vessels with incised and impressed decoration, including a distinctive, false-relief, wavy-line cordon motif. A considerable degree of consistency is apparent in many aspects of ceramic design and manufacture over the use-life of the settlement, the principal change being the appearance, from c. 3025–2975 cal bc, of large coarse ware vessels with uneven surfaces and thick applied cordons, and of the use of applied dimpled circular pellets. The circumstances of new foundation of settlement in the western part of Mainland are discussed, as well as the maintenance and character of the site. The pottery from the site is among the earliest Grooved Ware so far dated. Its wider connections are noted, as well as the significant implications for our understanding of the timing and circumstances of the emergence of Grooved Ware, and the role of material culture in social strategies.
Resumo:
The Conservative Party emerged from the 2010 United Kingdom General Election as the largest single party, but their support was not geographically uniform. In this paper, we estimate a hierarchical Bayesian spatial probit model that tests for the presence of regional voting effects. This model allows for the estimation of individual region-specic effects on the probability of Conservative Party success, incorporating information on the spatial relationships between the regions of the mainland United Kingdom. After controlling for a range of important covariates, we find that these spatial relationships are significant and that our individual region-specic effects estimates provide additional evidence of North-South variations in Conservative Party support.
Resumo:
The character of settlement patterns within the late Mesolithic communities of north-west Europe is a topic of substantial debate. An important case study concerns the five shell middens on the island of Oronsay, Inner Hebrides, western Scotland. Two conflicting interpretations have been proposed: the evidence from seasonality indicators and stable isotope analysis of human bones has been used to support a model of year-round settlement on this small island; alternatively, the middens have been interpreted as resulting from short-term intermittent visits to Oronsay within a regionally mobile settlement pattern. We contribute to this debate by describing Storakaig, a newly discovered site on the nearby island of Islay, undertaking a Bayesian chronological analysis and providing evidence for technological continuity between Oronsay and sites elsewhere in the region. While this new evidence remains open to alternative interpretation, we suggest that it makes regional mobility rather than year-round settlement on Oronsay a more viable interpretation for the Oronsay middens. Our analysis also confirms the likely overlap of the late Mesolithic with the earliest Neolithic within western Scotland.
Resumo:
In the context of Bayesian statistical analysis, elicitation is the process of formulating a prior density f(.) about one or more uncertain quantities to represent a person's knowledge and beliefs. Several different methods of eliciting prior distributions for one unknown parameter have been proposed. However, there are relatively few methods for specifying a multivariate prior distribution and most are just applicable to specific classes of problems and/or based on restrictive conditions, such as independence of variables. Besides, many of these procedures require the elicitation of variances and correlations, and sometimes elicitation of hyperparameters which are difficult for experts to specify in practice. Garthwaite et al. (2005) discuss the different methods proposed in the literature and the difficulties of eliciting multivariate prior distributions. We describe a flexible method of eliciting multivariate prior distributions applicable to a wide class of practical problems. Our approach does not assume a parametric form for the unknown prior density f(.), instead we use nonparametric Bayesian inference, modelling f(.) by a Gaussian process prior distribution. The expert is then asked to specify certain summaries of his/her distribution, such as the mean, mode, marginal quantiles and a small number of joint probabilities. The analyst receives that information, treating it as a data set D with which to update his/her prior beliefs to obtain the posterior distribution for f(.). Theoretical properties of joint and marginal priors are derived and numerical illustrations to demonstrate our approach are given. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Ecological regions are increasingly used as a spatial unit for planning and environmental management. It is important to define these regions in a scientifically defensible way to justify any decisions made on the basis that they are representative of broad environmental assets. The paper describes a methodology and tool to identify cohesive bioregions. The methodology applies an elicitation process to obtain geographical descriptions for bioregions, each of these is transformed into a Normal density estimate on environmental variables within that region. This prior information is balanced with data classification of environmental datasets using a Bayesian statistical modelling approach to objectively map ecological regions. The method is called model-based clustering as it fits a Normal mixture model to the clusters associated with regions, and it addresses issues of uncertainty in environmental datasets due to overlapping clusters.
Resumo:
While the system stabilizing function of reciprocity is widely acknowledged, much less attention has been paid to the argument that reciprocity might initiate social cooperation in the first place. This paper tests Gouldner’s early assumption that reciprocity may act as a ‘starting mechanism’ of social cooperation in consolidating societies. The empirical test scenario builds on unequal civic engagement between immigrants and nationals, as this engagement gap can be read as a lack of social cooperation in consolidating immigration societies. Empirical analyses using survey data on reciprocal norms and based on Bayesian hierarchical modelling lend support for Gouldner’s thesis, underlining thereby the relevance of reciprocity in today’s increasingly diverse societies: individual norms of altruistic reciprocity elevate immigrants’ propensity to volunteer, reducing thereby the engagement gap between immigrants and natives in the area of informal volunteering. In other words, compliance with altruistic reciprocity may trigger cooperation in social strata, where it is less likely to occur. The positive moderation of the informal engagement gap through altruistic reciprocity turns out to be most pronounced for immigrants who are least likely to engage in informal volunteering, meaning low, but also high educated immigrants.
Resumo:
The spatial context is critical when assessing present-day climate anomalies, attributing them to potential forcings and making statements regarding their frequency and severity in a long-term perspective. Recent international initiatives have expanded the number of high-quality proxy-records and developed new statistical reconstruction methods. These advances allow more rigorous regional past temperature reconstructions and, in turn, the possibility of evaluating climate models on policy-relevant, spatio-temporal scales. Here we provide a new proxy-based, annually-resolved, spatial reconstruction of the European summer (June–August) temperature fields back to 755 CE based on Bayesian hierarchical modelling (BHM), together with estimates of the European mean temperature variation since 138 BCE based on BHM and composite-plus-scaling (CPS). Our reconstructions compare well with independent instrumental and proxy-based temperature estimates, but suggest a larger amplitude in summer temperature variability than previously reported. Both CPS and BHM reconstructions indicate that the mean 20th century European summer temperature was not significantly different from some earlier centuries, including the 1st, 2nd, 8th and 10th centuries CE. The 1st century (in BHM also the 10th century) may even have been slightly warmer than the 20th century, but the difference is not statistically significant. Comparing each 50 yr period with the 1951–2000 period reveals a similar pattern. Recent summers, however, have been unusually warm in the context of the last two millennia and there are no 30 yr periods in either reconstruction that exceed the mean average European summer temperature of the last 3 decades (1986–2015 CE). A comparison with an ensemble of climate model simulations suggests that the reconstructed European summer temperature variability over the period 850–2000 CE reflects changes in both internal variability and external forcing on multi-decadal time-scales. For pan-European temperatures we find slightly better agreement between the reconstruction and the model simulations with high-end estimates for total solar irradiance. Temperature differences between the medieval period, the recent period and the Little Ice Age are larger in the reconstructions than the simulations. This may indicate inflated variability of the reconstructions, a lack of sensitivity and processes to changes in external forcing on the simulated European climate and/or an underestimation of internal variability on centennial and longer time scales.
Resumo:
The use of a fully parametric Bayesian method for analysing single patient trials based on the notion of treatment 'preference' is described. This Bayesian hierarchical modelling approach allows for full parameter uncertainty, use of prior information and the modelling of individual and patient sub-group structures. It provides updated probabilistic results for individual patients, and groups of patients with the same medical condition, as they are sequentially enrolled into individualized trials using the same medication alternatives. Two clinically interpretable criteria for determining a patient's response are detailed and illustrated using data from a previously published paper under two different prior information scenarios. Copyright (C) 2005 John Wiley & Sons, Ltd.
Resumo:
In Australia, along with many other parts of the world, fumigation with phosphine is a vital component in controlling stored grain insect pests. However, resistance is a factor that may limit the continued efficacy of this fumigant. While strong resistance to phosphine has been identified and characterised, very little information is available on the causes of its development and spread. Data obtained from a unique national resistance monitoring and management program were analysed, using Bayesian hurdle modelling, to determine which factors may be responsible. Fumigation in unsealed storages, combined with a high frequency of weak resistance, were found to be the main criteria that led to the development of strong resistance in Sitophilus oryzae. Independent development, rather than gene flow via migration, appears to be primarily responsible for the geographic incidence of strong resistance to phosphine in S. oryzae. This information can now be utilised to direct resources and education into those areas at high risk and to refine phosphine resistance management strategies.
Resumo:
We compare Bayesian methodology utilizing free-ware BUGS (Bayesian Inference Using Gibbs Sampling) with the traditional structural equation modelling approach based on another free-ware package, Mx. Dichotomous and ordinal (three category) twin data were simulated according to different additive genetic and common environment models for phenotypic variation. Practical issues are discussed in using Gibbs sampling as implemented by BUGS to fit subject-specific Bayesian generalized linear models, where the components of variation may be estimated directly. The simulation study (based on 2000 twin pairs) indicated that there is a consistent advantage in using the Bayesian method to detect a correct model under certain specifications of additive genetics and common environmental effects. For binary data, both methods had difficulty in detecting the correct model when the additive genetic effect was low (between 10 and 20%) or of moderate range (between 20 and 40%). Furthermore, neither method could adequately detect a correct model that included a modest common environmental effect (20%) even when the additive genetic effect was large (50%). Power was significantly improved with ordinal data for most scenarios, except for the case of low heritability under a true ACE model. We illustrate and compare both methods using data from 1239 twin pairs over the age of 50 years, who were registered with the Australian National Health and Medical Research Council Twin Registry (ATR) and presented symptoms associated with osteoarthritis occurring in joints of the hand.
Resumo:
In recent decades, an increased interest has been evidenced in the research on multi-scale hierarchical modelling in the field of mechanics, and also in the field of wood products and timber engineering. One of the main motivations for hierar-chical modelling is to understand how properties, composition and structure at lower scale levels may influence and be used to predict the material properties on a macroscopic and structural engineering scale. This chapter presents the applicability of statistic and probabilistic methods, such as the Maximum Likelihood method and Bayesian methods, in the representation of timber’s mechanical properties and its inference accounting to prior information obtained in different importance scales. These methods allow to analyse distinct timber’s reference properties, such as density, bending stiffness and strength, and hierarchically consider information obtained through different non, semi or destructive tests. The basis and fundaments of the methods are described and also recommendations and limitations are discussed. The methods may be used in several contexts, however require an expert’s knowledge to assess the correct statistic fitting and define the correlation arrangement between properties.
Resumo:
Evaluating agents in decision-making applications requires assessing their skill and predicting their behaviour. Both are well developed in Poker-like situations, but less so in more complex game and model domains. This paper addresses both tasks by using Bayesian inference in a benchmark space of reference agents. The concepts are explained and demonstrated using the game of chess but the model applies generically to any domain with quantifiable options and fallible choice. Demonstration applications address questions frequently asked by the chess community regarding the stability of the rating scale, the comparison of players of different eras and/or leagues, and controversial incidents possibly involving fraud. The last include alleged under-performance, fabrication of tournament results, and clandestine use of computer advice during competition. Beyond the model world of games, the aim is to improve fallible human performance in complex, high-value tasks.