920 resultados para Bayesian statistical decision theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many studies on birds focus on the collection of data through an experimental design, suitable for investigation in a classical analysis of variance (ANOVA) framework. Although many findings are confirmed by one or more experts, expert information is rarely used in conjunction with the survey data to enhance the explanatory and predictive power of the model. We explore this neglected aspect of ecological modelling through a study on Australian woodland birds, focusing on the potential impact of different intensities of commercial cattle grazing on bird density in woodland habitat. We examine a number of Bayesian hierarchical random effects models, which cater for overdispersion and a high frequency of zeros in the data using WinBUGS and explore the variation between and within different grazing regimes and species. The impact and value of expert information is investigated through the inclusion of priors that reflect the experience of 20 experts in the field of bird responses to disturbance. Results indicate that expert information moderates the survey data, especially in situations where there are little or no data. When experts agreed, credible intervals for predictions were tightened considerably. When experts failed to agree, results were similar to those evaluated in the absence of expert information. Overall, we found that without expert opinion our knowledge was quite weak. The fact that the survey data is quite consistent, in general, with expert opinion shows that we do know something about birds and grazing and we could learn a lot faster if we used this approach more in ecology, where data are scarce. Copyright (c) 2005 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An important and common problem in microarray experiments is the detection of genes that are differentially expressed in a given number of classes. As this problem concerns the selection of significant genes from a large pool of candidate genes, it needs to be carried out within the framework of multiple hypothesis testing. In this paper, we focus on the use of mixture models to handle the multiplicity issue. With this approach, a measure of the local FDR (false discovery rate) is provided for each gene. An attractive feature of the mixture model approach is that it provides a framework for the estimation of the prior probability that a gene is not differentially expressed, and this probability can subsequently be used in forming a decision rule. The rule can also be formed to take the false negative rate into account. We apply this approach to a well-known publicly available data set on breast cancer, and discuss our findings with reference to other approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We tested a social-cognitive intervention to influence contraceptive practices among men living in rural communes in Vietnam. It was predicted that participants who received a stage-targeted program based on the Transtheoretical Model (TTM) would report positive movement in their stage of motivational readiness for their wife to use an intrauterine device (IUD) compared to those in a control condition. A quasi-experimental design was used, where the primary unit for allocation was villages. Villages were allocated randomly to a control condition or to two rounds of intervention with stage-targeted letters and interpersonal counseling. There were 651 eligible married men in the 12 villages chosen. A significant positive movement in men's stage of readiness for IUD use by their wife occurred in the intervention group, with a decrease in the proportions in the precontemplation stage from 28.6 to 20.2% and an increase in action/maintenance from 59.8 to 74.4% (P < 0.05). There were no significant changes in the control group. Compared to the control group, the intervention group showed higher pros, lower cons and higher self-efficacy for IUD use by their wife as a contraceptive method (P < 0.05). Interventions based on social-cognitive theory can increase men's involvement in IUD use in rural Vietnam and should assist in reducing future rates of unwanted pregnancy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we propose an improvement of the classical Derjaguin-Broekhoff-de Boer (DBdB) theory for capillary condensation/evaporation in mesoporous systems. The primary idea of this improvement is to employ the Gibbs-Tolman-Koenig-Buff equation to predict the surface tension changes in mesopores. In addition, the statistical film thickness (so-called t-curve) evaluated accurately on the basis of the adsorption isotherms measured for the MCM-41 materials is used instead of the originally proposed t-curve (to take into account the excess of the chemical potential due to the surface forces). It is shown that the aforementioned modifications of the original DBdB theory have significant implications for the pore size analysis of mesoporous solids. To verify our improvement of the DBdB pore size analysis method (IDBdB), a series of the calcined MCM-41 samples, which are well-defined materials with hexagonally ordered cylindrical mesopores, were used for the evaluation of the pore size distributions. The correlation of the IDBdB method with the empirically calibrated Kruk-Jaroniec-Sayari (KJS) relationship is very good in the range of small mesopores. So, a major advantage of the IDBdB method is its applicability for small mesopores as well as for the mesopore range beyond that established by the KJS calibration, i.e., for mesopore radii greater than similar to4.5 nm. The comparison of the IDBdB results with experimental data reported by Kruk and Jaroniec for capillary condensation/evaporation as well as with the results from nonlocal density functional theory developed by Neimark et al. clearly justifies our approach. Note that the proposed improvement of the classical DBdB method preserves its original simplicity and simultaneously ensures a significant improvement of the pore size analysis, which is confirmed by the independent estimation of the mean pore size by the powder X-ray diffraction method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Great Barrier Reef Marine Park, an area almost the size , of Japan, has a new network of no-take areas that significantly improves the protection of biodiversity. The new marine park zoning implements, in a quantitative manner, many of the theoretical design principles discussed in the literature. For example, the new network of no-take areas has at least 20% protection per bioregion, minimum levels of protection for all known habitats and special or unique features, and minimum sizes for no-take areas of at least 10 or 20 kat across at the smallest diameter Overall, more than 33% of the Great Barrier Reef Marine Park is now in no-take areas (previously 4.5%). The steps taken leading to this outcome were to clarify to the interested public why the existing level of protection wets inadequate; detail the conservation objectives of establishing new no-take areas; work with relevant and independent experts to define, and contribute to, the best scientific process to deliver on the objectives; describe the biodiversity (e.g., map bioregions); define operational principles needed to achieve the objectives; invite community input on all of The above; gather and layer the data gathered in round-table discussions; report the degree of achievement of principles for various options of no-take areas; and determine how to address negative impacts. Some of the key success factors in this case have global relevance and include focusing initial communication on the problem to be addressed; applying the precautionary principle; using independent experts; facilitating input to decision making; conducting extensive and participatory consultation; having an existing marine park that encompassed much of the ecosystem; having legislative power under federal law; developing high-level support; ensuring agency Priority and ownership; and being able to address the issue of displaced fishers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A framework for developing marketing category management decision support systems (DSS) based upon the Bayesian Vector Autoregressive (BVAR) model is extended. Since the BVAR model is vulnerable to permanent and temporary shifts in purchasing patterns over time, a form that can correct for the shifts and still provide the other advantages of the BVAR is a Bayesian Vector Error-Correction Model (BVECM). We present the mechanics of extending the DSS to move from a BVAR model to the BVECM model for the category management problem. Several additional iterative steps are required in the DSS to allow the decision maker to arrive at the best forecast possible. The revised marketing DSS framework and model fitting procedures are described. Validation is conducted on a sample problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two studies in the context of English-French relations in Québec suggest that individuals who strongly identify with a group derive the individual-level costs and benefits that drive expectancy-value processes (rational decision-making) from group-level costs and benefits. In Study 1, high identifiers linked group- and individual-level outcomes of conflict choices whereas low identifiers did not. Group-level expectancy-value processes, in Study 2, mediated the relationship between social identity and perceptions that collective action benefits the individual actor and between social identity and intentions to act. These findings suggest the rational underpinnings of identity-driven political behavior, a relationship sometimes obscured in intergroup theory that focuses on cognitive processes of self-stereotyping. But the results also challenge the view that individuals' cost-benefit analyses are independent of identity processes. The findings suggest the importance of modeling the relationship of group and individual levels of expectancy-value processes as both hierarchical and contingent on social identity processes

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The recent deregulation in electricity markets worldwide has heightened the importance of risk management in energy markets. Assessing Value-at-Risk (VaR) in electricity markets is arguably more difficult than in traditional financial markets because the distinctive features of the former result in a highly unusual distribution of returns-electricity returns are highly volatile, display seasonalities in both their mean and volatility, exhibit leverage effects and clustering in volatility, and feature extreme levels of skewness and kurtosis. With electricity applications in mind, this paper proposes a model that accommodates autoregression and weekly seasonals in both the conditional mean and conditional volatility of returns, as well as leverage effects via an EGARCH specification. In addition, extreme value theory (EVT) is adopted to explicitly model the tails of the return distribution. Compared to a number of other parametric models and simple historical simulation based approaches, the proposed EVT-based model performs well in forecasting out-of-sample VaR. In addition, statistical tests show that the proposed model provides appropriate interval coverage in both unconditional and, more importantly, conditional contexts. Overall, the results are encouraging in suggesting that the proposed EVT-based model is a useful technique in forecasting VaR in electricity markets. (c) 2005 International Institute of Forecasters. Published by Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Can a work setting with its organizational, cultural, and practical considerations influence the way occupational therapists make decisions regarding client interventions? There is currently a paucity of evidence available to answer this question. This study aimed to investigate the influence of work setting on therapists’ clinical reasoning in the management of clients with cerebral palsy and upper limb hypertonicity. Specifically the study aimed to examine therapists’ objective and stated policies, and their intervention decisions using Social Judgement Theory methodology. Participants were eighteen occupational therapists with more than five years experience with clients with cerebral palsy who were asked to make intervention decisions for clients represented by 90 case vignettes. They worked in three settings, hospitals (5), schools (6), and community (6). The results indicated that therapy settings did influence therapists’ decisions about intervention choices but not their objective and subjective policy decisions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Standard factorial designs sometimes may be inadequate for experiments that aim to estimate a generalized linear model, for example, for describing a binary response in terms of several variables. A method is proposed for finding exact designs for such experiments that uses a criterion allowing for uncertainty in the link function, the linear predictor, or the model parameters, together with a design search. Designs are assessed and compared by simulation of the distribution of efficiencies relative to locally optimal designs over a space of possible models. Exact designs are investigated for two applications, and their advantages over factorial and central composite designs are demonstrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Markov chain Monte Carlo (MCMC) is a methodology that is gaining widespread use in the phylogenetics community and is central to phylogenetic software packages such as MrBayes. An important issue for users of MCMC methods is how to select appropriate values for adjustable parameters such as the length of the Markov chain or chains, the sampling density, the proposal mechanism, and, if Metropolis-coupled MCMC is being used, the number of heated chains and their temperatures. Although some parameter settings have been examined in detail in the literature, others are frequently chosen with more regard to computational time or personal experience with other data sets. Such choices may lead to inadequate sampling of tree space or an inefficient use of computational resources. We performed a detailed study of convergence and mixing for 70 randomly selected, putatively orthologous protein sets with different sizes and taxonomic compositions. Replicated runs from multiple random starting points permit a more rigorous assessment of convergence, and we developed two novel statistics, delta and epsilon, for this purpose. Although likelihood values invariably stabilized quickly, adequate sampling of the posterior distribution of tree topologies took considerably longer. Our results suggest that multimodality is common for data sets with 30 or more taxa and that this results in slow convergence and mixing. However, we also found that the pragmatic approach of combining data from several short, replicated runs into a metachain to estimate bipartition posterior probabilities provided good approximations, and that such estimates were no worse in approximating a reference posterior distribution than those obtained using a single long run of the same length as the metachain. Precision appears to be best when heated Markov chains have low temperatures, whereas chains with high temperatures appear to sample trees with high posterior probabilities only rarely. [Bayesian phylogenetic inference; heating parameter; Markov chain Monte Carlo; replicated chains.]

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the absence of an external frame of reference-i.e., in background independent theories such as general relativity-physical degrees of freedom must describe relations between systems. Using a simple model, we investigate how such a relational quantum theory naturally arises by promoting reference systems to the status of dynamical entities. Our goal is twofold. First, we demonstrate using elementary quantum theory how any quantum mechanical experiment admits a purely relational description at a fundamental. Second, we describe how the original non-relational theory approximately emerges from the fully relational theory when reference systems become semi-classical. Our technique is motivated by a Bayesian approach to quantum mechanics, and relies on the noiseless subsystem method of quantum information science used to protect quantum states against undesired noise. The relational theory naturally predicts a fundamental decoherence mechanism, so an arrow of time emerges from a time-symmetric theory. Moreover, our model circumvents the problem of the collapse of the wave packet as the probability interpretation is only ever applied to diagonal density operators. Finally, the physical states of the relational theory can be described in terms of spin networks introduced by Penrose as a combinatorial description of geometry, and widely studied in the loop formulation of quantum gravity. Thus, our simple bottom-up approach (starting from the semiclassical limit to derive the fully relational quantum theory) may offer interesting insights on the low energy limit of quantum gravity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problems of computing the power and exponential moments EXs and EetX of square Gaussian random matrices X=A+BWC for positive integer s and real t, where W is a standard normal random vector and A, B, C are appropriately dimensioned constant matrices. We solve the problems by a matrix product scalarization technique and interpret the solutions in system-theoretic terms. The results of the paper are applicable to Bayesian prediction in multivariate autoregressive time series and mean-reverting diffusion processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional vegetation mapping methods use high cost, labour-intensive aerial photography interpretation. This approach can be subjective and is limited by factors such as the extent of remnant vegetation, and the differing scale and quality of aerial photography over time. An alternative approach is proposed which integrates a data model, a statistical model and an ecological model using sophisticated Geographic Information Systems (GIS) techniques and rule-based systems to support fine-scale vegetation community modelling. This approach is based on a more realistic representation of vegetation patterns with transitional gradients from one vegetation community to another. Arbitrary, though often unrealistic, sharp boundaries can be imposed on the model by the application of statistical methods. This GIS-integrated multivariate approach is applied to the problem of vegetation mapping in the complex vegetation communities of the Innisfail Lowlands in the Wet Tropics bioregion of Northeastern Australia. The paper presents the full cycle of this vegetation modelling approach including sampling sites, variable selection, model selection, model implementation, internal model assessment, model prediction assessments, models integration of discrete vegetation community models to generate a composite pre-clearing vegetation map, independent data set model validation and model prediction's scale assessments. An accurate pre-clearing vegetation map of the Innisfail Lowlands was generated (0.83r(2)) through GIS integration of 28 separate statistical models. This modelling approach has good potential for wider application, including provision of. vital information for conservation planning and management; a scientific basis for rehabilitation of disturbed and cleared areas; a viable method for the production of adequate vegetation maps for conservation and forestry planning of poorly-studied areas. (c) 2006 Elsevier B.V. All rights reserved.