913 resultados para Averaging Principle
Resumo:
This study considered the problem of predicting survival, based on three alternative models: a single Weibull, a mixture of Weibulls and a cure model. Instead of the common procedure of choosing a single “best” model, where “best” is defined in terms of goodness of fit to the data, a Bayesian model averaging (BMA) approach was adopted to account for model uncertainty. This was illustrated using a case study in which the aim was the description of lymphoma cancer survival with covariates given by phenotypes and gene expression. The results of this study indicate that if the sample size is sufficiently large, one of the three models emerge as having highest probability given the data, as indicated by the goodness of fit measure; the Bayesian information criterion (BIC). However, when the sample size was reduced, no single model was revealed as “best”, suggesting that a BMA approach would be appropriate. Although a BMA approach can compromise on goodness of fit to the data (when compared to the true model), it can provide robust predictions and facilitate more detailed investigation of the relationships between gene expression and patient survival. Keywords: Bayesian modelling; Bayesian model averaging; Cure model; Markov Chain Monte Carlo; Mixture model; Survival analysis; Weibull distribution
Resumo:
Nitrous oxide (N2O) is one of the greenhouse gases that can contribute to global warming. Spatial variability of N2O can lead to large uncertainties in prediction. However, previous studies have often ignored the spatial dependency to quantify the N2O - environmental factors relationships. Few researches have examined the impacts of various spatial correlation structures (e.g. independence, distance-based and neighbourhood based) on spatial prediction of N2O emissions. This study aimed to assess the impact of three spatial correlation structures on spatial predictions and calibrate the spatial prediction using Bayesian model averaging (BMA) based on replicated, irregular point-referenced data. The data were measured in 17 chambers randomly placed across a 271 m(2) field between October 2007 and September 2008 in the southeast of Australia. We used a Bayesian geostatistical model and a Bayesian spatial conditional autoregressive (CAR) model to investigate and accommodate spatial dependency, and to estimate the effects of environmental variables on N2O emissions across the study site. We compared these with a Bayesian regression model with independent errors. The three approaches resulted in different derived maps of spatial prediction of N2O emissions. We found that incorporating spatial dependency in the model not only substantially improved predictions of N2O emission from soil, but also better quantified uncertainties of soil parameters in the study. The hybrid model structure obtained by BMA improved the accuracy of spatial prediction of N2O emissions across this study region.
Resumo:
Carbon nanotubes with specific nitrogen doping are proposed for controllable, highly selective, and reversible CO2 capture. Using density functional theory incorporating long-range dispersion corrections, we investigated the adsorption behavior of CO2 on (7,7) single-walled carbon nanotubes (CNTs) with several nitrogen doping configurations and varying charge states. Pyridinic-nitrogen incorporation in CNTs is found to induce an increasing CO2 adsorption strength with electron injecting, leading to a highly selective CO2 adsorption in comparison with N2. This functionality could induce intrinsically reversible CO2 adsorption as capture/release can be controlled by switching the charge carrying state of the system on/off. This phenomenon is verified for a number of different models and theoretical methods, with clear ramifications for the possibility of implementation with a broader class of graphene-based materials. A scheme for the implementation of this remarkable reversible electrocatalytic CO2-capture phenomenon is considered.
Resumo:
To enhance the performance of the k-nearest neighbors approach in forecasting short-term traffic volume, this paper proposed and tested a two-step approach with the ability of forecasting multiple steps. In selecting k-nearest neighbors, a time constraint window is introduced, and then local minima of the distances between the state vectors are ranked to avoid overlappings among candidates. Moreover, to control extreme values’ undesirable impact, a novel algorithm with attractive analytical features is developed based on the principle component. The enhanced KNN method has been evaluated using the field data, and our comparison analysis shows that it outperformed the competing algorithms in most cases.
Resumo:
A known limitation of the Probability Ranking Principle (PRP) is that it does not cater for dependence between documents. Recently, the Quantum Probability Ranking Principle (QPRP) has been proposed, which implicitly captures dependencies between documents through “quantum interference”. This paper explores whether this new ranking principle leads to improved performance for subtopic retrieval, where novelty and diversity is required. In a thorough empirical investigation, models based on the PRP, as well as other recently proposed ranking strategies for subtopic retrieval (i.e. Maximal Marginal Relevance (MMR) and Portfolio Theory(PT)), are compared against the QPRP. On the given task, it is shown that the QPRP outperforms these other ranking strategies. And unlike MMR and PT, one of the main advantages of the QPRP is that no parameter estimation/tuning is required; making the QPRP both simple and effective. This research demonstrates that the application of quantum theory to problems within information retrieval can lead to significant improvements.
Resumo:
In this work, we summarise the development of a ranking principle based on quantum probability theory, called the Quantum Probability Ranking Principle (QPRP), and we also provide an overview of the initial experiments performed employing the QPRP. The main difference between the QPRP and the classic Probability Ranking Principle, is that the QPRP implicitly captures the dependencies between documents by means of quantum interference". Subsequently, the optimal ranking of documents is not based solely on documents' probability of relevance but also on the interference with the previously ranked documents. Our research shows that the application of quantum theory to problems within information retrieval can lead to consistently better retrieval effectiveness, while still being simple, elegant and tractable.
Resumo:
While the Probability Ranking Principle for Information Retrieval provides the basis for formal models, it makes a very strong assumption regarding the dependence between documents. However, it has been observed that in real situations this assumption does not always hold. In this paper we propose a reformulation of the Probability Ranking Principle based on quantum theory. Quantum probability theory naturally includes interference effects between events. We posit that this interference captures the dependency between the judgement of document relevance. The outcome is a more sophisticated principle, the Quantum Probability Ranking Principle, that provides a more sensitive ranking which caters for interference/dependence between documents’ relevance.
Contrast transfer function correction applied to cryo-electron tomography and sub-tomogram averaging
Resumo:
Cryo-electron tomography together with averaging of sub-tomograms containing identical particles can reveal the structure of proteins or protein complexes in their native environment. The resolution of this technique is limited by the contrast transfer function (CTF) of the microscope. The CTF is not routinely corrected in cryo-electron tomography because of difficulties including CTF detection, due to the low signal to noise ratio, and CTF correction, since images are characterised by a spatially variant CTF. Here we simulate the effects of the CTF on the resolution of the final reconstruction, before and after CTF correction, and consider the effect of errors and approximations in defocus determination. We show that errors in defocus determination are well tolerated when correcting a series of tomograms collected at a range of defocus values. We apply methods for determining the CTF parameters in low signal to noise images of tilted specimens, for monitoring defocus changes using observed magnification changes, and for correcting the CTF prior to reconstruction. Using bacteriophage PRDI as a test sample, we demonstrate that this approach gives an improvement in the structure obtained by sub-tomogram averaging from cryo-electron tomograms.
Resumo:
Symmetry is a fundamental property found in both the physical and natural worlds. Bilateral symmetry is also present in the organization of the brain, however the degree to which symmetry is also an organizing principal between and within the key constituent elements of the nervous system, neurons, is not known. We compared and contrasted the structural organization of principal neurons (PN) in the three subnuclei of the lateral amygdala (LA) of the rat and for comparison also from the infralimbic cortex (IL)...
Resumo:
The maximum principle for the space and time–space fractional partial differential equations is still an open problem. In this paper, we consider a multi-term time–space Riesz–Caputo fractional differential equations over an open bounded domain. A maximum principle for the equation is proved. The uniqueness and continuous dependence of the solution are derived. Using a fractional predictor–corrector method combining the L1 and L2 discrete schemes, we present a numerical method for the specified equation. Two examples are given to illustrate the obtained results.
Resumo:
Two studies documented the “David and Goliath” rule—the tendency for people to perceive criticism of “David” groups (groups with low power and status) as less normatively permissible than criticism of “Goliath” groups (groups with high power and status). The authors confirmed the existence of the David and Goliath rule across Western and Chinese cultures (Study 1). However, the rule was endorsed more strongly in Western than in Chinese cultures, an effect mediated by cultural differences in power distance. Study 2 identified the psychological underpinnings of this rule in an Australian sample. Lower social dominance orientation (SDO) was associated with greater endorsement of the rule, an effect mediated through the differential attribution of stereotypes. Specifically, those low in SDO were more likely to attribute traits of warmth and incompetence to David versus Goliath groups, a pattern of stereotypes that was related to the protection of David groups from criticism.
Resumo:
Nanotubes and nanosheets are low-dimensional nanomaterials with unique properties that can be exploited for numerous applications. This book offers a complete overview of their structure, properties, development, modeling approaches, and practical use. It focuses attention on boron nitride (BN) nanotubes, which have had major interest given their special high-temperature properties, as well as graphene nanosheets, BN nanosheets, and metal oxide nanosheets. Key topics include surface functionalization of nanotubes for composite applications, wetting property changes for biocompatible environments, and graphene for energy storage applications
Resumo:
This thesis considers whether the Australian Privacy Commissioner's use of its powers supports compliance with the requirement to 'take reasonable steps' to protect personal information in National Privacy Principle 4 of the Privacy Act 1988 (Cth). Two unique lenses were used. First, the Commissioner's use of powers was assessed against the principles of transparency, balance and vigorousness and secondly against alignment with an industry practice approach to securing information. Following a comprehensive review of publicly available materials, interviews and investigation file records, this thesis found that the Commissioner's use of his powers has not been transparent, balanced or vigorous, nor has it been supportive of an industry practice approach to securing data. Accordingly, it concludes that the Privacy Commissioner's use of its regulatory powers is unlikely to result in any significant improvement to the security of personal information held by organisations in Australia.
Resumo:
This discussion paper is intended to provide background material for the workshop organised by Queensland University Technology (QUT) on 17 October 2014. The overall purpose of the workshop is to better understand the relationship between the precautionary principle and endangered species management in Australia. In particular, we are looking for real life examples (or hypotheticals) of where the principle is (or is not) being applied in relation to Australia’s endangered species. A wide variety of participants have been invited to the workshop including scientists, representatives of NGOs, lawyers and academics. Whilst some very general information is outlined below, we encourage all participants to bring their own thoughts on how the precautionary principle should operate and to reflect on examples of where you have seen it work (or not work) in Australia. The sharing of your own case studies is thus encouraged.
Resumo:
A teaching laboratory experiment is described that uses Archimedes’ principle to precisely investigate the effect of global warming on the oceans. A large component of sea level rise is due to the increase in the volume of water due to the decrease in water density with increasing temperature. Water close to 0 °C is placed in a beaker and a glass marble hung from an electronic balance immersed in the water. As the water warms, the weight of the marble increases as the water is less buoyant due to the decrease in density. In the experiment performed in this paper a balance with a precision of 0.1 mg was used with a marble 40.0 cm3 and mass of 99.3 g, yielding water density measurements with an average error of -0.008 ± 0.011%.