234 resultados para classical over barrier model(COBM)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Approximate Bayesian Computation’ (ABC) represents a powerful methodology for the analysis of complex stochastic systems for which the likelihood of the observed data under an arbitrary set of input parameters may be entirely intractable – the latter condition rendering useless the standard machinery of tractable likelihood-based, Bayesian statistical inference [e.g. conventional Markov chain Monte Carlo (MCMC) simulation]. In this paper, we demonstrate the potential of ABC for astronomical model analysis by application to a case study in the morphological transformation of high-redshift galaxies. To this end, we develop, first, a stochastic model for the competing processes of merging and secular evolution in the early Universe, and secondly, through an ABC-based comparison against the observed demographics of massive (Mgal > 1011 M⊙) galaxies (at 1.5 < z < 3) in the Cosmic Assembly Near-IR Deep Extragalatic Legacy Survey (CANDELS)/Extended Groth Strip (EGS) data set we derive posterior probability densities for the key parameters of this model. The ‘Sequential Monte Carlo’ implementation of ABC exhibited herein, featuring both a self-generating target sequence and self-refining MCMC kernel, is amongst the most efficient of contemporary approaches to this important statistical algorithm. We highlight as well through our chosen case study the value of careful summary statistic selection, and demonstrate two modern strategies for assessment and optimization in this regard. Ultimately, our ABC analysis of the high-redshift morphological mix returns tight constraints on the evolving merger rate in the early Universe and favours major merging (with disc survival or rapid reformation) over secular evolution as the mechanism most responsible for building up the first generation of bulges in early-type discs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Schottky barrier solar cells based on graphene/n-silicon heterojunction have been fabricated and characterized and the effect of graphene molecular doping by HNO3 on the solar cells performances have been analyzed. Different doping conditions and thermal annealing processes have been tested to asses and optimize the stability of the devices. The PCE of the cells increases after the treatment by HNO3 and reaches 5% in devices treated at 200 °C immediately before the exposition to the oxidant. Up to now our devices retain about 80% of efficiency over a period of two weeks, which represents a good stability result for similar devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: The purpose of this paper is to review, critique and develop a research agenda for the Elaboration Likelihood Model (ELM). The model was introduced by Petty and Cacioppo over three decades ago and has been modified, revised and extended. Given modern communication contexts, it is appropriate to question the model’s validity and relevance. Design/methodology/approach: The authors develop a conceptual approach, based on a fully comprehensive and extensive review and critique of ELM and its development since its inception. Findings: This paper focuses on major issues concerning the ELM. These include model assumptions and its descriptive nature; continuum questions, multi-channel processing and mediating variables before turning to the need to replicate the ELM and to offer recommendations for its future development. Research limitations/implications: This paper offers a series of questions in terms of research implications. These include whether ELM could or should be replicated, its extension, a greater conceptualization of argument quality, an explanation of movement along the continuum and between central and peripheral routes to persuasion, or to use new methodologies and technologies to help better understanding consume thinking and behaviour? All these relate to the current need to explore the relevance of ELM in a more modern context. Practical implications: It is time to question the validity and relevance of the ELM. The diversity of on- and off-line media options and the variants of consumer choice raise significant issues. Originality/value: While the ELM model continues to be widely cited and taught as one of the major cornerstones of persuasion, questions are raised concerning its relevance and validity in 21st century communication contexts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, attempts to improve decision making in species management have focussed on uncertainties associated with modelling temporal fluctuations in populations. Reducing model uncertainty is challenging; while larger samples improve estimation of species trajectories and reduce statistical errors, they typically amplify variability in observed trajectories. In particular, traditional modelling approaches aimed at estimating population trajectories usually do not account well for nonlinearities and uncertainties associated with multi-scale observations characteristic of large spatio-temporal surveys. We present a Bayesian semi-parametric hierarchical model for simultaneously quantifying uncertainties associated with model structure and parameters, and scale-specific variability over time. We estimate uncertainty across a four-tiered spatial hierarchy of coral cover from the Great Barrier Reef. Coral variability is well described; however, our results show that, in the absence of additional model specifications, conclusions regarding coral trajectories become highly uncertain when considering multiple reefs, suggesting that management should focus more at the scale of individual reefs. The approach presented facilitates the description and estimation of population trajectories and associated uncertainties when variability cannot be attributed to specific causes and origins. We argue that our model can unlock value contained in large-scale datasets, provide guidance for understanding sources of uncertainty, and support better informed decision making

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Extracorporeal membrane oxygenation (ECMO) circuits have been shown to sequester circulating blood compounds such as drugs based on their physicochemical properties. This study aimed to describe the disposition of macro- and micronutrients in simulated ECMO circuits. Methods Following baseline sampling, known quantities of macro- and micronutrients were injected post oxygenator into ex vivo ECMO circuits primed with the fresh human whole blood and maintained under standard physiologic conditions. Serial blood samples were then obtained at 1, 30 and 60 min and at 6, 12 and 24 h after the addition of nutrients, to measure the concentrations of study compounds using validated assays. Results Twenty-one samples were tested for thirty-one nutrient compounds. There were significant reductions (p < 0.05) in circuit concentrations of some amino acids [alanine (10%), arginine (95%), cysteine (14%), glutamine (25%) and isoleucine (7%)], vitamins [A (42%) and E (6%)] and glucose (42%) over 24 h. Significant increases in circuit concentrations (p < 0.05) were observed over time for many amino acids, zinc and vitamin C. There were no significant reductions in total proteins, triglycerides, total cholesterol, selenium, copper, manganese and vitamin D concentrations within the ECMO circuit over a 24-h period. No clear correlation could be established between physicochemical properties and circuit behaviour of tested nutrients. Conclusions Significant alterations in macro- and micronutrient concentrations were observed in this single-dose ex vivo circuit study. Most significantly, there is potential for circuit loss of essential amino acid isoleucine and lipid soluble vitamins (A and E) in the ECMO circuit, and the mechanisms for this need further exploration. While the reductions in glucose concentrations and an increase in other macro- and micronutrient concentrations probably reflect cellular metabolism and breakdown, the decrement in arginine and glutamine concentrations may be attributed to their enzymatic conversion to ornithine and glutamate, respectively. While the results are generally reassuring from a macronutrient perspective, prospective studies in clinical subjects are indicated to further evaluate the influence of ECMO circuit on micronutrient concentrations and clinical outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the Department of Defense's most pressing environmental problems is the efficient detection and identification of unexploded ordnance (UXO). In regions of highly magnetic soils, magnetic and electromagnetic sensors often detect anomalies that are of geologic origin, adding significantly to remediation costs. In order to develop predictive models for magnetic susceptibility, it is crucial to understand modes of formation and the spatial distribution of different iron oxides. Most rock types contain iron and their magnetic susceptibility is determined by the amount and form of iron oxides present. When rocks weather, the amount and form of the oxides change, producing concomitant changes in magnetic susceptibility. The type of iron oxide found in the weathered rock or regolith is a function of the duration and intensity of weathering, as well as the original content of iron in the parent material. The rate of weathering is controlled by rainfall and temperature; thus knowing the climate zone, the amount of iron in the lithology and the age of the surface will help predict the amount and forms of iron oxide. We have compiled analyses of the types, amounts, and magnetic properties of iron oxides from soils over a wide climate range, from semi arid grasslands, to temperate regions, and tropical forests. We find there is a predictable range of iron oxide type and magnetic susceptibility according to the climate zone, the age of the soil and the amount of iron in the unweathered regolith.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study started with the aim to develop an approach that will help designers create interfaces that are more intuitive for older adults to use. Two objectives were set for this study: 1) to investigate one of the possible strategies for developing intuitive interfaces for older people, and; 2) to investigate factors that could interfere with intuitive use. This paper briefly presents the outcome of the two experiments and how it has lead to the development of an adaptable interface design model that will help designers develop interfaces that are intuitive to learn and, over time, intuitive to use for users with diverse technology prior experience and cognitive abilities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The continuous changing impacts appeared in all solution understanding approaches in the projects management field (especially in the construction field of work) by adopting dynamic solution paths. The paper will define what argue to be a better relational model for project management constraints (time, cost, and scope). This new model will increase the success factors of any complex program / project. This is a qualitative research adopting a new avenue of investigation by following different approach of attributing project activities with social phenomena, and supporting phenomenon with field of observations rather than mathematical method by emerging solution from human, and ants' colonies successful practices. The results will show the correct approach of relation between the triple constraints considering the relation as multi agents system having specified communication channels based on agents locations. Information will be transferred between agents, and action would be taken based on constraint agents locations in the project structure allowing immediate changes abilities in order to overcome issues of over budget, behind schedule, and additional scope impact. This is complex adaptive system having self organizes technique, and cybernetic control. Resulted model can be used for improving existing project management methodologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Initial attempts to obtain lattice based signatures were closely related to reducing a vector modulo the fundamental parallelepiped of a secret basis (like GGH [9], or NTRUSign [12]). This approach leaked some information on the secret, namely the shape of the parallelepiped, which has been exploited on practical attacks [24]. NTRUSign was an extremely efficient scheme, and thus there has been a noticeable interest on developing countermeasures to the attacks, but with little success [6]. In [8] Gentry, Peikert and Vaikuntanathan proposed a randomized version of Babai’s nearest plane algorithm such that the distribution of a reduced vector modulo a secret parallelepiped only depended on the size of the base used. Using this algorithm and generating large, close to uniform, public keys they managed to get provably secure GGH-like lattice-based signatures. Recently, Stehlé and Steinfeld obtained a provably secure scheme very close to NTRUSign [26] (from a theoretical point of view). In this paper we present an alternative approach to seal the leak of NTRUSign. Instead of modifying the lattices and algorithms used, we do a classic leaky NTRUSign signature and hide it with gaussian noise using techniques present in Lyubashevky’s signatures. Our main contributions are thus a set of strong NTRUSign parameters, obtained by taking into account latest known attacks against the scheme, a statistical way to hide the leaky NTRU signature so that this particular instantiation of CVP-based signature scheme becomes zero-knowledge and secure against forgeries, based on the worst-case hardness of the O~(N1.5)-Shortest Independent Vector Problem over NTRU lattices. Finally, we give a set of concrete parameters to gauge the efficiency of the obtained signature scheme.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In images with low contrast-to-noise ratio (CNR), the information gain from the observed pixel values can be insufficient to distinguish foreground objects. A Bayesian approach to this problem is to incorporate prior information about the objects into a statistical model. A method for representing spatial prior information as an external field in a hidden Potts model is introduced. This prior distribution over the latent pixel labels is a mixture of Gaussian fields, centred on the positions of the objects at a previous point in time. It is particularly applicable in longitudinal imaging studies, where the manual segmentation of one image can be used as a prior for automatic segmentation of subsequent images. The method is demonstrated by application to cone-beam computed tomography (CT), an imaging modality that exhibits distortions in pixel values due to X-ray scatter. The external field prior results in a substantial improvement in segmentation accuracy, reducing the mean pixel misclassification rate for an electron density phantom from 87% to 6%. The method is also applied to radiotherapy patient data, demonstrating how to derive the external field prior in a clinical context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe the development and parameterization of a grid-based model of African savanna vegetation processes. The model was developed with the objective of exploring elephant effects on the diversity of savanna species and structure, and in this formulation concentrates on the relative cover of grass and woody plants, the vertical structure of the woody plant community, and the distribution of these over space. Grid cells are linked by seed dispersal and fire, and environmental variability is included in the form of stochastic rainfall and fire events. The model was parameterized from an extensive review of the African savanna literature; when available, parameter values varied widely. The most plausible set of parameters produced long-term coexistence between woody plants and grass, with the tree-grass balance being more sensitive to changes in parameters influencing demographic processes and drought incidence and response, while less sensitive to fire regime. There was considerable diversity in the woody structure of savanna systems within the range of uncertainty in tree growth rate parameters. Thus, given the paucity of height growth data regarding woody plant species in southern African savannas, managers of natural areas should be cognizant of different tree species growth and damage response attributes when considering whether to act on perceived elephant threats to vegetation. © 2007 Springer Science+Business Media B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a concern that high densities of elephants in southern Africa could lead to the overall reduction of other forms of biodiversity. We present a grid-based model of elephant-savanna dynamics, which differs from previous elephant-vegetation models by accounting for woody plant demographics, tree-grass interactions, stochastic environmental variables (fire and rainfall), and spatial contagion of fire and tree recruitment. The model projects changes in height structure and spatial pattern of trees over periods of centuries. The vegetation component of the model produces long-term tree-grass coexistence, and the emergent fire frequencies match those reported for southern African savannas. Including elephants in the savanna model had the expected effect of reducing woody plant cover, mainly via increased adult tree mortality, although at an elephant density of 1.0 elephant/km2, woody plants still persisted for over a century. We tested three different scenarios in addition to our default assumptions. (1) Reducing mortality of adult trees after elephant use, mimicking a more browsing-tolerant tree species, mitigated the detrimental effect of elephants on the woody population. (2) Coupling germination success (increased seedling recruitment) to elephant browsing further increased tree persistence, and (3) a faster growing woody component allowed some woody plant persistence for at least a century at a density of 3 elephants/km2. Quantitative models of the kind presented here provide a valuable tool for exploring the consequences of management decisions involving the manipulation of elephant population densities. © 2005 by the Ecological Society of America.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Australia’s governance of land and natural resources involves multiple polycentric domains of decision-making from global through to local levels. Although certainly complex, these arrangements have not necessarily translated into better decision-making or better environmental outcomes as evidenced by the growing concerns over the health and future of the Great Barrier Reef, (GBR). However within this system, arrangements for natural resource management (NRM) and reef water quality, which both use Australia’s integrated regional NRM model, have showed signs of improving decision-making and environmental outcomes in the GBR. In this paper we describe the latest evolutions in the governance and planning for natural resource use and management in Australia. We begin by reviewing the experience with first generation NRM as published in major audits and evaluations. As our primary interest is the health and future of the GBR, we then consider the impact of changes of second generation planning and governance outcomes in Queensland. We find that first generation plans, although developed under a relatively cohesive governance context, faced substantial problems in target setting, implementation, monitoring and review. Despite this, they were able to progress improvements in water quality in the Great Barrier Reef Regions. Second generation plans, currently being developed, face an even greater risk of failure due to the lack of bilateralism and cross-sectoral cooperation across the NRM governance system. The findings highlight the critical need to re-build and enhance the regional NRM model for NRM planning to have a positive impact on environmental outcomes in the GBR.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a new method for performing Bayesian parameter inference and model choice for low count time series models with intractable likelihoods. The method involves incorporating an alive particle filter within a sequential Monte Carlo (SMC) algorithm to create a novel pseudo-marginal algorithm, which we refer to as alive SMC^2. The advantages of this approach over competing approaches is that it is naturally adaptive, it does not involve between-model proposals required in reversible jump Markov chain Monte Carlo and does not rely on potentially rough approximations. The algorithm is demonstrated on Markov process and integer autoregressive moving average models applied to real biological datasets of hospital-acquired pathogen incidence, animal health time series and the cumulative number of poison disease cases in mule deer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pilot and industrial scale dilute acid pretreatment data can be difficult to obtain due to the significant infrastructure investment required. Consequently, models of dilute acid pretreatment by necessity use laboratory scale data to determine kinetic parameters and make predictions about optimal pretreatment conditions at larger scales. In order for these recommendations to be meaningful, the ability of laboratory scale models to predict pilot and industrial scale yields must be investigated. A mathematical model of the dilute acid pretreatment of sugarcane bagasse has previously been developed by the authors. This model was able to successfully reproduce the experimental yields of xylose and short chain xylooligomers obtained at the laboratory scale. In this paper, the ability of the model to reproduce pilot scale yield and composition data is examined. It was found that in general the model over predicted the pilot scale reactor yields by a significant margin. Models that appear very promising at the laboratory scale may have limitations when predicting yields on a pilot or industrial scale. It is difficult to comment whether there are any consistent trends in optimal operating conditions between reactor scale and laboratory scale hydrolysis due to the limited reactor datasets available. Further investigation is needed to determine whether the model has some efficacy when the kinetic parameters are re-evaluated by parameter fitting to reactor scale data, however, this requires the compilation of larger datasets. Alternatively, laboratory scale mathematical models may have enhanced utility for predicting larger scale reactor performance if bulk mass transport and fluid flow considerations are incorporated into the fibre scale equations. This work reinforces the need for appropriate attention to be paid to pilot scale experimental development when moving from laboratory to pilot and industrial scales for new technologies.