41 resultados para Stochasti cactor-based models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A conventional local model (LM) network consists of a set of affine local models blended together using appropriate weighting functions. Such networks have poor interpretability since the dynamics of the blended network are only weakly related to the underlying local models. In contrast, velocity-based LM networks employ strictly linear local models to provide a transparent framework for nonlinear modelling in which the global dynamics are a simple linear combination of the local model dynamics. A novel approach for constructing continuous-time velocity-based networks from plant data is presented. Key issues including continuous-time parameter estimation, correct realisation of the velocity-based local models and avoidance of the input derivative are all addressed. Application results are reported for the highly nonlinear simulated continuous stirred tank reactor process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Homology modeling was used to build 3D models of the N-methyl-D-aspartate (NMDA) receptor glycine binding site on the basis of an X-ray structure of the water-soluble AMPA-sensitive receptor. The docking of agonists and antagonists to these models was used to reveal binding modes of ligands and to explain known structure-activity relationships. Two types of quantitative models, 3D-QSAR/CoMFA and a regression model based on docking energies, were built for antagonists (derivatives of 4-hydroxy-2-quinolone, quinoxaline-2,3-dione, and related compounds). The CoMFA steric and electrostatic maps were superimposed on the homology-based model, and a close correspondence was marked. The derived computational models have permitted the evaluation of the structural features crucial for high glycine binding site affinity and are important for the design of new ligands.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In establishing the reliability of performance-related design methods for concrete – which are relevant for resistance against chloride-induced corrosion - long-term experience of local materials and practices and detailed knowledge of the ambient and local micro-climate are critical. Furthermore, in the development of analytical models for performance-based design, calibration against test data representative of actual conditions in practice is required. To this end, the current study presents results from full-scale, concrete pier-stems under long-term exposure to a marine environment with work focussing on XS2 (below mid-tide level) in which the concrete is regarded as fully saturated and XS3 (tidal, splash and spray) in which the concrete is in an unsaturated condition. These exposures represent zones where concrete structures are most susceptible to ionic ingress and deterioration. Chloride profiles and chloride transport behaviour are studied using both an empirical model (erfc function) and a physical model (ClinConc). The time dependency of surface chloride concentration (Cs) and apparent diffusivity (Da) were established for the empirical model whereas, in the ClinConc model (originally based on saturated concrete), two new environmental factors were introduced for the XS3 environmental exposure zone. Although the XS3 is considered as one environmental exposure zone according to BS EN 206-1:2013, the work has highlighted that even within this zone, significant changes in chloride ingress are evident. This study aims to update the parameters of both models for predicting the long term transport behaviour of concrete subjected to environmental exposure classes XS2 and XS3.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A companion paper described the partial-interaction localised properties that require the development of pseudo properties. If the quantification through experimental testing of these pseudo properties could be removed by the use of mechanics-based models, which is the subject of this paper, then this would: (a) substantially reduce the cost of developing new reinforced concrete products by reducing the amount of testing; (b) increase the accuracy of designing existing and novel reinforced concrete members and structures, bearing in mind that experimentally derived pseudo properties are only applicable within the range of the testing from which they were derived; and (c) reduce the cost and increase the accuracy of developing reinforced concrete design rules. This paper deals with the development of pseudo properties and behaviours directly through mechanics, as opposed to experimental testing, and their incorporation into member global simulations. It also addresses the need for a fundamental shift to displacement-based analyses as opposed to strain-based analyses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper tests empirically whether pension information derived by corporate pension accounting disclosures is priced in corporate bond spreads. The model represents a hybrid of more traditional accounting ratio-based models of credit risk and structural models of bond spreads initiated by Merton (1974). The model is fitted to 5 years of data from 2002 to 2006 featuring companies from the US and Europe. The paper finds that while unfunded pension liabilities are priced in the overall sample, they are not priced as aggressively as traditional leverage. Furthermore, an extended model shows that the pension–credit risk relation is most evident in the US and Germany, where unfunded pension liabilities are priced more aggressively than traditional forms of leverage. No pension–credit risk relation is found in the other countries sampled, notably the UK, Netherlands and France.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper introduces the discrete choice model-paradigm of Random Regret Minimization (RRM) to the field of environmental and resource economics. The RRM-approach has been very recently developed in the context of travel demand modelling and presents a tractable, regret-based alternative to the dominant choice-modelling paradigm based on Random Utility Maximization-theory (RUM-theory). We highlight how RRM-based models provide closed form, logit-type formulations for choice probabilities that allow for capturing semi-compensatory behaviour and choice set-composition effects while being equally parsimonious as their utilitarian counterparts. Using data from a Stated Choice-experiment aimed at identifying valuations of characteristics of nature parks, we compare RRM-based models and RUM-based models in terms of parameter estimates, goodness of fit, elasticities and consequential policy implications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present new homology-based models of the glutamate binding site (in closed and open forms) of the NMDA receptor NR2B subunit derived from X-ray structures of the water soluble AMPA sensitive glutamate receptor. The models were used for revealing binding modes of agonists and competitive antagonists, as well as for rationalizing known experimental facts concerning structure-activity relationships: (i) the switching between the agonist and the antagonist modes of action upon lengthening the chain between the distal acidic group and the amino acid moiety, (ii) the preference for the methyl group attached to the a-amino group of ligands, (iii) the preference for the D-configuration of agonists and antagonists, and (iv) the existence of "superacidic" agonists.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Motivated by the need to solve ecological problems (climate change, habitat fragmentation and biological invasions), there has been increasing interest in species distribution models (SDMs). Predictions from these models inform conservation policy, invasive species management and disease-control measures. However, predictions are subject to uncertainty, the degree and source of which is often unrecognized. Here, we review the SDM literature in the context of uncertainty, focusing on three main classes of SDM: niche-based models, demographic models and process-based models. We identify sources of uncertainty for each class and discuss how uncertainty can be minimized or included in the modelling process to give realistic measures of confidence around predictions. Because this has typically not been performed, we conclude that uncertainty in SDMs has often been underestimated and a false precision assigned to predictions of geographical distribution. We identify areas where development of new statistical tools will improve predictions from distribution models, notably the development of hierarchical models that link different types of distribution model and their attendant uncertainties across spatial scales. Finally, we discuss the need to develop more defensible methods for assessing predictive performance, quantifying model goodness-of-fit and for assessing the significance of model covariates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quality Management and Managerialism in Healthcare creates a comprehensive and systematic international survey of various perspectives on healthcare quality management together with some of their most pertinent critiques. Chapter one starts with a general discussion of the factors that drove the introduction of management paradigms into public sector and health management contexts in the mid to late 1980s. Chapter two explores the rise of risk awareness in medicine; which, prior to the 1980s, stood largely in isolation to the implementation of managerial performance targets. Chapter three investigates the widespread adoption of performance management and clinical governance frameworks during the 1980s and 1990s. This is followed by Chapters four and five which examine systems based models of patient safety and the evidence-based medicine movement as exemplars of managerial perspectives on healthcare quality. Chapter six discusses potential future avenues for the development of alternative perspectives on quality of care which emphasise workforce involvement. The book concludes by reviewing the factors which have underpinned the managerialist trajectory of healthcare management over the past decades and explores the potential impact of nascent technologies such as 'connected health' and 'telehealth' on future developments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chili powder is a globally traded commodity which has been found to be adulterated with Sudan dyes from 2003 onwards. In this study, chili powders were adulterated with varying quantities of Sudan I dye (0.1-5%) and spectra were generated using near infrared reflectance spectroscopy (NIRS) and Raman
spectroscopy (on a spectrometer with a sample compartment modified as part of the study). Chemometrics were applied to the spectral data to produce quantitative and qualitative calibration models and prediction statistics. For the quantitative models coefficients of determination (R2) were found to be
0.891-0.994 depending on which spectral data (NIRS/Raman) was processed, the mathematical algorithm used and the data pre-processing applied. The corresponding values for the root mean square error of calibration (RMSEC) and root mean square error of prediction (RMSEP) were found to be 0.208-0.851%
and 0.141-0.831% respectively, once again depending on the spectral data and the chemometric treatment applied to the data. Indications are that the NIR spectroscopy based models are superior to the models produced from Raman spectral data based on a comparison of the values of the chemometric
parameters. The limit of detection (LOD) based on analysis of 20 blank chili powders against each calibration model gave 0.25% and 0.88% for the NIR and Raman data, respectively. In addition, adopting a qualitative approach with the spectral data and applying PCA or PLS-DA, it was possible to discriminate
between adulterated chili powders from non-adulterated chili powders.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper highlights the crucial role played by party-specific responsibility attributions in performance-based voting. Three models of electoral accountability, which make distinct assumptions regarding citizens' ability to attribute responsibility to distinct governing parties, are tested in the challenging Northern Ireland context - an exemplar case of multi-level multi-party government in which expectations of performance based voting are low. The paper demonstrates the operation of party-attribution based electoral accountability, using data from the 2011 Northern Ireland Assembly Election Study. However, the findings are asymmetric: accountability operates in the Protestant/unionist bloc but not in the Catholic/nationalist bloc. This asymmetry may be explained by the absence of clear ethno-national ideological distinctions between the unionist parties (hence providing political space for performance based accountability to operate) but the continued relevance in the nationalist bloc of ethno-national difference (which limits the scope for performance politics). The implications of the findings for our understanding of the role of party-specific responsibility attribution in performance based models of voting, and for our evaluation of the quality of democracy in post-conflict consociational polities, are discussed. 

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Integration Insight provides a brief overview of the most popular modelling techniques used to analyse complex real-world problems, as well as some less popular but highly relevant techniques. The modelling methods are divided into three categories, with each encompassing a number of methods, as follows: 1) Qualitative Aggregate Models (Soft Systems Methodology, Concept Maps and Mind Mapping, Scenario Planning, Causal (Loop) Diagrams), 2) Quantitative Aggregate Models (Function fitting and Regression, Bayesian Nets, System of differential equations / Dynamical systems, System Dynamics, Evolutionary Algorithms) and 3) Individual Oriented Models (Cellular Automata, Microsimulation, Agent Based Models, Discrete Event Simulation, Social Network
Analysis). Each technique is broadly described with example uses, key attributes and reference material.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The stability of consumer-resource systems can depend on the form of feeding interactions (i.e. functional responses). Size-based models predict interactions - and thus stability - based on consumer-resource size ratios. However, little is known about how interaction contexts (e.g. simple or complex habitats) might alter scaling relationships. Addressing this, we experimentally measured interactions between a large size range of aquatic predators (4-6400 mg over 1347 feeding trials) and an invasive prey that transitions among habitats: from the water column (3D interactions) to simple and complex benthic substrates (2D interactions). Simple and complex substrates mediated successive reductions in capture rates - particularly around the unimodal optimum - and promoted prey population stability in model simulations. Many real consumer-resource systems transition between 2D and 3D interactions, and along complexity gradients. Thus, Context-Dependent Scaling (CDS) of feeding interactions could represent an unrecognised aspect of food webs, and quantifying the extent of CDS might enhance predictive ecology.