81 resultados para Framework Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was, within a sensitivity analysis framework, to determine if additional model complexity gives a better capability to model the hydrology and nitrogen dynamics of a small Mediterranean forested catchment or if the additional parameters cause over-fitting. Three nitrogen-models of varying hydrological complexity were considered. For each model, general sensitivity analysis (GSA) and Generalized Likelihood Uncertainty Estimation (GLUE) were applied, each based on 100,000 Monte Carlo simulations. The results highlighted the most complex structure as the most appropriate, providing the best representation of the non-linear patterns observed in the flow and streamwater nitrate concentrations between 1999 and 2002. Its 5% and 95% GLUE bounds, obtained considering a multi-objective approach, provide the narrowest band for streamwater nitrogen, which suggests increased model robustness, though all models exhibit periods of inconsistent good and poor fits between simulated outcomes and observed data. The results confirm the importance of the riparian zone in controlling the short-term (daily) streamwater nitrogen dynamics in this catchment but not the overall flux of nitrogen from the catchment. It was also shown that as the complexity of a hydrological model increases over-parameterisation occurs, but the converse is true for a water quality model where additional process representation leads to additional acceptable model simulations. Water quality data help constrain the hydrological representation in process-based models. Increased complexity was justifiable for modelling river-system hydrochemistry. Increased complexity was justifiable for modelling river-system hydrochemistry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As in any technology systems, analysis and design issues are among the fundamental challenges in persuasive technology. Currently, the Persuasive Systems Development (PSD) framework is considered to be the most comprehensive framework for designing and evaluation of persuasive systems. However, the framework is limited in terms of providing detailed information which can lead to selection of appropriate techniques depending on the variable nature of users or use over time. In light of this, we propose a model which is intended for analysing and implementing behavioural change in persuasive technology called the 3D-RAB model. The 3D-RAB model represents the three dimensional relationships between attitude towards behaviour, attitude towards change or maintaining a change, and current behaviour, and distinguishes variable levels in a user’s cognitive state. As such it provides a framework which could be used to select appropriate techniques for persuasive technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Decadal predictions have a high profile in the climate science community and beyond, yet very little is known about their skill. Nor is there any agreed protocol for estimating their skill. This paper proposes a sound and coordinated framework for verification of decadal hindcast experiments. The framework is illustrated for decadal hindcasts tailored to meet the requirements and specifications of CMIP5 (Coupled Model Intercomparison Project phase 5). The chosen metrics address key questions about the information content in initialized decadal hindcasts. These questions are: (1) Do the initial conditions in the hindcasts lead to more accurate predictions of the climate, compared to un-initialized climate change projections? and (2) Is the prediction model’s ensemble spread an appropriate representation of forecast uncertainty on average? The first question is addressed through deterministic metrics that compare the initialized and uninitialized hindcasts. The second question is addressed through a probabilistic metric applied to the initialized hindcasts and comparing different ways to ascribe forecast uncertainty. Verification is advocated at smoothed regional scales that can illuminate broad areas of predictability, as well as at the grid scale, since many users of the decadal prediction experiments who feed the climate data into applications or decision models will use the data at grid scale, or downscale it to even higher resolution. An overall statement on skill of CMIP5 decadal hindcasts is not the aim of this paper. The results presented are only illustrative of the framework, which would enable such studies. However, broad conclusions that are beginning to emerge from the CMIP5 results include (1) Most predictability at the interannual-to-decadal scale, relative to climatological averages, comes from external forcing, particularly for temperature; (2) though moderate, additional skill is added by the initial conditions over what is imparted by external forcing alone; however, the impact of initialization may result in overall worse predictions in some regions than provided by uninitialized climate change projections; (3) limited hindcast records and the dearth of climate-quality observational data impede our ability to quantify expected skill as well as model biases; and (4) as is common to seasonal-to-interannual model predictions, the spread of the ensemble members is not necessarily a good representation of forecast uncertainty. The authors recommend that this framework be adopted to serve as a starting point to compare prediction quality across prediction systems. The framework can provide a baseline against which future improvements can be quantified. The framework also provides guidance on the use of these model predictions, which differ in fundamental ways from the climate change projections that much of the community has become familiar with, including adjustment of mean and conditional biases, and consideration of how to best approach forecast uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sampling strategies for monitoring the status and trends in wildlife populations are often determined before the first survey is undertaken. However, there may be little information about the distribution of the population and so the sample design may be inefficient. Through time, as data are collected, more information about the distribution of animals in the survey region is obtained but it can be difficult to incorporate this information in the survey design. This paper introduces a framework for monitoring motile wildlife populations within which the design of future surveys can be adapted using data from past surveys whilst ensuring consistency in design-based estimates of status and trends through time. In each survey, part of the sample is selected from the previous survey sample using simple random sampling. The rest is selected with inclusion probability proportional to predicted abundance. Abundance is predicted using a model constructed from previous survey data and covariates for the whole survey region. Unbiased design-based estimators of status and trends and their variances are derived from two-phase sampling theory. Simulations over the short and long-term indicate that in general more precise estimates of status and trends are obtained using this mixed strategy than a strategy in which all of the sample is retained or all selected with probability proportional to predicted abundance. Furthermore the mixed strategy is robust to poor predictions of abundance. Estimates of status are more precise than those obtained from a rotating panel design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Undeniably, anticipation plays a crucial role in cognition. By what means, to what extent, and what it achieves remain open questions. In a recent BBS target article, Clark (in press) depicts an integrative model of the brain that builds on hierarchical Bayesian models of neural processing (Rao and Ballard, 1999; Friston, 2005; Brown et al., 2011), and their most recent formulation using the free-energy principle borrowed from thermodynamics (Feldman and Friston, 2010; Friston, 2010; Friston et al., 2010). Hierarchical generative models of cognition, such as those described by Clark, presuppose the manipulation of representations and internal models of the world, in as much detail as is perceptually available. Perhaps surprisingly, Clark acknowledges the existence of a “virtual version of the sensory data” (p. 4), but with no reference to some of the historical debates that shaped cognitive science, related to the storage, manipulation, and retrieval of representations in a cognitive system (Shanahan, 1997), or accounting for the emergence of intentionality within such a system (Searle, 1980; Preston and Bishop, 2002). Instead of demonstrating how this Bayesian framework responds to these foundational questions, Clark describes the structure and the functional properties of an action-oriented, multi-level system that is meant to combine perception, learning, and experience (Niedenthal, 2007).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cross-bred cow adoption is an important and potent policy variable precipitating subsistence household entry into emerging milk markets. This paper focuses on the problem of designing policies that encourage and sustain milkmarket expansion among a sample of subsistence households in the Ethiopian highlands. In this context it is desirable to measure households’ ‘proximity’ to market in terms of the level of deficiency of essential inputs. This problem is compounded by four factors. One is the existence of cross-bred cow numbers (count data) as an important, endogenous decision by the household; second is the lack of a multivariate generalization of the Poisson regression model; third is the censored nature of the milk sales data (sales from non-participating households are, essentially, censored at zero); and fourth is an important simultaneity that exists between the decision to adopt a cross-bred cow, the decision about how much milk to produce, the decision about how much milk to consume and the decision to market that milk which is produced but not consumed internally by the household. Routine application of Gibbs sampling and data augmentation overcome these problems in a relatively straightforward manner. We model the count data from two sites close to Addis Ababa in a latent, categorical-variable setting with known bin boundaries. The single-equation model is then extended to a multivariate system that accommodates the covariance between crossbred-cow adoption, milk-output, and milk-sales equations. The latent-variable procedure proves tractable in extension to the multivariate setting and provides important information for policy formation in emerging-market settings

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Lagrangian model of photochemistry and mixing is described (CiTTyCAT, stemming from the Cambridge Tropospheric Trajectory model of Chemistry And Transport), which is suitable for transport and chemistry studies throughout the troposphere. Over the last five years, the model has been developed in parallel at several different institutions and here those developments have been incorporated into one "community" model and documented for the first time. The key photochemical developments include a new scheme for biogenic volatile organic compounds and updated emissions schemes. The key physical development is to evolve composition following an ensemble of trajectories within neighbouring air-masses, including a simple scheme for mixing between them via an evolving "background profile", both within the boundary layer and free troposphere. The model runs along trajectories pre-calculated using winds and temperature from meteorological analyses. In addition, boundary layer height and precipitation rates, output from the analysis model, are interpolated to trajectory points and used as inputs to the mixing and wet deposition schemes. The model is most suitable in regimes when the effects of small-scale turbulent mixing are slow relative to advection by the resolved winds so that coherent air-masses form with distinct composition and strong gradients between them. Such air-masses can persist for many days while stretching, folding and thinning. Lagrangian models offer a useful framework for picking apart the processes of air-mass evolution over inter-continental distances, without being hindered by the numerical diffusion inherent to global Eulerian models. The model, including different box and trajectory modes, is described and some output for each of the modes is presented for evaluation. The model is available for download from a Subversion-controlled repository by contacting the corresponding authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is well known that there is a dynamic relationship between cerebral blood flow (CBF) and cerebral blood volume (CBV). With increasing applications of functional MRI, where the blood oxygen-level-dependent signals are recorded, the understanding and accurate modeling of the hemodynamic relationship between CBF and CBV becomes increasingly important. This study presents an empirical and data-based modeling framework for model identification from CBF and CBV experimental data. It is shown that the relationship between the changes in CBF and CBV can be described using a parsimonious autoregressive with exogenous input model structure. It is observed that neither the ordinary least-squares (LS) method nor the classical total least-squares (TLS) method can produce accurate estimates from the original noisy CBF and CBV data. A regularized total least-squares (RTLS) method is thus introduced and extended to solve such an error-in-the-variables problem. Quantitative results show that the RTLS method works very well on the noisy CBF and CBV data. Finally, a combination of RTLS with a filtering method can lead to a parsimonious but very effective model that can characterize the relationship between the changes in CBF and CBV.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The pig is a single-stomached omnivorous mammal and is an important model of human disease and nutrition. As such, it is necessary to establish a metabolic framework from which pathology-based variation can be compared. Here, a combination of one and two-dimensional 1H and 13C nuclear magnetic resonance spectroscopy (NMR) and high-resolution magic angle spinning (HR-MAS) NMR was used to provide a systems overview of porcine metabolism via characterisation of the urine, serum, liver and kidney metabolomes. The metabolites observed in each of these biological compartments were found to be qualitatively comparable to the metabolic signature of the same biological matrices in humans and rodents. The data were modelled using a combination of principal components analysis and Venn diagram mapping. Urine represented the most metabolically distinct biological compartment studied, with a relatively greater number of NMR detectable metabolites present, many of which are implicated in gut-microbial co-metabolic processes. The major interspecies differences observed were in the phase II conjugation of extra-genomic metabolites; the pig was observed to conjugate p-cresol, a gut microbial metabolite of tyrosine, with glucuronide rather than sulfate as seen in man. These observations are important to note when considering the translatability of experimental data derived from porcine models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In June 2009 the Sarychev volcano located in the Kuril Islands to the northeast of Japan erupted explosively, injecting ash and an estimated 1.2 ± 0.2 Tg of sulfur dioxide into the upper troposphere and lower stratosphere, making it arguably one of the 10 largest stratospheric injections in the last 50 years. During the period immediately after the eruption, we show that the sulfur dioxide (SO2) cloud was clearly detected by retrievals developed for the Infrared Atmospheric Sounding Interferometer (IASI) satellite instrument and that the resultant stratospheric sulfate aerosol was detected by the Optical Spectrograph and Infrared Imaging System (OSIRIS) limb sounder and CALIPSO lidar. Additional surface‐based instrumentation allows assessment of the impact of the eruption on the stratospheric aerosol optical depth. We use a nudged version of the HadGEM2 climate model to investigate how well this state‐of‐the‐science climate model can replicate the distributions of SO2 and sulfate aerosol. The model simulations and OSIRIS measurements suggest that in the Northern Hemisphere the stratospheric aerosol optical depth was enhanced by around a factor of 3 (0.01 at 550 nm), with resultant impacts upon the radiation budget. The simulations indicate that, in the Northern Hemisphere for July 2009, the magnitude of the mean radiative impact from the volcanic aerosols is more than 60% of the direct radiative forcing of all anthropogenic aerosols put together. While the cooling induced by the eruption will likely not be detectable in the observational record, the combination of modeling and measurements would provide an ideal framework for simulating future larger volcanic eruptions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The role of air–sea coupling in the simulation of the Madden–Julian oscillation (MJO) is explored using two configurations of the Hadley Centre atmospheric model (AGCM), GA3.0, which differ only in F, a parameter controlling convective entrainment and detrainment. Increasing F considerably improves deficient MJO-like variability in the Indian and Pacific Oceans, but variability in and propagation through the Maritime Continent remains weak. By coupling GA3.0 in the tropical Indo-Pacific to a boundary-layer ocean model, KPP, and employing climatological temperature corrections, well resolved air–sea interactions are simulated with limited alterations to the mean state. At default F, when GA3.0 has a poor MJO, coupling produces a stronger MJO with some eastward propagation, although both aspects remain deficient. These results agree with previous sensitivity studies using AGCMs with poor variability. At higher F, coupling does not affect MJO amplitude but enhances propagation through the Maritime Continent, resulting in an MJO that resembles observations. A sensitivity experiment with coupling in only the Indian Ocean reverses these improvements, suggesting coupling in the Maritime Continent and West Pacific is critical for propagation. We hypothesise that for AGCMs with a poor MJO, coupling provides a “crutch” to artificially augment MJO-like activity through high-frequency SST anomalies. In related experiments, we employ the KPP framework to analyse the impact of air–sea interactions in the fully coupled GA3.0, which at default F shows a similar MJO to uncoupled GA3.0. This is due to compensating effects: an improvement from coupling and a degradation from mean-state errors. Future studies on the role of coupling should carefully separate these effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper considers supply dynamics in the context of the Irish residential market. The analysis, in a multiple error-correction framework, reveals that although developers did respond to disequilibrium in supply, the rate of adjustment was relatively slow. In contrast, however, disequilibrium in demand did not impact upon supply, suggesting that inelastic supply conditions could explain the prolonged nature of the boom in the Irish market. Increased elasticity in the later stages of the boom may have been a contributory factor in the extent of the house price falls observed in recent years.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Complex Adaptive Systems, Cognitive Agents and Distributed Energy (CASCADE) project is developing a framework based on Agent Based Modelling (ABM). The CASCADE Framework can be used both to gain policy and industry relevant insights into the smart grid concept itself and as a platform to design and test distributed ICT solutions for smart grid based business entities. ABM is used to capture the behaviors of diff erent social, economic and technical actors, which may be defi ned at various levels of abstraction. It is applied to understanding their interactions and can be adapted to include learning processes and emergent patterns. CASCADE models ‘prosumer’ agents (i.e., producers and/or consumers of energy) and ‘aggregator’ agents (e.g., traders of energy in both wholesale and retail markets) at various scales, from large generators and Energy Service Companies down to individual people and devices. The CASCADE Framework is formed of three main subdivisions that link models of electricity supply and demand, the electricity market and power fl ow. It can also model the variability of renewable energy generation caused by the weather, which is an important issue for grid balancing and the profi tability of energy suppliers. The development of CASCADE has already yielded some interesting early fi ndings, demonstrating that it is possible for a mediating agent (aggregator) to achieve stable demandfl attening across groups of domestic households fi tted with smart energy control and communication devices, where direct wholesale price signals had previously been found to produce characteristic complex system instability. In another example, it has demonstrated how large changes in supply mix can be caused even by small changes in demand profi le. Ongoing and planned refi nements to the Framework will support investigation of demand response at various scales, the integration of the power sector with transport and heat sectors, novel technology adoption and diffusion work, evolution of new smart grid business models, and complex power grid engineering and market interactions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Plaut, McClelland, Seidenberg and Patterson (1996) connectionist model of reading was evaluated at two points early in its training against reading data collected from British children on two occasions during their first year of literacy instruction. First, the network’s non-word reading was poor relative to word reading when compared with the children. Second, the network made more non-lexical than lexical errors, the opposite pattern to the children. Three adaptations were made to the training of the network to bring it closer to the learning environment of a child: an incremental training regime was adopted; the network was trained on grapheme– phoneme correspondences; and a training corpus based on words found in children’s early reading materials was used. The modifications caused a sharp improvement in non-word reading, relative to word reading, resulting in a near perfect match to the children’s data on this measure. The modified network, however, continued to make predominantly non-lexical errors, although evidence from a small-scale implementation of the full triangle framework suggests that this limitation stems from the lack of a semantic pathway. Taken together, these results suggest that, when properly trained, connectionist models of word reading can offer insights into key aspects of reading development in children.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article builds on advances in social ontology to develop a new understanding of how mainstream economic modelling affects reality. We propose a new framework for analysing and describing how models intervene in the social sphere. This framework allows us to identify and articulate three key epistemic features of models as interventions: specificity, portability and formal precision. The second part of the article uses our framework to demonstrate how specificity, portability and formal precision explain the use of moral hazard models in a variety of different policy contexts, including worker compensation schemes, bank regulation and the euro-sovereign debt crisis.