853 resultados para Probabilistic decision process model
Resumo:
This paper explores the role of local government in urban regeneration in England. The first part describes local-central government relations during recent decades. It concludes that 'actually occurring' regeneration fuses top-down and bottom-up priorities and preferences, as well as path dependencies created by past decisions and local relations. The second part illustrates this contention by examining the regeneration of inner-city Salford over a 25-year period. It describes Salford City Council's approach in achieving the redevelopment of the former Salford Docks and how this created the confidence for the council to embark on further regeneration projects. Yet the top-down decision-making model has failed to satisfy local expectations, creating apathy which threatens the Labour government's desire for active citizens in regeneration projects.
Resumo:
The skill of numerical Lagrangian drifter trajectories in three numerical models is assessed by comparing these numerically obtained paths to the trajectories of drifting buoys in the real ocean. The skill assessment is performed using the two-sample Kolmogorov–Smirnov statistical test. To demonstrate the assessment procedure, it is applied to three different models of the Agulhas region. The test can either be performed using crossing positions of one-dimensional sections in order to test model performance in specific locations, or using the total two-dimensional data set of trajectories. The test yields four quantities: a binary decision of model skill, a confidence level which can be used as a measure of goodness-of-fit of the model, a test statistic which can be used to determine the sensitivity of the confidence level, and cumulative distribution functions that aid in the qualitative analysis. The ordering of models by their confidence levels is the same as the ordering based on the qualitative analysis, which suggests that the method is suited for model validation. Only one of the three models, a 1/10° two-way nested regional ocean model, might have skill in the Agulhas region. The other two models, a 1/2° global model and a 1/8° assimilative model, might have skill only on some sections in the region
Resumo:
This paper assesses the impact of the 'decoupling' reform of the Common Agricultural Policy on the labour allocation decisions of Irish farmers. The agricultural household decision-making model provides the conceptual and theoretical framework to examine the interaction between government subsidies and farmers' time allocation decisions. The relationship postulated is that 'decoupling' of agricultural support from production would probably result in a decline in the return to farm labour but it would also lead to an increase in household wealth. The effect of these factors on how farmers allocate their time is tested empirically using labour participation and labour supply models. The models developed are sufficiently general for application elsewhere. The main findings for the Irish situation are that the decoupling of direct payments is likely to increase the probability of farmers participating in the off-farm employment market and that the amount of time allocated to off-farm work will increase.
Resumo:
The next couple of years will see the need for replacement of a large amount of life-expired switchgear on the UK 11 kV distribution system. Latest technology and alternative equipment have made the choice of replacement a complex task. The authors present an expert system as an aid to the decision process for the design of the 11 kV power distribution network.
Resumo:
In most commercially available predictive control packages, there is a separation between economic optimisation and predictive control, although both algorithms may be part of the same software system. This method is compared in this article with two alternative approaches where the economic objectives are directly included in the predictive control algorithm. Simulations are carried out using the Tennessee Eastman process model.
Resumo:
Current methods for estimating vegetation parameters are generally sub-optimal in the way they exploit information and do not generally consider uncertainties. We look forward to a future where operational dataassimilation schemes improve estimates by tracking land surface processes and exploiting multiple types of observations. Dataassimilation schemes seek to combine observations and models in a statistically optimal way taking into account uncertainty in both, but have not yet been much exploited in this area. The EO-LDAS scheme and prototype, developed under ESA funding, is designed to exploit the anticipated wealth of data that will be available under GMES missions, such as the Sentinel family of satellites, to provide improved mapping of land surface biophysical parameters. This paper describes the EO-LDAS implementation, and explores some of its core functionality. EO-LDAS is a weak constraint variational dataassimilationsystem. The prototype provides a mechanism for constraint based on a prior estimate of the state vector, a linear dynamic model, and EarthObservationdata (top-of-canopy reflectance here). The observation operator is a non-linear optical radiative transfer model for a vegetation canopy with a soil lower boundary, operating over the range 400 to 2500 nm. Adjoint codes for all model and operator components are provided in the prototype by automatic differentiation of the computer codes. In this paper, EO-LDAS is applied to the problem of daily estimation of six of the parameters controlling the radiative transfer operator over the course of a year (> 2000 state vector elements). Zero and first order process model constraints are implemented and explored as the dynamic model. The assimilation estimates all state vector elements simultaneously. This is performed in the context of a typical Sentinel-2 MSI operating scenario, using synthetic MSI observations simulated with the observation operator, with uncertainties typical of those achieved by optical sensors supposed for the data. The experiments consider a baseline state vector estimation case where dynamic constraints are applied, and assess the impact of dynamic constraints on the a posteriori uncertainties. The results demonstrate that reductions in uncertainty by a factor of up to two might be obtained by applying the sorts of dynamic constraints used here. The hyperparameter (dynamic model uncertainty) required to control the assimilation are estimated by a cross-validation exercise. The result of the assimilation is seen to be robust to missing observations with quite large data gaps.
Resumo:
An analysis of diabatic heating and moistening processes from 12-36 hour lead time forecasts from 12 Global Circulation Models are presented as part of the "Vertical structure and physical processes of the Madden-Julian Oscillation (MJO)" project. A lead time of 12-36 hours is chosen to constrain the large scale dynamics and thermodynamics to be close to observations while avoiding being too close to the initial spin-up for the models as they adjust to being driven from the YOTC analysis. A comparison of the vertical velocity and rainfall with the observations and YOTC analysis suggests that the phases of convection associated with the MJO are constrained in most models at this lead time although the rainfall in the suppressed phase is typically overestimated. Although the large scale dynamics is reasonably constrained, moistening and heating profiles have large inter-model spread. In particular, there are large spreads in convective heating and moistening at mid-levels during the transition to active convection. Radiative heating and cloud parameters have the largest relative spread across models at upper levels during the active phase. A detailed analysis of time step behaviour shows that some models show strong intermittency in rainfall and differences in the precipitation and dynamics relationship between models. The wealth of model outputs archived during this project is a very valuable resource for model developers beyond the study of the MJO. In addition, the findings of this study can inform the design of process model experiments, and inform the priorities for field experiments and future observing systems.
Resumo:
Understanding farmer behaviour is needed for local agricultural systems to produce food sustainably while facing multiple pressures. We synthesize existing literature to identify three fundamental questions that correspond to three distinct areas of knowledge necessary to understand farmer behaviour: 1) decision-making model; 2) cross-scale and cross-level pressures; and 3) temporal dynamics. We use this framework to compare five interdisciplinary case studies of agricultural systems in distinct geographical contexts across the globe. We find that these three areas of knowledge are important to understanding farmer behaviour, and can be used to guide the interdisciplinary design and interpretation of studies in the future. Most importantly, we find that these three areas need to be addressed simultaneously in order to understand farmer behaviour. We also identify three methodological challenges hindering this understanding: the suitability of theoretical frameworks, the trade-offs among methods and the limited timeframe of typical research projects. We propose that a triangulation research strategy that makes use of mixed methods, or collaborations between researchers across mixed disciplines, can be used to successfully address all three areas simultaneously and show how this has been achieved in the case studies. The framework facilitates interdisciplinary research on farmer behaviour by opening up spaces of structured dialogue on assumptions, research questions and methods employed in investigation.
Resumo:
Background: Concerted evolution is normally used to describe parallel changes at different sites in a genome, but it is also observed in languages where a specific phoneme changes to the same other phoneme in many words in the lexicon—a phenomenon known as regular sound change. We develop a general statistical model that can detect concerted changes in aligned sequence data and apply it to study regular sound changes in the Turkic language family. Results: Linguistic evolution, unlike the genetic substitutional process, is dominated by events of concerted evolutionary change. Our model identified more than 70 historical events of regular sound change that occurred throughout the evolution of the Turkic language family, while simultaneously inferring a dated phylogenetic tree. Including regular sound changes yielded an approximately 4-fold improvement in the characterization of linguistic change over a simpler model of sporadic change, improved phylogenetic inference, and returned more reliable and plausible dates for events on the phylogenies. The historical timings of the concerted changes closely follow a Poisson process model, and the sound transition networks derived from our model mirror linguistic expectations. Conclusions: We demonstrate that a model with no prior knowledge of complex concerted or regular changes can nevertheless infer the historical timings and genealogical placements of events of concerted change from the signals left in contemporary data. Our model can be applied wherever discrete elements—such as genes, words, cultural trends, technologies, or morphological traits—can change in parallel within an organism or other evolving group.
Resumo:
Research on invention has focused on business invention and little work has been conducted on the process and capability required for the individual inventor or the capabilities required for an advice to be considered an invention. This paper synthesises the results of an empirical survey of ten inventor case studies with current research on invention and recent capability affordance research to develop an integrated capability process model of human capabilities for invention and specific capabilities of an invented device. We identify eight necessary human effectivities required for individual invention capability and six functional key activities using these effectivities, to deliver the functional capability of invention. We also identified key differences between invention and general problem solving processes. Results suggest that inventive step capability relies on a unique application of principles that relate to a new combination of affordance chain with a new mechanism and or space time (affordance) path representing the novel way the device works, in conjunction with defined critical affordance operating factors that are the subject of the patent claims.
Resumo:
We utilized an ecosystem process model (SIPNET, simplified photosynthesis and evapotranspiration model) to estimate carbon fluxes of gross primary productivity and total ecosystem respiration of a high-elevation coniferous forest. The data assimilation routine incorporated aggregated twice-daily measurements of the net ecosystem exchange of CO2 (NEE) and satellite-based reflectance measurements of the fraction of absorbed photosynthetically active radiation (fAPAR) on an eight-day timescale. From these data we conducted a data assimilation experiment with fifteen different combinations of available data using twice-daily NEE, aggregated annual NEE, eight-day f AP AR, and average annual fAPAR. Model parameters were conditioned on three years of NEE and fAPAR data and results were evaluated to determine the information content from the different combinations of data streams. Across the data assimilation experiments conducted, model selection metrics such as the Bayesian Information Criterion and Deviance Information Criterion obtained minimum values when assimilating average annual fAPAR and twice-daily NEE data. Application of wavelet coherence analyses showed higher correlations between measured and modeled fAPAR on longer timescales ranging from 9 to 12 months. There were strong correlations between measured and modeled NEE (R2, coefficient of determination, 0.86), but correlations between measured and modeled eight-day fAPAR were quite poor (R2 = −0.94). We conclude that this inability to determine fAPAR on eight-day timescale would improve with the considerations of the radiative transfer through the plant canopy. Modeled fluxes when assimilating average annual fAPAR and annual NEE were comparable to corresponding results when assimilating twice-daily NEE, albeit at a greater uncertainty. Our results support the conclusion that for this coniferous forest twice-daily NEE data are a critical measurement stream for the data assimilation. The results from this modeling exercise indicate that for this coniferous forest, average annuals for satellite-based fAPAR measurements paired with annual NEE estimates may provide spatial detail to components of ecosystem carbon fluxes in proximity of eddy covariance towers. Inclusion of other independent data streams in the assimilation will also reduce uncertainty on modeled values.
Resumo:
Observed and predicted changes in the strength of the westerly winds blowing over the Southern Ocean have motivated a number of studies of the response of the Antarctic Circumpolar Current and Southern Ocean Meridional Overturning Circulation (MOC) to wind perturbations and led to the discovery of the``eddy-compensation" regime, wherein the MOC becomes insensitive to wind changes. In addition to the MOC, tracer transport also depends on mixing processes. Here we show, in a high-resolution process model, that isopycnal mixing by mesoscale eddies is strongly dependent on the wind strength. This dependence can be explained by mixing-length theory and is driven by increases in eddy kinetic energy; the mixing length does not change strongly in our simulation. Simulation of a passive ventilation tracer (analogous to CFCs or anthropogenic CO$_2$) demonstrates that variations in tracer uptake across experiments are dominated by changes in isopycnal mixing, rather than changes in the MOC. We argue that, to properly understand tracer uptake under different wind-forcing scenarios, the sensitivity of isopycnal mixing to winds must be accounted for.
Resumo:
This paper presents an integrative and spatially explicit modeling approach for analyzing human and environmental exposure from pesticide application of smallholders in the potato producing Andean region in Colombia. The modeling approach fulfills the following criteria: (i) it includes environmental and human compartments; (ii) it contains a behavioral decision-making model for estimating the effect of policies on pesticide flows to humans and the environment; (iii) it is spatially explicit; and (iv) it is modular and easily expandable to include additional modules, crops or technologies. The model was calibrated and validated for the Vereda La Hoya and was used to explore the effect of different policy measures in the region. The model has moderate data requirements and can be adapted relatively easy to other regions in developing countries with similar conditions.
Resumo:
In this work we construct the stationary measure of the N species totally asymmetric simple exclusion process in a matrix product formulation. We make the connection between the matrix product formulation and the queueing theory picture of Ferrari and Martin. In particular, in the standard representation, the matrices act on the space of queue lengths. For N > 2 the matrices in fact become tensor products of elements of quadratic algebras. This enables us to give a purely algebraic proof of the stationary measure which we present for N=3.
Resumo:
The aim of the project is to develop a tool for a management system at the advertising agency ofConfetti. The company has during a longer period of time felt a need of introducing a quality and environmentalsystem but have a lack in resources of knowledge, time and money. Their goal is to have aconcrete documentation which describes their quality policy and the function of the activity.The work has been performed by mapping and analyzing the processes of the company according tothe process model with focus on one pilot process. New routines have been developed and documentedin a management system.Confetti would like their management system to fulfil the ISO-standard so that they eventually inthe future can be certified. Therefore a literature study has been done according to management systemsand standards. The project has been directed to fulfil the ISO-standards demands for a qualitymanagement system. It has resulted in the creation of a comprehensible and userfriendly managementsystem which includes a routine model and new routines for the pilot process of PDF-handling. A oneyear-plan has been created to facilitate for the continuos development of the quality management systemat Confetti. The new quality implementations demands investment, why a budget has been calculatedshowing profitability.Confetti has to continue the development of the management system very soon. They ought to createan analyzis of the companys current situation and choose the processes that they themeselves considermost important or easiest to improve. A management system will not operate by it self, it needsmaintenance and development.