185 resultados para Markov processes.
Resumo:
In the ten years since the first edition of this book appeared there have been significant developments in food process engineering, notably in biotechnology and membrane application. Advances have been made in the use of sensors for process control, and the growth of information technology and on-line computer applications continues apace. In addition, plant investment decisions are increasingly determined by quality assurance considerations and have to incorporate a greater emphasis on health and safety issues. The content of this edition has been rearranged to include descriptions of recent developments and to reflect the influence of new technology on the control and operations of automated plant. Original examples have been retained where relevant and these, together with many new illustrations, provide a comprehensive guide to good practice.
Resumo:
The primary role of land surface models embedded in climate models is to partition surface available energy into upwards, radiative, sensible and latent heat fluxes. Partitioning of evapotranspiration, ET, is of fundamental importance: as a major component of the total surface latent heat flux, ET affects the simulated surface water balance, and related energy balance, and consequently the feedbacks with the atmosphere. In this context it is also crucial to credibly represent the CO2 exchange between ecosystems and their environment. In this study, JULES, the land surface model used in UK weather and climate models, has been evaluated for temperate Europe. Compared to eddy covariance flux measurements, the CO2 uptake by the ecosystem is underestimated and the ET overestimated. In addition, the contribution to ET from soil and intercepted water evaporation far outweighs the contribution of plant transpiration. To alleviate these biases, adaptations have been implemented in JULES, based on key literature references. These adaptations have improved the simulation of the spatio-temporal variability of the fluxes and the accuracy of the simulated GPP and ET, including its partitioning. This resulted in a shift of the seasonal soil moisture cycle. These adaptations are expected to increase the fidelity of climate simulations over Europe. Finally, the extreme summer of 2003 was used as evaluation benchmark for the use of the model in climate change studies. The improved model captures the impact of the 2003 drought on the carbon assimilation and the water use efficiency of the plants. It, however, underestimates the 2003 GPP anomalies. The simulations showed that a reduction of evaporation from the interception and soil reservoirs, albeit not of transpiration, largely explained the good correlation between the carbon and the water fluxes anomalies that was observed during 2003. This demonstrates the importance of being able to discriminate the response of individual component of the ET flux to environmental forcing.
Resumo:
Tests for business cycle asymmetries are developed for Markov-switching autoregressive models. The tests of deepness, steepness, and sharpness are Wald statistics, which have standard asymptotics. For the standard two-regime model of expansions and contractions, deepness is shown to imply sharpness (and vice versa), whereas the process is always nonsteep. Two and three-state models of U.S. GNP growth are used to illustrate the approach, along with models of U.S. investment and consumption growth. The robustness of the tests to model misspecification, and the effects of regime-dependent heteroscedasticity, are investigated.
Resumo:
Although financial theory rests heavily upon the assumption that asset returns are normally distributed, value indices of commercial real estate display significant departures from normality. In this paper, we apply and compare the properties of two recently proposed regime switching models for value indices of commercial real estate in the US and the UK, both of which relax the assumption that observations are drawn from a single distribution with constant mean and variance. Statistical tests of the models' specification indicate that the Markov switching model is better able to capture the non-stationary features of the data than the threshold autoregressive model, although both represent superior descriptions of the data than the models that allow for only one state. Our results have several implications for theoretical models and empirical research in finance.
Resumo:
Experimental buildings at Butser Ancient Farm and St. Fagans (UK) and Lejre (Denmark) were sampled to investigate micromorphology of known activity areas, to contribute to our understanding of the internal use of space in excavated buildings and formation processes of house floor deposits. The experimental buildings provided important information relating to activity residues and sediments over the 16 years that the buildings were in use. Specifically, these results contribute to our understanding of the routes and cycles for transportation of materials in occupation contexts, which can be used to inform archaeological studies. It has been possible to identify internal ‘hot spots’ within the buildings for the deposition of activity residues and for the formation of specific deposit types. Analysis also highlighted postdepositional alterations occurring in internal occupation deposits, which has provided a means of identifying roofed and unroofed spaces in the archaeological record.
Resumo:
Galactic Cosmic Rays are one of the major sources of ion production in the troposphere and stratosphere. Recent studies have shown that ions form electrically charged clusters which may grow to become cloud droplets. Aerosol particles charge by the attachment of ions and electrons. The collision efficiency between a particle and a water droplet increases, if the particle is electrically charged, and thus aerosol-cloud interactions can be enhanced. Because these microphysical processes may change radiative properties of cloud and impact Earth's climate it is important to evaluate these processes' quantitative effects. Five different models developed independently have been coupled to investigate this. The first model estimates cloud height from dew point temperature and the temperature profile. The second model simulates the cloud droplet growth from aerosol particles using the cloud parcel concept. In the third model, the scavenging rate of the aerosol particles is calculated using the collision efficiency between charged particles and droplets. The fourth model calculates electric field and charge distribution on water droplets and aerosols within cloud. The fifth model simulates the global electric circuit (GEC), which computes the conductivity and ionic concentration in the atmosphere in altitude range 0–45 km. The first four models are initially coupled to calculate the height of cloud, boundary condition of cloud, followed by growth of droplets, charge distribution calculation on aerosols and cloud droplets and finally scavenging. These models are incorporated with the GEC model. The simulations are verified with experimental data of charged aerosol for various altitudes. Our calculations showed an effect of aerosol charging on the CCN concentration within the cloud, due to charging of aerosols increase the scavenging of particles in the size range 0.1 µm to 1 µm.
Resumo:
This paper explores the mapping of the environmental assessment process onto design and construction processes. A comparative case study method is used to identify and account for variations in the ‘fit’ between these two processes. The analysis compares eight BREEAM projects (although relevant to LEED, GreenStar, etc.) and distinguishes project-level characteristics and dynamics. Drawing on insights from literature on sustainable construction and assessment methods, an analytic framework is developed to examine the effect of clusters of project and assessment level elements on different types of fit (tight, punctual and bolt-on). Key elements distinguishing between types include: prior working experience with project team members, individual commitment to sustainable construction, experience with sustainable construction, project continuity, project-level ownership of the assessment process, and the nature and continuity of assessor involvement. Professionals with ‘sustainable’ experience used BREEAM judiciously to support their designs (along with other frameworks), but less committed professionals tended to treat it purely as an assessment method. More attention needs to be paid to individual levels of engagement with, and understanding of, sustainability in general (rather than knowledge of technical solutions to individual credits), to ownership of the assessment process and to the potential effect of discontinuities at the project level on sustainable design.
Resumo:
Decades of research attest that memory processes suffer under conditions of auditory distraction. What is however less well understood is whether people are able to modify how their memory processes are deployed in order to compensate for disruptive effects of distraction. The metacognitive approach to memory describes a variety of ways people can exert control over their cognitive processes to optimize performance. Here we describe our recent investigations into how these control processes change under conditions of auditory distraction. We specifically looked at control of encoding in the form of decisions about how long to study a word when it is presented and control of memory reporting in the form of decisions whether to volunteer or withhold retrieved details. Regarding control of encoding, we expected that people would compensate for disruptive effects of distraction by extending study time under noise. Our results revealed, however, that when exposed to irrelevant speech, people curtail rather than extend study. Regarding control of memory reporting, we expected that people would compensate for the loss of access to memory records by volunteering responses held with lower confidence. Our results revealed, however, that people’s reporting strategies do not differ when memory task is performed in silence or under auditory distraction, although distraction seriously undermines people’s confidence in their own responses. Together, our studies reveal novel avenues for investigating the psychological effects of auditory distraction within a metacognitive framework.
Resumo:
How is semantic memory influenced by individual differences under conditions of distraction? This question was addressed by observing how visual target words—drawn from a single category—were recalled whilst ignoring spoken distracter words that were either members of the same, or members of a different (single) category. Working memory capacity (WMC) was related to disruption only with synchronous, not asynchronous, presentation and distraction was greater when the words were presented synchronously. Subsequent experiments found greater negative priming of distracters amongst individuals with higher WMC but this may be dependent on targets and distracters being comparable category exemplars. With less dominant category members as distracters, target recall was impaired – relative to control – only amongst individuals with low WMC. The results highlight the role of cognitive control resources in target-distracter selection and the individual-specific cost implications of such cognitive control.
Resumo:
This paper identifies characteristics of knowledge intensive processes and a method to improve their performance based on analysis of investment banking front office processes. The inability to improve these processes using standard process improvement techniques confirmed that much of the process was not codified and depended on tacit knowledge and skills. This led to the use of a semi-structured analysis of the characteristics of the processes via a questionnaire to identify knowledge intensive processes characteristics that adds to existing theory. Further work identified innovative process analysis and change techniques that could generate improvements based on an analysis of their properties and the issue drivers. An improvement methodology was developed to harness a number of techniques that were found to effective in resolving the issue drivers and improving these knowledge intensive processes.
Resumo:
Monte Carlo algorithms often aim to draw from a distribution π by simulating a Markov chain with transition kernel P such that π is invariant under P. However, there are many situations for which it is impractical or impossible to draw from the transition kernel P. For instance, this is the case with massive datasets, where is it prohibitively expensive to calculate the likelihood and is also the case for intractable likelihood models arising from, for example, Gibbs random fields, such as those found in spatial statistics and network analysis. A natural approach in these cases is to replace P by an approximation Pˆ. Using theory from the stability of Markov chains we explore a variety of situations where it is possible to quantify how ’close’ the chain given by the transition kernel Pˆ is to the chain given by P . We apply these results to several examples from spatial statistics and network analysis.
Resumo:
Seven catchments of diverse size in Mediterranean Europe were investigated in order to understand the main aspects of their hydrological functioning. The methods included the analysis of daily and monthly precipitation, monthly potential evapotranspiration rates, flow duration curves, rainfall runoff relationships and catchment internal data for the smaller and more instrumented catchments. The results showed that the catchments were less dry than initially considered. Only one of them was really semi-arid throughout the year. All the remaining catchments showed wet seasons when precipitation exceeded potential evapotrans-piration, allowing aquifer recharge, wet runoff generation mechanisms and relevant baseflow contribution. Nevertheless, local infiltration excess (Hortonian) overland flow was inferred during summer storms in some catchments and urban overland flow in some others. The roles of karstic groundwater, human disturbance and low winter temperatures were identified as having an important impact on the hydrological regime in some of the catchments.
Resumo:
When studying hydrological processes with a numerical model, global sensitivity analysis (GSA) is essential if one is to understand the impact of model parameters and model formulation on results. However, different definitions of sensitivity can lead to a difference in the ranking of importance of the different model factors. Here we combine a fuzzy performance function with different methods of calculating global sensitivity to perform a multi-method global sensitivity analysis (MMGSA). We use an application of a finite element subsurface flow model (ESTEL-2D) on a flood inundation event on a floodplain of the River Severn to illustrate this new methodology. We demonstrate the utility of the method for model understanding and show how the prediction of state variables, such as Darcian velocity vectors, can be affected by such a MMGSA. This paper is a first attempt to use GSA with a numerically intensive hydrological model.