910 resultados para Process models
Resumo:
A model structure comprising a wavelet network and a linear term is proposed for nonlinear system identification. It is shown that under certain conditions wavelets are orthogonal to linear functions and, as a result, the two parts of the model can be identified separately. The linear-wavelet model is compared to a standard wavelet network using data from a simulated fermentation process. The results show that the linear-wavelet model yields a smaller modelling error when compared to a wavelet network using the same number of regressors.
Resumo:
The present study investigates the initiation of precipitating deep convection in an ensemble of convection-resolving mesoscale models. Results of eight different model runs from five non-hydrostatic models are compared for a case of the Convective and Orographically-induced Precipitation Study (COPS). An isolated convective cell initiated east of the Black Forest crest in southwest Germany, although convective available potential energy was only moderate and convective inhibition was high. Measurements revealed that, due to the absence of synoptic forcing, convection was initiated by local processes related to the orography. In particular, the lifting by low-level convergence in the planetary boundary layer is assumed to be the dominant process on that day. The models used different configurations as well as different initial and boundary conditions. By comparing the different model performance with each other and with measurements, the processes which need to be well represented to initiate convection at the right place and time are discussed. Besides an accurate specification of the thermodynamic and kinematic fields, the results highlight the role of boundary-layer convergence features for quantitative precipitation forecasts in mountainous terrain.
Resumo:
This study analyzes the issue of American option valuation when the underlying exhibits a GARCH-type volatility process. We propose the usage of Rubinstein's Edgeworth binomial tree (EBT) in contrast to simulation-based methods being considered in previous studies. The EBT-based valuation approach makes an implied calibration of the pricing model feasible. By empirically analyzing the pricing performance of American index and equity options, we illustrate the superiority of the proposed approach.
Resumo:
Dielectric properties of 16 process cheeses were determined over the frequency range 0.3-3 GHz. The effect of temperature on the dielectric properties of process cheeses were investigated at temperature intervals of 10 degrees C between 5 and 85 degrees C. Results showed that the dielectric constant decreased gradually as frequency increased, for all cheeses. The dielectric loss factor (epsilon") decreased from above 125 to below 12 as frequency increased. epsilon' was highest at 5 degrees C and generally decreased up to a temperature between 55 and 75 degrees C. epsilon" generally increased with increasing temperature for high and medium moisture/fat ratio cheeses. epsilon" decreased with temperature between 5 and 55 degrees C and then increased, for low moisture/fat ratio cheese. Partial least square regression models indicated that epsilon' and epsilon" could be used as a quality control screening application to measure moisture content and inorganic salt content of process cheese, respectively. (c) 2005 Elsevier Ltd. All rights reserved..
Resumo:
The development of architecture and the settlement is central to discussions concerning the Neolithic transformation asthe very visible evidence for the changes in society that run parallel to the domestication of plants and animals. Architecture hasbeen used as an important aspect of models of how the transformation occurred, and as evidence for the sharp difference betweenhunter-gatherer and farming societies. We suggest that the emerging evidence for considerable architectural complexity from theearly Neolithic indicates that some of our interpretations depend too much on a very basic understanding of structures which arenormally seen as being primarily for residential purposes and containing households, which become the organising principle for thenew communities which are often seen as fully sedentary and described as villages. Recent work in southern Jordan suggests that inthis region at least there is little evidence for a standard house, and that structures are constructed for a range of diverse primary purposes other than simple domestic shelters.
Resumo:
Motivation: Modelling the 3D structures of proteins can often be enhanced if more than one fold template is used during the modelling process. However, in many cases, this may also result in poorer model quality for a given target or alignment method. There is a need for modelling protocols that can both consistently and significantly improve 3D models and provide an indication of when models might not benefit from the use of multiple target-template alignments. Here, we investigate the use of both global and local model quality prediction scores produced by ModFOLDclust2, to improve the selection of target-template alignments for the construction of multiple-template models. Additionally, we evaluate clustering the resulting population of multi- and single-template models for the improvement of our IntFOLD-TS tertiary structure prediction method. Results: We find that using accurate local model quality scores to guide alignment selection is the most consistent way to significantly improve models for each of the sequence to structure alignment methods tested. In addition, using accurate global model quality for re-ranking alignments, prior to selection, further improves the majority of multi-template modelling methods tested. Furthermore, subsequent clustering of the resulting population of multiple-template models significantly improves the quality of selected models compared with the previous version of our tertiary structure prediction method, IntFOLD-TS.
Resumo:
We consider the relation between so called continuous localization models—i.e. non-linear stochastic Schrödinger evolutions—and the discrete GRW-model of wave function collapse. The former can be understood as scaling limit of the GRW process. The proof relies on a stochastic Trotter formula, which is of interest in its own right. Our Trotter formula also allows to complement results on existence theory of stochastic Schrödinger evolutions by Holevo and Mora/Rebolledo.
Resumo:
Previous studies using coupled general circulation models (GCMs) suggest that the atmosphere model plays a dominant role in the modeled El Nin ̃ o–Southern Oscillation (ENSO), and that intermodel differences in the thermodynamical damping of sea surface temperatures (SSTs) are a dominant contributor to the ENSO amplitude diversity. This study presents a detailed analysis of the shortwave flux feedback (aSW) in 12 Coupled Model Intercomparison Project phase 3 (CMIP3) simulations, motivated by findings that aSW is the primary contributor to model thermodynamical damping errors. A ‘‘feedback decomposition method,’’ developed to elucidate the aSW biases, shows that all models un- derestimate the dynamical atmospheric response to SSTs in the eastern equatorial Pacific, leading to un- derestimated aSW values. Biases in the cloud response to dynamics and the shortwave interception by clouds also contribute to errors in aSW. Changes in the aSW feedback between the coupled and corresponding atmosphere-only simulations are related to changes in the mean dynamics. A large nonlinearity is found in the observed and modeled SW flux feedback, hidden when linearly cal- culating aSW. In the observations, two physical mechanisms are proposed to explain this nonlinearity: 1) a weaker subsidence response to cold SST anomalies than the ascent response to warm SST anomalies and 2) a nonlinear high-level cloud cover response to SST. The shortwave flux feedback nonlinearity tends to be underestimated by the models, linked to an underestimated nonlinearity in the dynamical response to SST. The process-based methodology presented in this study may help to correct model ENSO atmospheric biases, ultimately leading to an improved simulation of ENSO in GCMs.
Resumo:
Climate modeling is a complex process, requiring accurate and complete metadata in order to identify, assess and use climate data stored in digital repositories. The preservation of such data is increasingly important given the development of ever-increasingly complex models to predict the effects of global climate change. The EU METAFOR project has developed a Common Information Model (CIM) to describe climate data and the models and modelling environments that produce this data. There is a wide degree of variability between different climate models and modelling groups. To accommodate this, the CIM has been designed to be highly generic and flexible, with extensibility built in. METAFOR describes the climate modelling process simply as "an activity undertaken using software on computers to produce data." This process has been described as separate UML packages (and, ultimately, XML schemas). This fairly generic structure canbe paired with more specific "controlled vocabularies" in order to restrict the range of valid CIM instances. The CIM will aid digital preservation of climate models as it will provide an accepted standard structure for the model metadata. Tools to write and manage CIM instances, and to allow convenient and powerful searches of CIM databases,. Are also under development. Community buy-in of the CIM has been achieved through a continual process of consultation with the climate modelling community, and through the METAFOR team’s development of a questionnaire that will be used to collect the metadata for the Intergovernmental Panel on Climate Change’s (IPCC) Coupled Model Intercomparison Project Phase 5 (CMIP5) model runs.
Resumo:
We evaluated the accuracy of six watershed models of nitrogen export in streams (kg km2 yr−1) developed for use in large watersheds and representing various empirical and quasi-empirical approaches described in the literature. These models differ in their methods of calibration and have varying levels of spatial resolution and process complexity, which potentially affect the accuracy (bias and precision) of the model predictions of nitrogen export and source contributions to export. Using stream monitoring data and detailed estimates of the natural and cultural sources of nitrogen for 16 watersheds in the northeastern United States (drainage sizes = 475 to 70,000 km2), we assessed the accuracy of the model predictions of total nitrogen and nitrate-nitrogen export. The model validation included the use of an error modeling technique to identify biases caused by model deficiencies in quantifying nitrogen sources and biogeochemical processes affecting the transport of nitrogen in watersheds. Most models predicted stream nitrogen export to within 50% of the measured export in a majority of the watersheds. Prediction errors were negatively correlated with cultivated land area, indicating that the watershed models tended to over predict export in less agricultural and more forested watersheds and under predict in more agricultural basins. The magnitude of these biases differed appreciably among the models. Those models having more detailed descriptions of nitrogen sources, land and water attenuation of nitrogen, and water flow paths were found to have considerably lower bias and higher precision in their predictions of nitrogen export.
Resumo:
Human-made transformations to the environment, and in particular the land surface, are having a large impact on the distribution (in both time and space) of rainfall, upon which all life is reliant. Focusing on precipitation, soil moisture and near-surface temperature, we compare data from Phase 5 of the Climate Modelling Intercomparison Project (CMIP5), as well as blended observational–satellite data, to see how the interaction between rainfall and the land surface differs (or agrees) between the models and reality, at daily timescales. As expected, the results suggest a strong positive relationship between precipitation and soil moisture when precipitation leads and is concurrent with soil moisture estimates, for the tropics as a whole. Conversely a negative relationship is shown when soil moisture leads rainfall by a day or more. A weak positive relationship between precipitation and temperature is shown when either leads by one day, whereas a weak negative relationship is shown over the same time period between soil moisture and temperature. Temporally, in terms of lag and lead relationships, the models appear to be in agreement on the overall patterns of correlation between rainfall and soil moisture. However, in terms of spatial patterns, a comparison of these relationships across all available models reveals considerable variability in the ability of the models to reproduce the correlations between precipitation and soil moisture. There is also a difference in the timings of the correlations, with some models showing the highest positive correlations when precipitation leads soil moisture by one day. Finally, the results suggest that there are 'hotspots' of high linear gradients between precipitation and soil moisture, corresponding to regions experiencing heavy rainfall. These results point to an inability of the CMIP5 models to simulate a positive feedback between soil moisture and precipitation at daily timescales. Longer timescale comparisons and experiments at higher spatial resolutions, where the impact of the spatial heterogeneity of rainfall on the initiation of convection and supply of moisture is included, would be expected to improve process understanding further.
Resumo:
Purpose – This paper aims to provide a brief re´sume´ of previous research which has analysed the impact of e-commerce on retail real estate in the UK, and to examine the important marketing role of the internet for shopping centre managers, and retail landlords. Design/methodology/approach – Based on the results from a wider study carried out in 2003, the paper uses case studies from two different shopping centres in the UK, and documents the innovative uses of both web-based marketing and online retailing by organisations that historically have not directly been involved in the retailing process. Findings – The paper highlights the importance of considering online sales within a multi-channel approach to retailing. The two types of emerging shopping centre model which are identified are characterised by their ultimate relationship with the physical shopping centre on whose web site they reside. These can be summarised as: the “centre-led” approach, and the “brand-led” or “marketing-led” approach. Research limitations/implications – The research is based on a limited number of in-depth case studies and secondary data. Further research is needed to monitor the continuing impact of e-commerce on retail property and the marketing strategies of shopping centre managers and owners. Practical implications – Internet-based sales provide an important adjunct to conventional retail sales and an important source of potential risk for landlords and tenants in the real estate investment market. Regardless of whether retailers use the internet as a sales channel, as a product-sourcing tool, or merely to provide information to the consumer, the internet has become a keystone within the greater retail marketing mix. The findings have ramifications for understanding the way in which landlords are structuring their retail property to defray potential risks. Originality/value – The paper examines shopping centre online marketing models for the first time in detail, and will be of value to retail occupiers, owners and other stakeholders of shopping centres.
Resumo:
The goal of the Chemistry‐Climate Model Validation (CCMVal) activity is to improve understanding of chemistry‐climate models (CCMs) through process‐oriented evaluation and to provide reliable projections of stratospheric ozone and its impact on climate. An appreciation of the details of model formulations is essential for understanding how models respond to the changing external forcings of greenhouse gases and ozonedepleting substances, and hence for understanding the ozone and climate forecasts produced by the models participating in this activity. Here we introduce and review the models used for the second round (CCMVal‐2) of this intercomparison, regarding the implementation of chemical, transport, radiative, and dynamical processes in these models. In particular, we review the advantages and problems associated with approaches used to model processes of relevance to stratospheric dynamics and chemistry. Furthermore, we state the definitions of the reference simulations performed, and describe the forcing data used in these simulations. We identify some developments in chemistry‐climate modeling that make models more physically based or more comprehensive, including the introduction of an interactive ocean, online photolysis, troposphere‐stratosphere chemistry, and non‐orographic gravity‐wave deposition as linked to tropospheric convection. The relatively new developments indicate that stratospheric CCM modeling is becoming more consistent with our physically based understanding of the atmosphere.
Resumo:
A process-oriented modeling approach is applied in order to simulate glacier mass balance for individual glaciers using statistically downscaled general circulation models (GCMs). Glacier-specific seasonal sensitivity characteristics based on a mass balance model of intermediate complexity are used to simulate mass balances of Nigardsbreen (Norway) and Rhonegletscher (Switzerland). Simulations using reanalyses (ECMWF) for the period 1979–93 are in good agreement with in situ mass balance measurements for Nigardsbreen. The method is applied to multicentury integrations of coupled (ECHAM4/OPYC) and mixed-layer (ECHAM4/MLO) GCMs excluding external forcing. A high correlation between decadal variations in the North Atlantic oscillation (NAO) and mass balance of the glaciers is found. The dominant factor for this relationship is the strong impact of winter precipitation associated with the NAO. A high NAO phase means enhanced (reduced) winter precipitation for Nigardsbreen (Rhonegletscher), typically leading to a higher (lower) than normal annual mass balance. This mechanism, entirely due to internal variations in the climate system, can explain observed strong positive mass balances for Nigardsbreen and other maritime Norwegian glaciers within the period 1980–95. It can also partly be responsible for recent strong negative mass balances of Alpine glaciers.
Resumo:
We present a benchmark system for global vegetation models. This system provides a quantitative evaluation of multiple simulated vegetation properties, including primary production; seasonal net ecosystem production; vegetation cover, composition and 5 height; fire regime; and runoff. The benchmarks are derived from remotely sensed gridded datasets and site-based observations. The datasets allow comparisons of annual average conditions and seasonal and inter-annual variability, and they allow the impact of spatial and temporal biases in means and variability to be assessed separately. Specifically designed metrics quantify model performance for each process, 10 and are compared to scores based on the temporal or spatial mean value of the observations and a “random” model produced by bootstrap resampling of the observations. The benchmark system is applied to three models: a simple light-use efficiency and water-balance model (the Simple Diagnostic Biosphere Model: SDBM), and the Lund-Potsdam-Jena (LPJ) and Land Processes and eXchanges (LPX) dynamic global 15 vegetation models (DGVMs). SDBM reproduces observed CO2 seasonal cycles, but its simulation of independent measurements of net primary production (NPP) is too high. The two DGVMs show little difference for most benchmarks (including the interannual variability in the growth rate and seasonal cycle of atmospheric CO2), but LPX represents burnt fraction demonstrably more accurately. Benchmarking also identified 20 several weaknesses common to both DGVMs. The benchmarking system provides a quantitative approach for evaluating how adequately processes are represented in a model, identifying errors and biases, tracking improvements in performance through model development, and discriminating among models. Adoption of such a system would do much to improve confidence in terrestrial model predictions of climate change 25 impacts and feedbacks.