55 resultados para Internal Model Principle (IMP)
Resumo:
The performance of the atmospheric component of the new Hadley Centre Global Environmental Model (HadGEM1) is assessed in terms of its ability to represent a selection of key aspects of variability in the Tropics and extratropics. These include midlatitude storm tracks and blocking activity, synoptic variability over Europe, and the North Atlantic Oscillation together with tropical convection, the Madden-Julian oscillation, and the Asian summer monsoon. Comparisons with the previous model, the Third Hadley Centre Coupled Ocean-Atmosphere GCM (HadCM3), demonstrate that there has been a considerable increase in the transient eddy kinetic energy (EKE), bringing HadGEM1 into closer agreement with current reanalyses. This increase in EKE results from the increased horizontal resolution and, in combination with the improved physical parameterizations, leads to improvements in the representation of Northern Hemisphere storm tracks and blocking. The simulation of synoptic weather regimes over Europe is also greatly improved compared to HadCM3, again due to both increased resolution and other model developments. The variability of convection in the equatorial region is generally stronger and closer to observations than in HadCM3. There is, however, still limited convective variance coincident with several of the observed equatorial wave modes. Simulation of the Madden-Julian oscillation is improved in HadGEM1: both the activity and interannual variability are increased and the eastward propagation, although slower than observed, is much better simulated. While some aspects of the climatology of the Asian summer monsoon are improved in HadGEM1, the upper-level winds are too weak and the simulation of precipitation deteriorates. The dominant modes of monsoon interannual variability are similar in the two models, although in HadCM3 this is linked to SST forcing, while in HadGEM1 internal variability dominates. Overall, analysis of the phenomena considered here indicates that HadGEM1 performs well and, in many important respects, improves upon HadCM3. Together with the improved representation of the mean climate, this improvement in the simulation of atmospheric variability suggests that HadGEM1 provides a sound basis for future studies of climate and climate change.
Resumo:
In this paper we focus on the one year ahead prediction of the electricity peak-demand daily trajectory during the winter season in Central England and Wales. We define a Bayesian hierarchical model for predicting the winter trajectories and present results based on the past observed weather. Thanks to the flexibility of the Bayesian approach, we are able to produce the marginal posterior distributions of all the predictands of interest. This is a fundamental progress with respect to the classical methods. The results are encouraging in both skill and representation of uncertainty. Further extensions are straightforward at least in principle. The main two of those consist in conditioning the weather generator model with respect to additional information like the knowledge of the first part of the winter and/or the seasonal weather forecast. Copyright (C) 2006 John Wiley & Sons, Ltd.
Resumo:
Lava domes comprise core, carapace, and clastic talus components. They can grow endogenously by inflation of a core and/or exogenously with the extrusion of shear bounded lobes and whaleback lobes at the surface. Internal structure is paramount in determining the extent to which lava dome growth evolves stably, or conversely the propensity for collapse. The more core lava that exists within a dome, in both relative and absolute terms, the more explosive energy is available, both for large pyroclastic flows following collapse and in particular for lateral blast events following very rapid removal of lateral support to the dome. Knowledge of the location of the core lava within the dome is also relevant for hazard assessment purposes. A spreading toe, or lobe of core lava, over a talus substrate may be both relatively unstable and likely to accelerate to more violent activity during the early phases of a retrogressive collapse. Soufrière Hills Volcano, Montserrat has been erupting since 1995 and has produced numerous lava domes that have undergone repeated collapse events. We consider one continuous dome growth period, from August 2005 to May 2006 that resulted in a dome collapse event on 20th May 2006. The collapse event lasted 3 h, removing the whole dome plus dome remnants from a previous growth period in an unusually violent and rapid collapse event. We use an axisymmetrical computational Finite Element Method model for the growth and evolution of a lava dome. Our model comprises evolving core, carapace and talus components based on axisymmetrical endogenous dome growth, which permits us to model the interface between talus and core. Despite explicitly only modelling axisymmetrical endogenous dome growth our core–talus model simulates many of the observed growth characteristics of the 2005–2006 SHV lava dome well. Further, it is possible for our simulations to replicate large-scale exogenous characteristics when a considerable volume of talus has accumulated around the lower flanks of the dome. Model results suggest that dome core can override talus within a growing dome, potentially generating a region of significant weakness and a potential locus for collapse initiation.
Resumo:
In this paper we focus on the one year ahead prediction of the electricity peak-demand daily trajectory during the winter season in Central England and Wales. We define a Bayesian hierarchical model for predicting the winter trajectories and present results based on the past observed weather. Thanks to the flexibility of the Bayesian approach, we are able to produce the marginal posterior distributions of all the predictands of interest. This is a fundamental progress with respect to the classical methods. The results are encouraging in both skill and representation of uncertainty. Further extensions are straightforward at least in principle. The main two of those consist in conditioning the weather generator model with respect to additional information like the knowledge of the first part of the winter and/or the seasonal weather forecast. Copyright (C) 2006 John Wiley & Sons, Ltd.
Resumo:
In designing modern office buildings, building spaces are frequently zoned by introducing internal partitioning, which may have a significant influence on the room air environment. This internal partitioning was studied by means of model test, numerical simulation, and statistical analysis as the final stage. In this paper, the results produced from the statistical analysis are summarized and presented.
Effect of internal partitioning on indoor air quality of rooms with mixing ventilation - basic study
Resumo:
The internal partitioning, which is frequently introduced in open-space planning due to its flexibility, was tested to study its effects on the room air quality as well as ventilation performance. For the study, physical tests using a small model room and numerical modeling using CFD computation were utilized to evaluate different test conditions employing mixing ventilation from the ceiling. The partition parameters, such as its location, height, and the gap underneath, as well as contaminant source location were tested under isothermal conditions. This paper summarizes the results from the study.
Resumo:
A fundamental principle in practical nonlinear data modeling is the parsimonious principle of constructing the minimal model that explains the training data well. Leave-one-out (LOO) cross validation is often used to estimate generalization errors by choosing amongst different network architectures (M. Stone, "Cross validatory choice and assessment of statistical predictions", J. R. Stast. Soc., Ser. B, 36, pp. 117-147, 1974). Based upon the minimization of LOO criteria of either the mean squares of LOO errors or the LOO misclassification rate respectively, we present two backward elimination algorithms as model post-processing procedures for regression and classification problems. The proposed backward elimination procedures exploit an orthogonalization procedure to enable the orthogonality between the subspace as spanned by the pruned model and the deleted regressor. Subsequently, it is shown that the LOO criteria used in both algorithms can be calculated via some analytic recursive formula, as derived in this contribution, without actually splitting the estimation data set so as to reduce computational expense. Compared to most other model construction methods, the proposed algorithms are advantageous in several aspects; (i) There are no tuning parameters to be optimized through an extra validation data set; (ii) The procedure is fully automatic without an additional stopping criteria; and (iii) The model structure selection is directly based on model generalization performance. The illustrative examples on regression and classification are used to demonstrate that the proposed algorithms are viable post-processing methods to prune a model to gain extra sparsity and improved generalization.
Resumo:
The identification of non-linear systems using only observed finite datasets has become a mature research area over the last two decades. A class of linear-in-the-parameter models with universal approximation capabilities have been intensively studied and widely used due to the availability of many linear-learning algorithms and their inherent convergence conditions. This article presents a systematic overview of basic research on model selection approaches for linear-in-the-parameter models. One of the fundamental problems in non-linear system identification is to find the minimal model with the best model generalisation performance from observational data only. The important concepts in achieving good model generalisation used in various non-linear system-identification algorithms are first reviewed, including Bayesian parameter regularisation and models selective criteria based on the cross validation and experimental design. A significant advance in machine learning has been the development of the support vector machine as a means for identifying kernel models based on the structural risk minimisation principle. The developments on the convex optimisation-based model construction algorithms including the support vector regression algorithms are outlined. Input selection algorithms and on-line system identification algorithms are also included in this review. Finally, some industrial applications of non-linear models are discussed.
Resumo:
A numerical mesoscale model is used to make a high-resolution simulation of the marine boundary layer in the Persian Gulf, during conditions of offshore flow from Saudi Arabia. A marine internal boundary layer (MIBL) and a sea-breeze circulation (SBC) are found to co-exist. The sea breeze develops in the mid-afternoon, at which time its front is displaced several tens of kilometres offshore. Between the coast and the sea-breeze system, the MIBL that occurs is consistent with a picture described in the existing literature. However, the MIBL is perturbed by the SBC, the boundary layer deepening significantly seaward of the sea-breeze front. Our analysis suggests that this strong, localized deepening is not a direct consequence of frontal uplift, but rather that the immediate cause is the retardation of the prevailing, low-level offshore wind by the SBC. The simulated boundary-layer development can be accounted for by using a simple 1D Lagrangian model of growth driven by the surface heat flux. This model is obtained as a straightforward modification of an established MIBL analytic growth model.
Resumo:
In financial decision-making, a number of mathematical models have been developed for financial management in construction. However, optimizing both qualitative and quantitative factors and the semi-structured nature of construction finance optimization problems are key challenges in solving construction finance decisions. The selection of funding schemes by a modified construction loan acquisition model is solved by an adaptive genetic algorithm (AGA) approach. The basic objectives of the model are to optimize the loan and to minimize the interest payments for all projects. Multiple projects being undertaken by a medium-size construction firm in Hong Kong were used as a real case study to demonstrate the application of the model to the borrowing decision problems. A compromise monthly borrowing schedule was finally achieved. The results indicate that Small and Medium Enterprise (SME) Loan Guarantee Scheme (SGS) was first identified as the source of external financing. Selection of sources of funding can then be made to avoid the possibility of financial problems in the firm by classifying qualitative factors into external, interactive and internal types and taking additional qualitative factors including sovereignty, credit ability and networking into consideration. Thus a more accurate, objective and reliable borrowing decision can be provided for the decision-maker to analyse the financial options.
Resumo:
Pollen-mediated gene flow is one of the main concerns associated with the introduction of genetically modified (GM) crops. Should a premium for non-GM varieties emerge on the market, ‘contamination’ by GM pollen would generate a revenue loss for growers of non-GM varieties. This paper analyses the problem of pollen-mediated gene flow as a particular type of production externality. The model, although simple, provides useful insights into coexistence policies. Following on from this and taking GM herbicide-tolerant oilseed rape (Brassica napus) as a model crop, a Monte Carlo simulation is used to generate data and then estimate the effect of several important policy variables (including width of buffer zones and spatial aggregation) on the magnitude of the externality associated with pollen-mediated gene flow.
Resumo:
We explore the potential for making statistical decadal predictions of sea surface temperatures (SSTs) in a perfect model analysis, with a focus on the Atlantic basin. Various statistical methods (Lagged correlations, Linear Inverse Modelling and Constructed Analogue) are found to have significant skill in predicting the internal variability of Atlantic SSTs for up to a decade ahead in control integrations of two different global climate models (GCMs), namely HadCM3 and HadGEM1. Statistical methods which consider non-local information tend to perform best, but which is the most successful statistical method depends on the region considered, GCM data used and prediction lead time. However, the Constructed Analogue method tends to have the highest skill at longer lead times. Importantly, the regions of greatest prediction skill can be very different to regions identified as potentially predictable from variance explained arguments. This finding suggests that significant local decadal variability is not necessarily a prerequisite for skillful decadal predictions, and that the statistical methods are capturing some of the dynamics of low-frequency SST evolution. In particular, using data from HadGEM1, significant skill at lead times of 6–10 years is found in the tropical North Atlantic, a region with relatively little decadal variability compared to interannual variability. This skill appears to come from reconstructing the SSTs in the far north Atlantic, suggesting that the more northern latitudes are optimal for SST observations to improve predictions. We additionally explore whether adding sub-surface temperature data improves these decadal statistical predictions, and find that, again, it depends on the region, prediction lead time and GCM data used. Overall, we argue that the estimated prediction skill motivates the further development of statistical decadal predictions of SSTs as a benchmark for current and future GCM-based decadal climate predictions.
Resumo:
The IntFOLD-TS method was developed according to the guiding principle that the model quality assessment would be the most critical stage for our template based modelling pipeline. Thus, the IntFOLD-TS method firstly generates numerous alternative models, using in-house versions of several different sequence-structure alignment methods, which are then ranked in terms of global quality using our top performing quality assessment method – ModFOLDclust2. In addition to the predicted global quality scores, the predictions of local errors are also provided in the resulting coordinate files, using scores that represent the predicted deviation of each residue in the model from the equivalent residue in the native structure. The IntFOLD-TS method was found to generate high quality 3D models for many of the CASP9 targets, whilst also providing highly accurate predictions of their per-residue errors. This important information may help to make the 3D models that are produced by the IntFOLD-TS method more useful for guiding future experimental work
Resumo:
An efficient method of combining neutron diffraction data over an extended Q range with detailed atomistic models is presented. A quantitative and qualitative mapping of the organization of the chain conformation in both glass and liquid phase has been performed. The proposed structural refinement method is based on the exploitation of the intrachain features of the diffraction pattern by the use of internal coordinates for bond lengths, valence angles and torsion rotations. Models are built stochastically by assignment of these internal coordinates from probability distributions with limited variable parameters. Variation of these parameters is used in the construction of models that minimize the differences between the observed and calculated structure factors. A series of neutron scattering data of 1,4-polybutadiene at the region 20320 K is presented. Analysis of the experimental data yield bond lengths for C-C and C=C of 1.54 and 1.35 Å respectively. Valence angles of the backbone were found to be at 112 and 122.8 for the CCC and CC=C respectively. Three torsion angles corresponding to the double bond and the adjacent R and β bonds were found to occupy cis and trans, s(, trans and g( and trans states, respectively. We compare our results with theoretical predictions, computer simulations, RIS models, and previously reported experimental results.