862 resultados para Prediction of Heterogeneous Variables System


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper the authors exploit two equivalent formulations of the average rate of material entropy production in the climate system to propose an approximate splitting between contributions due to vertical and eminently horizontal processes. This approach is based only on 2D radiative fields at the surface and at the top of atmosphere. Using 2D fields at the top of atmosphere alone, lower bounds to the rate of material entropy production and to the intensity of the Lorenz energy cycle are derived. By introducing a measure of the efficiency of the planetary system with respect to horizontal thermodynamic processes, it is possible to gain insight into a previous intuition on the possibility of defining a baroclinic heat engine extracting work from the meridional heat flux. The approximate formula of the material entropy production is verified and used for studying the global thermodynamic properties of climate models (CMs) included in the Program for Climate Model Diagnosis and Intercomparison (PCMDI)/phase 3 of the Coupled Model Intercomparison Project (CMIP3) dataset in preindustrial climate conditions. It is found that about 90% of the material entropy production is due to vertical processes such as convection, whereas the large-scale meridional heat transport contributes to only about 10% of the total. This suggests that the traditional two-box models used for providing a minimal representation of entropy production in planetary systems are not appropriate, whereas a basic—but conceptually correct—description can be framed in terms of a four-box model. The total material entropy production is typically 55 mW m−2 K−1, with discrepancies on the order of 5%, and CMs’ baroclinic efficiencies are clustered around 0.055. The lower bounds on the intensity of the Lorenz energy cycle featured by CMs are found to be around 1.0–1.5 W m−2, which implies that the derived inequality is rather stringent. When looking at the variability and covariability of the considered thermodynamic quantities, the agreement among CMs is worse, suggesting that the description of feedbacks is more uncertain. The contributions to material entropy production from vertical and horizontal processes are positively correlated, so that no compensation mechanism seems in place. Quite consistently among CMs, the variability of the efficiency of the system is a better proxy for variability of the entropy production due to horizontal processes than that of the large-scale heat flux. The possibility of providing constraints on the 3D dynamics of the fluid envelope based only on 2D observations of radiative fluxes seems promising for the observational study of planets and for testing numerical models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper provides a solution for predicting moving/moving and moving/static collisions of objects within a virtual environment. Feasible prediction in real-time virtual worlds can be obtained by encompassing moving objects within a sphere and static objects within a convex polygon. Fast solutions are then attainable by describing the movement of objects parametrically in time as a polynomial.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper explores the relationship between political ideology and planning in Britain and Sweden, with particular emphasis on the by-passing of the planning system. The prevailing ideology in each country over the last ten years is outlined and the impact on planning identified. The argument is then given in greater depth through case studies of two major projects. For Britain, this involves setting out the main features of Thatcherism and the way that this has changed the purposes underlying planning and created a diversified planning system. This is followed by a case study of Canary Wharf. For Sweden, the consensus culture and the emphasis on participation and decentralisation are discussed. The new planning legislation of 1987 is outlined. These aspects are then contrasted with the fiscal crisis and the development of 'negotiation planning'. These themes are illustrated by a case study of the Globe in Stockholm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A number of new and newly improved methods for predicting protein structure developed by the Jones–University College London group were used to make predictions for the CASP6 experiment. Structures were predicted with a combination of fold recognition methods (mGenTHREADER, nFOLD, and THREADER) and a substantially enhanced version of FRAGFOLD, our fragment assembly method. Attempts at automatic domain parsing were made using DomPred and DomSSEA, which are based on a secondary structure parsing algorithm and additionally for DomPred, a simple local sequence alignment scoring function. Disorder prediction was carried out using a new SVM-based version of DISOPRED. Attempts were also made at domain docking and “microdomain” folding in order to build complete chain models for some targets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dynamically disordered regions appear to be relatively abundant in eukaryotic proteomes. The DISOPRED server allows users to submit a protein sequence, and returns a probability estimate of each residue in the sequence being disordered. The results are sent in both plain text and graphical formats, and the server can also supply predictions of secondary structure to provide further structural information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

World-wide structural genomics initiatives are rapidly accumulating structures for which limited functional information is available. Additionally, state-of-the art structural prediction programs are now capable of generating at least low resolution structural models of target proteins. Accurate detection and classification of functional sites within both solved and modelled protein structures therefore represents an important challenge. We present a fully automatic site detection method, FuncSite, that uses neural network classifiers to predict the location and type of functionally important sites in protein structures. The method is designed primarily to require only backbone residue positions without the need for specific side-chain atoms to be present. In order to highlight effective site detection in low resolution structural models FuncSite was used to screen model proteins generated using mGenTHREADER on a set of newly released structures. We found effective metal site detection even for moderate quality protein models illustrating the robustness of the method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A standard CDMA system is considered and an extension of Pearson's results is used to determine the density function of the interference. The method is shown to work well in some cases, but not so in others. However this approach can be useful in further determining the probability of error of the system with minimal computational requirements.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper reviews the leading diagramming methods employed in system dynamics to communicate the contents of models. The main ideas and historical development of the field are first outlined. Two diagramming methods—causal loop diagrams (CLDs) and stock/flow diagrams (SFDs)—are then described and their advantages and limitations discussed. A set of broad research directions is then outlined. These concern: the abilities of different diagrams to communicate different ideas, the role that diagrams have in group model building, and the question of whether diagrams can be an adequate substitute for simulation modelling. The paper closes by suggesting that although diagrams alone are insufficient, they have many benefits. However, since these benefits have emerged only as ‘craft wisdom’, a more rigorous programme of research into the diagrams' respective attributes is called for.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 'Avalanche', an object is lowered, players staying in contact throughout. Normally the task is easily accomplished. However, with larger groups counter-intuitive behaviours appear. The paper proposes a formal theory for the underlying causal mechanisms. The aim is to not only provide an explicit, testable hypothesis for the source of the observed modes of behaviour-but also to exemplify the contribution that formal theory building can make to understanding complex social phenomena. Mapping reveals the importance of geometry to the Avalanche game; each player has a pair of balancing loops, one involved in lowering the object, the other ensuring contact. For more players, sets of balancing loops interact and these can allow dominance by reinforcing loops, causing the system to chase upwards towards an ever-increasing goal. However, a series of other effects concerning human physiology and behaviour (HPB) is posited as playing a role. The hypothesis is therefore rigorously tested using simulation. For simplicity a 'One Degree of Freedom' case is examined, allowing all of the effects to be included whilst rendering the analysis more transparent. Formulation and experimentation with the model gives insight into the behaviours. Multi-dimensional rate/level analysis indicates that there is only a narrow region in which the system is able to move downwards. Model runs reproduce the single 'desired' mode of behaviour and all three of the observed 'problematic' ones. Sensitivity analysis gives further insight into the system's modes and their causes. Behaviour is seen to arise only when the geometric effects apply (number of players greater than degrees of freedom of object) in combination with a range of HPB effects. An analogy exists between the co-operative behaviour required here and various examples: conflicting strategic objectives in organizations; Prisoners' Dilemma and integrated bargaining situations. Additionally, the game may be relatable in more direct algebraic terms to situations involving companies in which the resulting behaviours are mediated by market regulations. Finally, comment is offered on the inadequacy of some forms of theory building and the case is made for formal theory building involving the use of models, analysis and plausible explanations to create deep understanding of social phenomena.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Matei et al. (Reports, 6 January 2012, p. 76) claim to show skillful multiyear predictions of the Atlantic Meridional Overturning Circulation (AMOC). However, these claims are not justified, primarily because the predictions of AMOC transport do not outperform simple reference forecasts based on climatological annual cycles. Accordingly, there is no justification for the “confident” prediction of a stable AMOC through 2014.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We diagnose forcing and climate feedbacks in benchmark sensitivity experiments with the new Met Office Hadley Centre Earth system climate model HadGEM2-ES. To identify the impact of newly-included biogeophysical and chemical processes, results are compared to a parallel set of experiments performed with these processes switched off, and different couplings with the biogeochemistry. In abrupt carbon dioxide quadrupling experiments we find that the inclusion of these processes does not alter the global climate sensitivity of the model. However, when the change in carbon dioxide is uncoupled from the vegetation, or when the model is forced with a non-carbon dioxide forcing – an increase in solar constant – new feedbacks emerge that make the climate system less sensitive to external perturbations. We identify a strong negative dust-vegetation feedback on climate change that is small in standard carbon dioxide sensitivity experiments due to the physiological/fertilization effects of carbon dioxide on plants in this model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ECMWF operational grid point model (with a resolution of 1.875° of latitude and longitude) and its limited area version (with a resolution of !0.47° of latitude and longitude) with boundary values from the global model have been used to study the simulation of the typhoon Tip. The fine-mesh model was capable of simulating the main structural features of the typhoon and predicting a fall in central pressure of 60 mb in 3 days. The structure of the forecast typhoon, with a warm core (maximum potential temperature anomaly 17 K). intense swirling wind (maximum 55 m s-1 at 850 mb) and spiralling precipitation patterns is characteristic of a tropical cyclone. Comparison with the lower resolution forecast shows that the horizontal resolution is a determining factor in predicting not only the structure and intensity but even the movement of these vortices. However, an accurate and refined initial analysis is considered to be a prerequisite for a correct forecast of this phenomenon.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A series of numerical models have been used to investigate the predictability of atmospheric blocking for an episode selected from FGGE Special Observing Period I. Level II-b FGGE data have been used in the experiment. The blocking took place over the North Atlantic region and is a very characteristic example of high winter blocking. It is found that the very high resolution models developed at ECMWF, in a remarkable way manage to predict the blocking event in great detail, even beyond 1 week. Although models with much less resolution manage to predict the blocking phenomenon as such, the actual evolution differs very much from the observed and consequently the practical value is substantially reduced. Wind observations from the geostationary satellites are shown to have a substantial impact on the forecast beyond 5 days, as well as an extension of the integration domain to the whole globe. Quasi-geostrophic baroclinic models and, even more, barotropic models, are totally inadequate to predict blocking except in its initial phase. The prediction experiment illustrates clearly that efforts which have gone into the improvement of numerical prediction models in the last decades have been worth while.