940 resultados para global nonhydrostatic model
Resumo:
The IntFOLD-TS method was developed according to the guiding principle that the model quality assessment would be the most critical stage for our template based modelling pipeline. Thus, the IntFOLD-TS method firstly generates numerous alternative models, using in-house versions of several different sequence-structure alignment methods, which are then ranked in terms of global quality using our top performing quality assessment method – ModFOLDclust2. In addition to the predicted global quality scores, the predictions of local errors are also provided in the resulting coordinate files, using scores that represent the predicted deviation of each residue in the model from the equivalent residue in the native structure. The IntFOLD-TS method was found to generate high quality 3D models for many of the CASP9 targets, whilst also providing highly accurate predictions of their per-residue errors. This important information may help to make the 3D models that are produced by the IntFOLD-TS method more useful for guiding future experimental work
Resumo:
Six land surface models and five global hydrological models participate in a model intercomparison project (WaterMIP), which for the first time compares simulation results of these different classes of models in a consistent way. In this paper the simulation setup is described and aspects of the multi-model global terrestrial water balance are presented. All models were run at 0.5 degree spatial resolution for the global land areas for a 15-year period (1985-1999) using a newly-developed global meteorological dataset. Simulated global terrestrial evapotranspiration, excluding Greenland and Antarctica, ranges from 415 to 586 mm year-1 (60,000 to 85,000 km3 year-1) and simulated runoff ranges from 290 to 457 mm year-1 (42,000 to 66,000 km3 year-1). Both the mean and median runoff fractions for the land surface models are lower than those of the global hydrological models, although the range is wider. Significant simulation differences between land surface and global hydrological models are found to be caused by the snow scheme employed. The physically-based energy balance approach used by land surface models generally results in lower snow water equivalent values than the conceptual degree-day approach used by global hydrological models. Some differences in simulated runoff and evapotranspiration are explained by model parameterizations, although the processes included and parameterizations used are not distinct to either land surface models or global hydrological models. The results show that differences between model are major sources of uncertainty. Climate change impact studies thus need to use not only multiple climate models, but also some other measure of uncertainty, (e.g. multiple impact models).
Resumo:
A description is given of the global atmospheric electric circuit operating between the Earth’s surface and the ionosphere. Attention is drawn to the huge range of horizontal and vertical spatial scales, ranging from 10−9 m to 1012 m, concerned with the many important processes at work. A similarly enormous range of time scales is involved from 10−6 s to 109 s, in the physical effects and different phenomena that need to be considered. The current flowing in the global circuit is generated by disturbed weather such as thunderstorms and electrified rain/shower clouds, mostly occurring over the Earth’s land surface. The profile of electrical conductivity up through the atmosphere, determined mainly by galactic cosmic ray ionization, is a crucial parameter of the circuit. Model simulation results on the variation of the ionospheric potential, ∼250 kV positive with respect to the Earth’s potential, following lightning discharges and sprites are summarized. Experimental results comparing global circuit variations with the neutron rate recorded at Climax, Colorado, are then discussed. Within the return (load) part of the circuit in the fair weather regions remote from the generators, charge layers exist on the upper and lower edges of extensive layer clouds; new experimental evidence for these charge layers is also reviewed. Finally, some directions for future research in the subject are suggested.
Resumo:
The objective of this paper is to reconsider the Maximum Entropy Production conjecture (MEP) in the context of a very simple two-dimensional zonal-vertical climate model able to represent the total material entropy production due at the same time to both horizontal and vertical heat fluxes. MEP is applied first to a simple four-box model of climate which accounts for both horizontal and vertical material heat fluxes. It is shown that, under condition of fixed insolation, a MEP solution is found with reasonably realistic temperature and heat fluxes, thus generalising results from independent two-box horizontal or vertical models. It is also shown that the meridional and the vertical entropy production terms are independently involved in the maximisation and thus MEP can be applied to each subsystem with fixed boundary conditions. We then extend the four-box model by increasing its resolution, and compare it with GCM output. A MEP solution is found which is fairly realistic as far as the horizontal large scale organisation of the climate is concerned whereas the vertical structure looks to be unrealistic and presents seriously unstable features. This study suggest that the thermal meridional structure of the atmosphere is predicted fairly well by MEP once the insolation is given but the vertical structure of the atmosphere cannot be predicted satisfactorily by MEP unless constraints are imposed to represent the determination of longwave absorption by water vapour and clouds as a function of the state of the climate. Furthermore an order-of-magnitude estimate of contributions to the material entropy production due to horizontal and vertical processes within the climate system is provided by using two different methods. In both cases we found that approximately 40 mW m−2 K−1 of material entropy production is due to vertical heat transport and 5–7 mW m−2 K−1 to horizontal heat transport
Resumo:
We present an intercomparison and verification analysis of 20 GCMs (Global Circulation Models) included in the 4th IPCC assessment report regarding their representation of the hydrological cycle on the Danube river basin for 1961–2000 and for the 2161–2200 SRESA1B scenario runs. The basin-scale properties of the hydrological cycle are computed by spatially integrating the precipitation, evaporation, and runoff fields using the Voronoi-Thiessen tessellation formalism. The span of the model- simulated mean annual water balances is of the same order of magnitude of the observed Danube discharge of the Delta; the true value is within the range simulated by the models. Some land components seem to have deficiencies since there are cases of violation of water conservation when annual means are considered. The overall performance and the degree of agreement of the GCMs are comparable to those of the RCMs (Regional Climate Models) analyzed in a previous work, in spite of the much higher resolution and common nesting of the RCMs. The reanalyses are shown to feature several inconsistencies and cannot be used as a verification benchmark for the hydrological cycle in the Danubian region. In the scenario runs, for basically all models the water balance decreases, whereas its interannual variability increases. Changes in the strength of the hydrological cycle are not consistent among models: it is confirmed that capturing the impact of climate change on the hydrological cycle is not an easy task over land areas. Moreover, in several cases we find that qualitatively different behaviors emerge among the models: the ensemble mean does not represent any sort of average model, and often it falls between the models’ clusters.
Resumo:
International competitiveness ultimately depends upon the linkages between a firm’s unique, idiosyncratic capabilities (firm-specific advantages, FSAs) and its home country assets (country-specific advantages, CSAs). In this paper, we present a modified FSA/CSA matrix building upon the FSA/CSA matrix (Rugman 1981). We relate this to the diamond framework for national competitiveness (Porter 1990), and the double diamond model (Rugman and D’Cruz 1993). We provide empirical evidence to demonstrate the merits and usefulness of the modified FSA/CSA matrix using the Fortune Global 500 firms. We examine the FSAs based on the geographic scope of sales and CSAs that can lead to national, home region, and global competitiveness. Our empirical analysis suggests that the world’s largest 500 firms have increased their firm-level international competitiveness. However, much of this is still being achieved within their home region. In other words, international competitiveness is a regional not a global phenomenon. Our findings have significant implications for research and practice. Future research in international marketing should take into account the multi-faceted nature of FSAs and CSAs across different levels. For MNE managers, our study provides useful insights for strategic marketing planning and implementation.
Resumo:
Leisure is in the vanguard of a social and cultural revolution which is replacing the former East/West political bipolarity with a globalised economic system in which the new Europe has a central rôle. Within this revolution, leisure, including recreation, culture and tourism, is constructed as the epitome of successful capitalist development; the very legitimisation of the global transmogrification from a production to a consumption orientation. While acting as a direct encouragement to the political transformation in many eastern European states, it is uncertain how the issue of leisure policy is being handled, given its centrality to the new economic order. This paper therefore examines the experience of western Europe, considering in particular the degree to which the newly-created Department of National Heritage in the UK provides a potential model for leisure development and policy integration in the new Europe. Despite an official rhetoric of support and promotion of leisure activities, reflecting the growing economic significance of tourism and the positive relationship between leisure provision and regional economic development, the paper establishes that in the place of the traditional rôle of the state in promoting leisure interests, the introduction of the Department has signified a shift to the use of leisure to promote the Government's interests, particularly in regenerating citizen rights claims towards the market. While an institution such as the Department of National Heritage may have relevance to emerging states as a element in the maintenance of political hegemony, therefore, it is questionable how far it can be viewed as a promoter or protector of leisure as a signifier of a newly-won political, economic and cultural freedom throughout Europe.
Resumo:
This paper reviews the impact of the global financial crisis on financial system reform in China. Scholars and practitioners have critically questioned the efficiencies of the Anglo- American principal-agent model of corporate governance which promotes shareholder-value maximisation. Should China continue to follow the U.K.-U.S. path in relation to financial reform? This conceptual paper provides an insightful review of the corporate governance literature, regulatory reports and news articles from the financial press. After examining the fundamental limitations of the laissez-faire philosophy that underpins the neo-liberal model of capitalism, the paper considers the risks in opening up China’s financial markets and relaxing monetary and fiscal policies. The paper outlines a critique of shareholder-capitalism in relation to the German team-production model of corporate governance, promoting a “social market economy” styled capitalism. Through such analysis, the paper explores numerous implications for China to consider in terms of developing a new and sustainable corporate governance model. China needs to follow its own financial reform through understanding its particular economy. The global financial crisis might help China rethink the nature of corporate governance, identify its weakness and assess the current reform agenda.
Resumo:
For data assimilation in numerical weather prediction, the initial forecast-error covariance matrix Pf is required. For variational assimilation it is particularly important to prescribe an accurate initial matrix Pf, since Pf is either static (in the 3D-Var case) or constant at the beginning of each assimilation window (in the 4D-Var case). At large scales the atmospheric flow is well approximated by hydrostatic balance and this balance is strongly enforced in the initial matrix Pf used in operational variational assimilation systems such as that of the Met Office. However, at convective scales this balance does not necessarily hold any more. Here we examine the extent to which hydrostatic balance is valid in the vertical forecast-error covariances for high-resolution models in order to determine whether there is a need to relax this balance constraint in convective-scale data assimilation. We use the Met Office Global and Regional Ensemble Prediction System (MOGREPS) and a 1.5 km resolution version of the Unified Model for a case study characterized by the presence of convective activity. An ensemble of high-resolution forecasts valid up to three hours after the onset of convection is produced. We show that at 1.5 km resolution hydrostatic balance does not hold for forecast errors in regions of convection. This indicates that in the presence of convection hydrostatic balance should not be enforced in the covariance matrix used for variational data assimilation at this scale. The results show the need to investigate covariance models that may be better suited for convective-scale data assimilation. Finally, we give a measure of the balance present in the forecast perturbations as a function of the horizontal scale (from 3–90 km) using a set of diagnostics. Copyright © 2012 Royal Meteorological Society and British Crown Copyright, the Met Office
Resumo:
Motivation: Modelling the 3D structures of proteins can often be enhanced if more than one fold template is used during the modelling process. However, in many cases, this may also result in poorer model quality for a given target or alignment method. There is a need for modelling protocols that can both consistently and significantly improve 3D models and provide an indication of when models might not benefit from the use of multiple target-template alignments. Here, we investigate the use of both global and local model quality prediction scores produced by ModFOLDclust2, to improve the selection of target-template alignments for the construction of multiple-template models. Additionally, we evaluate clustering the resulting population of multi- and single-template models for the improvement of our IntFOLD-TS tertiary structure prediction method. Results: We find that using accurate local model quality scores to guide alignment selection is the most consistent way to significantly improve models for each of the sequence to structure alignment methods tested. In addition, using accurate global model quality for re-ranking alignments, prior to selection, further improves the majority of multi-template modelling methods tested. Furthermore, subsequent clustering of the resulting population of multiple-template models significantly improves the quality of selected models compared with the previous version of our tertiary structure prediction method, IntFOLD-TS.
Resumo:
During the 20th century, solar activity increased in magnitude to a so-called grand maximum. It is probable that this high level of solar activity is at or near its end. It is of great interest whether any future reduction in solar activity could have a significant impact on climate that could partially offset the projected anthropogenic warming. Observations and reconstructions of solar activity over the last 9000 years are used as a constraint on possible future variations to produce probability distributions of total solar irradiance over the next 100 years. Using this information, with a simple climate model, we present results of the potential implications for future projections of climate on decadal to multidecadal timescales. Using one of the most recent reconstructions of historic total solar irradiance, the likely reduction in the warming by 2100 is found to be between 0.06 and 0.1 K, a very small fraction of the projected anthropogenic warming. However, if past total solar irradiance variations are larger and climate models substantially underestimate the response to solar variations, then there is a potential for a reduction in solar activity to mitigate a small proportion of the future warming, a scenario we cannot totally rule out. While the Sun is not expected to provide substantial delays in the time to reach critical temperature thresholds, any small delays it might provide are likely to be greater for lower anthropogenic emissions scenarios than for higher-emissions scenarios.
Resumo:
Many studies warn that climate change may undermine global food security. Much work on this topic focuses on modelling crop-weather interactions but these models do not generally account for the ways in which socio-economic factors influence how harvests are affected by weather. To address this gap, this paper uses a quantitative harvest vulnerability index based on annual soil moisture and grain production data as the dependent variables in a Linear Mixed Effects model with national scale socio-economic data as independent variables for the period 1990-2005. Results show that rice, wheat and maize production in middle income countries were especially vulnerable to droughts. By contrast, harvests in countries with higher investments in agriculture (e.g higher amounts of fertilizer use) were less vulnerable to drought. In terms of differences between the world's major grain crops, factors that made rice and wheat crops vulnerable to drought were quite consistent, whilst those of maize crops varied considerably depending on the type of region. This is likely due to the fact that maize is produced under very different conditions worldwide. One recommendation for reducing drought vulnerability risks is coordinated development and adaptation policies, including institutional support that enables farmers to take proactive action.
Resumo:
A detailed analysis is undertaken of the Atlantic-European climate using data from 500-year-long proxy-based climate reconstructions, a long climate simulation with perpetual 1990 forcing, as well as two global and one regional climate change scenarios. The observed and simulated interannual variability and teleconnectivity are compared and interpreted in order to improve the understanding of natural climate variability on interannual to decadal time scales for the late Holocene. The focus is set on the Atlantic-European and Alpine regions during the winter and summer seasons, using temperature, precipitation, and 500 hPa geopotential height fields. The climate reconstruction shows pronounced interdecadal variations that appear to “lock” the atmospheric circulation in quasi-steady long-term patterns over multi-decadal periods controlling at least part of the temperature and precipitation variability. Different circulation patterns are persistent over several decades for the period 1500 to 1900. The 500-year-long simulation with perpetual 1990 forcing shows some substantial differences, with a more unsteady teleconnectivity behaviour. Two global scenario simulations indicate a transition towards more stable teleconnectivity for the next 100 years. Time series of reconstructed and simulated temperature and precipitation over the Alpine region show comparatively small changes in interannual variability within the time frame considered, with the exception of the summer season, where a substantial increase in interannual variability is simulated by regional climate models.