38 resultados para MPI
em CentAUR: Central Archive University of Reading - UK
Resumo:
Recent research in multi-agent systems incorporate fault tolerance concepts, but does not explore the extension and implementation of such ideas for large scale parallel computing systems. The work reported in this paper investigates a swarm array computing approach, namely 'Intelligent Agents'. A task to be executed on a parallel computing system is decomposed to sub-tasks and mapped onto agents that traverse an abstracted hardware layer. The agents intercommunicate across processors to share information during the event of a predicted core/processor failure and for successfully completing the task. The feasibility of the approach is validated by implementation of a parallel reduction algorithm on a computer cluster using the Message Passing Interface.
Resumo:
Synoptic activity over the Northern Hemisphere is evaluated in ensembles of ECHAM5/MPI-OM1 simulations for recent climate conditions (20C) and for three climate scenarios (following SRES A1B, A2, B1). A close agreement is found between the simulations for present day climate and the respective results from reanalysis. Significant changes in the winter mid-tropospheric storm tracks are detected in all three scenario simulations. Ensemble mean climate signals are rather similar, with particularly large activity increases downstream of the Atlantic storm track over Western Europe. The magnitude of this signal is largely dependent on the imposed change in forcing. However, differences between individual ensemble members may be large. With respect to the surface cyclones, the scenario runs produce a reduction in cyclonic track density over the mid-latitudes, even in the areas with increasing mid-tropospheric activity. The largest decrease in track densities occurs at subtropical latitudes, e.g., over the Mediterranean Basin. An increase of cyclone intensities is detected for limited areas (e.g., near Great Britain and Aleutian Isles) for the A1B and A2 experiments. The changes in synoptic activity are associated with alterations of the Northern Hemisphere circulation and background conditions (blocking frequencies, jet stream). The North Atlantic Oscillation index also shows increased values with enhanced forcing. With respect to the effects of changing synoptic activity, the regional change in cyclone intensities is accompanied by alterations of the extreme surface winds, with increasing values over Great Britain, North and Baltic Seas, as well as the areas with vanishing sea ice, and decreases over much of the subtropics.
Resumo:
A simple storm loss model is applied to an ensemble of ECHAM5/MPI-OM1 GCM simulations in order to estimate changes of insured loss potentials over Europe in the 21st century. Losses are computed based on the daily maximum wind speed for each grid point. The calibration of the loss model is performed using wind data from the ERA40-Reanalysis and German loss data. The obtained annual losses for the present climate conditions (20C, three realisations) reproduce the statistical features of the historical insurance loss data for Germany. The climate change experiments correspond to the SRES-Scenarios A1B and A2, and for each of them three realisations are considered. On average, insured loss potentials increase for all analysed European regions at the end of the 21st century. Changes are largest for Germany and France, and lowest for Portugal/Spain. Additionally, the spread between the single realisations is large, ranging e.g. for Germany from −4% to +43% in terms of mean annual loss. Moreover, almost all simulations show an increasing interannual variability of storm damage. This assessment is even more pronounced if no adaptation of building structure to climate change is considered. The increased loss potentials are linked with enhanced values for the high percentiles of surface wind maxima over Western and Central Europe, which in turn are associated with an enhanced number and increased intensity of extreme cyclones over the British Isles and the North Sea.
Resumo:
This paper details a strategy for modifying the source code of a complex model so that the model may be used in a data assimilation context, {and gives the standards for implementing a data assimilation code to use such a model}. The strategy relies on keeping the model separate from any data assimilation code, and coupling the two through the use of Message Passing Interface (MPI) {functionality}. This strategy limits the changes necessary to the model and as such is rapid to program, at the expense of ultimate performance. The implementation technique is applied in different models with state dimension up to $2.7 \times 10^8$. The overheads added by using this implementation strategy in a coupled ocean-atmosphere climate model are shown to be an order of magnitude smaller than the addition of correlated stochastic random errors necessary for some nonlinear data assimilation techniques.
Resumo:
The new Max-Planck-Institute Earth System Model (MPI-ESM) is used in the Coupled Model Intercomparison Project phase 5 (CMIP5) in a series of climate change experiments for either idealized CO2-only forcing or forcings based on observations and the Representative Concentration Pathway (RCP) scenarios. The paper gives an overview of the model configurations, experiments related forcings, and initialization procedures and presents results for the simulated changes in climate and carbon cycle. It is found that the climate feedback depends on the global warming and possibly the forcing history. The global warming from climatological 1850 conditions to 2080–2100 ranges from 1.5°C under the RCP2.6 scenario to 4.4°C under the RCP8.5 scenario. Over this range, the patterns of temperature and precipitation change are nearly independent of the global warming. The model shows a tendency to reduce the ocean heat uptake efficiency toward a warmer climate, and hence acceleration in warming in the later years. The precipitation sensitivity can be as high as 2.5% K−1 if the CO2 concentration is constant, or as small as 1.6% K−1, if the CO2 concentration is increasing. The oceanic uptake of anthropogenic carbon increases over time in all scenarios, being smallest in the experiment forced by RCP2.6 and largest in that for RCP8.5. The land also serves as a net carbon sink in all scenarios, predominantly in boreal regions. The strong tropical carbon sources found in the RCP2.6 and RCP8.5 experiments are almost absent in the RCP4.5 experiment, which can be explained by reforestation in the RCP4.5 scenario.
Resumo:
Tropical cyclones have been investigated in a T159 version of the MPI ECHAM5 climate model using a novel technique to diagnose the evolution of the 3-dimensional vorticity structure of tropical cyclones, including their full life cycle from weak initial vortex to their possible extra-tropical transition. Results have been compared with reanalyses (ERA40 and JRA25) and observed tropical storms during the period 1978-1999 for the Northern Hemisphere. There is no indication of any trend in the number or intensity of tropical storms during this period in ECHAM5 or in re-analyses but there are distinct inter-annual variations. The storms simulated by ECHAM5 are realistic both in space and time, but the model and even more so the re-analyses, underestimate the intensities of the most intense storms (in terms of their maximum wind speeds). There is an indication of a response to ENSO with a smaller number of Atlantic storms during El Niño in agreement with previous studies. The global divergence circulation responds to El Niño by setting up a large-scale convergence flow, with the center over the central Pacific with enhanced subsidence over the tropical Atlantic. At the same time there is an increase in the vertical wind shear in the region of the tropical Atlantic where tropical storms normally develop. There is a good correspondence between the model and ERA40 except that the divergence circulation is somewhat stronger in the model. The model underestimates storms in the Atlantic but tends to overestimate them in the Western Pacific and in the North Indian Ocean. It is suggested that the overestimation of storms in the Pacific by the model is related to an overly strong response to the tropical Pacific SST anomalies. The overestimation in 2 the North Indian Ocean is likely to be due to an over prediction in the intensity of monsoon depressions, which are then classified as intense tropical storms. Nevertheless, overall results are encouraging and will further contribute to increased confidence in simulating intense tropical storms with high-resolution climate models.
Resumo:
Tropical Cyclones (TC) under different climate conditions in the Northern Hemisphere have been investigated with the Max Planck Institute (MPI) coupled (ECHAM5/MPIOM) and atmosphere (ECHAM5) climate models. The intensity and size of the TC depend crucially on resolution with higher wind speed and smaller scales at the higher resolutions. The typical size of the TC is reduced by a factor of 2.3 from T63 to T319 using the distance of the maximum wind speed from the centre of the storm as a measure. The full three dimensional structure of the storms becomes increasingly more realistic as the resolution is increased. For the T63 resolution, three ensemble runs are explored for the period 1860 until 2100 using the IPCC SRES scenario A1B and evaluated for three 30 year periods at the end of the 19th, 20th and 21st century, respectively. While there is no significant change between the 19th and the 20th century, there is a considerable reduction in the number of the TC by some 20% in the 21st century, but no change in the number of the more intense storms. Reduction in the number of storms occurs in all regions. A single additional experiment at T213 resolution was run for the two latter 30-year periods. The T213 is an atmospheric only experiment using the transient Sea Surface Temperatures (SST) of the T63 resolution experiment. Also in this case, there is a reduction by some 10% in the number of simulated TC in the 21st century compared to the 20th century but a marked increase in the number of intense storms. The number of storms with maximum wind speeds greater than 50ms-1 increases by a third. Most of the intensification takes place in 2 the Eastern Pacific and in the Atlantic where also the number of storms more or less stays the same. We identify two competing processes effecting TC in a warmer climate. First, the increase in the static stability and the reduced vertical circulation is suggested to contribute to the reduction in the number of storms. Second, the increase in temperature and water vapor provide more energy for the storms so that when favorable conditions occur, the higher SST and higher specific humidity will contribute to more intense storms. As the maximum intensity depends crucially on resolution, this will require higher resolution to have its full effect. The distribution of storms between different regions does not, at first approximation, depend on the temperature itself but on the distribution of the SST anomalies and their influence on the atmospheric circulation. Two additional transient experiments at T319 resolution where run for 20 years at the end of the 20th and 21st century, respectively using the same conditions as in the T213 experiments. The results are consistent with the T213 study. The total number of tropical cyclones were similar to the T213 experiment but were generally more intense. The change from the 20th to the 21st century was also similar with fewer TC in total but with more intense cyclones.
Resumo:
The long-term stability, high accuracy, all-weather capability, high vertical resolution, and global coverage of Global Navigation Satellite System (GNSS) radio occultation (RO) suggests it as a promising tool for global monitoring of atmospheric temperature change. With the aim to investigate and quantify how well a GNSS RO observing system is able to detect climate trends, we are currently performing an (climate) observing system simulation experiment over the 25-year period 2001 to 2025, which involves quasi-realistic modeling of the neutral atmosphere and the ionosphere. We carried out two climate simulations with the general circulation model MAECHAM5 (Middle Atmosphere European Centre/Hamburg Model Version 5) of the MPI-M Hamburg, covering the period 2001–2025: One control run with natural variability only and one run also including anthropogenic forcings due to greenhouse gases, sulfate aerosols, and tropospheric ozone. On the basis of this, we perform quasi-realistic simulations of RO observables for a small GNSS receiver constellation (six satellites), state-of-the-art data processing for atmospheric profiles retrieval, and a statistical analysis of temperature trends in both the “observed” climatology and the “true” climatology. Here we describe the setup of the experiment and results from a test bed study conducted to obtain a basic set of realistic estimates of observational errors (instrument- and retrieval processing-related errors) and sampling errors (due to spatial-temporal undersampling). The test bed results, obtained for a typical summer season and compared to the climatic 2001–2025 trends from the MAECHAM5 simulation including anthropogenic forcing, were found encouraging for performing the full 25-year experiment. They indicated that observational and sampling errors (both contributing about 0.2 K) are consistent with recent estimates of these errors from real RO data and that they should be sufficiently small for monitoring expected temperature trends in the global atmosphere over the next 10 to 20 years in most regions of the upper troposphere and lower stratosphere (UTLS). Inspection of the MAECHAM5 trends in different RO-accessible atmospheric parameters (microwave refractivity and pressure/geopotential height in addition to temperature) indicates complementary climate change sensitivity in different regions of the UTLS so that optimized climate monitoring shall combine information from all climatic key variables retrievable from GNSS RO data.
Resumo:
The modelled El Nino-mean state-seasonal cycle interactions in 23 coupled ocean-atmosphere GCMs, including the recent IPCC AR4 models, are assessed and compared to observations and theory. The models show a clear improvement over previous generations in simulating the tropical Pacific climatology. Systematic biases still include too strong mean and seasonal cycle of trade winds. El Nino amplitude is shown to be an inverse function of the mean trade winds in agreement with the observed shift of 1976 and with theoretical studies. El Nino amplitude is further shown to be an inverse function of the relative strength of the seasonal cycle. When most of the energy is within the seasonal cycle, little is left for inter-annual signals and vice versa. An interannual coupling strength (ICS) is defined and its relation with the modelled El Nino frequency is compared to that predicted by theoretical models. An assessment of the modelled El Nino in term of SST mode (S-mode) or thermocline mode (T-mode) shows that most models are locked into a S-mode and that only a few models exhibit a hybrid mode, like in observations. It is concluded that several basic El Nino-mean state-seasonal cycle relationships proposed by either theory or analysis of observations seem to be reproduced by CGCMs. This is especially true for the amplitude of El Nino and is less clear for its frequency. Most of these relationships, first established for the pre-industrial control simulations, hold for the double and quadruple CO2 stabilized scenarios. The models that exhibit the largest El Nino amplitude change in these greenhouse gas (GHG) increase scenarios are those that exhibit a mode change towards a T-mode (either from S-mode to hybrid or hybrid to T-mode). This follows the observed 1976 climate shift in the tropical Pacific, and supports the-still debated-finding of studies that associated this shift to increased GHGs. In many respects, these models are also among those that best simulate the tropical Pacific climatology (ECHAM5/MPI-OM, GFDL-CM2.0, GFDL-CM2.1, MRI-CGM2.3.2, UKMO-HadCM3). Results from this large subset of models suggest the likelihood of increased El Nino amplitude in a warmer climate, though there is considerable spread of El Nino behaviour among the models and the changes in the subsurface thermocline properties that may be important for El Nino change could not be assessed. There are no clear indications of an El Nino frequency change with increased GHG.
Resumo:
In the Essence project a 17-member ensemble simulation of climate change in response to the SRES A1b scenario has been carried out using the ECHAM5/MPI-OM climate model. The relatively large size of the ensemble makes it possible to accurately investigate changes in extreme values of climate variables. Here we focus on the annual-maximum 2m-temperature and fit a Generalized Extreme Value (GEV) distribution to the simulated values and investigate the development of the parameters of this distribution. Over most land areas both the location and the scale parameter increase. Consequently the 100-year return values increase faster than the average temperatures. A comparison of simulated 100-year return values for the present climate with observations (station data and reanalysis) shows that the ECHAM5/MPI-OM model, as well as other models, overestimates extreme temperature values. After correcting for this bias, it still shows values in excess of 50°C in Australia, India, the Middle East, North Africa, the Sahel and equatorial and subtropical South America at the end of the century.
Resumo:
Inferring population admixture from genetic data and quantifying it is a difficult but crucial task in evolutionary and conservation biology. Unfortunately state-of-the-art probabilistic approaches are computationally demanding. Effectively exploiting the computational power of modern multiprocessor systems can thus have a positive impact to Monte Carlo-based simulation of admixture modeling. A novel parallel approach is briefly described and promising results on its message passing interface (MPI)-based C implementation are reported.
Resumo:
MPJ Express is a thread-safe Java messaging library that provides a full implementation of the mpiJava 1.2 API specification. This specification defines a MPI-like bindings for the Java language. We have implemented two communication devices as part of our library, the first, called niodev is based on the Java New I/O package and the second, called mxdev is based on the Myrinet eXpress library MPJ Express comes with an experimental runtitne, which allows portable bootstrapping of Java Virtual Machines across a cluster or network of computers. In this paper we describe the implementation of MPJ Express. Also, we present a performance comparison against various other C and Java messaging systems. A beta version of MPJ Express was released in September 2005.
Resumo:
The Java language first came to public attention in 1995. Within a year, it was being speculated that Java may be a good language for parallel and distributed computing. Its core features, including being objected oriented and platform independence, as well as having built-in network support and threads, has encouraged this view. Today, Java is being used in almost every type of computer-based system, ranging from sensor networks to high performance computing platforms, and from enterprise applications through to complex research-based.simulations. In this paper the key features that make Java a good language for parallel and distributed computing are first discussed. Two Java-based middleware systems, namely MPJ Express, an MPI-like Java messaging system, and Tycho, a wide-area asynchronous messaging framework with an integrated virtual registry are then discussed. The paper concludes by highlighting the advantages of using Java as middleware to support distributed applications.
Resumo:
In any data mining applications, automated text and text and image retrieval of information is needed. This becomes essential with the growth of the Internet and digital libraries. Our approach is based on the latent semantic indexing (LSI) and the corresponding term-by-document matrix suggested by Berry and his co-authors. Instead of using deterministic methods to find the required number of first "k" singular triplets, we propose a stochastic approach. First, we use Monte Carlo method to sample and to build much smaller size term-by-document matrix (e.g. we build k x k matrix) from where we then find the first "k" triplets using standard deterministic methods. Second, we investigate how we can reduce the problem to finding the "k"-largest eigenvalues using parallel Monte Carlo methods. We apply these methods to the initial matrix and also to the reduced one. The algorithms are running on a cluster of workstations under MPI and results of the experiments arising in textual retrieval of Web documents as well as comparison of the stochastic methods proposed are presented. (C) 2003 IMACS. Published by Elsevier Science B.V. All rights reserved.