137 resultados para Proxy servers


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper it is argued that rotational wind is not the best choice of leading control variable for variational data assimilation, and an alternative is suggested and tested. A rotational wind parameter is used in most global variational assimilation systems as a pragmatic way of approximately representing the balanced component of the assimilation increments. In effect, rotational wind is treated as a proxy for potential vorticity, but one that it is potentially not a good choice in flow regimes characterised by small Burger number. This paper reports on an alternative set of control variables which are based around potential vorticity. This gives rise to a new formulation of the background error covariances for the Met Office's variational assimilation system, which leads to flow dependency. It uses similar balance relationships to traditional schemes, but recognises the existence of unbalanced rotational wind which is used with a new anti-balance relationship. The new scheme is described and its performance is evaluated and compared to a traditional scheme using a sample of diagnostics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The OECD 14 d earthworm acute toxicity test was used to determine the toxicity of copper added as copper nitrate (Cu(NO3)(2)), copper sulphate (CuSO4) and malachite (Cu-2(OH)(2)(CO3)) to Eisenia fetida Savigny. Cu(NO3)(2), and CuSO4 were applied in both an aqueous (aq) and solid (s) form, Cu-2(OH)(2)(CO3) was added as a solid. Soil solution was extracted by centrifugation, and analysed for copper. Two extractants [0.01 M CaCl2 and 0.005 M diethylenetriminpentaacetic acid (DTPA)] were used as a proxy of the bioavailable copper fraction in the soil. For bulk soil copper content the calculated copper toxicity decreased in the order nitrate > sulphide > carbonate, the same order as decreasing solubility of the metal compounds. For Cu(NO3)(2) and CuSO4, the LC50s obtained were not significantly different when the compound was added in solution or solid form. There was a significant correlation between the soil solution copper concentration and the percentage earthworm mortality for all 3 copper compounds (P less than or equal to 0.05) indicating that the soil pore water copper concentration is important for determining copper availability and toxicity to E. fetida. In soil avoidance tests the earthworms avoided the soils treated with Cu(NO3)(2) (aq and s) and CuSO4 (aq and s), at all concentrations used (110-8750 mug Cu g(-1), and 600-8750 mug Cu g(-1) respectively). In soils treated with Cu-2(OH2)CO3, avoidance behaviour was exhibited at all concentrations greater than or equal to3500 mug Cu g(-1). There was no significant correlation between the copper extracted by either CaCl2 or DTPA and percentage mortality. These two extractants are therefore not useful indicators of copper availability and toxicity to E. fetida.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new dynamic model of water quality, Q(2), has recently been developed, capable of simulating large branched river systems. This paper describes the application of a generalized sensitivity analysis (GSA) to Q(2) for single reaches of the River Thames in southern England. Focusing on the simulation of dissolved oxygen (DO) (since this may be regarded as a proxy for the overall health of a river); the GSA is used to identify key parameters controlling model behavior and provide a probabilistic procedure for model calibration. It is shown that, in the River Thames at least, it is more important to obtain high quality forcing functions than to obtain improved parameter estimates once approximate values have been estimated. Furthermore, there is a need to ensure reasonable simulation of a range of water quality determinands, since a focus only on DO increases predictive uncertainty in the DO simulations. The Q(2) model has been applied here to the River Thames, but it has a broad utility for evaluating other systems in Europe and around the world.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Development geography has long sought to understand why inequalities exist and the best ways to address them. Dependency theory sets out an historical rationale for under development based on colonialism and a legacy of developed core and under-developed periphery. Race is relevant in this theory only insofar that Europeans are white and the places they colonised were occupied by people with darker skin colour. There are no innate biological reasons why it happened in that order. However, a new theory for national inequalities proposed by Lynn and Vanhanen in a series of publications makes the case that poorer countries have that status because of a poorer genetic stock rather than an accident of history. They argue that IQ has a genetic basis and IQ is linked to ability. Thus races with a poorer IQ have less ability, and thus national IQ can be positively correlated with performance as measured by an indicator like GDP/capita. Their thesis is one of despair, as little can be done to improve genetic stock significantly other than a programme of eugenics. This paper summarises and critiques the Lynn and Vanhanen hypothesis and the assumptions upon which it is based, and uses this analysis to show how a human desire to simplify in order to manage can be dangerous in development geography. While the attention may naturally be focused on the 'national IQ' variables as a proxy measure of 'innate ability', the assumption of GDP per capita as an indicator of 'success' and 'achievement' is far more readily accepted without criticism. The paper makes the case that the current vogue for indicators, indices and cause-effect can be tyrannical.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The southern Levant has a long history of human habitation and it has been previously suggested that climatic changes during the Late Pleistocene-Holocene stimulated changes in human behaviour and society. In order to evaluate such linkages, it is necessary to have a detailed understanding of the climate record. We have conducted an extensive and up-to-date review of terrestrial and marine climatic conditions in the Levant and Eastern Mediterranean during the last 25,000 years. We firstly present data from general circulation models (GCMs) simulating the climate for the last glacial maximum (LGM), and evaluate the output of the model by reference to geological climate proxy data. We consider the types of climate data available from different environments and proxies and then present the spatial climatic "picture" for key climatic events. This exercise suggests that the major Northern Hemisphere climatic fluctuations of the last 25,000 years are recorded in the Eastern Mediterranean and Levantine region. However, this review also highlights problems and inadequacies with the existing data. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The tropospheric response to a forced shutdown of the North Atlantic Ocean’s meridional overturning circulation (MOC) is investigated in a coupled ocean–atmosphere GCM [the third climate configuration of the Met Office Unified Model (HadCM3)]. The strength of the boreal winter North Atlantic storm track is significantly increased and penetrates much farther into western Europe. The changes in the storm track are shown to be consistent with the changes in near-surface baroclinicity, which can be linked to changes in surface temperature gradients near regions of sea ice formation and in the open ocean. Changes in the SST of the tropical Atlantic are linked to a strengthening of the subtropical jet to the north, which, combined with the enhanced storm track, leads to a pronounced split in the jet structure over Europe. EOF analysis and stationary box indices methods are used to analyze changes to the North Atlantic Oscillation (NAO). There is no consistent signal of a change in the variability of the NAO, and while the changes in the mean flow project onto the positive NAO phase, they are significantly different from it. However, there is a clear eastward shift of the NAO pattern in the shutdown run, and this potentially has implications for ocean circulation and for the interpretation of proxy paleoclimate records.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present an analysis of trace gas correlations in the lowermost stratosphere. In‐situ aircraft measurements of CO, N2O, NOy and O3, obtained during the STREAM 1997 winter campaign, have been used to investigate the role of cross‐tropopause mass exchange on tracer‐tracer relations. At altitudes several kilometers above the local tropopause, undisturbed stratospheric air was found with NOy/NOy * ratios close to unity, NOy/O3 about 0.003–0.006 and CO mixing ratios as low as 20 ppbv (NOy * is a proxy for total reactive nitrogen derived from NOy–N2O relations measured in the stratosphere). Mixing of tropospheric air into the lowermost stratosphere has been identified by enhanced ratios of NOy/NOy * and NOy/O3, and from scatter plots of CO versus O3. The enhanced NOy/O3 ratio in the lowermost stratospheric mixing zone points to a reduced efficiency of O3 formation from aircraft NOx emissions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Anthropogenic changes in precipitation pose a serious threat to society—particularly in regions such as the Middle East that already face serious water shortages. However, climate model projections of regional precipitation remain highly uncertain. Moreover, standard resolution climate models have particular difficulty representing precipitation in the Middle East, which is modulated by complex topography, inland water bodies and proximity to the Mediterranean Sea. Here we compare precipitation changes over the twenty-first century against both millennial variability during the Holocene and interannual variability in the present day. In order to assess the climate model and to make consistent comparisons, this study uses new regional climate model simulations of the past, present and future in conjunction with proxy and historical observations. We show that the pattern of precipitation change within Europe and the Middle East projected by the end of the twenty-first century has some similarities to that which occurred during the Holocene. In both cases, a poleward shift of the North Atlantic storm track and a weakening of the Mediterranean storm track appear to cause decreased winter rainfall in southern Europe and the Middle East and increased rainfall further north. In contrast, on an interannual time scale, anomalously dry seasons in the Middle East are associated with a strengthening and focusing of the storm track in the north Mediterranean and hence wet conditions throughout southern Europe.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The commonly held view of the conditions in the North Atlantic at the last glacial maximum, based on the interpretation of proxy records, is of large-scale cooling compared to today, limited deep convection, and extensive sea ice, all associated with a southward displaced and weakened overturning thermohaline circulation (THC) in the North Atlantic. Not all studies support that view; in particular, the "strength of the overturning circulation" is contentious and is a quantity that is difficult to determine even for the present day. Quasi-equilibrium simulations with coupled climate models forced by glacial boundary conditions have produced differing results, as have inferences made from proxy records. Most studies suggest the weaker circulation, some suggest little or no change, and a few suggest a stronger circulation. Here results are presented from a three-dimensional climate model, the Hadley Centre Coupled Model version 3 (HadCM3), of the coupled atmosphere - ocean - sea ice system suggesting, in a qualitative sense, that these diverging views could all have occurred at different times during the last glacial period, with different modes existing at different times. One mode might have been characterized by an active THC associated with moderate temperatures in the North Atlantic and a modest expanse of sea ice. The other mode, perhaps forced by large inputs of meltwater from the continental ice sheets into the northern North Atlantic, might have been characterized by a sluggish THC associated with very cold conditions around the North Atlantic and a large areal cover of sea ice. The authors' model simulation of such a mode, forced by a large input of freshwater, bears several of the characteristics of the Climate: Long-range Investigation, Mapping, and Prediction (CLIMAP) Project's reconstruction of glacial sea surface temperature and sea ice extent.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A physically motivated statistical model is used to diagnose variability and trends in wintertime ( October - March) Global Precipitation Climatology Project (GPCP) pentad (5-day mean) precipitation. Quasi-geostrophic theory suggests that extratropical precipitation amounts should depend multiplicatively on the pressure gradient, saturation specific humidity, and the meridional temperature gradient. This physical insight has been used to guide the development of a suitable statistical model for precipitation using a mixture of generalized linear models: a logistic model for the binary occurrence of precipitation and a Gamma distribution model for the wet day precipitation amount. The statistical model allows for the investigation of the role of each factor in determining variations and long-term trends. Saturation specific humidity q(s) has a generally negative effect on global precipitation occurrence and with the tropical wet pentad precipitation amount, but has a positive relationship with the pentad precipitation amount at mid- and high latitudes. The North Atlantic Oscillation, a proxy for the meridional temperature gradient, is also found to have a statistically significant positive effect on precipitation over much of the Atlantic region. Residual time trends in wet pentad precipitation are extremely sensitive to the choice of the wet pentad threshold because of increasing trends in low-amplitude precipitation pentads; too low a choice of threshold can lead to a spurious decreasing trend in wet pentad precipitation amounts. However, for not too small thresholds, it is found that the meridional temperature gradient is an important factor for explaining part of the long-term trend in Atlantic precipitation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Historical smoke concentrations at monthly resolution for the early twentieth century are found for Kew Observatory, London, using the atmospheric electricity proxy technique. Smoke particles modify the electrical properties of urban air: an increase in smoke concentration reduces air's electrical conductivity and increases the Potential Gradient (PG). Calibrated PG data are available from Kew since 1898, and air conductivity was measured routinely between 1909 and 1979 using the technique developed by C.T.R. Wilson. Automated smoke observations at the same site overlap with the atmospheric electrical measurements from 1921, providing an absolute calibration to smoke concentration. This shows that the late nineteenth century winter smoke concentrations at Kew were approximately 100 times greater than contemporary winter smoke concentrations. Following smoke emission regulations reducing the smoke concentration, the electrical parameters of the urban air did not change dramatically. This is suggested to be due to a composition change, with an increase in the abundance of small aerosol compensating for the decrease in smoke. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We have applied time series analytical techniques to the flux of lava from an extrusive eruption. Tilt data acting as a proxy for flux are used in a case study of the May–August 1997 period of the eruption at Soufrière Hills Volcano, Montserrat. We justify the use of such a proxy by simple calibratory arguments. Three techniques of time series analysis are employed: spectral, spectrogram and wavelet methods. In addition to the well-known ~9-hour periodicity shown by these data, a previously unknown periodic flux variability is revealed by the wavelet analysis as a 3-day cycle of frequency modulation during June–July 1997, though the physical mechanism responsible is not clear. Such time series analysis has potential for other lava flux proxies at other types of volcanoes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During many lava dome-forming eruptions, persistent rockfalls and the concurrent development of a substantial talus apron around the foot of the dome are important aspects of the observed activity. An improved understanding of internal dome structure, including the shape and internal boundaries of the talus apron, is critical for determining when a lava dome is poised for a major collapse and how this collapse might ensue. We consider a period of lava dome growth at the Soufrière Hills Volcano, Montserrat, from August 2005 to May 2006, during which a 100 × 106 m3 lava dome developed that culminated in a major dome-collapse event on 20 May 2006. We use an axi-symmetrical Finite Element Method model to simulate the growth and evolution of the lava dome, including the development of the talus apron. We first test the generic behaviour of this continuum model, which has core lava and carapace/talus components. Our model describes the generation rate of talus, including its spatial and temporal variation, as well as its post-generation deformation, which is important for an improved understanding of the internal configuration and structure of the dome. We then use our model to simulate the 2005 to 2006 Soufrière Hills dome growth using measured dome volumes and extrusion rates to drive the model and generate the evolving configuration of the dome core and carapace/talus domains. The evolution of the model is compared with the observed rockfall seismicity using event counts and seismic energy parameters, which are used here as a measure of rockfall intensity and hence a first-order proxy for volumes. The range of model-derived volume increments of talus aggraded to the talus slope per recorded rockfall event, approximately 3 × 103–13 × 103 m3 per rockfall, is high with respect to estimates based on observed events. From this, it is inferred that some of the volumetric growth of the talus apron (perhaps up to 60–70%) might have occurred in the form of aseismic deformation of the talus, forced by an internal, laterally spreading core. Talus apron growth by this mechanism has not previously been identified, and this suggests that the core, hosting hot gas-rich lava, could have a greater lateral extent than previously considered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An annually laminated, uranium-series dated, Holocene stalagmite from southeast Ethiopia has been analysed for growth rate and δ13C and δ18O variations at annual to biennial resolution, in order to provide the first long duration proxy record of decadal-scale rainfall variability in this climatically sensitive region. Our study site (10°N) is climatically influenced by both summer (June—August) and spring (March—May) rainfall caused by the annual movement of the Inter-Tropical Convergence Zone (ITCZ) and modulated by large-scale anomalies in the atmospheric circulation and in ocean temperatures. Here we show that stalagmite growth, episodic throughout the last 7800 years, demonstrates decadal-scale (8—25 yr) variability in both growth rate and δ 18O. A hydrological model was employed and indicates that this decadal variability is due to variations in the relative amounts of rainfall in the two rain seasons. Our record, unique in its combination of length (a total of ~1000 years), annual chronology and high resolution δ18O, shows for the first time that such decadal-scale variability in rainfall in this region has occurred through the Holocene, which implies persistent decadal-scale variability for the large-scale atmospheric and oceanic driving factors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A full assessment of para-­virtualization is important, because without knowledge about the various overheads, users can not understand whether using virtualization is a good idea or not. In this paper we are very interested in assessing the overheads of running various benchmarks on bare-­‐metal, as well as on para-­‐virtualization. The idea is to see what the overheads of para-­‐ virtualization are, as well as looking at the overheads of turning on monitoring and logging. The knowledge from assessing various benchmarks on these different systems will help a range of users understand the use of virtualization systems. In this paper we assess the overheads of using Xen, VMware, KVM and Citrix, see Table 1. These different virtualization systems are used extensively by cloud-­‐users. We are using various Netlib1 benchmarks, which have been developed by the University of Tennessee at Knoxville (UTK), and Oak Ridge National Laboratory (ORNL). In order to assess these virtualization systems, we run the benchmarks on bare-­‐metal, then on the para-­‐virtualization, and finally we turn on monitoring and logging. The later is important as users are interested in Service Level Agreements (SLAs) used by the Cloud providers, and the use of logging is a means of assessing the services bought and used from commercial providers. In this paper we assess the virtualization systems on three different systems. We use the Thamesblue supercomputer, the Hactar cluster and IBM JS20 blade server (see Table 2), which are all servers available at the University of Reading. A functional virtualization system is multi-­‐layered and is driven by the privileged components. Virtualization systems can host multiple guest operating systems, which run on its own domain, and the system schedules virtual CPUs and memory within each Virtual Machines (VM) to make the best use of the available resources. The guest-­‐operating system schedules each application accordingly. You can deploy virtualization as full virtualization or para-­‐virtualization. Full virtualization provides a total abstraction of the underlying physical system and creates a new virtual system, where the guest operating systems can run. No modifications are needed in the guest OS or application, e.g. the guest OS or application is not aware of the virtualized environment and runs normally. Para-­‐virualization requires user modification of the guest operating systems, which runs on the virtual machines, e.g. these guest operating systems are aware that they are running on a virtual machine, and provide near-­‐native performance. You can deploy both para-­‐virtualization and full virtualization across various virtualized systems. Para-­‐virtualization is an OS-­‐assisted virtualization; where some modifications are made in the guest operating system to enable better performance. In this kind of virtualization, the guest operating system is aware of the fact that it is running on the virtualized hardware and not on the bare hardware. In para-­‐virtualization, the device drivers in the guest operating system coordinate the device drivers of host operating system and reduce the performance overheads. The use of para-­‐virtualization [0] is intended to avoid the bottleneck associated with slow hardware interrupts that exist when full virtualization is employed. It has revealed [0] that para-­‐ virtualization does not impose significant performance overhead in high performance computing, and this in turn this has implications for the use of cloud computing for hosting HPC applications. The “apparent” improvement in virtualization has led us to formulate the hypothesis that certain classes of HPC applications should be able to execute in a cloud environment, with minimal performance degradation. In order to support this hypothesis, first it is necessary to define exactly what is meant by a “class” of application, and secondly it will be necessary to observe application performance, both within a virtual machine and when executing on bare hardware. A further potential complication is associated with the need for Cloud service providers to support Service Level Agreements (SLA), so that system utilisation can be audited.