942 resultados para Appropriate Dispute Resolution (ADR)
Resumo:
High spatial resolution environmental data gives us a better understanding of the environmental factors affecting plant distributions at fine spatial scales. However, large environmental datasets dramatically increase compute times and output species model size stimulating the need for an alternative computing solution. Cluster computing offers such a solution, by allowing both multiple plant species Environmental Niche Models (ENMs) and individual tiles of high spatial resolution models to be computed concurrently on the same compute cluster. We apply our methodology to a case study of 4,209 species of Mediterranean flora (around 17% of species believed present in the biome). We demonstrate a 16 times speed-up of ENM computation time when 16 CPUs were used on the compute cluster. Our custom Java ‘Merge’ and ‘Downsize’ programs reduce ENM output files sizes by 94%. The median 0.98 test AUC score of species ENMs is aided by various species occurrence data filtering techniques. Finally, by calculating the percentage change of individual grid cell values, we map the projected percentages of plant species vulnerable to climate change in the Mediterranean region between 1950–2000 and 2020.
Resumo:
High-resolution simulations over a large tropical domain (∼20◦S–20◦N and 42◦E–180◦E) using both explicit and parameterized convection are analyzed and compared to observations during a 10-day case study of an active Madden-Julian Oscillation (MJO) event. The parameterized convection model simulations at both 40 km and 12 km grid spacing have a very weak MJO signal and little eastward propagation. A 4 km explicit convection simulation using Smagorinsky subgrid mixing in the vertical and horizontal dimensions exhibits the best MJO strength and propagation speed. 12 km explicit convection simulations also perform much better than the 12 km parameterized convection run, suggesting that the convection scheme, rather than horizontal resolution, is key for these MJO simulations. Interestingly, a 4 km explicit convection simulation using the conventional boundary layer scheme for vertical subgrid mixing (but still using Smagorinsky horizontal mixing) completely loses the large-scale MJO organization, showing that relatively high resolution with explicit convection does not guarantee a good MJO simulation. Models with a good MJO representation have a more realistic relationship between lower-free-tropospheric moisture and precipitation, supporting the idea that moisture-convection feedback is a key process for MJO propagation. There is also increased generation of available potential energy and conversion of that energy into kinetic energy in models with a more realistic MJO, which is related to larger zonal variance in convective heating and vertical velocity, larger zonal temperature variance around 200 hPa, and larger correlations between temperature and ascent (and between temperature and diabatic heating) between 500–400 hPa.
Resumo:
In the European Union, first-tier assessment of the long-term risk to birds and mammals from pesticides is based on calculation of a deterministic long-term toxicity/exposure ratio(TERlt). The ratio is developed from generic herbivores and insectivores and applied to all species. This paper describes two case studies that implement proposed improvements to the way long-term risk is assessed. These refined methods require calculation of a TER for each of five identified phases of reproduction (phase-specific TERs) and use of adjusted No Observed Effect Levels (NOELs)to incorporate variation in species sensitivity to pesticides. They also involve progressive refinement of the exposure estimate so that it applies to particular species, rather than generic indicators, and relates spraying date to onset of reproduction. The effect of using these new methods on the assessment of risk is described. Each refinement did not necessarily alter the calculated TER value in a way that was either predictable or consistent across both case studies. However, use of adjusted NOELs always reduced TERs, and relating spraying date to onset of reproduction increased most phase-specific TERs. The case studies suggested that the current first-tier TERlt assessment may underestimate risk in some circumstances and that phase-specific assessments can help identify appropriate risk-reduction measures. The way in which deterministic phase-specific assessments can currently be implemented to enhance first-tier assessment is outlined.
Resumo:
With many operational centers moving toward order 1-km-gridlength models for routine weather forecasting, this paper presents a systematic investigation of the properties of high-resolution versions of the Met Office Unified Model for short-range forecasting of convective rainfall events. The authors describe a suite of configurations of the Met Office Unified Model running with grid lengths of 12, 4, and 1 km and analyze results from these models for a number of convective cases from the summers of 2003, 2004, and 2005. The analysis includes subjective evaluation of the rainfall fields and comparisons of rainfall amounts, initiation, cell statistics, and a scale-selective verification technique. It is shown that the 4- and 1-km-gridlength models often give more realistic-looking precipitation fields because convection is represented explicitly rather than parameterized. However, the 4-km model representation suffers from large convective cells and delayed initiation because the grid length is too long to correctly reproduce the convection explicitly. These problems are not as evident in the 1-km model, although it does suffer from too numerous small cells in some situations. Both the 4- and 1-km models suffer from poor representation at the start of the forecast in the period when the high-resolution detail is spinning up from the lower-resolution (12 km) starting data used. A scale-selective precipitation verification technique implies that for later times in the forecasts (after the spinup period) the 1-km model performs better than the 12- and 4-km models for lower rainfall thresholds. For higher thresholds the 4-km model scores almost as well as the 1-km model, and both do better than the 12-km model.
Resumo:
The development of NWP models with grid spacing down to 1 km should produce more realistic forecasts of convective storms. However, greater realism does not necessarily mean more accurate precipitation forecasts. The rapid growth of errors on small scales in conjunction with preexisting errors on larger scales may limit the usefulness of such models. The purpose of this paper is to examine whether improved model resolution alone is able to produce more skillful precipitation forecasts on useful scales, and how the skill varies with spatial scale. A verification method will be described in which skill is determined from a comparison of rainfall forecasts with radar using fractional coverage over different sized areas. The Met Office Unified Model was run with grid spacings of 12, 4, and 1 km for 10 days in which convection occurred during the summers of 2003 and 2004. All forecasts were run from 12-km initial states for a clean comparison. The results show that the 1-km model was the most skillful over all but the smallest scales (approximately <10–15 km). A measure of acceptable skill was defined; this was attained by the 1-km model at scales around 40–70 km, some 10–20 km less than that of the 12-km model. The biggest improvement occurred for heavier, more localized rain, despite it being more difficult to predict. The 4-km model did not improve much on the 12-km model because of the difficulties of representing convection at that resolution, which was accentuated by the spinup from 12-km fields.
Resumo:
The realistic representation of rainfall on the local scale in climate models remains a key challenge. Realism encompasses the full spatial and temporal structure of rainfall, and is a key indicator of model skill in representing the underlying processes. In particular, if rainfall is more realistic in a climate model, there is greater confidence in its projections of future change. In this study, the realism of rainfall in a very high-resolution (1.5 km) regional climate model (RCM) is compared to a coarser-resolution 12-km RCM. This is the first time a convection-permitting model has been run for an extended period (1989–2008) over a region of the United Kingdom, allowing the characteristics of rainfall to be evaluated in a climatological sense. In particular, the duration and spatial extent of hourly rainfall across the southern United Kingdom is examined, with a key focus on heavy rainfall. Rainfall in the 1.5-km RCM is found to be much more realistic than in the 12-km RCM. In the 12-km RCM, heavy rain events are not heavy enough, and tend to be too persistent and widespread. While the 1.5-km model does have a tendency for heavy rain to be too intense, it still gives a much better representation of its duration and spatial extent. Long-standing problems in climate models, such as the tendency for too much persistent light rain and errors in the diurnal cycle, are also considerably reduced in the 1.5-km RCM. Biases in the 12-km RCM appear to be linked to deficiencies in the representation of convection.
Resumo:
On the 8 January 2005 the city of Carlisle in north-west England was severely flooded following 2 days of almost continuous rain over the nearby hills. Orographic enhancement of the rain through the seeder–feeder mechanism led to the very high rainfall totals. This paper shows the impact of running the Met Office Unified Model (UM) with a grid spacing of 4 and 1 km compared to the 12 km available at the time of the event. These forecasts, and forecasts from the Nimrod nowcasting system, were fed into the Probability Distributed Model (PDM) to predict river flow at the outlets of two catchments important for flood warning. The results show the benefit of increased resolution in the UM, the benefit of coupling the high-resolution rainfall forecasts to the PDM and the improvement in timeliness of flood warning that might have been possible. Copyright © 2008 Royal Meteorological Society
Resumo:
Typeface design: a series of collaborative projects commissioned by Adobe, Inc. and Brill to develop extensive polytonic Greek typefaces. The two Adobe typefaces can be seen as extension of previous research for the Garamond Premier Pro family (2005), and concludes a research theme started in 1998 with work for Adobe’s Minion Pro Greek. These typefaces together define the state of the art for text-intensive Greek typesetting for wide character set texts (from classical texts, to poetry, to essays, to prose). They serve both as exemplar for other developers, and as vehicles for developing the potential of Greek text typography, for example with the parallel inclusion of monotonic and polytonic characters, detailed localised punctuation options, fluid handling of case-conversion issues, and innovative options such as accented small caps (originally requested by bibliographers, and subsequently rolled out to a general user base). The Brill typeface (for the established academic publisher) has an exceptionally wide character set to cover several academic disciplines, and is intended to differentiate sufficiently from its partner Latin typeface, while maintaining a clear texture in both offset and low-resolution print-on-demand reproduction. This work involved substantial amounts of testing and modifying the design, especially of diacritics, to maintain clarity the readability of unfamiliar words. All together these typefaces form a study in how Greek typesetting meets contemporary typographic requirements, while resonating with historically accurate styles, where these are present. Significant research in printing archives helped to identify appropriate styles, as well as originate variants that are coherent stylistically, even when historical equivalents were absent.
Resumo:
The quasi-biennial oscillation (QBO) in the equatorial zonal wind is an outstanding phenomenon of the atmosphere. The QBO is driven by a broad spectrum of waves excited in the tropical troposphere and modulates transport and mixing of chemical compounds in the whole middle atmosphere. Therefore, the simulation of the QBO in general circulation models and chemistry climate models is an important issue. Here, aspects of the climatology and forcing of a spontaneously occurring QBO in a middle-atmosphere model are evaluated, and its influence on the climate and variability of the tropical middle atmosphere is investigated. Westerly and easterly phases are considered separately, and 40-yr ECMWF Re-Analysis (ERA-40) data are used as a reference where appropriate. It is found that the simulated QBO is realistic in many details. Resolved large-scale waves are particularly important for the westerly phase, while parameterized gravity wave drag is more important for the easterly phase. Advective zonal wind tendencies are important for asymmetries between westerly and easterly phases, as found for the suppression of the easterly phase downward propagation. The simulation of the QBO improves the tropical upwelling and the atmospheric tape recorder compared to a model without a QBO. The semiannual oscillation is simulated realistically only if the QBO is represented. In sensitivity tests, it is found that the simulated QBO is strongly sensitive to changes in the gravity wave sources. The sensitivity to the tested range of horizontal resolutions is small. The stratospheric vertical resolution must be better than 1 km to simulate a realistic QBO.
Resumo:
Hamburg atmospheric general circulation model ECHAM3 at T106 resolution (1.125' lat.Aon.) has considerable skill in reproducing the observed seasonal reversal of mean sea level pressure, the location of the summer heat low as well as the position of the monsoon trough over the Indian subcontinent. The present-day climate and its seasonal cycle are realistically simulated by the model over this region. The model simulates the structure, intensity, frequency, movement and lifetime of monsoon depressions remarkably well. The number of monsoon depressions/storms simulated by the model in a year ranged from 5 to 12 with an average frequency of 8.4 yr-', not significantly different from the observed climatology. The model also simulates the interannual variability in the formation of depressions over the north Bay of Bengal during the summer monsoon season. In the warmer atmosphere under doubled CO2 conditions, the number of monsoon depressions/cyclonic storms forming in Indian seas in a year ranged from 5 to 11 with an average frequency of 7.6 yr-', not significantly different from those inferred in the control run of the model. However, under doubled CO2 conditions, fewer depressions formed in the month of June. Neither the lowest central pressure nor the maximum wind speed changes appreciably in monsoon depressions identified under simulated enhanced greenhouse conditions. The analysis suggests there will be no significant changes in the number and intensity of monsoon depressions in a warmer atmosphere.
Resumo:
As laid out in its convention there are 8 different objectives for ECMWF. One of the major objectives will consist of the preparation, on a regular basis, of the data necessary for the preparation of medium-range weather forecasts. The interpretation of this item is that the Centre will make forecasts once a day for a prediction period of up to 10 days. It is also evident that the Centre should not carry out any real weather forecasting but merely disseminate to the member countries the basic forecasting parameters with an appropriate resolution in space and time. It follows from this that the forecasting system at the Centre must from the operational point of view be functionally integrated with the Weather Services of the Member Countries. The operational interface between ECMWF and the Member Countries must be properly specified in order to get a reasonable flexibility for both systems. The problem of making numerical atmospheric predictions for periods beyond 4-5 days differs substantially from 2-3 days forecasting. From the physical point we can define a medium range forecast as a forecast where the initial disturbances have lost their individual structure. However we are still interested to predict the atmosphere in a similar way as in short range forecasting which means that the model must be able to predict the dissipation and decay of the initial phenomena and the creation of new ones. With this definition, medium range forecasting is indeed very difficult and generally regarded as more difficult than extended forecasts, where we usually only predict time and space mean values. The predictability of atmospheric flow has been extensively studied during the last years in theoretical investigations and by numerical experiments. As has been discussed elsewhere in this publication (see pp 338 and 431) a 10-day forecast is apparently on the fringe of predictability.
Resumo:
Although commonplace in human disease genetics, genome-wide association (GWA) studies have only relatively recently been applied to plants. Using 32 phenotypes in the inbreeding crop barley, we report GWA mapping of 15 morphological traits across ∼500 cultivars genotyped with 1,536 SNPs. In contrast to the majority of human GWA studies, we observe high levels of linkage disequilibrium within and between chromosomes. Despite this, GWA analysis readily detected common alleles of high penetrance. To investigate the potential of combining GWA mapping with comparative analysis to resolve traits to candidate polymorphism level in unsequenced genomes, we fine-mapped a selected phenotype (anthocyanin pigmentation) within a 140-kb interval containing three genes. Of these, resequencing the putative anthocyanin pathway gene HvbHLH1 identified a deletion resulting in a premature stop codon upstream of the basic helix-loop-helix domain, which was diagnostic for lack of anthocyanin in our association and biparental mapping populations. The methodology described here is transferable to species with limited genomic resources, providing a paradigm for reducing the threshold of map-based cloning in unsequenced crops.
Resumo:
A record of dust deposition events between 2009 and 2012 on Mt. Elbrus, Caucasus Mountains derived from a snow pit and a shallow ice core is presented for the first time for this region. A combination of isotopic analysis, SEVIRI red-green-blue composite imagery, MODIS atmospheric optical depth fields derived using the Deep Blue algorithm, air mass trajectories derived using the HYSPLIT model and analysis of meteorological data enabled identification of dust source regions with high temporal (hours) and spatial (cf. 20–100 km) resolution. Seventeen dust deposition events were detected; fourteen occurred in March–June, one in February and two in October. Four events originated in the Sahara, predominantly in north-eastern Libya and eastern Algeria. Thirteen events originated in the Middle East, in the Syrian Desert and northern Mesopotamia, from a mixture of natural and anthropogenic sources. Dust transportation from Sahara was associated with vigorous Saharan depressions, strong surface winds in the source region and mid-tropospheric south-westerly flow with daily winds speeds of 20–30 m s−1 at 700 hPa level and, although these events were less frequent, they resulted in higher dust concentrations in snow. Dust transportation from the Middle East was associated with weaker depressions forming over the source region, high pressure centered over or extending towards the Caspian Sea and a weaker southerly or south-easterly flow towards the Caucasus Mountains with daily wind speeds of 12–18 m s−1 at 700 hPa level. Higher concentrations of nitrates and ammonium characterise dust from the Middle East deposited on Mt. Elbrus in 2009 indicating contribution of anthropogenic sources. The modal values of particle size distributions ranged between 1.98 μm and 4.16 μm. Most samples were characterised by modal values of 2.0–2.8 μm with an average of 2.6 μm and there was no significant difference between dust from the Sahara and the Middle East.
Resumo:
A multi-proxy study of a Holocene sediment core (RF 93-30) from the western flank of the central Adriatic, in 77 m of water, reveals a sequence of changes in terrestrial vegetation, terrigenous sediment input and benthic fauna, as well as evidence for variations in sea surface temperature spanning most of the last 7000 yr. The chronology of sedimentation is based on several lines of evidence, including AMS 14C dates of foraminifera extracted from the core, palaeomagnetic secular variation, pollen indicators and dated tephra. The temporal resolution increases towards the surface and, for some of the properties measured, is sub-decadal for the last few centuries. The main changes recorded in vegetation, sedimentation and benthic foraminiferal assemblages appear to be directly related to human activity in the sediment source area, which includes the Po valley and the eastern flanks of the central and northern Appenines. The most striking episodes of deforestation and expanding human impact begin around 3600 BP (Late Bronze Age) and 700 BP (Medieval) and each leads to an acceleration in mass sedimentation and an increase in the proportion of terrigenous material, reflecting the response of surface processes to widespread forest clearance and cultivation. Although human impact appears to be the proximal cause of these changes, climatic effects may also have been important. During these periods, signs of stress are detectable in the benthic foram morphotype assemblages. Between these two periods of increased terrigeneous sedimentation there is smaller peak in sedimentation rate around 2400BP which is not associated with evidence for deforestation, shifts in the balance between terrigenous and authigenic sedimentation, or changes in benthic foraminifera. The mineral magnetic record provides a sensitive indicator of changing sediment sources: during forested periods of reduced terrigenous input it is dominated by authigenic bacterial magnetite, whereas during periods of increased erosion, anti-ferromagetic minerals (haematite and/or goethite) become more important, as well as both paramagnetic minerals and super-paramagnetic magnetite. Analysis of the alkenone, U37k′, record provides an indication of possible changes in sea surface temperature during the period, but it is premature to place too much reliance on these inferred changes until the indirect effects of past changes in the depth of the halocline and in circulation have been more fully evaluated. The combination of methods used and the results obtained illustrate the potential value of such high resolution near-shore marine sedimentary sequences for recording wide-scale human impact, documenting the effects of this on marine sedimentation and fauna and, potentially, disentangling evidence for human activities from that for past changes in climate.