43 resultados para Source analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ozone and its precursors were measured on board the Facility for Airborne Atmospheric Measurements (FAAM) BAe 146 Atmospheric Research Aircraft during the monsoon season 2006 as part of the African Monsoon Multidisciplinary Analysis (AMMA) campaign. One of the main features observed in the west African boundary layer is the increase of the ozone mixing ratios from 25 ppbv over the forested area (south of 12 degrees N) up to 40 ppbv over the Sahelian area. We employ a two-dimensional ( latitudinal versus vertical) meteorological model coupled with an O-3-NOx-VOC chemistry scheme to simulate the distribution of trace gases over West Africa during the monsoon season and to analyse the processes involved in the establishment of such a gradient. Including an additional source of NO over the Sahelian region to account for NO emitted by soils we simulate a mean NOx concentration of 0.7 ppbv at 16 degrees N versus 0.3 ppbv over the vegetated region further south in reasonable agreement with the observations. As a consequence, ozone is photochemically produced with a rate of 0.25 ppbv h(-1) over the vegetated region whilst it reaches up to 0.75 ppbv h(-1) at 16 degrees N. We find that the modelled gradient is due to a combination of enhanced deposition to vegetation, which decreases the ozone levels by up to 11 pbbv, and the aforementioned enhanced photochemical production north of 12 degrees N. The peroxy radicals required for this enhanced production in the north come from the oxidation of background CO and CH4 as well as from VOCs. Sensitivity studies reveal that both the background CH4 and partially oxidised VOCs, produced from the oxidation of isoprene emitted from the vegetation in the south, contribute around 5-6 ppbv to the ozone gradient. These results suggest that the northward transport of trace gases by the monsoon flux, especially during nighttime, can have a significant, though secondary, role in determining the ozone gradient in the boundary layer. Convection, anthropogenic emissions and NO produced from lightning do not contribute to the establishment of the discussed ozone gradient.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We quantify the risks of climate-induced changes in key ecosystem processes during the 21st century by forcing a dynamic global vegetation model with multiple scenarios from 16 climate models and mapping the proportions of model runs showing forest/nonforest shifts or exceedance of natural variability in wildfire frequency and freshwater supply. Our analysis does not assign probabilities to scenarios or weights to models. Instead, we consider distribution of outcomes within three sets of model runs grouped by the amount of global warming they simulate: <2°C (including simulations in which atmospheric composition is held constant, i.e., in which the only climate change is due to greenhouse gases already emitted), 2–3°C, and >3°C. High risk of forest loss is shown for Eurasia, eastern China, Canada, Central America, and Amazonia, with forest extensions into the Arctic and semiarid savannas; more frequent wildfire in Amazonia, the far north, and many semiarid regions; more runoff north of 50°N and in tropical Africa and northwestern South America; and less runoff in West Africa, Central America, southern Europe, and the eastern U.S. Substantially larger areas are affected for global warming >3°C than for <2°C; some features appear only at higher warming levels. A land carbon sink of ≈1 Pg of C per yr is simulated for the late 20th century, but for >3°C this sink converts to a carbon source during the 21st century (implying a positive climate feedback) in 44% of cases. The risks continue increasing over the following 200 years, even with atmospheric composition held constant.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A complete treatment of the charcterastic admittance-matrix representation of multilayer thin-film systems with absorbing media is presented. The algorithm from the systems analysis is implemented on an IBM microcomputer and some examples of filter design calculation are presented. Relevant source code in IBM Advanced Basic interpreter are also included.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A significant challenge in the prediction of climate change impacts on ecosystems and biodiversity is quantifying the sources of uncertainty that emerge within and between different models. Statistical species niche models have grown in popularity, yet no single best technique has been identified reflecting differing performance in different situations. Our aim was to quantify uncertainties associated with the application of 2 complimentary modelling techniques. Generalised linear mixed models (GLMM) and generalised additive mixed models (GAMM) were used to model the realised niche of ombrotrophic Sphagnum species in British peatlands. These models were then used to predict changes in Sphagnum cover between 2020 and 2050 based on projections of climate change and atmospheric deposition of nitrogen and sulphur. Over 90% of the variation in the GLMM predictions was due to niche model parameter uncertainty, dropping to 14% for the GAMM. After having covaried out other factors, average variation in predicted values of Sphagnum cover across UK peatlands was the next largest source of variation (8% for the GLMM and 86% for the GAMM). The better performance of the GAMM needs to be weighed against its tendency to overfit the training data. While our niche models are only a first approximation, we used them to undertake a preliminary evaluation of the relative importance of climate change and nitrogen and sulphur deposition and the geographic locations of the largest expected changes in Sphagnum cover. Predicted changes in cover were all small (generally <1% in an average 4 m2 unit area) but also highly uncertain. Peatlands expected to be most affected by climate change in combination with atmospheric pollution were Dartmoor, Brecon Beacons and the western Lake District.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It took the solar polar passage of Ulysses in the early 1990s to establish the global structure of the solar wind speed during solar minimum. However, it remains unclear if the solar wind is composed of two distinct populations of solar wind from different sources (e.g., closed loops which open up to produce the slow solar wind) or if the fast and slow solar wind rely on the superradial expansion of the magnetic field to account for the observed solar wind speed variation. We investigate the solar wind in the inner corona using the Wang-Sheeley-Arge (WSA) coronal model incorporating a new empirical magnetic topology–velocity relationship calibrated for use at 0.1 AU. In this study the empirical solar wind speed relationship was determined by using Helios perihelion observations, along with results from Riley et al. (2003) and Schwadron et al. (2005) as constraints. The new relationship was tested by using it to drive the ENLIL 3-D MHD solar wind model and obtain solar wind parameters at Earth (1.0 AU) and Ulysses (1.4 AU). The improvements in speed, its variability, and the occurrence of high-speed enhancements provide confidence that the new velocity relationship better determines the solar wind speed in the outer corona (0.1 AU). An analysis of this improved velocity field within the WSA model suggests the existence of two distinct mechanisms of the solar wind generation, one for fast and one for slow solar wind, implying that a combination of present theories may be necessary to explain solar wind observations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatic keyword or keyphrase extraction is concerned with assigning keyphrases to documents based on words from within the document. Previous studies have shown that in a significant number of cases author-supplied keywords are not appropriate for the document to which they are attached. This can either be because they represent what the author believes the paper is about not what it actually is, or because they include keyphrases which are more classificatory than explanatory e.g., “University of Poppleton” instead of “Knowledge Discovery in Databases”. Thus, there is a need for a system that can generate appropriate and diverse range of keyphrases that reflect the document. This paper proposes a solution that examines the synonyms of words and phrases in the document to find the underlying themes, and presents these as appropriate keyphrases. The primary method explores taking n-grams of the source document phrases, and examining the synonyms of these, while the secondary considers grouping outputs by their synonyms. The experiments undertaken show the primary method produces good results and that the secondary method produces both good results and potential for future work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have carried out a thorough mineralogical analysis of 16 pottery samples from the Lapita site of Bourwera in Fiji, using micromorphological techniques with optical and polarising microscopes. While the overall mineralogy of all of the samples is similar the samples clearly divide into two groups, namely those with or without the mineral calcite. Our findings are backed up by chemical analysis using SEM–EDX and FTIR. SEM–EDX shows the clear presence of inclusions of calcite in some of the samples; FTIR shows bands arising from calcite in these samples. The study suggests that it is likely that more than one clay source was used for production of this pottery, but that most of the pottery comes from a single source. This finding is in line with previous studies which suggest some trading of pottery between the Fijian islands but a single source of clay for most of the pottery found at Bouwera. We found no evidence for the destruction of CaCO3 by heating upon production of the pottery in line with the known technology of the Lapita people who produced earthenware pottery but not high temperature ceramics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Basic Network transactions specifies that datagram from source to destination is routed through numerous routers and paths depending on the available free and uncongested paths which results in the transmission route being too long, thus incurring greater delay, jitter, congestion and reduced throughput. One of the major problems of packet switched networks is the cell delay variation or jitter. This cell delay variation is due to the queuing delay depending on the applied loading conditions. The effect of delay, jitter accumulation due to the number of nodes along transmission routes and dropped packets adds further complexity to multimedia traffic because there is no guarantee that each traffic stream will be delivered according to its own jitter constraints therefore there is the need to analyze the effects of jitter. IP routers enable a single path for the transmission of all packets. On the other hand, Multi-Protocol Label Switching (MPLS) allows separation of packet forwarding and routing characteristics to enable packets to use the appropriate routes and also optimize and control the behavior of transmission paths. Thus correcting some of the shortfalls associated with IP routing. Therefore MPLS has been utilized in the analysis for effective transmission through the various networks. This paper analyzes the effect of delay, congestion, interference, jitter and packet loss in the transmission of signals from source to destination. In effect the impact of link failures, repair paths in the various physical topologies namely bus, star, mesh and hybrid topologies are all analyzed based on standard network conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The technique of relaxation of the tropical atmosphere towards an analysis in a month-season forecast model has previously been successfully exploited in a number of contexts. Here it is shown that when tropical relaxation is used to investigate the possible origin of the observed anomalies in June–July 2007, a simple dynamical model is able to reproduce the observed component of the pattern of anomalies given by an ensemble of ECMWF forecast runs. Following this result, the simple model is used for a range of experiments on time-scales of relaxation, variables and regions relaxed based on a control model run with equatorial heating in a zonal flow. A theory based on scale analysis for the large-scale tropics is used to interpret the results. Typical relationships between scales are determined from the basic equations, and for a specified diabatic heating a chain of deductions for determining the dependent variables is derived. Different critical time-scales are found for tropical relaxation of different dependent variables to be effective. Vorticity has the longest critical time-scale, typically 1.2 days. For temperature and divergence, the time-scales are 10 hours and 3 hours, respectively. However not all the tropical fields, in particular the vertical motion, are reproduced correctly by the model unless divergence is heavily damped. To obtain the correct extra-tropical fields, it is crucial to have the correct rotational flow in the subtropics to initiate the Rossby wave propagation from there. It is sufficient to relax vorticity or temperature on a time-scale comparable or less than their critical time-scales to obtain this. However if the divergent advection of vorticity is important in the Rossby Wave Source then strong relaxation of divergence is required to accurately represent the tropical forcing of Rossby waves.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mannitol is a polymorphic excipient which is usually used in pharmaceutical products as the beta form, although other polymorphs (alpha and delta) are common contaminants. Binary mixtures containing beta and delta mannitol were prepared to quantify the concentration of the beta form using FT-Raman spectroscopy. Spectral regions characteristic of each form were selected and peak intensity ratios of beta peaks to delta peaks were calculated. Using these ratios, a correlation curve was established which was then validated by analysing further samples of known composition. The results indicate that levels down to 2% beta could be quantified using this novel, non-destructive approach. Potential errors associated with quantitative studies using FT-Raman spectroscopy were also researched. The principal source of variability arose from inhomogeneities on mixing of the samples; a significant reduction of these errors was observed by reducing and controlling the particle size range. The results show that FT-Raman spectroscopy can be used to rapidly and accurately quantitate polymorphic mixtures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mannitol is a polymorphic pharmaceutical excipient, which commonly exists in three forms: alpha, beta and delta. Each polymorph has a needle-like morphology, which can give preferred orientation effects when analysed by X-ray powder diffractometry (XRPD) thus providing difficulties for quantitative XRPD assessments. The occurrence of preferred orientation may be demonstrated by sample rotation and the consequent effects on X-ray data can be minimised by reducing the particle size. Using two particle size ranges (less than 125 and 125–500�microns), binary mixtures of beta and delta mannitol were prepared and the delta component was quantified. Samples were assayed in either a static or rotating sampling accessory. Rotation and reducing the particle size range to less than�125 microns halved the limits of detection and quantitation to 1 and 3.6%, respectively. Numerous potential sources of assay errors were investigated; sample packing and mixing errors contributed the greatest source of variation. However, the rotation of samples for both particle size ranges reduced the majority of assay errors examined. This study shows that coupling sample rotation with a particle size reduction minimises preferred orientation effects on assay accuracy, allowing discrimination of two very similar polymorphs at around the 1% level

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Following a malicious or accidental atmospheric release in an outdoor environment it is essential for first responders to ensure safety by identifying areas where human life may be in danger. For this to happen quickly, reliable information is needed on the source strength and location, and the type of chemical agent released. We present here an inverse modelling technique that estimates the source strength and location of such a release, together with the uncertainty in those estimates, using a limited number of measurements of concentration from a network of chemical sensors considering a single, steady, ground-level source. The technique is evaluated using data from a set of dispersion experiments conducted in a meteorological wind tunnel, where simultaneous measurements of concentration time series were obtained in the plume from a ground-level point-source emission of a passive tracer. In particular, we analyze the response to the number of sensors deployed and their arrangement, and to sampling and model errors. We find that the inverse algorithm can generate acceptable estimates of the source characteristics with as few as four sensors, providing these are well-placed and that the sampling error is controlled. Configurations with at least three sensors in a profile across the plume were found to be superior to other arrangements examined. Analysis of the influence of sampling error due to the use of short averaging times showed that the uncertainty in the source estimates grew as the sampling time decreased. This demonstrated that averaging times greater than about 5min (full scale time) lead to acceptable accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rocket is a leafy brassicaceous salad crop that encompasses two major genera (Diplotaxis and Eruca) and many different cultivars. Rocket is a rich source of antioxidants and glucosinolates, many of which are produced as secondary products by the plant in response to stress. In this paper we examined the impact of temperature and light stress on several different cultivars of wild and salad rocket. Growth habit of the plants varied in response to stress and with different genotypes, reflecting the wide geographical distribution of the plant and the different environments to which the genera have naturally adapted. Preharvest environmental stress and genotype also had an impact on how well the cultivar was able to resist postharvest senescence, indicating that breeding or selection of senescence-resistant genotypes will be possible in the future. The abundance of key phytonutrients such as carotenoids and glucosinolates are also under genetic control. As genetic resources improve for rocket it will therefore be possible to develop a molecular breeding programme specifically targeted at improving stress resistance and nutritional levels of plant secondary products. Concomitantly, it has been shown in this paper that controlled levels of abiotic stress can potentially improve the levels of chlorophyll, carotenoids and antioxidant activity in this leafy vegetable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatic keyword or keyphrase extraction is concerned with assigning keyphrases to documents based on words from within the document. Previous studies have shown that in a significant number of cases author-supplied keywords are not appropriate for the document to which they are attached. This can either be because they represent what the author believes a paper is about not what it actually is, or because they include keyphrases which are more classificatory than explanatory e.g., “University of Poppleton” instead of “Knowledge Discovery in Databases”. Thus, there is a need for a system that can generate an appropriate and diverse range of keyphrases that reflect the document. This paper proposes two possible solutions that examine the synonyms of words and phrases in the document to find the underlying themes, and presents these as appropriate keyphrases. Using three different freely available thesauri, the work undertaken examines two different methods of producing keywords and compares the outcomes across multiple strands in the timeline. The primary method explores taking n-grams of the source document phrases, and examining the synonyms of these, while the secondary considers grouping outputs by their synonyms. The experiments undertaken show the primary method produces good results and that the secondary method produces both good results and potential for future work. In addition, the different qualities of the thesauri are examined and it is concluded that the more entries in a thesaurus, the better it is likely to perform. The age of the thesaurus or the size of each entry does not correlate to performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the development of an export coefficient model to characterise the rates and sources of P export from land to water in four reservoir systems located in a semi-arid rural region in southern of Portugal. The model was developed to enable effective management of these important water resource systems under the EU Water Framework Directive. This is the first time such an approach has been fully adapted for the semi-arid systems typical of Mediterranean Europe. The sources of P loading delivered to each reservoir from its catchment were determined and scenario analysis was undertaken to predict the likely impact of catchment management strategies on the scale of rate of P loading delivered to each water body from its catchment. The results indicate the importance of farming and sewage treatment works/collective septic tanks discharges as the main contributors to the total diffuse and point source P loading delivered to the reservoirs, respectively. A reduction in the total P loading for all study areas would require control of farming practices and more efficient removal of P from human wastes prior to discharge to surface waters. The scenario analysis indicates a strategy based solely on reducing the agricultural P surplus may result in only a slow improvement in water quality, which would be unlikely to support the generation of good ecological status in reservoirs. The model application indicates that a reduction of P-inputs to the reservoirs should first focus on reducing P loading from sewage effluent discharges through the introduction of tertiary treatment (P-stripping) in all major residential areas. The fully calibrated export coefficient modelling approach transferred well to semi-arid regions, with the only significant limitation being the availability of suitable input data to drive the model. Further studies using this approach in semi-arid catchments are now needed to increase the knowledge of nutrient export behaviours in semi-arid regions.