62 resultados para source code analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A poor representation of cloud structure in a general circulation model (GCM) is widely recognised as a potential source of error in the radiation budget. Here, we develop a new way of representing both horizontal and vertical cloud structure in a radiation scheme. This combines the ‘Tripleclouds’ parametrization, which introduces inhomogeneity by using two cloudy regions in each layer as opposed to one, each with different water content values, with ‘exponential-random’ overlap, in which clouds in adjacent layers are not overlapped maximally, but according to a vertical decorrelation scale. This paper, Part I of two, aims to parametrize the two effects such that they can be used in a GCM. To achieve this, we first review a number of studies for a globally applicable value of fractional standard deviation of water content for use in Tripleclouds. We obtain a value of 0.75 ± 0.18 from a variety of different types of observations, with no apparent dependence on cloud type or gridbox size. Then, through a second short review, we create a parametrization of decorrelation scale for use in exponential-random overlap, which varies the scale linearly with latitude from 2.9 km at the Equator to 0.4 km at the poles. When applied to radar data, both components are found to have radiative impacts capable of offsetting biases caused by cloud misrepresentation. Part II of this paper implements Tripleclouds and exponential-random overlap into a radiation code and examines both their individual and combined impacts on the global radiation budget using re-analysis data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We quantify the risks of climate-induced changes in key ecosystem processes during the 21st century by forcing a dynamic global vegetation model with multiple scenarios from 16 climate models and mapping the proportions of model runs showing forest/nonforest shifts or exceedance of natural variability in wildfire frequency and freshwater supply. Our analysis does not assign probabilities to scenarios or weights to models. Instead, we consider distribution of outcomes within three sets of model runs grouped by the amount of global warming they simulate: <2°C (including simulations in which atmospheric composition is held constant, i.e., in which the only climate change is due to greenhouse gases already emitted), 2–3°C, and >3°C. High risk of forest loss is shown for Eurasia, eastern China, Canada, Central America, and Amazonia, with forest extensions into the Arctic and semiarid savannas; more frequent wildfire in Amazonia, the far north, and many semiarid regions; more runoff north of 50°N and in tropical Africa and northwestern South America; and less runoff in West Africa, Central America, southern Europe, and the eastern U.S. Substantially larger areas are affected for global warming >3°C than for <2°C; some features appear only at higher warming levels. A land carbon sink of ≈1 Pg of C per yr is simulated for the late 20th century, but for >3°C this sink converts to a carbon source during the 21st century (implying a positive climate feedback) in 44% of cases. The risks continue increasing over the following 200 years, even with atmospheric composition held constant.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyzes the convergence behavior of the least mean square (LMS) filter when used in an adaptive code division multiple access (CDMA) detector consisting of a tapped delay line with adjustable tap weights. The sampling rate may be equal to or higher than the chip rate, and these correspond to chip-spaced (CS) and fractionally spaced (FS) detection, respectively. It is shown that CS and FS detectors with the same time-span exhibit identical convergence behavior if the baseband received signal is strictly bandlimited to half the chip rate. Even in the practical case when this condition is not met, deviations from this observation are imperceptible unless the initial tap-weight vector gives an extremely large mean squared error (MSE). This phenomenon is carefully explained with reference to the eigenvalues of the correlation matrix when the input signal is not perfectly bandlimited. The inadequacy of the eigenvalue spread of the tap-input correlation matrix as an indicator of the transient behavior and the influence of the initial tap weight vector on convergence speed are highlighted. Specifically, a initialization within the signal subspace or to the origin leads to very much faster convergence compared with initialization in the a noise subspace.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adaptive filters used in code division multiple access (CDMA) receivers to counter interference have been formulated both with and without the assumption of training symbols being transmitted. They are known as training-based and blind detectors respectively. We show that the convergence behaviour of the blind minimum-output-energy (MOE) detector can be quite easily derived, unlike what was implied by the procedure outlined in a previous paper. The simplification results from the observation that the correlation matrix determining convergence performance can be made symmetric, after which many standard results from the literature on least mean square (LMS) filters apply immediately.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A significant challenge in the prediction of climate change impacts on ecosystems and biodiversity is quantifying the sources of uncertainty that emerge within and between different models. Statistical species niche models have grown in popularity, yet no single best technique has been identified reflecting differing performance in different situations. Our aim was to quantify uncertainties associated with the application of 2 complimentary modelling techniques. Generalised linear mixed models (GLMM) and generalised additive mixed models (GAMM) were used to model the realised niche of ombrotrophic Sphagnum species in British peatlands. These models were then used to predict changes in Sphagnum cover between 2020 and 2050 based on projections of climate change and atmospheric deposition of nitrogen and sulphur. Over 90% of the variation in the GLMM predictions was due to niche model parameter uncertainty, dropping to 14% for the GAMM. After having covaried out other factors, average variation in predicted values of Sphagnum cover across UK peatlands was the next largest source of variation (8% for the GLMM and 86% for the GAMM). The better performance of the GAMM needs to be weighed against its tendency to overfit the training data. While our niche models are only a first approximation, we used them to undertake a preliminary evaluation of the relative importance of climate change and nitrogen and sulphur deposition and the geographic locations of the largest expected changes in Sphagnum cover. Predicted changes in cover were all small (generally <1% in an average 4 m2 unit area) but also highly uncertain. Peatlands expected to be most affected by climate change in combination with atmospheric pollution were Dartmoor, Brecon Beacons and the western Lake District.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It took the solar polar passage of Ulysses in the early 1990s to establish the global structure of the solar wind speed during solar minimum. However, it remains unclear if the solar wind is composed of two distinct populations of solar wind from different sources (e.g., closed loops which open up to produce the slow solar wind) or if the fast and slow solar wind rely on the superradial expansion of the magnetic field to account for the observed solar wind speed variation. We investigate the solar wind in the inner corona using the Wang-Sheeley-Arge (WSA) coronal model incorporating a new empirical magnetic topology–velocity relationship calibrated for use at 0.1 AU. In this study the empirical solar wind speed relationship was determined by using Helios perihelion observations, along with results from Riley et al. (2003) and Schwadron et al. (2005) as constraints. The new relationship was tested by using it to drive the ENLIL 3-D MHD solar wind model and obtain solar wind parameters at Earth (1.0 AU) and Ulysses (1.4 AU). The improvements in speed, its variability, and the occurrence of high-speed enhancements provide confidence that the new velocity relationship better determines the solar wind speed in the outer corona (0.1 AU). An analysis of this improved velocity field within the WSA model suggests the existence of two distinct mechanisms of the solar wind generation, one for fast and one for slow solar wind, implying that a combination of present theories may be necessary to explain solar wind observations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatic keyword or keyphrase extraction is concerned with assigning keyphrases to documents based on words from within the document. Previous studies have shown that in a significant number of cases author-supplied keywords are not appropriate for the document to which they are attached. This can either be because they represent what the author believes the paper is about not what it actually is, or because they include keyphrases which are more classificatory than explanatory e.g., “University of Poppleton” instead of “Knowledge Discovery in Databases”. Thus, there is a need for a system that can generate appropriate and diverse range of keyphrases that reflect the document. This paper proposes a solution that examines the synonyms of words and phrases in the document to find the underlying themes, and presents these as appropriate keyphrases. The primary method explores taking n-grams of the source document phrases, and examining the synonyms of these, while the secondary considers grouping outputs by their synonyms. The experiments undertaken show the primary method produces good results and that the secondary method produces both good results and potential for future work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have carried out a thorough mineralogical analysis of 16 pottery samples from the Lapita site of Bourwera in Fiji, using micromorphological techniques with optical and polarising microscopes. While the overall mineralogy of all of the samples is similar the samples clearly divide into two groups, namely those with or without the mineral calcite. Our findings are backed up by chemical analysis using SEM–EDX and FTIR. SEM–EDX shows the clear presence of inclusions of calcite in some of the samples; FTIR shows bands arising from calcite in these samples. The study suggests that it is likely that more than one clay source was used for production of this pottery, but that most of the pottery comes from a single source. This finding is in line with previous studies which suggest some trading of pottery between the Fijian islands but a single source of clay for most of the pottery found at Bouwera. We found no evidence for the destruction of CaCO3 by heating upon production of the pottery in line with the known technology of the Lapita people who produced earthenware pottery but not high temperature ceramics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Basic Network transactions specifies that datagram from source to destination is routed through numerous routers and paths depending on the available free and uncongested paths which results in the transmission route being too long, thus incurring greater delay, jitter, congestion and reduced throughput. One of the major problems of packet switched networks is the cell delay variation or jitter. This cell delay variation is due to the queuing delay depending on the applied loading conditions. The effect of delay, jitter accumulation due to the number of nodes along transmission routes and dropped packets adds further complexity to multimedia traffic because there is no guarantee that each traffic stream will be delivered according to its own jitter constraints therefore there is the need to analyze the effects of jitter. IP routers enable a single path for the transmission of all packets. On the other hand, Multi-Protocol Label Switching (MPLS) allows separation of packet forwarding and routing characteristics to enable packets to use the appropriate routes and also optimize and control the behavior of transmission paths. Thus correcting some of the shortfalls associated with IP routing. Therefore MPLS has been utilized in the analysis for effective transmission through the various networks. This paper analyzes the effect of delay, congestion, interference, jitter and packet loss in the transmission of signals from source to destination. In effect the impact of link failures, repair paths in the various physical topologies namely bus, star, mesh and hybrid topologies are all analyzed based on standard network conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The technique of relaxation of the tropical atmosphere towards an analysis in a month-season forecast model has previously been successfully exploited in a number of contexts. Here it is shown that when tropical relaxation is used to investigate the possible origin of the observed anomalies in June–July 2007, a simple dynamical model is able to reproduce the observed component of the pattern of anomalies given by an ensemble of ECMWF forecast runs. Following this result, the simple model is used for a range of experiments on time-scales of relaxation, variables and regions relaxed based on a control model run with equatorial heating in a zonal flow. A theory based on scale analysis for the large-scale tropics is used to interpret the results. Typical relationships between scales are determined from the basic equations, and for a specified diabatic heating a chain of deductions for determining the dependent variables is derived. Different critical time-scales are found for tropical relaxation of different dependent variables to be effective. Vorticity has the longest critical time-scale, typically 1.2 days. For temperature and divergence, the time-scales are 10 hours and 3 hours, respectively. However not all the tropical fields, in particular the vertical motion, are reproduced correctly by the model unless divergence is heavily damped. To obtain the correct extra-tropical fields, it is crucial to have the correct rotational flow in the subtropics to initiate the Rossby wave propagation from there. It is sufficient to relax vorticity or temperature on a time-scale comparable or less than their critical time-scales to obtain this. However if the divergent advection of vorticity is important in the Rossby Wave Source then strong relaxation of divergence is required to accurately represent the tropical forcing of Rossby waves.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mannitol is a polymorphic excipient which is usually used in pharmaceutical products as the beta form, although other polymorphs (alpha and delta) are common contaminants. Binary mixtures containing beta and delta mannitol were prepared to quantify the concentration of the beta form using FT-Raman spectroscopy. Spectral regions characteristic of each form were selected and peak intensity ratios of beta peaks to delta peaks were calculated. Using these ratios, a correlation curve was established which was then validated by analysing further samples of known composition. The results indicate that levels down to 2% beta could be quantified using this novel, non-destructive approach. Potential errors associated with quantitative studies using FT-Raman spectroscopy were also researched. The principal source of variability arose from inhomogeneities on mixing of the samples; a significant reduction of these errors was observed by reducing and controlling the particle size range. The results show that FT-Raman spectroscopy can be used to rapidly and accurately quantitate polymorphic mixtures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mannitol is a polymorphic pharmaceutical excipient, which commonly exists in three forms: alpha, beta and delta. Each polymorph has a needle-like morphology, which can give preferred orientation effects when analysed by X-ray powder diffractometry (XRPD) thus providing difficulties for quantitative XRPD assessments. The occurrence of preferred orientation may be demonstrated by sample rotation and the consequent effects on X-ray data can be minimised by reducing the particle size. Using two particle size ranges (less than 125 and 125–500�microns), binary mixtures of beta and delta mannitol were prepared and the delta component was quantified. Samples were assayed in either a static or rotating sampling accessory. Rotation and reducing the particle size range to less than�125 microns halved the limits of detection and quantitation to 1 and 3.6%, respectively. Numerous potential sources of assay errors were investigated; sample packing and mixing errors contributed the greatest source of variation. However, the rotation of samples for both particle size ranges reduced the majority of assay errors examined. This study shows that coupling sample rotation with a particle size reduction minimises preferred orientation effects on assay accuracy, allowing discrimination of two very similar polymorphs at around the 1% level

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Following a malicious or accidental atmospheric release in an outdoor environment it is essential for first responders to ensure safety by identifying areas where human life may be in danger. For this to happen quickly, reliable information is needed on the source strength and location, and the type of chemical agent released. We present here an inverse modelling technique that estimates the source strength and location of such a release, together with the uncertainty in those estimates, using a limited number of measurements of concentration from a network of chemical sensors considering a single, steady, ground-level source. The technique is evaluated using data from a set of dispersion experiments conducted in a meteorological wind tunnel, where simultaneous measurements of concentration time series were obtained in the plume from a ground-level point-source emission of a passive tracer. In particular, we analyze the response to the number of sensors deployed and their arrangement, and to sampling and model errors. We find that the inverse algorithm can generate acceptable estimates of the source characteristics with as few as four sensors, providing these are well-placed and that the sampling error is controlled. Configurations with at least three sensors in a profile across the plume were found to be superior to other arrangements examined. Analysis of the influence of sampling error due to the use of short averaging times showed that the uncertainty in the source estimates grew as the sampling time decreased. This demonstrated that averaging times greater than about 5min (full scale time) lead to acceptable accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rocket is a leafy brassicaceous salad crop that encompasses two major genera (Diplotaxis and Eruca) and many different cultivars. Rocket is a rich source of antioxidants and glucosinolates, many of which are produced as secondary products by the plant in response to stress. In this paper we examined the impact of temperature and light stress on several different cultivars of wild and salad rocket. Growth habit of the plants varied in response to stress and with different genotypes, reflecting the wide geographical distribution of the plant and the different environments to which the genera have naturally adapted. Preharvest environmental stress and genotype also had an impact on how well the cultivar was able to resist postharvest senescence, indicating that breeding or selection of senescence-resistant genotypes will be possible in the future. The abundance of key phytonutrients such as carotenoids and glucosinolates are also under genetic control. As genetic resources improve for rocket it will therefore be possible to develop a molecular breeding programme specifically targeted at improving stress resistance and nutritional levels of plant secondary products. Concomitantly, it has been shown in this paper that controlled levels of abiotic stress can potentially improve the levels of chlorophyll, carotenoids and antioxidant activity in this leafy vegetable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatic keyword or keyphrase extraction is concerned with assigning keyphrases to documents based on words from within the document. Previous studies have shown that in a significant number of cases author-supplied keywords are not appropriate for the document to which they are attached. This can either be because they represent what the author believes a paper is about not what it actually is, or because they include keyphrases which are more classificatory than explanatory e.g., “University of Poppleton” instead of “Knowledge Discovery in Databases”. Thus, there is a need for a system that can generate an appropriate and diverse range of keyphrases that reflect the document. This paper proposes two possible solutions that examine the synonyms of words and phrases in the document to find the underlying themes, and presents these as appropriate keyphrases. Using three different freely available thesauri, the work undertaken examines two different methods of producing keywords and compares the outcomes across multiple strands in the timeline. The primary method explores taking n-grams of the source document phrases, and examining the synonyms of these, while the secondary considers grouping outputs by their synonyms. The experiments undertaken show the primary method produces good results and that the secondary method produces both good results and potential for future work. In addition, the different qualities of the thesauri are examined and it is concluded that the more entries in a thesaurus, the better it is likely to perform. The age of the thesaurus or the size of each entry does not correlate to performance.