887 resultados para path sampling
Resumo:
Consideration is given to a standard CDMA system and determination of the density function of the interference with and without Gaussian noise using sampling theory concepts. The formula derived provides fast and accurate results and is a simple, useful alternative to other methods
Resumo:
The goal of this paper is to study and further develop the orthogonality sampling or stationary waves algorithm for the detection of the location and shape of objects from the far field pattern of scattered waves in electromagnetics or acoustics. Orthogonality sampling can be seen as a special beam forming algorithm with some links to the point source method and to the linear sampling method. The basic idea of orthogonality sampling is to sample the space under consideration by calculating scalar products of the measured far field pattern , with a test function for all y in a subset Q of the space , m = 2, 3. The way in which this is carried out is important to extract the information which the scattered fields contain. The theoretical foundation of orthogonality sampling is only partly resolved, and the goal of this work is to initiate further research by numerical demonstration of the high potential of the approach. We implement the method for a two-dimensional setting for the Helmholtz equation, which represents electromagnetic scattering when the setup is independent of the third coordinate. We show reconstructions of the location and shape of objects from measurements of the scattered field for one or several directions of incidence and one or many frequencies or wave numbers, respectively. In particular, we visualize the indicator function both with the Dirichlet and Neumann boundary condition and for complicated inhomogeneous media.
Resumo:
This paper provides a comparative study of the performance of cross-flow and counter-flow M-cycle heat exchangers for dew point cooling. It is recognised that evaporative cooling systems offer a low energy alternative to conventional air conditioning units. Recently emerged dew point cooling, as the renovated evaporative cooling configuration, is claimed to have much higher cooling output over the conventional evaporative modes owing to use of the M-cycle heat exchangers. Cross-flow and counter-flow heat exchangers, as the available structures for M-cycle dew point cooling processing, were theoretically and experimentally investigated to identify the difference in cooling effectiveness of both under the parallel structural/operational conditions, optimise the geometrical sizes of the exchangers and suggest their favourite operational conditions. Through development of a dedicated computer model and case-by-case experimental testing and validation, a parametric study of the cooling performance of the counter-flow and cross-flow heat exchangers was carried out. The results showed the counter-flow exchanger offered greater (around 20% higher) cooling capacity, as well as greater (15%–23% higher) dew-point and wet-bulb effectiveness when equal in physical size and under the same operating conditions. The cross-flow system, however, had a greater (10% higher) Energy Efficiency (COP). As the increased cooling effectiveness will lead to reduced air volume flow rate, smaller system size and lower cost, whilst the size and cost are the inherent barriers for use of dew point cooling as the alternation of the conventional cooling systems, the counter-flow system is considered to offer practical advantages over the cross-flow system that would aid the uptake of this low energy cooling alternative. In line with increased global demand for energy in cooling of building, largely by economic booming of emerging developing nations and recognised global warming, the research results will be of significant importance in terms of promoting deployment of the low energy dew point cooling system, helping reduction of energy use in cooling of buildings and cut of the associated carbon emission.
Resumo:
The ground-based Atmospheric Radiation Measurement Program (ARM) and NASA Aerosol Robotic Net- work (AERONET) routinely monitor clouds using zenith ra- diances at visible and near-infrared wavelengths. Using the transmittance calculated from such measurements, we have developed a new retrieval method for cloud effective droplet size and conducted extensive tests for non-precipitating liquid water clouds. The underlying principle is to combine a liquid-water-absorbing wavelength (i.e., 1640 nm) with a non-water-absorbing wavelength for acquiring information on cloud droplet size and optical depth. For simulated stratocumulus clouds with liquid water path less than 300 g m−2 and horizontal resolution of 201 m, the retrieval method underestimates the mean effective radius by 0.8μm, with a root-mean-squared error of 1.7 μm and a relative deviation of 13%. For actual observations with a liquid water path less than 450 g m−2 at the ARM Oklahoma site during 2007– 2008, our 1.5-min-averaged retrievals are generally larger by around 1 μm than those from combined ground-based cloud radar and microwave radiometer at a 5-min temporal resolution. We also compared our retrievals to those from combined shortwave flux and microwave observations for relatively homogeneous clouds, showing that the bias between these two retrieval sets is negligible, but the error of 2.6 μm and the relative deviation of 22 % are larger than those found in our simulation case. Finally, the transmittance-based cloud effective droplet radii agree to better than 11 % with satellite observations and have a negative bias of 1 μm. Overall, the retrieval method provides reasonable cloud effective radius estimates, which can enhance the cloud products of both ARM and AERONET.
Resumo:
In the context of the Ghanaian government’s objective of structural transformation with an emphasis on manufacturing, this paper provides a case study of economic transformation in Ghana, exploring patterns of growth, sectoral transformation, and agglomeration. We document and examine why, despite impressive growth and poverty reduction figures, Ghana’s economy has exhibited less transformation than might be expected for a country that has recently achieved middle-income status. Ghana’s reduced share of agriculture in the economy, unlike many successfully transformed countries in Asia and Latin America, has been filled by services, while manufacturing has stagnated and even declined. Likely causes include weak transformation of the agricultural sector and therefore little development of agroprocessing, the emergence of consumption cities and consumption-driven growth, upward pressure on the exchange rate, weak production linkages, and a poor environment for private-sector-led manufacturing.
Resumo:
Serial sampling and stable isotope analysis performed along the growth axis of vertebrate tooth enamel records differences attributed to seasonal variation in diet, climate or animal movement. Because several months are required to obtain mature enamel in large mammals, modifications in the isotopic composition of environmental parameters are not instantaneously recorded, and stable isotope analysis of tooth enamel returns a time-averaged signal attenuated in its amplitude relative to the input signal. For convenience, stable isotope profiles are usually determined on the side of the tooth where enamel is thickest. Here we investigate the possibility of improving the time resolution by targeting the side of the tooth where enamel is thinnest. Observation of developing third molars (M3) in sheep shows that the tooth growth rate is not constant but decreases exponentially, while the angle between the first layer of enamel deposited and the enamel–dentine junction increases as a tooth approaches its maximal length. We also noted differences in thickness and geometry of enamel growth between the mesial side (i.e., the side facing the M2) and the buccal side (i.e., the side facing the cheek) of the M3. Carbon and oxygen isotope variations were measured along the M3 teeth from eight sheep raised under controlled conditions. Intra-tooth variability was systematically larger along the mesial side and the difference in amplitude between the two sides was proportional to the time of exposure to the input signal. Although attenuated, the mesial side records variations in the environmental signal more faithfully than the buccal side. This approach can be adapted to other mammals whose teeth show lateral variation in enamel thickness and could potentially be used as an internal check for diagenesis.
Resumo:
This contribution proposes a novel probability density function (PDF) estimation based over-sampling (PDFOS) approach for two-class imbalanced classification problems. The classical Parzen-window kernel function is adopted to estimate the PDF of the positive class. Then according to the estimated PDF, synthetic instances are generated as the additional training data. The essential concept is to re-balance the class distribution of the original imbalanced data set under the principle that synthetic data sample follows the same statistical properties. Based on the over-sampled training data, the radial basis function (RBF) classifier is constructed by applying the orthogonal forward selection procedure, in which the classifier’s structure and the parameters of RBF kernels are determined using a particle swarm optimisation algorithm based on the criterion of minimising the leave-one-out misclassification rate. The effectiveness of the proposed PDFOS approach is demonstrated by the empirical study on several imbalanced data sets.
Resumo:
Monthly zonal mean climatologies of atmospheric measurements from satellite instruments can have biases due to the nonuniform sampling of the atmosphere by the instruments. We characterize potential sampling biases in stratospheric trace gas climatologies of the Stratospheric Processes and Their Role in Climate (SPARC) Data Initiative using chemical fields from a chemistry climate model simulation and sampling patterns from 16 satellite-borne instruments. The exercise is performed for the long-lived stratospheric trace gases O3 and H2O. Monthly sampling biases for O3 exceed 10% for many instruments in the high-latitude stratosphere and in the upper troposphere/lower stratosphere, while annual mean sampling biases reach values of up to 20% in the same regions for some instruments. Sampling biases for H2O are generally smaller than for O3, although still notable in the upper troposphere/lower stratosphere and Southern Hemisphere high latitudes. The most important mechanism leading to monthly sampling bias is nonuniform temporal sampling, i.e., the fact that for many instruments, monthly means are produced from measurements which span less than the full month in question. Similarly, annual mean sampling biases are well explained by nonuniformity in the month-to-month sampling by different instruments. Nonuniform sampling in latitude and longitude are shown to also lead to nonnegligible sampling biases, which are most relevant for climatologies which are otherwise free of biases due to nonuniform temporal sampling.
Resumo:
From Milsom's equations, which describe the geometry of ray-path hops reflected from the ionospheric F-layer, algorithms for the simplified estimation of mirror-reflection height are developed. These allow for hop length and the effects of variations in underlying ionisation (via the ratio of the F2- and E-layer critical frequencies) and F2-layer peak height (via the M(3000)F2-factor). Separate algorithms are presented which are applicable to a range of signal frequencies about the FOT and to propagation at the MUF. The accuracies and complexities of the algorithms are compared with those inherent in the use of a procedure based on an equation developed by Shimazaki.
Resumo:
The high computational cost of calculating the radiative heating rates in numerical weather prediction (NWP) and climate models requires that calculations are made infrequently, leading to poor sampling of the fast-changing cloud field and a poor representation of the feedback that would occur. This paper presents two related schemes for improving the temporal sampling of the cloud field. Firstly, the ‘split time-stepping’ scheme takes advantage of the independent nature of the monochromatic calculations of the ‘correlated-k’ method to split the calculation into gaseous absorption terms that are highly dependent on changes in cloud (the optically thin terms) and those that are not (optically thick). The small number of optically thin terms can then be calculated more often to capture changes in the grey absorption and scattering associated with cloud droplets and ice crystals. Secondly, the ‘incremental time-stepping’ scheme uses a simple radiative transfer calculation using only one or two monochromatic calculations representing the optically thin part of the atmospheric spectrum. These are found to be sufficient to represent the heating rate increments caused by changes in the cloud field, which can then be added to the last full calculation of the radiation code. We test these schemes in an operational forecast model configuration and find a significant improvement is achieved, for a small computational cost, over the current scheme employed at the Met Office. The ‘incremental time-stepping’ scheme is recommended for operational use, along with a new scheme to correct the surface fluxes for the change in solar zenith angle between radiation calculations.
Resumo:
The EU Water Framework Directive (WFD) requires that the ecological and chemical status of water bodies in Europe should be assessed, and action taken where possible to ensure that at least "good" quality is attained in each case by 2015. This paper is concerned with the accuracy and precision with which chemical status in rivers can be measured given certain sampling strategies, and how this can be improved. High-frequency (hourly) chemical data from four rivers in southern England were subsampled to simulate different sampling strategies for four parameters used for WFD classification: dissolved phosphorus, dissolved oxygen, pH and water temperature. These data sub-sets were then used to calculate the WFD classification for each site. Monthly sampling was less precise than weekly sampling, but the effect on WFD classification depended on the closeness of the range of concentrations to the class boundaries. In some cases, monthly sampling for a year could result in the same water body being assigned to three or four of the WFD classes with 95% confidence, due to random sampling effects, whereas with weekly sampling this was one or two classes for the same cases. In the most extreme case, the same water body could have been assigned to any of the five WFD quality classes. Weekly sampling considerably reduces the uncertainties compared to monthly sampling. The width of the weekly sampled confidence intervals was about 33% that of the monthly for P species and pH, about 50% for dissolved oxygen, and about 67% for water temperature. For water temperature, which is assessed as the 98th percentile in the UK, monthly sampling biases the mean downwards by about 1 °C compared to the true value, due to problems of assessing high percentiles with limited data. Low-frequency measurements will generally be unsuitable for assessing standards expressed as high percentiles. Confining sampling to the working week compared to all 7 days made little difference, but a modest improvement in precision could be obtained by sampling at the same time of day within a 3 h time window, and this is recommended. For parameters with a strong diel variation, such as dissolved oxygen, the value obtained, and thus possibly the WFD classification, can depend markedly on when in the cycle the sample was taken. Specifying this in the sampling regime would be a straightforward way to improve precision, but there needs to be agreement about how best to characterise risk in different types of river. These results suggest that in some cases it will be difficult to assign accurate WFD chemical classes or to detect likely trends using current sampling regimes, even for these largely groundwater-fed rivers. A more critical approach to sampling is needed to ensure that management actions are appropriate and supported by data.