36 resultados para High frequency measurements

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The EU Water Framework Directive (WFD) requires that the ecological and chemical status of water bodies in Europe should be assessed, and action taken where possible to ensure that at least "good" quality is attained in each case by 2015. This paper is concerned with the accuracy and precision with which chemical status in rivers can be measured given certain sampling strategies, and how this can be improved. High-frequency (hourly) chemical data from four rivers in southern England were subsampled to simulate different sampling strategies for four parameters used for WFD classification: dissolved phosphorus, dissolved oxygen, pH and water temperature. These data sub-sets were then used to calculate the WFD classification for each site. Monthly sampling was less precise than weekly sampling, but the effect on WFD classification depended on the closeness of the range of concentrations to the class boundaries. In some cases, monthly sampling for a year could result in the same water body being assigned to three or four of the WFD classes with 95% confidence, due to random sampling effects, whereas with weekly sampling this was one or two classes for the same cases. In the most extreme case, the same water body could have been assigned to any of the five WFD quality classes. Weekly sampling considerably reduces the uncertainties compared to monthly sampling. The width of the weekly sampled confidence intervals was about 33% that of the monthly for P species and pH, about 50% for dissolved oxygen, and about 67% for water temperature. For water temperature, which is assessed as the 98th percentile in the UK, monthly sampling biases the mean downwards by about 1 °C compared to the true value, due to problems of assessing high percentiles with limited data. Low-frequency measurements will generally be unsuitable for assessing standards expressed as high percentiles. Confining sampling to the working week compared to all 7 days made little difference, but a modest improvement in precision could be obtained by sampling at the same time of day within a 3 h time window, and this is recommended. For parameters with a strong diel variation, such as dissolved oxygen, the value obtained, and thus possibly the WFD classification, can depend markedly on when in the cycle the sample was taken. Specifying this in the sampling regime would be a straightforward way to improve precision, but there needs to be agreement about how best to characterise risk in different types of river. These results suggest that in some cases it will be difficult to assign accurate WFD chemical classes or to detect likely trends using current sampling regimes, even for these largely groundwater-fed rivers. A more critical approach to sampling is needed to ensure that management actions are appropriate and supported by data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The co-polar correlation coefficient (ρhv) has many applications, including hydrometeor classification, ground clutter and melting layer identification, interpretation of ice microphysics and the retrieval of rain drop size distributions (DSDs). However, we currently lack the quantitative error estimates that are necessary if these applications are to be fully exploited. Previous error estimates of ρhv rely on knowledge of the unknown "true" ρhv and implicitly assume a Gaussian probability distribution function of ρhv samples. We show that frequency distributions of ρhv estimates are in fact highly negatively skewed. A new variable: L = -log10(1 - ρhv) is defined, which does have Gaussian error statistics, and a standard deviation depending only on the number of independent radar pulses. This is verified using observations of spherical drizzle drops, allowing, for the first time, the construction of rigorous confidence intervals in estimates of ρhv. In addition, we demonstrate how the imperfect co-location of the horizontal and vertical polarisation sample volumes may be accounted for. The possibility of using L to estimate the dispersion parameter (µ) in the gamma drop size distribution is investigated. We find that including drop oscillations is essential for this application, otherwise there could be biases in retrieved µ of up to ~8. Preliminary results in rainfall are presented. In a convective rain case study, our estimates show µ to be substantially larger than 0 (an exponential DSD). In this particular rain event, rain rate would be overestimated by up to 50% if a simple exponential DSD is assumed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the problem of scattering of a time-harmonic acoustic incident plane wave by a sound soft convex polygon. For standard boundary or finite element methods, with a piecewise polynomial approximation space, the computational cost required to achieve a prescribed level of accuracy grows linearly with respect to the frequency of the incident wave. Recently Chandler–Wilde and Langdon proposed a novel Galerkin boundary element method for this problem for which, by incorporating the products of plane wave basis functions with piecewise polynomials supported on a graded mesh into the approximation space, they were able to demonstrate that the number of degrees of freedom required to achieve a prescribed level of accuracy grows only logarithmically with respect to the frequency. Here we propose a related collocation method, using the same approximation space, for which we demonstrate via numerical experiments a convergence rate identical to that achieved with the Galerkin scheme, but with a substantially reduced computational cost.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we consider the problem of time-harmonic acoustic scattering in two dimensions by convex polygons. Standard boundary or finite element methods for acoustic scattering problems have a computational cost that grows at least linearly as a function of the frequency of the incident wave. Here we present a novel Galerkin boundary element method, which uses an approximation space consisting of the products of plane waves with piecewise polynomials supported on a graded mesh, with smaller elements closer to the corners of the polygon. We prove that the best approximation from the approximation space requires a number of degrees of freedom to achieve a prescribed level of accuracy that grows only logarithmically as a function of the frequency. Numerical results demonstrate the same logarithmic dependence on the frequency for the Galerkin method solution. Our boundary element method is a discretization of a well-known second kind combined-layer-potential integral equation. We provide a proof that this equation and its adjoint are well-posed and equivalent to the boundary value problem in a Sobolev space setting for general Lipschitz domains.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article we review recent progress on the design, analysis and implementation of numerical-asymptotic boundary integral methods for the computation of frequency-domain acoustic scattering in a homogeneous unbounded medium by a bounded obstacle. The main aim of the methods is to allow computation of scattering at arbitrarily high frequency with finite computational resources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An isentropic potential vorticity (PV) budget analysis is employed to examine the role of synoptic transients, advection, and nonconservative processes as forcings for the evolution of the low-frequency PV anomalies locally and those associated with the North Atlantic Oscillation (NAO) and the Pacific–North American (PNA) pattern. Specifically, the rate of change of the low-frequency PV is expressed as a sum of tendencies due to divergence of eddy transport, advection by the low-frequency flow (hereafter referred to as advection), and the residual nonconservative processes. The balance between the variances and covariances of these terms is illustrated using a novel vector representation. It is shown that for most locations, as well as for the PNA pattern, the PV variability is dominantly driven by advection. The eddy forcing explains a small amount of the tendency variance. For the NAO, the role of synoptic eddy fluxes is found to be stronger, explaining on average 15% of the NAO tendency variance. Previous studies have not assessed quantitively how the various forcings balance the tendency. Thus, such studies may have overestimated the role of eddy fluxes for the evolution of teleconnections by examining, for example, composites and regressions that indicate maintenance, rather than evolution driven by the eddies. The authors confirm this contrasting view by showing that during persistent blocking (negative NAO) episodes the eddy driving is relatively stronger.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is now well established that subthalamic nucleus high-frequency stimulation (STN HFS) alleviates motor problems in Parkinson's disease. However, its efficacy for cognitive function remains a matter of debate. The aim of this study was to assess the effects of STN HFS in rats performing a visual attentional task. Bilateral STN HFS was applied in intact and in bilaterally dopamine (DA)-depleted rats. In all animals, STN HFS had a transient debilitating effect on all the variables measured in the task. In DA-depleted rats, STN HFS did not alleviate the deficits induced by the DA lesion such as omissions and latency to make correct responses, but induced perseverative approaches to the food magazine, an indicator of enhanced motivation. In sham-operated controls, STN HFS significantly reduced accuracy and induced perseverative behaviour, mimicking partially the effects of bilateral STN lesions in the same task. These results are in line with the hypothesis that STN HFS only partially mimics inactivation of STN produced by lesioning and confirm the motivational exacerbation induced by STN inactivation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We compare the variability of the Atlantic meridional overturning circulation (AMOC) as simulated by the coupled climate models of the RAPID project, which cover a wide range of resolution and complexity, and observed by the RAPID/MOCHA array at about 26N. We analyse variability on a range of timescales. In models of all resolutions there is substantial variability on timescales of a few days; in most AOGCMs the amplitude of the variability is of somewhat larger magnitude than that observed by the RAPID array, while the amplitude of the simulated annual cycle is similar to observations. A dynamical decomposition shows that in the models, as in observations, the AMOC is predominantly geostrophic (driven by pressure and sea-level gradients), with both geostrophic and Ekman contributions to variability, the latter being exaggerated and the former underrepresented in models. Other ageostrophic terms, neglected in the observational estimate, are small but not negligible. In many RAPID models and in models of the Coupled Model Intercomparison Project Phase 3 (CMIP3), interannual variability of the maximum of the AMOC wherever it lies, which is a commonly used model index, is similar to interannual variability in the AMOC at 26N. Annual volume and heat transport timeseries at the same latitude are well-correlated within 15-45N, indicating the climatic importance of the AMOC. In the RAPID and CMIP3 models, we show that the AMOC is correlated over considerable distances in latitude, but not the whole extent of the north Atlantic; consequently interannual variability of the AMOC at 50N is not well-correlated with the AMOC at 26N.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider scattering of a time harmonic incident plane wave by a convex polygon with piecewise constant impedance boundary conditions. Standard finite or boundary element methods require the number of degrees of freedom to grow at least linearly with respect to the frequency of the incident wave in order to maintain accuracy. Extending earlier work by Chandler-Wilde and Langdon for the sound soft problem, we propose a novel Galerkin boundary element method, with the approximation space consisting of the products of plane waves with piecewise polynomials supported on a graded mesh with smaller elements closer to the corners of the polygon. Theoretical analysis and numerical results suggest that the number of degrees of freedom required to achieve a prescribed level of accuracy grows only logarithmically with respect to the frequency of the incident wave.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the scattering of a time-harmonic acoustic incident plane wave by a sound soft convex curvilinear polygon with Lipschitz boundary. For standard boundary or finite element methods, with a piecewise polynomial approximation space, the number of degrees of freedom required to achieve a prescribed level of accuracy grows at least linearly with respect to the frequency of the incident wave. Here we propose a novel Galerkin boundary element method with a hybrid approximation space, consisting of the products of plane wave basis functions with piecewise polynomials supported on several overlapping meshes; a uniform mesh on illuminated sides, and graded meshes refined towards the corners of the polygon on illuminated and shadow sides. Numerical experiments suggest that the number of degrees of freedom required to achieve a prescribed level of accuracy need only grow logarithmically as the frequency of the incident wave increases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current force feedback, haptic interface devices are generally limited to the display of low frequency, high amplitude spatial data. A typical device consists of a low impedance framework of one or more degrees-of-freedom (dof), allowing a user to explore a pre-defined workspace via an end effector such as a handle, thimble, probe or stylus. The movement of the device is then constrained using high gain positional feedback, thus reducing the apparent dof of the device and conveying the illusion of hard contact to the user. Such devices are, however, limited to a narrow bandwidth of frequencies, typically below 30Hz, and are not well suited to the display of surface properties, such as object texture. This paper details a device to augment an existing force feedback haptic display with a vibrotactile display, thus providing a means of conveying low amplitude, high frequency spatial information of object surface properties. 1. Haptics and Haptic Interfaces Haptics is the study of human touch and interaction with the external environment via touch. Information from the human sense of touch can be classified in to two categories, cutaneous and kinesthetic. Cutaneous information is provided via the mechanoreceptive nerve endings in the glabrous skin of the human hand. It is primarily a means of relaying information regarding small-scale details in the form of skin stretch, compression and vibration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many recent papers have documented periodicities in returns, return volatility, bid–ask spreads and trading volume, in both equity and foreign exchange markets. We propose and employ a new test for detecting subtle periodicities in time series data based on a signal coherence function. The technique is applied to a set of seven half-hourly exchange rate series. Overall, we find the signal coherence to be maximal at the 8-h and 12-h frequencies. Retaining only the most coherent frequencies for each series, we implement a trading rule that is based on these observed periodicities. Our results demonstrate in all cases except one that, in gross terms, the rules can generate returns that are considerably greater than those of a buy-and-hold strategy, although they cannot retain their profitability net of transactions costs. We conjecture that this methodology could constitute an important tool for financial market researchers which will enable them to detect, quantify and rank the various periodic components in financial data better.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We compare the variability of the Atlantic meridional overturning circulation (AMOC) as simulated by the coupled climate models of the RAPID project, which cover a wide range of resolution and complexity, and observed by the RAPID/MOCHA array at about 26N. We analyse variability on a range of timescales, from five-daily to interannual. In models of all resolutions there is substantial variability on timescales of a few days; in most AOGCMs the amplitude of the variability is of somewhat larger magnitude than that observed by the RAPID array, while the time-mean is within about 10% of the observational estimate. The amplitude of the simulated annual cycle is similar to observations, but the shape of the annual cycle shows a spread among the models. A dynamical decomposition shows that in the models, as in observations, the AMOC is predominantly geostrophic (driven by pressure and sea-level gradients), with both geostrophic and Ekman contributions to variability, the latter being exaggerated and the former underrepresented in models. Other ageostrophic terms, neglected in the observational estimate, are small but not negligible. The time-mean of the western boundary current near the latitude of the RAPID/MOCHA array has a much wider model spread than the AMOC does, indicating large differences among models in the simulation of the wind-driven gyre circulation, and its variability is unrealistically small in the models. In many RAPID models and in models of the Coupled Model Intercomparison Project Phase 3 (CMIP3), interannual variability of the maximum of the AMOC wherever it lies, which is a commonly used model index, is similar to interannual variability in the AMOC at 26N. Annual volume and heat transport timeseries at the same latitude are well-correlated within 15--45N, indicating the climatic importance of the AMOC. In the RAPID and CMIP3 models, we show that the AMOC is correlated over considerable distances in latitude, but not the whole extent of the north Atlantic; consequently interannual variability of the AMOC at 50N, where it is particularly relevant to European climate, is not well-correlated with that of the AMOC at 26N, where it is monitored by the RAPID/MOCHA array.