238 resultados para Minimum processing
em CentAUR: Central Archive University of Reading - UK
OFDM joint data detection and phase noise cancellation based on minimum mean square prediction error
Resumo:
This paper proposes a new iterative algorithm for orthogonal frequency division multiplexing (OFDM) joint data detection and phase noise (PHN) cancellation based on minimum mean square prediction error. We particularly highlight the relatively less studied problem of "overfitting" such that the iterative approach may converge to a trivial solution. Specifically, we apply a hard-decision procedure at every iterative step to overcome the overfitting. Moreover, compared with existing algorithms, a more accurate Pade approximation is used to represent the PHN, and finally a more robust and compact fast process based on Givens rotation is proposed to reduce the complexity to a practical level. Numerical Simulations are also given to verify the proposed algorithm. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Generalized cubes are a subclass of hypercube-like networks, which include some hypercube variants as special cases. Let theta(G)(k) denote the minimum number of nodes adjacent to a set of k vertices of a graph G. In this paper, we prove theta(G)(k) >= -1/2k(2) + (2n - 3/2)k - (n(2) - 2) for each n-dimensional generalized cube and each integer k satisfying n + 2 <= k <= 2n. Our result is an extension of a result presented by Fan and Lin [J. Fan, X. Lin, The t/k-diagnosability of the BC graphs, IEEE Trans. Comput. 54 (2) (2005) 176-184]. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
A bit-level processing (BLP) based linear CDMA detector is derived following the principle of minimum variance distortionless response (MVDR). The combining taps for the MVDR detector are determined from (1) the covariance matrix of the matched filter output, and (2) the corresponding row (or column) of the user correlation matrix. Due to the interference suppression capability of MVDR and the fact that no inversion of the user correlation matrix is involved, the influence of the synchronisation errors is greatly reduced. The detector performance is demonstrated via computer simulations (both synchronisation errors and intercell interference are considered).
Resumo:
The Gram-Schmidt (GS) orthogonalisation procedure has been used to improve the convergence speed of least mean square (LMS) adaptive code-division multiple-access (CDMA) detectors. However, this algorithm updates two sets of parameters, namely the GS transform coefficients and the tap weights, simultaneously. Because of the additional adaptation noise introduced by the former, it is impossible to achieve the same performance as the ideal orthogonalised LMS filter, unlike the result implied in an earlier paper. The authors provide a lower bound on the minimum achievable mean squared error (MSE) as a function of the forgetting factor λ used in finding the GS transform coefficients, and propose a variable-λ algorithm to balance the conflicting requirements of good tracking and low misadjustment.
Resumo:
Adaptive filters used in code division multiple access (CDMA) receivers to counter interference have been formulated both with and without the assumption of training symbols being transmitted. They are known as training-based and blind detectors respectively. We show that the convergence behaviour of the blind minimum-output-energy (MOE) detector can be quite easily derived, unlike what was implied by the procedure outlined in a previous paper. The simplification results from the observation that the correlation matrix determining convergence performance can be made symmetric, after which many standard results from the literature on least mean square (LMS) filters apply immediately.
Resumo:
The total phenols, apigenin 7-glucoside, turbidity and colour of extracts from dried chamomile flowers were studied with a view to develop chamomile extracts with potential anti-inflammatory properties for incorporation into beverages. The extraction of all constituents followed pseudo first-order kinetics. In general, the rate constant (k) increased as the temperature increased from 57 to 100 °C. The turbidity only increased significantly between 90 and 100 °C. Therefore, aqueous chamomile extracts had maximum total phenol concentration and minimum turbidity when extracted at 90 °C for 20 min. The effect of drying conditions on chamomile extracted using these conditions was determined. A significant reduction in phenol concentration, from 19.7 ± 0.5 mg/g GAE in fresh chamomile to 13 ± 1 mg/g GAE, was found only in the plant material oven-dried at 80 °C (p ⩽ 0.05). The biggest colour change was between fresh chamomile and that oven-dried at 80 °C, followed by samples air-dried. There was no significant difference in colour of material freeze-dried and oven-dried at 40 °C.
Resumo:
We study the empirical performance of the classical minimum-variance hedging strategy, comparing several econometric models for estimating hedge ratios of crude oil, gasoline and heating oil crack spreads. Given the great variability and large jumps in both spot and futures prices, considerable care is required when processing the relevant data and accounting for the costs of maintaining and re-balancing the hedge position. We find that the variance reduction produced by all models is statistically and economically indistinguishable from the one-for-one “naïve” hedge. However, minimum-variance hedging models, especially those based on GARCH, generate much greater margin and transaction costs than the naïve hedge. Therefore we encourage hedgers to use a naïve hedging strategy on the crack spread bundles now offered by the exchange; this strategy is the cheapest and easiest to implement. Our conclusion contradicts the majority of the existing literature, which favours the implementation of GARCH-based hedging strategies.
Resumo:
The photochemical evolution of an anthropogenic plume from the New-York/Boston region during its transport at low altitudes over the North Atlantic to the European west coast has been studied using a Lagrangian framework. This plume, originally strongly polluted, was sampled by research aircraft just off the North American east coast on 3 successive days, and 3 days downwind off the west coast of Ireland where another aircraft re-sampled a weakly polluted plume. Changes in trace gas concentrations during transport were reproduced using a photochemical trajectory model including deposition and mixing effects. Chemical and wet deposition processing dominated the evolution of all pollutants in the plume. The mean net O3 production was evaluated to be -5 ppbv/day leading to low values of O3 by the time the plume reached Europe. Wet deposition of nitric acid was responsible for an 80% reduction in this O3 production. If the plume had not encountered precipitation, it would have reached the Europe with O3 levels up to 80-90 ppbv, and CO levels between 120 and 140 ppbv. Photochemical destruction also played a more important role than mixing in the evolution of plume CO due to high levels of both O3 and water vapour showing that CO cannot always be used as a tracer for polluted air masses, especially for plumes transported at low altitudes. The results also show that, in this case, an important increase in the O3/CO slope can be attributed to chemical destruction of CO and not to photochemical O3 production as is often assumed.
Resumo:
We construct a mapping from complex recursive linguistic data structures to spherical wave functions using Smolensky's filler/role bindings and tensor product representations. Syntactic language processing is then described by the transient evolution of these spherical patterns whose amplitudes are governed by nonlinear order parameter equations. Implications of the model in terms of brain wave dynamics are indicated.
Resumo:
In recent years there has been an increasing awareness of the radiological impact of non-nuclear industries that extract and/or process ores and minerals containing naturally occurring radioactive material (NORM). These industrial activities may result in significant radioactive contamination of (by-) products, wastes and plant installations. In this study, scale samples were collected from a decommissioned phosphoric acid processing plant. To determine the nature and concentration of NORM retained in pipe-work and associated process plant, four main areas of the site were investigated: (1) the 'Green Acid Plant', where crude acid was concentrated; (2) the green acid storage tanks; (3) the Purified White Acid (PWA) plant, where inorganic impurities were removed; and (4) the solid waste, disposed of on-site as landfill. The scale samples predominantly comprise the following: fluorides (e.g. ralstonite); calcium sulphate (e.g. gypsum); and an assemblage of mixed fluorides and phosphates (e.g. iron fluoride hydrate, calcium phosphate), respectively. The radioactive inventory is dominated by U-238 and its decay chain products, and significant fractionation along the series occurs. Compared to the feedstock ore, elevated concentrations (<= 8.8 Bq/g) of U-238 Were found to be retained in installations where the process stream was rich in fluorides and phosphates. In addition, enriched levels (<= 11 Bq/g) of Ra-226 were found in association with precipitates of calcium sulphate. Water extraction tests indicate that many of the scales and waste contain significantly soluble materials and readily release radioactivity into solution. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
This paper reports three experiments that examine the role of similarity processing in McGeorge and Burton's (1990) incidental learning task. In the experiments subjects performed a distractor task involving four-digit number strings, all of which conformed to a simple hidden rule. They were then given a forced-choice memory test in which they were presented with pairs of strings and were led to believe that one string of each pair had appeared in the prior learning phase. Although this was not the case, one string of each pair did conform to the hidden rule. Experiment 1 showed that, as in the McGeorge and Burton study, subjects were significantly more likely to select test strings that conformed to the hidden rule. However, additional analyses suggested that rather than having implicitly abstracted the rule, subjects may have been selecting strings that were in some way similar to those seen during the learning phase. Experiments 2 and 3 were designed to try to separate out effects due to similarity from those due to implicit rule abstraction. It was found that the results were more consistent with a similarity-based model than implicit rule abstraction per se.