995 resultados para Frequency Adaptive
Resumo:
The deployment of Quality of Service (QoS) techniques involves careful analysis of area including: those business requirements; corporate strategy; and technical implementation process, which can lead to conflict or contradiction between those goals of various user groups involved in that policy definition. In addition long-term change management provides a challenge as these implementations typically require a high-skill set and experience level, which expose organisations to effects such as “hyperthymestria” [1] and “The Seven Sins of Memory”, defined by Schacter and discussed further within this paper. It is proposed that, given the information embedded within the packets of IP traffic, an opportunity exists to augment the traffic management with a machine-learning agent-based mechanism. This paper describes the process by which current policies are defined and that research required to support the development of an application which enables adaptive intelligent Quality of Service controls to augment or replace those policy-based mechanisms currently in use.
Resumo:
As a vital factor affecting system cost and lifetime, energy consumption in wireless sensor networks (WSNs) has been paid much attention to. This article presents a new approach to making use of electromagnetic energy from useless radio frequency (RF) signals transmitted in WSNs, with a quantitative analysis showing its feasibility. A mechanism to harvest the energy either passively or actively is proposed.
A low clock frequency FFT core implementation for multiband full-rate ultra-wideband (UWB) receivers
Resumo:
This paper discusses the design, implementation and synthesis of an FFT module that has been specifically optimized for use in the OFDM based Multiband UWB system, although the work is generally applicable to many other OFDM based receiver systems. Previous work has detailed the requirements for the receiver FFT module within the Multiband UWB ODFM based system and this paper draws on those requirements coupled with modern digital architecture principles and low power design criteria to converge on our optimized solution particularly aimed at a low-clock rate implementation. The FFT design obtained in this paper is also applicable for implementation of the transmitter IFFT module therefore only needing one FFT module in the device for half-duplex operation. The results from this paper enable the baseband designers of the 200Mbit/sec variant of Multiband UWB systems (and indeed other OFDM based receivers) using System-on-Chip (SoC), FPGA and ASIC technology to create cost effective and low power consumer electronics product solutions biased toward the very competitive market.
Resumo:
Transient episodes of synchronisation of neuronal activity in particular frequency ranges are thought to underlie cognition. Empirical mode decomposition phase locking (EMDPL) analysis is a method for determining the frequency and timing of phase synchrony that is adaptive to intrinsic oscillations within data, alleviating the need for arbitrary bandpass filter cut-off selection. It is extended here to address the choice of reference electrode and removal of spurious synchrony resulting from volume conduction. Spline Laplacian transformation and independent component analysis (ICA) are performed as pre-processing steps, and preservation of phase synchrony between synthetic signals. combined using a simple forward model, is demonstrated. The method is contrasted with use of bandpass filtering following the same preprocessing steps, and filter cut-offs are shown to influence synchrony detection markedly. Furthermore, an approach to the assessment of multiple EEG trials using the method is introduced, and the assessment of statistical significance of phase locking episodes is extended to render it adaptive to local phase synchrony levels. EMDPL is validated in the analysis of real EEG data, during finger tapping. The time course of event-related (de)synchronisation (ERD/ERS) is shown to differ from that of longer range phase locking episodes, implying different roles for these different types of synchronisation. It is suggested that the increase in phase locking which occurs just prior to movement, coinciding with a reduction in power (or ERD) may result from selection of the neural assembly relevant to the particular movement. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Every winter, the high-latitude oceans are struck by severe storms that are considerably smaller than the weather-dominating synoptic depressions1. Accompanied by strong winds and heavy precipitation, these often explosively developing mesoscale cyclones—termed polar lows1—constitute a threat to offshore activities such as shipping or oil and gas exploitation. Yet owing to their small scale, polar lows are poorly represented in the observational and global reanalysis data2 often used for climatological investigations of atmospheric features and cannot be assessed in coarse-resolution global simulations of possible future climates. Here we show that in a future anthropogenically warmed climate, the frequency of polar lows is projected to decline. We used a series of regional climate model simulations to downscale a set of global climate change scenarios3 from the Intergovernmental Panel of Climate Change. In this process, we first simulated the formation of polar low systems in the North Atlantic and then counted the individual cases. A previous study4 using NCEP/NCAR re-analysis data5 revealed that polar low frequency from 1948 to 2005 did not systematically change. Now, in projections for the end of the twenty-first century, we found a significantly lower number of polar lows and a northward shift of their mean genesis region in response to elevated atmospheric greenhouse gas concentration. This change can be related to changes in the North Atlantic sea surface temperature and mid-troposphere temperature; the latter is found to rise faster than the former so that the resulting stability is increased, hindering the formation or intensification of polar lows. Our results provide a rare example of a climate change effect in which a type of extreme weather is likely to decrease, rather than increase.
Resumo:
In this paper we are mainly concerned with the development of efficient computer models capable of accurately predicting the propagation of low-to-middle frequency sound in the sea, in axially symmetric (2D) and in fully 3D environments. The major physical features of the problem, i.e. a variable bottom topography, elastic properties of the subbottom structure, volume attenuation and other range inhomogeneities are efficiently treated. The computer models presented are based on normal mode solutions of the Helmholtz equation on the one hand, and on various types of numerical schemes for parabolic approximations of the Helmholtz equation on the other. A new coupled mode code is introduced to model sound propagation in range-dependent ocean environments with variable bottom topography, where the effects of an elastic bottom, of volume attenuation, surface and bottom roughness are taken into account. New computer models based on finite difference and finite element techniques for the numerical solution of parabolic approximations are also presented. They include an efficient modeling of the bottom influence via impedance boundary conditions, they cover wide angle propagation, elastic bottom effects, variable bottom topography and reverberation effects. All the models are validated on several benchmark problems and versus experimental data. Results thus obtained were compared with analogous results from standard codes in the literature.
Resumo:
Alterations to the genetic code – codon reassignments – have occurred many times in life’s history, despite the fact that genomes are coadapted to their genetic codes and therefore alterations are likely to be maladaptive. A potential mechanism for adaptive codon reassignment, which could trigger either a temporary period of codon ambiguity or a permanent genetic code change, is the reactivation of a pseudogene by a nonsense suppressor mutant transfer RNA. I examine the population genetics of each stage of this process and find that pseudogene rescue is plausible and also readily explains some features of extant variability in genetic codes.
Resumo:
A year-long field study of the thermal environment in university classrooms was conducted from March 2005 to May 2006 in Chongqing, China. This paper presents the occupants’ thermal sensation votes and discusses the occupants’ adaptive response and perception of the thermal environment in a naturally conditioned space. Comparisons between the Actual Mean Vote (AMV) and Predicted Mean Vote (PMV) have been made as well as between the Actual Percentage of Dissatisfied (APD) and Predicted Percentage of Dissatisfied (PPD). The adaptive thermal comfort zone for the naturally conditioned space for Chongqing, which has hot summer and cold winter climatic characteristics, has been proposed based on the field study results. The Chongqing adaptive comfort range is broader than that of the ASHRAE Standard 55-2004 in general, but in the extreme cold and hot months, it is narrower. The thermal conditions in classrooms in Chongqing in summer and winter are severe. Behavioural adaptation such as changing clothing, adjusting indoor air velocity, taking hot/cold drinks, etc., as well as psychological adaptation, has played a role in adapting to the thermal environment.
Resumo:
We compare the variability of the Atlantic meridional overturning circulation (AMOC) as simulated by the coupled climate models of the RAPID project, which cover a wide range of resolution and complexity, and observed by the RAPID/MOCHA array at about 26N. We analyse variability on a range of timescales. In models of all resolutions there is substantial variability on timescales of a few days; in most AOGCMs the amplitude of the variability is of somewhat larger magnitude than that observed by the RAPID array, while the amplitude of the simulated annual cycle is similar to observations. A dynamical decomposition shows that in the models, as in observations, the AMOC is predominantly geostrophic (driven by pressure and sea-level gradients), with both geostrophic and Ekman contributions to variability, the latter being exaggerated and the former underrepresented in models. Other ageostrophic terms, neglected in the observational estimate, are small but not negligible. In many RAPID models and in models of the Coupled Model Intercomparison Project Phase 3 (CMIP3), interannual variability of the maximum of the AMOC wherever it lies, which is a commonly used model index, is similar to interannual variability in the AMOC at 26N. Annual volume and heat transport timeseries at the same latitude are well-correlated within 15-45N, indicating the climatic importance of the AMOC. In the RAPID and CMIP3 models, we show that the AMOC is correlated over considerable distances in latitude, but not the whole extent of the north Atlantic; consequently interannual variability of the AMOC at 50N is not well-correlated with the AMOC at 26N.
Resumo:
Improving methodology for Phase I dose-finding studies is currently of great interest in pharmaceutical and medical research. This article discusses the current atmosphere and attitude towards adaptive designs and focuses on the influence of Bayesian approaches.