22 resultados para Classification Automatic Modulation. Correntropy. Radio Cognitive

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Photonic signal processing is used to implement common mode signal cancellation across a very wide bandwidth utilising phase modulation of radio frequency (RF) signals onto a narrow linewidth laser carrier. RF spectra were observed using narrow-band, tunable optical filtering using a scanning Fabry Perot etalon. Thus functions conventionally performed using digital signal processing techniques in the electronic domain have been replaced by analog techniques in the photonic domain. This technique was able to observe simultaneous cancellation of signals across a bandwidth of 1400 MHz, limited only by the free spectral range of the etalon. © 2013 David M. Benton.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Cognitive Radio has been proposed as a key technology to significantly improve spectrum usage in wireless networks by enabling unlicensed users to access unused resource. We present new algorithms that are needed for the implementation of opportunistic scheduling policies that maximize the throughput utilization of resources by secondary users, under maximum interference constraints imposed by existing primary users. Our approach is based on the Belief Propagation (BP) algorithm, which is advantageous due to its simplicity and potential for distributed implementation. We examine convergence properties and evaluate the performance of the proposed BP algorithms via simulations and demonstrate that the results compare favorably with a benchmark greedy strategy. © 2013 IEEE.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The joint effect of fiber chromatic dispersion and fiber nonlinearity onto single-sideband and double-sideband modulated radio-over-fiber links is investigated. Experimental and simulated results show that modulation suppression caused by the chromatic dispersion in radio-over-fiber links can be successfully eliminated in both schemes only when the system is in the linear regime. Under nonlinear transmission the received microwave carrier power depends on the optical incident power.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

IEEE 802.11 standard is the dominant technology for wireless local area networks (WLANs). In the last two decades, the Distributed coordination function (DCF) of IEEE 802.11 standard has become the one of the most important media access control (MAC) protocols for mobile ad hoc networks (MANETs). The DCF protocol can also be combined with cognitive radio, thus the IEEE 802.11 cognitive radio ad hoc networks (CRAHNs) come into being. There were several literatures which focus on the modeling of IEEE 802.11 CRAHNs, however, there is still no thorough and scalable analytical models for IEEE 802.11 CRAHNs whose cognitive node (i.e., secondary user, SU) has spectrum sensing and possible channel silence process before the MAC contention process. This paper develops a unified analytical model for IEEE 802.11 CRAHNs for comprehensive MAC layer queuing analysis. In the proposed model, the SUs are modeled by a hyper generalized 2D Markov chain model with an M/G/1/K model while the primary users (PUs) are modeled by a generalized 2D Markov chain and an M/G/1/K model. The performance evaluation results show that the quality-of-service (QoS) of both the PUs and SUs can be statistically guaranteed with the suitable settings of duration of channel sensing and silence phase in the case of under loading.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

General practitioners, geriatricians, neurologists and health care professionals all over the world will be facing by 2040 the diagnostic, therapeutic and socioeconomic challenges of over 80 million people with dementia. Dementia is one of the most common diseases in the elderly which drastically affects daily life and everyday personal activities, is often associated with behavioural symptoms, personality change and numerous clinical complications and increases the risk for urinary incontinence, hip fracture, and - most markedly - the dependence on nursing care. The costs of care for patients with dementia are therefore immense. Serum cholesterol levels above 6.5 mmol/L are known to be associated with an increased RR of 1.5 and 2.1 to develop Alzheimeŕs disease, the most common form of dementia, and a reduction of serum cholesterol in midlife is associated with a lowered dementia risk. The aim of this work is to critically discuss some of the main results reported recently in the literature in this respect and to provide the pathophysiological rationale for the control of dyslipidemia in the prevention of dementia onset and progression.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While the retrieval of existing designs to prevent unnecessary duplication of parts is a recognised strategy in the control of design costs the available techniques to achieve this, even in product data management systems, are limited in performance or require large resources. A novel system has been developed based on a new version of an existing coding system (CAMAC) that allows automatic coding of engineering drawings and their subsequent retrieval using a drawing of the desired component as the input. The ability to find designs using a detail drawing rather than textual descriptions is a significant achievement in itself. Previous testing of the system has demonstrated this capability but if a means could be found to find parts from a simple sketch then its practical application would be much more effective. This paper describes the development and testing of such a search capability using a database of over 3000 engineering components.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of sensing devices is one of the instrumentation fields that has grown rapidly in the last decade. Corresponding to the swift advance in the development of microelectronic sensors, optical fibre sensors are widely investigated because of their advantageous properties over the electronics sensors such as their wavelength multiplexing capability and high sensitivity to temperature, pressure, strain, vibration and acoustic emission. Moreover, optical fibre sensors are more attractive than the electronics sensors as they can perform distributed sensing, in terms of covering a reasonably large area using a single piece of fibre. Apart from being a responsive element in the sensing field, optical fibre possesses good assets in generating, distributing, processing and transmitting signals in the future broadband information network. These assets include wide bandwidth, high capacity and low loss that grant mobility and flexibility for wireless access systems. Among these core technologies, the fibre optic signal processing and transmission of optical and radio frequency signals have been the subjects of study in this thesis. Based on the intrinsic properties of single-mode optical fibre, this thesis aims to exploit the fibre characteristics such as thermal sensitivity, birefringence, dispersion and nonlinearity, in the applications of temperature sensing and radio-over-fibre systems. By exploiting the fibre thermal sensitivity, a fully distributed temperature sensing system consisting of an apodised chirped fibre Bragg grating has been implemented. The proposed system has proven to be efficient in characterising grating and providing the information of temperature variation, location and width of the heat source applied in the area under test.To exploit the fibre birefringence, a fibre delay line filter using a single high-birefringence optical fibre structure has been presented. The proposed filter can be reconfigured and programmed by adjusting the input azimuth of launched light, as well as the strength and direction of the applied coupling, to meet the requirements of signal processing for different purposes in microwave photonic and optical filtering applications. To exploit the fibre dispersion and nonlinearity, experimental investigations have been carried out to study their joint effect in high power double-sideband and single-sideband modulated links with the presence of fibre loss. The experimental results have been theoretically verified based on the in-house implementation of the split-step Fourier method applied to the generalised nonlinear Schrödinger equation. Further simulation study on the inter-modulation distortion in two-tone signal transmission has also been presented so as to show the effect of nonlinearity of one channel on the other. In addition to the experimental work, numerical simulations have also been carried out in all the proposed systems, to ensure that all the aspects concerned are comprehensively investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need for low bit-rate speech coding is the result of growing demand on the available radio bandwidth for mobile communications both for military purposes and for the public sector. To meet this growing demand it is required that the available bandwidth be utilized in the most economic way to accommodate more services. Two low bit-rate speech coders have been built and tested in this project. The two coders combine predictive coding with delta modulation, a property which enables them to achieve simultaneously the low bit-rate and good speech quality requirements. To enhance their efficiency, the predictor coefficients and the quantizer step size are updated periodically in each coder. This enables the coders to keep up with changes in the characteristics of the speech signal with time and with changes in the dynamic range of the speech waveform. However, the two coders differ in the method of updating their predictor coefficients. One updates the coefficients once every one hundred sampling periods and extracts the coefficients from input speech samples. This is known in this project as the Forward Adaptive Coder. Since the coefficients are extracted from input speech samples, these must be transmitted to the receiver to reconstruct the transmitted speech sample, thus adding to the transmission bit rate. The other updates its coefficients every sampling period, based on information of output data. This coder is known as the Backward Adaptive Coder. Results of subjective tests showed both coders to be reasonably robust to quantization noise. Both were graded quite good, with the Forward Adaptive performing slightly better, but with a slightly higher transmission bit rate for the same speech quality, than its Backward counterpart. The coders yielded acceptable speech quality of 9.6kbps for the Forward Adaptive and 8kbps for the Backward Adaptive.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents an investigation into the application of methods of uncertain reasoning to the biological classification of river water quality. Existing biological methods for reporting river water quality are critically evaluated, and the adoption of a discrete biological classification scheme advocated. Reasoning methods for managing uncertainty are explained, in which the Bayesian and Dempster-Shafer calculi are cited as primary numerical schemes. Elicitation of qualitative knowledge on benthic invertebrates is described. The specificity of benthic response to changes in water quality leads to the adoption of a sensor model of data interpretation, in which a reference set of taxa provide probabilistic support for the biological classes. The significance of sensor states, including that of absence, is shown. Novel techniques of directly eliciting the required uncertainty measures are presented. Bayesian and Dempster-Shafer calculi were used to combine the evidence provided by the sensors. The performance of these automatic classifiers was compared with the expert's own discrete classification of sampled sites. Variations of sensor data weighting, combination order and belief representation were examined for their effect on classification performance. The behaviour of the calculi under evidential conflict and alternative combination rules was investigated. Small variations in evidential weight and the inclusion of evidence from sensors absent from a sample improved classification performance of Bayesian belief and support for singleton hypotheses. For simple support, inclusion of absent evidence decreased classification rate. The performance of Dempster-Shafer classification using consonant belief functions was comparable to Bayesian and singleton belief. Recommendations are made for further work in biological classification using uncertain reasoning methods, including the combination of multiple-expert opinion, the use of Bayesian networks, and the integration of classification software within a decision support system for water quality assessment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The primary objective of this research was to understand what kinds of knowledge and skills people use in `extracting' relevant information from text and to assess the extent to which expert systems techniques could be applied to automate the process of abstracting. The approach adopted in this thesis is based on research in cognitive science, information science, psycholinguistics and textlinguistics. The study addressed the significance of domain knowledge and heuristic rules by developing an information extraction system, called INFORMEX. This system, which was implemented partly in SPITBOL, and partly in PROLOG, used a set of heuristic rules to analyse five scientific papers of expository type, to interpret the content in relation to the key abstract elements and to extract a set of sentences recognised as relevant for abstracting purposes. The analysis of these extracts revealed that an adequate abstract could be generated. Furthermore, INFORMEX showed that a rule based system was a suitable computational model to represent experts' knowledge and strategies. This computational technique provided the basis for a new approach to the modelling of cognition. It showed how experts tackle the task of abstracting by integrating formal knowledge as well as experiential learning. This thesis demonstrated that empirical and theoretical knowledge can be effectively combined in expert systems technology to provide a valuable starting approach to automatic abstracting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although event-related potentials (ERPs) are widely used to study sensory, perceptual and cognitive processes, it remains unknown whether they are phase-locked signals superimposed upon the ongoing electroencephalogram (EEG) or result from phase-alignment of the EEG. Previous attempts to discriminate between these hypotheses have been unsuccessful but here a new test is presented based on the prediction that ERPs generated by phase-alignment will be associated with event-related changes in frequency whereas evoked-ERPs will not. Using empirical mode decomposition (EMD), which allows measurement of narrow-band changes in the EEG without predefining frequency bands, evidence was found for transient frequency slowing in recognition memory ERPs but not in simulated data derived from the evoked model. Furthermore, the timing of phase-alignment was frequency dependent with the earliest alignment occurring at high frequencies. Based on these findings, the Firefly model was developed, which proposes that both evoked and induced power changes derive from frequency-dependent phase-alignment of the ongoing EEG. Simulated data derived from the Firefly model provided a close match with empirical data and the model was able to account for i) the shape and timing of ERPs at different scalp sites, ii) the event-related desynchronization in alpha and synchronization in theta, and iii) changes in the power density spectrum from the pre-stimulus baseline to the post-stimulus period. The Firefly Model, therefore, provides not only a unifying account of event-related changes in the EEG but also a possible mechanism for cross-frequency information processing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Web APIs have gained increasing popularity in recent Web service technology development owing to its simplicity of technology stack and the proliferation of mashups. However, efficiently discovering Web APIs and the relevant documentations on the Web is still a challenging task even with the best resources available on the Web. In this paper we cast the problem of detecting the Web API documentations as a text classification problem of classifying a given Web page as Web API associated or not. We propose a supervised generative topic model called feature latent Dirichlet allocation (feaLDA) which offers a generic probabilistic framework for automatic detection of Web APIs. feaLDA not only captures the correspondence between data and the associated class labels, but also provides a mechanism for incorporating side information such as labelled features automatically learned from data that can effectively help improving classification performance. Extensive experiments on our Web APIs documentation dataset shows that the feaLDA model outperforms three strong supervised baselines including naive Bayes, support vector machines, and the maximum entropy model, by over 3% in classification accuracy. In addition, feaLDA also gives superior performance when compared against other existing supervised topic models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the reformation of spectrum policy and the development of cognitive radio, secondary users will be allowed to access spectrums licensed to primary users. Spectrum auctions can facilitate this secondary spectrum access in a market-driven way. To design an efficient auction framework, we first study the supply and demand pressures and the competitive equilibrium of the secondary spectrum market, considering the spectrum reusability. In well-designed auctions, competition among participants should lead to the competitive equilibrium according to the traditional economic point of view. Then, a discriminatory price spectrum double auction framework is proposed for this market. In this framework, rational participants compete with each other by using bidding prices, and their profits are guaranteed to be non-negative. A near-optimal heuristic algorithm is also proposed to solve the auction clearing problem of the proposed framework efficiently. Experimental results verify the efficiency of the proposed auction clearing algorithm and demonstrate that competition among secondary users and primary users can lead to the competitive equilibrium during auction iterations using the proposed auction framework. Copyright © 2011 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research describes a computerized model of human classification which has been constructed to represent the process by which assessments are made for psychodynamic psychotherapy. The model assigns membership grades (MGs) to clients so that the most suitable ones have high values in the therapy category. Categories consist of a hierarchy of components, one of which, ego strength, is analysed in detail to demonstrate the way it has captured the psychotherapist's knowledge. The bottom of the hierarchy represents the measurable factors being assessed during an interview. A questionnaire was created to gather the identified information and was completed by the psychotherapist after each assessment. The results were fed into the computerized model, demonstrating a high correlation between the model MGs and the suitability ratings of the psychotherapist (r = .825 for 24 clients). The model has successfully identified the relevant data involved in assessment and simulated the decision-making process of the expert. Its cognitive validity enables decisions to be explained, which means that it has potential for therapist training and also for enhancing the referral process, with benefits in cost effectiveness as well as in the reduction of trauma to clients. An adapted version measuring client improvement would give quantitative evidence for the benefit of therapy, thereby supporting auditing and accountability. © 1997 The British Psychological Society.