967 resultados para Extended spectrum
Resumo:
One of the most important factors that affects the performance of energy detection (ED) is the fading channel between the wireless nodes. This article investigates the performance of ED-based spectrum sensing, for cognitive radio (CR), over two-wave with diffuse power (TWDP) fading channels. The TWDP fading model characterizes a variety of fading channels, including well-known canonical fading distributions, such as Rayleigh and Rician, as well as worse than Rayleigh fading conditions modeled by the two-ray fading model. Novel analytic expressions for the average probability of detection over TWDP fading that account for single-user and cooperative spectrum sensing as well as square law selection diversity reception are derived. These expressions are used to analyze the behavior of ED-based spectrum sensing over moderate, severe and extreme fading conditions, and to investigate the use of cooperation and diversity as a means of mitigating the fading effects. Our results indicate that TWDP fading conditions can significantly degrade the sensing performance; however, it is shown that detection performance can be improved when cooperation and diversity are employed. The presented outcomes enable us to identify the limits of ED-based spectrum sensing and quantify the trade-offs between detection performance and energy efficiency for cognitive radio systems deployed within confined environments such as in-vehicular wireless networks.
Resumo:
Spectrum sensing is the cornerstone of cognitive radio technology and refers to the process of obtaining awareness of the radio spectrum usage in order to detect the presence of other users. Spectrum sensing algorithms consume considerable energy and time. Prediction methods for inferring the channel occupancy of future time instants have been proposed as a means of improving performance in terms of energy and time consumption. This paper studies the performance of a hidden Markov model (HMM) spectrum occupancy predictor as well as the improvement in sensing energy and time consumption based on real occupancy data obtained in the 2.4GHz ISM band. Experimental results show that the HMM-based occupancy predictor outperforms a kth order Markov and a 1-nearest neighbour (1NN) predictor. Our study also suggests that by employing such a predictive scheme in spectrum sensing, an improvement of up to 66% can be achieved in the required sensing energy and time.
Resumo:
With rising numbers of school-aged children with autism educated in mainstream classrooms and applied behaviour analysis (ABA) considered the basis of best practice, teachers’ knowledge in this field has become a key concern for inclusion. Self-reported knowledge of ABA of special needs teachers (n=165) was measured and compared to their actual knowledge of ABA demonstrated in accurate responses to a multiple-choice test. Findings reported here show that teachers’ self-perceived knowledge exceeded actual knowledge and that actual knowledge of ABA was not related to training received by government agency. Implications for teacher training are discussed.
Resumo:
BACKGROUND: The needs of children with autism spectrum disorder (ASD) are complex and this is reflected in the number and diversity of outcomes assessed and measurement tools used to collect evidence about children's progress. Relevant outcomes include improvement in core ASD impairments, such as communication, social awareness, sensory sensitivities and repetitiveness; skills such as social functioning and play; participation outcomes such as social inclusion; and parent and family impact.
OBJECTIVES: To examine the measurement properties of tools used to measure progress and outcomes in children with ASD up to the age of 6 years. To identify outcome areas regarded as important by people with ASD and parents.
METHODS: The MeASURe (Measurement in Autism Spectrum disorder Under Review) research collaboration included ASD experts and review methodologists. We undertook systematic review of tools used in ASD early intervention and observational studies from 1992 to 2013; systematic review, using the COSMIN checklist (Consensus-based Standards for the selection of health Measurement Instruments) of papers addressing the measurement properties of identified tools in children with ASD; and synthesis of evidence and gaps. The review design and process was informed throughout by consultation with stakeholders including parents, young people with ASD, clinicians and researchers.
RESULTS: The conceptual framework developed for the review was drawn from the International Classification of Functioning, Disability and Health, including the domains 'Impairments', 'Activity Level Indicators', 'Participation', and 'Family Measures'. In review 1, 10,154 papers were sifted - 3091 by full text - and data extracted from 184; in total, 131 tools were identified, excluding observational coding, study-specific measures and those not in English. In review 2, 2665 papers were sifted and data concerning measurement properties of 57 (43%) tools were extracted from 128 papers. Evidence for the measurement properties of the reviewed tools was combined with information about their accessibility and presentation. Twelve tools were identified as having the strongest supporting evidence, the majority measuring autism characteristics and problem behaviour. The patchy evidence and limited scope of outcomes measured mean these tools do not constitute a 'recommended battery' for use. In particular, there is little evidence that the identified tools would be good at detecting change in intervention studies. The obvious gaps in available outcome measurement include well-being and participation outcomes for children, and family quality-of-life outcomes, domains particularly valued by our informants (young people with ASD and parents).
CONCLUSIONS: This is the first systematic review of the quality and appropriateness of tools designed to monitor progress and outcomes of young children with ASD. Although it was not possible to recommend fully robust tools at this stage, the review consolidates what is known about the field and will act as a benchmark for future developments. With input from parents and other stakeholders, recommendations are made about priority targets for research.
FUTURE WORK: Priorities include development of a tool to measure child quality of life in ASD, and validation of a potential primary outcome tool for trials of early social communication intervention.
STUDY REGISTRATION: This study is registered as PROSPERO CRD42012002223.
FUNDING: The National Institute for Health Research Health Technology Assessment programme.
Resumo:
This paper presents the application of the on-load exciting current Extended Park's Vector Approach for diagnosing incipient turn-to-turn winding faults in operating power transformers. Experimental and simulated test results demonstrate the effectiveness of the proposed technique, which is based on the spectral analysis of the AC component of the on-load exciting current Park's Vector modulus.
Resumo:
This paper describes in detail the design of a CMOS custom fast Fourier transform (FFT) processor for computing a 256-point complex FFT. The FFT is well-suited for real-time spectrum analysis in instrumentation and measurement applications. The FFT butterfly processor reported here consists of one parallel-parallel multiplier and two adders. It is capable of computing one butterfly computation every 100 ns thus it can compute a 256-point complex FFT in 102.4 μs excluding data input and output processes.
Resumo:
This paper describes in detail the design of a custom CMOS Fast Fourier Transform (FFT) processor for computing 256-point complex FFT. The FFT is well suited for real-time spectrum analysis in instrumentation and measurement applications. The FFT butterfly processor consists of one parallel-parallel multiplier and two adders. It is capable of computing one butterfly computation every 100 ns thus it can compute 256-complex point FFT in 25.6 μs excluding data input and output processes.
Resumo:
Thesis (Ph.D.)--University of Washington, 2015
Resumo:
Currently available rabies post-exposure prophylaxis (PEP) for use in humans includes equine or human rabies immunoglobulins (RIG). The replacement of RIG with an equally or more potent and safer product is strongly encouraged due to the high costs and limited availability of existing RIG. In this study, we identified two broadly neutralizing human monoclonal antibodies that represent a valid and affordable alternative to RIG in rabies PEP. Memory B cells from four selected vaccinated donors were immortalized and monoclonal antibodies were tested for neutralizing activity and epitope specificity. Two antibodies, identified as RVC20 and RVC58 (binding to antigenic site I and III, respectively), were selected for their potency and broad-spectrum reactivity. In vitro, RVC20 and RVC58 were able to neutralize all 35 rabies virus (RABV) and 25 non-RABV lyssaviruses. They showed higher potency and breath compared to antibodies under clinical development (namely CR57, CR4098, and RAB1) and commercially available human RIG. In vivo, the RVC20–RVC58 cocktail protected Syrian hamsters from a lethal RABV challenge and did not affect the endogenous hamster post-vaccination antibody response.
Resumo:
Component joining is typically performed by welding, fastening, or adhesive-bonding. For bonded aerospace applications, adhesives must withstand high-temperatures (200°C or above, depending on the application), which implies their mechanical characterization under identical conditions. The extended finite element method (XFEM) is an enhancement of the finite element method (FEM) that can be used for the strength prediction of bonded structures. This work proposes and validates damage laws for a thin layer of an epoxy adhesive at room temperature (RT), 100, 150, and 200°C using the XFEM. The fracture toughness (G Ic ) and maximum load ( ); in pure tensile loading were defined by testing double-cantilever beam (DCB) and bulk tensile specimens, respectively, which permitted building the damage laws for each temperature. The bulk test results revealed that decreased gradually with the temperature. On the other hand, the value of G Ic of the adhesive, extracted from the DCB data, was shown to be relatively insensitive to temperature up to the glass transition temperature (T g ), while above T g (at 200°C) a great reduction took place. The output of the DCB numerical simulations for the various temperatures showed a good agreement with the experimental results, which validated the obtained data for strength prediction of bonded joints in tension. By the obtained results, the XFEM proved to be an alternative for the accurate strength prediction of bonded structures.
Resumo:
After a historical introduction, the bulk of the thesis concerns the study of a declarative semantics for logic programs. The main original contributions are: ² WFSX (Well–Founded Semantics with eXplicit negation), a new semantics for logic programs with explicit negation (i.e. extended logic programs), which compares favourably in its properties with other extant semantics. ² A generic characterization schema that facilitates comparisons among a diversity of semantics of extended logic programs, including WFSX. ² An autoepistemic and a default logic corresponding to WFSX, which solve existing problems of the classical approaches to autoepistemic and default logics, and clarify the meaning of explicit negation in logic programs. ² A framework for defining a spectrum of semantics of extended logic programs based on the abduction of negative hypotheses. This framework allows for the characterization of different levels of scepticism/credulity, consensuality, and argumentation. One of the semantics of abduction coincides with WFSX. ² O–semantics, a semantics that uniquely adds more CWA hypotheses to WFSX. The techniques used for doing so are applicable as well to the well–founded semantics of normal logic programs. ² By introducing explicit negation into logic programs contradiction may appear. I present two approaches for dealing with contradiction, and show their equivalence. One of the approaches consists in avoiding contradiction, and is based on restrictions in the adoption of abductive hypotheses. The other approach consists in removing contradiction, and is based in a transformation of contradictory programs into noncontradictory ones, guided by the reasons for contradiction.
Resumo:
The structural integrity of multi-component structures is usually determined by the strength and durability of their unions. Adhesive bonding is often chosen over welding, riveting and bolting, due to the reduction of stress concentrations, reduced weight penalty and easy manufacturing, amongst other issues. In the past decades, the Finite Element Method (FEM) has been used for the simulation and strength prediction of bonded structures, by strength of materials or fracture mechanics-based criteria. Cohesive-zone models (CZMs) have already proved to be an effective tool in modelling damage growth, surpassing a few limitations of the aforementioned techniques. Despite this fact, they still suffer from the restriction of damage growth only at predefined growth paths. The eXtended Finite Element Method (XFEM) is a recent improvement of the FEM, developed to allow the growth of discontinuities within bulk solids along an arbitrary path, by enriching degrees of freedom with special displacement functions, thus overcoming the main restriction of CZMs. These two techniques were tested to simulate adhesively bonded single- and double-lap joints. The comparative evaluation of the two methods showed their capabilities and/or limitations for this specific purpose.
Resumo:
Prototype validation is a major concern in modern electronic product design and development. Simulation, structural test, functional and timing debug are all forming parts of the validation process, although very often addressed as dissociated tasks. In this paper we describe an integrated approach to board-level prototype validation, based on a set of mandatory/optional BST instructions and a built-in controller for debug and test, that addresses the late mentioned tasks as inherent parts of a whole process