22 resultados para ERROR rates

em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast


Relevância:

70.00% 70.00%

Publicador:

Resumo:

In the IEEE 802.11 MAC layer protocol, there are different trade-off points between the number of nodes competing for the medium and the network capacity provided to them. There is also a trade-off between the wireless channel condition during the transmission period and the energy consumption of the nodes. Current approaches at modeling energy consumption in 802.11 based networks do not consider the influence of the channel condition on all types of frames (control and data) in the WLAN. Nor do they consider the effect on the different MAC and PHY schemes that can occur in 802.11 networks. In this paper, we investigate energy consumption corresponding to the number of competing nodes in IEEE 802.11's MAC and PHY layers in error-prone wireless channel conditions, and present a new energy consumption model. Analysis of the power consumed by each type of MAC and PHY over different bit error rates shows that the parameters in these layers play a critical role in determining the overall energy consumption of the ad-hoc network. The goal of this research is not only to compare the energy consumption using exact formulae in saturated IEEE 802.11-based DCF networks under varying numbers of competing nodes, but also, as the results show, to demonstrate that channel errors have a significant impact on the energy consumption.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Multiuser diversity gain has been investigated well in terms of a system capacity formulation in the literature. In practice, however, designs on multiuser systems with nonzero error rates require a relationship between the error rates and the number of users within a cell. Considering a best-user scheduling, where the user with the best channel condition is scheduled to transmit per scheduling interval, our focus is on the uplink. We assume that each user communicates with the base station through a single-input multiple-output channel. We derive a closed-form expression for the average BER, and analyze how the average BER goes to zero asymptotically as the number of users increases for a given SNR. Note that the analysis of average BER even in SI SO multiuser diversity systems has not been done with respect to the number of users for a given SNR. Our analysis can be applied to multiuser diversity systems with any number of antennas.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In view of the evidence that cognitive deficits in schizophrenia are critically important for long-term outcome, it is essential to establish the effects that the various antipsychotic compounds have on cognition, particularly second-generation drugs. This parallel group, placebo-controlled study aimed to compare the effects in healthy volunteers (n = 128) of acute doses of the atypical antipsychotics amisulpride (300 mg) and risperidone (3 mg) to those of chlorpromazine (100 mg) on tests thought relevant to the schizophrenic process: auditory and visual latent inhibition, prepulse inhibition of the acoustic startle response, executive function and eye movements. The drugs tested were not found to affect auditory latent inhibition, prepulse inhibition or executive functioning as measured by the Cambridge Neuropsychological Test Battery and the FAS test of verbal fluency. However, risperidone disrupted and amisulpride showed a trend to disrupt visual latent inhibition. Although amisulpride did not affect eye movements, both risperidone and chlorpromazine decreased peak saccadic velocity and increased antisaccade error rates, which, in the risperidone group, correlated with drug-induced akathisia. It was concluded that single doses of these drugs appear to have little effect on cognition, but may affect eye movement parameters in accordance with the amount of sedation and akathisia they produce. The effect risperidone had on latent inhibition is likely to relate to its serotonergic properties. Furthermore, as the trend for disrupted visual latent inhibition following amisulpride was similar in nature to that which would be expected with amphetamine, it was concluded that its behaviour in this model is consistent with its preferential presynaptic dopamine antagonistic activity in low dose and its efficacy in the negative symptoms of schizophrenia.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study finds evidence that attempts to reduce costs and error rates in the Inland Revenue through the use of e-commerce technology are flawed. While it is technically possible to write software that will record tax data, and then transmit it to the Inland Revenue, there is little demand for this service. The key finding is that the tax system is so complex that many people are unable to complete their own tax returns. This complexity cannot be overcome by well-designed software. The recommendation is to encourage the use of agents to assist taxpayers or simplify the tax system. The Inland Revenue is interested in saving administrative costs and errors by encouraging electronic submission of tax returns. To achieve these objectives, given the raw data it would seem clear that the focus should be on facilitating the work of agents.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper investigates the problem of speaker identi-fication and verification in noisy conditions, assuming that speechsignals are corrupted by environmental noise, but knowledgeabout the noise characteristics is not available. This research ismotivated in part by the potential application of speaker recog-nition technologies on handheld devices or the Internet. Whilethe technologies promise an additional biometric layer of securityto protect the user, the practical implementation of such systemsfaces many challenges. One of these is environmental noise. Due tothe mobile nature of such systems, the noise sources can be highlytime-varying and potentially unknown. This raises the require-ment for noise robustness in the absence of information about thenoise. This paper describes a method that combines multicondi-tion model training and missing-feature theory to model noisewith unknown temporal-spectral characteristics. Multiconditiontraining is conducted using simulated noisy data with limitednoise variation, providing a “coarse” compensation for the noise,and missing-feature theory is applied to refine the compensationby ignoring noise variation outside the given training conditions,thereby reducing the training and testing mismatch. This paperis focused on several issues relating to the implementation of thenew model for real-world applications. These include the gener-ation of multicondition training data to model noisy speech, thecombination of different training data to optimize the recognitionperformance, and the reduction of the model’s complexity. Thenew algorithm was tested using two databases with simulated andrealistic noisy speech data. The first database is a redevelopmentof the TIMIT database by rerecording the data in the presence ofvarious noise types, used to test the model for speaker identifica-tion with a focus on the varieties of noise. The second database isa handheld-device database collected in realistic noisy conditions,used to further validate the model for real-world speaker verifica-tion. The new model is compared to baseline systems and is foundto achieve lower error rates.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Response time (RT) variability is a common finding in ADHD research. RT variability may reflect frontal cortex function and may be related to deficits in sustained attention. The existence of a sustained attention deficit in ADHD has been debated, largely because of inconsistent evidence of time-on-task effects. A fixed-sequence Sustained Attention to Response Task (SART) was given to 29 control, 39 unimpaired and 24 impaired-ADHD children (impairment defined by the number of commission errors). The response time data were analysed using the Fast Fourier Transform, to define the fast-frequency and slow-frequency contributions to overall response variability. The impaired-ADHD group progressively slowed in RT over the course of the 5.5 min task, as reflected in this group's greater slow-frequency variability. The fast-frequency trial-to-trial variability was also significantly greater, but did not differentially worsen over the course of the task. The higher error rates of the impaired-ADHD group did not become differentially greater over the length of the task. The progressive slowing in mean RT over the course of the task may relate to a deficit in arousal in the impaired-ADHD group. The consistently poor performance in fast-frequency variability and error rates may be due to difficulties in sustained attention that fluctuate on a trial-to-trial basis. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Difficulties in phonological processing have been proposed to be the core symptom of developmental dyslexia. Phoneme awareness tasks have been shown to both index and predict individual reading ability. In a previous experiment, we observed that dyslexic adults fail to display a P3a modulation for phonological deviants within an alliterated word stream when concentrating primarily on a lexical decision task [Fosker and Thierry, 2004, Neurosci. Lett. 357, 171-174]. Here we recorded the P3b oddball response elicited by initial phonemes within streams of alliterated words and pseudo-words when participants focussed directly on detecting the oddball phonemes. Despite significant verbal screening test differences between dyslexic adults and controls, the error rates, reactions times, and main components (P2, N2, P3a, and P3b) were indistinguishable across groups. The only difference between groups was found in the NI range, where dyslexic participants failed to show the modulations induced by phonological pairings (/b/-/p/ versus /r/ /g/) in controls. In light of previous P3a differences, these results suggest an important role for attention allocation in the manifestation of phonological deficits in developmental dyslexia. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Many of the most interesting questions ecologists ask lead to analyses of spatial data. Yet, perhaps confused by the large number of statistical models and fitting methods available, many ecologists seem to believe this is best left to specialists. Here, we describe the issues that need consideration when analysing spatial data and illustrate these using simulation studies. Our comparative analysis involves using methods including generalized least squares, spatial filters, wavelet revised models, conditional autoregressive models and generalized additive mixed models to estimate regression coefficients from synthetic but realistic data sets, including some which violate standard regression assumptions. We assess the performance of each method using two measures and using statistical error rates for model selection. Methods that performed well included generalized least squares family of models and a Bayesian implementation of the conditional auto-regressive model. Ordinary least squares also performed adequately in the absence of model selection, but had poorly controlled Type I error rates and so did not show the improvements in performance under model selection when using the above methods. Removing large-scale spatial trends in the response led to poor performance. These are empirical results; hence extrapolation of these findings to other situations should be performed cautiously. Nevertheless, our simulation-based approach provides much stronger evidence for comparative analysis than assessments based on single or small numbers of data sets, and should be considered a necessary foundation for statements of this type in future.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Errors involving drug prescriptions are a key target for patient safety initiatives. Recent studies have focused on error rates across different grades of doctors in order to target interventions. However, many prescriptions are not instigated by the doctor who writes them. It is important to clarify how often this occurs in order to interpret these studies and create interventions. This study aimed to provisionally quantify and describe prescriptions where the identity of the decision maker and prescription writer differed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objectives: Study objectives were to investigate the prevalence and causes of prescribing errors amongst foundation doctors (i.e. junior doctors in their first (F1) or second (F2) year of post-graduate training), describe their knowledge and experience of prescribing errors, and explore their self-efficacy (i.e. confidence) in prescribing.

Method: A three-part mixed-methods design was used, comprising: prospective observational study; semi-structured interviews and cross-sectional survey. All doctors prescribing in eight purposively selected hospitals in Scotland participated. All foundation doctors throughout Scotland participated in the survey. The number of prescribing errors per patient, doctor, ward and hospital, perceived causes of errors and a measure of doctors’ self-efficacy were established.

Results: 4710 patient charts and 44,726 prescribed medicines were reviewed. There were 3364 errors, affecting 1700 (36.1%) charts (overall error rate: 7.5%; F1:7.4%; F2:8.6%; consultants:6.3%). Higher error rates were associated with : teaching hospitals (p,0.001), surgical (p = ,0.001) or mixed wards (0.008) rather thanmedical ward, higher patient turnover wards (p,0.001), a greater number of prescribed medicines (p,0.001) and the months December and June (p,0.001). One hundred errors were discussed in 40 interviews. Error causation was multi-factorial; work environment and team factors were particularly noted. Of 548 completed questionnaires (national response rate of 35.4%), 508 (92.7% of respondents) reported errors, most of which (328 (64.6%) did not reach the patient. Pressure from other staff, workload and interruptions were cited as the main causes of errors. Foundation year 2 doctors reported greater confidence than year 1 doctors in deciding the most appropriate medication regimen.

Conclusions: Prescribing errors are frequent and of complex causation. Foundation doctors made more errors than other doctors, but undertook the majority of prescribing, making them a key target for intervention. Contributing causes included work environment, team, task, individual and patient factors. Further work is needed to develop and assess interventions that address these.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Hardware impairments in physical transceivers are known to have a deleterious effect on communication systems; however, very few contributions have investigated their impact on relaying. This paper quantifies the impact of transceiver impairments in a two-way amplify-and-forward configuration. More specifically, the effective signal-to-noise-and-distortion ratios at both transmitter nodes are obtained. These are used to deduce exact and asymptotic closed-form expressions for the outage probabilities (OPs), as well as tractable formulations for the symbol error rates (SERs). It is explicitly shown that non-zero lower bounds on the OP and SER exist in the high-power regime---this stands in contrast to the special case of ideal hardware, where the OP and SER go asymptotically to zero.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we propose a design paradigm for energy efficient and variation-aware operation of next-generation multicore heterogeneous platforms. The main idea behind the proposed approach lies on the observation that not all operations are equally important in shaping the output quality of various applications and of the overall system. Based on such an observation, we suggest that all levels of the software design stack, including the programming model, compiler, operating system (OS) and run-time system should identify the critical tasks and ensure correct operation of such tasks by assigning them to dynamically adjusted reliable cores/units. Specifically, based on error rates and operating conditions identified by a sense-and-adapt (SeA) unit, the OS selects and sets the right mode of operation of the overall system. The run-time system identifies the critical/less-critical tasks based on special directives and schedules them to the appropriate units that are dynamically adjusted for highly-accurate/approximate operation by tuning their voltage/frequency. Units that execute less significant operations can operate at voltages less than what is required for correct operation and consume less power, if required, since such tasks do not need to be always exact as opposed to the critical ones. Such scheme can lead to energy efficient and reliable operation, while reducing the design cost and overheads of conventional circuit/micro-architecture level techniques.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The end of Dennard scaling has pushed power consumption into a first order concern for current systems, on par with performance. As a result, near-threshold voltage computing (NTVC) has been proposed as a potential means to tackle the limited cooling capacity of CMOS technology. Hardware operating in NTV consumes significantly less power, at the cost of lower frequency, and thus reduced performance, as well as increased error rates. In this paper, we investigate if a low-power systems-on-chip, consisting of ARM's asymmetric big.LITTLE technology, can be an alternative to conventional high performance multicore processors in terms of power/energy in an unreliable scenario. For our study, we use the Conjugate Gradient solver, an algorithm representative of the computations performed by a large range of scientific and engineering codes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

An investigation into exchange-traded fund (ETF) outperforrnance during the period 2008-2012 is undertaken utilizing a data set of 288 U.S. traded securities. ETFs are tested for net asset value (NAV) premium, underlying index and market benchmark outperformance, with Sharpe, Treynor, and Sortino ratios employed as risk-adjusted performance measures. A key contribution is the application of an innovative generalized stepdown procedure in controlling for data snooping bias. We find that a large proportion of optimized replication and debt asset class ETFs display risk-adjusted premiums with energy and precious metals focused funds outperforming the S&P 500 market benchmark. 

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A digital directional modulation (DM) transmitter structure is proposed from a practical implementation point of view in this paper. This digital DM architecture is built with the help of several off-the-shelf physical layer wireless experiment platform hardware boards. When compared with previous analogue DM transmitter architectures, the digital means offers more precise and fast control on the updates of the array excitations. More importantly, it is an ideal physical arrangement to implement the most universal DM synthesis algorithm, i.e., the orthogonal vector approach. The practical issues in digital DM system calibrations are described and solved. The bit error rates (BERs) are measured via real-time data transmissions to illustrate the DM advantages, in terms of secrecy performance, over conventional non-DM beam-steering transmitters.