930 resultados para error performance
Resumo:
In this paper, dual-hop amplify-and-forward (AF) cooperative systems in the presence of high-power amplifier (HPA) nonlinearity at semi-blind relays, are investigated. Based on the modified AF cooperative system model taking into account the HPA nonlinearity, the expression for the output signal-to-noise ratio (SNR) at the destination node is derived, where the interference due to both the AF relaying mechanism and the HPA nonlinearity is characterized. The performance of the AF cooperative system under study is evaluated in terms of average symbol error probability (SEP), which is derived using the moment-generating function (MGF) approach, considering transmissions over Nakagami-m fading channels. Numerical results are provided and show the effects of some system parameters, such as the HPA parameters, numbers of relays, quadrature amplitude modulation (QAM) order, Nakagami parameters, on performance.
Resumo:
In this paper, we consider multiple-input multiple- output (MIMO) maximal ratio combining (MRC) systems and assess the system performance in terms of average symbol error probability (SEP), outage probability and ergodic capacity in double-correlated Rayleigh-and-Lognormal fading channels. In order to derive the receive and transmit correlation functions needed for the performance analysis, a three-dimensional (3D) MIMO mobile-to-mobile (M-to-M) channel model, which takes into account the effects of fast fading and shadowing is used. Numerical results are provided to show the effects of system parameters, such as maximum elevation angle of scatterers, orientation angle of antenna array in the x-y plane, angle between x-y plane and the antenna array orientation, and degree of scattering in the x-y plane, on the system performance.
Resumo:
Point and click interactions using a mouse are an integral part of computer use for current desktop systems. Compared with younger users though, older adults experience greater difficulties performing cursor positioning tasks, and this can present limitations to using a computer easily and effectively. Target expansion is a technique for improving pointing performance, where the target dynamically grows as the cursor approaches. This has the advantage that targets conserve screen real estate in their unexpanded state, yet can still provide the benefits of a larger area to click on. This paper presents two studies of target expansion with older and younger participants, involving multidirectional point-select tasks with a computer mouse. Study 1 compares static versus expanding targets, and Study 2 compares static targets with three alternative techniques for expansion. Results show that expansion can improve times by up to 14%, and reduce error rates by up to 50%. Additionally, expanding targets are beneficial even when the expansion happens late in the movement, i.e. after the cursor has reached the expanded target area or even after it has reached the original target area. Participants’ subjective feedback on the target expansion are generally favorable, and this lends further support for the technique.
Resumo:
In order to examine metacognitive accuracy (i.e., the relationship between metacognitive judgment and memory performance), researchers often rely on by-participant analysis, where metacognitive accuracy (e.g., resolution, as measured by the gamma coefficient or signal detection measures) is computed for each participant and the computed values are entered into group-level statistical tests such as the t-test. In the current work, we argue that the by-participant analysis, regardless of the accuracy measurements used, would produce a substantial inflation of Type-1 error rates, when a random item effect is present. A mixed-effects model is proposed as a way to effectively address the issue, and our simulation studies examining Type-1 error rates indeed showed superior performance of mixed-effects model analysis as compared to the conventional by-participant analysis. We also present real data applications to illustrate further strengths of mixed-effects model analysis. Our findings imply that caution is needed when using the by-participant analysis, and recommend the mixed-effects model analysis.
Resumo:
Low-power medium access control (MAC) protocols used for communication of energy constraint wireless embedded devices do not cope well with situations where transmission channels are highly erroneous. Existing MAC protocols discard corrupted messages which lead to costly retransmissions. To improve transmission performance, it is possible to include an error correction scheme and transmit/receive diversity. It is possible to add redundant information to transmitted packets in order to recover data from corrupted packets. It is also possible to make use of transmit/receive diversity via multiple antennas to improve error resiliency of transmissions. Both schemes may be used in conjunction to further improve the performance. In this study, the authors show how an error correction scheme and transmit/receive diversity can be integrated in low-power MAC protocols. Furthermore, the authors investigate the achievable performance gains of both methods. This is important as both methods have associated costs (processing requirements; additional antennas and power) and for a given communication situation it must be decided which methods should be employed. The authors’ results show that, in many practical situations, error control coding outperforms transmission diversity; however, if very high reliability is required, it is useful to employ both schemes together.
Resumo:
The use of virtualization in high-performance computing (HPC) has been suggested as a means to provide tailored services and added functionality that many users expect from full-featured Linux cluster environments. The use of virtual machines in HPC can offer several benefits, but maintaining performance is a crucial factor. In some instances the performance criteria are placed above the isolation properties. This selective relaxation of isolation for performance is an important characteristic when considering resilience for HPC environments that employ virtualization. In this paper we consider some of the factors associated with balancing performance and isolation in configurations that employ virtual machines. In this context, we propose a classification of errors based on the concept of “error zones”, as well as a detailed analysis of the trade-offs between resilience and performance based on the level of isolation provided by virtualization solutions. Finally, a set of experiments are performed using different virtualization solutions to elucidate the discussion.
Resumo:
For certain observing types, such as those that are remotely sensed, the observation errors are correlated and these correlations are state- and time-dependent. In this work, we develop a method for diagnosing and incorporating spatially correlated and time-dependent observation error in an ensemble data assimilation system. The method combines an ensemble transform Kalman filter with a method that uses statistical averages of background and analysis innovations to provide an estimate of the observation error covariance matrix. To evaluate the performance of the method, we perform identical twin experiments using the Lorenz ’96 and Kuramoto-Sivashinsky models. Using our approach, a good approximation to the true observation error covariance can be recovered in cases where the initial estimate of the error covariance is incorrect. Spatial observation error covariances where the length scale of the true covariance changes slowly in time can also be captured. We find that using the estimated correlated observation error in the assimilation improves the analysis.
Resumo:
This paper presents a quantitative evaluation of a tracking system on PETS 2015 Challenge datasets using well-established performance measures. Using the existing tools, the tracking system implements an end-to-end pipeline that include object detection, tracking and post- processing stages. The evaluation results are presented on the provided sequences of both ARENA and P5 datasets of PETS 2015 Challenge. The results show an encouraging performance of the tracker in terms of accuracy but a greater tendency of being prone to cardinality error and ID changes on both datasets. Moreover, the analysis show a better performance of the tracker on visible imagery than on thermal imagery.
Resumo:
Estimating trajectories and parameters of dynamical systems from observations is a problem frequently encountered in various branches of science; geophysicists for example refer to this problem as data assimilation. Unlike as in estimation problems with exchangeable observations, in data assimilation the observations cannot easily be divided into separate sets for estimation and validation; this creates serious problems, since simply using the same observations for estimation and validation might result in overly optimistic performance assessments. To circumvent this problem, a result is presented which allows us to estimate this optimism, thus allowing for a more realistic performance assessment in data assimilation. The presented approach becomes particularly simple for data assimilation methods employing a linear error feedback (such as synchronization schemes, nudging, incremental 3DVAR and 4DVar, and various Kalman filter approaches). Numerical examples considering a high gain observer confirm the theory.
Resumo:
The Natural History of Human Papillomavirus (HPV) Infection in Men: The HIM Study is a prospective multi-center cohort study that, among other factors, analyzes participants` diet. A parallel cross-sectional study was designed to evaluate the validity and reproducibility of the quantitative food frequency questionnaire (QFFQ) used in the Brazilian center from the HIM Study. For this, a convenience subsample of 98 men aged 18 to 70 years from the HIM Study in Brazil answered three 54-item QFFQ and three 24-hour recall interviews, with 6-month intervals between them (data collection January to September 2007). A Bland-Altman analysis indicated that the difference between instruments was dependent on the magnitude of the intake for energy and most nutrients included in the validity analysis, with the exception of carbohydrates, fiber, polyunsaturated fat, vitamin C, and vitamin E. The correlation between the QFFQ and the 24-hour recall for the deattenuated and energy-adjusted data ranged from 0.05 (total fat) to 0.57 (calcium). For the energy and nutrients consumption included in the validity analysis, 33.5% of participants on average were correctly classified into quartiles, and the average value of 0.26 for weighted kappa shows a reasonable agreement. The intraclass correlation coefficients for all nutrients were greater than 0.40 in the reproducibility analysis. The QFFQ demonstrated good reproducibility and acceptable validity. The results support the use of this instrument in the HIM Study. J Am Diet Assoc. 2011;111:1045-1051.
Resumo:
Predictive performance evaluation is a fundamental issue in design, development, and deployment of classification systems. As predictive performance evaluation is a multidimensional problem, single scalar summaries such as error rate, although quite convenient due to its simplicity, can seldom evaluate all the aspects that a complete and reliable evaluation must consider. Due to this, various graphical performance evaluation methods are increasingly drawing the attention of machine learning, data mining, and pattern recognition communities. The main advantage of these types of methods resides in their ability to depict the trade-offs between evaluation aspects in a multidimensional space rather than reducing these aspects to an arbitrarily chosen (and often biased) single scalar measure. Furthermore, to appropriately select a suitable graphical method for a given task, it is crucial to identify its strengths and weaknesses. This paper surveys various graphical methods often used for predictive performance evaluation. By presenting these methods in the same framework, we hope this paper may shed some light on deciding which methods are more suitable to use in different situations.
Resumo:
Predictors of random effects are usually based on the popular mixed effects (ME) model developed under the assumption that the sample is obtained from a conceptual infinite population; such predictors are employed even when the actual population is finite. Two alternatives that incorporate the finite nature of the population are obtained from the superpopulation model proposed by Scott and Smith (1969. Estimation in multi-stage surveys. J. Amer. Statist. Assoc. 64, 830-840) or from the finite population mixed model recently proposed by Stanek and Singer (2004. Predicting random effects from finite population clustered samples with response error. J. Amer. Statist. Assoc. 99, 1119-1130). Predictors derived under the latter model with the additional assumptions that all variance components are known and that within-cluster variances are equal have smaller mean squared error (MSE) than the competitors based on either the ME or Scott and Smith`s models. As population variances are rarely known, we propose method of moment estimators to obtain empirical predictors and conduct a simulation study to evaluate their performance. The results suggest that the finite population mixed model empirical predictor is more stable than its competitors since, in terms of MSE, it is either the best or the second best and when second best, its performance lies within acceptable limits. When both cluster and unit intra-class correlation coefficients are very high (e.g., 0.95 or more), the performance of the empirical predictors derived under the three models is similar. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
This study evaluates the influence of different cartographic representations of in-car navigation systems on visual demand, subjective preference, and navigational error. It takes into account the type and complexity of the representation, maneuvering complexity, road layout, and driver gender. A group of 28 drivers (14 male and 14 female) participated in this experiment which was performed in a low-cost driving simulator. The tests were performed on a limited number of instances for each type of representation, and their purpose was to carry out a preliminary assessment and provide future avenues for further studies. Data collected for the visual demand study were analyzed using non-parametric statistical analyses. Results confirmed previous research that showed that different levels of design complexity significantly influence visual demand. Non-grid-like road networks, for example, influence significantly visual demand and navigational error. An analysis of simple maneuvers on a grid-like road network showed that static and blinking arrows did not present significant differences. From the set of representations analyzed to assess visual demand, both arrows were equally efficient. From a gender perspective, women seem to took at the display more than men, but this factor was not significant. With respect to subjective preferences, drivers prefer representations with mimetic landmarks when they perform straight-ahead tasks. For maneuvering tasks, landmarks in a perspective model created higher visual demands.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Bee males (drones) of stingless bees tend to congregate near entrances of conspecific nests, where they wait for virgin queens that initiate their nuptial flight. We observed that the Neotropical solitary wasp Trachypus boharti (Hymenoptera, Cabronidae) specifically preys on males of the stingless bee Scaptotrigona postica (Hymenoptera, Apidae); these wasps captured up to 50 males per day near the entrance of a single hive. Over 90% of the wasp attacks were unsuccessful; such erroneous attacks often involved conspecific wasps and worker bees. After the capture of non-male prey, wasps almost immediately released these individuals unharmed and continued hunting. A simple behavioral experiment showed that at short distances wasps were not specifically attracted to S. postica males nor were they repelled by workers of the same species. Likely, short-range prey detection near the bees' nest is achieved mainly by vision whereas close-range prey recognition is based principally on chemical and/or mechanical cues. We argue that the dependence on the wasp's visual perception during attack and the crowded and dynamic hunting conditions caused wasps to make many preying attempts that failed. Two wasp-density-related factors, wasp-prey distance and wasp-wasp encounters, may account for the fact that the highest male capture and unsuccessful wasp bee encounter rates occurred at intermediate wasp numbers.