3 resultados para Task Performance and Analysis

em DRUM (Digital Repository at the University of Maryland)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coprime and nested sampling are well known deterministic sampling techniques that operate at rates significantly lower than the Nyquist rate, and yet allow perfect reconstruction of the spectra of wide sense stationary signals. However, theoretical guarantees for these samplers assume ideal conditions such as synchronous sampling, and ability to perfectly compute statistical expectations. This thesis studies the performance of coprime and nested samplers in spatial and temporal domains, when these assumptions are violated. In spatial domain, the robustness of these samplers is studied by considering arrays with perturbed sensor locations (with unknown perturbations). Simplified expressions for the Fisher Information matrix for perturbed coprime and nested arrays are derived, which explicitly highlight the role of co-array. It is shown that even in presence of perturbations, it is possible to resolve $O(M^2)$ under appropriate conditions on the size of the grid. The assumption of small perturbations leads to a novel ``bi-affine" model in terms of source powers and perturbations. The redundancies in the co-array are then exploited to eliminate the nuisance perturbation variable, and reduce the bi-affine problem to a linear underdetermined (sparse) problem in source powers. This thesis also studies the robustness of coprime sampling to finite number of samples and sampling jitter, by analyzing their effects on the quality of the estimated autocorrelation sequence. A variety of bounds on the error introduced by such non ideal sampling schemes are computed by considering a statistical model for the perturbation. They indicate that coprime sampling leads to stable estimation of the autocorrelation sequence, in presence of small perturbations. Under appropriate assumptions on the distribution of WSS signals, sharp bounds on the estimation error are established which indicate that the error decays exponentially with the number of samples. The theoretical claims are supported by extensive numerical experiments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the organizational benefits of treating employees fairly, both anecdotal and empirical evidence suggest that managers do not behave fairly towards their employees in a consistent manner. As treating employees fairly takes up personal resources such as time, effort, and attention, I argue that when managers face high workloads (i.e., high amounts of work and time pressure), they are unable to devote such personal resources to effectively meet both core technical task requirements and treat employees fairly. I propose that in general, managers tend to view their core technical task performance as more important than being fair in their dealings with employees; as a result, when faced with high workloads, they tend to prioritize the former at the expense of the latter. I also propose that managerial fairness will suffer more as a result of heightened workloads than will core technical task performance, unless managers perceive their organization to explicitly reward fair treatment of employees. I find support for my hypotheses across three studies: two experimental studies (with online participants and students respectively) and one field study of managers from a variety of organizations. I discuss the implications of studying fairness in the wider context of managers’ complex role in organizations to the fairness and managerial work demands literatures.