8 resultados para Interval signals and systems
em Aston University Research Archive
Resumo:
An increasing number of neuroimaging studies are concerned with the identification of interactions or statistical dependencies between brain areas. Dependencies between the activities of different brain regions can be quantified with functional connectivity measures such as the cross-correlation coefficient. An important factor limiting the accuracy of such measures is the amount of empirical data available. For event-related protocols, the amount of data also affects the temporal resolution of the analysis. We use analytical expressions to calculate the amount of empirical data needed to establish whether a certain level of dependency is significant when the time series are autocorrelated, as is the case for biological signals. These analytical results are then contrasted with estimates from simulations based on real data recorded with magnetoencephalography during a resting-state paradigm and during the presentation of visual stimuli. Results indicate that, for broadband signals, 50-100 s of data is required to detect a true underlying cross-correlations coefficient of 0.05. This corresponds to a resolution of a few hundred milliseconds for typical event-related recordings. The required time window increases for narrow band signals as frequency decreases. For instance, approximately 3 times as much data is necessary for signals in the alpha band. Important implications can be derived for the design and interpretation of experiments to characterize weak interactions, which are potentially important for brain processing.
Resumo:
The thesis begins with a conceptual model of the way that language diversity affects the strategies, organisation and subsidiary control policies of multinational companies. The model is based solely on the researcher'’ personal experience of working in a variety of international management roles, but in Chapter 2 a wide-ranging review of related academic literature finds evidence to support the key ideas. The model is developed as a series of propositions which are tested in a comparative case study, refined and then re-tested in a global survey of multinational subsidiaries. The principal findings of the empirical phases of the thesis endorse the main tenets of the model: - That language difference between parent and subsidiary will impair communication, create mistrust and impede relationship development. - That subsequently the feelings of uncertainty, suspicion and mistrust will influence the decisions taken by the parent company. - They will have heightened sensitivity to language issues and will implement policies to manage language differences. - They will adopt low-risk strategies in host countries where they are concerned about language difference. - They will use organisational and manpower strategies to minimise the consequences and risks of the communications problems with the subsidiary. - As a consequence the level of integration and knowledge flow between parent and subsidiary will be curtailed. - They will adopt styles of control that depend least on their ability to communicate with their subsidiary. Although there is adequate support for all of the above conclusions, on some key points the evidence of the Case Studies and Survey is contradictory. The thesis, therefore, closes with an agenda for further research that would address these inconsistencies.
Resumo:
Visual detection performance (d') is usually an accelerating function of stimulus contrast, which could imply a smooth, threshold-like nonlinearity in the sensory response. Alternatively, Pelli (1985 Journal of the Optical Society of America A 2 1508 - 1532) developed the 'uncertainty model' in which responses were linear with contrast, but the observer was uncertain about which of many noisy channels contained the signal. Such internal uncertainty effectively adds noise to weak signals, and predicts the nonlinear psychometric function. We re-examined these ideas by plotting psychometric functions (as z-scores) for two observers (SAW, PRM) with high precision. The task was to detect a single, vertical, blurred line at the fixation point, or identify its polarity (light vs dark). Detection of a known polarity was nearly linear for SAW but very nonlinear for PRM. Randomly interleaving light and dark trials reduced performance and rendered it non-linear for SAW, but had little effect for PRM. This occurred for both single-interval and 2AFC procedures. The whole pattern of results was well predicted by our Monte Carlo simulation of Pelli's model, with only two free parameters. SAW (highly practised) had very low uncertainty. PRM (with little prior practice) had much greater uncertainty, resulting in lower contrast sensitivity, nonlinear performance, and no effect of external (polarity) uncertainty. For SAW, identification was about v2 better than detection, implying statistically independent channels for stimuli of opposite polarity, rather than an opponent (light - dark) channel. These findings strongly suggest that noise and uncertainty, rather than sensory nonlinearity, limit visual detection.
Resumo:
Limitations in the performance of coherent transmission systems employing digital back-propagation due to four-wave mixing impairments are reported for the first time. A significant performance constraint is identified, originating from four-wave mixing between signals and amplified spontaneous emission noise which induces a linear increase in the standard deviation of the received field with signal power, and linear dependence on transmission distance.
Resumo:
The increasing intensity of global competition has led organizations to utilize various types of performance measurement tools for improving the quality of their products and services. Data envelopment analysis (DEA) is a methodology for evaluating and measuring the relative efficiencies of a set of decision making units (DMUs) that use multiple inputs to produce multiple outputs. All the data in the conventional DEA with input and/or output ratios assumes the form of crisp numbers. However, the observed values of data in real-world problems are sometimes expressed as interval ratios. In this paper, we propose two new models: general and multiplicative non-parametric ratio models for DEA problems with interval data. The contributions of this paper are fourfold: (1) we consider input and output data expressed as interval ratios in DEA; (2) we address the gap in DEA literature for problems not suitable or difficult to model with crisp values; (3) we propose two new DEA models for evaluating the relative efficiencies of DMUs with interval ratios, and (4) we present a case study involving 20 banks with three interval ratios to demonstrate the applicability and efficacy of the proposed models where the traditional indicators are mostly financial ratios. © 2011 Elsevier Inc.
Resumo:
Identification of humans via ECG is being increasingly studied because it can have several advantages over the traditional biometric identification techniques. However, difficulties arise because of the heartrate variability. In this study we analysed the influence of QT interval correction on the performance of an identification system based on temporal and amplitude features of ECG. In particular we tested MLP, Naive Bayes and 3-NN classifiers on the Fantasia database. Results indicate that QT correction can significantly improve the overall system performance. © 2013 IEEE.