30 resultados para System Theory

em Aston University Research Archive


Relevância:

60.00% 60.00%

Publicador:

Resumo:

We investigate return-to-zero (RZ) to non-return-to-zero (NRZ) format conversion by means of the linear time-invariant system theory. It is shown that the problem of converting random RZ stream to NRZ stream can be reduced to constructing an appropriate transfer function for the linear filter. This approach is then used to propose novel optimally-designed single fiber Bragg grating (FBG) filter scheme for RZ-OOK/DPSK/DQPSK to NRZ-OOK/DPSK/DQPSK format conversion. The spectral response of the FBG is designed according to the optical spectra of the algebraic difference between isolated NRZ and RZ pulses, and the filter order is optimized for the maximum Q-factor of the output NRZ signals. Experimental results as well as simulations show that such an optimallydesigned FBG can successfully perform RZ-OOK/DPSK/DQPSK to NRZOOK/ DPSK/DQPSK format conversion.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In recent years the topic of risk management has moved up the agenda of both government and industry, and private sector initiatives to improve risk and internal control systems have been mirrored by similar promptings for change in the public sector. Both regulators and practitioners now view risk management as an integral part of the process of corporate governance, and an aid to the achievement of strategic objectives. The paper uses case study material on the risk management control system at Birmingham City Council to extend existing theory by developing a contingency theory for the public sector. The case demonstrates that whilst the structure of the control system fits a generic model, the operational details indicate that controls are contingent upon three core variables—central government policies, information and communication technology and organisational size. All three contingent variables are suitable for testing the theory across the broader public sector arena.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to consider the current status of strategic group theory in the light of developments over the last three decades. and then to discuss the continuing value of the concept, both to strategic management research and practising managers. Design/methodology/approach – Critical review of the idea of strategic groups together with a practical strategic mapping illustration. Findings – Strategic group theory still provides a useful approach for management research, which allows a detailed appraisal and comparison of company strategies within an industry. Research limitations/ implications – Strategic group research would undoubtedly benefit from more directly comparable, industry-specific studies, with a more careful focus on variable selection and the statistical methods used for validation. Future studies should aim to build sets of industry specific variables that describe strategic choice within that industry. The statistical methods used to identify strategic groupings need to be robust to ensure that strategic groups are not solely an artefact of method. Practical implications – The paper looks specifically at an application of strategic group theory in the UK pharmaceutical industry. The practical benefits of strategic groups as a classification system and of strategic mapping as a strategy development and analysis tool are discussed. Originality/value – The review of strategic group theory alongside alternative taxonomies and application of the concept to the UK pharmaceutical industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of the paper is to use a case study setting involving the implementation of an enterprise resource planning (ERP) system to expose and analyze the conflicts in the characterizations of the post bureaucratic organisation (PBO) in the literature. ERP implementations are often accompanied by increasing levels of stress in organizations that place pressures on organizational relationships and structures. Additionally, ERPs are regarded as introducing their own techno-logic of centralization, standardization and formalization that provides an apparent contrast to the exhortations about employee empowerment. Design/methodology/approach – A case study of ERP implementation in a medium-sized entity is presented. The paper explores aspects of ERP and PBO from the context of postmodern organization theory. Findings – Some concerns about PBO identified in the literature are reflected in the case situation. For example, there is a commitment to give up private time and work flexibly by some employees. The paper also provides evidence of the way the management team substitute their reliance on a key individual knowledge worker for that of an ERP system and external vendor support. Paradoxically, trust in that same knowledge worker and between core users of the system is essential to enable the implementation of the system. Originality/value – This paper adds empirical insight to a predominantly theoretical literature. The case evidence indicates some conflicting implications in the concurrent adoption of PBO and ERP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The introduction situates the ‘hard problem’ in its historical context and argues that the problem has two sides: the output side (the Kant-Eccles problem of the freedom of the Will) and the input side (the problem of qualia). The output side ultimately reduces to whether quantum mechanics can affect the operation of synapses. A discussion of the detailed molecular biology of synaptic transmission as presently understood suggests that such affects are unlikely. Instead an evolutionary argument is presented which suggests that our conviction of free agency is an evolutionarily induced illusion and hence that the Kant-Eccles problem is itself illusory. This conclusion is supported by well-known neurophysiology. The input side, the problem of qualia, of subjectivity, is not so easily outflanked. After a brief review of the neurophysiological correlates of consciousness (NCC) and of the Penrose-Hameroff microtubular neuroquantology it is again concluded that the molecular neurobiology makes quantum wave-mechanics an unlikely explanation. Instead recourse is made to an evolutionarily- and neurobiologically-informed panpsychism. The notion of an ‘emergent’ property is carefully distinguished from that of the more usual ‘system’ property used by most dual-aspect theorists (and the majority of neuroscientists) and used to support Llinas’ concept of an ‘oneiric’ consciousness continuously modified by sensory input. I conclude that a panpsychist theory, such as this, coupled with the non-classical understanding of matter flowing from quantum physics (both epistemological and scientific) may be the default and only solution to the problem posed by the presence of mind in a world of things.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a mean field theory of code-division multiple access (CDMA) systems with error-control coding. On the basis of the relation between the free energy and mutual information, we obtain an analytical expression of the maximum spectral efficiency of the coded CDMA system, from which a mean field description of the coded CDMA system is provided in terms of a bank of scalar Gaussian channels whose variances in general vary at different code symbol positions. Regular low-density parity-check (LDPC)-coded CDMA systems are also discussed as an example of the coded CDMA systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is generally assumed when using Bayesian inference methods for neural networks that the input data contains no noise. For real-world (errors in variable) problems this is clearly an unsafe assumption. This paper presents a Bayesian neural network framework which accounts for input noise provided that a model of the noise process exists. In the limit where the noise process is small and symmetric it is shown, using the Laplace approximation, that this method adds an extra term to the usual Bayesian error bar which depends on the variance of the input noise process. Further, by treating the true (noiseless) input as a hidden variable, and sampling this jointly with the network’s weights, using a Markov chain Monte Carlo method, it is demonstrated that it is possible to infer the regression over the noiseless input. This leads to the possibility of training an accurate model of a system using less accurate, or more uncertain, data. This is demonstrated on both the, synthetic, noisy sine wave problem and a real problem of inferring the forward model for a satellite radar backscatter system used to predict sea surface wind vectors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Collaborative working with the aid of computers is increasing rapidly due to the widespread use of computer networks, geographic mobility of people, and small powerful personal computers. For the past ten years research has been conducted into this use of computing technology from a wide variety of perspectives and for a wide range of uses. This thesis adds to that previous work by examining the area of collaborative writing amongst groups of people. The research brings together a number of disciplines, namely sociology for examining group dynamics, psychology for understanding individual writing and learning processes, and computer science for database, networking, and programming theory. The project initially looks at groups and how they form, communicate, and work together, progressing on to look at writing and the cognitive processes it entails for both composition and retrieval. The thesis then details a set of issues which need to be addressed in a collaborative writing system. These issues are then followed by developing a model for collaborative writing, detailing an iterative process of co-ordination, writing and annotation, consolidation, and negotiation, based on a structured but extensible document model. Implementation issues for a collaborative application are then described, along with various methods of overcoming them. Finally the design and implementation of a collaborative writing system, named Collaborwriter, is described in detail, which concludes with some preliminary results from initial user trials and testing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work described in this thesis is the development of an ultrasonic tomogram to provide outlines of cross-sections of the ulna in vivo. This instrument, used in conjunction with X-ray densitometry previously developed in this department, would provide actual bone mineral density to a high resolution. It was hoped that the accuracy of the plot obtained from the tomogram would exceed that of existing ultrasonic techniques by about five times. Repeat measurements with these instruments to follow bone mineral changes would involve very low X-ray doses. A theoretical study has been made of acoustic diffraction, using a geometrical transform applicable to the integration of three different Green's functions, for axisymmetric systems. This has involved the derivation of one of these in a form amenable to computation. It is considered that this function fits the boundary conditions occurring in medical ultrasonography more closely than those used previously. A three dimensional plot of the pressure field using this function has been made for a ring transducer, in addition to that for disc transducers using all three functions. It has been shown how the theory may be extended to investigate the nature and magnitude of the particle velocity, at any point in the field, for the three functions mentioned. From this study. a concept of diffraction fronts has been developed, which has made it possible to determine energy flow also in a diffracting system. Intensity has been displayed in a manner similar to that used for pressure. Plots have been made of diffraction fronts and energy flow direction lines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis deals with the problem of Information Systems design for Corporate Management. It shows that the results of applying current approaches to Management Information Systems and Corporate Modelling fully justify a fresh look to the problem. The thesis develops an approach to design based on Cybernetic principles and theories. It looks at Management as an informational process and discusses the relevance of regulation theory to its practice. The work proceeds around the concept of change and its effects on the organization's stability and survival. The idea of looking at organizations as viable systems is discussed and a design to enhance survival capacity is developed. It takes Ashby's theory of adaptation and developments on ultra-stability as a theoretical framework and considering conditions for learning and foresight deduces that a design should include three basic components: A dynamic model of the organization- environment relationships; a method to spot significant changes in the value of the essential variables and in a certain set of parameters; and a Controller able to conceive and change the other two elements and to make choices among alternative policies. Further considerations of the conditions for rapid adaptation in organisms composed of many parts, and the law of Requisite Variety determine that successful adaptive behaviour requires certain functional organization. Beer's model of viable organizations is put in relation to Ashby's theory of adaptation and regulation. The use of the Ultra-stable system as abstract unit of analysis permits developing a rigorous taxonomy of change; it starts distinguishing between change with in behaviour and change of behaviour to complete the classification with organizational change. It relates these changes to the logical categories of learning connecting the topic of Information System design with that of organizational learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents the results from an investigation into the merits of analysing Magnetoencephalographic (MEG) data in the context of dynamical systems theory. MEG is the study of both the methods for the measurement of minute magnetic flux variations at the scalp, resulting from neuro-electric activity in the neocortex, as well as the techniques required to process and extract useful information from these measurements. As a result of its unique mode of action - by directly measuring neuronal activity via the resulting magnetic field fluctuations - MEG possesses a number of useful qualities which could potentially make it a powerful addition to any brain researcher's arsenal. Unfortunately, MEG research has so far failed to fulfil its early promise, being hindered in its progress by a variety of factors. Conventionally, the analysis of MEG has been dominated by the search for activity in certain spectral bands - the so-called alpha, delta, beta, etc that are commonly referred to in both academic and lay publications. Other efforts have centred upon generating optimal fits of "equivalent current dipoles" that best explain the observed field distribution. Many of these approaches carry the implicit assumption that the dynamics which result in the observed time series are linear. This is despite a variety of reasons which suggest that nonlinearity might be present in MEG recordings. By using methods that allow for nonlinear dynamics, the research described in this thesis avoids these restrictive linearity assumptions. A crucial concept underpinning this project is the belief that MEG recordings are mere observations of the evolution of the true underlying state, which is unobservable and is assumed to reflect some abstract brain cognitive state. Further, we maintain that it is unreasonable to expect these processes to be adequately described in the traditional way: as a linear sum of a large number of frequency generators. One of the main objectives of this thesis will be to prove that much more effective and powerful analysis of MEG can be achieved if one were to assume the presence of both linear and nonlinear characteristics from the outset. Our position is that the combined action of a relatively small number of these generators, coupled with external and dynamic noise sources, is more than sufficient to account for the complexity observed in the MEG recordings. Another problem that has plagued MEG researchers is the extremely low signal to noise ratios that are obtained. As the magnetic flux variations resulting from actual cortical processes can be extremely minute, the measuring devices used in MEG are, necessarily, extremely sensitive. The unfortunate side-effect of this is that even commonplace phenomena such as the earth's geomagnetic field can easily swamp signals of interest. This problem is commonly addressed by averaging over a large number of recordings. However, this has a number of notable drawbacks. In particular, it is difficult to synchronise high frequency activity which might be of interest, and often these signals will be cancelled out by the averaging process. Other problems that have been encountered are high costs and low portability of state-of-the- art multichannel machines. The result of this is that the use of MEG has, hitherto, been restricted to large institutions which are able to afford the high costs associated with the procurement and maintenance of these machines. In this project, we seek to address these issues by working almost exclusively with single channel, unaveraged MEG data. We demonstrate the applicability of a variety of methods originating from the fields of signal processing, dynamical systems, information theory and neural networks, to the analysis of MEG data. It is noteworthy that while modern signal processing tools such as independent component analysis, topographic maps and latent variable modelling have enjoyed extensive success in a variety of research areas from financial time series modelling to the analysis of sun spot activity, their use in MEG analysis has thus far been extremely limited. It is hoped that this work will help to remedy this oversight.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research examines the evolution of interorganizational relationships in a franchising context. Using U-curve theory, we develop three hypotheses and contrast them with traditional lifecycle theory. Three groups of constructs are affected by lifecycle: cooperation variables, dependence variables, and relationship variables. Four distinct stages emerge, with highest levels of variables in the honeymoon stage, lower levels in routine and crossroad stages, and increasing levels in the stabilization stage. Franchisors should strive for “stability on high levels” before operational realities influence the franchisees. Franchisees’ intermediate lifecycle phases are most critical for the system, since opportunistic behavior and switching are most likely.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An applied psychological framework for coping with performance uncertainty in sport and work systems is presented. The theme of personal control serves to integrate ideas prevalent in industrial and organisational psychology, the stress literature and labour process theory. These commonly focus on the promotion of tacit knowledge and learned resourcefulness in individual performers. Finally, data from an empirical evaluation of a development training programme to facilitate self-regulation skills in professional athletes are briefly highlighted.