50 resultados para Deterministic walkers


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis, I view the historical background of Zimbabwe to show the patterns of traditional life that existed prior to settlerism. The form, nature, pace and impact of settlerism and colonialism up to the time of independence are also discussed to show how they affected the health of the population and the pace of development of the country. The political, social and economic underdevelopment of the African people that occurred in Zimbabwe prior to independence was a result of deliberate, politically motivated and controlled policy initiatives. These led to inequatable, inadequate, inappropriate and inaccessible health care provision. It is submitted that since it was the politics that determined the pace of underdevelopment, it must be the politics that must be at the forefront of the development strategy adopted. In the face of the amed conflict that existed in Zimbabwe, existing frameworks of analyses are shown to be inadequate for planning purposes because of their inability to provide indications about the stability of future outcomes. The Metagame technique of analysis of options is proposed as a methology that can be applied in such situations. It rejects deterministic predicative models as misleading and advocates an interactive model based on objective and subjective valuation of human behaviour. In conclusion, the search for stable outcomes rather than optimal and best solutions strategies is advocated in decision making in organisations of all sizes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The state of the art in productivity measurement and analysis shows a gap between simple methods having little relevance in practice and sophisticated mathematical theory which is unwieldy for strategic and tactical planning purposes, -particularly at company level. An extension is made in this thesis to the method of productivity measurement and analysis based on the concept of added value, appropriate to those companies in which the materials, bought-in parts and services change substantially and a number of plants and inter-related units are involved in providing components for final assembly. Reviews and comparisons of productivity measurement dealing with alternative indices and their problems have been made and appropriate solutions put forward to productivity analysis in general and the added value method in particular. Based on this concept and method, three kinds of computerised models two of them deterministic, called sensitivity analysis and deterministic appraisal, and the third one, stochastic, called risk simulation, have been developed to cope with the planning of productivity and productivity growth with reference to the changes in their component variables, ranging from a single value 'to• a class interval of values of a productivity distribution. The models are designed to be flexible and can be adjusted according to the available computer capacity expected accuracy and 'presentation of the output. The stochastic model is based on the assumption of statistical independence between individual variables and the existence of normality in their probability distributions. The component variables have been forecasted using polynomials of degree four. This model is tested by comparisons of its behaviour with that of mathematical model using real historical data from British Leyland, and the results were satisfactory within acceptable levels of accuracy. Modifications to the model and its statistical treatment have been made as required. The results of applying these measurements and planning models to the British motor vehicle manufacturing companies are presented and discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For analysing financial time series two main opposing viewpoints exist, either capital markets are completely stochastic and therefore prices follow a random walk, or they are deterministic and consequently predictable. For each of these views a great variety of tools exist with which it can be tried to confirm the hypotheses. Unfortunately, these methods are not well suited for dealing with data characterised in part by both paradigms. This thesis investigates these two approaches in order to model the behaviour of financial time series. In the deterministic framework methods are used to characterise the dimensionality of embedded financial data. The stochastic approach includes here an estimation of the unconditioned and conditional return distributions using parametric, non- and semi-parametric density estimation techniques. Finally, it will be shown how elements from these two approaches could be combined to achieve a more realistic model for financial time series.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Random number generation is a central component of modern information technology, with crucial applications in ensuring communications and information security. The development of new physical mechanisms suitable to directly generate random bit sequences is thus a subject of intense current research, with particular interest in alloptical techniques suitable for the generation of data sequences with high bit rate. One such promising technique that has received much recent attention is the chaotic semiconductor laser systems producing high quality random output as a result of the intrinsic nonlinear dynamics of its architecture [1]. Here we propose a novel complementary concept of all-optical technique that might dramatically increase the generation rate of random bits by using simultaneously multiple spectral channels with uncorrelated signals - somewhat similar to use of wave-division-multiplexing in communications. We propose to exploit the intrinsic nonlinear dynamics of extreme spectral broadening and supercontinuum (SC) generation in optical fibre, a process known to be often associated with non-deterministic fluctuations [2]. In this paper, we report proof-of concept results indicating that the fluctuations in highly nonlinear fibre SC generation can potentially be used for random number generation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis addresses data assimilation, which typically refers to the estimation of the state of a physical system given a model and observations, and its application to short-term precipitation forecasting. A general introduction to data assimilation is given, both from a deterministic and' stochastic point of view. Data assimilation algorithms are reviewed, in the static case (when no dynamics are involved), then in the dynamic case. A double experiment on two non-linear models, the Lorenz 63 and the Lorenz 96 models, is run and the comparative performance of the methods is discussed in terms of quality of the assimilation, robustness "in the non-linear regime and computational time. Following the general review and analysis, data assimilation is discussed in the particular context of very short-term rainfall forecasting (nowcasting) using radar images. An extended Bayesian precipitation nowcasting model is introduced. The model is stochastic in nature and relies on the spatial decomposition of the rainfall field into rain "cells". Radar observations are assimilated using a Variational Bayesian method in which the true posterior distribution of the parameters is approximated by a more tractable distribution. The motion of the cells is captured by a 20 Gaussian process. The model is tested on two precipitation events, the first dominated by convective showers, the second by precipitation fronts. Several deterministic and probabilistic validation methods are applied and the model is shown to retain reasonable prediction skill at up to 3 hours lead time. Extensions to the model are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Liberalisation has become an increasingly important policy trend, both in the private and public sectors of advanced industrial economies. This article eschews deterministic accounts of liberalisation by considering why government attempts to institute competition may be successful in some cases and not others. It considers the relative strength of explanations focusing on the institutional context, and on the volume and power of sectoral actors supporting liberalisation. These approaches are applied to two attempts to liberalise, one successful and one unsuccessful, within one sector in one nation – higher education in Britain. Each explanation is seen to have some explanatory power, but none is sufficient to explain why competition was generalised in the one case and not the other. The article counsels the need for scholars of liberalisation to be open to multiple explanations which may require the marshalling of multiple sources and types of evidence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Discrete event simulation of manufacturing systems has become widely accepted as an important tool to aid the design of such systems. Often, however, it is applied by practitioners in a manner which largely ignores an important element of industry; namely, the workforce. Workers are usually represented as simple resources, often with deterministic performance values. This approach ignores the potentially large effect that human performance variation can have on a system. A long-term data collection exercise is described with the aim of quantifying the performance variation of workers in a typical automotive assembly plant. The data are presented in a histogram form which is immediately usable in simulations to improve the accuracy of design assessment. The results show levels of skewness and range which are far larger than anticipated by current researchers and practitioners in the field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose a novel recursive-algorithm based maximum a posteriori probability (MAP) detector in spectrally-efficient coherent wavelength division multiplexing (CoWDM) systems, and investigate its performance in a 1-bit/s/Hz on-off keyed (OOK) system limited by optical-signal-to-noise ratio. The proposed method decodes each sub-channel using the signal levels not only of the particular sub-channel but also of its adjacent sub-channels, and therefore can effectively compensate deterministic inter-sub-channel crosstalk as well as inter-symbol interference arising from narrow-band filtering and chromatic dispersion (CD). Numerical simulation of a five-channel OOK-based CoWDM system with 10Gbit/s per channel using either direct or coherent detection shows that the MAP decoder can eliminate the need for phase control of each optical carrier (which is necessarily required in a conventional CoWDM system), and greatly relaxes the spectral design of the demultiplexing filter at the receiver. It also significantly improves back-to-back sensitivity and CD tolerance of the system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A "black-box" phase sensitive amplifier is presented achieving simultaneous suppression of deterministic phase distortion on two independent 42.66 Gbit/s DPSK modulated signal wavelengths.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report the impact of longitudinal signal power profile on the transmission performance of coherently-detected 112 Gb/s m-ary polarization multiplexed quadrature amplitude modulation system after compensation of deterministic nonlinear fibre impairments. Performance improvements up to 0.6 dB (Q(eff)) are reported for a non-uniform transmission link power profile. Further investigation reveals that the evolution of the transmission performance with power profile management is fully consistent with the parametric amplification of the amplified spontaneous emission by the signal through four-wave mixing. In particular, for a non-dispersion managed system, a single-step increment of 4 dB in the amplifier gain, with respect to a uniform gain profile, at similar to 2/3(rd) of the total reach considerably improves the transmission performance for all the formats studied. In contrary a negative-step profile, emulating a failure (gain decrease or loss increase), significantly degrades the bit-error rate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A scheme of teleportation of an arbitrary two-particle state is presented when two pairs of entangled particles are used as quantum channels. After the Bell state measurements are operated by the sender, the original state with deterministic probability can be reconstructed by the receiver when a corresponding unitary transformation is followed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study presents some quantitative evidence from a number of simulation experiments on the accuracy of the productivitygrowth estimates derived from growthaccounting (GA) and frontier-based methods (namely data envelopment analysis-, corrected ordinary least squares-, and stochastic frontier analysis-based malmquist indices) under various conditions. These include the presence of technical inefficiency, measurement error, misspecification of the production function (for the GA and parametric approaches) and increased input and price volatility from one period to the next. The study finds that the frontier-based methods usually outperform GA, but the overall performance varies by experiment. Parametric approaches generally perform best when there is no functional form misspecification, but their accuracy greatly diminishes otherwise. The results also show that the deterministic approaches perform adequately even under conditions of (modest) measurement error and when measurement error becomes larger, the accuracy of all approaches (including stochastic approaches) deteriorates rapidly, to the point that their estimates could be considered unreliable for policy purposes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The existing assignment problems for assigning n jobs to n individuals are limited to the considerations of cost or profit measured as crisp. However, in many real applications, costs are not deterministic numbers. This paper develops a procedure based on Data Envelopment Analysis method to solve the assignment problems with fuzzy costs or fuzzy profits for each possible assignment. It aims to obtain the points with maximum membership values for the fuzzy parameters while maximizing the profit or minimizing the assignment cost. In this method, a discrete approach is presented to rank the fuzzy numbers first. Then, corresponding to each fuzzy number, we introduce a crisp number using the efficiency concept. A numerical example is used to illustrate the usefulness of this new method. © 2012 Operational Research Society Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Productivity at the macro level is a complex concept but also arguably the most appropriate measure of economic welfare. Currently, there is limited research available on the various approaches that can be used to measure it and especially on the relative accuracy of said approaches. This thesis has two main objectives: firstly, to detail some of the most common productivity measurement approaches and assess their accuracy under a number of conditions and secondly, to present an up-to-date application of productivity measurement and provide some guidance on selecting between sometimes conflicting productivity estimates. With regards to the first objective, the thesis provides a discussion on the issues specific to macro-level productivity measurement and on the strengths and weaknesses of the three main types of approaches available, namely index-number approaches (represented by Growth Accounting), non-parametric distance functions (DEA-based Malmquist indices) and parametric production functions (COLS- and SFA-based Malmquist indices). The accuracy of these approaches is assessed through simulation analysis, which provided some interesting findings. Probably the most important were that deterministic approaches are quite accurate even when the data is moderately noisy, that no approaches were accurate when noise was more extensive, that functional form misspecification has a severe negative effect in the accuracy of the parametric approaches and finally that increased volatility in inputs and prices from one period to the next adversely affects all approaches examined. The application was based on the EU KLEMS (2008) dataset and revealed that the different approaches do in fact result in different productivity change estimates, at least for some of the countries assessed. To assist researchers in selecting between conflicting estimates, a new, three step selection framework is proposed, based on findings of simulation analyses and established diagnostics/indicators. An application of this framework is also provided, based on the EU KLEMS dataset.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conventional DEA models assume deterministic, precise and non-negative data for input and output observations. However, real applications may be characterized by observations that are given in form of intervals and include negative numbers. For instance, the consumption of electricity in decentralized energy resources may be either negative or positive, depending on the heat consumption. Likewise, the heat losses in distribution networks may be within a certain range, depending on e.g. external temperature and real-time outtake. Complementing earlier work separately addressing the two problems; interval data and negative data; we propose a comprehensive evaluation process for measuring the relative efficiencies of a set of DMUs in DEA. In our general formulation, the intervals may contain upper or lower bounds with different signs. The proposed method determines upper and lower bounds for the technical efficiency through the limits of the intervals after decomposition. Based on the interval scores, DMUs are then classified into three classes, namely, the strictly efficient, weakly efficient and inefficient. An intuitive ranking approach is presented for the respective classes. The approach is demonstrated through an application to the evaluation of bank branches. © 2013.