36 resultados para Hamming Distance
em Aston University Research Archive
Resumo:
A formalism for modelling the dynamics of Genetic Algorithms (GAs) using methods from statistical mechanics, originally due to Prugel-Bennett and Shapiro, is reviewed, generalized and improved upon. This formalism can be used to predict the averaged trajectory of macroscopic statistics describing the GA's population. These macroscopics are chosen to average well between runs, so that fluctuations from mean behaviour can often be neglected. Where necessary, non-trivial terms are determined by assuming maximum entropy with constraints on known macroscopics. Problems of realistic size are described in compact form and finite population effects are included, often proving to be of fundamental importance. The macroscopics used here are cumulants of an appropriate quantity within the population and the mean correlation (Hamming distance) within the population. Including the correlation as an explicit macroscopic provides a significant improvement over the original formulation. The formalism is applied to a number of simple optimization problems in order to determine its predictive power and to gain insight into GA dynamics. Problems which are most amenable to analysis come from the class where alleles within the genotype contribute additively to the phenotype. This class can be treated with some generality, including problems with inhomogeneous contributions from each site, non-linear or noisy fitness measures, simple diploid representations and temporally varying fitness. The results can also be applied to a simple learning problem, generalization in a binary perceptron, and a limit is identified for which the optimal training batch size can be determined for this problem. The theory is compared to averaged results from a real GA in each case, showing excellent agreement if the maximum entropy principle holds. Some situations where this approximation brakes down are identified. In order to fully test the formalism, an attempt is made on the strong sc np-hard problem of storing random patterns in a binary perceptron. Here, the relationship between the genotype and phenotype (training error) is strongly non-linear. Mutation is modelled under the assumption that perceptron configurations are typical of perceptrons with a given training error. Unfortunately, this assumption does not provide a good approximation in general. It is conjectured that perceptron configurations would have to be constrained by other statistics in order to accurately model mutation for this problem. Issues arising from this study are discussed in conclusion and some possible areas of further research are outlined.
Resumo:
If, as is widely believed, schizophrenia is characterized by abnormalities of brain functional connectivity, then it seems reasonable to expect that different subtypes of schizophrenia could be discriminated in the same way. However, evidence for differences in functional connectivity between the subtypes of schizophrenia is largely lacking and, where it exists, it could be accounted for by clinical differences between the patients (e.g. medication) or by the limitations of the measures used. In this study, we measured EEG functional connectivity in unmedicated male patients diagnosed with either positive or negative syndrome schizophrenia and compared them with age and sex matched healthy controls. Using new methodology (Medkour et al., 2009) based on partial coherence, brain connectivity plots were constructed for positive and negative syndrome patients and controls. Reliable differences in the pattern of functional connectivity were found with both syndromes showing not only an absence of some of the connections that were seen in controls but also the presence of connections that the controls did not show. Comparing connectivity graphs using the Hamming distance, the negative-syndrome patients were found to be more distant from the controls than were the positive syndrome patients. Bootstrap distributions of these distances were created which showed a significant difference in the mean distances that was consistent with the observation that negative-syndrome diagnosis is associated with a more severe form of schizophrenia. We conclude that schizophrenia is characterized by widespread changes in functional connectivity with negative syndrome patients showing a more extreme pattern of abnormality than positive syndrome patients.
Resumo:
In this paper, we present experimental results for monitoring long distance WDM communication links using a line monitoring system suitable for legacy optically amplified long-haul undersea systems. This monitoring system is based on setting up a simple, passive, low cost high-loss optical loopback circuit at each repeater that provides a connection between the existing anti-directional undersea fibres, and can be used to define fault location. Fault location is achieved by transmitting a short pulse supervisory signal along with the WDM data signals where a portion of the overall signal is attenuated and returned to the transmit terminal by the loopback circuit. A special receiver is used at the terminal to extract the weakly returned supervisory signal where each supervisory signal is received at different times corresponding to different optical repeaters. Therefore, the degradation in any repeater appears on its corresponding supervisory signal level. We use a recirculating loop to simulate a 4600 km fibre link, on which a high-loss loopback supervisory system is implemented. Successful monitoring is accomplished through the production of an appropriate supervisory signal at the terminal that is detected and identified in a satisfactory time period after passing through up to 45 dB attenuation in the loopback circuit. © 2012 Elsevier B.V. All rights reserved.
Resumo:
The modem digital communication systems are made transmission reliable by employing error correction technique for the redundancies. Codes in the low-density parity-check work along the principles of Hamming code, and the parity-check matrix is very sparse, and multiple errors can be corrected. The sparseness of the matrix allows for the decoding process to be carried out by probability propagation methods similar to those employed in Turbo codes. The relation between spin systems in statistical physics and digital error correcting codes is based on the existence of a simple isomorphism between the additive Boolean group and the multiplicative binary group. Shannon proved general results on the natural limits of compression and error-correction by setting up the framework known as information theory. Error-correction codes are based on mapping the original space of words onto a higher dimensional space in such a way that the typical distance between encoded words increases.
Resumo:
In an exploding and fluctuating construction market, managers are facing a challenge, which is how to manage business on a wider scale and to utilize modern developments in information technology to promote productivity. The extraordinary development of telecommunications and computer technology makes it possible for people to plan, lead, control, organize and manage projects from a distance without the need to be on site on a daily basis. A modern management known as distance management (DM) or remote management is emerging. Physical distance no longer determines the boundary of management since managers can now operate projects through virtual teams that organize manpower, material and production without face-to-face communication. What organization prototype could overcome psychological and physical barriers to reengineer a successful project through information technology? What criteria distinguishes the adapted way of communication of individual activities in a teamwork and assist the integration of an efficient and effective communication between face-to-face and a physical distance? The entire methodology has been explained through a case application on refuse incineration plant projects in Taiwan.
Resumo:
This paper reports on an aspect of the implementation of a sophisticated system of Casemix Budgeting within a large public hospital in New Zealand. The paper examines the role of accounting inscription in supporting a system of “remote” management control effected through the Finance function at the hospital. The paper provides detailed description and analysis of part of the casemix technology in use at the research site. The implementation of clinical budgeting through the Transition casemix system will be examined by describing an aspect of the casemix system in detail. The design and use of management reporting is described. Reporting to different levels of management and for differing parts of the organisation are discussed with particular emphasis on the adoption of traditional analysis of costs using standard costing and variance analysis techniques.
Resumo:
Traditional approaches to calculate total factor productivity change through Malmquist indexes rely on distance functions. In this paper we show that the use of distance functions as a means to calculate total factor productivity change may introduce some bias in the analysis, and therefore we propose a procedure that calculates total factor productivity change through observed values only. Our total factor productivity change is then decomposed into efficiency change, technological change, and a residual effect. This decomposition makes use of a non-oriented measure in order to avoid problems associated with the traditional use of radial oriented measures, especially when variable returns to scale technologies are to be compared.
Resumo:
Traditional approaches to calculate total factor productivity (TFP) change through Malmquist indexes rely on distance functions. In this paper we show that the use of distance functions as a means to calculate TFP change may introduce some bias in the analysis, and therefore we propose a procedure that calculates TFP change through observed values only. Our total TFP change is then decomposed into efficiency change, technological change, and a residual effect. This decomposition makes use of a non-oriented measure in order to avoid problems associated with the traditional use of radial oriented measures, especially when variable returns to scale technologies are to be compared. The proposed approach is applied in this paper to a sample of Portuguese bank branches.
Resumo:
This paper is drawn from the use of data envelopment analysis (DEA) in helping a Portuguese bank to manage the performance of its branches. The bank wanted to set targets for the branches on such variables as growth in number of clients, growth in funds deposited and so on. Such variables can take positive and negative values but apart from some exceptions, traditional DEA models have hitherto been restricted to non-negative data. We report on the development of a model to handle unrestricted data in a DEA framework and illustrate the use of this model on data from the bank concerned.
Resumo:
This paper identifies the important limiting processes in transmission capacity for amplified soliton systems. Some novel control techniques are described for optimizing this capacity. In particular, dispersion compensation and phase conjugation are identified as offering good control of jitter without the need for many new components in the system. An advanced average soliton model is described and demonstrated to permit large amplifier spacing. The potential for solitons in high-dispersion land-based systems is discussed and results are presented showing 10 Gbit s$^{-1}$ transmission over 1000 km with significant amplifier spacing.
Resumo:
The authors study experimentally ~10 ps return-to-zero pulse propagation near the net dispersion zero of an optical fibre transmission line. Stable near-jitter-free propagation was observed over 70 Mm. Pulse stabilisation and ASE suppression were achieved through the saturable aborber mechanism of nonlinear polarisation rotation.
Resumo:
This thesis presents results of transmission experiments using optical solitons in a dispersion managed optical fibre recirculating loop. The basic concepts of pulse propagation in optical fibre are introduced before optical solitons and their use in optically amplified fibre systems are discussed. The role of dispersion management in such systems is then considered. The design, operation and limitations of the recirculating loop and soliton sources which were used and the experimental techniques are described before the experimental work is presented. The experimental work covers a number of areas all of which used dispersion management of the transmission line. A novel ultra-long distance propagation scheme which achieved low timing jitter by suppression of the amplifier noise and by working close to the zero dispersion wavelength has been discovered. The use of fibre Bragg gratings as wavelength filters to suppress noise and reduce timing jitter has been investigated. The performance of the fibre grating cornpared favourably with that of a bulk device and was in good agreement with theoretical predictions. The upgrade of existing standard fibre systems to higher bit rates is currently an important issue. The possibility of using solitons with dispersion compensation to allow an increase in data rate of existing standard fibre systems to 10Gbit/s over 5000km has been demonstrated. The applicability of this technique to longer distances, higher bit rates or longer amplifier spans is also investigated by optimisation of the dispersion management scheme. The use of fibre Bragg gratings as the dispersion compensating elements in such standard fibre transmission experiments has been examined and the main problem that these devices currently have, high polarisation mode dispersion, is discussed. The likely future direction of optical communications and what part solitons and dispersion management will play in this development is discussed in the thesis conclusions