930 resultados para topological complexity
Resumo:
This paper review the some of the recent developments in Complexity theory as applied to telephone-switching. Some of these techniques are suitable for practical implementation in India.
Resumo:
We computed Higuchi's fractal dimension (FD) of resting, eyes closed EEG recorded from 30 scalp locations in 18 male neuroleptic-naive, recent-onset schizophrenia (NRS) subjects and 15 male healthy control (HC) subjects, who were group-matched for age. Schizophrenia patients showed a diffuse reduction of FD except in the bilateral temporal and occipital regions, with the reduction being most prominent bifrontally. The positive symptom (PS) schizophrenia subjects showed FD values similar to or even higher than HC in the bilateral temporo-occipital regions, along with a co-existent bifrontal FD reduction as noted in the overall sample of NRS. In contrast, this increase in FD values in the bilateral temporo-occipital region was absent in the negative symptom (NS) subgroup. The regional differences in complexity suggested by these findings may reflect the aberrant brain dynamics underlying the pathophysiology of schizophrenia and its symptom dimensions. Higuchi's method of measuring FD directly in the time domain provides an alternative for the more computationally intensive nonlinear methods of estimating EEG complexity.
Resumo:
The concept of focus on opportunities describes how many new goals, options, and possibilities employees believe to have in their personal future at work. This study investigated the specific and shared effects of age, job complexity, and the use of successful aging strategies called selection, optimization, and compensation (SOC) in predicting focus on opportunities. Results of data collected from 133 employees of one company (mean age = 38 years, SD = 13, range 16–65 years) showed that age was negatively, and job complexity and use of SOC strategies were positively related to focus on opportunities. In addition, older employees in high-complexity jobs and older employees in low-complexity jobs with high use of SOC strategies were better able to maintain a focus on opportunities than older employees in low-complexity jobs with low use of SOC strategies.
Resumo:
Focus on opportunities is a cognitive-motivational facet of occupational future time perspective that describes how many new goals, options, and possibilities individuals expect to have in their personal work-related futures. This study examined focus on opportunities as a mediator of the relationships between age and work performance and between job complexity and work performance. In addition, it was expected that job complexity buffers the negative relationship between age and focus on opportunities and weakens the negative indirect effect of age on work performance. Results of mediation, moderation, and moderated mediation analyses with data collected from 168 employees in 41 organizations (mean age = 40.22 years, SD = 10.43, range = 19-64 years) as well as 168 peers providing work performance ratings supported the assumptions. The findings suggest that future studies on the role of age for work design and performance should take employees' focus on opportunities into account.
Resumo:
There exists various suggestions for building a functional and a fault-tolerant large-scale quantum computer. Topological quantum computation is a more exotic suggestion, which makes use of the properties of quasiparticles manifest only in certain two-dimensional systems. These so called anyons exhibit topological degrees of freedom, which, in principle, can be used to execute quantum computation with intrinsic fault-tolerance. This feature is the main incentive to study topological quantum computation. The objective of this thesis is to provide an accessible introduction to the theory. In this thesis one has considered the theory of anyons arising in two-dimensional quantum mechanical systems, which are described by gauge theories based on so called quantum double symmetries. The quasiparticles are shown to exhibit interactions and carry quantum numbers, which are both of topological nature. Particularly, it is found that the addition of the quantum numbers is not unique, but that the fusion of the quasiparticles is described by a non-trivial fusion algebra. It is discussed how this property can be used to encode quantum information in a manner which is intrinsically protected from decoherence and how one could, in principle, perform quantum computation by braiding the quasiparticles. As an example of the presented general discussion, the particle spectrum and the fusion algebra of an anyon model based on the gauge group S_3 are explicitly derived. The fusion algebra is found to branch into multiple proper subalgebras and the simplest one of them is chosen as a model for an illustrative demonstration. The different steps of a topological quantum computation are outlined and the computational power of the model is assessed. It turns out that the chosen model is not universal for quantum computation. However, because the objective was a demonstration of the theory with explicit calculations, none of the other more complicated fusion subalgebras were considered. Studying their applicability for quantum computation could be a topic of further research.
Resumo:
For the consumer, flavor is arguably the most important aspect of a good coffee. Coffee flavor is extremely complex and arises from numerous chemical, biological and physical influences of cultivar, coffee cherry maturity, geographical growing location, production, processing, roasting and cup preparation. Not surprisingly there is a large volume of research published detailing the volatile and non-volatile compounds in coffee and that are likely to be playing a role in coffee flavor. Further, there is much published on the sensory properties of coffee. Nevertheless, the link between flavor components and the sensory properties expressed in the complex matrix of coffee is yet to be fully understood. This paper provides an overview of the chemical components that are thought to be involved in the flavor and sensory quality of Arabica coffee.
Resumo:
A smooth map is said to be stable if small perturbations of the map only differ from the original one by a smooth change of coordinates. Smoothly stable maps are generic among the proper maps between given source and target manifolds when the source and target dimensions belong to the so-called nice dimensions, but outside this range of dimensions, smooth maps cannot generally be approximated by stable maps. This leads to the definition of topologically stable maps, where the smooth coordinate changes are replaced with homeomorphisms. The topologically stable maps are generic among proper maps for any dimensions of source and target. The purpose of this thesis is to investigate methods for proving topological stability by constructing extremely tame (E-tame) retractions onto the map in question from one of its smoothly stable unfoldings. In particular, we investigate how to use E-tame retractions from stable unfoldings to find topologically ministable unfoldings for certain weighted homogeneous maps or germs. Our first results are concerned with the construction of E-tame retractions and their relation to topological stability. We study how to construct the E-tame retractions from partial or local information, and these results form our toolbox for the main constructions. In the next chapter we study the group of right-left equivalences leaving a given multigerm f invariant, and show that when the multigerm is finitely determined, the group has a maximal compact subgroup and that the corresponding quotient is contractible. This means, essentially, that the group can be replaced with a compact Lie group of symmetries without much loss of information. We also show how to split the group into a product whose components only depend on the monogerm components of f. In the final chapter we investigate representatives of the E- and Z-series of singularities, discuss their instability and use our tools to construct E-tame retractions for some of them. The construction is based on describing the geometry of the set of points where the map is not smoothly stable, discovering that by using induction and our constructional tools, we already know how to construct local E-tame retractions along the set. The local solutions can then be glued together using our knowledge about the symmetry group of the local germs. We also discuss how to generalize our method to the whole E- and Z- series.
Resumo:
Minimum Description Length (MDL) is an information-theoretic principle that can be used for model selection and other statistical inference tasks. There are various ways to use the principle in practice. One theoretically valid way is to use the normalized maximum likelihood (NML) criterion. Due to computational difficulties, this approach has not been used very often. This thesis presents efficient floating-point algorithms that make it possible to compute the NML for multinomial, Naive Bayes and Bayesian forest models. None of the presented algorithms rely on asymptotic analysis and with the first two model classes we also discuss how to compute exact rational number solutions.
Resumo:
Matrix decompositions, where a given matrix is represented as a product of two other matrices, are regularly used in data mining. Most matrix decompositions have their roots in linear algebra, but the needs of data mining are not always those of linear algebra. In data mining one needs to have results that are interpretable -- and what is considered interpretable in data mining can be very different to what is considered interpretable in linear algebra. --- The purpose of this thesis is to study matrix decompositions that directly address the issue of interpretability. An example is a decomposition of binary matrices where the factor matrices are assumed to be binary and the matrix multiplication is Boolean. The restriction to binary factor matrices increases interpretability -- factor matrices are of the same type as the original matrix -- and allows the use of Boolean matrix multiplication, which is often more intuitive than normal matrix multiplication with binary matrices. Also several other decomposition methods are described, and the computational complexity of computing them is studied together with the hardness of approximating the related optimization problems. Based on these studies, algorithms for constructing the decompositions are proposed. Constructing the decompositions turns out to be computationally hard, and the proposed algorithms are mostly based on various heuristics. Nevertheless, the algorithms are shown to be capable of finding good results in empirical experiments conducted with both synthetic and real-world data.
Resumo:
We develop a two stage split vector quantization method with optimum bit allocation, for achieving minimum computational complexity. This also results in much lower memory requirement than the recently proposed switched split vector quantization method. To improve the rate-distortion performance further, a region specific normalization is introduced, which results in 1 bit/vector improvement over the typical two stage split vector quantizer, for wide-band LSF quantization.
Resumo:
We present two discriminative language modelling techniques for Lempel-Ziv-Welch (LZW) based LID system. The previous approach to LID using LZW algorithm was to directly use the LZW pattern tables forlanguage modelling. But, since the patterns in a language pattern table are shared by other language pattern tables, confusability prevailed in the LID task. For overcoming this, we present two pruning techniques (i) Language Specific (LS-LZW)-in which patterns common to more than one pattern table are removed. (ii) Length-Frequency product based (LF-LZW)-in which patterns having their length-frequency product below a threshold are removed. These approaches reduce the classification score (Compression Ratio [LZW-CR] or the weighted discriminant score [LZW-WDS]) for non native languages and increases the LID performance considerably. Also the memory and computational requirements of these techniques are much less compared to basic LZW techniques.
Resumo:
Information exchange (IE) is a critical component of the complex collaborative medication process in residential aged care facilities (RACFs). Designing information and communication technology (ICT) to support complex processes requires a profound understanding of the IE that underpins their execution. There is little existing research that investigates the complexity of IE in RACFs and its impact on ICT design. The aim of this study was thus to undertake an in-depth exploration of the IE process involved in medication management to identify its implications for the design of ICT. The study was undertaken at a large metropolitan facility in NSW, Australia. A total of three focus groups, eleven interviews and two observation sessions were conducted between July to August 2010. Process modelling was undertaken by translating the qualitative data via in-depth iterative inductive analysis. The findings highlight the complexity and collaborative nature of IE in RACF medication management. These models emphasize the need to: a) deal with temporal complexity; b) rely on an interdependent set of coordinative artefacts; and c) use synchronous communication channels for coordination. Taken together these are crucial aspects of the IE process in RACF medication management that need to be catered for when designing ICT in this critical area. This study provides important new evidence of the advantages of viewing process as a part of a system rather than as segregated tasks as a means of identifying the latent requirements for ICT design and that is able to support complex collaborative processes like medication management in RACFs. © 2012 IEEE.
Resumo:
The study on the formation and growth of topological close packed (TCP) compounds is important to understand the performance of turbine blades in jet engine applications. These deleterious phases grow mainly by diffusion process in the superalloy substrate. Significant volume change was found because of growth of the p phase in Co-Mo system. Growth kinetics of this phase and different diffusion parameters, like interdiffusion, intrinsic and tracer diffusion coefficients are calculated. Further the activation energy, which provides an idea about the mechanism, is determined. Moreover, the interdiffusion coefficient in Co(Mo) solid solution and impurity diffusion coefficient of Mo in Co are determined.