883 resultados para 080401 Coding and Information Theory
Resumo:
The problem of distributed compression for correlated quantum sources is considered. The classical version of this problem was solved by Slepian and Wolf, who showed that distributed compression could take full advantage of redundancy in the local sources created by the presence of correlations. Here it is shown that, in general, this is not the case for quantum sources, by proving a lower bound on the rate sum for irreducible sources of product states which is stronger than the one given by a naive application of Slepian-Wolf. Nonetheless, strategies taking advantage of correlation do exist for some special classes of quantum sources. For example, Devetak and Winter demonstrated the existence of such a strategy when one of the sources is classical. Optimal nontrivial strategies for a different extreme, sources of Bell states, are presented here. In addition, it is explained how distributed compression is connected to other problems in quantum information theory, including information-disturbance questions, entanglement distillation and quantum error correction.
Resumo:
The practice of career counseling has been derived from principles of career theory and counseling theory. In recent times, the fields of both career and counseling theory have undergone considerable change. This article details the move toward convergence in career theory, and the subsequent development of the Systems Theory Framework in this domain. The importance of this development to connecting theory and practice in the field of career counseling is discussed.
Resumo:
In the social sciences, debate on the relationship between religion and politics is mainly the subject of analysis in the sociology of religion and the theory of international relations. While each of these fields promotes different approaches to study their interdependency. The individual's perception of religion and politics is neglected by current research. The faithful, who participates in religious ceremonies, listening and behaving according to specific religious teachings, actively engaging in the liturgical life of the institutional form of his religion, has a specific way of understanding the relationship between religion and politics. I argue that this aspect is under-researched and misrepresented in the literature of sociology and international relations. However, a more complex analysis is offered by the study of nationalism, and especially by its ethnosymbolic approach, which includes at the micro and macro societal level the presence of myths and symbols as part of the individual's and the nation's life. An integrative theory analysing the connection between religion and politics takes into account the role of myths and symbols from the perspectives of both individuals and ethnic communities.
Resumo:
We propose a method to determine the critical noise level for decoding Gallager type low density parity check error correcting codes. The method is based on the magnetization enumerator (¸M), rather than on the weight enumerator (¸W) presented recently in the information theory literature. The interpretation of our method is appealingly simple, and the relation between the different decoding schemes such as typical pairs decoding, MAP, and finite temperature decoding (MPM) becomes clear. Our results are more optimistic than those derived via the methods of information theory and are in excellent agreement with recent results from another statistical physics approach.
Resumo:
A novel approach, based on statistical mechanics, to analyze typical performance of optimum code-division multiple-access (CDMA) multiuser detectors is reviewed. A `black-box' view ot the basic CDMA channel is introduced, based on which the CDMA multiuser detection problem is regarded as a `learning-from-examples' problem of the `binary linear perceptron' in the neural network literature. Adopting Bayes framework, analysis of the performance of the optimum CDMA multiuser detectors is reduced to evaluation of the average of the cumulant generating function of a relevant posterior distribution. The evaluation of the average cumulant generating function is done, based on formal analogy with a similar calculation appearing in the spin glass theory in statistical mechanics, by making use of the replica method, a method developed in the spin glass theory.
Resumo:
We analyze, using the replica method of statistical mechanics, the theoretical performance of coded code-division multiple-access (CDMA) systems in which regular low-density parity-check (LDPC) codes are used for channel coding.
Resumo:
Using panel data for 41 developed and developing countries over the period 1998-2004, this paper examines the links between ICT diffusion and human development. We conducted a panel regression analysis of the investments in healthcare, education and information and communication technology (ICT) against human development index (HDI). The results show that these variables can be used to predict HDI scores. In agreement with findings of previous research, it is clear from our analysis that the central focus on ICT as a solution for development will not bring the results that the promoters of ICT as an ‘engine of growth’ are expecting. It is unwise to disaggregate the issues of education and healthcare infrastructure from ICT infrastructure development. ICT policies should be integrated with other national policies in order to find a holistic and structural solution to development.
Resumo:
This paper considers the role of opportunism in three contractual theories of the firm: rent-seeking theory, property rights theory, and agency theory. In each case I examine whether it is possible to have a functioning contractual theory of the firm without recourse to opportunism. Without opportunism firms may still exist as a result of issues arising from (incomplete) contracting. Far from posing a problem for the theory of the firm, questioning the role of opportunism and the ubiquity of the hold-up problem helps us understand more about the purpose and functions of contracts which go beyond mere incentive alignment.
Resumo:
Many see the absence of conflict between groups as indicative of effective intergroup relations. Others consider its management a suitable effectiveness criterion. In this article we demarcate a different approach and propose that these views are deficient in describing effective intergroup relations. The article theorizes alternative criteria of intergroup effectiveness rooted in team representatives' subjective value judgements and assesses the psychometric characteristics of a short measure based on these criteria. Results on empirical validity suggest the measure to be a potential alternative outcome of organizational conflict. Implications for both the study of intergroup relations and conflict theory are discussed. © 2005 Psychology Press Ltd.
Resumo:
This thesis presents a new approach to designing large organizational databases. The approach emphasizes the need for a holistic approach to the design process. The development of the proposed approach was based on a comprehensive examination of the issues of relevance to the design and utilization of databases. Such issues include conceptual modelling, organization theory, and semantic theory. The conceptual modelling approach presented in this thesis is developed over three design stages, or model perspectives. In the semantic perspective, concept definitions were developed based on established semantic principles. Such definitions rely on meaning - provided by intension and extension - to determine intrinsic conceptual definitions. A tool, called meaning-based classification (MBC), is devised to classify concepts based on meaning. Concept classes are then integrated using concept definitions and a set of semantic relations which rely on concept content and form. In the application perspective, relationships are semantically defined according to the application environment. Relationship definitions include explicit relationship properties and constraints. The organization perspective introduces a new set of relations specifically developed to maintain conformity of conceptual abstractions with the nature of information abstractions implied by user requirements throughout the organization. Such relations are based on the stratification of work hierarchies, defined elsewhere in the thesis. Finally, an example of an application of the proposed approach is presented to illustrate the applicability and practicality of the modelling approach.
Resumo:
This thesis explores the processes of team innovation. It utilises two studies, an organisationally based pilot and an experimental study, to examine and identify aspects of teams' behaviours that are important for successful innovative outcome. The pilot study, based in two automotive manufacturers, involved the collection of team members' experiences through semi-structured interviews, and identified a number of factors that affected teams' innovative performance. These included: the application of ideative & dissemination processes; the importance of good team relationships, especially those of a more informal nature, in facilitating information and ideative processes; the role of external linkages in enhancing quality and radicality of innovations; and the potential attenuation of innovative ideas by time deadlines. This study revealed a number key team behaviours that may be important in successful innovation outcomes. These included; goal setting, idea generation and development, external contact, task and personal information exchange, leadership, positive feedback and resource deployment. These behaviours formed the basis of a coding system used in the second part of the research. Building on the results from the field based research, an experimental study was undertaken to examine the behavioural differences between three groups of sixteen teams undertaking innovative an task to produce an anti-drugs poster. They were randomly assigned to one of three innovation category conditions suggested by King and Anderson (1990), emergent, imported and imposed. These conditions determined the teams level of access to additional information on previously successful campaigns and the degree of freedom they had with regarding to the design of the poster. In addition, a further experimental condition was imposed on half of the teams per category which involved a formal time deadline for task completion. The teams were video taped for the duration of their innovation and their behaviours analysed and coded in five main aspects including; ideation, external focus, goal setting, interpersonal, directive and resource related activities. A panel of experts, utilising five scales developed from West and Anderson's (1996) innovation outcome measures, assessed the teams' outputs. ANOVAs and repeated measure ANOVAs were deployed to identify whether there were significant differences between the different conditions. The results indicated that there were some behavioural differences between the categories and that over the duration of the task behavioural changes were identified. The results, however, revealed a complex picture and suggested limited support for three distinctive innovation categories. There were many differences in behaviours, but rarely between more than two of the categories. A main finding was the impact that different levels of constraint had in changing teams' focus of attention. For example, emergent teams were found to use both their own team and external resources, whilst those who could import information about other successful campaigns were likely to concentrate outside the team and pay limited attention to the internal resources available within the team. In contrast, those operating under task constraints with aspects of the task imposed onto them were more likely to attend to internal team resources and pay limited attention to the external world. As indicated by the earlier field study, time deadlines did significantly change teams' behaviour, reducing ideative and information exchange behaviours. The model shows an important behavioural progression related to innovate teams. This progression involved the teams' openness initially to external sources, and then to the intra-team environment. Premature closure on the final idea before their mid-point was found to have a detrimental impact on team's innovation. Ideative behaviour per se was not significant for innovation outcome, instead the development of intra-team support and trust emerged as crucial. Analysis of variance revealed some limited differentiation between the behaviours of teams operating under the aforementioned three innovation categories. There were also distinct detrimental differences in the behaviour of those operating under a time deadline. Overall, the study identified the complex interrelationships of team behaviours and outcomes, and between teams and their context.
Resumo:
This research examines the role of the information management process within a process-oriented enterprise, Xerox Ltd. The research approach is based on a post-positive paradigm and has resulted in thirty-five idiographic statements. The three major outcomes are: 1. The process-oriented holistic enterprise is an organisation that requires a long-term management commitment to its development. It depends on the careful management of people, tasks, information and technology. A complex integration of business processes is required and this can be managed through the use of consistent documentation techniques, clarity in the definition of process responsibilities and management attention to the global metrics and the centralisation of the management of the process model are critical to its success. 2. The role of the information management process within the context of a process-oriented enterprise is to provide flexible and cost-effective applications, technological, and process support to the business. This is best achieved through a centralisation of the management of information management and of the process model. A business-led approach combined with the consolidation of applications, information, process, and data architectures is central to providing effective business and process-focused support. 3. In a process oriented holistic enterprise, process and information management are inextricably linked. The model of process management depends heavily on information management, whilst the model of information management is totally focused around supporting and creating the process model. The two models are mutually creating - one cannot exist without the other. There is a duality concept of process and information management.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT This thesis is a cross-disciplinary study of the empirical impact of real options theory in the fields of decision sciences and performance management. Borrowing from the economics, strategy and operations research literature, the research examines the risk and performance implications of real options in firms’ strategic investments and multinational operations. An emphasis is placed on the flexibility potential and competitive advantage of multinational corporations to explore the extent to which real options analysis can be classified as best practice in management research. Using a combination of qualitative and quantitative techniques the evidence suggests that, if real options are explored and exploited appropriately, real options management can result in superior performance for multinational companies. The qualitative findings give an overview of the practical advantages and disadvantages of real options and the statistical results reveal that firms which have developed a high awareness of their real options are, as predicted by the theory, able to reduce their downside risk and increase profits through flexibility, organisational slack and multinationality. Although real options awareness does not systematically guarantee higher returns from operations, supplementary findings indicate that firms with evidence of significant investments in the acquisition of real options knowledge tend to outperform competitors which are unaware of their real options. There are three contributions of this research. First, it extends the real options and capacity planning literature to path-dependent contingent-claims analysis to underline the benefits of average type options in capacity allocation. Second, it is thought to be the first to explicitly examine the performance effects of real options on a sample of firms which have developed partial capabilities in real options analysis suggesting that real options diffusion can be key to value creation. Third, it builds a new decision-aiding framework to facilitate the use of real options in projects appraisal and strategic planning.
Resumo:
Task classification is introduced as a method for the evaluation of monitoring behaviour in different task situations. On the basis of an analysis of different monitoring tasks, a task classification system comprising four task 'dimensions' is proposed. The perceptual speed and flexibility of closure categories, which are identified with signal discrimination type, comprise the principal dimension in this taxonomy, the others being sense modality, the time course of events, and source complexity. It is also proposed that decision theory provides the most complete method for the analysis of performance in monitoring tasks. Several different aspects of decision theory in relation to monitoring behaviour are described. A method is also outlined whereby both accuracy and latency measures of performance may be analysed within the same decision theory framework. Eight experiments and an organizational study are reported. The results show that a distinction can be made between the perceptual efficiency (sensitivity) of a monitor and his criterial level of response, and that in most monitoring situations, there is no decrement in efficiency over the work period, but an increase in the strictness of the response criterion. The range of tasks exhibiting either or both of these performance trends can be specified within the task classification system. In particular, it is shown that a sensitivity decrement is only obtained for 'speed' tasks with a high stimulation rate. A distinctive feature of 'speed' tasks is that target detection requires the discrimination of a change in a stimulus relative to preceding stimuli, whereas in 'closure' tasks, the information required for the discrimination of targets is presented at the same point In time. In the final study, the specification of tasks yielding sensitivity decrements is shown to be consistent with a task classification analysis of the monitoring literature. It is also demonstrated that the signal type dimension has a major influence on the consistency of individual differences in performance in different tasks. The results provide an empirical validation for the 'speed' and 'closure' categories, and suggest that individual differences are not completely task specific but are dependent on the demands common to different tasks. Task classification is therefore shovn to enable improved generalizations to be made of the factors affecting 1) performance trends over time, and 2) the consistencv of performance in different tasks. A decision theory analysis of response latencies is shown to support the view that criterion shifts are obtained in some tasks, while sensitivity shifts are obtained in others. The results of a psychophysiological study also suggest that evoked potential latency measures may provide temporal correlates of criterion shifts in monitoring tasks. Among other results, the finding that the latencies of negative responses do not increase over time is taken to invalidate arousal-based theories of performance trends over a work period. An interpretation in terms of expectancy, however, provides a more reliable explanation of criterion shifts. Although the mechanisms underlying the sensitivity decrement are not completely clear, the results rule out 'unitary' theories such as observing response and coupling theory. It is suggested that an interpretation in terms of the memory data limitations on information processing provides the most parsimonious explanation of all the results in the literature relating to sensitivity decrement. Task classification therefore enables the refinement and selection of theories of monitoring behaviour in terms of their reliability in generalizing predictions to a wide range of tasks. It is thus concluded that task classification and decision theory provide a reliable basis for the assessment and analysis of monitoring behaviour in different task situations.
Resumo:
This thesis presents a thorough and principled investigation into the application of artificial neural networks to the biological monitoring of freshwater. It contains original ideas on the classification and interpretation of benthic macroinvertebrates, and aims to demonstrate their superiority over the biotic systems currently used in the UK to report river water quality. The conceptual basis of a new biological classification system is described, and a full review and analysis of a number of river data sets is presented. The biological classification is compared to the common biotic systems using data from the Upper Trent catchment. This data contained 292 expertly classified invertebrate samples identified to mixed taxonomic levels. The neural network experimental work concentrates on the classification of the invertebrate samples into biological class, where only a subset of the sample is used to form the classification. Other experimentation is conducted into the identification of novel input samples, the classification of samples from different biotopes and the use of prior information in the neural network models. The biological classification is shown to provide an intuitive interpretation of a graphical representation, generated without reference to the class labels, of the Upper Trent data. The selection of key indicator taxa is considered using three different approaches; one novel, one from information theory and one from classical statistical methods. Good indicators of quality class based on these analyses are found to be in good agreement with those chosen by a domain expert. The change in information associated with different levels of identification and enumeration of taxa is quantified. The feasibility of using neural network classifiers and predictors to develop numeric criteria for the biological assessment of sediment contamination in the Great Lakes is also investigated.