44 resultados para Database search Evidential value Bayesian decision theory Influence diagrams
em Aston University Research Archive
Resumo:
DUE TO INCOMPLETE PAPERWORK, ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
The problem of evaluating different learning rules and other statistical estimators is analysed. A new general theory of statistical inference is developed by combining Bayesian decision theory with information geometry. It is coherent and invariant. For each sample a unique ideal estimate exists and is given by an average over the posterior. An optimal estimate within a model is given by a projection of the ideal estimate. The ideal estimate is a sufficient statistic of the posterior, so practical learning rules are functions of the ideal estimator. If the sole purpose of learning is to extract information from the data, the learning rule must also approximate the ideal estimator. This framework is applicable to both Bayesian and non-Bayesian methods, with arbitrary statistical models, and to supervised, unsupervised and reinforcement learning schemes.
Resumo:
Bayesian decision theory is increasingly applied to support decision-making processes under environmental variability and uncertainty. Researchers from application areas like psychology and biomedicine have applied these techniques successfully. However, in the area of software engineering and speci?cally in the area of self-adaptive systems (SASs), little progress has been made in the application of Bayesian decision theory. We believe that techniques based on Bayesian Networks (BNs) are useful for systems that dynamically adapt themselves at runtime to a changing environment, which is usually uncertain. In this paper, we discuss the case for the use of BNs, speci?cally Dynamic Decision Networks (DDNs), to support the decision-making of self-adaptive systems. We present how such a probabilistic model can be used to support the decision making in SASs and justify its applicability. We have applied our DDN-based approach to the case of an adaptive remote data mirroring system. We discuss results, implications and potential bene?ts of the DDN to enhance the development and operation of self-adaptive systems, by providing mechanisms to cope with uncertainty and automatically make the best decision.
Resumo:
Neural networks are statistical models and learning rules are estimators. In this paper a theory for measuring generalisation is developed by combining Bayesian decision theory with information geometry. The performance of an estimator is measured by the information divergence between the true distribution and the estimate, averaged over the Bayesian posterior. This unifies the majority of error measures currently in use. The optimal estimators also reveal some intricate interrelationships among information geometry, Banach spaces and sufficient statistics.
Resumo:
Task classification is introduced as a method for the evaluation of monitoring behaviour in different task situations. On the basis of an analysis of different monitoring tasks, a task classification system comprising four task 'dimensions' is proposed. The perceptual speed and flexibility of closure categories, which are identified with signal discrimination type, comprise the principal dimension in this taxonomy, the others being sense modality, the time course of events, and source complexity. It is also proposed that decision theory provides the most complete method for the analysis of performance in monitoring tasks. Several different aspects of decision theory in relation to monitoring behaviour are described. A method is also outlined whereby both accuracy and latency measures of performance may be analysed within the same decision theory framework. Eight experiments and an organizational study are reported. The results show that a distinction can be made between the perceptual efficiency (sensitivity) of a monitor and his criterial level of response, and that in most monitoring situations, there is no decrement in efficiency over the work period, but an increase in the strictness of the response criterion. The range of tasks exhibiting either or both of these performance trends can be specified within the task classification system. In particular, it is shown that a sensitivity decrement is only obtained for 'speed' tasks with a high stimulation rate. A distinctive feature of 'speed' tasks is that target detection requires the discrimination of a change in a stimulus relative to preceding stimuli, whereas in 'closure' tasks, the information required for the discrimination of targets is presented at the same point In time. In the final study, the specification of tasks yielding sensitivity decrements is shown to be consistent with a task classification analysis of the monitoring literature. It is also demonstrated that the signal type dimension has a major influence on the consistency of individual differences in performance in different tasks. The results provide an empirical validation for the 'speed' and 'closure' categories, and suggest that individual differences are not completely task specific but are dependent on the demands common to different tasks. Task classification is therefore shovn to enable improved generalizations to be made of the factors affecting 1) performance trends over time, and 2) the consistencv of performance in different tasks. A decision theory analysis of response latencies is shown to support the view that criterion shifts are obtained in some tasks, while sensitivity shifts are obtained in others. The results of a psychophysiological study also suggest that evoked potential latency measures may provide temporal correlates of criterion shifts in monitoring tasks. Among other results, the finding that the latencies of negative responses do not increase over time is taken to invalidate arousal-based theories of performance trends over a work period. An interpretation in terms of expectancy, however, provides a more reliable explanation of criterion shifts. Although the mechanisms underlying the sensitivity decrement are not completely clear, the results rule out 'unitary' theories such as observing response and coupling theory. It is suggested that an interpretation in terms of the memory data limitations on information processing provides the most parsimonious explanation of all the results in the literature relating to sensitivity decrement. Task classification therefore enables the refinement and selection of theories of monitoring behaviour in terms of their reliability in generalizing predictions to a wide range of tasks. It is thus concluded that task classification and decision theory provide a reliable basis for the assessment and analysis of monitoring behaviour in different task situations.
Resumo:
A family of measurements of generalisation is proposed for estimators of continuous distributions. In particular, they apply to neural network learning rules associated with continuous neural networks. The optimal estimators (learning rules) in this sense are Bayesian decision methods with information divergence as loss function. The Bayesian framework guarantees internal coherence of such measurements, while the information geometric loss function guarantees invariance. The theoretical solution for the optimal estimator is derived by a variational method. It is applied to the family of Gaussian distributions and the implications are discussed. This is one in a series of technical reports on this topic; it generalises the results of ¸iteZhu95:prob.discrete to continuous distributions and serve as a concrete example of a larger picture ¸iteZhu95:generalisation.
Resumo:
Flow control in Computer Communication systems is generally a multi-layered structure, consisting of several mechanisms operating independently at different levels. Evaluation of the performance of networks in which different flow control mechanisms act simultaneously is an important area of research, and is examined in depth in this thesis. This thesis presents the modelling of a finite resource computer communication network equipped with three levels of flow control, based on closed queueing network theory. The flow control mechanisms considered are: end-to-end control of virtual circuits, network access control of external messages at the entry nodes and the hop level control between nodes. The model is solved by a heuristic technique, based on an equivalent reduced network and the heuristic extensions to the mean value analysis algorithm. The method has significant computational advantages, and overcomes the limitations of the exact methods. It can be used to solve large network models with finite buffers and many virtual circuits. The model and its heuristic solution are validated by simulation. The interaction between the three levels of flow control are investigated. A queueing model is developed for the admission delay on virtual circuits with end-to-end control, in which messages arrive from independent Poisson sources. The selection of optimum window limit is considered. Several advanced network access schemes are postulated to improve the network performance as well as that of selected traffic streams, and numerical results are presented. A model for the dynamic control of input traffic is developed. Based on Markov decision theory, an optimal control policy is formulated. Numerical results are given and throughput-delay performance is shown to be better with dynamic control than with static control.
Resumo:
The research was carried out within a major public company. It sought to implement an approach to strategic planning which accounted for organisational values as well as employing a holistic value-free analysis of the firm and its environment. To this end, an 'ecological' model of the firm was formulated. A series of value-free strategic policies for its development were generated. These policies were validated by the company's top-management.They compared favourably with their own planning outcomes. The approach appeared to be diagnostically strong but lacked sufficient depth in the context of finding realistic corrective measures. However, feedback from the company showed it to be a useful complementary process to conventional procedures, in providing an explicitly different perspective. The research empirically evaluated the company's value-systems and their influence on strategy. It introduced the idea of an organisational 'self-concept' pre-determining the acceptability of various strategies.The values and the "self-concept' of the company were identified and validated, They appeared to have considerable influence on strategy. In addition, tho company's planning process within the decentralised structure was shown to be sub-optimal. This resulted from the variety of value systems maintained by different parts of the organisation. Proposals attempting to redress this situation were ofJered and several accepted. The study was postured as process-action research and the chosen perspective could be succinctly described as a 'worm's-eye view', akin to that of many real planners operating at some distance from the decision-making body. In this way, the normal strategic functionings of the firm and any changes resulting from the researcher's intervention were observed and recorded. Recurrent difficulties of the planning process resulting from the decentralised structure were identified. The overall procedure suggested as a result of the research aimed to increase the viabiIity of planning and the efficiency of the process. It is considered to be flexible enough to be applicable in a broader context.
Resumo:
Since 2005, European-listed companies have been required to prepare their consolidated financial statements in accordance with the International Financial Reporting Standards (IFRS). We examine whether value relevance increased following the introduction of IFRS, using a sample of 3,721 companies listed on five European stock exchanges: Frankfurt, Madrid, Paris, London, and Milan. We find mixed evidence of an increase in value relevance. However, the influence of earnings on share price increased following the introduction of IFRS in Germany, France, and the United Kingdom, while the influence of book value of equity decreased (except for the United Kingdom). © 2010 Blackwell Publishing Ltd.
Resumo:
Background: Carotenoids are not considered to be essential nutrients, but their antioxidant and photoprotective properties have prompted interest in their potential role in disease prevention. Our aim is to review the evidence In relation to ocular disease. Method: Web of Science and Medline via PubMed database search. Results Lutein and zeaxanthin intake has been associated with a 22% reduced risk of cataract extraction in women (RR 0.78, p = 0.04), and a 19% lower risk of cataract in men (RR 0.8, p = 0, 03). A randomised controlled trial (RCT) found a significant improvement in visual acuity in cataract patients supplemented with lutein. Two RCTs investigating the effect of P-carotene, in combination with other nutrients, on cataract report conflicting results. Several studies show no inverse association between cataract and P-carotene. Lutein and zeaxanthin are the only carotenoids found in the human macula. RCTs have found beneficial effects of both lutein and beta-carotene supplementation, in combination with other antioxidants, on visual function age-related macular disease affected subjects. Evidence for a role of lutein in preventing deterioration of visual function in retinitis pigmentosa patients is conflicting. CONCLUSIONS: Further research into the role of lutein and zeaxanthin in prevention of onset and progression of ocular disease is warranted.
Resumo:
Numerous scientific disciplines suffer from a common epistemological ailment. They tend to generate impressive bodies of empirical knowledge that are otherwise disjointed. The key force that shapes this reality is the lack of organizing meta-frameworks that are capable of otherwise creating a consilient body of core knowledge. In the current paper, we seek to demonstrate the synthetic value of evolutionary theory across a wide range of neuro-business disciplines including neuroeconomics, neuromarketing, neuroentrepreneurship, and organizational neuroscience. Neuroscientists operating at the junction of the brain sciences and a wide range of business disciplines stand to benefit in recognizing that the minds of Homo consumericus, Homo corporaticus, and Homo economicus have been forged by Darwinian forces that have shaped all living organisms. A complete and accurate understanding of most neuro-business phenomena requires that these be tackled at both the proximate (i.e., how something operates) and ultimate (its adaptive function) levels.
Resumo:
Purpose - To provide an example of the use of system dynamics within the context of a discrete-event simulation study. Design/methodology/approach - A discrete-event simulation study of a production-planning facility in a gas cylinder-manufacturing plant is presented. The case study evidence incorporates questionnaire responses from sales managers involved in the order-scheduling process. Findings - As the project progressed it became clear that, although the discrete-event simulation would meet the objectives of the study in a technical sense, the organizational problem of "delivery performance" would not be solved by the discrete-event simulation study alone. The case shows how the qualitative outcomes of the discrete-event simulation study led to an analysis using the system dynamics technique. The system dynamics technique was able to model the decision-makers in the sales and production process and provide a deeper understanding of the performance of the system. Research limitations/implications - The case study describes a traditional discrete-event simulation study which incorporated an unplanned investigation using system dynamics. Further, case studies using a planned approach to showing consideration of organizational issues in discrete-event simulation studies are required. Then the role of both qualitative data in a discrete-event simulation study and the use of supplementary tools which incorporate organizational aspects may help generate a methodology for discrete-event simulation that incorporates human aspects and so improve its relevance for decision making. Practical implications - It is argued that system dynamics can provide a useful addition to the toolkit of the discrete-event simulation practitioner in helping them incorporate a human aspect in their analysis. Originality/value - Helps decision makers gain a broader perspective on the tools available to them by showing the use of system dynamics to supplement the use of discrete-event simulation. © Emerald Group Publishing Limited.
Resumo:
This Dialog responds to a growing debate about the relevance of business schools generally and the value of strategy theory and research for strategic management practice. The authors propose that academic theory and management practice can be better connected through management education. The academy researches practice, derives theory, and returns it to practice through the development of teaching materials and the teaching of current and future practitioners. The three articles in this Dialog examine how different approaches to strategy research inform strategy teaching and its application to practice. Joseph Bower explains the rise of business policy and the process research approach that informed that teaching tradition at Harvard Business School. Robert Grant responds by emphasizing the economic theory underpinnings of strategic management research and its impact on teaching. Paula Jarzabkowski and Richard Whittington conclude by proposing a strategyas-practice perspective and suggesting ways to better incorporate strategy-as-practice research into strategy teaching.
Resumo:
Phospholipid oxidation can generate reactive and electrophilic products that are capable of modifying proteins, especially at cysteine, lysine and histidine residues. Such lipoxidation reactions are known to alter protein structure and function, both with gain of function and loss of activity effects. As well as potential importance in the redox regulation of cell behaviour, lipoxidation products in plasma could also be useful biomarkers for stress conditions. Although studies with antibodies suggested the occurrence of lipoxidation adducts on ApoB-100, these products had not previously been characterized at a molecular level. We have developed new mass spectrometry-based approaches to detect and locate adducts of oxidized phospholipids in plasma proteins, as well as direct oxidation modifications of proteins, which avoid some of the problems typically encountered with database search engines leading to erroneous identifications of oxidative PTMs. This approach uses accurate mass extracted ion chromatograms (XICs) of fragment ions from peptides containing oxPTMs, and allows multiple modifications to be examined regardless of the protein that contains them. For example, a reporter ion at 184.074 Da/e corresponding to phosphocholine indicated the presence of oxidized phosphatidylcholine adducts, while 2 reporter ions at 100.078 and 82.025 Da/e were selective for allysine. ApoB-100-oxidized phospholipid adducts were detected even in healthy human samples, as well as LDL from patients with inflammatory disease. Lipidomic studies showed that more than 350 different species of lipid were present in LDL, and were altered in disease conditions. LDL clearly represents a very complex carrier system and one that offers a rich source of information about systemic conditions, with potential as indicators of oxidative damage in ageing or inflammatory diseases.