22 resultados para Adaptive resonance theory
em CentAUR: Central Archive University of Reading - UK
Resumo:
This study examines when “incremental” change is likely to trigger “discontinuous” change, using the lens of complex adaptive systems theory. Going beyond the simulations and case studies through which complex adaptive systems have been approached so far, we study the relationship between incremental organizational reconfigurations and discontinuous organizational restructurings using a large-scale database of U.S. Fortune 50 industrial corporations. We develop two types of escalation process in organizations: accumulation and perturbation. Under ordinary conditions, it is perturbation rather than the accumulation that is more likely to trigger subsequent discontinuous change. Consistent with complex adaptive systems theory, organizations are more sensitive to both accumulation and perturbation in conditions of heightened disequilibrium. Contrary to expectations, highly interconnected organizations are not more liable to discontinuous change. We conclude with implications for further research, especially the need to attend to the potential role of managerial design and coping when transferring complex adaptive systems theory from natural systems to organizational systems.
Resumo:
Nowadays the changing environment becomes the main challenge for most of organizations, since they have to evaluate proper policies to adapt to the environment. In this paper, we propose a multi-agent simulation method to evaluate policies based on complex adaptive system theory. Furthermore, we propose a semiotic EDA (Epistemic, Deontic, Axiological) agent model to simulate agent's behavior in the system by incorporating the social norms reflecting the policy. A case study is also provided to validate our approach. Our research present better adaptability and validity than the qualitative analysis and experiment approach and the semiotic agent model provides high creditability to simulate agents' behavior.
Resumo:
The Iowa gambling task (IGT) is one of the most influential behavioral paradigms in reward-related decision making and has been, most notably, associated with ventromedial prefrontal cortex function. However, performance in the IGT relies on a complex set of cognitive subprocesses, in particular integrating information about the outcome of choices into a continuously updated decision strategy under ambiguous conditions. The complexity of the task has made it difficult for neuroimaging studies to disentangle the underlying neurocognitive processes. In this study, we used functional magnetic resonance imaging in combination with a novel adaptation of the task, which allowed us to examine separately activation associated with the moment of decision or the evaluation of decision outcomes. Importantly, using whole-brain regression analyses with individual performance, in combination with the choice/outcome history of individual subjects, we aimed to identify the neural overlap between areas that are involved in the evaluation of outcomes and in the progressive discrimination of the relative value of available choice options, thus mapping the two fundamental cognitive processes that lead to adaptive decision making. We show that activation in right ventromedial and dorsolateral prefrontal cortex was predictive of adaptive performance, in both discriminating disadvantageous from advantageous decisions and confirming negative decision outcomes. We propose that these two prefrontal areas mediate shifting away from disadvantageous choices through their sensitivity to accumulating negative outcomes. These findings provide functional evidence of the underlying processes by which these prefrontal subregions drive adaptive choice in the task, namely through contingency-sensitive outcome evaluation.
Resumo:
The calculation of accurate and reliable vibrational potential functions and normal co-ordinates is discussed, for such simple polyatomic molecules as it may be possible. Such calculations should be corrected for the effects of anharmonicity and of resonance interactions between the vibrational states, and should be fitted to all the available information on all isotopic species: particularly the vibrational frequencies, Coriolis zeta constants and centrifugal distortion constants. The difficulties of making these corrections, and of making use of the observed data are reviewed. A programme for the Ferranti Mercury Computer is described by means of which harmonic vibration frequencies and normal co-ordinate vectors, zeta factors and centrifugal distortion constants can be calculated, from a given force field and from given G-matrix elements, etc. The programme has been used on up to 5 × 5 secular equations for which a single calculation and output of results takes approximately l min; it can readily be extended to larger determinants. The best methods of using such a programme and the possibility of reversing the direction of calculation are discussed. The methods are applied to calculating the best possible vibrational potential function for the methane molecule, making use of all the observed data.
Resumo:
In clinical trials, situations often arise where more than one response from each patient is of interest; and it is required that any decision to stop the study be based upon some or all of these measures simultaneously. Theory for the design of sequential experiments with simultaneous bivariate responses is described by Jennison and Turnbull (Jennison, C., Turnbull, B. W. (1993). Group sequential tests for bivariate response: interim analyses of clinical trials with both efficacy and safety endpoints. Biometrics 49:741-752) and Cook and Farewell (Cook, R. J., Farewell, V. T. (1994). Guidelines for monitoring efficacy and toxicity responses in clinical trials. Biometrics 50:1146-1152) in the context of one efficacy and one safety response. These expositions are in terms of normally distributed data with known covariance. The methods proposed require specification of the correlation, ρ between test statistics monitored as part of the sequential test. It can be difficult to quantify ρ and previous authors have suggested simply taking the lowest plausible value, as this will guarantee power. This paper begins with an illustration of the effect that inappropriate specification of ρ can have on the preservation of trial error rates. It is shown that both the type I error and the power can be adversely affected. As a possible solution to this problem, formulas are provided for the calculation of correlation from data collected as part of the trial. An adaptive approach is proposed and evaluated that makes use of these formulas and an example is provided to illustrate the method. Attention is restricted to the bivariate case for ease of computation, although the formulas derived are applicable in the general multivariate case.
Resumo:
An adaptive tuned vibration absorber (ATVA) with a smart variable stiffness element is capable of retuning itself in response to a time-varying excitation frequency., enabling effective vibration control over a range of frequencies. This paper discusses novel methods of achieving variable stiffness in an ATVA by changing shape, as inspired by biological paradigms. It is shown that considerable variation in the tuned frequency can be achieved by actuating a shape change, provided that this is within the limits of the actuator. A feasible design for such an ATVA is one in which the device offers low resistance to the required shape change actuation while not being restricted to low values of the effective stiffness of the vibration absorber. Three such original designs are identified: (i) A pinned-pinned arch beam with fixed profile of slight curvature and variable preload through an adjustable natural curvature; (ii) a vibration absorber with a stiffness element formed from parallel curved beams of adjustable curvature vibrating longitudinally; (iii) a vibration absorber with a variable geometry linkage as stiffness element. The experimental results from demonstrators based on two of these designs show good correlation with the theory.
Resumo:
It is argued that the truth status of emergent properties of complex adaptive systems models should be based on an epistemology of proof by constructive verification and therefore on the ontological axioms of a non-realist logical system such as constructivism or intuitionism. ‘Emergent’ properties of complex adaptive systems (CAS) models create particular epistemological and ontological challenges. These challenges bear directly on current debates in the philosophy of mathematics and in theoretical computer science. CAS research, with its emphasis on computer simulation, is heavily reliant on models which explore the entailments of Formal Axiomatic Systems (FAS). The incompleteness results of Gödel, the incomputability results of Turing, and the Algorithmic Information Theory results of Chaitin, undermine a realist (platonic) truth model of emergent properties. These same findings support the hegemony of epistemology over ontology and point to alternative truth models such as intuitionism, constructivism and quasi-empiricism.
Resumo:
This paper presents in detail a theoretical adaptive model of thermal comfort based on the “Black Box” theory, taking into account factors such as culture, climate, social, psychological and behavioural adaptations, which have an impact on the senses used to detect thermal comfort. The model is called the Adaptive Predicted Mean Vote (aPMV) model. The aPMV model explains, by applying the cybernetics concept, the phenomena that the Predicted Mean Vote (PMV) is greater than the Actual Mean Vote (AMV) in free-running buildings, which has been revealed by many researchers in field studies. An Adaptive coefficient (λ) representing the adaptive factors that affect the sense of thermal comfort has been proposed. The empirical coefficients in warm and cool conditions for the Chongqing area in China have been derived by applying the least square method to the monitored onsite environmental data and the thermal comfort survey results.
Resumo:
Theory of mind ability has been associated with performance in interpersonal interactions and has been found to influence aspects such as emotion recognition, social competence, and social anxiety. Being able to attribute mental states to others requires attention to subtle communication cues such as facial emotional expressions. Decoding and interpreting emotions expressed by the face, especially those with negative valence, are essential skills to successful social interaction. The current study explored the association between theory of mind skills and attentional bias to facial emotional expressions. According to the study hypothesis, individuals with poor theory of mind skills showed preferential attention to negative faces over both non-negative faces and neutral objects. Tentative explanations for the findings are offered emphasizing the potential adaptive role of vigilance for threat as a way of allocating a limited capacity to interpret others’ mental states to obtain as much information as possible about potential danger in the social environment.
Resumo:
A nonlinear general predictive controller (NLGPC) is described which is based on the use of a Hammerstein model within a recursive control algorithm. A key contribution of the paper is the use of a novel, one-step simple root solving procedure for the Hammerstein model, this being a fundamental part of the overall tuning algorithm. A comparison is made between NLGPC and nonlinear deadbeat control (NLDBC) using the same one-step nonlinear components, in order to investigate NLGPC advantages and disadvantages.
Resumo:
In a world of almost permanent and rapidly increasing electronic data availability, techniques of filtering, compressing, and interpreting this data to transform it into valuable and easily comprehensible information is of utmost importance. One key topic in this area is the capability to deduce future system behavior from a given data input. This book brings together for the first time the complete theory of data-based neurofuzzy modelling and the linguistic attributes of fuzzy logic in a single cohesive mathematical framework. After introducing the basic theory of data-based modelling, new concepts including extended additive and multiplicative submodels are developed and their extensions to state estimation and data fusion are derived. All these algorithms are illustrated with benchmark and real-life examples to demonstrate their efficiency. Chris Harris and his group have carried out pioneering work which has tied together the fields of neural networks and linguistic rule-based algortihms. This book is aimed at researchers and scientists in time series modeling, empirical data modeling, knowledge discovery, data mining, and data fusion.