37 resultados para Evolutionary Information Behaviour
em Aston University Research Archive
Resumo:
DNA-binding proteins are crucial for various cellular processes and hence have become an important target for both basic research and drug development. With the avalanche of protein sequences generated in the postgenomic age, it is highly desired to establish an automated method for rapidly and accurately identifying DNA-binding proteins based on their sequence information alone. Owing to the fact that all biological species have developed beginning from a very limited number of ancestral species, it is important to take into account the evolutionary information in developing such a high-throughput tool. In view of this, a new predictor was proposed by incorporating the evolutionary information into the general form of pseudo amino acid composition via the top-n-gram approach. It was observed by comparing the new predictor with the existing methods via both jackknife test and independent data-set test that the new predictor outperformed its counterparts. It is anticipated that the new predictor may become a useful vehicle for identifying DNA-binding proteins. It has not escaped our notice that the novel approach to extract evolutionary information into the formulation of statistical samples can be used to identify many other protein attributes as well.
Resumo:
The amplification of demand variation up a supply chain widely termed ‘the Bullwhip Effect’ is disruptive, costly and something that supply chain management generally seeks to minimise. Originally attributed to poor system design; deficiencies in policies, organisation structure and delays in material and information flow all lead to sub-optimal reorder point calculation. It has since been attributed to exogenous random factors such as: uncertainties in demand, supply and distribution lead time but these causes are not exclusive as academic and operational studies since have shown that orders and/or inventories can exhibit significant variability even if customer demand and lead time are deterministic. This increase in the range of possible causes of dynamic behaviour indicates that our understanding of the phenomenon is far from complete. One possible, yet previously unexplored, factor that may influence dynamic behaviour in supply chains is the application and operation of supply chain performance measures. Organisations monitoring and responding to their adopted key performance metrics will make operational changes and this action may influence the level of dynamics within the supply chain, possibly degrading the performance of the very system they were intended to measure. In order to explore this a plausible abstraction of the operational responses to the Supply Chain Council’s SCOR® (Supply Chain Operations Reference) model was incorporated into a classic Beer Game distribution representation, using the dynamic discrete event simulation software Simul8. During the simulation the five SCOR Supply Chain Performance Attributes: Reliability, Responsiveness, Flexibility, Cost and Utilisation were continuously monitored and compared to established targets. Operational adjustments to the; reorder point, transportation modes and production capacity (where appropriate) for three independent supply chain roles were made and the degree of dynamic behaviour in the Supply Chain measured, using the ratio of the standard deviation of upstream demand relative to the standard deviation of the downstream demand. Factors employed to build the detailed model include: variable retail demand, order transmission, transportation delays, production delays, capacity constraints demand multipliers and demand averaging periods. Five dimensions of supply chain performance were monitored independently in three autonomous supply chain roles and operational settings adjusted accordingly. Uniqueness of this research stems from the application of the five SCOR performance attributes with modelled operational responses in a dynamic discrete event simulation model. This project makes its primary contribution to knowledge by measuring the impact, on supply chain dynamics, of applying a representative performance measurement system.
Resumo:
This multi-modal investigation aimed to refine analytic tools including proton magnetic resonance spectroscopy (1H-MRS) and fatty acid gas chromatography-mass spectrometry (GC-MS) analysis, for use with adult and paediatric populations, to investigate potential biochemical underpinnings of cognition (Chapter 1). Essential fatty acids (EFAs) are vital for the normal development and function of neural cells. There is increasing evidence of behavioural impairments arising from dietary deprivation of EFAs and their long-chain fatty acid metabolites (Chapter 2). Paediatric liver disease was used as a deficiency model to examine the relationships between EFA status and cognitive outcomes. Age-appropriate Wechsler assessments measured Full-scale IQ (FSIQ) and Information Processing Speed (IPS) in clinical and healthy cohorts; GC-MS quantified surrogate markers of EFA status in erythrocyte membranes; and 1H-MRS quantified neurometabolite markers of neuronal viability and function in cortical tissue (Chapter 3). Post-transplant children with early-onset liver disease demonstrated specific deficits in IPS compared to age-matched acute liver failure transplant patients and sibling controls, suggesting that the time-course of the illness is a key factor (Chapter 4). No signs of EFA deficiency were observed in the clinical cohort, suggesting that EFA metabolism was not significantly impacted by liver disease. A strong, negative correlation was observed between omega-6 fatty acids and FSIQ, independent of disease diagnosis (Chapter 5). In a study of healthy adults, effect sizes for the relationship between 1H-MRS- detectable neurometabolites and cognition fell within the range of previous work, but were not statistically significant. Based on these findings, recommendations are made emphasising the need for hypothesis-driven enquiry and greater subtlety of data analysis (Chapter 6). Consistency of metabolite values between paediatric clinical cohorts and controls indicate normal neurodevelopment, but the lack of normative, age-matched data makes it difficult to assess the true strength of liver disease-associated metabolite changes (Chapter 7). Converging methods offer a challenging but promising and novel approach to exploring brain-behaviour relationships from micro- to macroscopic levels of analysis (Chapter 8).
Resumo:
This paper complements earlier work by the author that shows that the pattern of information arrivals into the UK stock market may explain the behaviour of returns. It is argued that delays or other systematic behaviour in the processing of this information could compound the impact of information arrival patterns. It is found, however, that this does not happen, and so it is the arrival and not the processing of news that is most important. © 2004 Taylor & Francis Ltd.
Resumo:
The relationship between parent-child interaction and child pedestrian behaviour was investigated by comparing parent-child communication to road-crossing behaviour. Forty-four children and their parents were observed carrying out a communication task (the Map Task), and were covertly filmed crossing roads around a university campus. The Map Task provided measures of task focus and sensitivity to another's current knowledge, which we predicted would be reflected in road-crossing behaviour. We modelled indices of road behaviour with factor scores derived from a principal-component analysis of communication features, and background variables including the age, sex and traffic experience of the child, and parental education. A number of variables were significantly related to road crossing, including the age and sex of the child, the length of the conversation, and specific conversational features such as the checking and clarification of uncertain information by both parent and child. The theoretical and practical implications of the findings are discussed.
Resumo:
The global and local synchronisation of a square lattice composed of alternating Duffing resonators and van der Pol oscillators coupled through displacement is studied. The lattice acts as a sensing device in which the input signal is characterised by an external driving force that is injected into the system through a subset of the Duffing resonators. The parameters of the system are taken from MEMS devices. The effects of the system parameters, the lattice architecture and size are discussed.
Resumo:
This research compared decision making processes in six Chinese state-owned enterprises during the period 1985 to 1988. The research objectives were: a) To examine changes in the managerial behaviour over a period of 1985 to 1988 with a focus on decision-making; b) Through this examination, to throw light on the means by which government policies on economic reform were implemented at the enterprise level; c) To illustrate problems encountered in the decentralization programme which was a major part of China's economic reform. The research was conducted by means of intensive interviews with more than eighty managers and a survey of documents relating to specific decisions. A total of sixty cases of decision-making were selected from five decision topics: purchasing of inputs, pricing of outputs, recruitment of labour, organizational change and innovation, which occurred in 1985 (or before) and in 1988/89. Data from the interviews were used to investigate environmental conditions, relations between the enterprise and its higher authority, interactions between management and the party system, the role of information, and effectiveness of regulations and government policies on enterprise management. The analysis of the data indicates that the decision processes in the different enterprises have some similarities in regard to actor involvement, the flow of decision activities, interactions with the authorities, information usage and the effect of regulations. Comparison of the same or similar decision contents over time indicates that the achievement of decentralization varied according to the topic of decision. Managerial authority was delegated to enterprises when the authorities relaxed their control over resource allocation. When acquisition of necessary resources is dependent upon the planning system or the decision matter is sensitive, because it involves change to the institutional framework (e.g. the Party), then a high degree of centralization was retained, resulting in a marginal change in managerial behaviour. The economic reform failed to increase decision efficiency and effectiveness of decision-making. The prevailing institutional frameworks were regarded as negative to the change. The research argues that the decision process is likely to be more contingent on the decision content than the organization. Three types of decision process have been conceptualized, each of them related to a certain type of decision content. This argument gives attention to the perspectives of institution and power in a way which facilitates an elaboration of organizational analysis. The problems encountered in the reform of China's industrial enterprises are identified and discussed. General recommendations for policies of further reform are offered, based on the analysis of decision process and managerial behaviour.
Resumo:
The work described in the following pages was carried out at various sites in the Rod Division of the Delta Metal Company. Extensive variation in the level of activity in the industry during the years 1974 to I975 had led to certain inadequacies being observed 1n the traditional cost control procedure. In an attempt to remedy this situation it was suggested that a method be found of constructing a system to improve the flexibility of cost control procedures. The work involved an assimilation of the industrial and financial environment via pilot studies which would later prove invaluable to home in on the really interesting and important areas. Weaknesses in the current systems which came to light made the methodology of data collection and the improvement of cost control and profit planning procedures easier to adopt. Because of the requirements of the project to investigate the implications of Cost behaviour for profit planning and control, the next stage of the research work was to utilise the on-site experience to examine at a detailed level the nature of cost behaviour. The analysis of factory costs then showed that certain costs, which were the most significant exhibited a stable relationship with respect to some known variable, usually a specific measure of Output. These costs were then formulated in a cost model, to establish accurate standards in a complex industrial setting in order to provide a meaningful comparison against which to judge actual performance. The necessity of a cost model was •reinforced by the fact that the cost behaviour found to exist was, in the main, a step function, and this complex cost behaviour, the traditional cost and profit planning procedures could not possibly incorporate. Already implemented from this work is the establishment of the post of information officer to co-ordinate data collection and information provision.
Resumo:
This thesis deals with the problem of Information Systems design for Corporate Management. It shows that the results of applying current approaches to Management Information Systems and Corporate Modelling fully justify a fresh look to the problem. The thesis develops an approach to design based on Cybernetic principles and theories. It looks at Management as an informational process and discusses the relevance of regulation theory to its practice. The work proceeds around the concept of change and its effects on the organization's stability and survival. The idea of looking at organizations as viable systems is discussed and a design to enhance survival capacity is developed. It takes Ashby's theory of adaptation and developments on ultra-stability as a theoretical framework and considering conditions for learning and foresight deduces that a design should include three basic components: A dynamic model of the organization- environment relationships; a method to spot significant changes in the value of the essential variables and in a certain set of parameters; and a Controller able to conceive and change the other two elements and to make choices among alternative policies. Further considerations of the conditions for rapid adaptation in organisms composed of many parts, and the law of Requisite Variety determine that successful adaptive behaviour requires certain functional organization. Beer's model of viable organizations is put in relation to Ashby's theory of adaptation and regulation. The use of the Ultra-stable system as abstract unit of analysis permits developing a rigorous taxonomy of change; it starts distinguishing between change with in behaviour and change of behaviour to complete the classification with organizational change. It relates these changes to the logical categories of learning connecting the topic of Information System design with that of organizational learning.
Resumo:
Existing theories of semantic cognition propose models of cognitive processing occurring in a conceptual space, where ‘meaning’ is derived from the spatial relationships between concepts’ mapped locations within the space. Information visualisation is a growing area of research within the field of information retrieval, and methods for presenting database contents visually in the form of spatial data management systems (SDMSs) are being developed. This thesis combined these two areas of research to investigate the benefits associated with employing spatial-semantic mapping (documents represented as objects in two- and three-dimensional virtual environments are proximally mapped dependent on the semantic similarity of their content) as a tool for improving retrieval performance and navigational efficiency when browsing for information within such systems. Positive effects associated with the quality of document mapping were observed; improved retrieval performance and browsing behaviour were witnessed when mapping was optimal. It was also shown using a third dimension for virtual environment (VE) presentation provides sufficient additional information regarding the semantic structure of the environment that performance is increased in comparison to using two-dimensions for mapping. A model that describes the relationship between retrieval performance and browsing behaviour was proposed on the basis of findings. Individual differences were not found to have any observable influence on retrieval performance or browsing behaviour when mapping quality was good. The findings from this work have implications for both cognitive modelling of semantic information, and for designing and testing information visualisation systems. These implications are discussed in the conclusions of this work.
Resumo:
Task classification is introduced as a method for the evaluation of monitoring behaviour in different task situations. On the basis of an analysis of different monitoring tasks, a task classification system comprising four task 'dimensions' is proposed. The perceptual speed and flexibility of closure categories, which are identified with signal discrimination type, comprise the principal dimension in this taxonomy, the others being sense modality, the time course of events, and source complexity. It is also proposed that decision theory provides the most complete method for the analysis of performance in monitoring tasks. Several different aspects of decision theory in relation to monitoring behaviour are described. A method is also outlined whereby both accuracy and latency measures of performance may be analysed within the same decision theory framework. Eight experiments and an organizational study are reported. The results show that a distinction can be made between the perceptual efficiency (sensitivity) of a monitor and his criterial level of response, and that in most monitoring situations, there is no decrement in efficiency over the work period, but an increase in the strictness of the response criterion. The range of tasks exhibiting either or both of these performance trends can be specified within the task classification system. In particular, it is shown that a sensitivity decrement is only obtained for 'speed' tasks with a high stimulation rate. A distinctive feature of 'speed' tasks is that target detection requires the discrimination of a change in a stimulus relative to preceding stimuli, whereas in 'closure' tasks, the information required for the discrimination of targets is presented at the same point In time. In the final study, the specification of tasks yielding sensitivity decrements is shown to be consistent with a task classification analysis of the monitoring literature. It is also demonstrated that the signal type dimension has a major influence on the consistency of individual differences in performance in different tasks. The results provide an empirical validation for the 'speed' and 'closure' categories, and suggest that individual differences are not completely task specific but are dependent on the demands common to different tasks. Task classification is therefore shovn to enable improved generalizations to be made of the factors affecting 1) performance trends over time, and 2) the consistencv of performance in different tasks. A decision theory analysis of response latencies is shown to support the view that criterion shifts are obtained in some tasks, while sensitivity shifts are obtained in others. The results of a psychophysiological study also suggest that evoked potential latency measures may provide temporal correlates of criterion shifts in monitoring tasks. Among other results, the finding that the latencies of negative responses do not increase over time is taken to invalidate arousal-based theories of performance trends over a work period. An interpretation in terms of expectancy, however, provides a more reliable explanation of criterion shifts. Although the mechanisms underlying the sensitivity decrement are not completely clear, the results rule out 'unitary' theories such as observing response and coupling theory. It is suggested that an interpretation in terms of the memory data limitations on information processing provides the most parsimonious explanation of all the results in the literature relating to sensitivity decrement. Task classification therefore enables the refinement and selection of theories of monitoring behaviour in terms of their reliability in generalizing predictions to a wide range of tasks. It is thus concluded that task classification and decision theory provide a reliable basis for the assessment and analysis of monitoring behaviour in different task situations.
Resumo:
In biaxial compression tests, the stress calculations based on boundary information underestimate the principal stresses leading to a significant overestimation of the shear strength. In direct shear tests, the shear strain becomes highly concentrated in the mid-plane of the sample during the test. Although the stress distribution within the specimen is heterogeneous, the evolution of the stress ratio inside the shear band is similar to that inferred from the boundary force calculations. It is also demonstrated that the dilatancy in the shear band significantly exceeds that implied from the boundary displacements. In simple shear tests, the stresses acting on the wall boundaries do not reflect the internal state of stress but merely provide information about the average mobilised wall friction. It is demonstrated that the results are sensitive to the initial stress state defined by K0 = sh/sv. For all cases, non-coaxiality of the principal stress and strain-rate directions is examined and the corresponding flow rule is identified. Periodic cell simulations have been used to examine biaxial compression for a wide range of initial packing densities. Both constant volume and constant mean stress tests have been simulated. The characteristic behaviour at both the macroscopic and microscopic scales is determined by whether or not the system percolates (enduring connectivity is established in all directions). The transition from non-percolating to percolating systems is characterised by transitional behaviour of internal variables and corresponds to an elastic percolation threshold, which correlates well with the establishment of a mechanical coordination number of ca. 3.0. Strong correlations are found between macroscopic and internal variables at the critical state.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
This thesis examines children's consumer choice behaviour using an information processing perspective, with the fundamental goal of applying academic research to practical marketing and commercial problems. Proceeding a preface, which describes the academic and commercial terms of reference within which this interdisciplinary study is couched, the thesis comprises four discernible parts. Initially, the rationale inherent in adopting an information processing perspective is justified and the diverse array of topics which have bearing on children's consumer processing and behaviour are aggregated. The second part uses this perspective as a springboard to appraise the little explored role of memory, and especially memory structure, as a central cognitive component in children's consumer choice processing. The main research theme explores the ease with which 10 and 11 year olds retrieve contemporary consumer information from subjectively defined memory organisations. Adopting a sort-recall paradigm, hierarchical retrieval processing is stimulated and it is contended that when two items, known to be stored proximally in the memory organisation are not recalled adjacently, this discrepancy is indicative of retrieval processing ease. Results illustrate the marked influence of task conditions and orientation of memory structure on retrieval; these conclusions are accounted for in terms of input and integration failure. The third section develops the foregoing interpellations in the marketing context. A straightforward methodology for structuring marketing situations is postulated, a basis for segmenting children's markets using processing characteristics is adopted, and criteria for communicating brand support information to children are discussed. A taxonomy of market-induced processing conditions is developed. Finally, a case study with topical commercial significance is described. The development, launch and marketing of a new product in the confectionery market is outlined, the aetiology of its subsequent demise identified and expounded, and prescriptive guidelines are put forward to help avert future repetition of marketing misjudgements.
Resumo:
This thesis proposes a novel graphical model for inference called the Affinity Network,which displays the closeness between pairs of variables and is an alternative to Bayesian Networks and Dependency Networks. The Affinity Network shares some similarities with Bayesian Networks and Dependency Networks but avoids their heuristic and stochastic graph construction algorithms by using a message passing scheme. A comparison with the above two instances of graphical models is given for sparse discrete and continuous medical data and data taken from the UCI machine learning repository. The experimental study reveals that the Affinity Network graphs tend to be more accurate on the basis of an exhaustive search with the small datasets. Moreover, the graph construction algorithm is faster than the other two methods with huge datasets. The Affinity Network is also applied to data produced by a synchronised system. A detailed analysis and numerical investigation into this dynamical system is provided and it is shown that the Affinity Network can be used to characterise its emergent behaviour even in the presence of noise.