80 resultados para Auditing of computer systems
Resumo:
A framework that connects computational mechanics and molecular dynamics has been developed and described. As the key parts of the framework, the problem of symbolising molecular trajectory and the associated interrelation between microscopic phase space variables and macroscopic observables of the molecular system are considered. Following Shalizi and Moore, it is shown that causal states, the constituent parts of the main construct of computational mechanics, the e-machine, define areas of the phase space that are optimal in the sense of transferring information from the micro-variables to the macro-observables. We have demonstrated that, based on the decay of their Poincare´ return times, these areas can be divided into two classes that characterise the separation of the phase space into resonant and chaotic areas. The first class is characterised by predominantly short time returns, typical to quasi-periodic or periodic trajectories. This class includes a countable number of areas corresponding to resonances. The second class includes trajectories with chaotic behaviour characterised by the exponential decay of return times in accordance with the Poincare´ theorem.
Resumo:
Since publication of the first edition, huge developments have taken place in sensory biology research and new insights have been provided in particular by molecular biology. These show the similarities in the molecular architecture and in the physiology of sensory cells across species and across sensory modality and often indicate a common ancestry dating back over half a billion years. Biology of Sensory Systems has thus been completely revised and takes a molecular, evolutionary and comparative approach, providing an overview of sensory systems in vertebrates, invertebrates and prokaryotes, with a strong focus on human senses. Written by a renowned author with extensive teaching experience, the book covers, in six parts, the general features of sensory systems, the mechanosenses, the chemosenses, the senses which detect electromagnetic radiation, other sensory systems including pain, thermosensitivity and some of the minority senses and, finally, provides an outline and discussion of philosophical implications. New in this edition: - Greater emphasis on molecular biology and intracellular mechanisms - New chapter on genomics and sensory systems - Sections on TRP channels, synaptic transmission, evolution of nervous systems, arachnid mechanosensitive sensilla and photoreceptors, electroreception in the Monotremata, language and the FOXP2 gene, mirror neurons and the molecular biology of pain - Updated passages on human olfaction and gustation. Over four hundred illustrations, boxes containing supplementary material and self-assessment questions and a full bibliography at the end of each part make Biology of Sensory Systems essential reading for undergraduate students of biology, zoology, animal physiology, neuroscience, anatomy and physiological psychology. The book is also suitable for postgraduate students in more specialised courses such as vision sciences, optometry, neurophysiology, neuropathology, developmental biology.
Resumo:
We introduce models of heterogeneous systems with finite connectivity defined on random graphs to capture finite-coordination effects on the low-temperature behaviour of finite-dimensional systems. Our models use a description in terms of small deviations of particle coordinates from a set of reference positions, particularly appropriate for the description of low-temperature phenomena. A Born-von Karman-type expansion with random coefficients is used to model effects of frozen heterogeneities. The key quantity appearing in the theoretical description is a full distribution of effective single-site potentials which needs to be determined self-consistently. If microscopic interactions are harmonic, the effective single-site potentials turn out to be harmonic as well, and the distribution of these single-site potentials is equivalent to a distribution of localization lengths used earlier in the description of chemical gels. For structural glasses characterized by frustration and anharmonicities in the microscopic interactions, the distribution of single-site potentials involves anharmonicities of all orders, and both single-well and double-well potentials are observed, the latter with a broad spectrum of barrier heights. The appearance of glassy phases at low temperatures is marked by the appearance of asymmetries in the distribution of single-site potentials, as previously observed for fully connected systems. Double-well potentials with a broad spectrum of barrier heights and asymmetries would give rise to the well-known universal glassy low-temperature anomalies when quantum effects are taken into account. © 2007 IOP Publishing Ltd.
Resumo:
A comprehensive and highly illustrated text providing a broad and invaluable overview of sensory systems at the molecular, cellular and neurophysiological level of vertebrates, invertebrates and prokaryotes. It retains a strong focus on human systems, and takes an evolutionary and comparative approach to review the mechanosenses, chemosenses, photosenses, and other sensory systems including those for detecting pain, temperature electric and magnetic fields etc. It incorporates exciting and significant new insights provided by molecular biology which demonstrate how similar the molecular architecture and physiology of sensory cells are across species and across sensory modality, often indicationg a common ancestry dating back over half a billion years. Written by a renowned author, with extensive teaching experience in the biology of sensory systems, this book includes: - Over 400 illustrations - Self–assessment questions - Full bibliography preceded by short bibliographical essays - Boxes containing useful supplementary material. It will be invaluable for undergraduates and postgraduates studying biology, zoology, animal physiology, neuroscience, anatomy, molecular biology, physiological psychology and related courses.
Resumo:
There is an increasing emphasis on the use of software to control safety critical plants for a wide area of applications. The importance of ensuring the correct operation of such potentially hazardous systems points to an emphasis on the verification of the system relative to a suitably secure specification. However, the process of verification is often made more complex by the concurrency and real-time considerations which are inherent in many applications. A response to this is the use of formal methods for the specification and verification of safety critical control systems. These provide a mathematical representation of a system which permits reasoning about its properties. This thesis investigates the use of the formal method Communicating Sequential Processes (CSP) for the verification of a safety critical control application. CSP is a discrete event based process algebra which has a compositional axiomatic semantics that supports verification by formal proof. The application is an industrial case study which concerns the concurrent control of a real-time high speed mechanism. It is seen from the case study that the axiomatic verification method employed is complex. It requires the user to have a relatively comprehensive understanding of the nature of the proof system and the application. By making a series of observations the thesis notes that CSP possesses the scope to support a more procedural approach to verification in the form of testing. This thesis investigates the technique of testing and proposes the method of Ideal Test Sets. By exploiting the underlying structure of the CSP semantic model it is shown that for certain processes and specifications the obligation of verification can be reduced to that of testing the specification over a finite subset of the behaviours of the process.
Resumo:
This thesis is concerned with the measurement of the characteristics of nonlinear systems by crosscorrelation, using pseudorandom input signals based on m sequences. The systems are characterised by Volterra series, and analytical expressions relating the rth order Volterra kernel to r-dimensional crosscorrelation measurements are derived. It is shown that the two-dimensional crosscorrelation measurements are related to the corresponding second order kernel values by a set of equations which may be structured into a number of independent subsets. The m sequence properties determine how the maximum order of the subsets for off-diagonal values is related to the upper bound of the arguments for nonzero kernel values. The upper bound of the arguments is used as a performance index, and the performance of antisymmetric pseudorandom binary, ternary and quinary signals is investigated. The performance indices obtained above are small in relation to the periods of the corresponding signals. To achieve higher performance with ternary signals, a method is proposed for combining the estimates of the second order kernel values so that the effects of some of the undesirable nonzero values in the fourth order autocorrelation function of the input signal are removed. The identification of the dynamics of two-input, single-output systems with multiplicative nonlinearity is investigated. It is shown that the characteristics of such a system may be determined by crosscorrelation experiments using phase-shifted versions of a common signal as inputs. The effects of nonlinearities on the estimates of system weighting functions obtained by crosscorrelation are also investigated. Results obtained by correlation testing of an industrial process are presented, and the differences between theoretical and experimental results discussed for this case;
Resumo:
The current study aimed to exploit the electrostatic associative interaction between carrageenan and gelatin to optimise a formulation of lyophilised orally disintegrating tablets (ODTs) suitable for multiparticulate delivery. A central composite face centred (CCF) design was applied to study the influence of formulation variables (gelatin, carrageenan and alanine concentrations) on the crucial responses of the formulation (disintegration time, hardness, viscosity and pH). The disintegration time and viscosity were controlled by the associative interaction between gelatin and carrageenan upon hydration which forms a strong complex that increases the viscosity of the stock solution and forms tablet with higher resistant to disintegration in aqueous medium. Therefore, the levels of carrageenan, gelatin and their interaction in the formulation were the significant factors. In terms of hardness, increasing gelatin and alanine concentration was the most effective way to improve tablet hardness. Accordingly, optimum concentrations of these excipients were needed to find the best balance that fulfilled all formulation requirements. The revised model showed high degree of predictability and optimisation reliability and therefore was successful in developing an ODT formulation with optimised properties that were able deliver enteric coated multiparticulates of omeprazole without compromising their functionality.
Computational mechanics reveals nanosecond time correlations in molecular dynamics of liquid systems
Resumo:
Statistical complexity, a measure introduced in computational mechanics has been applied to MD simulated liquid water and other molecular systems. It has been found that statistical complexity does not converge in these systems but grows logarithmically without a limit. The coefficient of the growth has been introduced as a new molecular parameter which is invariant for a given liquid system. Using this new parameter extremely long time correlations in the system undetectable by traditional methods are elucidated. The existence of hundreds of picosecond and even nanosecond long correlations in bulk water has been demonstrated. © 2008 Elsevier B.V. All rights reserved.
Resumo:
Information systems are corporate resources, therefore information systems development must be aligned with corporate strategy. This thesis proposes that effective strategic alignment of information systems requires information systems development, information systems planning and strategic management to be united. Literature in these areas is examined, breaching the academic boundaries which separate these areas, to contribute a synthesised approach to the strategic alignment of information systems development. Previous work in information systems planning has extended information systems development techniques, such as data modelling, into strategic planning activities, neglecting techniques of strategic management. Examination of strategic management in this thesis, identifies parallel trends in strategic management and information systems development; the premises of the learning school of strategic management are similar to those of soft systems approaches to information systems development. It is therefore proposed that strategic management can be supported by a soft systems approach. Strategic management tools and techniques frame individual views of a strategic situation; soft systems approaches can integrate these diverse views to explore the internal and external environments of an organisation. The information derived from strategic analysis justifies the need for an information system and provides a starting point for information systems development. This is demonstrated by a composite framework which enables each information system to be justified according to its direct contribution to corporate strategy. The proposed framework was developed through action research conducted in a number of organisations of varying types. This suggests that the framework can be widely used to support the strategic alignment of information systems development, thereby contributing to organisational success.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY WITH PRIOR ARRANGEMENT
Resumo:
This thesis describes a project which has investigated the evaluation of information systems. The work took place in, and is related to, a specific organisational context, that of the National Health Service (NHS). It aims to increase understanding of the evaluation which takes place in the service and the way in which this is affected by the NHS environment. It also investigates the issues which surround some important types of evaluation and their use in this context. The first stage of the project was a postal survey in which respondents were asked to describe the evaluation which took place in their authorities and to give their opinions about it. This was used to give an overview of the practice of IS evaluation in the NHS and to identify its uses and the problems experienced. Three important types of evaluation were then examined in more detail by means of action research studies. One of these dealt with the selection and purchase of a large hospital information system. The study took the form of an evaluation of the procurement process, and examined the methods used and the influence of organisational factors. The other studies are concerned with post-implementation evaluation, and examine the choice of an evaluation approach as well as its application. One was an evaluation of a community health system which had been operational for some time but was of doubtful value, and suffered from a number of problems. The situation was explored by means of a study of the costs and benefits of the system. The remaining study was the initial review of a system which was used in the administration of a Breast Screening Service. The service itself was also newly operational and the relationship between the service and the system was of interest.
Resumo:
This dissertation studies the process of operations systems design within the context of the manufacturing organization. Using the DRAMA (Design Routine for Adopting Modular Assembly) model as developed by a team from the IDOM Research Unit at Aston University as a starting point, the research employed empirically based fieldwork and a survey to investigate the process of production systems design and implementation within four UK manufacturing industries: electronics assembly, electrical engineering, mechanical engineering and carpet manufacturing. The intention was to validate the basic DRAMA model as a framework for research enquiry within the electronics industry, where the initial IDOM work was conducted, and then to test its generic applicability, further developing the model where appropriate, within the other industries selected. The thesis contains a review of production systems design theory and practice prior to presenting thirteen industrial case studies of production systems design from the four industry sectors. The results and analysis of the postal survey into production systems design are then presented. The strategic decisions of manufacturing and their relationship to production systems design, and the detailed process of production systems design and operation are then discussed. These analyses are used to develop the generic model of production systems design entitled DRAMA II (Decision Rules for Analysing Manufacturing Activities). The model contains three main constituent parts: the basic DRAMA model, the extended DRAMA II model showing the imperatives and relationships within the design process, and a benchmark generic approach for the design and analysis of each component in the design process. DRAMA II is primarily intended for use by researchers as an analytical framework of enquiry, but is also seen as having application for manufacturing practitioners.
Resumo:
The research described in this thesis investigates three issues related to the use of expert systems for decision making in organizations. These are the effectiveness of ESs when used in different roles, to replace a human decision maker or to advise a human decision maker, the users' behaviourand opinions towards using an expertadvisory system and, the possibility of organization-wide deployment of expert systems and the role of an ES in different organizational levels. The research was based on the development of expert systems within a business game environment, a simulation of a manufacturing company. This was chosen to give more control over the `experiments' than would be possible in a real organization. An expert system (EXGAME) was developed based on a structure derived from Anthony's three levels of decision making to manage the simulated company in the business game itself with little user intervention. On the basis of EXGAME, an expert advisory system (ADGAME) was built to help game players to make better decisions in managing the game company. EXGAME and ADGAME are thus two expert systems in the same domain performing different roles; it was found that ADGAME had, in places, to be different from EXGAME, not simply an extension of it. EXGAME was tested several times against human rivals and was evaluated by measuring its performance. ADGAME was also tested by different users and was assessed by measuring the users' performance and analysing their opinions towards it as a helpful decision making aid. The results showed that an expert system was able to replace a human at the operational level, but had difficulty at the strategic level. It also showed the success of the organization-wide deployment of expert systems in this simulated company.
Resumo:
This thesis investigates how people select items from a computer display using the mouse input device. The term computer mouse refers to a class of input devices which share certain features, but these may have different characteristics which influence the ways in which people use the device. Although task completion time is one of the most commonly used performance measures for input device evaluation, there is no consensus as to its definition. Furthermore most mouse studies fail to provide adequate assurances regarding its correct measurement.Therefore precise and accurate timing software were developed which permitted the recording of movement data which by means of automated analysis yielded the device movements made. Input system gain, an important task parameter, has been poorly defined and misconceptualized in most previous studies. The issue of gain has been clarified and investigated within this thesis. Movement characteristics varied between users and within users, even for the same task conditions. The variables of target size, movement amplitude, and experience exerted significant effects on performance. Subjects consistently undershot the target area. This may be a consequence of the particular task demands. Although task completion times indicated that mouse performance had stabilized after 132 trials the movement traces, even of very experienced users, indicated that there was still considerable room for improvement in performance, as indicated by the proportion of poorly made movements. The mouse input device was suitable for older novice device users, but they took longer to complete the experimental trials. Given the diversity and inconsistency of device movements, even for the same task conditions, caution is urged when interpreting averaged grouped data. Performance was found to be sensitive to; task conditions, device implementations, and experience in ways which are problematic for the theoretical descriptions of device movement, and limit the generalizability of such findings within this thesis.