41 resultados para Complex biological systems

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of strategy remains a debate for academics and a concern for practitioners. Published research has focused on producing models for strategy development and on studying how strategy is developed in organisations. The Operational Research literature has highlighted the importance of considering complexity within strategic decision making; but little has been done to link strategy development with complexity theories, despite organisations and organisational environments becoming increasingly more complex. We review the dominant streams of strategy development and complexity theories. Our theoretical investigation results in the first conceptual framework which links an established Strategic Operational Research model, the Strategy Development Process model, with complexity via Complex Adaptive Systems theory. We present preliminary findings from the use of this conceptual framework applied to a longitudinal, in-depth case study, to demonstrate the advantages of using this integrated conceptual model. Our research shows that the conceptual model proposed provides rich data and allows for a more holistic examination of the strategy development process. © 2012 Operational Research Society Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The two areas of theory upon which this research was based were „strategy development process?(SDP) and „complex adaptive systems? (CAS), as part of complexity theory, focused on human social organisations. The literature reviewed showed that there is a paucity of empirical work and theory in the overlap of the two areas, providing an opportunity for contributions to knowledge in each area of theory, and for practitioners. An inductive approach was adopted for this research, in an effort to discover new insights to the focus area of study. It was undertaken from within an interpretivist paradigm, and based on a novel conceptual framework. The organisationally intimate nature of the research topic, and the researcher?s circumstances required a research design that was both in-depth and long term. The result was a single, exploratory, case study, which included use of data from 44 in-depth, semi-structured interviews, from 36 people, involving all the top management team members and significant other staff members; observations, rumour and grapevine (ORG) data; and archive data, over a 5½ year period (2005 – 2010). Findings confirm the validity of the conceptual framework, and that complex adaptive systems theory has potential to extend strategy development process theory. It has shown how and why the strategy process developed in the case study organisation by providing deeper insights to the behaviour of the people, their backgrounds, and interactions. Broad predictions of the „latent strategy development? process and some elements of the strategy content are also possible. Based on this research, it is possible to extend the utility of the SDP model by including peoples? behavioural characteristics within the organisation, via complex adaptive systems theory. Further research is recommended to test limits of the application of the conceptual framework and improve its efficacy with more organisations across a variety of sectors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article, we describe and model the language classroom as a complex adaptive system (see Logan & Schumann, 2005). We argue that linear, categorical descriptions of classroom processes and interactions do not sufficiently explain the complex nature of classrooms, and cannot account for how classroom change occurs (or does not occur), over time. A relational model of classrooms is proposed which focuses on the relations between different elements (physical, environmental, cognitive, social) in the classroom and on how their interaction is crucial in understanding and describing classroom action.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Society depends on complex IT systems created by integrating and orchestrating independently managed systems. The incredible increase in scale and complexity in them over the past decade means new software-engineering techniques are needed to help us cope with their inherent complexity. The key characteristic of these systems is that they are assembled from other systems that are independently controlled and managed. While there is increasing awareness in the software engineering community of related issues, the most relevant background work comes from systems engineering. The interacting algos that led to the Flash Crash represent an example of a coalition of systems, serving the purposes of their owners and cooperating only because they have to. The owners of the individual systems were competing finance companies that were often mutually hostile. Each system jealously guarded its own information and could change without consulting any other system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents an investigation, of synchronisation and causality, motivated by problems in computational neuroscience. The thesis addresses both theoretical and practical signal processing issues regarding the estimation of interdependence from a set of multivariate data generated by a complex underlying dynamical system. This topic is driven by a series of problems in neuroscience, which represents the principal background motive behind the material in this work. The underlying system is the human brain and the generative process of the data is based on modern electromagnetic neuroimaging methods . In this thesis, the underlying functional of the brain mechanisms are derived from the recent mathematical formalism of dynamical systems in complex networks. This is justified principally on the grounds of the complex hierarchical and multiscale nature of the brain and it offers new methods of analysis to model its emergent phenomena. A fundamental approach to study the neural activity is to investigate the connectivity pattern developed by the brain’s complex network. Three types of connectivity are important to study: 1) anatomical connectivity refering to the physical links forming the topology of the brain network; 2) effective connectivity concerning with the way the neural elements communicate with each other using the brain’s anatomical structure, through phenomena of synchronisation and information transfer; 3) functional connectivity, presenting an epistemic concept which alludes to the interdependence between data measured from the brain network. The main contribution of this thesis is to present, apply and discuss novel algorithms of functional connectivities, which are designed to extract different specific aspects of interaction between the underlying generators of the data. Firstly, a univariate statistic is developed to allow for indirect assessment of synchronisation in the local network from a single time series. This approach is useful in inferring the coupling as in a local cortical area as observed by a single measurement electrode. Secondly, different existing methods of phase synchronisation are considered from the perspective of experimental data analysis and inference of coupling from observed data. These methods are designed to address the estimation of medium to long range connectivity and their differences are particularly relevant in the context of volume conduction, that is known to produce spurious detections of connectivity. Finally, an asymmetric temporal metric is introduced in order to detect the direction of the coupling between different regions of the brain. The method developed in this thesis is based on a machine learning extensions of the well known concept of Granger causality. The thesis discussion is developed alongside examples of synthetic and experimental real data. The synthetic data are simulations of complex dynamical systems with the intention to mimic the behaviour of simple cortical neural assemblies. They are helpful to test the techniques developed in this thesis. The real datasets are provided to illustrate the problem of brain connectivity in the case of important neurological disorders such as Epilepsy and Parkinson’s disease. The methods of functional connectivity in this thesis are applied to intracranial EEG recordings in order to extract features, which characterize underlying spatiotemporal dynamics before during and after an epileptic seizure and predict seizure location and onset prior to conventional electrographic signs. The methodology is also applied to a MEG dataset containing healthy, Parkinson’s and dementia subjects with the scope of distinguishing patterns of pathological from physiological connectivity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Systems Engineering Group (SEG) at De Montfort University are developing the Boardman Soft Systems Methodology (BSSM) which allows complex human systems to be modelled, this work builds upon Checkland's Soft Systems Methodology (1981). The BSSM has been applied to the modelling of the systems engineering process as used in design and manufacturing companies. The BSSM is used to solicit information from a company and this data is then transformed into systemic diagrams (systemigrams). These systemigrams are posited to be accurate and concise representations of the system which has been modelled. This paper describes the collaboration between SEG and a manufacturing company (MC) in Leicester, England. The purpose of this collaboration was twofold. First, it was to create an objective view of the MC's processes, in the form of systemigrams. It was important to get this modelled by a source outside of the company, as it is difficult for people within a system being modelled to be unbiased. Secondly, it allowed a series of systemigrams to be produced which can then be subjected to simulation, for the purpose of aiding risk management decisions and to reduce the project cycle time

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The CONNECT European project that started in February 2009 aims at dropping the interoperability barrier faced by today’s distributed systems. It does so by adopting a revolutionary approach to the seamless networking of digital systems, that is, synthesizing on the fly the connectors via which networked systems communicate.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A key objective of autonomic computing is to reduce the cost and expertise required for the management of complex IT systems. As a growing number of these systems are implemented as hierarchies or federations of lower-level systems, techniques that support the development of autonomic systems of systems are required. This article introduces one such technique, which involves the run-time synthesis of autonomic system connectors. These connectors are specified by means of a new type of autonomic computing policy termed a resource definition policy, and enable the dynamic realisation of collections of collaborating autonomic systems, as envisaged by the original vision of autonomic computing. We propose a framework for the formal specification of autonomic computing policies, and use it to define the new policy type and to describe its application to the development of autonomic system of systems. To validate the approach, we present a sample data-centre application that was built using connectors synthesised from resource-definition policies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The development of increasingly powerful computers, which has enabled the use of windowing software, has also opened the way for the computer study, via simulation, of very complex physical systems. In this study, the main issues related to the implementation of interactive simulations of complex systems are identified and discussed. Most existing simulators are closed in the sense that there is no access to the source code and, even if it were available, adaptation to interaction with other systems would require extensive code re-writing. This work aims to increase the flexibility of such software by developing a set of object-oriented simulation classes, which can be extended, by subclassing, at any level, i.e., at the problem domain, presentation or interaction levels. A strategy, which involves the use of an object-oriented framework, concurrent execution of several simulation modules, use of a networked windowing system and the re-use of existing software written in procedural languages, is proposed. A prototype tool which combines these techniques has been implemented and is presented. It allows the on-line definition of the configuration of the physical system and generates the appropriate graphical user interface. Simulation routines have been developed for the chemical recovery cycle of a paper pulp mill. The application, by creation of new classes, of the prototype to the interactive simulation of this physical system is described. Besides providing visual feedback, the resulting graphical user interface greatly simplifies the interaction with this set of simulation modules. This study shows that considerable benefits can be obtained by application of computer science concepts to the engineering domain, by helping domain experts to tailor interactive tools to suit their needs.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A series of simple copper N(2)S(2) macrocycles were examined for their potential as biological redox sensors, following previous characterization of their redox potentials and crystal structures. The divalent species were reduced by glutathione or ascorbate at a biologically relevant pH in aqueous buffer. A less efficient reduction was also achieved by vitamin E in DMSO. Oxidation of the corresponding univalent copper species by sodium hypochlorite resulted in only partial (~65 %) recovery of the divalent form. This was concluded to be due to competition between metal oxidation and ligand oxidation, which is believed to contribute to macrocycle demetallation. Electrospray mass spectrometry confirmed that ligand oxidation had occurred. Moreover, the macrocyclic complexes could be demetallated by incubation with EDTA and bovine serum albumin, demonstrating that they would be inappropriate for use in biological systems. The susceptibility to oxidation and demetallation was hypothesized to be due to oxidation of the secondary amines. Consequently these were modified to incorporate additional oxygen donor atoms. This modification led to greater resistance to demetallation and ligand oxidation, providing a better platform for further development of copper macrocycles as redox sensors for use in biological systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The behaviour of self adaptive systems can be emergent. The difficulty in predicting the system's behaviour means that there is scope for the system to surprise its customers and its developers. Because its behaviour is emergent, a self-adaptive system needs to garner confidence in its customers and it needs to resolve any surprise on the part of the developer during testing and mainteinance. We believe that these two functions can only be achieved if a self-adaptive system is also capable of self-explanation. We argue a self-adaptive system's behaviour needs to be explained in terms of satisfaction of its requirements. Since self-adaptive system requirements may themselves be emergent, a means needs to be found to explain the current behaviour of the system and the reasons that brought that behaviour about. We propose the use of goal-based models during runtime to offer self-explanation of how a system is meeting its requirements, and why the means of meeting these were chosen. We discuss the results of early experiments in self-explanation, and set out future work. © 2012 C.E.S.A.M.E.S.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Bayesian algorithms pose a limit to the performance learning algorithms can achieve. Natural selection should guide the evolution of information processing systems towards those limits. What can we learn from this evolution and what properties do the intermediate stages have? While this question is too general to permit any answer, progress can be made by restricting the class of information processing systems under study. We present analytical and numerical results for the evolution of on-line algorithms for learning from examples for neural network classifiers, which might include or not a hidden layer. The analytical results are obtained by solving a variational problem to determine the learning algorithm that leads to maximum generalization ability. Simulations using evolutionary programming, for programs that implement learning algorithms, confirm and expand the results. The principal result is not just that the evolution is towards a Bayesian limit. Indeed it is essentially reached. In addition we find that evolution is driven by the discovery of useful structures or combinations of variables and operators. In different runs the temporal order of the discovery of such combinations is unique. The main result is that combinations that signal the surprise brought by an example arise always before combinations that serve to gauge the performance of the learning algorithm. This latter structures can be used to implement annealing schedules. The temporal ordering can be understood analytically as well by doing the functional optimization in restricted functional spaces. We also show that there is data suggesting that the appearance of these traits also follows the same temporal ordering in biological systems. © 2006 American Institute of Physics.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Protein oxidation is increasingly recognised as an important modulator of biochemical pathways controlling both physiological and pathological processes. While much attention has focused on cysteine modifications in reversible redox signalling, there is increasing evidence that other protein residues are oxidised in vivo with impact on cellular homeostasis and redox signalling pathways. A notable example is tyrosine, which can undergo a number of oxidative post-translational modifications to form 3-hydroxy-tyrosine, tyrosine crosslinks, 3-nitrotyrosine and halogenated tyrosine, with different effects on cellular functions. Tyrosine oxidation has been studied extensively in vitro, and this has generated detailed information about the molecular mechanisms that may occur in vivo. An important aspect of studying tyrosine oxidation both in vitro and in biological systems is the ability to monitor the formation of oxidised derivatives, which depends on a variety of analytical techniques. While antibody-dependent techniques such as ELISAs are commonly used, these have limitations, and more specific assays based on spectroscopic or spectrometric techniques are required to provide information on the exact residues modified and the nature of the modification. These approaches have helped understanding of the consequences of tyrosine oxidation in biological systems, especially its effects on cell signalling and cell dysfunction, linking to roles in disease. There is mounting evidence that tyrosine oxidation processes are important in vivo and can contribute to cellular pathology.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Biotribology is essentially the study of friction, lubrication and wear in biological systems. The area has been widely studied in relation to the behaviour of synovial joints and the design and behaviour of hip joint prostheses, but only in the last decade have serious studies been extended to the eye. In the ocular environment - as distinct from articular joints - wear is not a major factor. Both lubrication and friction are extremely important, however; this is particularly the case in the presence of the contact lens, which is a medical device important not only in vision correction but also as a therapeutic bandage for the compromised cornea. This chapter describes the difficulty in replicating experimental conditions that accurately reflect the complex nature of the ocular environment together with the factors such as load and rate of travel of the eyelid, which is the principal moving surface in the eye. Results obtained across a range of laboratories are compared.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Proteins are susceptible to oxidation by reactive oxygen species, where the type of damage induced is characteristic of the denaturing species. The induction of protein carbonyls is a widely applied biomarker, arising from primary oxidative insult. However, when applied to complex biological and pathological conditions it can be subject to interference from lipid, carbohydrate and DNA oxidation products. More recently, interest has focused on the analysis of specific protein bound oxidised amino acids. Of the 22 amino acids, aromatic and sulphydryl containing residues have been regarded as being particularly susceptible to oxidative modification, with L-DOPA from tyrosine, ortho-tyrosine from phenylalanine; sulphoxides and disulphides from methionine and cysteine respectively; and kynurenines from tryptophan. Latterly, the identification of valine and leucine hydroxides, reduced from hydroperoxide intermediates, has been described and applied. In order to examine the nature of oxidative damage and protective efficacy of antioxidants the markers must be thoroughly evaluated for dosimetry in vitro following damage by specific radical species. Antioxidant protection against formation of the biomarker should be demonstrated in vitro. Quantification of biomarkers in proteins from normal subjects should be within the limits of detection of any analytical procedure. Further to this, the techniques for isolation and hydrolysis of specific proteins should demonstrate that in vitro oxidation is minimised. There is a need for the development of standards for quality assurance material to standardise procedures between laboratories. At present, antioxidant effects on protein oxidation in vivo are limited to animal studies, where dietary antioxidants have been reported to reduce dityrosine formation during rat exercise training. Two studies on humans have been reported last year. The further application of these methods to human studies is indicated, where the quality of the determinations will be enhanced through inter-laboratory validation.