53 resultados para Complex adaptive systems
Resumo:
This paper introduces a fast algorithm for moving window principal component analysis (MWPCA) which will adapt a principal component model. This incorporates the concept of recursive adaptation within a moving window to (i) adapt the mean and variance of the process variables, (ii) adapt the correlation matrix, and (iii) adjust the PCA model by recomputing the decomposition. This paper shows that the new algorithm is computationally faster than conventional moving window techniques, if the window size exceeds 3 times the number of variables, and is not affected by the window size. A further contribution is the introduction of an N-step-ahead horizon into the process monitoring. This implies that the PCA model, identified N-steps earlier, is used to analyze the current observation. For monitoring complex chemical systems, this work shows that the use of the horizon improves the ability to detect slowly developing drifts.
Resumo:
A novel methodology is proposed for the development of neural network models for complex engineering systems exhibiting nonlinearity. This method performs neural network modeling by first establishing some fundamental nonlinear functions from a priori engineering knowledge, which are then constructed and coded into appropriate chromosome representations. Given a suitable fitness function, using evolutionary approaches such as genetic algorithms, a population of chromosomes evolves for a certain number of generations to finally produce a neural network model best fitting the system data. The objective is to improve the transparency of the neural networks, i.e. to produce physically meaningful
Resumo:
In the identification of complex dynamic systems using fuzzy neural networks, one of the main issues is the curse of dimensionality, which makes it difficult to retain a large number of system inputs or to consider a large number of fuzzy sets. Moreover, due to the correlations, not all possible network inputs or regression vectors in the network are necessary and adding them simply increases the model complexity and deteriorates the network generalisation performance. In this paper, the problem is solved by first proposing a fast algorithm for selection of network terms, and then introducing a refinement procedure to tackle the correlation issue. Simulation results show the efficacy of the method.
Resumo:
Gamma-ray positron annihilation spectra of the noble gases are simulated using computational chemistry tools for the bound electron wavefunctions and plane-wave approximation for the low-energy positron. The present annihilation line shapes, i.e. the full width at half maximum, Delta epsilon, of the gamma-ray annihilation spectra for He and Ar (valence) agree well with available independent atomic calculations using a different algorithm. For other noble gases they achieve moderate agreement with the experimental measurements. It is found that the contributions of various atomic electron shells to the spectra depend significantly on their principal quantum number n and orbital angular momentum quantum number l. The present study further reveals that the outermost ns electrons of the noble gases exhibit spectral line shapes in close agreement with those measured, indicating (as expected) that the measurements are not due to a simple sum over the momentum densities for all atomic electrons. The robust nature of the present approach makes it possible for us to proceed to more complex molecular systems using the tools of modern computational chemistry.
Resumo:
A cartographer constructs a map of an individual creative history, that of the American artist kara lynch, as it emerges in connection to a collective history of African American cultural expression. Positioning history as complex, dynamic systems of interwoven memory networks, the map follows lynch’s traversals through various “zones of cultural haunting”: places where collective memories made invisible through systematic processes of cultural erasure may be recovered and revived. Through these traversals, which are inspired by lynch’s “forever project” Invisible, the map covers such terrains as haunted narratives, mechanisms of abstraction and coding within African American media production, water as an informational technology, the distribution of memory in blood, the dialectics of materiality and immateriality that frame considerations of black subjectivity, and the possibility that place of music might not be the site of sound but instead the social production of memory.
Resumo:
The relationships among organisms and their surroundings can be of immense complexity. To describe and understand an ecosystem as a tangled bank, multiple ways of interaction and their effects have to be considered, such as predation, competition, mutualism and facilitation. Understanding the resulting interaction networks is a challenge in changing environments, e.g. to predict knock-on effects of invasive species and to understand how climate change impacts biodiversity. The elucidation of complex ecological systems with their interactions will benefit enormously from the development of new machine learning tools that aim to infer the structure of interaction networks from field data. In the present study, we propose a novel Bayesian regression and multiple changepoint model (BRAM) for reconstructing species interaction networks from observed species distributions. The model has been devised to allow robust inference in the presence of spatial autocorrelation and distributional heterogeneity. We have evaluated the model on simulated data that combines a trophic niche model with a stochastic population model on a 2-dimensional lattice, and we have compared the performance of our model with L1-penalized sparse regression (LASSO) and non-linear Bayesian networks with the BDe scoring scheme. In addition, we have applied our method to plant ground coverage data from the western shore of the Outer Hebrides with the objective to infer the ecological interactions. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
The initial part of this paper reviews the early challenges (c 1980) in achieving real-time silicon implementations of DSP computations. In particular, it discusses research on application specific architectures, including bit level systolic circuits that led to important advances in achieving the DSP performance levels then required. These were many orders of magnitude greater than those achievable using programmable (including early DSP) processors, and were demonstrated through the design of commercial digital correlator and digital filter chips. As is discussed, an important challenge was the application of these concepts to recursive computations as occur, for example, in Infinite Impulse Response (IIR) filters. An important breakthrough was to show how fine grained pipelining can be used if arithmetic is performed most significant bit (msb) first. This can be achieved using redundant number systems, including carry-save arithmetic. This research and its practical benefits were again demonstrated through a number of novel IIR filter chip designs which at the time, exhibited performance much greater than previous solutions. The architectural insights gained coupled with the regular nature of many DSP and video processing computations also provided the foundation for new methods for the rapid design and synthesis of complex DSP System-on-Chip (SoC), Intellectual Property (IP) cores. This included the creation of a wide portfolio of commercial SoC video compression cores (MPEG2, MPEG4, H.264) for very high performance applications ranging from cell phones to High Definition TV (HDTV). The work provided the foundation for systematic methodologies, tools and design flows including high-level design optimizations based on "algorithmic engineering" and also led to the creation of the Abhainn tool environment for the design of complex heterogeneous DSP platforms comprising processors and multiple FPGAs. The paper concludes with a discussion of the problems faced by designers in developing complex DSP systems using current SoC technology. © 2007 Springer Science+Business Media, LLC.
Resumo:
Analysis of molecular interaction and conformational dynamics of biomolecules is of paramount importance in understanding of their vital functions in complex biological systems, disease detection, and new drug development. Plasmonic biosensors based upon surface plasmon resonance and localized surface plasmon resonance have become the predominant workhorse for detecting accumulated biomass caused by molecular binding events. However, unlike surface-enhanced Raman spectroscopy (SERS), the plasmonic biosensors indeed are not suitable tools to interrogate vibrational signatures of conformational transitions required for biomolecules to interact. Here, we show that plasmonic metamaterials can offer two transducing channels for parallel acquisition of optical transmission and sensitive SERS spectra at the biointerface, simultaneously probing the conformational states and binding affinity of biomolecules, e.g. G-quadruplexes, in different environments (Fig. 1). We further demonstrate the use of the metamaterials for fingerprinting and detection of arginine-glycine-glycine domain of nucleolin, a cancer biomarker which specifically binds to a G-quadruplex, with the picomolar sensitivity. The dual-mode nanosensor will significantly contribute to unraveling the complexes of the conformational dynamics of biomolecules as well as to improving specificity of biodetection assays.
Resumo:
Analysis of binding recognition and conformation of biomolecules is of paramount important in understanding of their vital functions in complex biological systems. By enabling sub-wavelength light localization and strong local field enhancement, plasmonic biosensors have become dominant tools used for such analysis owing to their label-free and real-time attributes1,2. However, the plasmonic biosensors are not well-suited to provide information regarding conformation or chemical fingerprint of biomolecules. Here, we show that plasmonic metamaterials, consisting of periodic arrays of artificial split-ring resonators (SRRs)3, can enable capabilities of both sensing and fingerprinting of biomolecules. We demonstrate that by engineering geometry of individual SRRs, localized surface plasmon resonance (LSPR) frequency of the metamaterials could be tuned to visible-near infrared regimes (Vis-NIR) such that they possess high local field enhancement for surface-enhanced Raman scattering spectroscopy (SERS). This will provide the basis for the development of a dual mode label-free conformational-resolving and quantitative detection platform. We present here the ability of each sensing mode to independently detect binding adsorption and to identify different conformational states of Guanine (G)-rich DNA monolayers in different environment milieu. Also shown is the use of the nanosensor for fingerprinting and detection of Arginine-Glycine-Glycine (RGG) peptide binding to the G-quadruplex aptamer. The dual-mode nanosensor will significantly contribute to unraveling the complexes of the conformational dynamics of biomolecules as well as to improving specificity of biodetection assays that the conventional, population-averaged plasmonic biosensors cannot achieve.
Resumo:
Analysis of molecular interaction and conformational dynamics of biomolecules is of paramount importance in understanding of their vital functions in complex biological systems, disease detection, and new drug development. Plasmonic biosensors based upon surface plasmon resonance and localized surface plasmon resonance have become the predominant workhorse for detecting accumulated biomass caused by molecular binding events. However, unlike surface-enhanced Raman spectroscopy (SERS), the plasmonic biosensors indeed are not suitable tools to interrogate vibrational signatures of conformational transitions required for biomolecules to interact. Here, we show that highly tunable plasmonic metamaterials can offer two transducing channels for parallel acquisition of optical transmission and sensitive SERS spectra at the biointerface, simultaneously probing the conformational states and binding affinity of biomolecules, e.g. G-quadruplexes, in different environments. We further demonstrate the use of the metamaterials for fingerprinting and detection of arginine-glycine-glycine domain of nucleolin, a cancer biomarker which specifically binds to a G-quadruplex, with the picomolar sensitivity.
Resumo:
Boron-modified Pd catalysts have shown excellent performance for the selective hydrogenation of alkynes experimentally. In the current work, we investigated the hydrogenation of acetylene on boron-modified Pd(111) and Pd(211) surfaces, utilizing density functional theory calculations. The activity of acetylene hydrogenation has been studied by estimating the effective barrier of the whole process. The selectivity of ethylene formation is investigated from a comparison between the desorption and the hydrogenation of ethylene as well as comparison between the ethylene and the 1,3-butadiene formation. Formation of subsurface carbon and hydrogen on both boron-modified Pd(111) and Pd(211) surfaces has also been evaluated, since these have been reported to affect both the activity and the selectivity of acetylene hydrogenation to produce ethylene on Pd surfaces. Our results provide some important insights into the Pd B catalysts for selective hydrogenation of acetylene and also for more complex hydrogenation systems, such as stereoselective hydrogenation of longer chain alkynes and selective hydrogenation of vegetable oil.
Resumo:
We present a new formulation of the correlated electron-ion dynamics (CEID) scheme, which systematically improves Ehrenfest dynamics by including quantum fluctuations around the mean-field atomic trajectories. We show that the method can simulate models of nonadiabatic electronic transitions and test it against exact integration of the time-dependent Schrodinger equation. Unlike previous formulations of CEID, the accuracy of this scheme depends on a single tunable parameter which sets the level of atomic fluctuations included. The convergence to the exact dynamics by increasing the tunable parameter is demonstrated for a model two level system. This algorithm provides a smooth description of the nonadiabatic electronic transitions which satisfies the kinematic constraints (energy and momentum conservation) and preserves quantum coherence. The applicability of this algorithm to more complex atomic systems is discussed.
Resumo:
In response to Terrence Casey's argument that the emergence of macroprudential regulation since the financial crash can and should save neoliberalism we raise five objections. 1). The Debt-Driven Growth Hypothesis (DDG) and the Financial Instability Hypothesis (FIH), as Casey terms them, are just as likely to be complementary as they are oppositional and they are by no means incompatible. 2) Casey's empirics are too thin and static, drawn from the 1980s and 1990s, while Anglo Liberal Financialised Capitalism (ALFC) is a complex adaptive system that has continued to evolve throughout the 2000s. 3) Casey overlooks the dynamic relationship between potentially excessive financialisation and the performance of the wider economy, which is becoming a growing concern for many policy makers using the macroprudential frame. 4) Macroprudential as a series of ideas about the economy are often incompatible with neoliberal premises and their ontological foundations. 5) Many of the policy makers who have acted as the biggest champions of macroprudential regulation have also been highly critical of ALFC and view the macroprudential turn as making a contribution to a much needed deeper financial reformation that would over time transform some of the constituent economic and social relations of the existing political economy. We conclude that what we call the social purpose of macroprudential regulation (the question of whether it is intended to patch up or transform the existing system) is contested, and that macroprudential regulation has much potential beyond saving ‘neoliberalism’.