836 resultados para GIBBS FORMALISM
Resumo:
Background: Design of newly engineered microbial strains for biotechnological purposes would greatly benefit from the development of realistic mathematical models for the processes to be optimized. Such models can then be analyzed and, with the development and application of appropriate optimization techniques, one could identify the modifications that need to be made to the organism in order to achieve the desired biotechnological goal. As appropriate models to perform such an analysis are necessarily non-linear and typically non-convex, finding their global optimum is a challenging task. Canonical modeling techniques, such as Generalized Mass Action (GMA) models based on the power-law formalism, offer a possible solution to this problem because they have a mathematical structure that enables the development of specific algorithms for global optimization. Results: Based on the GMA canonical representation, we have developed in previous works a highly efficient optimization algorithm and a set of related strategies for understanding the evolution of adaptive responses in cellular metabolism. Here, we explore the possibility of recasting kinetic non-linear models into an equivalent GMA model, so that global optimization on the recast GMA model can be performed. With this technique, optimization is greatly facilitated and the results are transposable to the original non-linear problem. This procedure is straightforward for a particular class of non-linear models known as Saturable and Cooperative (SC) models that extend the power-law formalism to deal with saturation and cooperativity. Conclusions: Our results show that recasting non-linear kinetic models into GMA models is indeed an appropriate strategy that helps overcoming some of the numerical difficulties that arise during the global optimization task.
Resumo:
Optimization models in metabolic engineering and systems biology focus typically on optimizing a unique criterion, usually the synthesis rate of a metabolite of interest or the rate of growth. Connectivity and non-linear regulatory effects, however, make it necessary to consider multiple objectives in order to identify useful strategies that balance out different metabolic issues. This is a fundamental aspect, as optimization of maximum yield in a given condition may involve unrealistic values in other key processes. Due to the difficulties associated with detailed non-linear models, analysis using stoichiometric descriptions and linear optimization methods have become rather popular in systems biology. However, despite being useful, these approaches fail in capturing the intrinsic nonlinear nature of the underlying metabolic systems and the regulatory signals involved. Targeting more complex biological systems requires the application of global optimization methods to non-linear representations. In this work we address the multi-objective global optimization of metabolic networks that are described by a special class of models based on the power-law formalism: the generalized mass action (GMA) representation. Our goal is to develop global optimization methods capable of efficiently dealing with several biological criteria simultaneously. In order to overcome the numerical difficulties of dealing with multiple criteria in the optimization, we propose a heuristic approach based on the epsilon constraint method that reduces the computational burden of generating a set of Pareto optimal alternatives, each achieving a unique combination of objectives values. To facilitate the post-optimal analysis of these solutions and narrow down their number prior to being tested in the laboratory, we explore the use of Pareto filters that identify the preferred subset of enzymatic profiles. We demonstrate the usefulness of our approach by means of a case study that optimizes the ethanol production in the fermentation of Saccharomyces cerevisiae.
Resumo:
This paper presents an approach based on the saddle-point approximation to study the equilibrium interactions between small molecules and macromolecules with a large number of sites. For this case, the application of the Darwin–Fowler method results in very simple expressions for the stoichiometric equilibrium constants and their corresponding free energies in terms of integrals of the binding curve plus a correction term which depends on the first derivatives of the binding curve in the points corresponding to an integer value of the mean occupation number. These expressions are simplified when the number of sites tends to infinity, providing an interpretation of the binding curve in terms of the stoichiometric stability constants. The formalism presented is applied to some simple complexation models, obtaining good values for the free energies involved. When heterogeneous complexation is assumed, simple expressions are obtained to relate the macroscopic description of the binding, given by the stoichiomeric constants, with the microscopic description in terms of the intrinsic stability constants or the affinity spectrum. © 1999 American Institute of Physics.
Resumo:
We discuss the evolution of purity in mixed quantum/classical approaches to electronic nonadiabatic dynamics in the context of the Ehrenfest model. As it is impossible to exactly determine initial conditions for a realistic system, we choose to work in the statistical Ehrenfest formalism that we introduced in Alonso et al. [J. Phys. A: Math. Theor. 44, 396004 (2011)10.1088/1751-8113/44/39/395004]. From it, we develop a new framework to determine exactly the change in the purity of the quantum subsystem along with the evolution of a statistical Ehrenfest system. In a simple case, we verify how and to which extent Ehrenfest statistical dynamics makes a system with more than one classical trajectory, and an initial quantum pure state become a quantum mixed one. We prove this numerically showing how the evolution of purity depends on time, on the dimension of the quantum state space D, and on the number of classical trajectories N of the initial distribution. The results in this work open new perspectives for studying decoherence with Ehrenfest dynamics.
Resumo:
The saturation properties of neutron-rich matter are investigated in a relativistic mean-field formalism using two accurately calibrated models: NL3 and FSUGold. The saturation properties density, binding energy per nucleon, and incompressibility coefficient are calculated as a function of the neutron-proton asymmetry α≡(N-Z)/A to all orders in α. Good agreement (at the 10% level or better) is found between these numerical calculations and analytic expansions that are given in terms of a handful of bulk parameters determined at saturation density. Using insights developed from the analytic approach and a general expression for the incompressibility coefficient of infinite neutron-rich matter, i.e., K0(α)=K0+Kτα2+ , we construct a hybrid model with values for K0 and Kτ as suggested by recent experimental findings. Whereas the hybrid model provides a better description of the measured distribution of isoscalar monopole strength in the Sn isotopes relative to both NL3 and FSUGold, it significantly underestimates the distribution of strength in 208Pb. Thus, we conclude that the incompressibility coefficient of neutron-rich matter remains an important open problem.
Resumo:
Many models proposed to study the evolution of collective action rely on a formalism that represents social interactions as n-player games between individuals adopting discrete actions such as cooperate and defect. Despite the importance of spatial structure in biological collective action, the analysis of n-player games games in spatially structured populations has so far proved elusive. We address this problem by considering mixed strategies and by integrating discrete-action n-player games into the direct fitness approach of social evolution theory. This allows to conveniently identify convergence stable strategies and to capture the effect of population structure by a single structure coefficient, namely, the pairwise (scaled) relatedness among interacting individuals. As an application, we use our mathematical framework to investigate collective action problems associated with the provision of three different kinds of collective goods, paradigmatic of a vast array of helping traits in nature: "public goods" (both providers and shirkers can use the good, e.g., alarm calls), "club goods" (only providers can use the good, e.g., participation in collective hunting), and "charity goods" (only shirkers can use the good, e.g., altruistic sacrifice). We show that relatedness promotes the evolution of collective action in different ways depending on the kind of collective good and its economies of scale. Our findings highlight the importance of explicitly accounting for relatedness, the kind of collective good, and the economies of scale in theoretical and empirical studies of the evolution of collective action.
Resumo:
We study the interaction of vector mesons with the octet of stable baryons in the framework of the local hidden gauge formalism using a coupled channels unitary approach. We examine the scattering amplitudes and their poles, which can be associated to known J P = 1/2- , 3/2- baryon resonances, in some cases, or give predictions in other ones. The formalism employed produces doublets of degenerate J P = 1/2- , 3/2- states, a pattern which is observed experimentally in several cases. The findings of this work should also be useful to guide present experimental programs searching for new resonances, in particular in the strange sector where the current information is very poor.
Resumo:
Biomedical research is currently facing a new type of challenge: an excess of information, both in terms of raw data from experiments and in the number of scientific publications describing their results. Mirroring the focus on data mining techniques to address the issues of structured data, there has recently been great interest in the development and application of text mining techniques to make more effective use of the knowledge contained in biomedical scientific publications, accessible only in the form of natural human language. This thesis describes research done in the broader scope of projects aiming to develop methods, tools and techniques for text mining tasks in general and for the biomedical domain in particular. The work described here involves more specifically the goal of extracting information from statements concerning relations of biomedical entities, such as protein-protein interactions. The approach taken is one using full parsing—syntactic analysis of the entire structure of sentences—and machine learning, aiming to develop reliable methods that can further be generalized to apply also to other domains. The five papers at the core of this thesis describe research on a number of distinct but related topics in text mining. In the first of these studies, we assessed the applicability of two popular general English parsers to biomedical text mining and, finding their performance limited, identified several specific challenges to accurate parsing of domain text. In a follow-up study focusing on parsing issues related to specialized domain terminology, we evaluated three lexical adaptation methods. We found that the accurate resolution of unknown words can considerably improve parsing performance and introduced a domain-adapted parser that reduced the error rate of theoriginal by 10% while also roughly halving parsing time. To establish the relative merits of parsers that differ in the applied formalisms and the representation given to their syntactic analyses, we have also developed evaluation methodology, considering different approaches to establishing comparable dependency-based evaluation results. We introduced a methodology for creating highly accurate conversions between different parse representations, demonstrating the feasibility of unification of idiverse syntactic schemes under a shared, application-oriented representation. In addition to allowing formalism-neutral evaluation, we argue that such unification can also increase the value of parsers for domain text mining. As a further step in this direction, we analysed the characteristics of publicly available biomedical corpora annotated for protein-protein interactions and created tools for converting them into a shared form, thus contributing also to the unification of text mining resources. The introduced unified corpora allowed us to perform a task-oriented comparative evaluation of biomedical text mining corpora. This evaluation established clear limits on the comparability of results for text mining methods evaluated on different resources, prompting further efforts toward standardization. To support this and other research, we have also designed and annotated BioInfer, the first domain corpus of its size combining annotation of syntax and biomedical entities with a detailed annotation of their relationships. The corpus represents a major design and development effort of the research group, with manual annotation that identifies over 6000 entities, 2500 relationships and 28,000 syntactic dependencies in 1100 sentences. In addition to combining these key annotations for a single set of sentences, BioInfer was also the first domain resource to introduce a representation of entity relations that is supported by ontologies and able to capture complex, structured relationships. Part I of this thesis presents a summary of this research in the broader context of a text mining system, and Part II contains reprints of the five included publications.
Resumo:
The 4πβ-γ coincidence counting method and its close relatives are widely used for the primary standardization of radioactivity. Both the general formalism and specific implementation of these methods have been well-documented. In particular, previous papers contain the extrapolation equations used for various decay schemes, methods for determining model parameters and, in some cases, tabulated uncertainty budgets. Two things often lacking from experimental reports are both the rationale for estimating uncertainties in a specific way and the details of exactly how a specific component of uncertainty was estimated. Furthermore, correlations among the components of uncertainty are rarely mentioned. To fill in these gaps, the present article shares the best-practices from a few practitioners of this craft. We explain and demonstrate with examples of how these approaches can be used to estimate the uncertainty of the reported massic activity. We describe uncertainties due to measurement variability, extrapolation functions, dead-time and resolving-time effects, gravimetric links, and nuclear and atomic data. Most importantly, a thorough understanding of the measurement system and its response to the decay under study can be used to derive a robust estimate of the measurement uncertainty.
Resumo:
This thesis develops a comprehensive and a flexible statistical framework for the analysis and detection of space, time and space-time clusters of environmental point data. The developed clustering methods were applied in both simulated datasets and real-world environmental phenomena; however, only the cases of forest fires in Canton of Ticino (Switzerland) and in Portugal are expounded in this document. Normally, environmental phenomena can be modelled as stochastic point processes where each event, e.g. the forest fire ignition point, is characterised by its spatial location and occurrence in time. Additionally, information such as burned area, ignition causes, landuse, topographic, climatic and meteorological features, etc., can also be used to characterise the studied phenomenon. Thereby, the space-time pattern characterisa- tion represents a powerful tool to understand the distribution and behaviour of the events and their correlation with underlying processes, for instance, socio-economic, environmental and meteorological factors. Consequently, we propose a methodology based on the adaptation and application of statistical and fractal point process measures for both global (e.g. the Morisita Index, the Box-counting fractal method, the multifractal formalism and the Ripley's K-function) and local (e.g. Scan Statistics) analysis. Many measures describing the space-time distribution of environmental phenomena have been proposed in a wide variety of disciplines; nevertheless, most of these measures are of global character and do not consider complex spatial constraints, high variability and multivariate nature of the events. Therefore, we proposed an statistical framework that takes into account the complexities of the geographical space, where phenomena take place, by introducing the Validity Domain concept and carrying out clustering analyses in data with different constrained geographical spaces, hence, assessing the relative degree of clustering of the real distribution. Moreover, exclusively to the forest fire case, this research proposes two new methodologies to defining and mapping both the Wildland-Urban Interface (WUI) described as the interaction zone between burnable vegetation and anthropogenic infrastructures, and the prediction of fire ignition susceptibility. In this regard, the main objective of this Thesis was to carry out a basic statistical/- geospatial research with a strong application part to analyse and to describe complex phenomena as well as to overcome unsolved methodological problems in the characterisation of space-time patterns, in particular, the forest fire occurrences. Thus, this Thesis provides a response to the increasing demand for both environmental monitoring and management tools for the assessment of natural and anthropogenic hazards and risks, sustainable development, retrospective success analysis, etc. The major contributions of this work were presented at national and international conferences and published in 5 scientific journals. National and international collaborations were also established and successfully accomplished. -- Cette thèse développe une méthodologie statistique complète et flexible pour l'analyse et la détection des structures spatiales, temporelles et spatio-temporelles de données environnementales représentées comme de semis de points. Les méthodes ici développées ont été appliquées aux jeux de données simulées autant qu'A des phénomènes environnementaux réels; nonobstant, seulement le cas des feux forestiers dans le Canton du Tessin (la Suisse) et celui de Portugal sont expliqués dans ce document. Normalement, les phénomènes environnementaux peuvent être modélisés comme des processus ponctuels stochastiques ou chaque événement, par ex. les point d'ignition des feux forestiers, est déterminé par son emplacement spatial et son occurrence dans le temps. De plus, des informations tels que la surface bru^lée, les causes d'ignition, l'utilisation du sol, les caractéristiques topographiques, climatiques et météorologiques, etc., peuvent aussi être utilisées pour caractériser le phénomène étudié. Par conséquent, la définition de la structure spatio-temporelle représente un outil puissant pour compren- dre la distribution du phénomène et sa corrélation avec des processus sous-jacents tels que les facteurs socio-économiques, environnementaux et météorologiques. De ce fait, nous proposons une méthodologie basée sur l'adaptation et l'application de mesures statistiques et fractales des processus ponctuels d'analyse global (par ex. l'indice de Morisita, la dimension fractale par comptage de boîtes, le formalisme multifractal et la fonction K de Ripley) et local (par ex. la statistique de scan). Des nombreuses mesures décrivant les structures spatio-temporelles de phénomènes environnementaux peuvent être trouvées dans la littérature. Néanmoins, la plupart de ces mesures sont de caractère global et ne considèrent pas de contraintes spatiales com- plexes, ainsi que la haute variabilité et la nature multivariée des événements. A cet effet, la méthodologie ici proposée prend en compte les complexités de l'espace géographique ou le phénomène a lieu, à travers de l'introduction du concept de Domaine de Validité et l'application des mesures d'analyse spatiale dans des données en présentant différentes contraintes géographiques. Cela permet l'évaluation du degré relatif d'agrégation spatiale/temporelle des structures du phénomène observé. En plus, exclusif au cas de feux forestiers, cette recherche propose aussi deux nouvelles méthodologies pour la définition et la cartographie des zones périurbaines, décrites comme des espaces anthropogéniques à proximité de la végétation sauvage ou de la forêt, et de la prédiction de la susceptibilité à l'ignition de feu. A cet égard, l'objectif principal de cette Thèse a été d'effectuer une recherche statistique/géospatiale avec une forte application dans des cas réels, pour analyser et décrire des phénomènes environnementaux complexes aussi bien que surmonter des problèmes méthodologiques non résolus relatifs à la caractérisation des structures spatio-temporelles, particulièrement, celles des occurrences de feux forestières. Ainsi, cette Thèse fournit une réponse à la demande croissante de la gestion et du monitoring environnemental pour le déploiement d'outils d'évaluation des risques et des dangers naturels et anthro- pogéniques. Les majeures contributions de ce travail ont été présentées aux conférences nationales et internationales, et ont été aussi publiées dans 5 revues internationales avec comité de lecture. Des collaborations nationales et internationales ont été aussi établies et accomplies avec succès.
Resumo:
This review has tried to collect and correlate all the various equations for the g matrix of strong field d5 systems obtained from different basis sets using full electron and hole formalism calculations. It has corrected mistakes found in the literature and shown how the failure to properly take in symmetry boundary conditions has produced a variety of apparently inconsistent equations in the literature. The review has reexamined the problem of spin-orbit interaction with excited t4e states and finds that the earlier reports that it is zero in octahedral symmetry is not correct. It has shown how redefining what x, y, and z are in the principal coordinate system simplifies, compared to previous methods, the analysis of experimental g values with the equations.
Resumo:
This paper deals with Carathédory's formulation of the second law of thermodynamics. The material is presented in a didatical way, which allows a second year undergraduate student to follow the formalism. An application is made to an ideal gas with two independent variables. A criticism to Carnot formulation of the second law and an investigation of the historical origins of the Carathéodory formalism are also presented.
Resumo:
Statistical mechanics Monte Carlo simulation is reviewed as a formalism to study thermodynamic properties of liquids. Considering the importance of free energy changes in chemical processes, the thermodynamic perturbation theory implemented in the Monte Carlo method is discussed. The representation of molecular interaction by the Lennard-Jones and Coulomb potential functions is also discussed. Charges derived from quantum molecular electrostatic potential are also discussed as an useful methodology to generate an adequate set of partial charges to be used in liquid simulation.
Resumo:
In mathematical modeling the estimation of the model parameters is one of the most common problems. The goal is to seek parameters that fit to the measurements as well as possible. There is always error in the measurements which implies uncertainty to the model estimates. In Bayesian statistics all the unknown quantities are presented as probability distributions. If there is knowledge about parameters beforehand, it can be formulated as a prior distribution. The Bays’ rule combines the prior and the measurements to posterior distribution. Mathematical models are typically nonlinear, to produce statistics for them requires efficient sampling algorithms. In this thesis both Metropolis-Hastings (MH), Adaptive Metropolis (AM) algorithms and Gibbs sampling are introduced. In the thesis different ways to present prior distributions are introduced. The main issue is in the measurement error estimation and how to obtain prior knowledge for variance or covariance. Variance and covariance sampling is combined with the algorithms above. The examples of the hyperprior models are applied to estimation of model parameters and error in an outlier case.
Resumo:
Consensus is gathering that antimicrobial peptides that exert their antibacterial action at the membrane level must reach a local concentration threshold to become active. Studies of peptide interaction with model membranes do identify such disruptive thresholds but demonstrations of the possible correlation of these with the in vivo onset of activity have only recently been proposed. In addition, such thresholds observed in model membranes occur at local peptide concentrations close to full membrane coverage. In this work we fully develop an interaction model of antimicrobial peptides with biological membranes; by exploring the consequences of the underlying partition formalism we arrive at a relationship that provides antibacterial activity prediction from two biophysical parameters: the affinity of the peptide to the membrane and the critical bound peptide to lipid ratio. A straightforward and robust method to implement this relationship, with potential application to high-throughput screening approaches, is presented and tested. In addition, disruptive thresholds in model membranes and the onset of antibacterial peptide activity are shown to occur over the same range of locally bound peptide concentrations (10 to 100 mM), which conciliates the two types of observations