934 resultados para dynamic systems theory
Resumo:
The paper reflects the work of COST Action TU1403 Workgroup 3/Task group 1. The aim is to identify research needs from a review of the state of the art of three aspects related to adaptive façade systems: (1) dynamic performance requirements; (2) façade design under stochastic boundary conditions and (3) experiences with adaptive façade systems and market needs.
Resumo:
Degree of Doctor of Philosophy of Structural/Civil Engineering
Resumo:
Documento submetido para revisão pelos pares. A publicar em Journal of Parallel and Distributed Computing. ISSN 0743-7315
Resumo:
"Series: Solid mechanics and its applications, vol. 226"
Resumo:
Se propone analizar el efecto del uso productivo en el Chaco Árido de la provincia de Córdoba, mediante la aplicación de indicadores de sustentabilidad relacionados con la calidad de la materia orgánica y la liberación de nutrientes en el suelo, con la finalidad de aportar a un tema de suma interes para la provincia de Córdoba como es la formulación de criterios y pautas de manejo para la implementación de la Ley de Bosques (N° 26331). Se trabajará en la localidad de San Miguel en el departamento Pocho, en un sitio de bosque no disturbado y en tres sistemas productivos: desmonte selectivo con implantación de pasturas; desmonte total con agricultura bajo riego y desmonte total sobrepastoreado. En cada sitio se medirá “in situ” la emisión de CO2 y se tomaran muestras de suelo a las que se les determinará: a) contenido de materia orgánica total (MO), b) contenido de sustancias húmicas (SH), diferenciando ácidos húmicos (AH) y fúlvicos (AF), c) abundancia y actividad de microorganismos nitrificadores y d) propiedades químicas de los AH y AF. Se calcularán los siguientes índices de sustentabilidad a) materia orgánica biodisponible (MOB=MO–SH); b) índice de humificación (IH=SH/MO); c) tipo de humus (TH=AF/AH; d) índice de mineralización de C (IMC=CO2/MO); e) índice de nitrificación (IN=actividad/abundancia); y f) índice de estabilidad de las fracciones humificadas: compuestos aromáticos/ alifáticos. Los datos serán analizados estadísticamente mediante ANOVA y comparación de medias por LSD (P<0.05) y tests multivariados. We proposed analyze the effect of land use in Arid Chaco of Cordoba province, using sustainability indicators related to organic matter quality and nutrient release in soil, with the aim to formulate management criteria for the implementation of the Ley de Bosques (N° 26331) in Córdoba province. The study will be conducted in San Miguel village in Pocho department, in one undisturbed forest site and three productive systems: selective clearing with grass sowing; total clearing with irrigation agriculture and total clearing with overgrazed. In each site "in situ" CO2 emission will be measured and soil samples will be taken, in which the following parameters will be determined: a) total organic matter content (MO), b) humic substances content (SH), in humic acids (AH) and fulvic acids (AF), c) abundance and activity of nitrifier microorganisms and d) chemical properties of AH and AF. The sustainability indexes will be calculated: biodisponible organic matter (MOB=MO–SH); b) humification index (IH=SH/MO); c) humus type (TH=AF/AH; d) C mineralization index (IMC=CO2/MO); e) nitrifying index (IN=activity/abundance); and f) humic fractions stability index: aromatic/aliphatic compounds. The data will be statistically analyzed by ANOVA and the means will be compared by LSD (P<0.05) and multivariate tests.
Resumo:
Este proyecto se enmarca en la utlización de métodos formales (más precisamente, en la utilización de teoría de tipos) para garantizar la ausencia de errores en programas. Por un lado se plantea el diseño de nuevos algoritmos de chequeo de tipos. Para ello, se proponen nuevos algoritmos basados en la idea de normalización por evaluación que sean extensibles a otros sistemas de tipos. En el futuro próximo extenderemos resultados que hemos conseguido recientemente [16,17] para obtener: una simplificación de los trabajos realizados para sistemas sin regla eta (acá se estudiarán dos sistemas: a la Martin Löf y a la PTS), la formulación de estos chequeadores para sistemas con variables, generalizar la noción de categoría con familia utilizada para dar semántica a teoría de tipos, obtener una formulación categórica de la noción de normalización por evaluación y finalmente, aplicar estos algoritmos a sistemas con reescrituras. Para los primeros resultados esperados mencionados, nos proponemos como método adaptar las pruebas de [16,17] a los nuevos sistemas. La importancia radica en que permitirán tornar más automatizables (y por ello, más fácilmente utilizables) los asistentes de demostración basados en teoría de tipos. Por otro lado, se utilizará la teoría de tipos para certificar compiladores, intentando llevar adelante la propuesta nunca explorada de [22] de utilizar un enfoque abstracto basado en categorías funtoriales. El método consistirá en certificar el lenguaje "Peal" [29] y luego agregar sucesivamente funcionalidad hasta obtener Forsythe [23]. En este período esperamos poder agregar varias extensiones. La importancia de este proyecto radica en que sólo un compilador certificado garantiza que un programa fuente correcto se compile a un programa objeto correcto. Es por ello, crucial para todo proceso de verificación que se base en verificar código fuente. Finalmente, se abordará la formalización de sistemas con session types. Los mismos han demostrado tener fallas en sus formulaciones [30], por lo que parece conveniente su formalización. Durante la marcha de este proyecto, esperamos tener alguna formalización que dé lugar a un algoritmo de chequeo de tipos y a demostrar las propiedades usuales de los sistemas. La contribución es arrojar un poco de luz sobre estas formulaciones cuyos errores revelan que el tema no ha adquirido aún suficiente madurez o comprensión por parte de la comunidad. This project is about using type theory to garantee program correctness. It follows three different directions: 1) Finding new type-checking algorithms based on normalization by evaluation. First, we would show that recent results like [16,17] extend to other type systems like: Martin-Löf´s type theory without eta rule, PTSs, type systems with variables (in addition to systems in [16,17] which are a la de Bruijn), systems with rewrite rules. This will be done by adjusting the proofs in [16,17] so that they apply to such systems as well. We will also try to obtain a more general definition of categories with families and normalization by evaluation, formulated in categorical terms. We expect this may turn proof-assistants more automatic and useful. 2) Exploring the proposal in [22] to compiler construction for Algol-like languages using functorial categories. According to [22] such approach is suitable for verifying compiler correctness, claim which was never explored. First, the language Peal [29] will be certified in type theory and we will gradually add funtionality to it until a correct compiler for the language Forsythe [23] is obtained. 3) Formilizing systems for session types. Several proposals have shown to be faulty [30]. This means that a formalization of it may contribute to the general understanding of session types.
Resumo:
Convex cone, toric variety, graph theory, electrochemical catalysis, oxidation of formic acid, feedback-loopsbifurcations, enzymatic catalysis, Peroxidase reaction, Shil'nikov chaos
Resumo:
The objective of this paper is to measure the impact of different kinds of knowledge and external economies on urban growth in an intraregional context. The main hypothesis is that knowledge leads to growth, and that this knowledge is related to the existence of agglomeration and network externalities in cities. We develop a three-tage methodology: first, we measure the amount and growth of knowledge in cities using the OCDE (2003) classification and employment data; second, we identify the spatial structure of the area of analysis (networks of cities); third, we combine the Glaeser - Henderson - De Lucio models with spatial econometric specifications in order to contrast the existence of spatially static (agglomeration) and spatially dynamic (network) external economies in an urban growth model. Results suggest that higher growth rates are associated to higher levels of technology and knowledge. The growth of the different kinds of knowledge is related to local and spatial factors (agglomeration and network externalities) and each knowledge intensity shows a particular response to these factors. These results have implications for policy design, since we can forecast and intervene on local knowledge development paths.
Resumo:
In Part I, we formulate and examine some systems that have arisen in the study of the constructible hierarchy; we find numerous transitive models for them, among which are supertransitive models containing all ordinals that show that Devlin's system BS lies strictly between Gandy's systems PZ and BST'; and we use our models to show that BS fails to handle even the simplest rudimentary functions, and is thus inadequate for the use intended for it in Devlin's treatise. In Part II we propose and study an enhancement of the underlying logic of these systems, build further models to show where the previous hierarchy of systems is preserved by our enhancement; and consider three systems that might serve for Devlin's purposes: one the enhancement of a version of BS, one a formulation of Gandy-Jensen set theory, and the third a subsystem common to those two. In Part III we give new proofs of results of Boffa by constructing three models in which, respectively, TCo, AxPair and AxSing fail; we give some sufficient conditions for a set not to belong to the rudimentary closure of another set, and thus answer a question of McAloon; and we comment on Gandy's numerals and correct and sharpen other of his observations.
Resumo:
Consider a Riemannian manifold equipped with an infinitesimal isometry. For this setup, a unified treatment is provided, solely in the language of Riemannian geometry, of techniques in reduction, linearization, and stability of relative equilibria. In particular, for mechanical control systems, an explicit characterization is given for the manner in which reduction by an infinitesimal isometry, and linearization along a controlled trajectory "commute." As part of the development, relationships are derived between the Jacobi equation of geodesic variation and concepts from reduction theory, such as the curvature of the mechanical connection and the effective potential. As an application of our techniques, fiber and base stability of relative equilibria are studied. The paper also serves as a tutorial of Riemannian geometric methods applicable in the intersection of mechanics and control theory.
Resumo:
We extend Floquet theory for reducing nonlinear periodic difference systems to autonomous ones (actually linear) by using normal form theory.
Resumo:
To a finite graph there corresponds a free partially commutative group: with the given graph as commutation graph. In this paper we construct an orthogonality theory for graphs and their corresponding free partially commutative groups. The theory developed here provides tools for the study of the structure of partially commutative groups, their universal theory and automorphism groups. In particular the theory is applied in this paper to the centraliser lattice of such groups.
Resumo:
The demand for computational power has been leading the improvement of the High Performance Computing (HPC) area, generally represented by the use of distributed systems like clusters of computers running parallel applications. In this area, fault tolerance plays an important role in order to provide high availability isolating the application from the faults effects. Performance and availability form an undissociable binomial for some kind of applications. Therefore, the fault tolerant solutions must take into consideration these two constraints when it has been designed. In this dissertation, we present a few side-effects that some fault tolerant solutions may presents when recovering a failed process. These effects may causes degradation of the system, affecting mainly the overall performance and availability. We introduce RADIC-II, a fault tolerant architecture for message passing based on RADIC (Redundant Array of Distributed Independent Fault Tolerance Controllers) architecture. RADIC-II keeps as maximum as possible the RADIC features of transparency, decentralization, flexibility and scalability, incorporating a flexible dynamic redundancy feature, allowing to mitigate or to avoid some recovery side-effects.
Resumo:
We present existence, uniqueness and continuous dependence results for some kinetic equations motivated by models for the collective behavior of large groups of individuals. Models of this kind have been recently proposed to study the behavior of large groups of animals, such as flocks of birds, swarms, or schools of fish. Our aim is to give a well-posedness theory for general models which possibly include a variety of effects: an interaction through a potential, such as a short-range repulsion and long-range attraction; a velocity-averaging effect where individuals try to adapt their own velocity to that of other individuals in their surroundings; and self-propulsion effects, which take into account effects on one individual that are independent of the others. We develop our theory in a space of measures, using mass transportation distances. As consequences of our theory we show also the convergence of particle systems to their corresponding kinetic equations, and the local-in-time convergence to the hydrodynamic limit for one of the models.
Resumo:
Game theory describes and analyzes strategic interaction. It is usually distinguished between static games, which are strategic situations in which the players choose only once as well as simultaneously, and dynamic games, which are strategic situations involving sequential choices. In addition, dynamic games can be further classified according to perfect and imperfect information. Indeed, a dynamic game is said to exhibit perfect information, whenever at any point of the game every player has full informational access to all choices that have been conducted so far. However, in the case of imperfect information some players are not fully informed about some choices. Game-theoretic analysis proceeds in two steps. Firstly, games are modelled by so-called form structures which extract and formalize the significant parts of the underlying strategic interaction. The basic and most commonly used models of games are the normal form, which rather sparsely describes a game merely in terms of the players' strategy sets and utilities, and the extensive form, which models a game in a more detailed way as a tree. In fact, it is standard to formalize static games with the normal form and dynamic games with the extensive form. Secondly, solution concepts are developed to solve models of games in the sense of identifying the choices that should be taken by rational players. Indeed, the ultimate objective of the classical approach to game theory, which is of normative character, is the development of a solution concept that is capable of identifying a unique choice for every player in an arbitrary game. However, given the large variety of games, it is not at all certain whether it is possible to device a solution concept with such universal capability. Alternatively, interactive epistemology provides an epistemic approach to game theory of descriptive character. This rather recent discipline analyzes the relation between knowledge, belief and choice of game-playing agents in an epistemic framework. The description of the players' choices in a given game relative to various epistemic assumptions constitutes the fundamental problem addressed by an epistemic approach to game theory. In a general sense, the objective of interactive epistemology consists in characterizing existing game-theoretic solution concepts in terms of epistemic assumptions as well as in proposing novel solution concepts by studying the game-theoretic implications of refined or new epistemic hypotheses. Intuitively, an epistemic model of a game can be interpreted as representing the reasoning of the players. Indeed, before making a decision in a game, the players reason about the game and their respective opponents, given their knowledge and beliefs. Precisely these epistemic mental states on which players base their decisions are explicitly expressible in an epistemic framework. In this PhD thesis, we consider an epistemic approach to game theory from a foundational point of view. In Chapter 1, basic game-theoretic notions as well as Aumann's epistemic framework for games are expounded and illustrated. Also, Aumann's sufficient conditions for backward induction are presented and his conceptual views discussed. In Chapter 2, Aumann's interactive epistemology is conceptually analyzed. In Chapter 3, which is based on joint work with Conrad Heilmann, a three-stage account for dynamic games is introduced and a type-based epistemic model is extended with a notion of agent connectedness. Then, sufficient conditions for backward induction are derived. In Chapter 4, which is based on joint work with Jérémie Cabessa, a topological approach to interactive epistemology is initiated. In particular, the epistemic-topological operator limit knowledge is defined and some implications for games considered. In Chapter 5, which is based on joint work with Jérémie Cabessa and Andrés Perea, Aumann's impossibility theorem on agreeing to disagree is revisited and weakened in the sense that possible contexts are provided in which agents can indeed agree to disagree.