882 resultados para Dynamic Modelling And Simulation
Resumo:
We compare Bayesian methodology utilizing free-ware BUGS (Bayesian Inference Using Gibbs Sampling) with the traditional structural equation modelling approach based on another free-ware package, Mx. Dichotomous and ordinal (three category) twin data were simulated according to different additive genetic and common environment models for phenotypic variation. Practical issues are discussed in using Gibbs sampling as implemented by BUGS to fit subject-specific Bayesian generalized linear models, where the components of variation may be estimated directly. The simulation study (based on 2000 twin pairs) indicated that there is a consistent advantage in using the Bayesian method to detect a correct model under certain specifications of additive genetics and common environmental effects. For binary data, both methods had difficulty in detecting the correct model when the additive genetic effect was low (between 10 and 20%) or of moderate range (between 20 and 40%). Furthermore, neither method could adequately detect a correct model that included a modest common environmental effect (20%) even when the additive genetic effect was large (50%). Power was significantly improved with ordinal data for most scenarios, except for the case of low heritability under a true ACE model. We illustrate and compare both methods using data from 1239 twin pairs over the age of 50 years, who were registered with the Australian National Health and Medical Research Council Twin Registry (ATR) and presented symptoms associated with osteoarthritis occurring in joints of the hand.
Resumo:
Earthquakes and tsunamis along Morocco's coasts have been reported since historical times. The threat posed by tsunamis must be included in coastal risk studies. This study focuses on the tsunami impact and vulnerability assessment of the Casablanca harbour and surrounding area using a combination of tsunami inundation numerical modelling, field survey data and geographic information system. The tsunami scenario used here is compatible with the 1755 Lisbon event that we considered to be the worst case tsunami scenario. Hydrodynamic modelling was performed with an adapted version of the Cornell Multigrid Coupled Tsunami Model from Cornell University. The simulation covers the eastern domain of the Azores-Gibraltar fracture zone corresponding to the largest tsunamigenic area in the North Atlantic. The proposed vulnerability model attempts to provide an insight into the tsunami vulnerability of building stock. Results in the form of a vulnerability map will be useful for decision makers and local authorities in preventing the community resiliency for tsunami hazards.
Resumo:
The current study focuses on the analysis of pressure surge damping in single pipeline systems generated by a fast change of flow, conditions. A dimensionless form of pressurised transient flow equations was developed. presenting the main advantage of being independent of the system characteristics. In lack of flow velocity profiles. the unsteady friction in turbulent regimes is analysed based on two new empirical corrective-coefficients associated with local and convective acceleration terms. A new, surge damping approach is also presented taking into account the pressure peak time variation. The observed attenuation effect in the pressure wave for high deformable pipe materials can be described by a combination of the non-elastic behaviour of the pipe-wall with steady and unsteady friction effects. Several simulations and experimental tests have been carried out. in order to analyse the dynamic response of single pipelines with different characteristics, such as pipe materials. diameters. thickness. lengths and transient conditions.
Resumo:
Thesis submitted to the Faculty of Sciences and Technology, New University of Lisbon, for the degree of Doctor of Philosophy in Environmental Sciences
Resumo:
This paper analyses forest fires in the perspective of dynamical systems. Forest fires exhibit complex correlations in size, space and time, revealing features often present in complex systems, such as the absence of a characteristic length-scale, or the emergence of long range correlations and persistent memory. This study addresses a public domain forest fires catalogue, containing information of events for Portugal, during the period from 1980 up to 2012. The data is analysed in an annual basis, modelling the occurrences as sequences of Dirac impulses with amplitude proportional to the burnt area. First, we consider mutual information to correlate annual patterns. We use visualization trees, generated by hierarchical clustering algorithms, in order to compare and to extract relationships among the data. Second, we adopt the Multidimensional Scaling (MDS) visualization tool. MDS generates maps where each object corresponds to a point. Objects that are perceived to be similar to each other are placed on the map forming clusters. The results are analysed in order to extract relationships among the data and to identify forest fire patterns.
Resumo:
This paper analyses forest fires in the perspective of dynamical systems. Forest fires exhibit complex correlations in size, space and time, revealing features often present in complex systems, such as the absence of a characteristic length-scale, or the emergence of long range correlations and persistent memory. This study addresses a public domain forest fires catalogue, containing information of events for Portugal, during the period from 1980 up to 2012. The data is analysed in an annual basis, modelling the occurrences as sequences of Dirac impulses with amplitude proportional to the burnt area. First, we consider mutual information to correlate annual patterns. We use visualization trees, generated by hierarchical clustering algorithms, in order to compare and to extract relationships among the data. Second, we adopt the Multidimensional Scaling (MDS) visualization tool. MDS generates maps where each object corresponds to a point. Objects that are perceived to be similar to each other are placed on the map forming clusters. The results are analysed in order to extract relationships among the data and to identify forest fire patterns.
Resumo:
This paper presents the most recent developments of the Simulator of Intelligent Transportation Systems (SITS). The SITS is based on a microscopic simulation approach to reproduce real traffic conditions in an urban or non-urban network. In order to analyse the quality of the microscopic traffic simulator SITS a benchmark test was performed. A dynamical analysis of several traffic phenomena, applying a new modelling formalism based on the embedding of statistics and Laplace transform, is then addressed. The paper presents also a new traffic control concept applied to a freeway traffic system.
Resumo:
This work intends to present a newly developed test setup for dynamic out-of-plane loading using underWater Blast Wave Generators (WBWG) as loading source. Underwater blasting operations have been, during the last decades, subject of research and development of maritime blasting operations (including torpedo studies), aquarium tests for the measurement of blasting energy of industrial explosives and confined underwater blast wave generators. WBWG allow a wide range for the produced blast impulse and surface area distribution. It also avoids the generation of high velocity fragments and reduces atmospheric sound wave. A first objective of this work is to study the behavior of masonry infill walls subjected to blast loading. Three different masonry walls are to be studied, namely unreinforced masonry infill walls and two different reinforcement solutions. These solutions have been studied previously for seismic action mitigation. Subsequently, the walls will be simulated using an explicit finite element code for validation and parametric studies. Finally, a tool to help designers to make informed decisions on the use of infills under blast loading will be presented.
Resumo:
Nessie is an Autonomous Underwater Vehicle (AUV) created by a team of students in the Heriot Watt University to compete in the Student Autonomous Underwater Competition, Europe (SAUC-E) in August 2006. The main objective of the project is to find the dynamic equation of the robot, dynamic model. With it, the behaviour of the robot will be easier to understand and movement tests will be available by computer without the need of the robot, what is a way to save time, batteries, money and the robot from water inside itself. The object of the second part in this project is setting a control system for Nessie by using the model
Resumo:
Background: To enhance our understanding of complex biological systems like diseases we need to put all of the available data into context and use this to detect relations, pattern and rules which allow predictive hypotheses to be defined. Life science has become a data rich science with information about the behaviour of millions of entities like genes, chemical compounds, diseases, cell types and organs, which are organised in many different databases and/or spread throughout the literature. Existing knowledge such as genotype - phenotype relations or signal transduction pathways must be semantically integrated and dynamically organised into structured networks that are connected with clinical and experimental data. Different approaches to this challenge exist but so far none has proven entirely satisfactory. Results: To address this challenge we previously developed a generic knowledge management framework, BioXM™, which allows the dynamic, graphic generation of domain specific knowledge representation models based on specific objects and their relations supporting annotations and ontologies. Here we demonstrate the utility of BioXM for knowledge management in systems biology as part of the EU FP6 BioBridge project on translational approaches to chronic diseases. From clinical and experimental data, text-mining results and public databases we generate a chronic obstructive pulmonary disease (COPD) knowledge base and demonstrate its use by mining specific molecular networks together with integrated clinical and experimental data. Conclusions: We generate the first semantically integrated COPD specific public knowledge base and find that for the integration of clinical and experimental data with pre-existing knowledge the configuration based set-up enabled by BioXM reduced implementation time and effort for the knowledge base compared to similar systems implemented as classical software development projects. The knowledgebase enables the retrieval of sub-networks including protein-protein interaction, pathway, gene - disease and gene - compound data which are used for subsequent data analysis, modelling and simulation. Pre-structured queries and reports enhance usability; establishing their use in everyday clinical settings requires further simplification with a browser based interface which is currently under development.
Resumo:
The proportion of population living in or around cites is more important than ever. Urban sprawl and car dependence have taken over the pedestrian-friendly compact city. Environmental problems like air pollution, land waste or noise, and health problems are the result of this still continuing process. The urban planners have to find solutions to these complex problems, and at the same time insure the economic performance of the city and its surroundings. At the same time, an increasing quantity of socio-economic and environmental data is acquired. In order to get a better understanding of the processes and phenomena taking place in the complex urban environment, these data should be analysed. Numerous methods for modelling and simulating such a system exist and are still under development and can be exploited by the urban geographers for improving our understanding of the urban metabolism. Modern and innovative visualisation techniques help in communicating the results of such models and simulations. This thesis covers several methods for analysis, modelling, simulation and visualisation of problems related to urban geography. The analysis of high dimensional socio-economic data using artificial neural network techniques, especially self-organising maps, is showed using two examples at different scales. The problem of spatiotemporal modelling and data representation is treated and some possible solutions are shown. The simulation of urban dynamics and more specifically the traffic due to commuting to work is illustrated using multi-agent micro-simulation techniques. A section on visualisation methods presents cartograms for transforming the geographic space into a feature space, and the distance circle map, a centre-based map representation particularly useful for urban agglomerations. Some issues on the importance of scale in urban analysis and clustering of urban phenomena are exposed. A new approach on how to define urban areas at different scales is developed, and the link with percolation theory established. Fractal statistics, especially the lacunarity measure, and scale laws are used for characterising urban clusters. In a last section, the population evolution is modelled using a model close to the well-established gravity model. The work covers quite a wide range of methods useful in urban geography. Methods should still be developed further and at the same time find their way into the daily work and decision process of urban planners. La part de personnes vivant dans une région urbaine est plus élevé que jamais et continue à croître. L'étalement urbain et la dépendance automobile ont supplanté la ville compacte adaptée aux piétons. La pollution de l'air, le gaspillage du sol, le bruit, et des problèmes de santé pour les habitants en sont la conséquence. Les urbanistes doivent trouver, ensemble avec toute la société, des solutions à ces problèmes complexes. En même temps, il faut assurer la performance économique de la ville et de sa région. Actuellement, une quantité grandissante de données socio-économiques et environnementales est récoltée. Pour mieux comprendre les processus et phénomènes du système complexe "ville", ces données doivent être traitées et analysées. Des nombreuses méthodes pour modéliser et simuler un tel système existent et sont continuellement en développement. Elles peuvent être exploitées par le géographe urbain pour améliorer sa connaissance du métabolisme urbain. Des techniques modernes et innovatrices de visualisation aident dans la communication des résultats de tels modèles et simulations. Cette thèse décrit plusieurs méthodes permettant d'analyser, de modéliser, de simuler et de visualiser des phénomènes urbains. L'analyse de données socio-économiques à très haute dimension à l'aide de réseaux de neurones artificiels, notamment des cartes auto-organisatrices, est montré à travers deux exemples aux échelles différentes. Le problème de modélisation spatio-temporelle et de représentation des données est discuté et quelques ébauches de solutions esquissées. La simulation de la dynamique urbaine, et plus spécifiquement du trafic automobile engendré par les pendulaires est illustrée à l'aide d'une simulation multi-agents. Une section sur les méthodes de visualisation montre des cartes en anamorphoses permettant de transformer l'espace géographique en espace fonctionnel. Un autre type de carte, les cartes circulaires, est présenté. Ce type de carte est particulièrement utile pour les agglomérations urbaines. Quelques questions liées à l'importance de l'échelle dans l'analyse urbaine sont également discutées. Une nouvelle approche pour définir des clusters urbains à des échelles différentes est développée, et le lien avec la théorie de la percolation est établi. Des statistiques fractales, notamment la lacunarité, sont utilisées pour caractériser ces clusters urbains. L'évolution de la population est modélisée à l'aide d'un modèle proche du modèle gravitaire bien connu. Le travail couvre une large panoplie de méthodes utiles en géographie urbaine. Toutefois, il est toujours nécessaire de développer plus loin ces méthodes et en même temps, elles doivent trouver leur chemin dans la vie quotidienne des urbanistes et planificateurs.
Resumo:
Tämän tutkimustyön kohteena on TietoEnator Oy:n kehittämän Fenix-tietojärjestelmän kapasiteettitarpeen ennustaminen. Työn tavoitteena on tutustua Fenix-järjestelmän eri osa-alueisiin, löytää tapa eritellä ja mallintaa eri osa-alueiden vaikutus järjestelmän kuormitukseen ja selvittää alustavasti mitkä parametrit vaikuttavat kyseisten osa-alueiden luomaan kuormitukseen. Osa tätä työtä on tutkia eri vaihtoehtoja simuloinnille ja selvittää eri vaihtoehtojen soveltuvuus monimutkaisten järjestelmien mallintamiseen. Kerätyn tiedon pohjaltaluodaan järjestelmäntietovaraston kuormitusta kuvaava simulaatiomalli. Hyödyntämällä mallista saatua tietoa ja tuotantojärjestelmästä mitattua tietoa mallia kehitetään vastaamaan yhä lähemmin todellisen järjestelmän toimintaa. Mallista tarkastellaan esimerkiksi simuloitua järjestelmäkuormaa ja jonojen käyttäytymistä. Tuotantojärjestelmästä mitataan eri kuormalähteiden käytösmuutoksia esimerkiksi käyttäjämäärän ja kellonajan suhteessa. Tämän työn tulosten on tarkoitus toimia pohjana myöhemmin tehtävälle jatkotutkimukselle, jossa osa-alueiden parametrisointia tarkennetaan lisää, mallin kykyä kuvata todellista järjestelmää tehostetaanja mallin laajuutta kasvatetaan.
Resumo:
Effective control and limiting of carbon dioxide (CO₂) emissions in energy production are major challenges of science today. Current research activities include the development of new low-cost carbon capture technologies, and among the proposed concepts, chemical combustion (CLC) and chemical looping with oxygen uncoupling (CLOU) have attracted significant attention allowing intrinsic separation of pure CO₂ from a hydrocarbon fuel combustion process with a comparatively small energy penalty. Both CLC and CLOU utilize the well-established fluidized bed technology, but several technical challenges need to be overcome in order to commercialize the processes. Therefore, development of proper modelling and simulation tools is essential for the design, optimization, and scale-up of chemical looping-based combustion systems. The main objective of this work was to analyze the technological feasibility of CLC and CLOU processes at different scales using a computational modelling approach. A onedimensional fluidized bed model frame was constructed and applied for simulations of CLC and CLOU systems consisting of interconnected fluidized bed reactors. The model is based on the conservation of mass and energy, and semi-empirical correlations are used to describe the hydrodynamics, chemical reactions, and transfer of heat in the reactors. Another objective was to evaluate the viability of chemical looping-based energy production, and a flow sheet model representing a CLC-integrated steam power plant was developed. The 1D model frame was succesfully validated based on the operation of a 150 kWth laboratory-sized CLC unit fed by methane. By following certain scale-up criteria, a conceptual design for a CLC reactor system at a pre-commercial scale of 100 MWth was created, after which the validated model was used to predict the performance of the system. As a result, further understanding of the parameters affecting the operation of a large-scale CLC process was acquired, which will be useful for the practical design work in the future. The integration of the reactor system and steam turbine cycle for power production was studied resulting in a suggested plant layout including a CLC boiler system, a simple heat recovery setup, and an integrated steam cycle with a three pressure level steam turbine. Possible operational regions of a CLOU reactor system fed by bituminous coal were determined via mass, energy, and exergy balance analysis. Finally, the 1D fluidized bed model was modified suitable for CLOU, and the performance of a hypothetical 500 MWth CLOU fuel reactor was evaluated by extensive case simulations.
Resumo:
This thesis Entitled “modelling and analysis of recurrent event data with multiple causes.Survival data is a term used for describing data that measures the time to occurrence of an event.In survival studies, the time to occurrence of an event is generally referred to as lifetime.Recurrent event data are commonly encountered in longitudinal studies when individuals are followed to observe the repeated occurrences of certain events. In many practical situations, individuals under study are exposed to the failure due to more than one causes and the eventual failure can be attributed to exactly one of these causes.The proposed model was useful in real life situations to study the effect of covariates on recurrences of certain events due to different causes.In Chapter 3, an additive hazards model for gap time distributions of recurrent event data with multiple causes was introduced. The parameter estimation and asymptotic properties were discussed .In Chapter 4, a shared frailty model for the analysis of bivariate competing risks data was presented and the estimation procedures for shared gamma frailty model, without covariates and with covariates, using EM algorithm were discussed. In Chapter 6, two nonparametric estimators for bivariate survivor function of paired recurrent event data were developed. The asymptotic properties of the estimators were studied. The proposed estimators were applied to a real life data set. Simulation studies were carried out to find the efficiency of the proposed estimators.
Resumo:
Applications such as neuroscience, telecommunication, online social networking, transport and retail trading give rise to connectivity patterns that change over time. In this work, we address the resulting need for network models and computational algorithms that deal with dynamic links. We introduce a new class of evolving range-dependent random graphs that gives a tractable framework for modelling and simulation. We develop a spectral algorithm for calibrating a set of edge ranges from a sequence of network snapshots and give a proof of principle illustration on some neuroscience data. We also show how the model can be used computationally and analytically to investigate the scenario where an evolutionary process, such as an epidemic, takes place on an evolving network. This allows us to study the cumulative effect of two distinct types of dynamics.