12 resultados para agent based model
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
The hierarchical organisation of biological systems plays a crucial role in the pattern formation of gene expression resulting from the morphogenetic processes, where autonomous internal dynamics of cells, as well as cell-to-cell interactions through membranes, are responsible for the emergent peculiar structures of the individual phenotype. Being able to reproduce the systems dynamics at different levels of such a hierarchy might be very useful for studying such a complex phenomenon of self-organisation. The idea is to model the phenomenon in terms of a large and dynamic network of compartments, where the interplay between inter-compartment and intra-compartment events determines the emergent behaviour resulting in the formation of spatial patterns. According to these premises the thesis proposes a review of the different approaches already developed in modelling developmental biology problems, as well as the main models and infrastructures available in literature for modelling biological systems, analysing their capabilities in tackling multi-compartment / multi-level models. The thesis then introduces a practical framework, MS-BioNET, for modelling and simulating these scenarios exploiting the potential of multi-level dynamics. This is based on (i) a computational model featuring networks of compartments and an enhanced model of chemical reaction addressing molecule transfer, (ii) a logic-oriented language to flexibly specify complex simulation scenarios, and (iii) a simulation engine based on the many-species/many-channels optimised version of Gillespie’s direct method. The thesis finally proposes the adoption of the agent-based model as an approach capable of capture multi-level dynamics. To overcome the problem of parameter tuning in the model, the simulators are supplied with a module for parameter optimisation. The task is defined as an optimisation problem over the parameter space in which the objective function to be minimised is the distance between the output of the simulator and a target one. The problem is tackled with a metaheuristic algorithm. As an example of application of the MS-BioNET framework and of the agent-based model, a model of the first stages of Drosophila Melanogaster development is realised. The model goal is to generate the early spatial pattern of gap gene expression. The correctness of the models is shown comparing the simulation results with real data of gene expression with spatial and temporal resolution, acquired in free on-line sources.
Resumo:
Interaction protocols establish how different computational entities can interact with each other. The interaction can be finalized to the exchange of data, as in 'communication protocols', or can be oriented to achieve some result, as in 'application protocols'. Moreover, with the increasing complexity of modern distributed systems, protocols are used also to control such a complexity, and to ensure that the system as a whole evolves with certain features. However, the extensive use of protocols has raised some issues, from the language for specifying them to the several verification aspects. Computational Logic provides models, languages and tools that can be effectively adopted to address such issues: its declarative nature can be exploited for a protocol specification language, while its operational counterpart can be used to reason upon such specifications. In this thesis we propose a proof-theoretic framework, called SCIFF, together with its extensions. SCIFF is based on Abductive Logic Programming, and provides a formal specification language with a clear declarative semantics (based on abduction). The operational counterpart is given by a proof procedure, that allows to reason upon the specifications and to test the conformance of given interactions w.r.t. a defined protocol. Moreover, by suitably adapting the SCIFF Framework, we propose solutions for addressing (1) the protocol properties verification (g-SCIFF Framework), and (2) the a-priori conformance verification of peers w.r.t. the given protocol (AlLoWS Framework). We introduce also an agent based architecture, the SCIFF Agent Platform, where the same protocol specification can be used to program and to ease the implementation task of the interacting peers.
Resumo:
Reasoning under uncertainty is a human capacity that in software system is necessary and often hidden. Argumentation theory and logic make explicit non-monotonic information in order to enable automatic forms of reasoning under uncertainty. In human organization Distributed Cognition and Activity Theory explain how artifacts are fundamental in all cognitive process. Then, in this thesis we search to understand the use of cognitive artifacts in an new argumentation framework for an agent-based artificial society.
Resumo:
The aim of the thesi is to formulate a suitable Item Response Theory (IRT) based model to measure HRQoL (as latent variable) using a mixed responses questionnaire and relaxing the hypothesis of normal distributed latent variable. The new model is a combination of two models already presented in literature, that is, a latent trait model for mixed responses and an IRT model for Skew Normal latent variable. It is developed in a Bayesian framework, a Markov chain Monte Carlo procedure is used to generate samples of the posterior distribution of the parameters of interest. The proposed model is test on a questionnaire composed by 5 discrete items and one continuous to measure HRQoL in children, the EQ-5D-Y questionnaire. A large sample of children collected in the schools was used. In comparison with a model for only discrete responses and a model for mixed responses and normal latent variable, the new model has better performances, in term of deviance information criterion (DIC), chain convergences times and precision of the estimates.
Resumo:
Biomedical analyses are becoming increasingly complex, with respect to both the type of the data to be produced and the procedures to be executed. This trend is expected to continue in the future. The development of information and protocol management systems that can sustain this challenge is therefore becoming an essential enabling factor for all actors in the field. The use of custom-built solutions that require the biology domain expert to acquire or procure software engineering expertise in the development of the laboratory infrastructure is not fully satisfactory because it incurs undesirable mutual knowledge dependencies between the two camps. We propose instead an infrastructure concept that enables the domain experts to express laboratory protocols using proper domain knowledge, free from the incidence and mediation of the software implementation artefacts. In the system that we propose this is made possible by basing the modelling language on an authoritative domain specific ontology and then using modern model-driven architecture technology to transform the user models in software artefacts ready for execution in a multi-agent based execution platform specialized for biomedical laboratories.
Resumo:
Mainstream hardware is becoming parallel, heterogeneous, and distributed on every desk, every home and in every pocket. As a consequence, in the last years software is having an epochal turn toward concurrency, distribution, interaction which is pushed by the evolution of hardware architectures and the growing of network availability. This calls for introducing further abstraction layers on top of those provided by classical mainstream programming paradigms, to tackle more effectively the new complexities that developers have to face in everyday programming. A convergence it is recognizable in the mainstream toward the adoption of the actor paradigm as a mean to unite object-oriented programming and concurrency. Nevertheless, we argue that the actor paradigm can only be considered a good starting point to provide a more comprehensive response to such a fundamental and radical change in software development. Accordingly, the main objective of this thesis is to propose Agent-Oriented Programming (AOP) as a high-level general purpose programming paradigm, natural evolution of actors and objects, introducing a further level of human-inspired concepts for programming software systems, meant to simplify the design and programming of concurrent, distributed, reactive/interactive programs. To this end, in the dissertation first we construct the required background by studying the state-of-the-art of both actor-oriented and agent-oriented programming, and then we focus on the engineering of integrated programming technologies for developing agent-based systems in their classical application domains: artificial intelligence and distributed artificial intelligence. Then, we shift the perspective moving from the development of intelligent software systems, toward general purpose software development. Using the expertise maturated during the phase of background construction, we introduce a general-purpose programming language named simpAL, which founds its roots on general principles and practices of software development, and at the same time provides an agent-oriented level of abstraction for the engineering of general purpose software systems.
Resumo:
In Italia, il processo di de-istituzionalizzazione e di implementazione di modelli di assistenza per la salute mentale sono caratterizzati da carenza di valutazione. In particolare, non sono state intraprese iniziative per monitorare le attività relative all’assistenza dei pazienti con disturbi psichiatrici. Pertanto, l’obiettivo della tesi è effettuare una valutazione comparativa dei percorsi di cura nell’ambito della salute mentale nei Dipartimenti di Salute Mentale e Dipendenze Patologiche della regione Emilia-Romagna utilizzando indicatori ottenuti dai flussi amministrativi correnti.. I dati necessari alla costruzione degli indicatori sono stati ottenuti attraverso un data linkage dei flussi amministrativi correnti regionali delle schede di dimissione ospedaliera, delle attività territoriali dei Centri di Salute Mentale e delle prescrizioni farmaceutiche, con riferimento all’anno 2010. Gli indicatori sono stati predisposti per tutti i pazienti con diagnosi principale psichiatrica e poi suddivisi per categoria diagnostica in base al ICD9-CM. . Il set di indicatori esaminato comprende i tassi di prevalenza trattata e di incidenza dei disturbi mentali, i tassi di ospedalizzazione, la ri-ospedalizzazione a 7 e 30 giorni dalla dimissione dai reparti psichiatrici, la continuità assistenziale ospedale-territorio, l’adesione ai trattamenti ed il consumo e appropriatezza prescrittiva di farmaci. Sono state rilevate alcune problematiche nella ricostruzione della continuità assistenziale ospedale-territorio ed alcuni limiti degli indicatori relativi alle prescrizioni dei farmaci. Il calcolo degli indicatori basato sui flussi amministrativi correnti si presenta fattibile, pur con i limiti legati alla qualità, completezza ed accuratezza dei dati presenti. L’implementazione di questi indicatori su larga scala (regionale e nazionale) e su base regolare può essere una opportunità per impostare un sistema di sorveglianza, monitoraggio e valutazione dell’assistenza psichiatrica nei DSM.
Resumo:
Several countries have acquired, over the past decades, large amounts of area covering Airborne Electromagnetic data. Contribution of airborne geophysics has dramatically increased for both groundwater resource mapping and management proving how those systems are appropriate for large-scale and efficient groundwater surveying. We start with processing and inversion of two AEM dataset from two different systems collected over the Spiritwood Valley Aquifer area, Manitoba, Canada respectively, the AeroTEM III (commissioned by the Geological Survey of Canada in 2010) and the “Full waveform VTEM” dataset, collected and tested over the same survey area, during the fall 2011. We demonstrate that in the presence of multiple datasets, either AEM and ground data, due processing, inversion, post-processing, data integration and data calibration is the proper approach capable of providing reliable and consistent resistivity models. Our approach can be of interest to many end users, ranging from Geological Surveys, Universities to Private Companies, which are often proprietary of large geophysical databases to be interpreted for geological and\or hydrogeological purposes. In this study we deeply investigate the role of integration of several complimentary types of geophysical data collected over the same survey area. We show that data integration can improve inversions, reduce ambiguity and deliver high resolution results. We further attempt to use the final, most reliable output resistivity models as a solid basis for building a knowledge-driven 3D geological voxel-based model. A voxel approach allows a quantitative understanding of the hydrogeological setting of the area, and it can be further used to estimate the aquifers volumes (i.e. potential amount of groundwater resources) as well as hydrogeological flow model prediction. In addition, we investigated the impact of an AEM dataset towards hydrogeological mapping and 3D hydrogeological modeling, comparing it to having only a ground based TEM dataset and\or to having only boreholes data.
Resumo:
L’oggetto del lavoro si concentra sull’analisi in chiave giuridica del modello di cooperazione in rete tra le autorità nazionali degli Stati membri nel quadro dello Spazio LSG, allo scopo di valutarne il contributo, le prospettive e il potenziale. La trattazione si suddivide in due parti, precedute da una breve premessa teorica incentrata sull’analisi della nozione di rete e la sua valenza giuridica. La prima parte ricostruisce il percorso di maturazione della cooperazione in rete, dando risalto tanto ai fattori di ordine congiunturale quanto ai fattori giuridici e d’ordine strutturale che sono alla base del processo di retificazione dei settori giustizia e sicurezza. In particolare, vengono elaborati taluni rilievi critici, concernenti l’operatività degli strumenti giuridici che attuano il principio di mutuo riconoscimento e di quelli che danno applicazione al principio di disponibilità delle informazioni. Ciò allo scopo di evidenziare gli ostacoli che, di frequente, impediscono il buon esito delle procedure di cooperazione e di comprendere le potenzialità e le criticità derivanti dall’utilizzo della rete rispetto alla concreta applicazione di tali procedure. La seconda parte si focalizza sull’analisi delle principali reti attive in materia di giustizia e sicurezza, con particolare attenzione ai rispettivi meccanismi di funzionamento. La trattazione si suddivide in due distinte sezioni che si concentrano sulle a) reti che operano a supporto dell’applicazione delle procedure di assistenza giudiziaria e degli strumenti di mutuo riconoscimento e sulle b) reti che operano nel settore della cooperazione informativa e agevolano lo scambio di informazioni operative e tecniche nelle azioni di prevenzione e lotta alla criminalità - specialmente nel settore della protezione dell’economia lecita. La trattazione si conclude con la ricostruzione delle caratteristiche di un modello di rete europea e del ruolo che questo esercita rispetto all’esercizio delle competenze dell’Unione Europea in materia di giustizia e sicurezza.
Resumo:
Persons affected by Down Syndrome show a heterogeneous phenotype that includes developmental defects and cognitive and haematological disorders. Premature accelerated aging and the consequent development of age associated diseases like Alzheimer Disease (AD) seem to be the cause of higher mortality late in life of DS persons. Down Syndrome is caused by the complete or partial trisomy of chromosome 21, but it is not clear if the molecular alterations of the disease are triggered by the specific functions of a limited number of genes on chromosome 21 or by the disruption of genetic homeostasis due the presence of a trisomic chromosome. As epigenomic studies can help to shed light on this issue, here we used the Infinium HumanMethilation450 BeadChip to analyse blood DNA methylation patterns of 29 persons affected by Down syndrome (DSP), using their healthy siblings (DSS) and mothers (DSM) as controls. In this way we obtained a family-based model that allowed us to monitor possible confounding effects on DNA methylation patterns deriving from genetic and environmental factors. We showed that defects in DNA methylation map in genes involved in developmental, neurological and haematological pathways. These genes are enriched on chromosome 21 but localize also in the rest of the genome, suggesting that the trisomy of specific genes on chromosome 21 induces a cascade of events that engages many genes on other chromosomes and results in a global alteration of genomic function. We also analysed the methylation status of three target regions localized at the promoter (Ribo) and at the 5’ sequences of 18S and 28S regions of the rDNA, identifying differently methylated CpG sites. In conclusion, we identified an epigenetic signature of Down Syndrome in blood cells that sustains a link between developmental defects and disease phenotype, including segmental premature aging.
Resumo:
Forest models are tools for explaining and predicting the dynamics of forest ecosystems. They simulate forest behavior by integrating information on the underlying processes in trees, soil and atmosphere. Bayesian calibration is the application of probability theory to parameter estimation. It is a method, applicable to all models, that quantifies output uncertainty and identifies key parameters and variables. This study aims at testing the Bayesian procedure for calibration to different types of forest models, to evaluate their performances and the uncertainties associated with them. In particular,we aimed at 1) applying a Bayesian framework to calibrate forest models and test their performances in different biomes and different environmental conditions, 2) identifying and solve structure-related issues in simple models, and 3) identifying the advantages of additional information made available when calibrating forest models with a Bayesian approach. We applied the Bayesian framework to calibrate the Prelued model on eight Italian eddy-covariance sites in Chapter 2. The ability of Prelued to reproduce the estimated Gross Primary Productivity (GPP) was tested over contrasting natural vegetation types that represented a wide range of climatic and environmental conditions. The issues related to Prelued's multiplicative structure were the main topic of Chapter 3: several different MCMC-based procedures were applied within a Bayesian framework to calibrate the model, and their performances were compared. A more complex model was applied in Chapter 4, focusing on the application of the physiology-based model HYDRALL to the forest ecosystem of Lavarone (IT) to evaluate the importance of additional information in the calibration procedure and their impact on model performances, model uncertainties, and parameter estimation. Overall, the Bayesian technique proved to be an excellent and versatile tool to successfully calibrate forest models of different structure and complexity, on different kind and number of variables and with a different number of parameters involved.
Resumo:
This comprehensive study explores the intricate world of 3D printing, with a focus on Fused Deposition Modelling (FDM). It sheds light on the critical factors that influence the quality and mechanical properties of 3D printed objects. Using an optical microscope with 40X magnification, the shapes of the printed beads is correlated to specific slicing parameters, resulting in a 2D parametric model. This mathematical model, derived from real samples, serves as a tool to predict general mechanical behaviour, bridging the gap between theory and practice in FDM printing. The study begins by emphasising the importance of geometric parameters such as layer height, line width and filament tolerance on the final printed bead geometry and the resulting theoretical effect on mechanical properties. The introduction of VPratio parameter (ratio between the area of the voids and the area occupied by printed material) allows the quantification of the variation of geometric slicing parameters on the improvement or reduction of mechanical properties. The study also addresses the effect of overhang and the role of filament diameter tolerances. The research continues with the introduction of 3D FEM (Finite Element Analysis) models based on the RVE (Representative Volume Element) to verify the results obtained from the 2D model and to analyse other aspects that affect mechanical properties and not directly observable with the 2D model. The study also proposes a model for the examination of 3D printed infill structures, introducing also an innovative methodology called “double RVE” which speeds up the calculation of mechanical properties and is also more computationally efficient. Finally, the limitations of the RVE model are shown and a so-called Hybrid RVE-based model is created to overcome the limitations and inaccuracy of the conventional RVE model and homogenization procedure on some printed geometries.