17 resultados para model driven system, semantic representation, semantic modeling, enterprise system development
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
The hierarchical organisation of biological systems plays a crucial role in the pattern formation of gene expression resulting from the morphogenetic processes, where autonomous internal dynamics of cells, as well as cell-to-cell interactions through membranes, are responsible for the emergent peculiar structures of the individual phenotype. Being able to reproduce the systems dynamics at different levels of such a hierarchy might be very useful for studying such a complex phenomenon of self-organisation. The idea is to model the phenomenon in terms of a large and dynamic network of compartments, where the interplay between inter-compartment and intra-compartment events determines the emergent behaviour resulting in the formation of spatial patterns. According to these premises the thesis proposes a review of the different approaches already developed in modelling developmental biology problems, as well as the main models and infrastructures available in literature for modelling biological systems, analysing their capabilities in tackling multi-compartment / multi-level models. The thesis then introduces a practical framework, MS-BioNET, for modelling and simulating these scenarios exploiting the potential of multi-level dynamics. This is based on (i) a computational model featuring networks of compartments and an enhanced model of chemical reaction addressing molecule transfer, (ii) a logic-oriented language to flexibly specify complex simulation scenarios, and (iii) a simulation engine based on the many-species/many-channels optimised version of Gillespie’s direct method. The thesis finally proposes the adoption of the agent-based model as an approach capable of capture multi-level dynamics. To overcome the problem of parameter tuning in the model, the simulators are supplied with a module for parameter optimisation. The task is defined as an optimisation problem over the parameter space in which the objective function to be minimised is the distance between the output of the simulator and a target one. The problem is tackled with a metaheuristic algorithm. As an example of application of the MS-BioNET framework and of the agent-based model, a model of the first stages of Drosophila Melanogaster development is realised. The model goal is to generate the early spatial pattern of gap gene expression. The correctness of the models is shown comparing the simulation results with real data of gene expression with spatial and temporal resolution, acquired in free on-line sources.
Resumo:
Many industries and academic institutions share the vision that an appropriate use of information originated from the environment may add value to services in multiple domains and may help humans in dealing with the growing information overload which often seems to jeopardize our life. It is also clear that information sharing and mutual understanding between software agents may impact complex processes where many actors (humans and machines) are involved, leading to relevant socioeconomic benefits. Starting from these two input, architectural and technological solutions to enable “environment-related cooperative digital services” are here explored. The proposed analysis starts from the consideration that our environment is physical space and here diversity is a major value. On the other side diversity is detrimental to common technological solutions, and it is an obstacle to mutual understanding. An appropriate environment abstraction and a shared information model are needed to provide the required levels of interoperability in our heterogeneous habitat. This thesis reviews several approaches to support environment related applications and intends to demonstrate that smart-space-based, ontology-driven, information-sharing platforms may become a flexible and powerful solution to support interoperable services in virtually any domain and even in cross-domain scenarios. It also shows that semantic technologies can be fruitfully applied not only to represent application domain knowledge. For example semantic modeling of Human-Computer Interaction may support interaction interoperability and transformation of interaction primitives into actions, and the thesis shows how smart-space-based platforms driven by an interaction ontology may enable natural ad flexible ways of accessing resources and services, e.g, with gestures. An ontology for computational flow execution has also been built to represent abstract computation, with the goal of exploring new ways of scheduling computation flows with smart-space-based semantic platforms.
Resumo:
Personal archives are the archives created by individuals for their own purposes. Among these are the library and documentary collections of writers and scholars. It is only recently that archival literature has begun to focus on this category of archives, emphasising how their heterogeneous nature necessitates the conciliation of different approaches to archival description, and calling for a broader understanding of the principle of provenance, recognising that multiple creators, including subsequent researchers, can contribute to shaping personal archives over time by adding new layers of contexts. Despite these advances in the theoretical debate, current architectures for archival representation remain behind. Finding aids privilege a single point of view and do not allow subsequent users to embed their own, potentially conflicting, readings. Using semantic web technologies this study aims to define a conceptual model for writers' archives based on existing and widely adopted models in the cultural heritage and humanities domains. The model developed can be used to represent different types of documents at various levels of analysis, as well as record content and components. It also enables the representation of complex relationships and the incorporation of additional layers of interpretation into the finding aid, transforming it from a static search tool into a dynamic research platform. The personal archive and library of Giuseppe Raimondi serves as a case study for the creation of an archival knowledge base using the proposed conceptual model. By querying the knowledge graph through SPARQL, the effectiveness of the model is evaluated. The results demonstrate that the model addresses the primary representation challenges identified in archival literature, from both a technological and methodological standpoint. The ultimate goal is to bring the output par excellence of archival science, i.e. the finding aid, more in line with the latest developments in archival thinking.
Resumo:
In the last few years the resolution of numerical weather prediction (nwp) became higher and higher with the progresses of technology and knowledge. As a consequence, a great number of initial data became fundamental for a correct initialization of the models. The potential of radar observations has long been recognized for improving the initial conditions of high-resolution nwp models, while operational application becomes more frequent. The fact that many nwp centres have recently taken into operations convection-permitting forecast models, many of which assimilate radar data, emphasizes the need for an approach to providing quality information which is needed in order to avoid that radar errors degrade the model's initial conditions and, therefore, its forecasts. Environmental risks can can be related with various causes: meteorological, seismical, hydrological/hydraulic. Flash floods have horizontal dimension of 1-20 Km and can be inserted in mesoscale gamma subscale, this scale can be modeled only with nwp model with the highest resolution as the COSMO-2 model. One of the problems of modeling extreme convective events is related with the atmospheric initial conditions, in fact the scale dimension for the assimilation of atmospheric condition in an high resolution model is about 10 Km, a value too high for a correct representation of convection initial conditions. Assimilation of radar data with his resolution of about of Km every 5 or 10 minutes can be a solution for this problem. In this contribution a pragmatic and empirical approach to deriving a radar data quality description is proposed to be used in radar data assimilation and more specifically for the latent heat nudging (lhn) scheme. Later the the nvective capabilities of the cosmo-2 model are investigated through some case studies. Finally, this work shows some preliminary experiments of coupling of a high resolution meteorological model with an Hydrological one.
Resumo:
This thesis investigates interactive scene reconstruction and understanding using RGB-D data only. Indeed, we believe that depth cameras will still be in the near future a cheap and low-power 3D sensing alternative suitable for mobile devices too. Therefore, our contributions build on top of state-of-the-art approaches to achieve advances in three main challenging scenarios, namely mobile mapping, large scale surface reconstruction and semantic modeling. First, we will describe an effective approach dealing with Simultaneous Localization And Mapping (SLAM) on platforms with limited resources, such as a tablet device. Unlike previous methods, dense reconstruction is achieved by reprojection of RGB-D frames, while local consistency is maintained by deploying relative bundle adjustment principles. We will show quantitative results comparing our technique to the state-of-the-art as well as detailed reconstruction of various environments ranging from rooms to small apartments. Then, we will address large scale surface modeling from depth maps exploiting parallel GPU computing. We will develop a real-time camera tracking method based on the popular KinectFusion system and an online surface alignment technique capable of counteracting drift errors and closing small loops. We will show very high quality meshes outperforming existing methods on publicly available datasets as well as on data recorded with our RGB-D camera even in complete darkness. Finally, we will move to our Semantic Bundle Adjustment framework to effectively combine object detection and SLAM in a unified system. Though the mathematical framework we will describe does not restrict to a particular sensing technology, in the experimental section we will refer, again, only to RGB-D sensing. We will discuss successful implementations of our algorithm showing the benefit of a joint object detection, camera tracking and environment mapping.
Resumo:
Cost, performance and availability considerations are forcing even the most conservative high-integrity embedded real-time systems industry to migrate from simple hardware processors to ones equipped with caches and other acceleration features. This migration disrupts the practices and solutions that industry had developed and consolidated over the years to perform timing analysis. Industry that are confident with the efficiency/effectiveness of their verification and validation processes for old-generation processors, do not have sufficient insight on the effects of the migration to cache-equipped processors. Caches are perceived as an additional source of complexity, which has potential for shattering the guarantees of cost- and schedule-constrained qualification of their systems. The current industrial approach to timing analysis is ill-equipped to cope with the variability incurred by caches. Conversely, the application of advanced WCET analysis techniques on real-world industrial software, developed without analysability in mind, is hardly feasible. We propose a development approach aimed at minimising the cache jitters, as well as at enabling the application of advanced WCET analysis techniques to industrial systems. Our approach builds on:(i) identification of those software constructs that may impede or complicate timing analysis in industrial-scale systems; (ii) elaboration of practical means, under the model-driven engineering (MDE) paradigm, to enforce the automated generation of software that is analyzable by construction; (iii) implementation of a layout optimisation method to remove cache jitters stemming from the software layout in memory, with the intent of facilitating incremental software development, which is of high strategic interest to industry. The integration of those constituents in a structured approach to timing analysis achieves two interesting properties: the resulting software is analysable from the earliest releases onwards - as opposed to becoming so only when the system is final - and more easily amenable to advanced timing analysis by construction, regardless of the system scale and complexity.
Resumo:
Biomedical analyses are becoming increasingly complex, with respect to both the type of the data to be produced and the procedures to be executed. This trend is expected to continue in the future. The development of information and protocol management systems that can sustain this challenge is therefore becoming an essential enabling factor for all actors in the field. The use of custom-built solutions that require the biology domain expert to acquire or procure software engineering expertise in the development of the laboratory infrastructure is not fully satisfactory because it incurs undesirable mutual knowledge dependencies between the two camps. We propose instead an infrastructure concept that enables the domain experts to express laboratory protocols using proper domain knowledge, free from the incidence and mediation of the software implementation artefacts. In the system that we propose this is made possible by basing the modelling language on an authoritative domain specific ontology and then using modern model-driven architecture technology to transform the user models in software artefacts ready for execution in a multi-agent based execution platform specialized for biomedical laboratories.
Resumo:
MYCN amplification is a genetic hallmark of the childhood tumour neuroblastoma. MYCN-MAX dimers activate the expression of genes promoting cell proliferation. Moreover, MYCN seems to transcriptionally repress cell differentiation even in absence of MAX. We adopted the Drosophila eye as model to investigate the effect of high MYC to MAX expression ratio on cells. We found that dMyc overexpression in eye cell precursors inhibits cell differentiation and induces the ectopic expression of Antennapedia (the wing Hox gene). The further increase of MYC/MAX ratio results in an eye-to-wing homeotic transformation. Notably, dMyc overexpression phenotype is suppressed by low levels of transcriptional co-repressors and MYCN associates to the promoter of Deformed (the eye Hox gene) in proximity to repressive sites. Hence, we envisage that, in presence of high MYC/MAX ratio, the “free MYC” might inhibit Deformed expression, leading in turn to the ectopic expression of Antennapedia. This suggests that MYCN might reinforce its oncogenic role by affecting the physiological homeotic program. Furthermore, poor neuroblastoma outcome associates with a high level of the MRP1 protein, encoded by the ABCC1 gene and known to promote drug efflux in cancer cells. Intriguingly, this correlation persists regardless of chemotherapy and ABCC1 overexpression enhances neuroblastoma cell motility. We found that Drosophila dMRP contributes to the adhesion between the dorsal and ventral epithelia of the wing by inhibiting the function of integrin receptors, well known regulators of cell adhesion and migration. Besides, integrins play a crucial role during synaptogenesis and ABCC1 locus is included in a copy number variable region of the human genome (16p13.11) involved in neuropsychiatric diseases. Interestingly, we found that the altered dMRP/MRP1 level affects nervous system development in Drosophila embryos. These preliminary findings point out novel ABCC1 functions possibly defining ABCC1 contribution to neuroblastoma and to the pathogenicity of 16p13.11 deletion/duplication
Resumo:
The control of a proton exchange membrane fuel cell system (PEM FC) for domestic heat and power supply requires extensive control measures to handle the complicated process. Highly dynamic and non linear behavior, increase drastically the difficulties to find the optimal design and control strategies. The objective is to design, implement and commission a controller for the entire fuel cell system. The fuel cell process and the control system are engineered simultaneously; therefore there is no access to the process hardware during the control system development. Therefore the method of choice was a model based design approach, following the rapid control prototyping (RCP) methodology. The fuel cell system is simulated using a fuel cell library which allowed thermodynamic calculations. In the course of the development the process model is continuously adapted to the real system. The controller application is designed and developed in parallel and thereby tested and verified against the process model. Furthermore, after the commissioning of the real system, the process model can be also better identified and parameterized utilizing measurement data to perform optimization procedures. The process model and the controller application are implemented in Simulink using Mathworks` Real Time Workshop (RTW) and the xPC development suite for MiL (model-in-theloop) and HiL (hardware-in-the-loop) testing. It is possible to completely develop, verify and validate the controller application without depending on the real fuel cell system, which is not available for testing during the development process. The fuel cell system can be immediately taken into operation after connecting the controller to the process.
Resumo:
Rett's Syndrome (RTT) is a severe neurodevelopmental disorder, characterized by cognitive disability that appears in the first months/years of life. Recently, mutations in the X-linked cyclin-dependent kinase-like 5 (CDKL5) gene have been detected in RTT patients characterized by early-onset seizures. CDKL5 is highly expressed in the brain starting from early postnatal stages to adulthood, suggesting the importance of this kinase for proper brain maturation and function. However, the role/s of CDKL5 in brain development and the molecular mechanisms whereby CDKL5 exerts its effects are still largely unknown. In order to characterize the role of CDKL5 on brain development, we created a mice carrying a targeted conditional knockout allele of Cdkl5. A first behavioral characterization shows that Cdkl5 knockout mice recapitulate several features that mimic the clinical features described in CDKL5 patients and are a useful tool to investigate phenotypic and functional aspects of Cdkl5 loss. We used the Cdkl5 knockout mouse model to dissect the role of CDKL5 on hippocampal development and to establish the mechanism/s underlying its actions. We found that Cdkl5 knockout mice showed increased precursor cell proliferation in the hippocampal dentate gyrus. Interestingly, this region was also characterized by an increased rate of apoptotic cell death that caused a reduction in the final neuron number in spite of the proliferation increase. Moreover, loss of Cdkl5 led to decreased dendritic development of new generated granule cells. Finally, we identified the Akt/GSK3-beta signaling as a target of Cdkl5 in the regulation of neuronal precursor proliferation, survival and maturation. Overall our findings highlight a critical role of CDKL5/AKT/GSK3-beta signaling in the control of neuron proliferation, survival and differentiation and suggest that CDKL5-related alterations of these processes during brain development underlie the neurological symptoms of the CDKL5 variant of RTT.
Resumo:
Antigen design is generally driven by the need to obtain enhanced stability,efficiency and safety in vaccines.Unfortunately,the antigen modification is rarely proceeded in parallel with analytical tools development characterization.The analytical tools set up is required during steps of vaccine manufacturing pipeline,for vaccine production modifications,improvements or regulatory requirements.Despite the relevance of bioconjugate vaccines,robust and consistent analytical tools to evaluate the extent of carrier glycosylation are missing.Bioconjugation is a glycoengineering technology aimed to produce N-glycoprotein in vivo in E.coli cells,based on the PglB-dependent system by C. jejuni,applied for production of several glycoconjugate vaccines.This applicability is due to glycocompetent E. coli ability to produce site-selective glycosylated protein used,after few purification steps, as vaccines able to elicit both humoral and cell-mediate immune-response.Here, S.aureus Hla bioconjugated with CP5 was used to perform rational analytical-driven design of the glycosylation sites for the glycosylation extent quantification by Mass Spectrometry.The aim of the study was to develop a MS-based approach to quantify the glycosylation extent for in-process monitoring of bioconjugate production and for final product characterization.The three designed consensus sequences differ for a single amino-acid residue and fulfill the prerequisites for engineered bioconjugate more appropriate from an analytical perspective.We aimed to achieve an optimal MS detectability of the peptide carrying the consensus sequences,complying with the well-characterized requirements for N-glycosylation by PglB.Hla carrier isoforms,bearing these consensus sequences allowed a recovery of about 20 ng/μg of periplasmic protein glycosylated at 40%.The SRM-MS here developed was successfully applied to evaluate the differential site occupancy when carrier protein present two glycosites.The glycosylation extent in each glycosite was determined and the difference in the isoforms were influenced either by the overall source of protein produced and by the position of glycosite insertion.The analytical driven design of the bioconjugated antigen and the development of accurate,precise and robust analytical method allowed to finely characterize the vaccine.
Resumo:
Nowadays the development of new Internal Combustion Engines is mainly driven by the need to reduce tailpipe emissions of pollutants, Green-House Gases and avoid the fossil fuels wasting. The design of dimension and shape of the combustion chamber together with the implementation of different injection strategies e.g., injection timing, spray targeting, higher injection pressure, play a key role in the accomplishment of the aforementioned targets. As far as the match between the fuel injection and evaporation and the combustion chamber shape is concerned, the assessment of the interaction between the liquid fuel spray and the engine walls in gasoline direct injection engines is crucial. The use of numerical simulations is an acknowledged technique to support the study of new technological solutions such as the design of new gasoline blends and of tailored injection strategies to pursue the target mixture formation. The current simulation framework lacks a well-defined best practice for the liquid fuel spray interaction simulation, which is a complex multi-physics problem. This thesis deals with the development of robust methodologies to approach the numerical simulation of the liquid fuel spray interaction with walls and lubricants. The accomplishment of this task was divided into three tasks: i) setup and validation of spray-wall impingement three-dimensional CFD spray simulations; ii) development of a one-dimensional model describing the liquid fuel – lubricant oil interaction; iii) development of a machine learning based algorithm aimed to define which mixture of known pure components mimics the physical behaviour of the real gasoline for the simulation of the liquid fuel spray interaction.
Resumo:
Inverse problems are at the core of many challenging applications. Variational and learning models provide estimated solutions of inverse problems as the outcome of specific reconstruction maps. In the variational approach, the result of the reconstruction map is the solution of a regularized minimization problem encoding information on the acquisition process and prior knowledge on the solution. In the learning approach, the reconstruction map is a parametric function whose parameters are identified by solving a minimization problem depending on a large set of data. In this thesis, we go beyond this apparent dichotomy between variational and learning models and we show they can be harmoniously merged in unified hybrid frameworks preserving their main advantages. We develop several highly efficient methods based on both these model-driven and data-driven strategies, for which we provide a detailed convergence analysis. The arising algorithms are applied to solve inverse problems involving images and time series. For each task, we show the proposed schemes improve the performances of many other existing methods in terms of both computational burden and quality of the solution. In the first part, we focus on gradient-based regularized variational models which are shown to be effective for segmentation purposes and thermal and medical image enhancement. We consider gradient sparsity-promoting regularized models for which we develop different strategies to estimate the regularization strength. Furthermore, we introduce a novel gradient-based Plug-and-Play convergent scheme considering a deep learning based denoiser trained on the gradient domain. In the second part, we address the tasks of natural image deblurring, image and video super resolution microscopy and positioning time series prediction, through deep learning based methods. We boost the performances of supervised, such as trained convolutional and recurrent networks, and unsupervised deep learning strategies, such as Deep Image Prior, by penalizing the losses with handcrafted regularization terms.
Resumo:
The globalization process of the last twenty years has changed the world through international flows of people, policies and practices. International cooperation to development is a part of that process and brought International Organizations (IOs) and Non Governmental Organizations (NGOs) from the West to the rest of the world. In my thesis I analyze the Italian NGOs that worked in Bosnia Herzegovina (BH) to understand which development projects they realized and how they faced the ethnic issue that characterized BH. I consider the relation shaped between Italian NGOs and Bosnian civil society as an object of ethnic interests. In BH, once part of former Yugoslavia, the transition from the communist regime to a democratic country has not been completed. BH’s social conditions are characterized by strong ethnic divisions. The legacy of the early 1990s crisis was a phenomenon of ethnic identities created before the war and that still endure today. The Dayton Peace Agreement signed in 1995 granted the peace and reinforced the inter-ethnic hate between the newly recognized three principal ethnicities: Serbs, Croats and Bosniak. Through the new constitution, the institutions were characterized by division at every level, from the top to the bottom of society. Besides it was the first constitution ever written and signed outside the own country; that was the root of the state of exception that characterized BH. Thus ethnic identities culture survived through the international political involvement. At the same time ethnic groups that dominated the political debate clashed with the international organization’s democratic purpose to build a multicultural and democratic state. Ethnic and also religious differences were the instruments for a national statement that might cause the transition and development projects failure. Fifteen years later social fragmentation was still present and it established an atmosphere of daily cultural violence. Civil society suffered this condition and attended to recreate the ethnic fragmentation in every day life. Some cities became physically divided and other cities don’t tolerated the minority presence. In rural areas, the division was more explicit, from village to village, without integration. In my speech, the anthropology for development – the derivative study from applied anthropology – constitutes the point of view that I used to understand how ethnic identities still influenced the development process in BH. I done ethnographic research about the Italian cooperation for development projects that were working there in 2007. The target of research were the Italian NGOs that created a relation with Bosnian civil society; they were almost twenty divided in four main field of competences: institutional building, education, agriculture and democratization. I assumed that NGOs work needed a deep study because the bottom of society is the place where people could really change their representation and behavior. Italian NGOs operated in BH with the aim of creating sustainable development. They found cultural barricade that both institutions and civil society erected when development projects have been applied. Ethnic and religious differences were stressed to maintain boundaries and fragmented power. Thus NGOs tried to negotiate development projects by social integration. I found that NGOs worked among ethnic groups by pursuing a new integration. They often gained success among people; civil society was ready to accept development projects and overcome differences. On the other hand NGOs have been limited by political level that sustained the ethnic talk and by their representation of Bosnian issue. Thus development policies have been impeded by ethnic issue and by cooperation practices established on a top down perspective. Paradoxically, since international community has approved the political ethnic division within DPA, then the willing of development followed by funding NGOs cooperation projects was not completely successful.
Resumo:
The needs of customers to improve machinery in recent years have driven tractor manufacturers to reduce product life and development costs. The most significant efforts have concentrated on the attempt to decrease the costs of the experimental testing sector. The validation of the tractor prototypes are presently performed with a replication of a particularly unfavourable condition a defined number of times. These laboratory tests do not always faithfully reproduce the real use of the tractor. Therefore, field tests are also carried out to evaluate the prototype during real use, but it is difficult to perform such tests for a period of time long enough to reproduce tractor life usage. In this context, accelerated tests have been introduced in the automotive sector, producing a certain damage to the structure in a reduced amount of time. The goal of this paper is to define a methodology for the realization of accelerated structural tests on a tractor, through the reproduction of real customer tractor usage. A market analysis was performed on a 80 kW power tractor and a series of measures were then taken to simulate the real use of the tractor. Subsequently, the rainflow matrixes of the signals were extrapolated and used to estimate the tractor loadings for 10 years of tractor life. Finally these loadings were reproduced on testing grounds with special road pavements. The results obtained highlight the possibility of reproducing field loadings during road driving on proving grounds (PGs), but the use of two field operations is also necessary. The global acceleration factor obtained in this first step of the methodology is equal to three.