20 resultados para Agent-based methodologies

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditional software engineering approaches and metaphors fall short when applied to areas of growing relevance such as electronic commerce, enterprise resource planning, and mobile computing: such areas, in fact, generally call for open architectures that may evolve dynamically over time so as to accommodate new components and meet new requirements. This is probably one of the main reasons that the agent metaphor and the agent-oriented paradigm are gaining momentum in these areas. This thesis deals with the engineering of complex software systems in terms of the agent paradigm. This paradigm is based on the notions of agent and systems of interacting agents as fundamental abstractions for designing, developing and managing at runtime typically distributed software systems. However, today the engineer often works with technologies that do not support the abstractions used in the design of the systems. For this reason the research on methodologies becomes the basic point in the scientific activity. Currently most agent-oriented methodologies are supported by small teams of academic researchers, and as a result, most of them are in an early stage and still in the first context of mostly \academic" approaches for agent-oriented systems development. Moreover, such methodologies are not well documented and very often defined and presented only by focusing on specific aspects of the methodology. The role played by meta- models becomes fundamental for comparing and evaluating the methodologies. In fact a meta-model specifies the concepts, rules and relationships used to define methodologies. Although it is possible to describe a methodology without an explicit meta-model, formalising the underpinning ideas of the methodology in question is valuable when checking its consistency or planning extensions or modifications. A good meta-model must address all the different aspects of a methodology, i.e. the process to be followed, the work products to be generated and those responsible for making all this happen. In turn, specifying the work products that must be developed implies dening the basic modelling building blocks from which they are built. As a building block, the agent abstraction alone is not enough to fully model all the aspects related to multi-agent systems in a natural way. In particular, different perspectives exist on the role that environment plays within agent systems: however, it is clear at least that all non-agent elements of a multi-agent system are typically considered to be part of the multi-agent system environment. The key role of environment as a first-class abstraction in the engineering of multi-agent system is today generally acknowledged in the multi-agent system community, so environment should be explicitly accounted for in the engineering of multi-agent system, working as a new design dimension for agent-oriented methodologies. At least two main ingredients shape the environment: environment abstractions - entities of the environment encapsulating some functions -, and topology abstractions - entities of environment that represent the (either logical or physical) spatial structure. In addition, the engineering of non-trivial multi-agent systems requires principles and mechanisms for supporting the management of the system representation complexity. These principles lead to the adoption of a multi-layered description, which could be used by designers to provide different levels of abstraction over multi-agent systems. The research in these fields has lead to the formulation of a new version of the SODA methodology where environment abstractions and layering principles are exploited for en- gineering multi-agent systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Interaction protocols establish how different computational entities can interact with each other. The interaction can be finalized to the exchange of data, as in 'communication protocols', or can be oriented to achieve some result, as in 'application protocols'. Moreover, with the increasing complexity of modern distributed systems, protocols are used also to control such a complexity, and to ensure that the system as a whole evolves with certain features. However, the extensive use of protocols has raised some issues, from the language for specifying them to the several verification aspects. Computational Logic provides models, languages and tools that can be effectively adopted to address such issues: its declarative nature can be exploited for a protocol specification language, while its operational counterpart can be used to reason upon such specifications. In this thesis we propose a proof-theoretic framework, called SCIFF, together with its extensions. SCIFF is based on Abductive Logic Programming, and provides a formal specification language with a clear declarative semantics (based on abduction). The operational counterpart is given by a proof procedure, that allows to reason upon the specifications and to test the conformance of given interactions w.r.t. a defined protocol. Moreover, by suitably adapting the SCIFF Framework, we propose solutions for addressing (1) the protocol properties verification (g-SCIFF Framework), and (2) the a-priori conformance verification of peers w.r.t. the given protocol (AlLoWS Framework). We introduce also an agent based architecture, the SCIFF Agent Platform, where the same protocol specification can be used to program and to ease the implementation task of the interacting peers.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Reasoning under uncertainty is a human capacity that in software system is necessary and often hidden. Argumentation theory and logic make explicit non-monotonic information in order to enable automatic forms of reasoning under uncertainty. In human organization Distributed Cognition and Activity Theory explain how artifacts are fundamental in all cognitive process. Then, in this thesis we search to understand the use of cognitive artifacts in an new argumentation framework for an agent-based artificial society.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Mainstream hardware is becoming parallel, heterogeneous, and distributed on every desk, every home and in every pocket. As a consequence, in the last years software is having an epochal turn toward concurrency, distribution, interaction which is pushed by the evolution of hardware architectures and the growing of network availability. This calls for introducing further abstraction layers on top of those provided by classical mainstream programming paradigms, to tackle more effectively the new complexities that developers have to face in everyday programming. A convergence it is recognizable in the mainstream toward the adoption of the actor paradigm as a mean to unite object-oriented programming and concurrency. Nevertheless, we argue that the actor paradigm can only be considered a good starting point to provide a more comprehensive response to such a fundamental and radical change in software development. Accordingly, the main objective of this thesis is to propose Agent-Oriented Programming (AOP) as a high-level general purpose programming paradigm, natural evolution of actors and objects, introducing a further level of human-inspired concepts for programming software systems, meant to simplify the design and programming of concurrent, distributed, reactive/interactive programs. To this end, in the dissertation first we construct the required background by studying the state-of-the-art of both actor-oriented and agent-oriented programming, and then we focus on the engineering of integrated programming technologies for developing agent-based systems in their classical application domains: artificial intelligence and distributed artificial intelligence. Then, we shift the perspective moving from the development of intelligent software systems, toward general purpose software development. Using the expertise maturated during the phase of background construction, we introduce a general-purpose programming language named simpAL, which founds its roots on general principles and practices of software development, and at the same time provides an agent-oriented level of abstraction for the engineering of general purpose software systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The hierarchical organisation of biological systems plays a crucial role in the pattern formation of gene expression resulting from the morphogenetic processes, where autonomous internal dynamics of cells, as well as cell-to-cell interactions through membranes, are responsible for the emergent peculiar structures of the individual phenotype. Being able to reproduce the systems dynamics at different levels of such a hierarchy might be very useful for studying such a complex phenomenon of self-organisation. The idea is to model the phenomenon in terms of a large and dynamic network of compartments, where the interplay between inter-compartment and intra-compartment events determines the emergent behaviour resulting in the formation of spatial patterns. According to these premises the thesis proposes a review of the different approaches already developed in modelling developmental biology problems, as well as the main models and infrastructures available in literature for modelling biological systems, analysing their capabilities in tackling multi-compartment / multi-level models. The thesis then introduces a practical framework, MS-BioNET, for modelling and simulating these scenarios exploiting the potential of multi-level dynamics. This is based on (i) a computational model featuring networks of compartments and an enhanced model of chemical reaction addressing molecule transfer, (ii) a logic-oriented language to flexibly specify complex simulation scenarios, and (iii) a simulation engine based on the many-species/many-channels optimised version of Gillespie’s direct method. The thesis finally proposes the adoption of the agent-based model as an approach capable of capture multi-level dynamics. To overcome the problem of parameter tuning in the model, the simulators are supplied with a module for parameter optimisation. The task is defined as an optimisation problem over the parameter space in which the objective function to be minimised is the distance between the output of the simulator and a target one. The problem is tackled with a metaheuristic algorithm. As an example of application of the MS-BioNET framework and of the agent-based model, a model of the first stages of Drosophila Melanogaster development is realised. The model goal is to generate the early spatial pattern of gap gene expression. The correctness of the models is shown comparing the simulation results with real data of gene expression with spatial and temporal resolution, acquired in free on-line sources.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Biomedical analyses are becoming increasingly complex, with respect to both the type of the data to be produced and the procedures to be executed. This trend is expected to continue in the future. The development of information and protocol management systems that can sustain this challenge is therefore becoming an essential enabling factor for all actors in the field. The use of custom-built solutions that require the biology domain expert to acquire or procure software engineering expertise in the development of the laboratory infrastructure is not fully satisfactory because it incurs undesirable mutual knowledge dependencies between the two camps. We propose instead an infrastructure concept that enables the domain experts to express laboratory protocols using proper domain knowledge, free from the incidence and mediation of the software implementation artefacts. In the system that we propose this is made possible by basing the modelling language on an authoritative domain specific ontology and then using modern model-driven architecture technology to transform the user models in software artefacts ready for execution in a multi-agent based execution platform specialized for biomedical laboratories.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Historical evidence shows that chemical, process, and Oil&Gas facilities where dangerous substances are stored or handled are target of deliberate malicious attacks (security attacks) aiming at interfering with normal operations. Physical attacks and cyber-attacks may generate events with consequences on people, property, and the surrounding environment that are comparable to those of major accidents caused by safety-related causes. The security aspects of these facilities are commonly addressed using Security Vulnerability/Risk Assessment (SVA/SRA) methodologies. Most of these methodologies are semi-quantitative and non-systematic approaches that strongly rely on expert judgment, leading to security assessments that are not reproducible. Moreover, they do not consider the synergies with the safety domain. The present 3-year research is aimed at filling the gap outlined by providing knowledge on security attacks, as well as rigorous and systematic methods supporting existing SVA/SRA studies suitable for the chemical, process, and Oil&Gas industry. The different nature of cyber and physical attacks resulted in the development of different methods for the two domains. The first part of the research was devoted to the development and statistical analysis of security databases that allowed to develop new knowledge and lessons learnt on security threats. Based on the obtained background, a Bow-Tie based procedure and two reverse-HazOp based methodologies were developed as hazard identification approaches for physical and cyber threats respectively. To support the quantitative estimation of the security risk, a quantitative procedure based on the Bayesian Network was developed allowing to calculate the probability of success of physical security attacks. All the developed methods have been applied to case studies addressing chemical, process and Oil&Gas facilities (offshore and onshore) proving the quality of the results that can be achieved in improving site security. Furthermore, the outcomes achieved allow to step forward in developing synergies and promoting integration among safety and security management.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In recent decades, two prominent trends have influenced the data modeling field, namely network analysis and machine learning. This thesis explores the practical applications of these techniques within the domain of drug research, unveiling their multifaceted potential for advancing our comprehension of complex biological systems. The research undertaken during this PhD program is situated at the intersection of network theory, computational methods, and drug research. Across six projects presented herein, there is a gradual increase in model complexity. These projects traverse a diverse range of topics, with a specific emphasis on drug repurposing and safety in the context of neurological diseases. The aim of these projects is to leverage existing biomedical knowledge to develop innovative approaches that bolster drug research. The investigations have produced practical solutions, not only providing insights into the intricacies of biological systems, but also allowing the creation of valuable tools for their analysis. In short, the achievements are: • A novel computational algorithm to identify adverse events specific to fixed-dose drug combinations. • A web application that tracks the clinical drug research response to SARS-CoV-2. • A Python package for differential gene expression analysis and the identification of key regulatory "switch genes". • The identification of pivotal events causing drug-induced impulse control disorders linked to specific medications. • An automated pipeline for discovering potential drug repurposing opportunities. • The creation of a comprehensive knowledge graph and development of a graph machine learning model for predictions. Collectively, these projects illustrate diverse applications of data science and network-based methodologies, highlighting the profound impact they can have in supporting drug research activities.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Nowadays the rise of non-recurring engineering (NRE) costs associated with complexity is becoming a major factor in SoC design, limiting both scaling opportunities and the flexibility advantages offered by the integration of complex computational units. The introduction of embedded programmable elements can represent an appealing solution, able both to guarantee the desired flexibility and upgradabilty and to widen the SoC market. In particular embedded FPGA (eFPGA) cores can provide bit-level optimization for those applications which benefits from synthesis, paying on the other side in terms of performance penalties and area overhead with respect to standard cell ASIC implementations. In this scenario this thesis proposes a design methodology for a synthesizable programmable device designed to be embedded in a SoC. A soft-core embedded FPGA (eFPGA) is hence presented and analyzed in terms of the opportunities given by a fully synthesizable approach, following an implementation flow based on Standard-Cell methodology. A key point of the proposed eFPGA template is that it adopts a Multi-Stage Switching Network (MSSN) as the foundation of the programmable interconnects, since it can be efficiently synthesized and optimized through a standard cell based implementation flow, ensuring at the same time an intrinsic congestion-free network topology. The evaluation of the flexibility potentialities of the eFPGA has been performed using different technology libraries (STMicroelectronics CMOS 65nm and BCD9s 0.11μm) through a design space exploration in terms of area-speed-leakage tradeoffs, enabled by the full synthesizability of the template. Since the most relevant disadvantage of the adopted soft approach, compared to a hardcore, is represented by a performance overhead increase, the eFPGA analysis has been made targeting small area budgets. The generation of the configuration bitstream has been obtained thanks to the implementation of a custom CAD flow environment, and has allowed functional verification and performance evaluation through an application-aware analysis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Advanced analytical methodologies were developed to characterize new potential active MTDLs on isolated targets involved in the first stages of Alzheimer’s disease (AD). In addition, the methods investigated drug-protein bindings and evaluated protein-protein interactions involved in the neurodegeneration. A high-throughput luminescent assay allowed the study of the first in class GSK-3β/ HDAC dual inhibitors towards the enzyme GSK-3β. The method was able to identify an innovative disease-modifying agent with an activity in the micromolar range both on GSK-3β, HDAC1 and HDAC6. Then, the same assay reliably and quickly selected true positive hit compounds among natural Amaryllidaceae alkaloids tested against GSK-3β. Hence, given the central role of the amyloid pathway in the multifactorial nature of AD, a multi-methodological approach based on mass spectrometry (MS), circular dichroism spectroscopy (CD) and ThT assay was applied to characterize the potential interaction of CO releasing molecules (CORMs) with Aβ1-42 peptide. The comprehensive method provided reliable information on the different steps of the fibrillation process and regarding CORMs mechanism of action. Therefore, the optimal CORM-3/Aβ1−42 ratio in terms of inhibitory effect was identified by mass spectrometry. CD analysis confirmed the stabilizing effect of CORM-3 on the Aβ1−42 peptide soluble form and the ThT Fluorescent Analysis ensured that the entire fibrillation process was delayed. Then the amyloid aggregation process was studied in view of a possible correlation with AD lipid brain alterations. Therefore, SH-SY5Y cells were treated with increasing concentration of Aß1-42 at different times and the samples were analysed by a RP-UHPLC system coupled with a high-resolution quadrupole TOF mass spectrometer in comprehensive data-independent SWATH acquisition mode. Each lipid class profiling in SH-SY5Y cells treated with Aß1-42 was compared to the one obtained from the untreated. The approach underlined some peculiar lipid alterations, suitable as biomarkers, that might be correlated to Aß1-42 different aggregation species.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Transition metal catalyzed cross-coupling reactions represent among the most versatile and useful tools in organic synthesis for the carbon-carbon (C-C) bond formation and have a prominent role in both the academic and pharmaceutical segments. Among them, palladium catalyzed cross-coupling reactions are currently the most versatile. In this thesis, the applications, impact and development of green palladium cross-coupling reactions are discussed. Specifically, we discuss the translation of the Twelve Principles of Green Chemistry and their applications in pharmaceutical organometallic chemistry to stimulate the development of cost-effective and sustainable catalytic processes for the synthesis of active pharmaceutical ingredients (API). The Heck-Cassar-Sonogashira (HCS) and the Suzuki-Miyaura (SM) protocols, using HEP/H2O as green mixture and sulfonated phosphine ligands, allowed to recycle and recover the catalyst, always guaranteeing high yields and fast conversion under mild conditions, with aryl iodides, bromides, triflates and chlorides. No catalyst leakage or metal contamination of the final product were observed during the HCS and SM reactions, respecting the very low limits for metal impurities in medicines established by the International Conference of Harmonization Guidelines Q3D (ICH Q3D). In addition, a deep understanding of the reaction mechanism is very important if the final target is to develop efficient protocols that can be applied at industrial level. Experimental and theoretical studies pointed out the presence of two catalytic cycles depending on the counterion, shedding light on the role of base in catalyst reduction and acetylene coordination in the HCS coupling. Finally, the development of a cross-coupling reaction to form aryldifluoronitriles in the presence of copper is discussed, highlighting the importance of inserting fluorine atoms within biological structures and the use of readily available metals such as copper as an alternative to palladium.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biohybrid derivatives of π-conjugated materials are emerging as powerful tools to study biological events through the (opto)electronic variations of the π-conjugated moieties, as well as to direct and govern the self-assembly properties of the organic materials through the organization principles of the bio component. So far, very few examples of thiophene-based biohybrids have been reported. The aim of this Ph. D thesis has been the development of oligothiophene-oligonucleotide hybrid derivatives as tools, on one side, to detect DNA hybridisation events and, on the other, as model compounds to investigate thiophene-nucleobase interactions in the solid state. To obtain oligothiophene bioconjugates with the required high level of purity, we first developed new synthetic ecofriendly protocols for the synthesis of thiophene oligomers. Our innovative heterogeneous Suzuki coupling methodology, carried out in EtOH/water or isopropanol under microwave irradiation, allowed us to obtain alkyl substituted oligothiophenes and thiophene based co-oligomers in high yields and very short reaction times, free from residual metals and with improved film forming properties. These methodologies were subsequently applied in the synthesis of oligothiophene-oligonucleotide conjugates. Oligothiophene-5-labeled deoxyuridines were synthesized and incorporated into 19-meric oligonucletide sequences. We showed that the oligothiophene-labeled oligonucletide sequences obtained can be used as probes to detect a single nucleotide polymorphism (SNP) in complementary DNA target sequences. In fact, all the probes showed marked variations in emission intensity upon hybridization with a complementary target sequence. The observed variations in emitted light were comparable or even superior to those reported in similar studies, showing that the biohybrids can potentially be useful to develop biosensors for the detection of DNA mismatches. Finally, water-soluble, photoluminescent and electroactive dinucleotide-hybrid derivatives of quaterthiophene and quinquethiophene were synthesized. By means of a combination of spectroscopy and microscopy techniques, electrical characterizations, microfluidic measurements and theoretical calculations, we were able to demonstrate that the self-assembly modalities of the biohybrids in thin films are driven by the interplay of intra and intermolecular interactions in which the π-stacking between the oligothiophene and nucleotide bases plays a major role.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The topics I came across during the period I spent as a Ph.D. student are mainly two. The first concerns new organocatalytic protocols for Mannich-type reactions mediated by Cinchona alkaloids derivatives (Scheme I, left); the second topic, instead, regards the study of a new approach towards the enantioselective total synthesis of Aspirochlorine, a potent gliotoxin that recent studies indicate as a highly selective and active agent against fungi (Scheme I, right). At the beginning of 2005 I had the chance to join the group of Prof. Alfredo Ricci at the Department of Organic Chemistry of the University of Bologna, starting my PhD studies. During the first period I started to study a new homogeneous organocatalytic aza-Henry reaction by means of Cinchona alkaloid derivatives as chiral base catalysts with good results. Soon after we introduced a new protocol which allowed the in situ synthesis of N-carbamoyl imines, scarcely stable, moisture sensitive compounds. For this purpose we used α-amido sulfones, bench stable white crystalline solids, as imine precursors (Scheme II). In particular we were able to obtain the aza-Henry adducts, by using chiral phase transfer catalysis, with a broad range of substituents as R-group and excellent results, unprecedented for Mannich-type transformations (Scheme II). With the optimised protocol in hand we have extended the methodology to the other Mannich-type reactions. We applied the new method to the Mannich, Strecker and Pudovik (hydrophosphonylation of imines) reactions with very good results in terms of enantioselections and yields, broadening the usefulness of this novel protocol. The Mannich reaction was certainly the most extensively studied work in this thesis (Scheme III). Initially we developed the reaction with α-amido sulfones as imine precursors and non-commercially available malonates with excellent results in terms of yields and enantioselections.3 In this particular case we recorded 1 mol% of catalyst loading, very low for organocatalytic processes. Then we thought to develop a new Mannich reaction by using simpler malonates, such as dimethyl malonate.4 With new optimised condition the reaction provided slightly lower enantioselections than the previous protocol, but the Mannich adducts were very versatile for the obtainment of β3-amino acids. Furthermore we performed the first addition of cyclic β-ketoester to α-amido sulfones obtaining the corresponding products in good yield with high level of diastereomeric and enantiomeric excess (Scheme III). Further studies were done about the Strecker reaction mediated by Cinchona alkaloid phase-transfer quaternary ammonium salt derivatives, using acetone cyanohydrin, a relatively harmless cyanide source (Scheme IV). The reaction proceeded very well providing the corresponding α-amino nitriles in good yields and enantiomeric excesses. Finally, we developed two new complementary methodologies for the hydrophosphonylation of imines (Scheme V). As a result of the low stability of the products derived from aromatic imines, we performed the reactions in mild homogeneous basic condition by using quinine as a chiral base catalyst giving the α-aryl-α-amido phosphonic acid esters as products (Scheme V, top).6 On the other hand, we performed the addition of dialkyl phosphite to aliphatic imines by using chiral Cinchona alkaloid phase transfer quaternary ammonium salt derivatives using our methodology based on α-amido sulfones (Scheme V, bottom). The results were good for both procedures covering a broad range of α-amino phosphonic acid ester. During the second year Ph.D. studies, I spent six months in the group of Prof. Steven V. Ley, at the Department of Chemistry of the University of Cambridge, in United Kingdom. During this fruitful period I have been involved in a project concerning the enantioselective synthesis of Aspirochlorine. We provided a new route for the synthesis of a key intermediate, reducing the number of steps and increasing the overall yield. Then we introduced a new enantioselective spirocyclisation for the synthesis of a chiral building block for the completion of the synthesis (Scheme VI).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The running innovation processes of the microwave transistor technologies, used in the implementation of microwave circuits, have to be supported by the study and development of proper design methodologies which, depending on the applications, will fully exploit the technology potentialities. After the choice of the technology to be used in the particular application, the circuit designer has few degrees of freedom when carrying out his design; in the most cases, due to the technological constrains, all the foundries develop and provide customized processes optimized for a specific performance such as power, low-noise, linearity, broadband etc. For these reasons circuit design is always a “compromise”, an investigation for the best solution to reach a trade off between the desired performances. This approach becomes crucial in the design of microwave systems to be used in satellite applications; the tight space constraints impose to reach the best performances under proper electrical and thermal de-rated conditions, respect to the maximum ratings provided by the used technology, in order to ensure adequate levels of reliability. In particular this work is about one of the most critical components in the front-end of a satellite antenna, the High Power Amplifier (HPA). The HPA is the main power dissipation source and so the element which mostly engrave on space, weight and cost of telecommunication apparatus; it is clear from the above reasons that design strategies addressing optimization of power density, efficiency and reliability are of major concern. Many transactions and publications demonstrate different methods for the design of power amplifiers, highlighting the availability to obtain very good levels of output power, efficiency and gain. Starting from existing knowledge, the target of the research activities summarized in this dissertation was to develop a design methodology capable optimize power amplifier performances complying all the constraints imposed by the space applications, tacking into account the thermal behaviour in the same manner of the power and the efficiency. After a reminder of the existing theories about the power amplifier design, in the first section of this work, the effectiveness of the methodology based on the accurate control of the dynamic Load Line and her shaping will be described, explaining all steps in the design of two different kinds of high power amplifiers. Considering the trade-off between the main performances and reliability issues as the target of the design activity, we will demonstrate that the expected results could be obtained working on the characteristics of the Load Line at the intrinsic terminals of the selected active device. The methodology proposed in this first part is based on the assumption that designer has the availability of an accurate electrical model of the device; the variety of publications about this argument demonstrates that it is so difficult to carry out a CAD model capable to taking into account all the non-ideal phenomena which occur when the amplifier operates at such high frequency and power levels. For that, especially for the emerging technology of Gallium Nitride (GaN), in the second section a new approach for power amplifier design will be described, basing on the experimental characterization of the intrinsic Load Line by means of a low frequency high power measurements bench. Thanks to the possibility to develop my Ph.D. in an academic spin-off, MEC – Microwave Electronics for Communications, the results of this activity has been applied to important research programs requested by space agencies, with the aim support the technological transfer from universities to industrial world and to promote a science-based entrepreneurship. For these reasons the proposed design methodology will be explained basing on many experimental results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The advent of distributed and heterogeneous systems has laid the foundation for the birth of new architectural paradigms, in which many separated and autonomous entities collaborate and interact to the aim of achieving complex strategic goals, impossible to be accomplished on their own. A non exhaustive list of systems targeted by such paradigms includes Business Process Management, Clinical Guidelines and Careflow Protocols, Service-Oriented and Multi-Agent Systems. It is largely recognized that engineering these systems requires novel modeling techniques. In particular, many authors are claiming that an open, declarative perspective is needed to complement the closed, procedural nature of the state of the art specification languages. For example, the ConDec language has been recently proposed to target the declarative and open specification of Business Processes, overcoming the over-specification and over-constraining issues of classical procedural approaches. On the one hand, the success of such novel modeling languages strongly depends on their usability by non-IT savvy: they must provide an appealing, intuitive graphical front-end. On the other hand, they must be prone to verification, in order to guarantee the trustworthiness and reliability of the developed model, as well as to ensure that the actual executions of the system effectively comply with it. In this dissertation, we claim that Computational Logic is a suitable framework for dealing with the specification, verification, execution, monitoring and analysis of these systems. We propose to adopt an extended version of the ConDec language for specifying interaction models with a declarative, open flavor. We show how all the (extended) ConDec constructs can be automatically translated to the CLIMB Computational Logic-based language, and illustrate how its corresponding reasoning techniques can be successfully exploited to provide support and verification capabilities along the whole life cycle of the targeted systems.