859 resultados para multi-system


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The evolution of the electronics embedded applications forces electronics systems designers to match their ever increasing requirements. This evolution pushes the computational power of digital signal processing systems, as well as the energy required to accomplish the computations, due to the increasing mobility of such applications. Current approaches used to match these requirements relies on the adoption of application specific signal processors. Such kind of devices exploits powerful accelerators, which are able to match both performance and energy requirements. On the other hand, the too high specificity of such accelerators often results in a lack of flexibility which affects non-recurrent engineering costs, time to market, and market volumes too. The state of the art mainly proposes two solutions to overcome these issues with the ambition of delivering reasonable performance and energy efficiency: reconfigurable computing and multi-processors computing. All of these solutions benefits from the post-fabrication programmability, that definitively results in an increased flexibility. Nevertheless, the gap between these approaches and dedicated hardware is still too high for many application domains, especially when targeting the mobile world. In this scenario, flexible and energy efficient acceleration can be achieved by merging these two computational paradigms, in order to address all the above introduced constraints. This thesis focuses on the exploration of the design and application spectrum of reconfigurable computing, exploited as application specific accelerators for multi-processors systems on chip. More specifically, it introduces a reconfigurable digital signal processor featuring a heterogeneous set of reconfigurable engines, and a homogeneous multi-core system, exploiting three different flavours of reconfigurable and mask-programmable technologies as implementation platform for applications specific accelerators. In this work, the various trade-offs concerning the utilization multi-core platforms and the different configuration technologies are explored, characterizing the design space of the proposed approach in terms of programmability, performance, energy efficiency and manufacturing costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The PhD activity described in the document is part of the Microsatellite and Microsystem Laboratory of the II Faculty of Engineering, University of Bologna. The main objective is the design and development of a GNSS receiver for the orbit determination of microsatellites in low earth orbit. The development starts from the electronic design and goes up to the implementation of the navigation algorithms, covering all the aspects that are involved in this type of applications. The use of GPS receivers for orbit determination is a consolidated application used in many space missions, but the development of the new GNSS system within few years, such as the European Galileo, the Chinese COMPASS and the Russian modernized GLONASS, proposes new challenges and offers new opportunities to increase the orbit determination performances. The evaluation of improvements coming from the new systems together with the implementation of a receiver that is compatible with at least one of the new systems, are the main activities of the PhD. The activities can be divided in three section: receiver requirements definition and prototype implementation, design and analysis of the GNSS signal tracking algorithms, and design and analysis of the navigation algorithms. The receiver prototype is based on a Virtex FPGA by Xilinx, and includes a PowerPC processor. The architecture follows the software defined radio paradigm, so most of signal processing is performed in software while only what is strictly necessary is done in hardware. The tracking algorithms are implemented as a combination of Phase Locked Loop and Frequency Locked Loop for the carrier, and Delay Locked Loop with variable bandwidth for the code. The navigation algorithm is based on the extended Kalman filter and includes an accurate LEO orbit model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Membrane proteins play a major role in every living cell. They are the key factors in the cell’s metabolism and in other functions, for example in cell-cell interaction, signal transduction, and transport of ions and nutrients. Cytochrome c oxidase (CcO), as one of the membrane proteins of the respiratory chain, plays a significant role in the energy transformation of higher organisms. CcO is a multi centered heme protein, utilizing redox energy to actively transport protons across the mitochondrial membrane. One aim of this dissertation is to investigate single steps in the mechanism of the ion transfer process coupled to electron transfer, which are not fully understood. The protein-tethered bilayer lipid membrane is a general approach to immobilize membrane proteins in an oriented fashion on a planar electrode embedded in a biomimetic membrane. This system enables the combination of electrochemical techniques with surface enhanced resonance Raman (SERRS), surface enhanced reflection absorption infrared (SEIRAS), and surface plasmon spectroscopy to study protein mediated electron and ion transport processes. The orientation of the enzymes within the surface confined architecture can be controlled by specific site-mutations, i.e. the insertion of a poly-histidine tag to different subunits of the enzyme. CcO can, thus, be oriented uniformly with its natural electron pathway entry pointing either towards or away from the electrode surface. The first orientation allows an ultra-fast direct electron transfer(ET) into the protein, not provided by conventional systems, which can be leveraged to study intrinsic charge transfer processes. The second orientation permits to study the interaction with its natural electron donor cytochrome c. Electrochemical and SERR measurements show conclusively that the redox site structure and the activity of the surface confined enzyme are preserved. Therefore, this biomimetic system offers a unique platform to study the kinetics of the ET processes in order to clarify mechanistic properties of the enzyme. Highly sensitive and ultra fast electrochemical techniques allow the separation of ET steps between all four redox centres including the determination of ET rates. Furthermore, proton transfer coupled to ET could be directly measured and discriminated from other ion transfer processes, revealing novel mechanistic information of the proton transfer mechanism of cytochrome c oxidase. In order to study the kinetics of the ET inside the protein, including the catalytic center, time resolved SEIRAS and SERRS measurements were performed to gain more insight into the structural and coordination changes of the heme environment. The electrical behaviour of tethered membrane systems and membrane intrinsic proteins as well as related charge transfer processes were simulated by solving the respective sets of differential equations, utilizing a software package called SPICE. This helps to understand charge transfer processes across membranes and to develop models that can help to elucidate mechanisms of complex enzymatic processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this dissertation the influence of a precast concrete cladding system on structural robustness of a multi-storey steel-composite building is studied. The analysis follows the well-established framework developed at Imperial College London for the appraisal of robustness of multi-storey buildings. For this scope a simplified nonlinear model of a typical precast concrete façade-system is developed. Particular attention is given to the connection system between structural frame and panel, recognised as the driving component of the nonlinear behaviour of the façade-system. Only connections involved in the gravity load path are evaluated (bearing connections). Together with standard connection, a newly proposed system (Slotted Bearing Connection) is designed to achieve a more ductile behaviour of the panel-connection system. A parametric study involving the dimensions of panel-connection components is developed to search for an optimal configuration of the bearing connection. From the appraisal of structural robustness of the panelised frame it is found that the standard connection systems may reduce the robustness of a multi-storey frame due to a poor ductile behaviour while the newly proposed connection is able to guarantee an enhanced response to the panelised multi-storey frame thanks to a higher ductility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stylolites are rough paired surfaces, indicative of localized stress-induced dissolution under a non-hydrostatic state of stress, separated by a clay parting which is believed to be the residuum of the dissolved rock. These structures are the most frequent deformation pattern in monomineralic rocks and thus provide important information about low temperature deformation and mass transfer. The intriguing roughness of stylolites can be used to assess amount of volume loss and paleo-stress directions, and to infer the destabilizing processes during pressure solution. But there is little agreement on how stylolites form and why these localized pressure solution patterns develop their characteristic roughness.rnNatural bedding parallel and vertical stylolites were studied in this work to obtain a quantitative description of the stylolite roughness and understand the governing processes during their formation. Adapting scaling approaches based on fractal principles it is demonstrated that stylolites show two self affine scaling regimes with roughness exponents of 1.1 and 0.5 for small and large length scales separated by a crossover length at the millimeter scale. Analysis of stylolites from various depths proved that this crossover length is a function of the stress field during formation, as analytically predicted. For bedding parallel stylolites the crossover length is a function of the normal stress on the interface, but vertical stylolites show a clear in-plane anisotropy of the crossover length owing to the fact that the in-plane stresses (σ2 and σ3) are dissimilar. Therefore stylolite roughness contains a signature of the stress field during formation.rnTo address the origin of stylolite roughness a combined microstructural (SEM/EBSD) and numerical approach is employed. Microstructural investigations of natural stylolites in limestones reveal that heterogeneities initially present in the host rock (clay particles, quartz grains) are responsible for the formation of the distinctive stylolite roughness. A two-dimensional numerical model, i.e. a discrete linear elastic lattice spring model, is used to investigate the roughness evolving from an initially flat fluid filled interface induced by heterogeneities in the matrix. This model generates rough interfaces with the same scaling properties as natural stylolites. Furthermore two coinciding crossover phenomena in space and in time exist that separate length and timescales for which the roughening is either balanced by surface or elastic energies. The roughness and growth exponents are independent of the size, amount and the dissolution rate of the heterogeneities. This allows to conclude that the location of asperities is determined by a polimict multi-scale quenched noise, while the roughening process is governed by inherent processes i.e. the transition from a surface to an elastic energy dominated regime.rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DI Diesel engine are widely used both for industrial and automotive applications due to their durability and fuel economy. Nonetheless, increasing environmental concerns force that type of engine to comply with increasingly demanding emission limits, so that, it has become mandatory to develop a robust design methodology of the DI Diesel combustion system focused on reduction of soot and NOx simultaneously while maintaining a reasonable fuel economy. In recent years, genetic algorithms and CFD three-dimensional combustion simulations have been successfully applied to that kind of problem. However, combining GAs optimization with actual CFD three-dimensional combustion simulations can be too onerous since a large number of calculations is usually needed for the genetic algorithm to converge, resulting in a high computational cost and, thus, limiting the suitability of this method for industrial processes. In order to make the optimization process less time-consuming, CFD simulations can be more conveniently used to generate a training set for the learning process of an artificial neural network which, once correctly trained, can be used to forecast the engine outputs as a function of the design parameters during a GA optimization performing a so-called virtual optimization. In the current work, a numerical methodology for the multi-objective virtual optimization of the combustion of an automotive DI Diesel engine, which relies on artificial neural networks and genetic algorithms, was developed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Die optische Eigenschaften sowie der Oberflächenverstärkungseffekt von rauen Metalloberflächen sowie Nanopartikeln wurden intensiv für den infraroten Bereich des Spektrums in der Literatur diskutiert. Für die Präparation solcher Oberflächen gibt es prinzipiell zwei verschiedene Strategien, zum einen können die Nanopartikel zuerst ex-situ synthetisiert werden, der zweite Ansatz beruht darauf, dass die Nanopartikel in-situ hergestellt und aufgewachsen werden. Hierbei wurden beide Ansätze ausgetestet, dabei stellte sich heraus, dass man nur mittels der in-situ Synthese der Goldnanopartikel in der Lage ist nanostrukturierte Oberflächen zu erhalten, welche elektronisch leitfähig sind, nicht zu rau sind, um eine Membranbildung zu ermöglichen und gleichzeitig einen optimalen Oberflächenverstärkungseffekt zeigen. Obwohl keine ideale Form der Nanopartikel mittels der in-situ Synthese erhalten werden können, verhalten sich diese dennoch entsprechend der Theorie des Oberflächenverstärkungseffekts. Optimierungen der Form und Grösse der Nanopartikel führten in dieser Arbeit zu einer Optimierung des Verstärkungseffekts. Solche optimierten Oberflächen konnten einfach reproduziert werden und zeichnen sich durch eine hohe Stabilität aus. Der so erhaltene Oberflächenverstärkungseffekt beträgt absolut 128 verglichen mit dem belegten ATR-Kristall ohne Nanopartikel oder etwa 6 mal, verglichen mit der Oberfläche, die bis jetzt auch in unserer Gruppe verwendet wurde. Daher können nun Spektren erhalten werden, welche ein deutlich besseres Signal zu Rauschverhältnis (SNR) aufweisen, was die Auswertung und Bearbeitung der erhaltenen Spektren deutlich vereinfacht und verkürzt.rnNach der Optimierung der verwendeten Metalloberfläche und der verwendeten Messparameter am Beispiel von Cytochrom C wurde nun an der Oberflächenbelegung der deutlich größeren Cytochrom c Oxidase gearbeitet. Hierfür wurde der DTNTA-Linker ex-situ synthetisiert. Anschließend wurden gemischte Monolagen (self assembeld monolayers) aus DTNTA und DTP hergestellt. Die NTA-Funktionalität ist für die Anbindung der CcO mit der his-tag Technologie verantwortlich. Die Kriterien für eine optimale Linkerkonzentration waren die elektrischen Parameter der Schicht vor und nach Rekonstitution in eine Lipidmembran, sowie Elektronentransferraten bestimmt durch elektrochemische Messungen. Erst mit diesem optimierten System, welches zuverlässig und reproduzierbar funktioniert, konnten weitere Messungen an der CcO begonnen werden. Aus elektrochemischen Messungen war bekannt, dass die CcO durch direkten Elektronentransfer unter Sauerstoffsättigung in einen aktivierten Zustand überführt werden kann. Dieser aktivierte Zustand zeichnet sich durch eine Verschiebung der Redoxpotentiale um etwa 400mV gegenüber dem aus Gleichgewichts-Titrationen bekannten Redoxpotential aus. Durch SEIRAS konnte festgestellt werden, dass die Reduktion bzw. Oxidation aller Redoxzentren tatsächlich bei den in der Cyclovoltammetrie gemessenen Potentialen erfolgt. Außerdem ergaben die SEIRA-Spektren, dass durch direkten Elektronentransfer gravierende Konformationsänderungen innerhalb des Proteins stattfinden. rnBisher war man davon ausgegangen, aufgrund des Elektronentransfers mittels Mediatoren, dass nur minimale Konformationsänderungen beteiligt sind. Vor allem konnte erstmaligrnder aktivierte und nicht aktivierte Zustand der Cytochrom c Oxidase spektroskopisch nachweisen werden.rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern embedded systems embrace many-core shared-memory designs. Due to constrained power and area budgets, most of them feature software-managed scratchpad memories instead of data caches to increase the data locality. It is therefore programmers’ responsibility to explicitly manage the memory transfers, and this make programming these platform cumbersome. Moreover, complex modern applications must be adequately parallelized before they can the parallel potential of the platform into actual performance. To support this, programming languages were proposed, which work at a high level of abstraction, and rely on a runtime whose cost hinders performance, especially in embedded systems, where resources and power budget are constrained. This dissertation explores the applicability of the shared-memory paradigm on modern many-core systems, focusing on the ease-of-programming. It focuses on OpenMP, the de-facto standard for shared memory programming. In a first part, the cost of algorithms for synchronization and data partitioning are analyzed, and they are adapted to modern embedded many-cores. Then, the original design of an OpenMP runtime library is presented, which supports complex forms of parallelism such as multi-level and irregular parallelism. In the second part of the thesis, the focus is on heterogeneous systems, where hardware accelerators are coupled to (many-)cores to implement key functional kernels with orders-of-magnitude of speedup and energy efficiency compared to the “pure software” version. However, three main issues rise, namely i) platform design complexity, ii) architectural scalability and iii) programmability. To tackle them, a template for a generic hardware processing unit (HWPU) is proposed, which share the memory banks with cores, and the template for a scalable architecture is shown, which integrates them through the shared-memory system. Then, a full software stack and toolchain are developed to support platform design and to let programmers exploiting the accelerators of the platform. The OpenMP frontend is extended to interact with it.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to its practical importance and inherent complexity, the optimisation of distribution networks for supplying drinking water has been the subject of extensive study for the past 30 years. The optimization is governed by sizing the pipes in the water distribution network (WDN) and / or optimises specific parts of the network such as pumps, tanks etc. or try to analyse and optimise the reliability of a WDN. In this thesis, the author has analysed two different WDNs (Anytown City and Cabrera city networks), trying to solve and optimise a multi-objective optimisation problem (MOOP). The main two objectives in both cases were the minimisation of Energy Cost (€) or Energy consumption (kWh), along with the total Number of pump switches (TNps) during a day. For this purpose, a decision support system generator for Multi-objective optimisation used. Its name is GANetXL and has been developed by the Center of Water System in the University of Exeter. GANetXL, works by calling the EPANET hydraulic solver, each time a hydraulic analysis has been fulfilled. The main algorithm used, was a second-generation algorithm for multi-objective optimisation called NSGA_II that gave us the Pareto fronts of each configuration. The first experiment that has been carried out was the network of Anytown city. It is a big network with a pump station of four fixed speed parallel pumps that are boosting the water dynamics. The main intervention was to change these pumps to new Variable speed driven pumps (VSDPs), by installing inverters capable to diverse their velocity during the day. Hence, it’s been achieved great Energy and cost savings along with minimisation in the number of pump switches. The results of the research are thoroughly illustrated in chapter 7, with comments and a variety of graphs and different configurations. The second experiment was about the network of Cabrera city. The smaller WDN had a unique FS pump in the system. The problem was the same as far as the optimisation process was concerned, thus, the minimisation of the energy consumption and in parallel the minimisation of TNps. The same optimisation tool has been used (GANetXL).The main scope was to carry out several and different experiments regarding a vast variety of configurations, using different pump (but this time keeping the FS mode), different tank levels, different pipe diameters and different emitters coefficient. All these different modes came up with a large number of results that were compared in the chapter 8. Concluding, it should be said that the optimisation of WDNs is a very interested field that has a vast space of options to deal with. This includes a large number of algorithms to choose from, different techniques and configurations to be made and different support system generators. The researcher has to be ready to “roam” between these choices, till a satisfactory result will convince him/her that has reached a good optimisation point.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chlorinated solvents are the most ubiquitous organic contaminants found in groundwater since the last five decades. They generally reach groundwater as Dense Non-Aqueous Phase Liquid (DNAPL). This phase can migrate through aquifers, and also through aquitards, in ways that aqueous contaminants cannot. The complex phase partitioning to which chlorinated solvent DNAPLs can undergo (i.e. to the dissolved, vapor or sorbed phase), as well as their transformations (e.g. degradation), depend on the physico-chemical properties of the contaminants themselves and on features of the hydrogeological system. The main goal of the thesis is to provide new knowledge for the future investigations of sites contaminated by DNAPLs in alluvial settings, proposing innovative investigative approaches and emphasizing some of the key issues and main criticalities of this kind of contaminants in such a setting. To achieve this goal, the hydrogeologic setting below the city of Ferrara (Po plain, northern Italy), which is affected by scattered contamination by chlorinated solvents, has been investigated at different scales (regional and site specific), both from an intrinsic (i.e. groundwater flow systems) and specific (i.e. chlorinated solvent DNAPL behavior) point of view. Detailed investigations were carried out in particular in one selected test-site, known as “Caretti site”, where high-resolution vertical profiling of different kind of data were collected by means of multilevel monitoring systems and other innovative sampling and analytical techniques. This allowed to achieve a deep geological and hydrogeological knowledge of the system and to reconstruct in detail the architecture of contaminants in relationship to the features of the hosting porous medium. The results achieved in this thesis are useful not only at local scale, e.g. employable to interpret the origin of contamination in other sites of the Ferrara area, but also at global scale, in order to address future remediation and protection actions of similar hydrogeologic settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis is described the design and synthesis of potential agents for the treatment of the multifactorial Alzheimer’s disease (AD). Our multi-target approach was to consider cannabinoid system involved in AD, together with classic targets. In the first project, designed modifications were performed on lead molecule in order to increase potency and obtain balanced activities on fatty acid amide hydrolase and cholinesterases. A small library of compounds was synthesized and biological results showed increased inhibitory activity (nanomolar range) related to selected target. The second project was focused on the benzofuran framework, a privileged structure being a common moiety found in many biologically active natural products and therapeutics. Hybrid molecules were designed and synthesized, focusing on the inhibition of cholinesterases, Aβ aggregation, FAAH and on the interaction with CB receptors. Preliminary results showed that several compounds are potent CB ligands, in particular the high affinity for CB2 receptors, could open new opportunities to modulate neuroinflammation. The third and the fourth project were carried out at the IMS, Aberdeen, under the supervision of Prof. Matteo Zanda. The role of the cannabinoid system in the brain is still largely unexplored and the relationship between the CB1 receptors functional modification, density and distribution and the onset of a pathological state is not well understood. For this reasons, Rimonabant analogues suitable as radioligands were synthesized. The latter, through PET, could provide reliable measurements of density and distribution of CB1 receptors in the brain. In the fifth project, in collaboration with CHyM of York, the goal was to develop arginine analogues that are target specific due to their exclusively location into NOS enzymes and could work as MRI contrasting agents. Synthesized analogues could be suitable substrate for the transfer of polarization by p-H2 molecules through SABRE technique transforming MRI a more sensitive and faster technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coastal flooding poses serious threats to coastal areas around the world, billions of dollars in damage to property and infrastructure, and threatens the lives of millions of people. Therefore, disaster management and risk assessment aims at detecting vulnerability and capacities in order to reduce coastal flood disaster risk. In particular, non-specialized researchers, emergency management personnel, and land use planners require an accurate, inexpensive method to determine and map risk associated with storm surge events and long-term sea level rise associated with climate change. This study contributes to the spatially evaluation and mapping of social-economic-environmental vulnerability and risk at sub-national scale through the development of appropriate tools and methods successfully embedded in a Web-GIS Decision Support System. A new set of raster-based models were studied and developed in order to be easily implemented in the Web-GIS framework with the purpose to quickly assess and map flood hazards characteristics, damage and vulnerability in a Multi-criteria approach. The Web-GIS DSS is developed recurring to open source software and programming language and its main peculiarity is to be available and usable by coastal managers and land use planners without requiring high scientific background in hydraulic engineering. The effectiveness of the system in the coastal risk assessment is evaluated trough its application to a real case study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Systems Biology is an innovative way of doing biology recently raised in bio-informatics contexts, characterised by the study of biological systems as complex systems with a strong focus on the system level and on the interaction dimension. In other words, the objective is to understand biological systems as a whole, putting on the foreground not only the study of the individual parts as standalone parts, but also of their interaction and of the global properties that emerge at the system level by means of the interaction among the parts. This thesis focuses on the adoption of multi-agent systems (MAS) as a suitable paradigm for Systems Biology, for developing models and simulation of complex biological systems. Multi-agent system have been recently introduced in informatics context as a suitabe paradigm for modelling and engineering complex systems. Roughly speaking, a MAS can be conceived as a set of autonomous and interacting entities, called agents, situated in some kind of nvironment, where they fruitfully interact and coordinate so as to obtain a coherent global system behaviour. The claim of this work is that the general properties of MAS make them an effective approach for modelling and building simulations of complex biological systems, following the methodological principles identified by Systems Biology. In particular, the thesis focuses on cell populations as biological systems. In order to support the claim, the thesis introduces and describes (i) a MAS-based model conceived for modelling the dynamics of systems of cells interacting inside cell environment called niches. (ii) a computational tool, developed for implementing the models and executing the simulations. The tool is meant to work as a kind of virtual laboratory, on top of which kinds of virtual experiments can be performed, characterised by the definition and execution of specific models implemented as MASs, so as to support the validation, falsification and improvement of the models through the observation and analysis of the simulations. A hematopoietic stem cell system is taken as reference case study for formulating a specific model and executing virtual experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Classic group recommender systems focus on providing suggestions for a fixed group of people. Our work tries to give an inside look at design- ing a new recommender system that is capable of making suggestions for a sequence of activities, dividing people in subgroups, in order to boost over- all group satisfaction. However, this idea increases problem complexity in more dimensions and creates great challenge to the algorithm’s performance. To understand the e↵ectiveness, due to the enhanced complexity and pre- cise problem solving, we implemented an experimental system from data collected from a variety of web services concerning the city of Paris. The sys- tem recommends activities to a group of users from two di↵erent approaches: Local Search and Constraint Programming. The general results show that the number of subgroups can significantly influence the Constraint Program- ming Approaches’s computational time and e�cacy. Generally, Local Search can find results much quicker than Constraint Programming. Over a lengthy period of time, Local Search performs better than Constraint Programming, with similar final results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Negli ultimi anni le tecnologie informatiche sono state al centro di uno sviluppo esponenziale. Fra le incalcolabili innovazioni presentate, ha preso sempre più campo il paradigma per la programmazione ad agenti, che permette la realizzazione di sistemi software complessi, i quali, nell'informatica moderna, ricoprono un ruolo di fondamentale importanza. Questi sistemi, denominati autonomi, mostrano caratteristiche interessanti per scenari dinamici; essi infatti devono essere robusti e resistenti, in grado di adattarsi al contesto ambientale e quindi reagire a determinate modifiche che si verificano nell'ambiente, comportandosi di conseguenza. Indicano perciò la pro-attività dell'entità presa in considerazione. In questa tesi saranno spiegate queste tipologie di sistemi, introdotte le loro caratteristiche e mostrate le loro potenzialità. Tali caratteristiche permettono di responsabilizzare i soggetti, rendendo il sistema auto-organizzato, con una migliore scalabilità e modularità, riducendo quindi le elevate esigenze di calcolo. L'organizzazione di questo documento prevede i primi capitoli atti a introdurre il mondo dei sistemi autonomi, partendo dalle definizioni di autonomia e di agenti software, concludendo con i sistemi multi-agenti, allo scopo di permettere al lettore una comprensione adatta ed esaustiva. I successivi capitoli riguardano le fasi di progettazione delle entità prese in esame, le loro forme di standardizzazione e i modelli che possono adottare, tra i quali il più conosciuto, il modello BDI. Ne seguono due diverse metodologie per l'ingegneria del software orientata agli agenti. Si conclude con la presentazione dello stato dell'arte degli ambienti di sviluppo conosciuti, contenente un'esauriente introduzione ad ognuno di essi ed una visione nel mondo del lavoro del loro apporto negli applicativi in commercio. Infine la tesi terminerà con un capitolo di conclusioni e di riflessioni sui possibili aspetti futuri.