6 resultados para Software Re-use

em AMS Tesi di Laurea - Alm@DL - Università di Bologna


Relevância:

40.00% 40.00%

Publicador:

Resumo:

L’obiettivo della tesi è stato quello di valutare la vulnerabilità sismica di edifici ordinari in muratura tipici del costruito del Comune di Maranello (MO), e di stimare le curve di fragilità. Vengono individuate le tipologie strutturali in muratura tipiche degli edifici del Comune, che viene suddiviso in comparti secondo il metodo CARTIS. Lo scopo è stato di definire quali sono le tipologie in muratura più vulnerabili, e quindi i comparti del Comune costituiti dagli edifici in muratura più fragili dal punto di vista sismico. La valutazione della vulnerabilità sismica di alcuni edifici rappresentativi delle tipologie murarie esistenti nel territorio analizzato è stata eseguita mediante due metodologie: la prima è una metodologia speditiva chiamata RE.SIS.TO., una valutazione semplificata sviluppata dall’Università degli Studi di Bologna, con l’obiettivo di definire lo stato di criticità degli edifici e di definire la priorità di intervento in tempi brevi; la seconda è una metodologia di valutazione più accurata eseguita attraverso l’analisi statica non lineare con il software 3Muri, un programma per il calcolo sismico delle strutture in muratura.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although Recovery is often defined as the less studied and documented phase of the Emergency Management Cycle, a wide literature is available for describing characteristics and sub-phases of this process. Previous works do not allow to gain an overall perspective because of a lack of systematic consistent monitoring of recovery utilizing advanced technologies such as remote sensing and GIS technologies. Taking into consideration the key role of Remote Sensing in Response and Damage Assessment, this thesis is aimed to verify the appropriateness of such advanced monitoring techniques to detect recovery advancements over time, with close attention to the main characteristics of the study event: Hurricane Katrina storm surge. Based on multi-source, multi-sensor and multi-temporal data, the post-Katrina recovery was analysed using both a qualitative and a quantitative approach. The first phase was dedicated to the investigation of the relation between urban types, damage and recovery state, referring to geographical and technological parameters. Damage and recovery scales were proposed to review critical observations on remarkable surge- induced effects on various typologies of structures, analyzed at a per-building level. This wide-ranging investigation allowed a new understanding of the distinctive features of the recovery process. A quantitative analysis was employed to develop methodological procedures suited to recognize and monitor distribution, timing and characteristics of recovery activities in the study area. Promising results, gained by applying supervised classification algorithms to detect localization and distribution of blue tarp, have proved that this methodology may help the analyst in the detection and monitoring of recovery activities in areas that have been affected by medium damage. The study found that Mahalanobis Distance was the classifier which provided the most accurate results, in localising blue roofs with 93.7% of blue roof classified correctly and a producer accuracy of 70%. It was seen to be the classifier least sensitive to spectral signature alteration. The application of the dissimilarity textural classification to satellite imagery has demonstrated the suitability of this technique for the detection of debris distribution and for the monitoring of demolition and reconstruction activities in the study area. Linking these geographically extensive techniques with expert per-building interpretation of advanced-technology ground surveys provides a multi-faceted view of the physical recovery process. Remote sensing and GIS technologies combined to advanced ground survey approach provides extremely valuable capability in Recovery activities monitoring and may constitute a technical basis to lead aid organization and local government in the Recovery management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Il Cloud computing è probabilmente l'argomento attualmente più dibattuto nel mondo dell'Information and Communication Technology (ICT). La diffusione di questo nuovo modo di concepire l'erogazione di servizi IT, è l'evoluzione di una serie di tecnologie che stanno rivoluzionando le modalit à in cui le organizzazioni costruiscono le proprie infrastrutture informatiche. I vantaggi che derivano dall'utilizzo di infrastrutture di Cloud Computing sono ad esempio un maggiore controllo sui servizi, sulla struttura dei costi e sugli asset impiegati. I costi sono proporzionati all'eettivo uso dei servizi (pay-per-use), evitando dunque gli sprechi e rendendo più efficiente il sistema di sourcing. Diverse aziende hanno già cominciato a provare alcuni servizi cloud e molte altre stanno valutando l'inizio di un simile percorso. La prima organizzazione a fornire una piattaforma di cloud computing fu Amazon, grazie al suo Elastic Computer Cloud (EC2). Nel luglio del 2010 nasce OpenStack, un progetto open-source creato dalla fusione dei codici realizzati dall'agenzia governativa della Nasa[10] e dell'azienda statunitense di hosting Rackspace. Il software realizzato svolge le stesse funzioni di quello di Amazon, a differenza di questo, però, è stato rilasciato con licenza Apache, quindi nessuna restrizione di utilizzo e di implementazione. Oggi il progetto Openstack vanta di numerose aziende partner come Dell, HP, IBM, Cisco, e Microsoft. L'obiettivo del presente elaborato è quello di comprendere ed analizzare il funzionamento del software OpenStack. Il fine principale è quello di familiarizzare con i diversi componenti di cui è costituito e di concepire come essi interagiscono fra loro, per poter costruire infrastrutture cloud del tipo Infrastructure as a service (IaaS). Il lettore si troverà di fronte all'esposizione degli argomenti organizzati nei seguenti capitoli. Nel primo capitolo si introduce la definizione di cloud computing, trattandone le principali caratteristiche, si descrivono poi, i diversi modelli di servizio e di distribuzione, delineando vantaggi e svantaggi che ne derivano. Nel secondo capitolo due si parla di una delle tecnologie impiegate per la realizzazione di infrastrutture di cloud computing, la virtualizzazione. Vengono trattate le varie forme e tipologie di virtualizzazione. Nel terzo capitolo si analizza e descrive in dettaglio il funzionamento del progetto OpenStack. Per ogni componente del software, viene illustrata l'architettura, corredata di schemi, ed il relativo meccanismo. Il quarto capitolo rappresenta la parte relativa all'installazione del software e alla configurazione dello stesso. Inoltre si espongono alcuni test effettuati sulla macchina in cui è stato installato il software. Infine nel quinto capitolo si trattano le conclusioni con le considerazioni sugli obiettivi raggiunti e sulle caratteristiche del software preso in esame.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last years, the importance of locating people and objects and communicating with them in real time has become a common occurrence in every day life. Nowadays, the state of the art of location systems for indoor environments has not a dominant technology as instead occurs in location systems for outdoor environments, where GPS is the dominant technology. In fact, each location technology for indoor environments presents a set of features that do not allow their use in the overall application scenarios, but due its characteristics, it can well coexist with other similar technologies, without being dominant and more adopted than the others indoor location systems. In this context, the European project SELECT studies the opportunity of collecting all these different features in an innovative system which can be used in a large number of application scenarios. The goal of this project is to realize a wireless system, where a network of fixed readers able to query one or more tags attached to objects to be located. The SELECT consortium is composed of European institutions and companies, including Datalogic S.p.A. and CNIT, which deal with software and firmware development of the baseband receiving section of the readers, whose function is to acquire and process the information received from generic tagged objects. Since the SELECT project has an highly innovative content, one of the key stages of the system design is represented by the debug phase. This work aims to study and develop tools and techniques that allow to perform the debug phase of the firmware of the baseband receiving section of the readers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resource management is of paramount importance in network scenarios and it is a long-standing and still open issue. Unfortunately, while technology and innovation continue to evolve, our network infrastructure system has been maintained almost in the same shape for decades and this phenomenon is known as “Internet ossification”. Software-Defined Networking (SDN) is an emerging paradigm in computer networking that allows a logically centralized software program to control the behavior of an entire network. This is done by decoupling the network control logic from the underlying physical routers and switches that forward traffic to the selected destination. One mechanism that allows the control plane to communicate with the data plane is OpenFlow. The network operators could write high-level control programs that specify the behavior of an entire network. Moreover, the centralized control makes it possible to define more specific and complex tasks that could involve many network functionalities, e.g., security, resource management and control, into a single framework. Nowadays, the explosive growth of real time applications that require stringent Quality of Service (QoS) guarantees, brings the network programmers to design network protocols that deliver certain performance guarantees. This thesis exploits the use of SDN in conjunction with OpenFlow to manage differentiating network services with an high QoS. Initially, we define a QoS Management and Orchestration architecture that allows us to manage the network in a modular way. Then, we provide a seamless integration between the architecture and the standard SDN paradigm following the separation between the control and data planes. This work is a first step towards the deployment of our proposal in the University of California, Los Angeles (UCLA) campus network with differentiating services and stringent QoS requirements. We also plan to exploit our solution to manage the handoff between different network technologies, e.g., Wi-Fi and WiMAX. Indeed, the model can be run with different parameters, depending on the communication protocol and can provide optimal results to be implemented on the campus network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the most serious problems of the modern medicine is the growing emergence of antibiotic resistance among pathogenic bacteria. In this circumstance, different and innovative approaches for treating infections caused by multidrug-resistant bacteria are imperatively required. Bacteriophage Therapy is one among the fascinating approaches to be taken into account. This consists of the use of bacteriophages, viruses that infect bacteria, in order to defeat specific bacterial pathogens. Phage therapy is not an innovative idea, indeed, it was widely used around the world in the 1930s and 1940s, in order to treat various infection diseases, and it is still used in Eastern Europe and the former Soviet Union. Nevertheless, Western scientists mostly lost interest in further use and study of phage therapy and abandoned it after the discovery and the spread of antibiotics. The advancement of scientific knowledge of the last years, together with the encouraging results from recent animal studies using phages to treat bacterial infections, and above all the urgent need for novel and effective antimicrobials, have given a prompt for additional rigorous researches in this field. In particular, in the laboratory of synthetic biology of the department of Life Sciences at the University of Warwick, a novel approach was adopted, starting from the original concept of phage therapy, in order to study a concrete alternative to antibiotics. The innovative idea of the project consists in the development of experimental methodologies, which allow to engineer a programmable synthetic phage system using a combination of directed evolution, automation and microfluidics. The main aim is to make “the therapeutics of tomorrow individualized, specific, and self-regulated” (Jaramillo, 2015). In this context, one of the most important key points is the Bacteriophage Quantification. Therefore, in this research work, a mathematical model describing complex dynamics occurring in biological systems involving continuous growth of bacteriophages, modulated by the performance of the host organisms, was implemented as algorithms into a working software using MATLAB. The developed program is able to predict different unknown concentrations of phages much faster than the classical overnight Plaque Assay. What is more, it gives a meaning and an explanation to the obtained data, making inference about the parameter set of the model, that are representative of the bacteriophage-host interaction.