11 resultados para Models performance

em AMS Tesi di Laurea - Alm@DL - Università di Bologna


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Nell'ambito della loro trasformazione digitale, molte organizzazioni stanno adottando nuove tecnologie per supportare lo sviluppo, l'implementazione e la gestione delle proprie architetture basate su microservizi negli ambienti cloud e tra i fornitori di cloud. In questo scenario, le service ed event mesh stanno emergendo come livelli infrastrutturali dinamici e configurabili che facilitano interazioni complesse e la gestione di applicazioni basate su microservizi e servizi cloud. L’obiettivo di questo lavoro è quello di analizzare soluzioni mesh open-source (istio, Linkerd, Apache EventMesh) dal punto di vista delle prestazioni, quando usate per gestire la comunicazione tra applicazioni a workflow basate su microservizi all’interno dell’ambiente cloud. A questo scopo è stato realizzato un sistema per eseguire il dislocamento di ognuno dei componenti all’interno di un cluster singolo e in un ambiente multi-cluster. La raccolta delle metriche e la loro sintesi è stata realizzata con un sistema personalizzato, compatibile con il formato dei dati di Prometheus. I test ci hanno permesso di valutare le prestazioni di ogni componente insieme alla sua efficacia. In generale, mentre si è potuta accertare la maturità delle implementazioni di service mesh testate, la soluzione di event mesh da noi usata è apparsa come una tecnologia ancora non matura, a causa di numerosi problemi di funzionamento.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the recent decade, the request for structural health monitoring expertise increased exponentially in the United States. The aging issues that most of the transportation structures are experiencing can put in serious jeopardy the economic system of a region as well as of a country. At the same time, the monitoring of structures is a central topic of discussion in Europe, where the preservation of historical buildings has been addressed over the last four centuries. More recently, various concerns arose about security performance of civil structures after tragic events such the 9/11 or the 2011 Japan earthquake: engineers looks for a design able to resist exceptional loadings due to earthquakes, hurricanes and terrorist attacks. After events of such a kind, the assessment of the remaining life of the structure is at least as important as the initial performance design. Consequently, it appears very clear that the introduction of reliable and accessible damage assessment techniques is crucial for the localization of issues and for a correct and immediate rehabilitation. The System Identification is a branch of the more general Control Theory. In Civil Engineering, this field addresses the techniques needed to find mechanical characteristics as the stiffness or the mass starting from the signals captured by sensors. The objective of the Dynamic Structural Identification (DSI) is to define, starting from experimental measurements, the modal fundamental parameters of a generic structure in order to characterize, via a mathematical model, the dynamic behavior. The knowledge of these parameters is helpful in the Model Updating procedure, that permits to define corrected theoretical models through experimental validation. The main aim of this technique is to minimize the differences between the theoretical model results and in situ measurements of dynamic data. Therefore, the new model becomes a very effective control practice when it comes to rehabilitation of structures or damage assessment. The instrumentation of a whole structure is an unfeasible procedure sometimes because of the high cost involved or, sometimes, because it’s not possible to physically reach each point of the structure. Therefore, numerous scholars have been trying to address this problem. In general two are the main involved methods. Since the limited number of sensors, in a first case, it’s possible to gather time histories only for some locations, then to move the instruments to another location and replay the procedure. Otherwise, if the number of sensors is enough and the structure does not present a complicate geometry, it’s usually sufficient to detect only the principal first modes. This two problems are well presented in the works of Balsamo [1] for the application to a simple system and Jun [2] for the analysis of system with a limited number of sensors. Once the system identification has been carried, it is possible to access the actual system characteristics. A frequent practice is to create an updated FEM model and assess whether the structure fulfills or not the requested functions. Once again the objective of this work is to present a general methodology to analyze big structure using a limited number of instrumentation and at the same time, obtaining the most information about an identified structure without recalling methodologies of difficult interpretation. A general framework of the state space identification procedure via OKID/ERA algorithm is developed and implemented in Matlab. Then, some simple examples are proposed to highlight the principal characteristics and advantage of this methodology. A new algebraic manipulation for a prolific use of substructuring results is developed and implemented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sub-grid scale (SGS) models are required in order to model the influence of the unresolved small scales on the resolved scales in large-eddy simulations (LES), the flow at the smallest scales of turbulence. In the following work two SGS models are presented and deeply analyzed in terms of accuracy through several LESs with different spatial resolutions, i.e. grid spacings. The first part of this thesis focuses on the basic theory of turbulence, the governing equations of fluid dynamics and their adaptation to LES. Furthermore, two important SGS models are presented: one is the Dynamic eddy-viscosity model (DEVM), developed by \cite{germano1991dynamic}, while the other is the Explicit Algebraic SGS model (EASSM), by \cite{marstorp2009explicit}. In addition, some details about the implementation of the EASSM in a Pseudo-Spectral Navier-Stokes code \cite{chevalier2007simson} are presented. The performance of the two aforementioned models will be investigated in the following chapters, by means of LES of a channel flow, with friction Reynolds numbers $Re_\tau=590$ up to $Re_\tau=5200$, with relatively coarse resolutions. Data from each simulation will be compared to baseline DNS data. Results have shown that, in contrast to the DEVM, the EASSM has promising potentials for flow predictions at high friction Reynolds numbers: the higher the friction Reynolds number is the better the EASSM will behave and the worse the performances of the DEVM will be. The better performance of the EASSM is contributed to the ability to capture flow anisotropy at the small scales through a correct formulation for the SGS stresses. Moreover, a considerable reduction in the required computational resources can be achieved using the EASSM compared to DEVM. Therefore, the EASSM combines accuracy and computational efficiency, implying that it has a clear potential for industrial CFD usage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The large scale development of an Intelligent Transportation System is very close. The main component of such a smart environment is the network that provides connectivity for all vehicles. Public safety is the most demanding application because requires a fast, reliable and secure communication. Although IEEE 802.11p is presently the only full wireless standard for vehicular communications, recent advancements in 3GPP LTE provide support to direct communications and the ongoing activities are also addressing the vehicle to vehicle case. This thesis focuses on the resource allocation procedures and performance of LTE-V2V. To this aim, a MATLAB simulator has been implemented and results have been obtained adopting different mobility models for both in-coverage and out-of-coverage scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study analyses the calibration process of a newly developed high-performance plug-in hybrid electric passenger car powertrain. The complexity of modern powertrains and the more and more restrictive regulations regarding pollutant emissions are the primary challenges for the calibration of a vehicle’s powertrain. In addition, the managers of OEM need to know as earlier as possible if the vehicle under development will meet the target technical features (emission included). This leads to the necessity for advanced calibration methodologies, in order to keep the development of the powertrain robust, time and cost effective. The suggested solution is the virtual calibration, that allows the tuning of control functions of a powertrain before having it built. The aim of this study is to calibrate virtually the hybrid control unit functions in order to optimize the pollutant emissions and the fuel consumption. Starting from the model of the conventional vehicle, the powertrain is then hybridized and integrated with emissions and aftertreatments models. After its validation, the hybrid control unit strategies are optimized using the Model-in-the-Loop testing methodology. The calibration activities will proceed thanks to the implementation of a Hardware-in-the-Loop environment, that will allow to test and calibrate the Engine and Transmission control units effectively, besides in a time and cost saving manner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Collecting and analysing data is an important element in any field of human activity and research. Even in sports, collecting and analyzing statistical data is attracting a growing interest. Some exemplar use cases are: improvement of technical/tactical aspects for team coaches, definition of game strategies based on the opposite team play or evaluation of the performance of players. Other advantages are related to taking more precise and impartial judgment in referee decisions: a wrong decision can change the outcomes of important matches. Finally, it can be useful to provide better representations and graphic effects that make the game more engaging for the audience during the match. Nowadays it is possible to delegate this type of task to automatic software systems that can use cameras or even hardware sensors to collect images or data and process them. One of the most efficient methods to collect data is to process the video images of the sporting event through mixed techniques concerning machine learning applied to computer vision. As in other domains in which computer vision can be applied, the main tasks in sports are related to object detection, player tracking, and to the pose estimation of athletes. The goal of the present thesis is to apply different models of CNNs to analyze volleyball matches. Starting from video frames of a volleyball match, we reproduce a bird's eye view of the playing court where all the players are projected, reporting also for each player the type of action she/he is performing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The following thesis work focuses on the use and implementation of advanced models for measuring the resilience of water distribution networks. In particular, the functions implemented in GRA Tool, a software developed by the University of Exeter (UK), and the functions of the Toolkit of Epanet 2.2 were investigated. The study of the resilience and failure, obtained through GRA Tool and the development of the methodology based on the combined use of EPANET 2.2 and MATLAB software, was tested in a first phase, on a small-sized literature water distribution network, so that the variability of the results could be perceived more clearly and with greater immediacy, and then, on a more complex network, that of Modena. In the specific, it has been decided to go to recreate a mode of failure deferred in time, one proposed by the software GRA Tool, that is failure to the pipes, to make a comparison between the two methodologies. The analysis of hydraulic efficiency was conducted using a synthetic and global network performance index, i.e., Resilience index, introduced by Todini in the years 2000-2016. In fact, this index, being one of the parameters with which to evaluate the overall state of "hydraulic well-being" of a network, has the advantage of being able to act as a criterion for selecting any improvements to be made on the network itself. Furthermore, during these analyzes, was shown the analytical development undergone over time by the formula of the Resilience Index. The final intent of this thesis work was to understand by what means to improve the resilience of the system in question, as the introduction of the scenario linked to the rupture of the pipelines was designed to be able to identify the most problematic branches, i.e., those that in the event of a failure it would entail greater damage to the network, including lowering the Resilience Index.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The High Energy Rapid Modular Ensemble of Satellites (HERMES) is a new mission concept involving the development of a constellation of six CubeSats in low Earth orbit with new miniaturized instruments that host a hybrid Silicon Drift Detector/GAGG:Ce based system for X-ray and γ-ray detection, aiming to monitor high-energy cosmic transients, such as Gamma Ray Bursts and the electromagnetic counterparts of gravitational wave events. The HERMES constellation will also operate together with the Australian-Italian SpIRIT mission, which will house a HERMES-like detector. The HERMES pathfinder mini-constellation, consisting of six satellites plus SpIRIT, is likely to be launched in 2023. The HERMES detectors are based on the heritage of the Italian ReDSoX collaboration, with joint design and production by INFN-Trieste and Fondazione Bruno Kessler, and the involvement of several Italian research institutes and universities. An application-specific, low-noise, low-power integrated circuit (ASIC) called LYRA was conceived and designed for the HERMES readout electronics. My thesis project focuses on the ground calibrations of the first HERMES and SpIRIT flight detectors, with a performance assessment and characterization of the detectors. The first part of this work addresses measurements and experimental tests on laboratory prototypes of the HERMES detectors and their front-end electronics, while the second part is based on the design of the experimental setup for flight detector calibrations and related functional tests for data acquisition, as well as the development of the calibration software. In more detail, the calibration parameters (such as the gain of each detector channel) are determined using measurements with radioactive sources, performed at different operating temperatures between -20°C and +20°C by placing the detector in a suitable climate chamber. The final part of the thesis involves the analysis of the calibration data and a discussion of the results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last few years there has been a great development of techniques like quantum computers and quantum communication systems, due to their huge potentialities and the growing number of applications. However, physical qubits experience a lot of nonidealities, like measurement errors and decoherence, that generate failures in the quantum computation. This work shows how it is possible to exploit concepts from classical information in order to realize quantum error-correcting codes, adding some redundancy qubits. In particular, the threshold theorem states that it is possible to lower the percentage of failures in the decoding at will, if the physical error rate is below a given accuracy threshold. The focus will be on codes belonging to the family of the topological codes, like toric, planar and XZZX surface codes. Firstly, they will be compared from a theoretical point of view, in order to show their advantages and disadvantages. The algorithms behind the minimum perfect matching decoder, the most popular for such codes, will be presented. The last section will be dedicated to the analysis of the performances of these topological codes with different error channel models, showing interesting results. In particular, while the error correction capability of surface codes decreases in presence of biased errors, XZZX codes own some intrinsic symmetries that allow them to improve their performances if one kind of error occurs more frequently than the others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Historic vaulted masonry structures often need strengthening interventions that can effectively improve their structural performance, especially during seismic events, and at the same time respect the existing setting and the modern conservation requirements. In this context, the use of innovative materials such as fiber-reinforced composite materials has been shown as an effective solution that can satisfy both aspects. This work aims to provide insight into the computational modeling of a full-scale masonry vault strengthened by fiber-reinforced composite materials and analyze the influence of the arrangement of the reinforcement on the efficiency of the intervention. At first, a parametric model of a cross vault focusing on a realistic representation of its micro-geometry is proposed. Then numerical modeling, simulating the pushover analyses, of several barrel vaults reinforced with different reinforcement configurations is performed. Finally, the results are collected and discussed in terms of force-displacement curves obtained for each proposed configuration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The following thesis aims to investigate the issues concerning the maintenance of a Machine Learning model over time, both about the versioning of the model itself and the data on which it is trained and about data monitoring tools and their distribution. The themes of Data Drift and Concept Drift were then explored and the performance of some of the most popular techniques in the field of Anomaly detection, such as VAE, PCA, and Monte Carlo Dropout, were evaluated.