942 resultados para Machine-tools - numerical control
Resumo:
This thesis argues the attitude control problem of nanosatellites, which has been a challenging issue over the years for the scientific community and still constitutes an active area of research. The interest is increasing as more than 70% of future satellite launches are nanosatellites. Therefore, new challenges appear with the miniaturisation of the subsystems and improvements must be reached. In this framework, the aim of this thesis is to develop novel control approaches for three-axis stabilisation of nanosatellites equipped with magnetorquers and reaction wheels, to improve the performance of the existent control strategies and demonstrate the stability of the system. In particular, this thesis is focused on the development of non-linear control techniques to stabilise full-actuated nanosatellites, and in the case of underactuation, in which the number of control variables is less than the degrees of freedom of the system. The main contributions are, for the first control strategy proposed, to demonstrate global asymptotic stability derived from control laws that stabilise the system in a target frame, a fixed direction of the orbit frame. Simulation results show good performance, also in presence of disturbances, and a theoretical selection of the magnetic control gain is given. The second control approach presents instead, a novel stable control methodology for three-axis stabilisation in underactuated conditions. The control scheme consists of the dynamical implementation of an attitude manoeuvre planning by means of a switching control logic. A detailed numerical analysis of the control law gains and the effect on the convergence time, total integrated and maximum torque is presented demonstrating the good performance and robustness also in the presence of disturbances.
Resumo:
The main objective of this PhD thesis is to optimize a specific multifunctional maritime structure for harbour protection and energy production, named Overtopping Breakwater for Energy Conversion (OBREC), developed by the team of the University of Campania. This device is provided with a sloping plate followed by a unique reservoir, which is linked with the machine room (where the energy conversion occurs) by means of a pipe passing through the crown wall, provided with a parapet on top of it. Therefore, the potential energy of the overtopping waves, collected inside the reservoir located above the still water level, is then converted by means of low – head turbines. In order to improve the understanding of the wave – structure interactions with OBREC, several methodologies have been used and combined together: i. analysis of recent experimental campaigns on wave overtopping discharges and pressures at the crown wall on small – scale OBREC cross sections, carried out in other laboratories by the team of the University of Campania; ii. new experiments on cross sections similar to the OBREC device, planned and carried out in the hydraulic lab at the University of Bologna in the framework of this PhD work; iii. numerical modelling with a 1 – phase incompressible fluid model IH – 2VOF, developed by the University of Cantabria, and with a 2 – phase incompressible fluid model OpenFOAM, both available from the literature; iv. numerical modelling with a new 2 – phase compressible fluid model developed in the OpenFOAM environment within this PhD work; v. analysis of the data gained from the monitoring of the OBREC prototype installation.
Resumo:
Choosing natural enemies to suppress pest population has been for a long the key of biological control. Overtime the term biological control has also been applied to the use of suppressive soils, bio-disinfection and biopesticides. Biological control agents (BCA) and natural compounds, extracted or fermented from various sources, are the resources for containing phytopathogens. BCA can act through direct antagonism mechanisms or inducing hypovirulence of the pathogen. The first part of the thesis focused on mycoviruses infecting phytopathogenic fungi belonging to the genus Fusarium. The development of new approaches capable of faster dissecting the virome of filamentous fungi samples was performed. The semiconductor-based sequencer Ion Torrent™ and the nanopore-based sequencer MinION have been exploited to analyze DNA and RNA referable to viral genomes. Comparison with GeneBank accessions and sequence analysis allowed to identify more than 40 putative viral species, some of these mycovirus genera have been studied as inducers of hypovirulence in several phytopathogenic fungi, therefore future works will focus on the comparison of the morphology and physiology of the fungal strain infected and cured by the viruses identified and their possible use as a biocontrol agent. In a second part of the thesis the potential of botanical pesticides has been evaluated for the biocontrol of phloem limited phytopathogens such as phytoplasmas. The only active compounds able to control phytoplasmas are the antibiotic oxytetracyclines and in vitro direct and fast screening of new antimicrobials compounds on media is almost impossible due to the difficulty to culture phytoplasmas. For this reason, a simple and reliable screening method was developed to evaluate the effects of antimicrobials directly on phytoplasmas by an “ex-vivo” approach. Using scanning electron microscopy (SEM) in parallel with molecular tools (ddRT-PCR), the direct activity of tetracyclines on phytoplasma cells was verified, identifying also a promising compound showing similar activity.
Resumo:
Since their emergence, locally resonant metamaterials have found several applications for the control of surface waves, from micrometer-sized electronic devices to meter-sized seismic barriers. The interaction between Rayleigh-type surface waves and resonant metamaterials has been investigated through the realization of locally resonant metasurfaces, thin elastic interfaces constituted by a cluster of resonant inclusions or oscillators embedded near the surface of an elastic waveguide. When such resonant metasurfaces are embedded in an elastic homogeneous half-space, they can filter out the propagation of Rayleigh waves, creating low-frequency bandgaps at selected frequencies. In the civil engineering context, heavy resonating masses are needed to extend the bandgap frequency width of locally resonant devices, a requirement that limits their practical implementations. In this dissertation, the wave attenuation capabilities of locally resonant metasurfaces have been enriched by proposing (i) tunable metasurfaces to open large frequency bandgaps with small effective inertia, and by developing (ii) an analytical framework aimed at studying the propagation of Rayleigh waves propagation in deep resonant waveguides. In more detail, inertial amplified resonators are exploited to design advanced metasurfaces with a prescribed static and a tunable dynamic response. The modular design of the tunable metasurfaces allows to shift and enlarge low-frequency spectral bandgaps without modifying the total inertia of the metasurface. Besides, an original dispersion law is derived to study the dispersive properties of Rayleigh waves propagating in thick resonant layers made of sub-wavelength resonators. Accordingly, a deep resonant wave barrier of mechanical resonators embedded inside the soil is designed to impede the propagation of seismic surface waves. Numerical models are developed to confirm the analytical dispersion predictions of the tunable metasurface and resonant layer. Finally, a medium-size scale resonant wave barrier is designed according to the soil stratigraphy of a real geophysical scenario to attenuate ground-borne vibration.
Resumo:
In the last decades, global food supply chains had to deal with the increasing awareness of the stakeholders and consumers about safety, quality, and sustainability. In order to address these new challenges for food supply chain systems, an integrated approach to design, control, and optimize product life cycle is required. Therefore, it is essential to introduce new models, methods, and decision-support platforms tailored to perishable products. This thesis aims to provide novel practice-ready decision-support models and methods to optimize the logistics of food items with an integrated and interdisciplinary approach. It proposes a comprehensive review of the main peculiarities of perishable products and the environmental stresses accelerating their quality decay. Then, it focuses on top-down strategies to optimize the supply chain system from the strategical to the operational decision level. Based on the criticality of the environmental conditions, the dissertation evaluates the main long-term logistics investment strategies to preserve products quality. Several models and methods are proposed to optimize the logistics decisions to enhance the sustainability of the supply chain system while guaranteeing adequate food preservation. The models and methods proposed in this dissertation promote a climate-driven approach integrating climate conditions and their consequences on the quality decay of products in innovative models supporting the logistics decisions. Given the uncertain nature of the environmental stresses affecting the product life cycle, an original stochastic model and solving method are proposed to support practitioners in controlling and optimizing the supply chain systems when facing uncertain scenarios. The application of the proposed decision-support methods to real case studies proved their effectiveness in increasing the sustainability of the perishable product life cycle. The dissertation also presents an industry application of a global food supply chain system, further demonstrating how the proposed models and tools can be integrated to provide significant savings and sustainability improvements.
Resumo:
Nowadays the development of new Internal Combustion Engines is mainly driven by the need to reduce tailpipe emissions of pollutants, Green-House Gases and avoid the fossil fuels wasting. The design of dimension and shape of the combustion chamber together with the implementation of different injection strategies e.g., injection timing, spray targeting, higher injection pressure, play a key role in the accomplishment of the aforementioned targets. As far as the match between the fuel injection and evaporation and the combustion chamber shape is concerned, the assessment of the interaction between the liquid fuel spray and the engine walls in gasoline direct injection engines is crucial. The use of numerical simulations is an acknowledged technique to support the study of new technological solutions such as the design of new gasoline blends and of tailored injection strategies to pursue the target mixture formation. The current simulation framework lacks a well-defined best practice for the liquid fuel spray interaction simulation, which is a complex multi-physics problem. This thesis deals with the development of robust methodologies to approach the numerical simulation of the liquid fuel spray interaction with walls and lubricants. The accomplishment of this task was divided into three tasks: i) setup and validation of spray-wall impingement three-dimensional CFD spray simulations; ii) development of a one-dimensional model describing the liquid fuel – lubricant oil interaction; iii) development of a machine learning based algorithm aimed to define which mixture of known pure components mimics the physical behaviour of the real gasoline for the simulation of the liquid fuel spray interaction.
Resumo:
Integrated Coastal Zone Management (ICZM) should be considered as one of the main components to be able to implement sustainable development. Friuli Venezia Giulia region with its 93 km of coastline is committed to investing its resources in projects aimed at studying the evolution of the coast. In this report, reference will be made to the area in front of the municipality of Grado, where the Banco della Mula di Muggia is located. Starting from previous studies and surveys, morphology of the coastal stretch between the municipality of Grado and the mouth of the Isonzo river will be reproduced through numerical modeling tools, to simulate its hydrodynamic behavior on an annual basis and also as a function of significant events such as storms, calm events or floods of the Isonzo river. The software employed will be the MIKE by DHI with in particular the implementation of "Littoral Drift" and "MIKE 21/3" Coupled models. The first to calculate net and gross longshore transport on an annual basis along a transverse profile, the latter is a modelling system for coastal application that will be used for the analysis of significant events effects. Although not primary focus of this work, there will be included an initial review of finger bars. These particular sand formations are present at the south-western border of the Banco della Mula di Muggia and may have an impact on it. This work could form the starting point of future investigations to build on the findings of this report.
Resumo:
In this thesis, a tube-based Distributed Economic Predictive Control (DEPC) scheme is presented for a group of dynamically coupled linear subsystems. These subsystems are components of a large scale system and control inputs are computed based on optimizing a local economic objective. Each subsystem is interacting with its neighbors by sending its future reference trajectory, at each sampling time. It solves a local optimization problem in parallel, based on the received future reference trajectories of the other subsystems. To ensure recursive feasibility and a performance bound, each subsystem is constrained to not deviate too much from its communicated reference trajectory. This difference between the plan trajectory and the communicated one is interpreted as a disturbance on the local level. Then, to ensure the satisfaction of both state and input constraints, they are tightened by considering explicitly the effect of these local disturbances. The proposed approach averages over all possible disturbances, handles tightened state and input constraints, while satisfies the compatibility constraints to guarantee that the actual trajectory lies within a certain bound in the neighborhood of the reference one. Each subsystem is optimizing a local arbitrary economic objective function in parallel while considering a local terminal constraint to guarantee recursive feasibility. In this framework, economic performance guarantees for a tube-based distributed predictive control (DPC) scheme are developed rigorously. It is presented that the closed-loop nominal subsystem has a robust average performance bound locally which is no worse than that of a local robust steady state. Since a robust algorithm is applying on the states of the real (with disturbances) subsystems, this bound can be interpreted as an average performance result for the real closed-loop system. To this end, we present our outcomes on local and global performance, illustrated by a numerical example.
Resumo:
In the last decade, manufacturing companies have been facing two significant challenges. First, digitalization imposes adopting Industry 4.0 technologies and allows creating smart, connected, self-aware, and self-predictive factories. Second, the attention on sustainability imposes to evaluate and reduce the impact of the implemented solutions from economic and social points of view. In manufacturing companies, the maintenance of physical assets assumes a critical role. Increasing the reliability and the availability of production systems leads to the minimization of systems’ downtimes; In addition, the proper system functioning avoids production wastes and potentially catastrophic accidents. Digitalization and new ICT technologies have assumed a relevant role in maintenance strategies. They allow assessing the health condition of machinery at any point in time. Moreover, they allow predicting the future behavior of machinery so that maintenance interventions can be planned, and the useful life of components can be exploited until the time instant before their fault. This dissertation provides insights on Predictive Maintenance goals and tools in Industry 4.0 and proposes a novel data acquisition, processing, sharing, and storage framework that addresses typical issues machine producers and users encounter. The research elaborates on two research questions that narrow down the potential approaches to data acquisition, processing, and analysis for fault diagnostics in evolving environments. The research activity is developed according to a research framework, where the research questions are addressed by research levers that are explored according to research topics. Each topic requires a specific set of methods and approaches; however, the overarching methodological approach presented in this dissertation includes three fundamental aspects: the maximization of the quality level of input data, the use of Machine Learning methods for data analysis, and the use of case studies deriving from both controlled environments (laboratory) and real-world instances.
Resumo:
Besides increasing the share of electric and hybrid vehicles, in order to comply with more stringent environmental protection limitations, in the mid-term the auto industry must improve the efficiency of the internal combustion engine and the well to wheel efficiency of the employed fuel. To achieve this target, a deeper knowledge of the phenomena that influence the mixture formation and the chemical reactions involving new synthetic fuel components is mandatory, but complex and time intensive to perform purely by experimentation. Therefore, numerical simulations play an important role in this development process, but their use can be effective only if they can be considered accurate enough to capture these variations. The most relevant models necessary for the simulation of the reacting mixture formation and successive chemical reactions have been investigated in the present work, with a critical approach, in order to provide instruments to define the most suitable approaches also in the industrial context, which is limited by time constraints and budget evaluations. To overcome these limitations, new methodologies have been developed to conjugate detailed and simplified modelling techniques for the phenomena involving chemical reactions and mixture formation in non-traditional conditions (e.g. water injection, biofuels etc.). Thanks to the large use of machine learning and deep learning algorithms, several applications have been revised or implemented, with the target of reducing the computing time of some traditional tasks by orders of magnitude. Finally, a complete workflow leveraging these new models has been defined and used for evaluating the effects of different surrogate formulations of the same experimental fuel on a proof-of-concept GDI engine model.
Resumo:
Air pollution is one of the greatest health risks in the world. At the same time, the strong correlation with climate change, as well as with Urban Heat Island and Heat Waves, make more intense the effects of all these phenomena. A good air quality and high levels of thermal comfort are the big goals to be reached in urban areas in coming years. Air quality forecast help decision makers to improve air quality and public health strategies, mitigating the occurrence of acute air pollution episodes. Air quality forecasting approaches combine an ensemble of models to provide forecasts from global to regional air pollution and downscaling for selected countries and regions. The development of models dedicated to urban air quality issues requires a good set of data regarding the urban morphology and building material characteristics. Only few examples of air quality forecast system at urban scale exist in the literature and often they are limited to selected cities. This thesis develops by setting up a methodology for the development of a forecasting tool. The forecasting tool can be adapted to all cities and uses a new parametrization for vegetated areas. The parametrization method, based on aerodynamic parameters, produce the urban spatially varying roughness. At the core of the forecasting tool there is a dispersion model (urban scale) used in forecasting mode, and the meteorological and background concentration forecasts provided by two regional numerical weather forecasting models. The tool produces the 1-day spatial forecast of NO2, PM10, O3 concentration, the air temperature, the air humidity and BLQ-Air index values. The tool is automatized to run every day, the maps produced are displayed on the e-Globus platform, updated every day. The results obtained indicate that the forecasting output were in good agreement with the observed measurements.
Resumo:
Whole Exome Sequencing (WES) is rapidly becoming the first-tier test in clinics, both thanks to its declining costs and the development of new platforms that help clinicians in the analysis and interpretation of SNV and InDels. However, we still know very little on how CNV detection could increase WES diagnostic yield. A plethora of exome CNV callers have been published over the years, all showing good performances towards specific CNV classes and sizes, suggesting that the combination of multiple tools is needed to obtain an overall good detection performance. Here we present TrainX, a ML-based method for calling heterozygous CNVs in WES data using EXCAVATOR2 Normalized Read Counts. We select males and females’ non pseudo-autosomal chromosome X alignments to construct our dataset and train our model, make predictions on autosomes target regions and use HMM to call CNVs. We compared TrainX against a set of CNV tools differing for the detection method (GATK4 gCNV, ExomeDepth, DECoN, CNVkit and EXCAVATOR2) and found that our algorithm outperformed them in terms of stability, as we identified both deletions and duplications with good scores (0.87 and 0.82 F1-scores respectively) and for sizes reaching the minimum resolution of 2 target regions. We also evaluated the method robustness using a set of WES and SNP array data (n=251), part of the Italian cohort of Epi25 collaborative, and were able to retrieve all clinical CNVs previously identified by the SNP array. TrainX showed good accuracy in detecting heterozygous CNVs of different sizes, making it a promising tool to use in a diagnostic setting.
Resumo:
The microstructure of 6XXX aluminum alloys deeply affects mechanical, crash, corrosion and aesthetic properties of extruded profiles. Unfortunately, grain structure evolution during manufacturing processes is a complex phenomenon because several process and material parameters such as alloy chemical composition, temperature, extrusion speed, tools geometries, quenching and thermal treatment parameters affect the grain evolution during the manufacturing process. The aim of the present PhD thesis was the analysis of the recrystallization kinetics during the hot extrusion of 6XXX aluminum alloys and the development of reliable recrystallization models to be used in FEM codes for the microstructure prediction at a die design stage. Experimental activities have been carried out in order to acquire data for the recrystallization models development, validation and also to investigate the effect of process parameters and die design on the microstructure of the final component. The experimental campaign reported in this thesis involved the extrusion of AA6063, AA6060 and AA6082 profiles with different process parameters in order to provide a reliable amount of data for the models validation. A particular focus was made to investigate the PCG defect evolution during the extrusion of medium-strength alloys such as AA6082. Several die designs and process conditions were analysed in order to understand the influence of each of them on the recrystallization behaviour of the investigated alloy. From the numerical point of view, innovative models for the microstructure prediction were developed and validated over the extrusion of industrial-scale profiles with complex geometries, showing a good matching in terms of the grain size and surface recrystallization prediction. The achieved results suggest the reliability of the developed models and their application in the industrial field for process and material properties optimization at a die-design stage.
Resumo:
The fourth industrial revolution is paving the way for Industrial Internet of Things applications where industrial assets (e.g., robotic arms, valves, pistons) are equipped with a large number of wireless devices (i.e., microcontroller boards that embed sensors and actuators) to enable a plethora of new applications, such as analytics, diagnostics, monitoring, as well as supervisory, and safety control use-cases. Nevertheless, current wireless technologies, such as Wi-Fi, Bluetooth, and even private 5G networks, cannot fulfill all the requirements set up by the Industry 4.0 paradigm, thus opening up new 6G-oriented research trends, such as the use of THz frequencies. In light of the above, this thesis provides (i) a broad overview of the main use-cases, requirements, and key enabling wireless technologies foreseen by the fourth industrial revolution, and (ii) proposes innovative contributions, both theoretical and empirical, to enhance the performance of current and future wireless technologies at different levels of the protocol stack. In particular, at the physical layer, signal processing techniques are being exploited to analyze two multiplexing schemes, namely Affine Frequency Division Multiplexing and Orthogonal Chirp Division Multiplexing, which seem promising for high-frequency wireless communications. At the medium access layer, three protocols for intra-machine communications are proposed, where one is based on LoRa at 2.4 GHz and the others work in the THz band. Different scheduling algorithms for private industrial 5G networks are compared, and two main proposals are described, i.e., a decentralized scheme that leverages machine learning techniques to better address aperiodic traffic patterns, and a centralized contention-based design that serves a federated learning industrial application. Results are provided in terms of numerical evaluations, simulation results, and real-world experiments. Several improvements over the state-of-the-art were obtained, and the description of up-and-running testbeds demonstrates the feasibility of some of the theoretical concepts when considering a real industry plant.
Resumo:
In this doctoral dissertation, a comprehensive methodological approach for the assessment of river embankments safety conditions, based on the integrated use of laboratory testing, physical modelling and finite element (FE) numerical simulations, is proposed, with the aim of contributing to a better understanding of the effect of time-dependent hydraulic boundary conditions on the hydro-mechanical response of river embankments. The case study and materials selected for the present research project are representative for the riverbank systems of Alpine and Apennine tributaries of the main river Po (Northern Italy), which have recently experienced various sudden overall collapses. The outcomes of a centrifuge test carried out under the enhanced gravity field of 50-g, on a riverbank model, made of a compacted silty sand mixture, overlying a homogeneous clayey silt foundation layer and subjected to a simulated flood event, have been considered for the definition of a robust and realistic experimental benchmark. In order to reproduce the observed experimental behaviour, a first set of numerical simulations has been carried out by assuming, for both the embankments and the foundation unit, rigid soil porous media, under partially saturated conditions. Mechanical and hydraulic soil properties adopted in the numerical analyses have been carefully estimated based on standard saturated triaxial, oedometer and constant head permeability tests. Afterwards, advanced suction-controlled laboratory tests, have been carried out to investigate the effect of suction and confining stresses on the shear strength and compressibility characteristics of the filling material and a second set of numerical simulations has been run, taking into account the soil parameters updated based on the most recent tests. The final aim of the study is the quantitative estimation of the predictive capabilities of the calibrated numerical tools, by systematically comparing the results of the FE simulations to the experimental benchmark.