15 resultados para Application of Data-driven Modelling in Water Sciences

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nanotechnologies are rapidly expanding because of the opportunities that the new materials offer in many areas such as the manufacturing industry, food production, processing and preservation, and in the pharmaceutical and cosmetic industry. Size distribution of the nanoparticles determines their properties and is a fundamental parameter that needs to be monitored from the small-scale synthesis up to the bulk production and quality control of nanotech products on the market. A consequence of the increasing number of applications of nanomaterial is that the EU regulatory authorities are introducing the obligation for companies that make use of nanomaterials to acquire analytical platforms for the assessment of the size parameters of the nanomaterials. In this work, Asymmetrical Flow Field-Flow Fractionation (AF4) and Hollow Fiber F4 (HF5), hyphenated with Multiangle Light Scattering (MALS) are presented as tools for a deep functional characterization of nanoparticles. In particular, it is demonstrated the applicability of AF4-MALS for the characterization of liposomes in a wide series of mediums. Afterwards the technique is used to explore the functional features of a liposomal drug vector in terms of its biological and physical interaction with blood serum components: a comprehensive approach to understand the behavior of lipid vesicles in terms of drug release and fusion/interaction with other biological species is described, together with weaknesses and strength of the method. Afterwards the size characterization, size stability, and conjugation of azidothymidine drug molecules with a new generation of metastable drug vectors, the Metal Organic Frameworks, is discussed. Lastly, it is shown the applicability of HF5-ICP-MS for the rapid screening of samples of relevant nanorisk: rather than a deep and comprehensive characterization it this time shown a quick and smart methodology that within few steps provides qualitative information on the content of metallic nanoparticles in tattoo ink samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Minor components are of particular interest due to their antioxidant and biological properties. Various classes of lipophilic minor components (plant sterols (PS) and α-tocopherol) were selected as they are widely used in the food industry. A Fast GC-MS method for PS analysis in functional dairy products was set up. The analytical performance and significant reduction of the analysis time and consumables, demonstrated that Fast GC-MS could be suitable for the PS analysis in functional dairy products. Due to their chemical structure, PS can undergo oxidation, which could be greatly impacted by matrix nature/composition and thermal treatments. The oxidative stability of PS during microwave heating was evaluated. Two different model systems (PS alone and in combination) were heated up to 30 min at 1000 W. PS degraded faster when they were alone than in presence of TAG. The extent of PS degradation depends on both heating time and the surrounding medium, which can impact the quality and safety of the food product destined to microwave heating/cooking. Many minor lipid components are included in emulsion systems and can affect the rate of lipid oxidation. The oxidative stability of oil-in-water (O/W) emulsions containing PS esters, ω-3 FA and phenolic compounds, were evaluated after a 14-day storage at room temperature. Due to their surface active character, PS could be particularly prone to oxidation when they are incorporated in emulsions, as they are more exposed to water-soluble prooxidants. Finally, some minor lipophilic components may increase oxidative stability of food systems due to their antioxidant activity. á-tocopherol partitioning and antioxidant activity was determined in the presence of excess SDS in stripped soybean O/W emulsions. Results showed that surfactant micelles could play a key role as an antioxidant carrier, by potentially increasing the accessibility of hydrophobic antioxidant to the interface.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Standard Model (SM) of particle physics predicts the existence of a Higgs field responsible for the generation of particles' mass. However, some aspects of this theory remain unsolved, supposing the presence of new physics Beyond the Standard Model (BSM) with the production of new particles at a higher energy scale compared to the current experimental limits. The search for additional Higgs bosons is, in fact, predicted by theoretical extensions of the SM including the Minimal Supersymmetry Standard Model (MSSM). In the MSSM, the Higgs sector consists of two Higgs doublets, resulting in five physical Higgs particles: two charged bosons $H^{\pm}$, two neutral scalars $h$ and $H$, and one pseudoscalar $A$. The work presented in this thesis is dedicated to the search of neutral non-Standard Model Higgs bosons decaying to two muons in the model independent MSSM scenario. Proton-proton collision data recorded by the CMS experiment at the CERN LHC at a center-of-mass energy of 13 TeV are used, corresponding to an integrated luminosity of $35.9\ \text{fb}^{-1}$. Such search is sensitive to neutral Higgs bosons produced either via gluon fusion process or in association with a $\text{b}\bar{\text{b}}$ quark pair. The extensive usage of Machine and Deep Learning techniques is a fundamental element in the discrimination between signal and background simulated events. A new network structure called parameterised Neural Network (pNN) has been implemented, replacing a whole set of single neural networks trained at a specific mass hypothesis value with a single neural network able to generalise well and interpolate in the entire mass range considered. The results of the pNN signal/background discrimination are used to set a model independent 95\% confidence level expected upper limit on the production cross section times branching ratio, for a generic $\phi$ boson decaying into a muon pair in the 130 to 1000 GeV range.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Legionella is a Gram-negative bacterium that represent a public health issue, with heavy social and economic impact. Therefore, it is mandatory to provide a proper environmental surveillance and risk assessment plan to perform Legionella control in water distribution systems in hospital and community buildings. The thesis joins several methodologies in a unique workflow applied for the identification of non-pneumophila Legionella species (n-pL), starting from standard methods as culture and gene sequencing (mip and rpoB), and passing through innovative approaches as MALDI-TOF MS technique and whole genome sequencing (WGS). The results obtained, were compared to identify the Legionella isolates, and lead to four presumptive novel Legionella species identification. One of these four new isolates was characterized and recognized at taxonomy level with the name of Legionella bononiensis (the 64th Legionella species). The workflow applied in this thesis, help to increase the knowledge of Legionella environmental species, improving the description of the environment itself and the events that promote the growth of Legionella in their ecological niche. The correct identification and characterization of the isolates permit to prevent their spread in man-made environment and contain the occurrence of cases, clusters, or outbreaks. Therefore, the experimental work undertaken, could support the preventive measures during environmental and clinical surveillance, improving the study of species often underestimated or still unknown.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The discovery of new materials and their functions has always been a fundamental component of technological progress. Nowadays, the quest for new materials is stronger than ever: sustainability, medicine, robotics and electronics are all key assets which depend on the ability to create specifically tailored materials. However, designing materials with desired properties is a difficult task, and the complexity of the discipline makes it difficult to identify general criteria. While scientists developed a set of best practices (often based on experience and expertise), this is still a trial-and-error process. This becomes even more complex when dealing with advanced functional materials. Their properties depend on structural and morphological features, which in turn depend on fabrication procedures and environment, and subtle alterations leads to dramatically different results. Because of this, materials modeling and design is one of the most prolific research fields. Many techniques and instruments are continuously developed to enable new possibilities, both in the experimental and computational realms. Scientists strive to enforce cutting-edge technologies in order to make progress. However, the field is strongly affected by unorganized file management, proliferation of custom data formats and storage procedures, both in experimental and computational research. Results are difficult to find, interpret and re-use, and a huge amount of time is spent interpreting and re-organizing data. This also strongly limit the application of data-driven and machine learning techniques. This work introduces possible solutions to the problems described above. Specifically, it talks about developing features for specific classes of advanced materials and use them to train machine learning models and accelerate computational predictions for molecular compounds; developing method for organizing non homogeneous materials data; automate the process of using devices simulations to train machine learning models; dealing with scattered experimental data and use them to discover new patterns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sheet pile walls are one of the oldest earth retention systems utilized in civil engineering projects. They are used for various purposes; such as excavation support system, cofferdams, cut-off walls under dams, slope stabilization, waterfront structures, and flood walls. Sheet pile walls are one of the most common types of quay walls used in port construction. The worldwide increases in utilization of large ships for transportation have created an urgent need of deepening the seabed within port areas and consequently the rehabilitation of its wharfs. Several methods can be used to increase the load-carrying capacity of sheet-piling walls. The use of additional anchored tie rods grouted into the backfill soil and arranged along the exposed wall height is one of the most practical and appropriate solutions adopted for stabilization and rehabilitation of the existing quay wall. The Ravenna Port Authority initiated a project to deepen the harbor bottom at selected wharves. An extensive parametric study through the finite element program, PLAXIS 2D, version 2012 was carried out to investigate the enhancement of using submerged grouted anchors technique on the load response of sheet-piling quay wall. The influence of grout-ties area, length of grouted body, anchor inclination and anchor location were considered and evaluated due to the effect of different system parameters. Also a comparative study was conducted by Plaxis 2D and 3D program to investigate the behavior of these sheet pile quay walls in terms of horizontal displacements induced along the sheet pile wall and ground surface settlements as well as the anchor force and calculated factor of safety. Finally, a comprehensive study was carried out by using different constitutive models to simulate the mechanical behavior of the soil to investigate the effect of these two models (Mohr-Coulomb and Hardening Soil) on the behavior of these sheet pile quay walls.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work provides several policy proposals capable to strengthen the private enforcement of EU competition law in arbitration. It focuses on the procedural law aspects that are permeated by legal uncertainty and that have not yet fallen under the scrutiny of the law and economics debate. The policy proposals described herein are based on the functional approach to law and economics and aim to promote a more qualified decision making process by: adjudicators, private parties and lawmakers. The resulting framework of procedural rules would be a cost-effective policy tool that could sustain the European Commission’s effort to guarantee a workable level of competition in the EU internal market. This project aims to answer the following broad research question: which procedural rules can improve the efficiency of antitrust arbitration by decreasing litigation costs for private parties on the one hand, and by increasing private parties’ compliance with competition law on the other hand?Throughout this research project, such broad question has been developed into research sub-questions revolving around several key legal issues. The chosen sub-research questions result from a vacuum in the European enforcement system that leaves several key legal issues in antitrust arbitration unresolved. The legal framework proposed in this research project could prevent such a blurry scenario from impairing the EU private enforcement of competition law in arbitration. Therefore, our attention was triggered by those legal issues whose proposed solutions lead to relevant uncertainties and that are most suitable for a law and economics analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Piezoelectric instrumentation seems to offer 3 important advantages for cutting bone structures. Be more precise because it is produced by micro-vibrations from the cutting insert. Be safer because the ultrasonic frequency used does not affect soft tissue. Thirdly, the less invasive cutting action produces minor tissue damage and consequently probably a better healing Aim of the Study: The aim of this study is to evaluate the effectiveness of piezoelectric device capability in maxillo-facial surgery, in order to take advantage of these favourable capacity. Material and Methods: Considering the several potential application of the piezoelectric technology in Orthognathic, Oncologic and Extractive surgery, we would like to design protocols in order to verify how this new device can modify the surgical technique, the surgical time, the patients healing and its quality of life. Results: Due to the precise Piezosurgery cut, we can manage the Cad-Cam-Custom Made plates protocol in Oncologic Surgery and in Orthognatic Surgery increasing our percentage of comparison between the 3D preoperative plan and the surgical execution. We also found a better quality of life impaction in Patient who underwent and extractive surgery Conclusion: Piezosurgery device seems to be a strong surgical aid were safe and precise cut are needed and its capability to reduce the discomfort Patients need to be study in deep also in major surgery like Orthognatic and Oncologic surgery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis studies how commercial practice is developing with artificial intelligence (AI) technologies and discusses some normative concepts in EU consumer law. The author analyses the phenomenon of 'algorithmic business', which defines the increasing use of data-driven AI in marketing organisations for the optimisation of a range of consumer-related tasks. The phenomenon is orienting business-consumer relations towards some general trends that influence power and behaviors of consumers. These developments are not taking place in a legal vacuum, but against the background of a normative system aimed at maintaining fairness and balance in market transactions. The author assesses current developments in commercial practices in the context of EU consumer law, which is specifically aimed at regulating commercial practices. The analysis is critical by design and without neglecting concrete practices tries to look at the big picture. The thesis consists of nine chapters divided in three thematic parts. The first part discusses the deployment of AI in marketing organisations, a brief history, the technical foundations, and their modes of integration in business organisations. In the second part, a selected number of socio-technical developments in commercial practice are analysed. The following are addressed: the monitoring and analysis of consumers’ behaviour based on data; the personalisation of commercial offers and customer experience; the use of information on consumers’ psychology and emotions, the mediation through marketing conversational applications. The third part assesses these developments in the context of EU consumer law and of the broader policy debate concerning consumer protection in the algorithmic society. In particular, two normative concepts underlying the EU fairness standard are analysed: manipulation, as a substantive regulatory standard that limits commercial behaviours in order to protect consumers’ informed and free choices and vulnerability, as a concept of social policy that portrays people who are more exposed to marketing practices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Imaging technologies are widely used in application fields such as natural sciences, engineering, medicine, and life sciences. A broad class of imaging problems reduces to solve ill-posed inverse problems (IPs). Traditional strategies to solve these ill-posed IPs rely on variational regularization methods, which are based on minimization of suitable energies, and make use of knowledge about the image formation model (forward operator) and prior knowledge on the solution, but lack in incorporating knowledge directly from data. On the other hand, the more recent learned approaches can easily learn the intricate statistics of images depending on a large set of data, but do not have a systematic method for incorporating prior knowledge about the image formation model. The main purpose of this thesis is to discuss data-driven image reconstruction methods which combine the benefits of these two different reconstruction strategies for the solution of highly nonlinear ill-posed inverse problems. Mathematical formulation and numerical approaches for image IPs, including linear as well as strongly nonlinear problems are described. More specifically we address the Electrical impedance Tomography (EIT) reconstruction problem by unrolling the regularized Gauss-Newton method and integrating the regularization learned by a data-adaptive neural network. Furthermore we investigate the solution of non-linear ill-posed IPs introducing a deep-PnP framework that integrates the graph convolutional denoiser into the proximal Gauss-Newton method with a practical application to the EIT, a recently introduced promising imaging technique. Efficient algorithms are then applied to the solution of the limited electrods problem in EIT, combining compressive sensing techniques and deep learning strategies. Finally, a transformer-based neural network architecture is adapted to restore the noisy solution of the Computed Tomography problem recovered using the filtered back-projection method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Because of the potentially irreversible impact of groundwater quality deterioration in the Ferrara coastal aquifer, answers concerning the assessment of the extent of the salinization problem, the understanding of the mechanisms governing salinization processes, and the sustainability of the current water resources management are urgent. In this light, the present thesis aims to achieve the following objectives: Characterization of the lowland coastal aquifer of Ferrara: hydrology, hydrochemistry and evolution of the system The importance of data acquisition techniques in saltwater intrusion monitoring Predicting salinization trends in the lowland coastal aquifer Ammonium occurrence in a salinized lowland coastal aquifer Trace elements mobility in a saline coastal aquifer

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work is focused on the study of saltwater intrusion in coastal aquifers, and in particular on the realization of conceptual schemes to evaluate the risk associated with it. Saltwater intrusion depends on different natural and anthropic factors, both presenting a strong aleatory behaviour, that should be considered for an optimal management of the territory and water resources. Given the uncertainty of problem parameters, the risk associated with salinization needs to be cast in a probabilistic framework. On the basis of a widely adopted sharp interface formulation, key hydrogeological problem parameters are modeled as random variables, and global sensitivity analysis is used to determine their influence on the position of saltwater interface. The analyses presented in this work rely on an efficient model reduction technique, based on Polynomial Chaos Expansion, able to combine the best description of the model without great computational burden. When the assumptions of classical analytical models are not respected, and this occurs several times in the applications to real cases of study, as in the area analyzed in the present work, one can adopt data-driven techniques, based on the analysis of the data characterizing the system under study. It follows that a model can be defined on the basis of connections between the system state variables, with only a limited number of assumptions about the "physical" behaviour of the system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The internet and digital technologies revolutionized the economy. Regulating the digital market has become a priority for the European Union. While promoting innovation and development, EU institutions must assure that the digital market maintains a competitive structure. Among the numerous elements characterizing the digital sector, users’ data are particularly important. Digital services are centered around personal data, the accumulation of which contributed to the centralization of market power in the hands of a few large providers. As a result, data-driven mergers and data-related abuses gained a central role for the purposes of EU antitrust enforcement. In light of these considerations, this work aims at assessing whether EU competition law is well-suited to address data-driven mergers and data-related abuses of dominance. These conducts are of crucial importance to the maintenance of competition in the digital sector, insofar as the accumulation of users’ data constitutes a fundamental competitive advantage. To begin with, part 1 addresses the specific features of the digital market and their impact on the definition of the relevant market and the assessment of dominance by antitrust authorities. Secondly, part 2 analyzes the EU’s case law on data-driven mergers to verify if merger control is well-suited to address these concentrations. Thirdly, part 3 discusses abuses of dominance in the phase of data collection and the legal frameworks applicable to these conducts. Fourthly, part 4 focuses on access to “essential” datasets and the indirect effects of anticompetitive conducts on rivals’ ability to access users’ information. Finally, Part 5 discusses differential pricing practices implemented online and based on personal data. As it will be assessed, the combination of an efficient competition law enforcement and the auspicial adoption of a specific regulation seems to be the best solution to face the challenges raised by “data-related dominance”.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present work aims to provide a deeper understanding of thermally driven turbulence and to address some modelling aspects related to the physics of the flow. For this purpose, two idealized systems are investigated by Direct Numerical Simulation: the rotating and non-rotating Rayleigh-Bénard convection. The preliminary study of the flow topologies shows how the coherent structures organise into different patterns depending on the rotation rate. From a statistical perspective, the analysis of the turbulent kinetic energy and temperature variance budgets allows to identify the flow regions where the production, the transport, and the dissipation of turbulent fluctuations occur. To provide a multi-scale description of the flows, a theoretical framework based on the Kolmogorov and Yaglom equations is applied for the first time to the Rayleigh-Bénard convection. The analysis shows how the spatial inhomogeneity modulates the dynamics at different scales and wall-distances. Inside the core of the flow, the space of scales can be divided into an inhomogeneity-dominated range at large scales, an inertial-like range at intermediate scales and a dissipative range at small scales. This classic scenario breaks close to the walls, where the inhomogeneous mechanisms and the viscous/diffusive processes are important at every scale and entail more complex dynamics. The same theoretical framework is extended to the filtered velocity and temperature fields of non-rotating Rayleigh-Bénard convection. The analysis of the filtered Kolmogorov and Yaglom equations reveals the influence of the residual scales on the filtered dynamics both in physical and scale space, highlighting the effect of the relative position between the filter length and the crossover that separates the inhomogeneity-dominated range from the quasi-homogeneous range. The assessment of the filtered and residual physics results to be instrumental for the correct use of the existing Large-Eddy Simulation models and for the development of new ones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present Dissertation shows how recent statistical analysis tools and open datasets can be exploited to improve modelling accuracy in two distinct yet interconnected domains of flood hazard (FH) assessment. In the first Part, unsupervised artificial neural networks are employed as regional models for sub-daily rainfall extremes. The models aim to learn a robust relation to estimate locally the parameters of Gumbel distributions of extreme rainfall depths for any sub-daily duration (1-24h). The predictions depend on twenty morphoclimatic descriptors. A large study area in north-central Italy is adopted, where 2238 annual maximum series are available. Validation is performed over an independent set of 100 gauges. Our results show that multivariate ANNs may remarkably improve the estimation of percentiles relative to the benchmark approach from the literature, where Gumbel parameters depend on mean annual precipitation. Finally, we show that the very nature of the proposed ANN models makes them suitable for interpolating predicted sub-daily rainfall quantiles across space and time-aggregation intervals. In the second Part, decision trees are used to combine a selected blend of input geomorphic descriptors for predicting FH. Relative to existing DEM-based approaches, this method is innovative, as it relies on the combination of three characteristics: (1) simple multivariate models, (2) a set of exclusively DEM-based descriptors as input, and (3) an existing FH map as reference information. First, the methods are applied to northern Italy, represented with the MERIT DEM (∼90m resolution), and second, to the whole of Italy, represented with the EU-DEM (25m resolution). The results show that multivariate approaches may (a) significantly enhance flood-prone areas delineation relative to a selected univariate one, (b) provide accurate predictions of expected inundation depths, (c) produce encouraging results in extrapolation, (d) complete the information of imperfect reference maps, and (e) conveniently convert binary maps into continuous representation of FH.