819 resultados para Peculiarities


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The simulation of ultrafast photoinduced processes is a fundamental step towards the understanding of the underlying molecular mechanism and interpretation/prediction of experimental data. Performing a computer simulation of a complex photoinduced process is only possible introducing some approximations but, in order to obtain reliable results, the need to reduce the complexity must balance with the accuracy of the model, which should include all the relevant degrees of freedom and a quantitatively correct description of the electronic states involved in the process. This work presents new computational protocols and strategies for the parameterisation of accurate models for photochemical/photophysical processes based on state-of-the-art multiconfigurational wavefunction-based methods. The required ingredients for a dynamics simulation include potential energy surfaces (PESs) as well as electronic state couplings, which must be mapped across the wide range of geometries visited during the wavepacket/trajectory propagation. The developed procedures allow to obtain solid and extended databases reducing as much as possible the computational cost, thanks to, e.g., specific tuning of the level of theory for different PES regions and/or direct calculation of only the needed components of vectorial quantities (like gradients or nonadiabatic couplings). The presented approaches were applied to three case studies (azobenzene, pyrene, visual rhodopsin), all requiring an accurate parameterisation but for different reasons. The resulting models and simulations allowed to elucidate the mechanism and time scale of the internal conversion, reproducing or even predicting new transient experiments. The general applicability of the developed protocols to systems with different peculiarities and the possibility to parameterise different types of dynamics on an equal footing (classical vs purely quantum) prove that the developed procedures are flexible enough to be tailored for each specific system, and pave the way for exact quantum dynamics with multiple degrees of freedom.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The steadily growing immigration phenomenon in today’s Japan is showing a tangible and expanding presence of immigrant-origin youths residing in the country. International research in the migration studies area has underlined the importance of focusing on immigrant-origin youths to shed light on the character of the way immigrant incorporate in countries of destinations. In-deed, immigrants’ offspring, the adults of tomorrow, embody the interlocutor between first-generation immigrants and the receiving societal context. The extent of the presence of immigrants’ children in countries of destination is also a reliable yardstick to assess the maturation of the migration process, transforming it from a temporary phenomenon to a long-term settlement. Within this framework, the school is a privileged site to observe and analyze immigrant-origin youths’ integration. Alongside their family and peers, school constitutes one of the main agents of socialization. Here, children learn norms and rules and acquire the necessary tools to eventually compete in the pursuit of an occupation, determining their future socioeconomic standing. This doctoral research aims to identify which theoretical model articulated in the area of migration studies best describes the adaptation process of immigrant-origin youths in Japan. In particular, it examines whether (and to what extent) any of the pre-existing frameworks can help explain the Japanese occurring circumstances, or whether further elaboration and adjustment are needed. Alternatively, it studies whether it is necessary to produce a new model based on the peculiarities of the Japanese social context. This study provides a theoretical-oriented contribution to the (mainly descriptive but maturing) literature on immigrant-origin youths’ integration in Japan. Considering past growth trends of Japanese immigration and its expanding prospective projections (Korekawa 2018c), this study might be considered pioneering for future development of the phenomenon.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gli investimenti alle infrastrutture di trasporto sono stati per lungo tempo considerati misure di politica generale di competenza esclusiva degli Stati membri, nonostante il generale divieto di misure di sostegno pubblico a favore delle imprese ai sensi dell’art. 107 TFUE. Le sentenze rese dalle corti eurounitarie in relazione agli aeroporti di Parigi e di Lipsia Halle hanno dato avvio ad un vero e proprio revirement giurisprudenziale, in considerazione delle trasformazioni economiche internazionali, rimettendo in discussione il concetto di impresa, nonché la ferma interpretazione secondo cui il finanziamento alle infrastrutture – in quanto beni pubblici intesi a soddisfare i bisogni di mobilità dei cittadini – sfuggirebbe all’applicazione della disciplina degli aiuti di Stato. Nonostante le esigenze di costante ammodernamento e sviluppo delle infrastrutture, il nuovo quadro regolatorio adottato dall’Unione europea a seguire ha condotto inevitabilmente gli Stati membri a dover sottoporre al vaglio preventivo della Commissione ogni nuovo investimento infrastrutturale. La presente trattazione, muovendo dall’analisi della disciplina degli aiuti di Stato di cui agli artt. 107 e ss. TFUE, analizza i principi di creazione giurisprudenziale e dottrinale che derivano dall’interpretazione delle fonti primarie, mettendo in evidenza le principali problematiche giuridiche sottese, anche in considerazione delle peculiarità delle infrastrutture in questione, dei modelli proprietari e di governance, delle competenze e dei poteri decisionali in merito a nuovi progetti di investimento. Infine, la trattazione si concentra sui grandi progetti infrastrutturali a livello europeo e internazionale che interessano le reti di trasporto, analizzando le nuove sfide, pur considerando la necessità di assicurare, anche rispetto ad essi, la salvaguardia del cd. level playing field e l’osservanza sostanziale delle norme sugli aiuti di Stato.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Laser-based Powder Bed Fusion (L-PBF) technology is one of the most commonly used metal Additive Manufacturing (AM) techniques to produce highly customized and value-added parts. The AlSi10Mg alloy has received more attention in the L-PBF process due to its good printability, high strength/weight ratio, corrosion resistance, and relatively low cost. However, a deep understanding of the effect of heat treatments on this alloy's metastable microstructure is still required for developing tailored heat treatments for the L-PBF AlSi10Mg alloy to overcome the limits of the as-built condition. Several authors have already investigated the effects of conventional heat treatment on the microstructure and mechanical behavior of the L-PBF AlSi10Mg alloy but often overlooked the peculiarities of the starting supersatured and ultrafine microstructure induced by rapid solidification. For this reason, the effects of innovative T6 heat treatment (T6R) on the microstructure and mechanical behavior of the L-PBF AlSi10Mg alloy were assessed. The short solution soaking time (10 min) and the relatively low temperature (510 °C) reduced the typical porosity growth at high temperatures and led to a homogeneous distribution of fine globular Si particles in the Al matrix. In addition, it increased the amount of Mg and Si in the solid solution available for precipitation hardening during the aging step. The mechanical (at room temperature and 200 °C) and tribological properties of the T6R alloy were evaluated and compared with other solutions, especially with an optimized direct-aged alloy (T5 alloy). Results showed that the innovative T6R alloy exhibits the best mechanical trade-off between strength and ductility, the highest fatigue strength among the analyzed conditions, and interesting tribological behavior. Furthermore, the high-temperature mechanical performances of the heat-treated L-PBF AlSi10Mg alloy make it suitable for structural components operating in mild service conditions at 200 °C.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The severe accidents deriving from the impact of natural events on industrial installations have become a matter of growing concern in the last decades. In the literature, these events are typically referred to as Natech accidents. Several peculiarities distinguish them from conventional industrial accidents caused by internal factors, such as the possible occurrence of multiple simultaneous failures, and the enhanced probability of cascading events. The research project provides a comprehensive overview of Natech accidents that occurred in the Chemical and Process Industry, allowing for the identification of relevant aspects of Natech events. Quantified event trees and probability of ignition are derived from the collected dataset, providing a step forward in the quantitative risk assessment of Natech accidents. The investigation of past Natech accidents also demonstrated that wildfires may cause technological accidents. Climate change and global warming are promoting the conditions for wildfire development and rapid spread. Hence, ensuring the safety of industrial facilities exposed to wildfires is paramount. This was achieved defining safety distances between wildland vegetation and industrial equipment items. In addition, an innovative methodology for the vulnerability assessment of Natech and Domino scenarios triggered by wildfires was developed. The approach accounted for the dynamic behaviour of wildfire events and related technological scenarios. Besides, the performance of the emergency response and the related intervention time in the case of cascading events caused by natural events were evaluated. Overall, the tools presented in this thesis represent a step forward in the Quantitative Risk Assessment of Natech accidents. The methodologies developed also provide a solid basis for the definition of effective strategies for risk mitigation and reduction. These aspects are crucial to improve the resilience of industrial plants to natural hazards, especially considering the effects that climate change may have on the severity of such events.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The application of modern ICT technologies is radically changing many fields pushing toward more open and dynamic value chains fostering the cooperation and integration of many connected partners, sensors, and devices. As a valuable example, the emerging Smart Tourism field derived from the application of ICT to Tourism so to create richer and more integrated experiences, making them more accessible and sustainable. From a technological viewpoint, a recurring challenge in these decentralized environments is the integration of heterogeneous services and data spanning multiple administrative domains, each possibly applying different security/privacy policies, device and process control mechanisms, service access, and provisioning schemes, etc. The distribution and heterogeneity of those sources exacerbate the complexity in the development of integrating solutions with consequent high effort and costs for partners seeking them. Taking a step towards addressing these issues, we propose APERTO, a decentralized and distributed architecture that aims at facilitating the blending of data and services. At its core, APERTO relies on APERTO FaaS, a Serverless platform allowing fast prototyping of the business logic, lowering the barrier of entry and development costs to newcomers, (zero) fine-grained scaling of resources servicing end-users, and reduced management overhead. APERTO FaaS infrastructure is based on asynchronous and transparent communications between the components of the architecture, allowing the development of optimized solutions that exploit the peculiarities of distributed and heterogeneous environments. In particular, APERTO addresses the provisioning of scalable and cost-efficient mechanisms targeting: i) function composition allowing the definition of complex workloads from simple, ready-to-use functions, enabling smarter management of complex tasks and improved multiplexing capabilities; ii) the creation of end-to-end differentiated QoS slices minimizing interfaces among application/service running on a shared infrastructure; i) an abstraction providing uniform and optimized access to heterogeneous data sources, iv) a decentralized approach for the verification of access rights to resources.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The study of ancient, undeciphered scripts presents unique challenges, that depend both on the nature of the problem and on the peculiarities of each writing system. In this thesis, I present two computational approaches that are tailored to two different tasks and writing systems. The first of these methods is aimed at the decipherment of the Linear A afraction signs, in order to discover their numerical values. This is achieved with a combination of constraint programming, ad-hoc metrics and paleographic considerations. The second main contribution of this thesis regards the creation of an unsupervised deep learning model which uses drawings of signs from ancient writing system to learn to distinguish different graphemes in the vector space. This system, which is based on techniques used in the field of computer vision, is adapted to the study of ancient writing systems by incorporating information about sequences in the model, mirroring what is often done in natural language processing. In order to develop this model, the Cypriot Greek Syllabary is used as a target, since this is a deciphered writing system. Finally, this unsupervised model is adapted to the undeciphered Cypro-Minoan and it is used to answer open questions about this script. In particular, by reconstructing multiple allographs that are not agreed upon by paleographers, it supports the idea that Cypro-Minoan is a single script and not a collection of three script like it was proposed in the literature. These results on two different tasks shows that computational methods can be applied to undeciphered scripts, despite the relatively low amount of available data, paving the way for further advancement in paleography using these methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis aims to illustrate the construction of a mathematical model of a hydraulic system, oriented to the design of a model predictive control (MPC) algorithm. The modeling procedure starts with the basic formulation of a piston-servovalve system. The latter is a complex non linear system with some unknown and not measurable effects that constitute a challenging problem for the modeling procedure. The first level of approximation for system parameters is obtained basing on datasheet informations, provided workbench tests and other data from the company. Then, to validate and refine the model, open-loop simulations have been made for data matching with the characteristics obtained from real acquisitions. The final developed set of ODEs captures all the main peculiarities of the system despite some characteristics due to highly varying and unknown hydraulic effects, like the unmodeled resistive elements of the pipes. After an accurate analysis, since the model presents many internal complexities, a simplified version is presented. The latter is used to linearize and discretize correctly the non linear model. Basing on that, a MPC algorithm for reference tracking with linear constraints is implemented. The results obtained show the potential of MPC in this kind of industrial applications, thus a high quality tracking performances while satisfying state and input constraints. The increased robustness and flexibility are evident with respect to the standard control techniques, such as PID controllers, adopted for these systems. The simulations for model validation and the controlled system have been carried out in a Python code environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Digital Breast Tomosynthesis (DBT) is an advanced mammography technique based on the reconstruction of a pseudo-volumetric image. To date, image quality represents the most deficient section of DBT quality control protocols. In fact, related tests are not yet characterized by either action levels or typical values. This thesis work focuses on the evaluation of one aspect of image quality: the z-resolution. The latter is studied in terms of Artifact Spread Function (ASF), a function that describes the signal spread of a detail along the reconstructed focal planes. To quantify the ASF numerically, its Full Width at Half Maximum (FWHM) is calculated and used as a representative index of z-resolution. Experimental measurements were acquired in 24 DBT systems, of 7 different models, currently in use in 20 hospital facilities in Italy. The analysis, performed on the clinical reconstructed images, of 5 different commercial phantoms, lead to the identification of characteristic FWHM values for each type of DBT system. The ASF clearly showed a dependence on the size of the detail, providing higher FWHM values for larger objects. The z-resolution was found to be positively influenced by the acquisition angle: Fujifilm sistematically showed wider ASF profiles in ST mode (15°) than in HR mode (40°). However, no clear relationship was found between angular range and ASF, among different DBT systems, due to the influence of the peculiarities of each reconstruction algorithm. The experimental approach shown in this thesis work can be proposed as a z-resolution quality control test procedure. Contextually, the values found could be used as a starting point for identifying typical values to be included in the test, in a DBT protocol. Clearly, a statistically significant number of images is needed to do this. The equipment involved in this work is located in hospitals and is not available for research purposes, so only a limited amount of data was acquired and processed.