11 resultados para Optimal fusion performance

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The fundamental goal of this thesis is the determination of the isospin dependence of the Ar+Ni fusion-evaporation cross section. Three Ar isotope beams, with energies of about 13AMeV, have been accelerated and impinged onto isotopically enriched Ni targets, in order to produce Pd nuclei, with mass number varying from 92 to 104. The measurements have been performed by the high performance 4pi detector INDRA, coupled with the magnetic spectrometer VAMOS. Even if the results are very preliminary, the obtained fusion-evaporation cross sections behaviour gives a hint at the possible isospin dependence of the fusion-evaporation cross sections.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Plasma Focus is a device designed to generate a plasma sheet between two coaxial electrodes by means of a high voltage difference. The plasma is then driven to collapse into a “pinch”, where thermonuclear conditions prevail. During the “pinch phase” charged particles are emitted, with two main components: an ion beam peaked forward and an electron beam directed backward. The electron beam emitted backward by Plasma Focus devices is being investigated as a radiation source for medical applications, using it to produce x-rays by interaction with appropriate targets (through bremsstrahlung and characteristic emission). A dedicated Plasma Focus device, named PFMA-3 (Plasma Focus for Medical Applications number 3), has been designed, put in operation and tested by the research groups of the Universities of Bologna and Ferrara. The very high dose rate (several gray per discharge, in less than 1 µs) is a peculiarity of this device that has to be investigated, as it might modify the relative biological effectiveness (RBE). Aim of this Ph.D. project was to investigate the main physical properties of the low-energy x-ray beams produced by a Plasma Focus device and their potential medical applications to IORT treatments. It was necessary to develop the optimal geometrical configuration; to evaluate the x-rays produced and their dose deposited; to estimate the energy electron spectrum produced in the “pinch phase”; to study an optimal target for the conversion of the x-rays; to conduct simulations to study the physics involved; and in order to evaluate the radio-biological features of the beam, cell holders had to be developed for both irradiations and cell growth conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Waste management represents an important issue in our society and Waste-to-Energy incineration plants have been playing a significant role in the last decades, showing an increased importance in Europe. One of the main issues posed by waste combustion is the generation of air contaminants. Particular concern is present about acid gases, mainly hydrogen chloride and sulfur oxides, due to their potential impact on the environment and on human health. Therefore, in the present study the main available technological options for flue gas treatment were analyzed, focusing on dry treatment systems, which are increasingly applied in Municipal Solid Wastes (MSW) incinerators. An operational model was proposed to describe and optimize acid gas removal process. It was applied to an existing MSW incineration plant, where acid gases are neutralized in a two-stage dry treatment system. This process is based on the injection of powdered calcium hydroxide and sodium bicarbonate in reactors followed by fabric filters. HCl and SO2 conversions were expressed as a function of reactants flow rates, calculating model parameters from literature and plant data. The implementation in a software for process simulation allowed the identification of optimal operating conditions, taking into account the reactant feed rates, the amount of solid products and the recycle of the sorbent. Alternative configurations of the reference plant were also assessed. The applicability of the operational model was extended developing also a fundamental approach to the issue. A predictive model was developed, describing mass transfer and kinetic phenomena governing the acid gas neutralization with solid sorbents. The rate controlling steps were identified through the reproduction of literature data, allowing the description of acid gas removal in the case study analyzed. A laboratory device was also designed and started up to assess the required model parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation consists of three papers. The first paper "Managing the Workload: an Experiment on Individual Decision Making and Performance" experimentally investigates how decision-making in workload management affects individual performance. I designed a laboratory experiment in order to exogenously manipulate the schedule of work faced by each subject and to identify its impact on final performance. Through the mouse click-tracking technique, I also collected interesting behavioral measures on organizational skills. I found that a non-negligible share of individuals performs better under externally imposed schedules than in the unconstrained case. However, such constraints are detrimental for those good in self-organizing. The second chapter, "On the allocation of effort with multiple tasks and piecewise monotonic hazard function", tests the optimality of a scheduling model, proposed in a different literature, for the decisional problem faced in the experiment. Under specific assumptions, I find that such model identifies what would be the optimal scheduling of the tasks in the Admission Test. The third paper "The Effects of Scholarships and Tuition Fees Discounts on Students' Performances: Which Monetary Incentives work Better?" explores how different levels of monetary incentives affect the achievement of students in tertiary education. I used a Regression Discontinuity Design to exploit the assignment of different monetary incentives, to study the effects of such liquidity provision on performance outcomes, ceteris paribus. The results show that a monetary increase in the scholarships generates no effect on performance since the achievements of the recipients are all centered near the requirements for non-returning the benefit. Secondly, students, who are actually paying some share of the total cost of college attendance, surprisingly, perform better than those whose cost is completely subsidized. A lower benefit, relatively to a higher aid, it motivates students to finish early and not to suffer the extra cost of a delayed graduation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, we deal with the design of experiments in the drug development process, focusing on the design of clinical trials for treatment comparisons (Part I) and the design of preclinical laboratory experiments for proteins development and manufacturing (Part II). In Part I we propose a multi-purpose design methodology for sequential clinical trials. We derived optimal allocations of patients to treatments for testing the efficacy of several experimental groups by also taking into account ethical considerations. We first consider exponential responses for survival trials and we then present a unified framework for heteroscedastic experimental groups that encompasses the general ANOVA set-up. The very good performance of the suggested optimal allocations, in terms of both inferential and ethical characteristics, are illustrated analytically and through several numerical examples, also performing comparisons with other designs proposed in the literature. Part II concerns the planning of experiments for processes composed of multiple steps in the context of preclinical drug development and manufacturing. Following the Quality by Design paradigm, the objective of the multi-step design strategy is the definition of the manufacturing design space of the whole process and, as we consider the interactions among the subsequent steps, our proposal ensures the quality and the safety of the final product, by enabling more flexibility and process robustness in the manufacturing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This PhD project aimed to (i) investigate the effects of three nutritional strategies (supplementation of a synbiotic, a muramidase, or arginine) on growth performance, gut health, and metabolism of broilers fed without antibiotics under thermoneutral and heat stress conditions and to (ii) explore the impacts of heat stress on hypothalamic regulation of feed intake in three broiler lines from diverse stages of genetic selection and in the red jungle fowl, the ancestor of domestic chickens. Synbiotic improved feed efficiency and footpad health, increased Firmicutes and reduced Bacteroidetes in the ceca of birds kept in thermoneutral conditions, while did not mitigate the impacts of heat stress on growth performance. Under optimal thermal conditions, muramidase increased final body weight and reduced cumulative feed intake and feed conversion ratio in a dose-dependent way. The highest dose reduced the risk of footpad lesions, cecal alpha diversity, the Firmicutes to Bacteroidetes ratio, and butyrate producers, increased Bacteroidaceae and Lactobacillaceae, plasmatic levels of bioenergetic metabolites, and reduced the levels of pro-oxidant metabolites. The same dose, however, failed to reduce the effects of heat stress on growth performance. Arginine supplementation improved growth rate, final body weight, and feed efficiency, increased plasmatic levels of arginine and creatine and hepatic levels of creatine and essential amino acids, reduced alpha diversity, Firmicutes, and Proteobacteria (especially Escherichia coli), and increased Bacteroidetes and Lactobacillus salivarius in the ceca of thermoneutral birds. No arginine-mediated attenuation of heat stress was found. Heat stress altered protein metabolism and caused the accumulation of antioxidant and protective molecules in oxidative stress-sensitive tissues. Arginine supplementation, however, may have partially counterbalanced the effects of heat stress on energy homeostasis. Stable gene expression of (an)orexigenic neuropeptides was found in the four chicken populations studied, but responses to hypoxia and heat stress appeared to be related to feed intake regulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Emissions of CO2 are constantly growing since the beginning of industrial era. Interruption of the production of major emitters sectors (energy and agriculture) is not a viable way and reducing all the emission through carbon capture and storage (CCS) is not economically viable and little publicly accepted, therefore, it becomes fundamentals to take actions like retrofitting already developed infrastructure employing cleanest resources, modify the actual processes limiting the emissions, and reduce the emissions already present through direct air capture. The present thesis will deeply discuss the aspects mentioned in regard to syngas and hydrogen production since they have a central role in the market of energy and chemicals. Among the strategies discussed, greater emphasis is given to the application of looping technologies and to direct air capture processes, as they have been the main point of this work. Particularly, chemical looping methane reforming to syngas was studied with Aspen Plus thermodynamic simulations, thermogravimetric analysis characterization (TGA) and testing in a fixed bed reactor. The process was studied cyclically exploiting the redox properties of a Ce-based oxide oxygen carrier synthetized with a simple forming procedure. The two steps of the looping cycles were studied isothermally at 900 °C and 950° C with a mixture of 10 %CH4 in N2 and of 3% O2 in N2, for carrier reduction and oxidation, respectively. During the stay abroad, in collaboration with the EHT of Zurich, a CO2 capture process in presence of amine solid sorbents was investigated, studying the difference in the performance achievable with the use of contactors of different geometry. The process was studied at two concentrations (382 ppm CO2 in N2 and 5.62% CO2 in N2) and at different flow rates, to understand the dynamics of the adsorption process and to define the mass transfer limiting step.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Decarbonization of maritime transport requires immediate action. In the short term, ship weather routing can provide greenhouse gas emission reductions, even for existing ships and without retrofitting them. Weather routing is based on making optimal use of both envi- ronmental information and knowledge about vessel seakeeping and performance. Combining them at a state-of-the-art level and making use of path planning in realistic conditions can be challenging. To address these topics in an open-source framework, this thesis led to the development of a new module called bateau , and to its combination with the ship routing model VISIR. bateau includes both hull geometry and propulsion modelling for various vessel types. It has two objectives: to predict the sustained speed in a seaway and to estimate the CO2 emission rate during the voyage. Various semi-empirical approaches were used in bateau to predict the ship hydro- and aerodynamical resistance in both head and oblique seas. Assuming that the ship sails at a constant engine load, the involuntary speed loss due to waves was estimated. This thesis also attempted to clarify the role played by the actual representation of the sea state. In particular, the influence of the wave steepness parameter was assessed. For dealing with ships with a greater superstructure, the wind added resistance was also estimated. Numerical experiments via bateau were conducted for both a medium and a large-size container ships, a bulk-carrier, and a tanker. The simulations of optimal routes were carried out for a feeder containership during voyages in the North Indian Ocean and in the South China Sea. Least-CO2 routes were compared to the least-distance ones, assessing the relative CO2 savings. Analysis fields from the Copernicus Marine Service were used in the numerical experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Continuum parallel robots (CPRs) are manipulators employing multiple flexible beams arranged in parallel and connected to a rigid end-effector. CPRs promise higher payload and accuracy than serial CRs while keeping great flexibility. As the risk of injury during accidental contacts between a human and a CPR should be reduced, CPRs may be used in large-scale collaborative tasks or assisted robotic surgery. There exist various CPR designs, but the prototype conception is rarely based on performance considerations, and the CPRs realization in mainly based on intuitions or rigid-link parallel manipulators architectures. This thesis focuses on the performance analysis of CPRs, and the tools needed for such evaluation, such as workspace computation algorithms. In particular, workspace computation strategies for CPRs are essential for the performance assessment, since the CPRs workspace may be used as a performance index or it can serve for optimal-design tools. Two new workspace computation algorithms are proposed in this manuscript, the former focusing on the workspace volume computation and the certification of its numerical results, while the latter aims at computing the workspace boundary only. Due to the elastic nature of CPRs, a key performance indicator for these robots is the stability of their equilibrium configurations. This thesis proposes the experimental validation of the equilibrium stability assessment on a real prototype, demonstrating limitations of some commonly used assumptions. Additionally, a performance index measuring the distance to instability is originally proposed in this manuscript. Differently from the majority of the existing approaches, the clear advantage of the proposed index is a sound physical meaning; accordingly, the index can be used for a more straightforward performance quantification, and to derive robot specifications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Embedded systems are increasingly integral to daily life, improving and facilitating the efficiency of modern Cyber-Physical Systems which provide access to sensor data, and actuators. As modern architectures become increasingly complex and heterogeneous, their optimization becomes a challenging task. Additionally, ensuring platform security is important to avoid harm to individuals and assets. This study primarily addresses challenges in contemporary Embedded Systems, focusing on platform optimization and security enforcement. The initial section of this study delves into the application of machine learning methods to efficiently determine the optimal number of cores for a parallel RISC-V cluster to minimize energy consumption using static source code analysis. Results demonstrate that automated platform configuration is not only viable but also that there is a moderate performance trade-off when relying solely on static features. The second part focuses on addressing the problem of heterogeneous device mapping, which involves assigning tasks to the most suitable computational device in a heterogeneous platform for optimal runtime. The contribution of this section lies in the introduction of novel pre-processing techniques, along with a training framework called Siamese Networks, that enhances the classification performance of DeepLLVM, an advanced approach for task mapping. Importantly, these proposed approaches are independent from the specific deep-learning model used. Finally, this research work focuses on addressing issues concerning the binary exploitation of software running in modern Embedded Systems. It proposes an architecture to implement Control-Flow Integrity in embedded platforms with a Root-of-Trust, aiming to enhance security guarantees with limited hardware modifications. The approach involves enhancing the architecture of a modern RISC-V platform for autonomous vehicles by implementing a side-channel communication mechanism that relays control-flow changes executed by the process running on the host core to the Root-of-Trust. This approach has limited impact on performance and it is effective in enhancing the security of embedded platforms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main focus of this work is to define a numerical methodology to simulate an aerospike engine and then to analyse the performance of DemoP1, which is a small aerospike demonstrator built by Pangea Aerospace. The aerospike is a promising solution to build more efficient engine than the actual one. Its main advantage is the expansion adaptation that allows to reach the optimal expansion in a wide range of ambient pressures delivering more thrust than an equivalent bell-shaped nozzle. The main drawbacks are the cooling system design and the spike manufacturing but nowadays, these issues seem to be overcome with the use of the additive manufacturing method. The simulations are performed with dbnsTurbFoam which is a solver of OpenFOAM. It has been designed to simulate a supersonic compressible turbulent flow. This work is divided in four chapters. The first one is a short introduction. The second one shows a brief summary of the theoretical performance of the aerospike. The third one introduces the numerical methodology to simulate a compressible supersonic flow. In the fourth chapter, the solver has been verified with an experiment found in literature. And in the fifth chapter, the simulations on DemoP1 engine are illustrated.