3 resultados para Data uncertainty
em AMS Tesi di Laurea - Alm@DL - Università di Bologna
Resumo:
The aim of this work is to present a general overview of state-of-the-art related to design for uncertainty with a focus on aerospace structures. In particular, a simulation on a FCCZ lattice cell and on the profile shape of a nozzle will be performed. Optimization under uncertainty is characterized by the need to make decisions without complete knowledge of the problem data. When dealing with a complex problem, non-linearity, or optimization, two main issues are raised: the uncertainty of the feasibility of the solution and the uncertainty of the objective value of the function. In the first part, the Design Of Experiments (DOE) methodologies, Uncertainty Quantification (UQ), and then Uncertainty optimization will be deepened. The second part will show an application of the previous theories on through a commercial software. Nowadays multiobjective optimization on high non-linear problem can be a powerful tool to approach new concept solutions or to develop cutting-edge design. In this thesis an effective improvement have been reached on a rocket nozzle. Future work could include the introduction of multi scale modelling, multiphysics approach and every strategy useful to simulate as much possible real operative condition of the studied design.
Resumo:
Hadrontherapy is a medical treatment based on the use of charged particles beams accelerated towards deep-seated tumors on clinical patients. The reason why it is increasingly used is the favorable depth dose profile following the Bragg Peak distribution, where the release of dose is almost sharply focused near the end of the beam path. However, nuclear interactions between the beam and the human body constituents occur, generating nuclear fragments which modify the dose profile. To overcome the lack of experimental data on nuclear fragmentation reactions in the energy range of hadrontherapy interest, the FOOT (FragmentatiOn Of Target) experiment has been conceived with the main aim of measuring differential nuclear fragmentation cross sections with an uncertainty lower than 5\%. The same results are of great interest also in the radioprotection field, studying similar processes. Long-term human missions outside the Earth’s orbit are going to be planned in the next years, among which the NASA foreseen travel to Mars, and it is fundamental to protect astronauts health and electronics from radiation exposure .\\ In this thesis, a first analysis of the data taken at the GSI with a beam of $^{16}O$ at 400 $MeV/u$ impinging on a target of graphite ($C$) will be presented, showing the first preliminary results of elemental cross section and angular differential cross section. A Monte Carlo dataset was first studied to test the performance of the tracking reconstruction algorithm and to check the reliability of the full analysis chain, from hit reconstruction to cross section measurement. An high agreement was found between generated and reconstructed fragments, thus validating the adopted procedure. A preliminary experimental cross section was measured and compared with MC results, highlighting a good consistency for all the fragments.
Resumo:
The increasing number of extreme rainfall events, combined with the high population density and the imperviousness of the land surface, makes urban areas particularly vulnerable to pluvial flooding. In order to design and manage cities to be able to deal with this issue, the reconstruction of weather phenomena is essential. Among the most interesting data sources which show great potential are the observational networks of private sensors managed by citizens (crowdsourcing). The number of these personal weather stations is consistently increasing, and the spatial distribution roughly follows population density. Precisely for this reason, they perfectly suit this detailed study on the modelling of pluvial flood in urban environments. The uncertainty associated with these measurements of precipitation is still a matter of research. In order to characterise the accuracy and precision of the crowdsourced data, we carried out exploratory data analyses. A comparison between Netatmo hourly precipitation amounts and observations of the same quantity from weather stations managed by national weather services is presented. The crowdsourced stations have very good skills in rain detection but tend to underestimate the reference value. In detail, the accuracy and precision of crowd- sourced data change as precipitation increases, improving the spread going to the extreme values. Then, the ability of this kind of observation to improve the prediction of pluvial flooding is tested. To this aim, the simplified raster-based inundation model incorporated in the Saferplaces web platform is used for simulating pluvial flooding. Different precipitation fields have been produced and tested as input in the model. Two different case studies are analysed over the most densely populated Norwegian city: Oslo. The crowdsourced weather station observations, bias-corrected (i.e. increased by 25%), showed very good skills in detecting flooded areas.