24 resultados para ASSESSMENT MODELS
Resumo:
One of the main problems recognized in sustainable development goals and sustainable agricultural objectives is Climate change. Farming contributes significantly to the overall Greenhouse gases (GHG) in the atmosphere, which is approximately 10-12 percent of total GHG emissions, but when taking in consideration also land-use change, including deforestation driven by agricultural expansion for food, fiber and fuel the number rises to approximately 30 percent (Smith et. al., 2007). There are two distinct methodological approaches for environmental impact assessment; Life Cycle Assessment (a bottom up approach) and Input-Output Analysis (a top down approach). The two methodologies differ significantly but there is not an immediate choice between them if the scope of the study is on a sectorial level. Instead, as an alternative, hybrid approaches which combine these two approaches have emerged. The aim of this study is to analyze in a greater detail the agricultural sectors contribution to Climate change caused by the consumption of food products. Hence, to identify the food products that have the greatest impact through their life cycle, identifying their hotspots and evaluating the mitigation possibilities for the same. At the same time evaluating methodological possibilities and models to be applied for this purpose both on a EU level and on a country level (Italy).
Resumo:
Spatial prediction of hourly rainfall via radar calibration is addressed. The change of support problem (COSP), arising when the spatial supports of different data sources do not coincide, is faced in a non-Gaussian setting; in fact, hourly rainfall in Emilia-Romagna region, in Italy, is characterized by abundance of zero values and right-skeweness of the distribution of positive amounts. Rain gauge direct measurements on sparsely distributed locations and hourly cumulated radar grids are provided by the ARPA-SIMC Emilia-Romagna. We propose a three-stage Bayesian hierarchical model for radar calibration, exploiting rain gauges as reference measure. Rain probability and amounts are modeled via linear relationships with radar in the log scale; spatial correlated Gaussian effects capture the residual information. We employ a probit link for rainfall probability and Gamma distribution for rainfall positive amounts; the two steps are joined via a two-part semicontinuous model. Three model specifications differently addressing COSP are presented; in particular, a stochastic weighting of all radar pixels, driven by a latent Gaussian process defined on the grid, is employed. Estimation is performed via MCMC procedures implemented in C, linked to R software. Communication and evaluation of probabilistic, point and interval predictions is investigated. A non-randomized PIT histogram is proposed for correctly assessing calibration and coverage of two-part semicontinuous models. Predictions obtained with the different model specifications are evaluated via graphical tools (Reliability Plot, Sharpness Histogram, PIT Histogram, Brier Score Plot and Quantile Decomposition Plot), proper scoring rules (Brier Score, Continuous Rank Probability Score) and consistent scoring functions (Root Mean Square Error and Mean Absolute Error addressing the predictive mean and median, respectively). Calibration is reached and the inclusion of neighbouring information slightly improves predictions. All specifications outperform a benchmark model with incorrelated effects, confirming the relevance of spatial correlation for modeling rainfall probability and accumulation.
Resumo:
The uncertainties in the determination of the stratigraphic profile of natural soils is one of the main problems in geotechnics, in particular for landslide characterization and modeling. The study deals with a new approach in geotechnical modeling which relays on a stochastic generation of different soil layers distributions, following a boolean logic – the method has been thus called BoSG (Boolean Stochastic Generation). In this way, it is possible to randomize the presence of a specific material interdigitated in a uniform matrix. In the building of a geotechnical model it is generally common to discard some stratigraphic data in order to simplify the model itself, assuming that the significance of the results of the modeling procedure would not be affected. With the proposed technique it is possible to quantify the error associated with this simplification. Moreover, it could be used to determine the most significant zones where eventual further investigations and surveys would be more effective to build the geotechnical model of the slope. The commercial software FLAC was used for the 2D and 3D geotechnical model. The distribution of the materials was randomized through a specifically coded MatLab program that automatically generates text files, each of them representing a specific soil configuration. Besides, a routine was designed to automate the computation of FLAC with the different data files in order to maximize the sample number. The methodology is applied with reference to a simplified slope in 2D, a simplified slope in 3D and an actual landslide, namely the Mortisa mudslide (Cortina d’Ampezzo, BL, Italy). However, it could be extended to numerous different cases, especially for hydrogeological analysis and landslide stability assessment, in different geological and geomorphological contexts.
Resumo:
Coastal flooding poses serious threats to coastal areas around the world, billions of dollars in damage to property and infrastructure, and threatens the lives of millions of people. Therefore, disaster management and risk assessment aims at detecting vulnerability and capacities in order to reduce coastal flood disaster risk. In particular, non-specialized researchers, emergency management personnel, and land use planners require an accurate, inexpensive method to determine and map risk associated with storm surge events and long-term sea level rise associated with climate change. This study contributes to the spatially evaluation and mapping of social-economic-environmental vulnerability and risk at sub-national scale through the development of appropriate tools and methods successfully embedded in a Web-GIS Decision Support System. A new set of raster-based models were studied and developed in order to be easily implemented in the Web-GIS framework with the purpose to quickly assess and map flood hazards characteristics, damage and vulnerability in a Multi-criteria approach. The Web-GIS DSS is developed recurring to open source software and programming language and its main peculiarity is to be available and usable by coastal managers and land use planners without requiring high scientific background in hydraulic engineering. The effectiveness of the system in the coastal risk assessment is evaluated trough its application to a real case study.
Resumo:
A possible future scenario for the water injection (WI) application has been explored as an advanced strategy for modern GDI engines. The aim is to verify whether the PWI (Port Water Injection) and DWI (Direct Water Injection) architectures can replace current fuel enrichment strategies to limit turbine inlet temperatures (TiT) and knock engine attitude. In this way, it might be possible to extend the stoichiometric mixture condition over the entire engine map, meeting possible future restrictions in the use of AES (Auxiliary Emission Strategies) and future emission limitations. The research was first addressed through a comprehensive assessment of the state-of-the-art of the technology and the main effects of the chemical-physical water properties. Then, detailed chemical kinetics simulations were performed in order to compute the effects of WI on combustion development and auto-ignition. The latter represents an important methodology step for accurate numerical combustion simulations. The water injection was then analysed in detail for a PWI system, through an experimental campaign for macroscopic and microscopic injector characterization inside a test chamber. The collected data were used to perform a numerical validation of the spray models, obtaining an excellent matching in terms of particle size and droplet velocity distributions. Finally, a wide range of three-dimensional CFD simulations of a virtual high-bmep engine were realized and compared, exploring also different engine designs and water/fuel injection strategies under non-reacting and reacting flow conditions. According to the latter, it was found that thanks to the introduction of water, for both PWI and DWI systems, it could be possible to obtain an increase of the target performance and an optimization of the bsfc (Break Specific Fuel Consumption), lowering the engine knock risk at the same time, while the TiT target has been achieved hardly only for one DWI configuration.
Resumo:
The growing interest for constellation of small, less expensive satellites is bringing space junk and traffic management to the attention of space community. At the same time, the continuous quest for more efficient propulsion systems put the spotlight on electric (low thrust) propulsion as an appealing solution for collision avoidance. Starting with an overview of the current techniques for conjunction assessment and avoidance, we then highlight the possible problems when a low thrust propulsion is used. The need for accurate propagation model shows up from the conducted simulations. Thus, aiming at propagation models with low computational burden, we study the available models from the literature and propose an analytical alternative to improve propagation accuracy. The model is then tested in the particular case of a tangential maneuver. Results show that the proposed solution significantly improve on state of the art methods and is a good candidate to be used in collision avoidance operations. For instance to propagate satellite uncertainty or optimizing avoidance maneuver when conjunction occurs within few (3-4) orbits from measurements time.
Resumo:
The rate of diagnosis and treatment of degenerative spine disorders is increasing, increasing the need for surgical intervention. Posterior spine fusion is one surgical intervention used to treat various spine degeneration pathologies To minimize the risk of complications and provide patients with positive outcomes, preoperative planning and postsurgical assessment are necessary. This PhD aimed to investigate techniques for the surgical planning and assessment of spine surgeries. Three main techniques were assessed: stereophotogrammetric motion analysis, 3D printing of complex spine deformities and finite element analysis of the thoracolumbar spine. Upon reviewing the literature on currently available spine kinematics protocol, a comprehensive motion analysis protocol to measure the multi-segmental spine motion was developed. Using this protocol, the patterns of spine motion in patients before and after posterior spine fixation was mapped. The second part investigated the use of virtual and 3D printed spine models for the surgical planning of complex spine deformity correction. Compared to usual radiographic images, the printed model allowed optimal surgical intervention, reduced surgical time and provided better surgeon-patient communication. The third part assessed the use of polyetheretherketone rods auxiliary to titanium rods to reduce the stiffness of posterior spine fusion constructs. Using a finite element model of the thoracolumbar spine, the rods system showed a decrease in the overall stress of the uppermost instrumented vertebra when compared to regular fixation approaches. Finally, a retrospective biomechanical assessment of a lumbopelvic reconstruction technique was investigated to assess the patients' gait following the surgery, the implant deformation over the years and the extent of bony fusion between spine and implant. In conclusion, this thesis highlighted the need to provide surgeons with new planning and assessment techniques to better understand postsurgical complications. The methodologies investigated in this project can be used in the future to establish a patient-specific planning protocol.
Resumo:
The great challenges of today pose great pressure on the food chain to provide safe and nutritious food that meets regulations and consumer health standards. In this context, Risk Analysis is used to produce an estimate of the risks to human health and to identify and implement effective risk-control measures. The aims of this work were 1) describe how QRA is used to evaluate the risk for consumers health, 2) address the methodology to obtain models to apply in QMRA; 3) evaluate solutions to mitigate the risk. The application of a QCRA to the Italian milk industry enabled the assessment of Aflatoxin M1 exposure, impact on different population categories, and comparison of risk-mitigation strategies. The results highlighted the most sensitive population categories, and how more stringent sampling plans reduced risk. The application of a QMRA to Spanish fresh cheeses evidenced how the contamination of this product with Listeria monocytogenes may generate a risk for the consumers. Two risk-mitigation actions were evaluated, i.e. reducing shelf life and domestic refrigerator temperature, both resulting effective in reducing the risk of listeriosis. A description of the most applied protocols for data generation for predictive model development, was provided to increase transparency and reproducibility and to provide the means to better QMRA. The development of a linear regression model describing the fate of Salmonella spp. in Italian salami during the production process and HPP was described. Alkaline electrolyzed water was evaluated for its potential use to reduce microbial loads on working surfaces, with results showing its effectiveness. This work showed the relevance of QRA, of predictive microbiology, and of new technologies to ensure food safety on a more integrated way. Filling of data gaps, the development of better models and the inclusion of new risk-mitigation strategies may lead to improvements in the presented QRAs.
Resumo:
Pain is a highly complex phenomenon involving intricate neural systems, whose interactions with other physiological mechanisms are not fully understood. Standard pain assessment methods, relying on verbal communication, often fail to provide reliable and accurate information, which poses a critical challenge in the clinical context. In the era of ubiquitous and inexpensive physiological monitoring, coupled with the advancement of artificial intelligence, these new tools appear as the natural candidates to be tested to address such a challenge. This thesis aims to conduct experimental research to develop digital biomarkers for pain assessment. After providing an overview of the state-of-the-art regarding pain neurophysiology and assessment tools, methods for appropriately conditioning physiological signals and controlling confounding factors are presented. The thesis focuses on three different pain conditions: cancer pain, chronic low back pain, and pain experienced by patients undergoing neurorehabilitation. The approach presented in this thesis has shown promise, but further studies are needed to confirm and strengthen these results. Prior to developing any models, a preliminary signal quality check is essential, along with the inclusion of personal and health information in the models to limit their confounding effects. A multimodal approach is preferred for better performance, although unimodal analysis has revealed interesting aspects of the pain experience. This approach can enrich the routine clinical pain assessment procedure by enabling pain to be monitored when and where it is actually experienced, and without the involvement of explicit communication,. This would improve the characterization of the pain experience, aid in antalgic therapy personalization, and bring timely relief, with the ultimate goal of improving the quality of life of patients suffering from pain.