9 resultados para Strategy and tactics
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
The world currently faces a paradox in terms of accessibility for people with disabilities. While digital technologies hold immense potential to improve their quality of life, the majority of web content still exhibits critical accessibility issues. This PhD thesis addresses this challenge by proposing two interconnected research branches. The first introduces a groundbreaking approach to improving web accessibility by rethinking how it is approached, making it more accessible itself. It involves the development of: 1. AX, a declarative framework of web components that enforces the generation of accessible markup by means of static analysis. 2. An innovative accessibility testing and evaluation methodology, which communicates test results by exploiting concepts that developers are already familiar with (visual rendering and mouse operability) to convey the accessibility of a page. This methodology is implemented through the SAHARIAN browser extension. 3. A11A, a categorized and structured collection of curated accessibility resources aimed at facilitating their intended audiences discover and use them. The second branch focuses on unleashing the full potential of digital technologies to improve accessibility in the physical world. The thesis proposes the SCAMP methodology to make scientific artifacts accessible to blind, visually impaired individuals, and the general public. It enhances the natural characteristics of objects, making them more accessible through interactive, multimodal, and multisensory experiences. Additionally, the prototype of \gls{a11yvt}, a system supporting accessible virtual tours, is presented. It provides blind and visually impaired individuals with features necessary to explore unfamiliar indoor environments, while maintaining universal design principles that makes it suitable for usage by the general public. The thesis extensively discusses the theoretical foundations, design, development, and unique characteristics of these innovative tools. Usability tests with the intended target audiences demonstrate the effectiveness of the proposed artifacts, suggesting their potential to significantly improve the current state of accessibility.
Resumo:
The hydrologic risk (and the hydro-geologic one, closely related to it) is, and has always been, a very relevant issue, due to the severe consequences that may be provoked by a flooding or by waters in general in terms of human and economic losses. Floods are natural phenomena, often catastrophic, and cannot be avoided, but their damages can be reduced if they are predicted sufficiently in advance. For this reason, the flood forecasting plays an essential role in the hydro-geological and hydrological risk prevention. Thanks to the development of sophisticated meteorological, hydrologic and hydraulic models, in recent decades the flood forecasting has made a significant progress, nonetheless, models are imperfect, which means that we are still left with a residual uncertainty on what will actually happen. In this thesis, this type of uncertainty is what will be discussed and analyzed. In operational problems, it is possible to affirm that the ultimate aim of forecasting systems is not to reproduce the river behavior, but this is only a means through which reducing the uncertainty associated to what will happen as a consequence of a precipitation event. In other words, the main objective is to assess whether or not preventive interventions should be adopted and which operational strategy may represent the best option. The main problem for a decision maker is to interpret model results and translate them into an effective intervention strategy. To make this possible, it is necessary to clearly define what is meant by uncertainty, since in the literature confusion is often made on this issue. Therefore, the first objective of this thesis is to clarify this concept, starting with a key question: should be the choice of the intervention strategy to adopt based on the evaluation of the model prediction based on its ability to represent the reality or on the evaluation of what actually will happen on the basis of the information given by the model forecast? Once the previous idea is made unambiguous, the other main concern of this work is to develope a tool that can provide an effective decision support, making possible doing objective and realistic risk evaluations. In particular, such tool should be able to provide an uncertainty assessment as accurate as possible. This means primarily three things: it must be able to correctly combine all the available deterministic forecasts, it must assess the probability distribution of the predicted quantity and it must quantify the flooding probability. Furthermore, given that the time to implement prevention strategies is often limited, the flooding probability will have to be linked to the time of occurrence. For this reason, it is necessary to quantify the flooding probability within a horizon time related to that required to implement the intervention strategy and it is also necessary to assess the probability of the flooding time.
Resumo:
The general objective of this research is to explore theories and methodologies of sustainability indicators, environmental management and decision making disciplines with the operational purpose of producing scientific, robust and relevant information for supporting system understanding and decision making in real case studies. Several tools have been applied in order to increase the understanding of socio-ecological systems as well as providing relevant information on the choice between alternatives. These tools have always been applied having in mind the complexity of the issues and the uncertainty tied to the partial knowledge of the systems under study. Two case studies with specific application to performances measurement (environmental performances in the case of the K8 approach and sustainable development performances in the case of the EU Sustainable Development Strategy) and a case study about the selection of sustainable development indicators amongst Municipalities in Scotland, are discussed in the first part of the work. In the second part of the work, the common denominator among subjects consists in the application of spatial indices and indicators to address operational problems in land use management within the territory of the Ravenna province (Italy). The main conclusion of the thesis is that a ‘perfect’ methodological approach which always produces the best results in assessing sustainability performances does not exist. Rather, there is a pool of correct approaches answering different evaluation questions, to be used when methodologies fit the purpose of the analysis. For this reason, methodological limits and conceptual assumptions as well as consistency and transparency of the assessment, become the key factors for assessing the quality of the analysis.
Resumo:
The objective of this work of thesis is the refined estimations of source parameters. To such a purpose we used two different approaches, one in the frequency domain and the other in the time domain. In frequency domain, we analyzed the P- and S-wave displacement spectra to estimate spectral parameters, that is corner frequencies and low frequency spectral amplitudes. We used a parametric modeling approach which is combined with a multi-step, non-linear inversion strategy and includes the correction for attenuation and site effects. The iterative multi-step procedure was applied to about 700 microearthquakes in the moment range 1011-1014 N•m and recorded at the dense, wide-dynamic range, seismic networks operating in Southern Apennines (Italy). The analysis of the source parameters is often complicated when we are not able to model the propagation accurately. In this case the empirical Green function approach is a very useful tool to study the seismic source properties. In fact the Empirical Green Functions (EGFs) consent to represent the contribution of propagation and site effects to signal without using approximate velocity models. An EGF is a recorded three-component set of time-histories of a small earthquake whose source mechanism and propagation path are similar to those of the master event. Thus, in time domain, the deconvolution method of Vallée (2004) was applied to calculate the source time functions (RSTFs) and to accurately estimate source size and rupture velocity. This technique was applied to 1) large event, that is Mw=6.3 2009 L’Aquila mainshock (Central Italy), 2) moderate events, that is cluster of earthquakes of 2009 L’Aquila sequence with moment magnitude ranging between 3 and 5.6, 3) small event, i.e. Mw=2.9 Laviano mainshock (Southern Italy).
Resumo:
Nanotechnology entails the manufacturing and manipulation of matter at length scales ranging from single atoms to micron-sized objects. The ability to address properties on the biologically-relevant nanometer scale has made nanotechnology attractive for Nanomedicine. This is perceived as a great opportunity in healthcare especially in diagnostics, therapeutics and more in general to develop personalized medicine. Nanomedicine has the potential to enable early detection and prevention, and to improve diagnosis, mass screening, treatment and follow-up of many diseases. From the biological standpoint, nanomaterials match the typical size of naturally occurring functional units or components of living organisms and, for this reason, enable more effective interaction with biological systems. Nanomaterials have the potential to influence the functionality and cell fate in the regeneration of organs and tissues. To this aim, nanotechnology provides an arsenal of techniques for intervening, fabricate, and modulate the environment where cells live and function. Unconventional micro- and nano-fabrication techniques allow patterning biomolecules and biocompatible materials down to the level of a few nanometer feature size. Patterning is not simply a deterministic placement of a material; in a more extended acception it allows a controlled fabrication of structures and gradients of different nature. Gradients are emerging as one of the key factors guiding cell adhesion, proliferation, migration and even differentiation in the case of stem cells. The main goal of this thesis has been to devise a nanotechnology-based strategy and tools to spatially and temporally control biologically-relevant phenomena in-vitro which are important in some fields of medical research.
Resumo:
In this Thesis a series of numerical models for the evaluation of the seasonal performance of reversible air-to-water heat pump systems coupled to residential and non-residential buildings are presented. The exploitation of the energy saving potential linked to the adoption of heat pumps is a hard task for designers due to the influence on their energy performance of several factors, like the external climate variability, the heat pump modulation capacity, the system control strategy and the hydronic loop configuration. The aim of this work is to study in detail all these aspects. In the first part of this Thesis a series of models which use a temperature class approach for the prediction of the seasonal performance of reversible air source heat pumps are shown. An innovative methodology for the calculation of the seasonal performance of an air-to-water heat pump has been proposed as an extension of the procedure reported by the European standard EN 14825. This methodology can be applied not only to air-to-water single-stage heat pumps (On-off HPs) but also to multi-stage (MSHPs) and inverter-driven units (IDHPs). In the second part, dynamic simulation has been used with the aim to optimize the control systems of the heat pump and of the HVAC plant. A series of dynamic models, developed by means of TRNSYS, are presented to study the behavior of On-off HPs, MSHPs and IDHPs. The main goal of these dynamic simulations is to show the influence of the heat pump control strategies and of the lay-out of the hydronic loop used to couple the heat pump to the emitters on the seasonal performance of the system. A particular focus is given to the modeling of the energy losses linked to on-off cycling.
Resumo:
In the last decades the automotive sector has seen a technological revolution, due mainly to the more restrictive regulation, the newly introduced technologies and, as last, to the poor resources of fossil fuels remaining on Earth. Promising solution in vehicles’ propulsion are represented by alternative architectures and energy sources, for example fuel-cells and pure electric vehicles. The automotive transition to new and green vehicles is passing through the development of hybrid vehicles, that usually combine positive aspects of each technology. To fully exploit the powerful of hybrid vehicles, however, it is important to manage the powertrain’s degrees of freedom in the smartest way possible, otherwise hybridization would be worthless. To this aim, this dissertation is focused on the development of energy management strategies and predictive control functions. Such algorithms have the goal of increasing the powertrain overall efficiency and contextually increasing the driver safety. Such control algorithms have been applied to an axle-split Plug-in Hybrid Electric Vehicle with a complex architecture that allows more than one driving modes, including the pure electric one. The different energy management strategies investigated are mainly three: the vehicle baseline heuristic controller, in the following mentioned as rule-based controller, a sub-optimal controller that can include also predictive functionalities, referred to as Equivalent Consumption Minimization Strategy, and a vehicle global optimum control technique, called Dynamic Programming, also including the high-voltage battery thermal management. During this project, different modelling approaches have been applied to the powertrain, including Hardware-in-the-loop, and diverse powertrain high-level controllers have been developed and implemented, increasing at each step their complexity. It has been proven the potential of using sophisticated powertrain control techniques, and that the gainable benefits in terms of fuel economy are largely influenced by the chose energy management strategy, even considering the powerful vehicle investigated.
Resumo:
Although its great potential as low to medium temperature waste heat recovery (WHR) solution, the ORC technology presents open challenges that still prevent its diffusion in the market, which are different depending on the application and the size at stake. Focusing on the micro range power size and low temperature heat sources, the ORC technology is still not mature due to the lack of appropriate machines and working fluids. Considering instead the medium to large size, the technology is already available but the investment is still risky. The intention of this thesis is to address some of the topical themes in the ORC field, paying special attention in the development of reliable models based on realistic data and accounting for the off-design performance of the ORC system and of each of its components. Concerning the “Micro-generation” application, this work: i) explores the modelling methodology, the performance and the optimal parameters of reciprocating piston expanders; ii) investigates the performance of such expander and of the whole micro-ORC system when using Hydrofluorocarbons as working fluid or their new low GWP alternatives and mixtures; iii) analyzes the innovative ORC reversible architecture (conceived for the energy storage), its optimal regulation strategy and its potential when inserted in typical small industrial frameworks. Regarding the “Industrial WHR” sector, this thesis examines the WHR opportunity of ORCs, with a focus on the natural gas compressor stations application. This work provides information about all the possible parameters that can influence the optimal sizing, the performance and thus the feasibility of installing an ORC system. New WHR configurations are explored: i) a first one, relying on the replacement of a compressor prime mover with an ORC; ii) a second one, which consists in the use of a supercritical CO2 cycle as heat recovery system.
Resumo:
The aim of the Ph.D. research project was to explore Dual Fuel combustion and hybridization. Natural gas-diesel Dual Fuel combustion was experimentally investigated on a 4-Stroke, 2.8 L, turbocharged, light-duty Diesel engine, considering four operating points in the range between low to medium-high loads at 3000 rpm. Then, a numerical analysis was carried out using a customized version of the KIVA-3V code, in order to optimize the diesel injection strategy of the highest investigated load. A second KIVA-3V model was used to analyse the interchangeability between natural gas and biogas on an intermediate operating point. Since natural gas-diesel Dual Fuel combustion suffers from poor combustion efficiency at low loads, the effects of hydrogen enriched natural gas on Dual Fuel combustion were investigated using a validated Ansys Forte model, followed by an optimization of the diesel injection strategy and a sensitivity analysis to the swirl ratio, on the lowest investigated load. Since one of the main issues of Low Temperature Combustion engines is the low power density, 2-Stroke engines, thanks to the double frequency compared to 4-Stroke engines, may be more suitable to operate in Dual Fuel mode. Therefore, the application of gasoline-diesel Dual Fuel combustion to a modern 2-Stroke Diesel engine was analysed, starting from the investigation of gasoline injection and mixture formation. As far as hybridization is concerned, a MATLAB-Simulink model was built to compare a conventional (combustion) and a parallel-hybrid powertrain applied to a Formula SAE race car.