13 resultados para Closing the Gap
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
Il Lavoro é incentrato sull' influenza dell'insegnamento di G. I Gurdjieff sul teatro del novecento in particolare sul lavoro di Peter Brook, Declan Donnellan e Robert Lepage
Resumo:
This exploratory research project developed a cognitive situated approach to studying aspects of simultaneous interpreting with quantitative, confirmatory methods. To do so, it explored how to determine the potential benefits of using a computer-assisted interpreting tool, InterpretBank, among 22 Chinese interpreting trainees with Chinese L1 and English L2. The informants were mostly 2nd-year female students with an average age of 24.7 enrolled in Chinese MA interpreting programs. The study adopted a pretest and posttest design with three cycles. The independent variable was using Excel or InterpretBank. After Cycle I (pre-test), the sample split into control (Excel) and experimental (InterpretBank) groups. Tool choice was compulsory in Cycle II but not Cycle III. The source materials for each cycle were pairs of matching transcripts from popular science podcasts. Informants compiled glossaries out of one transcript, while the other one was edited for simultaneous interpreting, with 39 terms as potential problem triggers. Quantitative profiling results showed that InterpretBank informants spent less time on glossary compilation, generated more terms faster than Excel informants, but their glossaries were less diverse (personal) and longer. The booth tasks yielded no significant differences in fluency indicators except for more bumps (200-600ms silent time gaps) for InterpretBank in Cycle II. InterpretBank informants had more correct renditions in Cycles II and III but there was no statistically significant difference among accuracy indicators per cycle. Holistic quality assessments by PhD raters showed InterpretBank consistently outperforming Excel, suggesting a positive InterpretBank impact on SI quality. However, some InterpretBank implementations raised cognitive ergonomic concerns for Chinese, potentially undermining its utility. Overall, results were mixed regarding InterpretBank benefits for Chinese trainees, but the project was successful in developing cognitive situated interpreting study methods, constructs and indicators.
Resumo:
Riding the wave of recent groundbreaking achievements, artificial intelligence (AI) is currently the buzzword on everybody’s lips and, allowing algorithms to learn from historical data, Machine Learning (ML) emerged as its pinnacle. The multitude of algorithms, each with unique strengths and weaknesses, highlights the absence of a universal solution and poses a challenging optimization problem. In response, automated machine learning (AutoML) navigates vast search spaces within minimal time constraints. By lowering entry barriers, AutoML emerged as promising the democratization of AI, yet facing some challenges. In data-centric AI, the discipline of systematically engineering data used to build an AI system, the challenge of configuring data pipelines is rather simple. We devise a methodology for building effective data pre-processing pipelines in supervised learning as well as a data-centric AutoML solution for unsupervised learning. In human-centric AI, many current AutoML tools were not built around the user but rather around algorithmic ideas, raising ethical and social bias concerns. We contribute by deploying AutoML tools aiming at complementing, instead of replacing, human intelligence. In particular, we provide solutions for single-objective and multi-objective optimization and showcase the challenges and potential of novel interfaces featuring large language models. Finally, there are application areas that rely on numerical simulators, often related to earth observations, they tend to be particularly high-impact and address important challenges such as climate change and crop life cycles. We commit to coupling these physical simulators with (Auto)ML solutions towards a physics-aware AI. Specifically, in precision farming, we design a smart irrigation platform that: allows real-time monitoring of soil moisture, predicts future moisture values, and estimates water demand to schedule the irrigation.
Resumo:
The present PhD dissertation is dedicated to the general topic of knowledge transfer from academia to industry and the role of various measures at both institutional and university levels in support of commercialization of university research. The overall contribution of the present dissertation work refers to presenting an in-depth and comprehensive analysis of the main critical issues that currently exist with regard to commercial exploitation of academic research, while providing evidence on the role of previously underexplored areas (e.g. strategic use of academic patents; female academic patenting) in a general debate on the ways to successful knowledge transfer from academia to industry. The first paper, included in the present PhD dissertation, aims to address this gap by developing a taxonomy of literature, based on a comprehensive review of the existing body of research on government measures in support of knowledge transfer from academia to industry. The results of the review reveal that there is a considerable gap in the analysis of the impact and relative effectiveness of the public policy measures, especially in what regards the measures aimed at building knowledge and expertise among academic faculty and technology transfer agents. The second paper, presented as a part of the dissertation, focuses on the role of interorganizational collaborations and their effect on the likelihood of an academic patent to remain unused, and points to the strategic management of patents by universities. In the third paper I turn to the issue of female participation in patenting and commercialization; in particular, I find evidence on the positive role of university and its internal support structures in closing the gender gap in female academic patenting. The results of the research, carried out for the present dissertation, provide important implications for policy makers in crafting measures to increase the efficient use of university knowledge stock.
Resumo:
The European External Action Service (EEAS or Service) is one of the most significant and most debated innovations introduced by the Lisbon Treaty. This analysis intends to explain the anomalous design of the EEAS in light of its function, which consists in the promotion of external action coherence. Coherence is a principle of the EU legal system, which requires synergy in the external actions of the Union and its Members. It can be enforced only through the coordination of European policy-makers' initiatives, by bridging the gap between the 'Communitarian' and intergovernmental approaches. This is the 'Union method' envisaged by A. Merkel: "coordinated action in a spirit of solidarity - each of us in the area for which we are responsible but all working towards the same goal". The EEAS embodies the 'Union method', since it is institutionally linked to both Union organs and Member States. It is also capable of enhancing synergy in policy management and promoting unity in international representation, since its field of action is delimited not by an abstract concern for institutional balance but by a pragmatic assessment of the need for coordination in each sector. The challenge is now to make sure that this pragmatic approach is applied with respect to all the activities of the Service, in order to reinforce its effectiveness. The coordination brought by the EEAS is in fact the only means through which a European foreign policy can come into being: the choice is not between the Community method and the intergovernmental method, but between a coordinated position and nothing at all.
Resumo:
This thesis aims to fill the gap in the literature by examining the relationship between technological trajectories and environmental policy in the automotive industry, focusing on the role of environmental policies in unlocking the industry from fossil fuel path-dependence. It first explores the inducement mechanism that underpins the interaction between environmental policy and green technological advances, investigating under what conditions the European environmental transport policy portfolio and the intrinsic characteristics of assignees' knowledge boost worldwide green patent production. Subsequently, the thesis empirically analyses the dynamics of technological knowledge involved in technological trajectories assessing evolution patterns such as variation, selection and retention, in order to study the impact of policy implementation on technological knowledge related to electric and hybrid vehicle technologies. Finally, the thesis sheds light on the drivers that encourage a shift from incumbent internal combustion engine technologies towards low-emission vehicle technologies. This analysis tests whether tax-inclusive fuel prices and technological proximity between technological fields induce a shift from non-environmental inventions to environmentally friendly inventive activities and if they impact the competition between alternative vehicle technologies. The findings provide insights into the effectiveness of environmental policy in triggering inventive activities related to the development of alternative vehicle technologies. In addition, there is evidence that environmental policy redirects technological efforts towards a sustainable path and impacts the competition between low-emission vehicles.
Resumo:
Heavy Liquid Metal Cooled Reactors are among the concepts, fostered by the GIF, as potentially able to comply with stringent safety, economical, sustainability, proliferation resistance and physical protection requirements. The increasing interest around these innovative systems has highlighted the lack of tools specifically dedicated to their core design stage. The present PhD thesis summarizes the three years effort of, partially, closing the mentioned gap, by rationally defining the role of codes in core design and by creating a development methodology for core design-oriented codes (DOCs) and its subsequent application to the most needed design areas. The covered fields are, in particular, the fuel assembly thermal-hydraulics and the fuel pin thermo-mechanics. Regarding the former, following the established methodology, the sub-channel code ANTEO+ has been conceived. Initially restricted to the forced convection regime and subsequently extended to the mixed one, ANTEO+, via a thorough validation campaign, has been demonstrated a reliable tool for design applications. Concerning the fuel pin thermo-mechanics, the will to include safety-related considerations at the outset of the pin dimensioning process, has given birth to the safety-informed DOC TEMIDE. The proposed DOC development methodology has also been applied to TEMIDE; given the complex interdependence patterns among the numerous phenomena involved in an irradiated fuel pin, to optimize the code final structure, a sensitivity analysis has been performed, in the anticipated application domain. The development methodology has also been tested in the verification and validation phases; the latter, due to the low availability of experiments truly representative of TEMIDE's application domain, has only been a preliminary attempt to test TEMIDE's capabilities in fulfilling the DOC requirements upon which it has been built. In general, the capability of the proposed development methodology for DOCs in delivering tools helping the core designer in preliminary setting the system configuration has been proven.
Resumo:
In the digital age, e-health technologies play a pivotal role in the processing of medical information. As personal health data represents sensitive information concerning a data subject, enhancing data protection and security of systems and practices has become a primary concern. In recent years, there has been an increasing interest in the concept of Privacy by Design, which aims at developing a product or a service in a way that it supports privacy principles and rules. In the EU, Article 25 of the General Data Protection Regulation provides a binding obligation of implementing Data Protection by Design technical and organisational measures. This thesis explores how an e-health system could be developed and how data processing activities could be carried out to apply data protection principles and requirements from the design stage. The research attempts to bridge the gap between the legal and technical disciplines on DPbD by providing a set of guidelines for the implementation of the principle. The work is based on literature review, legal and comparative analysis, and investigation of the existing technical solutions and engineering methodologies. The work can be differentiated by theoretical and applied perspectives. First, it critically conducts a legal analysis on the principle of PbD and it studies the DPbD legal obligation and the related provisions. Later, the research contextualises the rule in the health care field by investigating the applicable legal framework for personal health data processing. Moreover, the research focuses on the US legal system by conducting a comparative analysis. Adopting an applied perspective, the research investigates the existing technical methodologies and tools to design data protection and it proposes a set of comprehensive DPbD organisational and technical guidelines for a crucial case study, that is an Electronic Health Record system.
Resumo:
Biology is now a “Big Data Science” thanks to technological advancements allowing the characterization of the whole macromolecular content of a cell or a collection of cells. This opens interesting perspectives, but only a small portion of this data may be experimentally characterized. From this derives the demand of accurate and efficient computational tools for automatic annotation of biological molecules. This is even more true when dealing with membrane proteins, on which my research project is focused leading to the development of two machine learning-based methods: BetAware-Deep and SVMyr. BetAware-Deep is a tool for the detection and topology prediction of transmembrane beta-barrel proteins found in Gram-negative bacteria. These proteins are involved in many biological processes and primary candidates as drug targets. BetAware-Deep exploits the combination of a deep learning framework (bidirectional long short-term memory) and a probabilistic graphical model (grammatical-restrained hidden conditional random field). Moreover, it introduced a modified formulation of the hydrophobic moment, designed to include the evolutionary information. BetAware-Deep outperformed all the available methods in topology prediction and reported high scores in the detection task. Glycine myristoylation in Eukaryotes is the binding of a myristic acid on an N-terminal glycine. SVMyr is a fast method based on support vector machines designed to predict this modification in dataset of proteomic scale. It uses as input octapeptides and exploits computational scores derived from experimental examples and mean physicochemical features. SVMyr outperformed all the available methods for co-translational myristoylation prediction. In addition, it allows (as a unique feature) the prediction of post-translational myristoylation. Both the tools here described are designed having in mind best practices for the development of machine learning-based tools outlined by the bioinformatics community. Moreover, they are made available via user-friendly web servers. All this make them valuable tools for filling the gap between sequential and annotated data.
Resumo:
Historical evidence shows that chemical, process, and Oil&Gas facilities where dangerous substances are stored or handled are target of deliberate malicious attacks (security attacks) aiming at interfering with normal operations. Physical attacks and cyber-attacks may generate events with consequences on people, property, and the surrounding environment that are comparable to those of major accidents caused by safety-related causes. The security aspects of these facilities are commonly addressed using Security Vulnerability/Risk Assessment (SVA/SRA) methodologies. Most of these methodologies are semi-quantitative and non-systematic approaches that strongly rely on expert judgment, leading to security assessments that are not reproducible. Moreover, they do not consider the synergies with the safety domain. The present 3-year research is aimed at filling the gap outlined by providing knowledge on security attacks, as well as rigorous and systematic methods supporting existing SVA/SRA studies suitable for the chemical, process, and Oil&Gas industry. The different nature of cyber and physical attacks resulted in the development of different methods for the two domains. The first part of the research was devoted to the development and statistical analysis of security databases that allowed to develop new knowledge and lessons learnt on security threats. Based on the obtained background, a Bow-Tie based procedure and two reverse-HazOp based methodologies were developed as hazard identification approaches for physical and cyber threats respectively. To support the quantitative estimation of the security risk, a quantitative procedure based on the Bayesian Network was developed allowing to calculate the probability of success of physical security attacks. All the developed methods have been applied to case studies addressing chemical, process and Oil&Gas facilities (offshore and onshore) proving the quality of the results that can be achieved in improving site security. Furthermore, the outcomes achieved allow to step forward in developing synergies and promoting integration among safety and security management.
Resumo:
This comprehensive study explores the intricate world of 3D printing, with a focus on Fused Deposition Modelling (FDM). It sheds light on the critical factors that influence the quality and mechanical properties of 3D printed objects. Using an optical microscope with 40X magnification, the shapes of the printed beads is correlated to specific slicing parameters, resulting in a 2D parametric model. This mathematical model, derived from real samples, serves as a tool to predict general mechanical behaviour, bridging the gap between theory and practice in FDM printing. The study begins by emphasising the importance of geometric parameters such as layer height, line width and filament tolerance on the final printed bead geometry and the resulting theoretical effect on mechanical properties. The introduction of VPratio parameter (ratio between the area of the voids and the area occupied by printed material) allows the quantification of the variation of geometric slicing parameters on the improvement or reduction of mechanical properties. The study also addresses the effect of overhang and the role of filament diameter tolerances. The research continues with the introduction of 3D FEM (Finite Element Analysis) models based on the RVE (Representative Volume Element) to verify the results obtained from the 2D model and to analyse other aspects that affect mechanical properties and not directly observable with the 2D model. The study also proposes a model for the examination of 3D printed infill structures, introducing also an innovative methodology called “double RVE” which speeds up the calculation of mechanical properties and is also more computationally efficient. Finally, the limitations of the RVE model are shown and a so-called Hybrid RVE-based model is created to overcome the limitations and inaccuracy of the conventional RVE model and homogenization procedure on some printed geometries.
Resumo:
Honey bees are considered keystone species in ecosystem, the effect of harmful pesticides for the honey bees, the action of extreme climatic waves and their consequence on honey bees health can cause the loss of many colonies which could contribute to the reduction of the effective population size and incentive the use of non-autochthonous queens to replace dead colonies. Over the last decades, the use of non-ligustica bee subspecies in Italy has increased and together with the mentioned phenomena exposed native honey bees to hybridization, laeding to a dramatic loss of genetic erosion and admixture. Healthy genetic diversity within honey bee populations is critical to provide tolerance and resistance to current and future threatening. Nowadays it is urgent to design strategies for the conservation of local subspecies and their valorisation on a productive scale. In this Thesis we applied genomics tool for the analysis of the genetic diversity and the genomic integrity of honey bee populations in Italy are described. In this work mtDNA based methods are presented using honey bee DNA or honey eDNA as source of information of the genetic diversity of A. mellifera at different level. Taken together, the results derived from these studies should enlarge the knowledge of the genetic diversity and integrity of the honey bee populations in Italy, filling the gap of information necessary to design efficient conservation programmes. Furthermore, the methods presented in these works will provide a tool for the honey authentication to sustain and valorise beekeeping products and sector against frauds.
Resumo:
In pursuit of aligning with the European Union's ambitious target of achieving a carbon-neutral economy by 2050, researchers, vehicle manufacturers, and original equipment manufacturers have been at the forefront of exploring cutting-edge technologies for internal combustion engines. The introduction of these technologies has significantly increased the effort required to calibrate the models implemented in the engine control units. Consequently the development of tools that reduce costs and the time required during the experimental phases, has become imperative. Additionally, to comply with ever-stricter limits on 〖"CO" 〗_"2" emissions, it is crucial to develop advanced control systems that enhance traditional engine management systems in order to reduce fuel consumption. Furthermore, the introduction of new homologation cycles, such as the real driving emissions cycle, compels manufacturers to bridge the gap between engine operation in laboratory tests and real-world conditions. Within this context, this thesis showcases the performance and cost benefits achievable through the implementation of an auto-adaptive closed-loop control system, leveraging in-cylinder pressure sensors in a heavy-duty diesel engine designed for mining applications. Additionally, the thesis explores the promising prospect of real-time self-adaptive machine learning models, particularly neural networks, to develop an automatic system, using in-cylinder pressure sensors for the precise calibration of the target combustion phase and optimal spark advance in a spark-ignition engines. To facilitate the application of these combustion process feedback-based algorithms in production applications, the thesis discusses the results obtained from the development of a cost-effective sensor for indirect cylinder pressure measurement. Finally, to ensure the quality control of the proposed affordable sensor, the thesis provides a comprehensive account of the design and validation process for a piezoelectric washer test system.