15 resultados para The ISTRION platform
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
Bioinformatics is a recent and emerging discipline which aims at studying biological problems through computational approaches. Most branches of bioinformatics such as Genomics, Proteomics and Molecular Dynamics are particularly computationally intensive, requiring huge amount of computational resources for running algorithms of everincreasing complexity over data of everincreasing size. In the search for computational power, the EGEE Grid platform, world's largest community of interconnected clusters load balanced as a whole, seems particularly promising and is considered the new hope for satisfying the everincreasing computational requirements of bioinformatics, as well as physics and other computational sciences. The EGEE platform, however, is rather new and not yet free of problems. In addition, specific requirements of bioinformatics need to be addressed in order to use this new platform effectively for bioinformatics tasks. In my three years' Ph.D. work I addressed numerous aspects of this Grid platform, with particular attention to those needed by the bioinformatics domain. I hence created three major frameworks, Vnas, GridDBManager and SETest, plus an additional smaller standalone solution, to enhance the support for bioinformatics applications in the Grid environment and to reduce the effort needed to create new applications, additionally addressing numerous existing Grid issues and performing a series of optimizations. The Vnas framework is an advanced system for the submission and monitoring of Grid jobs that provides an abstraction with reliability over the Grid platform. In addition, Vnas greatly simplifies the development of new Grid applications by providing a callback system to simplify the creation of arbitrarily complex multistage computational pipelines and provides an abstracted virtual sandbox which bypasses Grid limitations. Vnas also reduces the usage of Grid bandwidth and storage resources by transparently detecting equality of virtual sandbox files based on content, across different submissions, even when performed by different users. BGBlast, evolution of the earlier project GridBlast, now provides a Grid Database Manager (GridDBManager) component for managing and automatically updating biological flatfile databases in the Grid environment. GridDBManager sports very novel features such as an adaptive replication algorithm that constantly optimizes the number of replicas of the managed databases in the Grid environment, balancing between response times (performances) and storage costs according to a programmed cost formula. GridDBManager also provides a very optimized automated management for older versions of the databases based on reverse delta files, which reduces the storage costs required to keep such older versions available in the Grid environment by two orders of magnitude. The SETest framework provides a way to the user to test and regressiontest Python applications completely scattered with side effects (this is a common case with Grid computational pipelines), which could not easily be tested using the more standard methods of unit testing or test cases. The technique is based on a new concept of datasets containing invocations and results of filtered calls. The framework hence significantly accelerates the development of new applications and computational pipelines for the Grid environment, and the efforts required for maintenance. An analysis of the impact of these solutions will be provided in this thesis. This Ph.D. work originated various publications in journals and conference proceedings as reported in the Appendix. Also, I orally presented my work at numerous international conferences related to Grid and bioinformatics.
Resumo:
The main purpose of this work is to develop a numerical platform for the turbulence modeling and optimal control of liquid metal flows. Thanks to their interesting thermal properties, liquid metals are widely studied as coolants for heat transfer applications in the nuclear context. However, due to their low Prandtl numbers, the standard turbulence models commonly used for coolants as air or water are inadequate. Advanced turbulence models able to capture the anisotropy in the flow and heat transfer are then necessary. In this thesis, a new anisotropic four-parameter turbulence model is presented and validated. The proposed model is based on explicit algebraic models and solves four additional transport equations for dynamical and thermal turbulent variables. For the validation of the model, several flow configurations are considered for different Reynolds and Prandtl numbers, namely fully developed flows in a plane channel and cylindrical pipe, and forced and mixed convection in a backward-facing step geometry. Since buoyancy effects cannot be neglected in liquid metals-cooled fast reactors, the second aim of this work is to provide mathematical and numerical tools for the simulation and optimization of liquid metals in mixed and natural convection. Optimal control problems for turbulent buoyant flows are studied and analyzed with the Lagrange multipliers method. Numerical algorithms for optimal control problems are integrated into the numerical platform and several simulations are performed to show the robustness, consistency, and feasibility of the method.
Resumo:
The human movement analysis (HMA) aims to measure the abilities of a subject to stand or to walk. In the field of HMA, tests are daily performed in research laboratories, hospitals and clinics, aiming to diagnose a disease, distinguish between disease entities, monitor the progress of a treatment and predict the outcome of an intervention [Brand and Crowninshield, 1981; Brand, 1987; Baker, 2006]. To achieve these purposes, clinicians and researchers use measurement devices, like force platforms, stereophotogrammetric systems, accelerometers, baropodometric insoles, etc. This thesis focus on the force platform (FP) and in particular on the quality assessment of the FP data. The principal objective of our work was the design and the experimental validation of a portable system for the in situ calibration of FPs. The thesis is structured as follows: Chapter 1. Description of the physical principles used for the functioning of a FP: how these principles are used to create force transducers, such as strain gauges and piezoelectrics transducers. Then, description of the two category of FPs, three- and six-component, the signals acquisition (hardware structure), and the signals calibration. Finally, a brief description of the use of FPs in HMA, for balance or gait analysis. Chapter 2. Description of the inverse dynamics, the most common method used in the field of HMA. This method uses the signals measured by a FP to estimate kinetic quantities, such as joint forces and moments. The measures of these variables can not be taken directly, unless very invasive techniques; consequently these variables can only be estimated using indirect techniques, as the inverse dynamics. Finally, a brief description of the sources of error, present in the gait analysis. Chapter 3. State of the art in the FP calibration. The selected literature is divided in sections, each section describes: systems for the periodic control of the FP accuracy; systems for the error reduction in the FP signals; systems and procedures for the construction of a FP. In particular is detailed described a calibration system designed by our group, based on the theoretical method proposed by ?. This system was the “starting point” for the new system presented in this thesis. Chapter 4. Description of the new system, divided in its parts: 1) the algorithm; 2) the device; and 3) the calibration procedure, for the correct performing of the calibration process. The algorithm characteristics were optimized by a simulation approach, the results are here presented. In addiction, the different versions of the device are described. Chapter 5. Experimental validation of the new system, achieved by testing it on 4 commercial FPs. The effectiveness of the calibration was verified by measuring, before and after calibration, the accuracy of the FPs in measuring the center of pressure of an applied force. The new system can estimate local and global calibration matrices; by local and global calibration matrices, the non–linearity of the FPs was quantified and locally compensated. Further, a non–linear calibration is proposed. This calibration compensates the non– linear effect in the FP functioning, due to the bending of its upper plate. The experimental results are presented. Chapter 6. Influence of the FP calibration on the estimation of kinetic quantities, with the inverse dynamics approach. Chapter 7. The conclusions of this thesis are presented: need of a calibration of FPs and consequential enhancement in the kinetic data quality. Appendix: Calibration of the LC used in the presented system. Different calibration set–up of a 3D force transducer are presented, and is proposed the optimal set–up, with particular attention to the compensation of non–linearities. The optimal set–up is verified by experimental results.
Resumo:
The industrial context is changing rapidly due to advancements in technology fueled by the Internet and Information Technology. The fourth industrial revolution counts integration, flexibility, and optimization as its fundamental pillars, and, in this context, Human-Robot Collaboration has become a crucial factor for manufacturing sustainability in Europe. Collaborative robots are appealing to many companies due to their low installation and running costs and high degree of flexibility, making them ideal for reshoring production facilities with a short return on investment. The ROSSINI European project aims to implement a true Human-Robot Collaboration by designing, developing, and demonstrating a modular and scalable platform for integrating human-centred robotic technologies in industrial production environments. The project focuses on safety concerns related to introducing a cobot in a shared working area and aims to lay the groundwork for a new working paradigm at the industrial level. The need for a software architecture suitable to the robotic platform employed in one of three use cases selected to deploy and test the new technology was the main trigger of this Thesis. The chosen application consists of the automatic loading and unloading of raw-material reels to an automatic packaging machine through an Autonomous Mobile Robot composed of an Autonomous Guided Vehicle, two collaborative manipulators, and an eye-on-hand vision system for performing tasks in a partially unstructured environment. The results obtained during the ROSSINI use case development were later used in the SENECA project, which addresses the need for robot-driven automatic cleaning of pharmaceutical bins in a very specific industrial context. The inherent versatility of mobile collaborative robots is evident from their deployment in the two projects with few hardware and software adjustments. The positive impact of Human-Robot Collaboration on diverse production lines is a motivation for future investments in research on this increasingly popular field by the industry.
Resumo:
Metabolomics has established itself as a discipline that can offer a unique point of view on how a technological treatment could impact on the charactersitics of a food. Even more, the same analytical platforms necessary for the purpose can also effectively unravel intricate interactions between such food and human health upon consumption. This PhD thesis investigates the application of metabolomics in understanding the impact of technological treatments on food and their subsequent effects on human health, utilizing 1H-NMR as the analytical platform. The study involves the development of standard operating procedures (SOPs) to ensure a fast and stable preparation of seafood samples, incorporating novel algorithms to enhance the accuracy of metabolome profiles. To gain insight on how metabolomics can allow exploring the effects of a technological treatment on a food, we performed three sets of experiments to investigate the application of metabolomics in studying the impact of high hydrostatic pressure (HHP) treatment on seafood metabolome during storage. The first experiment employs untargeted metabolomic analysis on chill-stored rose shrimp, revealing significant post-HHP treatment metabolic alterations and mechanisms. The investigation is extended to grey mullet in the second experiment, utilizing both untargeted and targeted metabolomic analyses to account for matrix-related effects. The third experiment assesses the targeted metabolome of striped prawns, showing that HHP significantly influences metabolic pathways, positively impacting freshness and taste through alterations in related metabolites. Shifting focus to the effects of food on humans, the study explores the impact of multistrain probiotics on cirrhosis patients using 1H-NMR. The platform reveals notable alterations in glutamine/glutamate metabolism, enhancing the patients' ammonia detoxification capacity. This research underscores the potential of metabolomics in uncovering intricate interactions between technological treatments, food, and human health, providing valuable insights for both the food industry and healthcare interventions.
Resumo:
This thesis explores the advancement of cancer treatment through targeted photodynamic therapy (PDT) using bioengineered phages. It aims to harness the specificity of phages for targeting cancer-related receptors such as EGFR and HER2, which are pivotal in numerous malignancies and associated with poor outcomes. The study commenced with the M13EGFR phage, modified to target EGFR through pIII-displayed EGFR-binding peptides, demonstrating enhanced killing efficiency when conjugated with the Rose Bengal photosensitizer. This phase underscored phages' potential in targeted PDT. A breakthrough was achieved with the development of the M137D12 phage, engineered to display the 7D12 nanobody for precise EGFR targeting, marking a shift from peptide-based to nanobody-based targeting and yielding better specificity and therapeutic results. The translational potential was highlighted through in vitro and in vivo assays employing therapeutic lasers, showing effective, specific cancer cell killing through a necrotic mechanism. Additionally, the research delved into the interaction between the M13CC phage and colon cancer models, demonstrating its ability to penetrate and disrupt cancer spheroids only upon irradiation, indicating a significant advancement in targeting cells within challenging tumor microenvironments. In summary, the thesis provides a thorough examination of the phage platform's efficacy and versatility for targeted PDT. The promising outcomes, especially with the M137D12 phage, and initial findings on a HER2-targeting phage (M13HER2), forecast a promising future for phage-mediated, targeted anticancer strategies employing photosensitizers in PDT.
Resumo:
A Machining Centre is nowadays a complex mechanical, electronic, electrical system that needs integrated design capabilities which very often require a high time-consuming effort. Numerical techniques for designing and dimensioning the machine structure and components usually requires different knowledge according to the system that have to be designed. This Ph. D Thesis is related about the efforts of the Authors to develop a system that allows to perform the complete project of a new machine optimized in its dynamic behaviour. An integration of the different systems developed, each of which respond to specific necessities of designer, is here presented. In particular a dynamic analysis system, based on a lumped mass approach, that rapidly allows to setup the drives of the machine and an Integrated Dynamic Simulation System, based on a FEM approach, that permit a dynamic optimization, are shown. A multilevel Data Base, and an operator interface module provide to complete the designing platform. The proposed approach represents a significant step toward the virtual machining for the prediction of the quality of the worked surface.
Resumo:
The work of my thesis is focused on the impact of tsunami waves in limited basins. By limited basins I mean here those basins capable of modifying significantly the tsunami signal with respect to the surrounding open sea. Based on this definition, we consider limited basins not only harbours but also straits, channels, seamounts and oceanic shelves. I have considered two different examples, one dealing with the Seychelles Island platform in the Indian Ocean, the second focussing on the Messina Strait and the harbour of the Messina city itself (Italy). The Seychelles platform is differentiated at bathymetric level from the surrounding ocean, with rapid changes from 2 km to 70 meters over short horizontal distances. The study of the platform response to the tsunami propagation is based on the simulation of the mega-event occurred on 26 December 2004. Based on a hypothesis for the earthquake causative fault, the ensuing tsunami has been numerically simulated. I analysed synthetic tide gauge records at several virtual tide gauges aligned along the direction going from the source to the platform. A substantial uniformity of tsunami signals in all calculated open ocean tide-gauge records is observed, while the signals calculated in two points of the Seychelles platform show different features both in terms of amplitude and period of the perturbation. To better understand the content in frequency of different calculated marigrams, a spectral analysis was carried out. In particular the ratio between the calculated tide-gauge records spectrum on the platform and the average tide-gauge records in the open ocean was considered. The main result is that, while in the average spectrum in the open ocean the fundamental peak is related to the source, the platform introduces further peaks linked both to the bathymetric configuration and to coastal geometry. The Messina Strait represents an interesting case because it consists in a sort of a channel open both in the north and in the south and furthermore contains the limited basin of the Messina harbour. In this case the study has been carried out in a different way with respect to the Seychelles case. The basin was forced along a boundary of the computational domain with sinusoidal functions having different periods within the typical tsunami frequencies. The tsunami has been simulated numerically and in particular the tide-gauge records were calculated for every forcing function in different points both externally and internally of the channel and of the Messina harbour. Apart from the tide-gauge records in the source region that almost immediately reach stationarity, all the computed signals in the channel and in the Messina harbour present a transient variable amplitude followed by a stationary part. Based exclusively on this last part, I calculated the amplification curves for each site. I found that the maximum amplification is obtained for forcing periods of approximately 10 minutes.
Resumo:
Generic programming is likely to become a new challenge for a critical mass of developers. Therefore, it is crucial to refine the support for generic programming in mainstream Object-Oriented languages — both at the design and at the implementation level — as well as to suggest novel ways to exploit the additional degree of expressiveness made available by genericity. This study is meant to provide a contribution towards bringing Java genericity to a more mature stage with respect to mainstream programming practice, by increasing the effectiveness of its implementation, and by revealing its full expressive power in real world scenario. With respect to the current research setting, the main contribution of the thesis is twofold. First, we propose a revised implementation for Java generics that greatly increases the expressiveness of the Java platform by adding reification support for generic types. Secondly, we show how Java genericity can be leveraged in a real world case-study in the context of the multi-paradigm language integration. Several approaches have been proposed in order to overcome the lack of reification of generic types in the Java programming language. Existing approaches tackle the problem of reification of generic types by defining new translation techniques which would allow for a runtime representation of generics and wildcards. Unfortunately most approaches suffer from several problems: heterogeneous translations are known to be problematic when considering reification of generic methods and wildcards. On the other hand, more sophisticated techniques requiring changes in the Java runtime, supports reified generics through a true language extension (where clauses) so that backward compatibility is compromised. In this thesis we develop a sophisticated type-passing technique for addressing the problem of reification of generic types in the Java programming language; this approach — first pioneered by the so called EGO translator — is here turned into a full-blown solution which reifies generic types inside the Java Virtual Machine (JVM) itself, thus overcoming both performance penalties and compatibility issues of the original EGO translator. Java-Prolog integration Integrating Object-Oriented and declarative programming has been the subject of several researches and corresponding technologies. Such proposals come in two flavours, either attempting at joining the two paradigms, or simply providing an interface library for accessing Prolog declarative features from a mainstream Object-Oriented languages such as Java. Both solutions have however drawbacks: in the case of hybrid languages featuring both Object-Oriented and logic traits, such resulting language is typically too complex, thus making mainstream application development an harder task; in the case of library-based integration approaches there is no true language integration, and some “boilerplate code” has to be implemented to fix the paradigm mismatch. In this thesis we develop a framework called PatJ which promotes seamless exploitation of Prolog programming in Java. A sophisticated usage of generics/wildcards allows to define a precise mapping between Object-Oriented and declarative features. PatJ defines a hierarchy of classes where the bidirectional semantics of Prolog terms is modelled directly at the level of the Java generic type-system.
Resumo:
A pursuer UAV tracking and loitering around a target is the problem analyzed in this thesis. The UAV is assumed to be a fixed-wing vehicle and constant airspeed together with bounded lateral accelerations are the main constraints of the problem. Three different guidance laws are designed for ensuring a continuos overfly on the target. Different proofs are presented to demonstrate the stability properties of the laws. All the algorithms are tested on a 6DoF Pioneer software simulator. Classic control design methods have been adopted to develop autopilots for implementig the simulation platform used for testing the guidance laws.
Resumo:
The thesis analyze a subject of renewed interest in bioengineering, the research and analysis of exercise parameters that maximize the neuromuscular and cardiovascular involvement in vibration treatment. The research activity was inspired by the increasing use of device able to provide localized or whole body vibration (WBV). In particular, the focus was placed on the vibrating platform and the effect that the vibrations have on the neuromuscular system and cardiovascular system. The aim of the thesis is to evaluate the effectiveness and efficiency of vibration applied to the entire body, in particular, it was investigated the effect of WBV on: 1) Oxygen consumption during static and dynamic squat; 2) Resonant frequency of the muscle groups of the lower limbs; 3) Oxygen consumption and electromyographic signals during static and dynamic squat. In the first three chapters are explained the state of the art concerning vibration treatments, the effects of vibration applied to the entire body, with the explanation of the basic mechanisms (Tonic Vibration Reflex, TVR) and the neuromuscular system, with particular attention to the skeletal muscles and the stretch reflex. In the fourth chapter is illustrated the set-up used for the experiments and the software, implemented in LabWindows in order to control the platform and acquire the electromyographic signal. In the fifth chapter were exposed experiments undertaken during the PhD years. In particular, the analysis of Whole Body Vibration effect on neurological and cardiovascular systems showed interesting results. The results indicate that the static squat with WBV produced higher neuromuscular and cardiorespiratory system activation for exercise duration <60 sec. Otherwise, if the single bout duration was higher than 60 sec, the greater cardiorespiratory system activation was achieved during the dynamic squat with WBV while higher neuromuscular activation was still obtained with the static exercise.
Resumo:
I moderni motori a combustione interna diventano sempre più complessi L'introduzione della normativa antinquinamento EURO VI richiederà una significativa riduzione degli inquinanti allo scarico. La maggiore criticità è rappresentata dalla riduzione degli NOx per i motori Diesel da aggiungersi a quelle già in vigore con le precedenti normative. Tipicamente la messa a punto di una nuova motorizzazione prevede una serie di test specifici al banco prova. Il numero sempre maggiore di parametri di controllo della combustione, sorti come conseguenza della maggior complessità meccanica del motore stesso, causa un aumento esponenziale delle prove da eseguire per caratterizzare l'intero sistema. L'obiettivo di questo progetto di dottorato è quello di realizzare un sistema di analisi della combustione in tempo reale in cui siano implementati diversi algoritmi non ancora presenti nelle centraline moderne. Tutto questo facendo particolare attenzione alla scelta dell'hardware su cui implementare gli algoritmi di analisi. Creando una piattaforma di Rapid Control Prototyping (RCP) che sfrutti la maggior parte dei sensori presenti in vettura di serie; che sia in grado di abbreviare i tempi e i costi della sperimentazione sui motopropulsori, riducendo la necessità di effettuare analisi a posteriori, su dati precedentemente acquisiti, a fronte di una maggior quantità di calcoli effettuati in tempo reale. La soluzione proposta garantisce l'aggiornabilità, la possibilità di mantenere al massimo livello tecnologico la piattaforma di calcolo, allontanandone l'obsolescenza e i costi di sostituzione. Questa proprietà si traduce nella necessità di mantenere la compatibilità tra hardware e software di generazioni differenti, rendendo possibile la sostituzione di quei componenti che limitano le prestazioni senza riprogettare il software.
Resumo:
The primary aim of the research activity presented in this PhD thesis was the development of an innovative hardware and software solution for creating a unique tool for kinematics and electromyographic analysis of the human body in an ecological setting. For this purpose, innovative algorithms have been proposed regarding different aspects of inertial and magnetic data elaboration: magnetometer calibration and magnetic field mapping (Chapter 2), data calibration (Chapter 3) and sensor-fusion algorithm. Topics that may conflict with the confidentiality agreement between University of Bologna and NCS Lab will not be covered in this thesis. After developing and testing the wireless platform, research activities were focused on its clinical validation. The first clinical study aimed to evaluate the intra and interobserver reproducibility in order to evaluate three-dimensional humero-scapulo-thoracic kinematics in an outpatient setting (Chapter 4). A second study aimed to evaluate the effect of Latissimus Dorsi Tendon Transfer on shoulder kinematics and Latissimus Dorsi activation in humerus intra - extra rotations (Chapter 5). Results from both clinical studies have demonstrated the ability of the developed platform to enter into daily clinical practice, providing useful information for patients' rehabilitation.
Resumo:
The topic of this thesis is the design and the implementation of mathematical models and control system algorithms for rotary-wing unmanned aerial vehicles to be used in cooperative scenarios. The use of rotorcrafts has many attractive advantages, since these vehicles have the capability to take-off and land vertically, to hover and to move backward and laterally. Rotary-wing aircraft missions require precise control characteristics due to their unstable and heavy coupling aspects. As a matter of fact, flight test is the most accurate way to evaluate flying qualities and to test control systems. However, it may be very expensive and/or not feasible in case of early stage design and prototyping. A good compromise is made by a preliminary assessment performed by means of simulations and a reduced flight testing campaign. Consequently, having an analytical framework represents an important stage for simulations and control algorithm design. In this work mathematical models for various helicopter configurations are implemented. Different flight control techniques for helicopters are presented with theoretical background and tested via simulations and experimental flight tests on a small-scale unmanned helicopter. The same platform is used also in a cooperative scenario with a rover. Control strategies, algorithms and their implementation to perform missions are presented for two main scenarios. One of the main contributions of this thesis is to propose a suitable control system made by a classical PID baseline controller augmented with L1 adaptive contribution. In addition a complete analytical framework and the study of the dynamics and the stability of a synch-rotor are provided. At last, the implementation of cooperative control strategies for two main scenarios that include a small-scale unmanned helicopter and a rover.
Resumo:
Knowledge graphs and ontologies are closely related concepts in the field of knowledge representation. In recent years, knowledge graphs have gained increasing popularity and are serving as essential components in many knowledge engineering projects that view them as crucial to their success. The conceptual foundation of the knowledge graph is provided by ontologies. Ontology modeling is an iterative engineering process that consists of steps such as the elicitation and formalization of requirements, the development, testing, refactoring, and release of the ontology. The testing of the ontology is a crucial and occasionally overlooked step of the process due to the lack of integrated tools to support it. As a result of this gap in the state-of-the-art, the testing of the ontology is completed manually, which requires a considerable amount of time and effort from the ontology engineers. The lack of tool support is noticed in the requirement elicitation process as well. In this aspect, the rise in the adoption and accessibility of knowledge graphs allows for the development and use of automated tools to assist with the elicitation of requirements from such a complementary source of data. Therefore, this doctoral research is focused on developing methods and tools that support the requirement elicitation and testing steps of an ontology engineering process. To support the testing of the ontology, we have developed XDTesting, a web application that is integrated with the GitHub platform that serves as an ontology testing manager. Concurrently, to support the elicitation and documentation of competency questions, we have defined and implemented RevOnt, a method to extract competency questions from knowledge graphs. Both methods are evaluated through their implementation and the results are promising.