60 resultados para New applications
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
In the last decades mesenchymal stromal cells (MSC), intriguing for their multilineage plasticity and their proliferation activity in vitro, have been intensively studied for innovative therapeutic applications. In the first project, a new method to expand in vitro adipose derived-MSC (ASC) while maintaining their progenitor properties have been investigated. ASC are cultured in the same flask for 28 days in order to allow cell-extracellular matrix and cell-cell interactions and to mimic in vivo niche. ASC cultured with this method (Unpass cells) were compared with ASC cultured under classic condition (Pass cells). Unpass and Pass cells were characterized in terms of clonogenicity, proliferation, stemness gene expression, differentiation in vitro and in vivo and results obtained showed that Unpass cells preserve their stemness and phenotypic properties suggesting a fundamental role of the niche in the maintenance of ASC progenitor features. Our data suggests alternative culture conditions for the expansion of ASC ex vivo which could increase the performance of ASC in regenerative applications. In vivo MSC tracking is essential in order to assess their homing and migration. Super-paramagnetic iron oxide nanoparticles (SPION) have been used to track MSC in vivo due to their biocompatibility and traceability by MRI. In the second project a new generation of magnetic nanoparticles (MNP) used to label MSC were tested. These MNP have been functionalized with hyperbranched poly(epsilon-lysine)dendrons (G3CB) in order to interact with membrane glycocalix of the cells avoiding their internalization and preventing any cytotoxic effects. In literature it is reported that labeling of MSC with SPION takes long time of incubation. In our experiments after 15min of incubation with G3CB-MNP more then 80% of MSC were labeled. The data obtained from cytotoxic, proliferation and differentiation assay showed that labeling does not affect MSC properties suggesting a potential application of G3CB nano-particles in regenerative medicine.
Resumo:
Bioinformatics is a recent and emerging discipline which aims at studying biological problems through computational approaches. Most branches of bioinformatics such as Genomics, Proteomics and Molecular Dynamics are particularly computationally intensive, requiring huge amount of computational resources for running algorithms of everincreasing complexity over data of everincreasing size. In the search for computational power, the EGEE Grid platform, world's largest community of interconnected clusters load balanced as a whole, seems particularly promising and is considered the new hope for satisfying the everincreasing computational requirements of bioinformatics, as well as physics and other computational sciences. The EGEE platform, however, is rather new and not yet free of problems. In addition, specific requirements of bioinformatics need to be addressed in order to use this new platform effectively for bioinformatics tasks. In my three years' Ph.D. work I addressed numerous aspects of this Grid platform, with particular attention to those needed by the bioinformatics domain. I hence created three major frameworks, Vnas, GridDBManager and SETest, plus an additional smaller standalone solution, to enhance the support for bioinformatics applications in the Grid environment and to reduce the effort needed to create new applications, additionally addressing numerous existing Grid issues and performing a series of optimizations. The Vnas framework is an advanced system for the submission and monitoring of Grid jobs that provides an abstraction with reliability over the Grid platform. In addition, Vnas greatly simplifies the development of new Grid applications by providing a callback system to simplify the creation of arbitrarily complex multistage computational pipelines and provides an abstracted virtual sandbox which bypasses Grid limitations. Vnas also reduces the usage of Grid bandwidth and storage resources by transparently detecting equality of virtual sandbox files based on content, across different submissions, even when performed by different users. BGBlast, evolution of the earlier project GridBlast, now provides a Grid Database Manager (GridDBManager) component for managing and automatically updating biological flatfile databases in the Grid environment. GridDBManager sports very novel features such as an adaptive replication algorithm that constantly optimizes the number of replicas of the managed databases in the Grid environment, balancing between response times (performances) and storage costs according to a programmed cost formula. GridDBManager also provides a very optimized automated management for older versions of the databases based on reverse delta files, which reduces the storage costs required to keep such older versions available in the Grid environment by two orders of magnitude. The SETest framework provides a way to the user to test and regressiontest Python applications completely scattered with side effects (this is a common case with Grid computational pipelines), which could not easily be tested using the more standard methods of unit testing or test cases. The technique is based on a new concept of datasets containing invocations and results of filtered calls. The framework hence significantly accelerates the development of new applications and computational pipelines for the Grid environment, and the efforts required for maintenance. An analysis of the impact of these solutions will be provided in this thesis. This Ph.D. work originated various publications in journals and conference proceedings as reported in the Appendix. Also, I orally presented my work at numerous international conferences related to Grid and bioinformatics.
Resumo:
Graphene and graphenic derivatives have rapidly emerged as an extremely promising system for electronic, optical, thermal, and electromechanical applications. Several approaches have been developed to produce these materials (i.e. scotch tape, CVD, chemical and solvent exfoliation). In this work we report a chemical approach to produce graphene by reducing graphene oxide (GO) via thermal or electrical methods. A morphological and electrical characterization of these systems has been performed using different techniques such as SPM, SEM, TEM, Raman and XPS. Moreover, we studied the interaction between graphene derivates and organic molecules focusing on the following aspects: - improvement of optical contrast of graphene on different substrates for rapid monolayer identification1 - supramolecular interaction with organic molecules (i.e. thiophene, pyrene etc.)4 - covalent functionalization with optically active molecules2 - preparation and characterization of organic/graphene Field Effect Transistors3-5 Graphene chemistry can potentially allow seamless integration of graphene technology in organic electronics devices to improve device performance and develop new applications for graphene-based materials. [1] E. Treossi, M. Melucci, A. Liscio, M. Gazzano, P. Samorì, and V. Palermo, J. Am. Chem. Soc., 2009, 131, 15576. [2] M. Melucci, E. Treossi, L. Ortolani, G. Giambastiani, V. Morandi, P. Klar, C. Casiraghi, P. Samorì, and V. Palermo, J. Mater. Chem., 2010, 20, 9052. [3] J.M. Mativetsky, E. Treossi, E. Orgiu, M. Melucci, G.P. Veronese, P. Samorì, and V. Palermo, J. Am. Chem. Soc., 2010, 132, 14130. [4] A. Liscio, G.P. Veronese, E. Treossi, F. Suriano, F. Rossella, V. Bellani, R. Rizzoli, P. Samorì and V. Palermo, J. Mater. Chem., 2011, 21, 2924. [5] J.M. Mativetsky, A. Liscio, E. Treossi, E. Orgiu, A. Zanelli, P. Samorì , V. Palermo, J. Am. Chem. Soc., 2011, 133, 14320
Resumo:
Today, third generation networks are consolidated realities, and user expectations on new applications and services are becoming higher and higher. Therefore, new systems and technologies are necessary to move towards the market needs and the user requirements. This has driven the development of fourth generation networks. ”Wireless network for the fourth generation” is the expression used to describe the next step in wireless communications. There is no formal definition for what these fourth generation networks are; however, we can say that the next generation networks will be based on the coexistence of heterogeneous networks, on the integration with the existing radio access network (e.g. GPRS, UMTS, WIFI, ...) and, in particular, on new emerging architectures that are obtaining more and more relevance, as Wireless Ad Hoc and Sensor Networks (WASN). Thanks to their characteristics, fourth generation wireless systems will be able to offer custom-made solutions and applications personalized according to the user requirements; they will offer all types of services at an affordable cost, and solutions characterized by flexibility, scalability and reconfigurability. This PhD’s work has been focused on WASNs, autoconfiguring networks which are not based on a fixed infrastructure, but are characterized by being infrastructure less, where devices have to automatically generate the network in the initial phase, and maintain it through reconfiguration procedures (if nodes’ mobility, or energy drain, etc..., cause disconnections). The main part of the PhD activity has been focused on an analytical study on connectivity models for wireless ad hoc and sensor networks, nevertheless a small part of my work was experimental. Anyway, both the theoretical and experimental activities have had a common aim, related to the performance evaluation of WASNs. Concerning the theoretical analysis, the objective of the connectivity studies has been the evaluation of models for the interference estimation. This is due to the fact that interference is the most important performance degradation cause in WASNs. As a consequence, is very important to find an accurate model that allows its investigation, and I’ve tried to obtain a model the most realistic and general as possible, in particular for the evaluation of the interference coming from bounded interfering areas (i.e. a WiFi hot spot, a wireless covered research laboratory, ...). On the other hand, the experimental activity has led to Throughput and Packet Error Rare measurements on a real IEEE802.15.4 Wireless Sensor Network.
Resumo:
L’approccio innovativo di questa tesi alla pianificazione ciclabile consiste nell’integrare le linee guida per la redazione di un biciplan con aspetti, metodologie e strumenti nuovi, per rendere più efficace la programmazione di interventi. I limiti del biciplan risiedono nella fase di pianificazione e di monitoraggio, quindi, nel 1° capitolo, vengono esaminate le differenze esistenti tra la normativa americana (AASHTO) e quella italiana (D.P.R. 557/99). Nel 2° capitolo vengono analizzati gli indicatori usati nella fase di monitoraggio e la loro evoluzione fino alla definizione degli attuali indici per la determinazione del LOS delle infrastrutture ciclabili: BLOS e BCI. L’analisi è integrata con le nuove applicazioni di questi indici e con lo studio del LOS de HCM 2010. BCI e BISI sono stati applicati alla rete di Bologna per risolvere problemi di pianificazione e per capire se esistessero problemi di trasferibilità. Gli indici analizzati prendono in considerazione solo il lato offerta del sistema di trasporto ciclabile; manca un giudizio sui flussi, per verificare l’efficacia delle policy. Perciò il 3° capitolo è dedicato alla metodologia sul monitoraggio dei flussi, mediante l’utilizzo di comuni traffic counter per le rilevazioni dei flussi veicolari. Dal monitoraggio è possibile ricavare informazioni sul numero di passaggi, periodi di punta, esistenza di percorsi preferiti, influenza delle condizioni climatiche, utili ai progettisti; si possono creare serie storiche di dati per controllare l’evoluzione della mobilità ciclabile e determinare l’esistenza di criticità dell’infrastruttura. L’efficacia della pianificazione ciclabile è legata al grado di soddisfazione dell’utente e all’appetibilità delle infrastrutture, perciò il progettista deve conoscere degli elementi che influenzano le scelte del ciclista. Nel 4° capitolo sono analizzate le tecniche e gli studi sulle scelte dell’itinerario dei ciclisti, e lo studio pilota fatto a Bologna per definire le variabili che influenzano le scelte dei ciclisti e il loro peso.
Resumo:
n the last few years, the vision of our connected and intelligent information society has evolved to embrace novel technological and research trends. The diffusion of ubiquitous mobile connectivity and advanced handheld portable devices, amplified the importance of the Internet as the communication backbone for the fruition of services and data. The diffusion of mobile and pervasive computing devices, featuring advanced sensing technologies and processing capabilities, triggered the adoption of innovative interaction paradigms: touch responsive surfaces, tangible interfaces and gesture or voice recognition are finally entering our homes and workplaces. We are experiencing the proliferation of smart objects and sensor networks, embedded in our daily living and interconnected through the Internet. This ubiquitous network of always available interconnected devices is enabling new applications and services, ranging from enhancements to home and office environments, to remote healthcare assistance and the birth of a smart environment. This work will present some evolutions in the hardware and software development of embedded systems and sensor networks. Different hardware solutions will be introduced, ranging from smart objects for interaction to advanced inertial sensor nodes for motion tracking, focusing on system-level design. They will be accompanied by the study of innovative data processing algorithms developed and optimized to run on-board of the embedded devices. Gesture recognition, orientation estimation and data reconstruction techniques for sensor networks will be introduced and implemented, with the goal to maximize the tradeoff between performance and energy efficiency. Experimental results will provide an evaluation of the accuracy of the presented methods and validate the efficiency of the proposed embedded systems.
Resumo:
Coastal sand dunes represent a richness first of all in terms of defense from the sea storms waves and the saltwater ingression; moreover these morphological elements constitute an unique ecosystem of transition between the sea and the land environment. The research about dune system is a strong part of the coastal sciences, since the last century. Nowadays this branch have assumed even more importance for two reasons: on one side the born of brand new technologies, especially related to the Remote Sensing, have increased the researcher possibilities; on the other side the intense urbanization of these days have strongly limited the dune possibilities of development and fragmented what was remaining from the last century. This is particularly true in the Ravenna area, where the industrialization united to the touristic economy and an intense subsidence, have left only few dune ridges residual still active. In this work three different foredune ridges, along the Ravenna coast, have been studied with Laser Scanner technology. This research didn’t limit to analyze volume or spatial difference, but try also to find new ways and new features to monitor this environment. Moreover the author planned a series of test to validate data from Terrestrial Laser Scanner (TLS), with the additional aim of finalize a methodology to test 3D survey accuracy. Data acquired by TLS were then applied on one hand to test some brand new applications, such as Digital Shore Line Analysis System (DSAS) and Computational Fluid Dynamics (CFD), to prove their efficacy in this field; on the other hand the author used TLS data to find any correlation with meteorological indexes (Forcing Factors), linked to sea and wind (Fryberger's method) applying statistical tools, such as the Principal Component Analysis (PCA).
Resumo:
Organic molecular semiconductors are subject of intense research for their crucial role as key components of new generation low cost, flexible, and large area electronic devices such as displays, thin-film transistors, solar cells, sensors and logic circuits. In particular, small molecular thienoimide (TI) based materials are emerging as novel multifunctional materials combining a good processability together to ambipolar or n-type charge transport and electroluminescence at the solid state, thus enabling the fabrication of integrated devices like organic field effect transistors (OFETs) and light emitting transistor (OLETs). Given this peculiar combination of characteristics, they also constitute the ideal substrates for fundamental studies on the structure-property relationships in multifunctional molecular systems. In this scenario, this thesis work is focused on the synthesis of new thienoimide based materials with tunable optical, packing, morphology, charge transport and electroluminescence properties by following a fine molecular tailoring, thus optimizing their performances in device as well as investigating and enabling new applications. Investigation on their structure-property relationships has been carried out and in particular, the effect of different π-conjugated cores (heterocycles, length) and alkyl end chain (shape, length) changes have been studied, obtaining materials with enhanced electron transport capability end electroluminescence suitable for the realization of OFETs and single layer OLETs. Moreover, control on the polymorphic behaviour characterizing thienoimide materials has been reached by synthetic and post-synthetic methodologies, developing multifunctional materials from a single polymorphic compound. Finally, with the aim of synthesizing highly pure materials, simplifying the purification steps and avoiding organometallic residues, procedures based on direct arylation reactions replacing conventional cross-couplings have been investigated and applied to different classes of molecules, bearing thienoimidic core or ends, as well as thiophene and anthracene derivatives, validating this approach as a clean alternative for the synthesis of several molecular materials.
Resumo:
This thesis is focused on Smart Grid applications in medium voltage distribution networks. For the development of new applications it appears useful the availability of simulation tools able to model dynamic behavior of both the power system and the communication network. Such a co-simulation environment would allow the assessment of the feasibility of using a given network technology to support communication-based Smart Grid control schemes on an existing segment of the electrical grid and to determine the range of control schemes that different communications technologies can support. For this reason, is presented a co-simulation platform that has been built by linking the Electromagnetic Transients Program Simulator (EMTP v3.0) with a Telecommunication Network Simulator (OPNET-Riverbed v18.0). The simulator is used to design and analyze a coordinate use of Distributed Energy Resources (DERs) for the voltage/var control (VVC) in distribution network. This thesis is focused control structure based on the use of phase measurement units (PMUs). In order to limit the required reinforcements of the communication infrastructures currently adopted by Distribution Network Operators (DNOs), the study is focused on leader-less MAS schemes that do not assign special coordinating rules to specific agents. Leader-less MAS are expected to produce more uniform communication traffic than centralized approaches that include a moderator agent. Moreover, leader-less MAS are expected to be less affected by limitations and constraint of some communication links. The developed co-simulator has allowed the definition of specific countermeasures against the limitations of the communication network, with particular reference to the latency and loss and information, for both the case of wired and wireless communication networks. Moreover, the co-simulation platform has bee also coupled with a mobility simulator in order to study specific countermeasures against the negative effects on the medium voltage/current distribution network caused by the concurrent connection of electric vehicles.
Resumo:
Big data are reshaping the way we interact with technology, thus fostering new applications to increase the safety-assessment of foods. An extraordinary amount of information is analysed using machine learning approaches aimed at detecting the existence or predicting the likelihood of future risks. Food business operators have to share the results of these analyses when applying to place on the market regulated products, whereas agri-food safety agencies (including the European Food Safety Authority) are exploring new avenues to increase the accuracy of their evaluations by processing Big data. Such an informational endowment brings with it opportunities and risks correlated to the extraction of meaningful inferences from data. However, conflicting interests and tensions among the involved entities - the industry, food safety agencies, and consumers - hinder the finding of shared methods to steer the processing of Big data in a sound, transparent and trustworthy way. A recent reform in the EU sectoral legislation, the lack of trust and the presence of a considerable number of stakeholders highlight the need of ethical contributions aimed at steering the development and the deployment of Big data applications. Moreover, Artificial Intelligence guidelines and charters published by European Union institutions and Member States have to be discussed in light of applied contexts, including the one at stake. This thesis aims to contribute to these goals by discussing what principles should be put forward when processing Big data in the context of agri-food safety-risk assessment. The research focuses on two interviewed topics - data ownership and data governance - by evaluating how the regulatory framework addresses the challenges raised by Big data analysis in these domains. The outcome of the project is a tentative Roadmap aimed to identify the principles to be observed when processing Big data in this domain and their possible implementations.
Resumo:
Non-linear effects are responsible for peculiar phenomena in charged particles dynamics in circular accelerators. Recently, they have been used to propose novel beam manipulations where one can modify the transverse beam distribution in a controlled way, to fulfil the constraints posed by new applications. One example is the resonant beam splitting used at CERN for the Multi-Turn Extraction (MTE), to transfer proton beams from PS to SPS. The theoretical description of these effects relies on the formulation of the particle's dynamics in terms of Hamiltonian systems and symplectic maps, and on the theory of adiabatic invariance and resonant separatrix crossing. Close to resonance, new stable regions and new separatrices appear in the phase space. As non-linear effects do not preserve the Courant-Snyder invariant, it is possible for a particle to cross a separatrix, changing the value of its adiabatic invariant. This process opens the path to new beam manipulations. This thesis deals with various possible effects that can be used to shape the transverse beam dynamics, using 2D and 4D models of particles' motion. We show the possibility of splitting a beam using a resonant external exciter, or combining its action with MTE-like tune modulation close to resonance. Non-linear effects can also be used to cool a beam acting on its transverse beam distribution. We discuss the case of an annular beam distribution, showing that emittance can be reduced modulating amplitude and frequency of a resonant oscillating dipole. We then consider 4D models where, close to resonance, motion in the two transverse planes is coupled. This is exploited to operate on the transverse emittances with a 2D resonance crossing. Depending on the resonance, the result is an emittance exchange between the two planes, or an emittance sharing. These phenomena are described and understood in terms of adiabatic invariance theory.
Resumo:
The fourth industrial revolution is paving the way for Industrial Internet of Things applications where industrial assets (e.g., robotic arms, valves, pistons) are equipped with a large number of wireless devices (i.e., microcontroller boards that embed sensors and actuators) to enable a plethora of new applications, such as analytics, diagnostics, monitoring, as well as supervisory, and safety control use-cases. Nevertheless, current wireless technologies, such as Wi-Fi, Bluetooth, and even private 5G networks, cannot fulfill all the requirements set up by the Industry 4.0 paradigm, thus opening up new 6G-oriented research trends, such as the use of THz frequencies. In light of the above, this thesis provides (i) a broad overview of the main use-cases, requirements, and key enabling wireless technologies foreseen by the fourth industrial revolution, and (ii) proposes innovative contributions, both theoretical and empirical, to enhance the performance of current and future wireless technologies at different levels of the protocol stack. In particular, at the physical layer, signal processing techniques are being exploited to analyze two multiplexing schemes, namely Affine Frequency Division Multiplexing and Orthogonal Chirp Division Multiplexing, which seem promising for high-frequency wireless communications. At the medium access layer, three protocols for intra-machine communications are proposed, where one is based on LoRa at 2.4 GHz and the others work in the THz band. Different scheduling algorithms for private industrial 5G networks are compared, and two main proposals are described, i.e., a decentralized scheme that leverages machine learning techniques to better address aperiodic traffic patterns, and a centralized contention-based design that serves a federated learning industrial application. Results are provided in terms of numerical evaluations, simulation results, and real-world experiments. Several improvements over the state-of-the-art were obtained, and the description of up-and-running testbeds demonstrates the feasibility of some of the theoretical concepts when considering a real industry plant.
Resumo:
This artwork reports on two different projects that were carried out during the three years of Doctor of the Philosophy course. In the first years a project regarding Capacitive Pressure Sensors Array for Aerodynamic Applications was developed in the Applied Aerodynamic research team of the Second Faculty of Engineering, University of Bologna, Forlì, Italy, and in collaboration with the ARCES laboratories of the same university. Capacitive pressure sensors were designed and fabricated, investigating theoretically and experimentally the sensor’s mechanical and electrical behaviours by means of finite elements method simulations and by means of wind tunnel tests. During the design phase, the sensor figures of merit are considered and evaluated for specific aerodynamic applications. The aim of this work is the production of low cost MEMS-alternative devices suitable for a sensor network to be implemented in air data system. The last two year was dedicated to a project regarding Wireless Pressure Sensor Network for Nautical Applications. Aim of the developed sensor network is to sense the weak pressure field acting on the sail plan of a full batten sail by means of instrumented battens, providing a real time differential pressure map over the entire sail surface. The wireless sensor network and the sensing unit were designed, fabricated and tested in the faculty laboratories. A static non-linear coupled mechanical-electrostatic simulation, has been developed to predict the pressure versus capacitance static characteristic suitable for the transduction process and to tune the geometry of the transducer to reach the required resolution, sensitivity and time response in the appropriate full scale pressure input A time dependent viscoelastic error model has been inferred and developed by means of experimental data in order to model, predict and reduce the inaccuracy bound due to the viscolelastic phenomena affecting the Mylar® polyester film used for the sensor diaphragm. The development of the two above mentioned subjects are strictly related but presently separately in this artwork.
Resumo:
The subject of this thesis is multicolour bioluminescence analysis and how it can provide new tools for drug discovery and development.The mechanism of color tuning in bioluminescent reactions is not fully understood yet but it is object of intense research and several hypothesis have been generated. In the past decade key residues of the active site of the enzyme or in the surface surrounding the active site have been identified as responsible of different color emission. Anyway since bioluminescence reaction is strictly dependent from the interaction between the enzyme and its substrate D-luciferin, modification of the substrate can lead to a different emission spectrum too. In the recent years firefly luciferase and other luciferases underwent mutagenesis in order to obtain mutants with different emission characteristics. Thanks to these new discoveries in the bioluminescence field multicolour luciferases can be nowadays employed in bioanalysis for assay developments and imaging purposes. The use of multicolor bioluminescent enzymes expanded the potential of a range of application in vitro and in vivo. Multiple analysis and more information can be obtained from the same analytical session saving cost and time. This thesis focuses on several application of multicolour bioluminescence for high-throughput screening and in vivo imaging. Multicolor luciferases can be employed as new tools for drug discovery and developments and some examples are provided in the different chapters. New red codon optimized luciferase have been demonstrated to be improved tools for bioluminescence imaging in small animal and the possibility to combine red and green luciferases for BLI has been achieved even if some aspects of the methodology remain challenging and need further improvement. In vivo Bioluminescence imaging has known a rapid progress since its first application no more than 15 years ago. It is becoming an indispensable tool in pharmacological research. At the same time the development of more sensitive and implemented microscopes and low-light imager for a better visualization and quantification of multicolor signals would boost the research and the discoveries in life sciences in general and in drug discovery and development in particular.