55 resultados para next generation sequencing
Resumo:
Over the last decade, graphene and related materials (GRM) have drawn significant interest and resources for their development into the next generation of composite materials. This is because these nanoparticles have the ability to operate as reinforcing additives capable of imparting considerable mechanical property increases while also embedding multi-functional advantages on the host matrix. Because graphene and 2D materials are still in their early stages, the relative maturity of different types of composite systems varies. As a result, certain nanocomposite systems are currently commercially accessible, while others are not yet sufficiently developed to enter the market. A substantial emphasis has been placed on developing thermoplastic and thermosetting materials that combine a variety of mechanical and functional qualities. These include higher strength and stiffness, increased thermal and electrical conductivity, improved barrier properties, fire retardancy, and others, with the ultimate goal of providing multifunctionality to already employed composites. The work presented in this thesis investigates the use and benefits that GRM could bring to composites for a variety of applications, with the goal of realizing multifunctional components with improved properties that leads to lightweight and, as a result, energy and cost savings and pollution reduction in the environment. In particular, we worked on the following topics: • Benchmarking of commercial GRM-based master batches; • GRM-coatings for water uptake reduction; • GRM as thermo-electrical anti-icing /de-icing system; • GRM for Out of Oven curing of composites.
Resumo:
The ambitious goals of increasing the efficiency, performance and power densities of transportation drives cannot be met with compromises in the motor reliability. For the insulation specialists the challenge will be critical as the use of wide-bandgap converters (WBG, based on SiC and GaN switches) and the higher operating voltages expected for the next generation drives will enhance the electrical stresses to unprecedented levels. It is expected for the DC bus in aircrafts to reach 800 V (split +/-400 V) and beyond, driven by the urban air mobility sector and the need for electrification of electro-mechanical/electro-hydraulic actuators (an essential part of the "More Electric Aircraft" concept). Simultaneously the DC bus in electric vehicles (EV) traction motors is anticipated to increase up to 1200 V very soon. The electrical insulation system is one of the most delicate part of the machine in terms of failure probability. In particular, the appearance of partial discharges (PD) is disruptive on the reliability of the drive, especially under fast repetitive transients. Extensive experimental activity has been performed to extend the body of knowledge on PD inception, endurance under PD activity, and explore and identify new phenomena undermining the reliability. The focus has been concentrated on the impact of the WGB-converter produced waveforms and the environmental conditions typical of the aeronautical sector on insulation models. Particular effort was put in the analysis at the reduced pressures typical of aircraft cruise altitude operation. The results obtained, after a critical discussion, have been used to suggest a coordination between the insulation PD inception voltage with the converter stresses and to propose an improved qualification procedure based on the existing IEC 60034-18-41 standard.
Resumo:
Leishmaniasis is one of the major parasitic diseases among neglected tropical diseases with a high rate of morbidity and mortality. Human migration and climate change have spread the disease from limited endemic areas all over the world, also reaching regions in Southern Europe, and causing significant health and economic burden. The currently available treatments are far from ideal due to host toxicity, elevated cost, and increasing rates of drug resistance. Safer and more effective drugs are thus urgently required. Nevertheless, the identification of new chemical entities for leishmaniasis has proven to be incredibly hard and exacerbated by the scarcity of well-validated targets. Trypanothione reductase (TR) represents one robustly validated target in Leishmania that fulfils most of the requirements for a good drug target. However, due to the large and featureless active site, TR is considered extremely challenging and almost undruggable by small molecules. This scenario advocates the development of new chemical entities by unlocking new modalities for leishmaniasis drug discovery. The classical toolbox for drug discovery has enormously expanded in the last decade, and medicinal chemists can now strategize across a variety of new chemical modalities and a vast chemical space, to efficiently modulate challenging targets and provide effective treatments. Beyond others, Targeted p Protein Degradation (TPD) is an emerging strategy that uses small molecules to hijack endogenous proteolysis systems to degrade disease-relevant proteins and thus reduce their abundance in the cell. Based on these considerations, this thesis aimed to develop new strategies for leishmaniasis drug discovery while embracing novel chemical modalities and navigating the chemical space by chasing unprecedented chemotypes. This has been achieved by four complementary projects. We believe that these next-generation chemical modalities for leishmaniasis will play an important role in what was previously thought to be a drug discovery landscape dominated by small molecules.
Resumo:
The Cherenkov Telescope Array (CTA) will be the next-generation ground-based observatory to study the universe in the very-high-energy domain. The observatory will rely on a Science Alert Generation (SAG) system to analyze the real-time data from the telescopes and generate science alerts. The SAG system will play a crucial role in the search and follow-up of transients from external alerts, enabling multi-wavelength and multi-messenger collaborations. It will maximize the potential for the detection of the rarest phenomena, such as gamma-ray bursts (GRBs), which are the science case for this study. This study presents an anomaly detection method based on deep learning for detecting gamma-ray burst events in real-time. The performance of the proposed method is evaluated and compared against the Li&Ma standard technique in two use cases of serendipitous discoveries and follow-up observations, using short exposure times. The method shows promising results in detecting GRBs and is flexible enough to allow real-time search for transient events on multiple time scales. The method does not assume background nor source models and doe not require a minimum number of photon counts to perform analysis, making it well-suited for real-time analysis. Future improvements involve further tests, relaxing some of the assumptions made in this study as well as post-trials correction of the detection significance. Moreover, the ability to detect other transient classes in different scenarios must be investigated for completeness. The system can be integrated within the SAG system of CTA and deployed on the onsite computing clusters. This would provide valuable insights into the method's performance in a real-world setting and be another valuable tool for discovering new transient events in real-time. Overall, this study makes a significant contribution to the field of astrophysics by demonstrating the effectiveness of deep learning-based anomaly detection techniques for real-time source detection in gamma-ray astronomy.
Resumo:
The recent trend of moving Cloud Computing capabilities to the Edge of the network is reshaping how applications and their middleware supports are designed, deployed, and operated. This new model envisions a continuum of virtual resources between the traditional cloud and the network edge, which is potentially more suitable to meet the heterogeneous Quality of Service (QoS) requirements of diverse application domains and next-generation applications. Several classes of advanced Internet of Things (IoT) applications, e.g., in the industrial manufacturing domain, are expected to serve a wide range of applications with heterogeneous QoS requirements and call for QoS management systems to guarantee/control performance indicators, even in the presence of real-world factors such as limited bandwidth and concurrent virtual resource utilization. The present dissertation proposes a comprehensive QoS-aware architecture that addresses the challenges of integrating cloud infrastructure with edge nodes in IoT applications. The architecture provides end-to-end QoS support by incorporating several components for managing physical and virtual resources. The proposed architecture features: i) a multilevel middleware for resolving the convergence between Operational Technology (OT) and Information Technology (IT), ii) an end-to-end QoS management approach compliant with the Time-Sensitive Networking (TSN) standard, iii) new approaches for virtualized network environments, such as running TSN-based applications under Ultra-low Latency (ULL) constraints in virtual and 5G environments, and iv) an accelerated and deterministic container overlay network architecture. Additionally, the QoS-aware architecture includes two novel middlewares: i) a middleware that transparently integrates multiple acceleration technologies in heterogeneous Edge contexts and ii) a QoS-aware middleware for Serverless platforms that leverages coordination of various QoS mechanisms and virtualized Function-as-a-Service (FaaS) invocation stack to manage end-to-end QoS metrics. Finally, all architecture components were tested and evaluated by leveraging realistic testbeds, demonstrating the efficacy of the proposed solutions.
Resumo:
DUNE is a next-generation long-baseline neutrino oscillation experiment. It aims to measure the still unknown $ \delta_{CP} $ violation phase and the sign of $ \Delta m_{13}^2 $, which defines the neutrino mass ordering. DUNE will exploit a Far Detector composed of four multi-kiloton LArTPCs, and a Near Detector (ND) complex located close to the neutrino source at Fermilab. The SAND detector at the ND complex is designed to perform on-axis beam monitoring, constrain uncertainties in the oscillation analysis and perform precision neutrino physics measurements. SAND includes a 0.6 T super-conductive magnet, an electromagnetic calorimeter, a 1-ton liquid Argon detector - GRAIN - and a modular, low-density straw tube target tracker system. GRAIN is an innovative LAr detector where neutrino interactions can be reconstructed using only the LAr scintillation light imaged by an optical system based on Coded Aperture masks and lenses - a novel approach never used before in particle physics applications. In this thesis, a first evaluation of GRAIN track reconstruction and calorimetric capabilities was obtained with an optical system based on Coded Aperture cameras. A simulation of $\nu_\mu + Ar$ interactions with the energy spectrum expected at the future Fermilab Long Baseline Neutrino Facility (LBNF) was performed. The performance of SAND was evaluated, combining the information provided by all its sub-detectors, on the selection of $ \nu_\mu + Ar \to \mu^- + p + X $ sample and on the neutrino energy reconstruction.
Resumo:
In this thesis, the focus is on utilizing metasurfaces to improve radiation characteristics of planar structures. The study encompasses various aspects of metasurface applications, including enhancing antenna radiation characteristics and manipulating electromagnetic (EM) waves, such as polarization conversion and anomalous reflection. The thesis introduces the design of a single-port antenna with dual-mode operation, integrating metasurfaces. This antenna serves as the front-end for a next-generation tag, functioning as a position sensor with identification and energy harvesting capabilities. It operates in the lower European Ultra-Wideband (UWB) frequency range for communication/localization and the UHF band for wireless energy reception. The design aims for a low-profile stack-up that remains unaffected by background materials. Researchers worldwide are drawn to metasurfaces due to their EM wave manipulation capabilities. The thesis also demonstrates how a High-Impedance Surface (HIS) can enhance the antenna's versatility through metasurface application, including conformal design using 3D-printing technology, ensuring adaptability for various deformation and tracking/powering scenarios. Additionally, the thesis explores two distinct metasurface applications. One involves designing an angularly stable super-wideband Circular Polarization Converter (CPC) operating from 11 to 35GHz with an impressive relative impedance bandwidth of 104.3%. The CPC shows a stable response even at oblique incidences up to 40 degrees, with a Peak Cross-Polarization Ratio (PCR) exceeding 62% across the entire band. The second application focuses on an Intelligent Reflective Surface (IRS) capable of redirecting incoming waves in unconventional directions. Tunability is achieved through an artificially developed ferroelectric material (HfZrO) and distributed capacitive elements (IDC) to fine-tune impedance and phase responses at the meta-atom level. The IRS demonstrates anomalous reflection for normal incident waves. These innovative applications of metasurfaces offer promising advancements in antenna design, EM wave manipulation, and versatile wireless communication systems.
Resumo:
The pervasive availability of connected devices in any industrial and societal sector is pushing for an evolution of the well-established cloud computing model. The emerging paradigm of the cloud continuum embraces this decentralization trend and envisions virtualized computing resources physically located between traditional datacenters and data sources. By totally or partially executing closer to the network edge, applications can have quicker reactions to events, thus enabling advanced forms of automation and intelligence. However, these applications also induce new data-intensive workloads with low-latency constraints that require the adoption of specialized resources, such as high-performance communication options (e.g., RDMA, DPDK, XDP, etc.). Unfortunately, cloud providers still struggle to integrate these options into their infrastructures. That risks undermining the principle of generality that underlies the cloud computing scale economy by forcing developers to tailor their code to low-level APIs, non-standard programming models, and static execution environments. This thesis proposes a novel system architecture to empower cloud platforms across the whole cloud continuum with Network Acceleration as a Service (NAaaS). To provide commodity yet efficient access to acceleration, this architecture defines a layer of agnostic high-performance I/O APIs, exposed to applications and clearly separated from the heterogeneous protocols, interfaces, and hardware devices that implement it. A novel system component embodies this decoupling by offering a set of agnostic OS features to applications: memory management for zero-copy transfers, asynchronous I/O processing, and efficient packet scheduling. This thesis also explores the design space of the possible implementations of this architecture by proposing two reference middleware systems and by adopting them to support interactive use cases in the cloud continuum: a serverless platform and an Industry 4.0 scenario. A detailed discussion and a thorough performance evaluation demonstrate that the proposed architecture is suitable to enable the easy-to-use, flexible integration of modern network acceleration into next-generation cloud platforms.
Resumo:
The rapid progression of biomedical research coupled with the explosion of scientific literature has generated an exigent need for efficient and reliable systems of knowledge extraction. This dissertation contends with this challenge through a concentrated investigation of digital health, Artificial Intelligence, and specifically Machine Learning and Natural Language Processing's (NLP) potential to expedite systematic literature reviews and refine the knowledge extraction process. The surge of COVID-19 complicated the efforts of scientists, policymakers, and medical professionals in identifying pertinent articles and assessing their scientific validity. This thesis presents a substantial solution in the form of the COKE Project, an initiative that interlaces machine reading with the rigorous protocols of Evidence-Based Medicine to streamline knowledge extraction. In the framework of the COKE (“COVID-19 Knowledge Extraction framework for next-generation discovery science”) Project, this thesis aims to underscore the capacity of machine reading to create knowledge graphs from scientific texts. The project is remarkable for its innovative use of NLP techniques such as a BERT + bi-LSTM language model. This combination is employed to detect and categorize elements within medical abstracts, thereby enhancing the systematic literature review process. The COKE project's outcomes show that NLP, when used in a judiciously structured manner, can significantly reduce the time and effort required to produce medical guidelines. These findings are particularly salient during times of medical emergency, like the COVID-19 pandemic, when quick and accurate research results are critical.
Resumo:
Recent technological advancements have played a key role in seamlessly integrating cloud, edge, and Internet of Things (IoT) technologies, giving rise to the Cloud-to-Thing Continuum paradigm. This cloud model connects many heterogeneous resources that generate a large amount of data and collaborate to deliver next-generation services. While it has the potential to reshape several application domains, the number of connected entities remarkably broadens the security attack surface. One of the main problems is the lack of security measures to adapt to the dynamic and evolving conditions of the Cloud-To-Thing Continuum. To address this challenge, this dissertation proposes novel adaptable security mechanisms. Adaptable security is the capability of security controls, systems, and protocols to dynamically adjust to changing conditions and scenarios. However, since the design and development of novel security mechanisms can be explored from different perspectives and levels, we place our attention on threat modeling and access control. The contributions of the thesis can be summarized as follows. First, we introduce a model-based methodology that secures the design of edge and cyber-physical systems. This solution identifies threats, security controls, and moving target defense techniques based on system features. Then, we focus on access control management. Since access control policies are subject to modifications, we evaluate how they can be efficiently shared among distributed areas, highlighting the effectiveness of distributed ledger technologies. Furthermore, we propose a risk-based authorization middleware, adjusting permissions based on real-time data, and a federated learning framework that enhances trustworthiness by weighting each client's contributions according to the quality of their partial models. Finally, since authorization revocation is another critical concern, we present an efficient revocation scheme for verifiable credentials in IoT networks, featuring decentralization, demanding minimum storage and computing capabilities. All the mechanisms have been evaluated in different conditions, proving their adaptability to the Cloud-to-Thing Continuum landscape.