10 resultados para ree software environment for statistical computing and graphics R

em AMS Tesi di Laurea - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the most serious problems of the modern medicine is the growing emergence of antibiotic resistance among pathogenic bacteria. In this circumstance, different and innovative approaches for treating infections caused by multidrug-resistant bacteria are imperatively required. Bacteriophage Therapy is one among the fascinating approaches to be taken into account. This consists of the use of bacteriophages, viruses that infect bacteria, in order to defeat specific bacterial pathogens. Phage therapy is not an innovative idea, indeed, it was widely used around the world in the 1930s and 1940s, in order to treat various infection diseases, and it is still used in Eastern Europe and the former Soviet Union. Nevertheless, Western scientists mostly lost interest in further use and study of phage therapy and abandoned it after the discovery and the spread of antibiotics. The advancement of scientific knowledge of the last years, together with the encouraging results from recent animal studies using phages to treat bacterial infections, and above all the urgent need for novel and effective antimicrobials, have given a prompt for additional rigorous researches in this field. In particular, in the laboratory of synthetic biology of the department of Life Sciences at the University of Warwick, a novel approach was adopted, starting from the original concept of phage therapy, in order to study a concrete alternative to antibiotics. The innovative idea of the project consists in the development of experimental methodologies, which allow to engineer a programmable synthetic phage system using a combination of directed evolution, automation and microfluidics. The main aim is to make “the therapeutics of tomorrow individualized, specific, and self-regulated” (Jaramillo, 2015). In this context, one of the most important key points is the Bacteriophage Quantification. Therefore, in this research work, a mathematical model describing complex dynamics occurring in biological systems involving continuous growth of bacteriophages, modulated by the performance of the host organisms, was implemented as algorithms into a working software using MATLAB. The developed program is able to predict different unknown concentrations of phages much faster than the classical overnight Plaque Assay. What is more, it gives a meaning and an explanation to the obtained data, making inference about the parameter set of the model, that are representative of the bacteriophage-host interaction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the main practical implications of quantum mechanical theory is quantum computing, and therefore the quantum computer. Quantum computing (for example, with Shor’s algorithm) challenges the computational hardness assumptions, such as the factoring problem and the discrete logarithm problem, that anchor the safety of cryptosystems. So the scientific community is studying how to defend cryptography; there are two defense strategies: the quantum cryptography (which involves the use of quantum cryptographic algorithms on quantum computers) and the post-quantum cryptography (based on classical cryptographic algorithms, but resistant to quantum computers). For example, National Institute of Standards and Technology (NIST) is collecting and standardizing the post-quantum ciphers, as it established DES and AES as symmetric cipher standards, in the past. In this thesis an introduction on quantum mechanics was given, in order to be able to talk about quantum computing and to analyze Shor’s algorithm. The differences between quantum and post-quantum cryptography were then analyzed. Subsequently the focus was given to the mathematical problems assumed to be resistant to quantum computers. To conclude, post-quantum digital signature cryptographic algorithms selected by NIST were studied and compared in order to apply them in today’s life.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, Deep Learning techniques have shown to perform well on a large variety of problems both in Computer Vision and Natural Language Processing, reaching and often surpassing the state of the art on many tasks. The rise of deep learning is also revolutionizing the entire field of Machine Learning and Pattern Recognition pushing forward the concepts of automatic feature extraction and unsupervised learning in general. However, despite the strong success both in science and business, deep learning has its own limitations. It is often questioned if such techniques are only some kind of brute-force statistical approaches and if they can only work in the context of High Performance Computing with tons of data. Another important question is whether they are really biologically inspired, as claimed in certain cases, and if they can scale well in terms of "intelligence". The dissertation is focused on trying to answer these key questions in the context of Computer Vision and, in particular, Object Recognition, a task that has been heavily revolutionized by recent advances in the field. Practically speaking, these answers are based on an exhaustive comparison between two, very different, deep learning techniques on the aforementioned task: Convolutional Neural Network (CNN) and Hierarchical Temporal memory (HTM). They stand for two different approaches and points of view within the big hat of deep learning and are the best choices to understand and point out strengths and weaknesses of each of them. CNN is considered one of the most classic and powerful supervised methods used today in machine learning and pattern recognition, especially in object recognition. CNNs are well received and accepted by the scientific community and are already deployed in large corporation like Google and Facebook for solving face recognition and image auto-tagging problems. HTM, on the other hand, is known as a new emerging paradigm and a new meanly-unsupervised method, that is more biologically inspired. It tries to gain more insights from the computational neuroscience community in order to incorporate concepts like time, context and attention during the learning process which are typical of the human brain. In the end, the thesis is supposed to prove that in certain cases, with a lower quantity of data, HTM can outperform CNN.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Industry 4.0 refers to the 4th industrial revolution and at its bases, we can see the digitalization and the automation of the assembly line. The whole production process has improved and evolved thanks to the advances made in networking, and AI studies, which include of course machine learning, cloud computing, IoT, and other technologies that are finally being implemented into the industrial scenario. All these technologies have in common a need for faster, more secure, robust, and reliable communication. One of the many solutions for these demands is the use of mobile communication technologies in the industrial environment, but which technology is better suited for these demands? Of course, the answer isn’t as simple as it seems. The 4th industrial revolution has a never seen incomparable potential with respect to the previous ones, every factory, enterprise, or company have different network demands, and even in each of these infrastructures, the demands may diversify by sector, or by application. For example, in the health care industry, there may be e a need for increased bandwidth for the analysis of high-definition videos or, faster speeds in order to have analytics occur in real-time, and again another application might be higher security and reliability to protect patients’ data. As seen above, choosing the right technology for the right environment and application, considers many things, and the ones just stated are but a speck of dust with respect to the overall picture. In this thesis, we will investigate a comparison between the use of two of the available technologies in use for the industrial environment: Wi-Fi 6 and 5G Private Networks in the specific case of a steel factory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information technology (IT) is on the verge of another revolution. Driven by the increasing capabilities and ever declining costs of computing and communications devices, IT is being embedded into a growing range of physical devices linked together through networks and will become ever more pervasive as the component technologies become smaller, faster, and cheaper. [..] These networked systems of embedded computers, referred to as EmNets throughout this report, have the potential to change radically the way people interact with their environment by linking together a range of devices and sensors that will allow information to be collected, shared, and processed in unprecedented ways.[..] The use of EmNets throughout society could well dwarf previous milestones in the information revolution.[..] IT will eventually become \textbf{an invisible component of almost everything} in everyone`s surroundings. Con il ridursi dei costi e l'aumentare della capacità di computazione dei componenti elettronici sono proliferate piattaforme che permettono al bambino come all'ingegnere di sviluppare un'idea che trasversalmente taglia il mondo reale e quello virtuale. Una collisione tra due mondi che fino a poco tempo fa era consentita esclusivamente a professionisti. Oggetti che possono acquisire o estendere funzionalità, che ci permettono di estendere la nostra percezione del mondo e di rivalutarne i suoi limiti. Oggetti connessi alla 'rete delle reti' che condividono ed elaborano dati per un nuovo utilizzo delle informazioni. Con questa tesi si vuole andare ad esplorare l'applicazione degli agenti software alle nuove piattaforme dei sistemi embedded e dell'Internet of Things, tecnologie abbastanza mature eppure non ancora esplorate a fondo. Ha senso modellare un sistema embedded con gli agenti?

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis, developed in collaboration between the team Systems and Equipment for Energy and Environment of Bologna University and Chalmers University of Technology in Goteborg, aims to study the benefits resulting from the adoption of a thermal storage system for marine application. To that purpose a chruis ship has been considered. To reach the purpose has been used the software EGO (Energy Greed Optimization) developed by University of Bologna.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main goal of this thesis is to report patterns of perceived safety in the context of airport infrastructure, taking the airport of Bologna as reference. Many personal and environmental attributes are investigated to paint the profile of the sensitive passenger and to understand why precise factors of the transit environment are so impactful on the individual. The main analyses are based on a 2014-2015 passengers’ survey, involving almost six thousand of incoming and outgoing passengers. Other reports are used to implement and support the resource. The analysis is carried out by using a combination of Chi-square tests and binary logistic regressions. Findings shows that passengers result to be particularly affected by the perception of airport’s environment (e.g., state and maintenance of facilities, clarity and efficacy of information system, functionality of elevators and escalators), but also by the way how the passenger reaches the airport and the quality of security checks. In relation to such results, several suggestions are provided for the improvement of passenger satisfaction with safety. The attention is then focused on security checkpoints and related operations, described on a theoretical and technical ground. We present an example of how to realize a proper model of the security checks area of Bologna’s airport, with the aim to assess present performances of the system and consequences of potential variations. After a brief introduction to Arena, a widespread simulation software, the existing model is described, pointing out flaws and limitations. Such model is finally updated and changed in order to make it more reliable and more representative of the reality. Different scenarios are tested and results are compared using graphs and tables.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Jupiter and its moons are a complex dynamical system that include several phenomenon like tides interactions, moon's librations and resonances. One of the most interesting characteristics of the Jovian system is the presence of the Laplace resonance, where the orbital periods of Ganymede, Europa and Io maintain a 4:2:1 ratio respectively. It is interesting to study the role of the Laplace Resonance in the dynamic of the system, especially regarding the dissipative nature of the tidal interaction between Jupiter and its closest moon, Io. Numerous theories have been proposed regarding the orbital evolution of the Galilean satellites, but they disagree about the amount of dissipation of the system, therefore about the magnitude and the direction of the evolution of the system, mainly because of the lack of experimental data. The future JUICE space mission is a great opportunity to solve this dispute. JUICE is an ESA (European Space Agency) L-class mission (the largest category of missions in the ESA Cosmic Vision) that, at the beginning of 2030, will be inserted in the Jovian system and that will perform several flybys of the Galilean satellites, with the exception of Io. Subsequently, during the last part of the mission, it will orbit around Ganymede for nine months, with a possible extension of the mission. The data that JUICE will collect during the mission will have an exceptional accuracy, allowing to investigate several aspects of the dynamics the system, especially, the evolution of Laplace Resonance of the Galilean moons and its stability. This thesis will focus on the JUICE mission, in particular in the gravity estimation and orbit reconstruction of the Galilean satellites during the Jovian orbital phase using radiometric data. This is accomplished through an orbit determination technique called multi-arc approach, using the JPL's orbit determination software MONTE (Mission-analysis, Operations and Navigation Tool-kit Environment).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The direct attention of this thesis is the maintenance of road elements to improve road safety. The goal of the research is to prioritise maintenance for barriers based on factors such as the terrain of the site, deformations, degradation of the components, and adherence to the original installation. Using these factors, a coefficient is calculated to determine the maintenance priority for each barrier. To ease understanding and visualisation, data was uploaded and processed in a GIS environment to generate analysis and maps. This was done using GIS, a free and open-source GIS software. Information about the features of the barriers was collected through both on-site and online examination. During on-site inspections, a database of geotagged photos was created to aid in the survey. GIS capabilities word fully utilised by using geoprocessing tools for more in-depth analysis.