19 resultados para Exploit

em AMS Tesi di Laurea - Alm@DL - Università di Bologna


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this report it was designed an innovative satellite-based monitoring approach applied on the Iraqi Marshlands to survey the extent and distribution of marshland re-flooding and assess the development of wetland vegetation cover. The study, conducted in collaboration with MEEO Srl , makes use of images collected from the sensor (A)ATSR onboard ESA ENVISAT Satellite to collect data at multi-temporal scales and an analysis was adopted to observe the evolution of marshland re-flooding. The methodology uses a multi-temporal pixel-based approach based on classification maps produced by the classification tool SOIL MAPPER ®. The catalogue of the classification maps is available as web service through the Service Support Environment Portal (SSE, supported by ESA). The inundation of the Iraqi marshlands, which has been continuous since April 2003, is characterized by a high degree of variability, ad-hoc interventions and uncertainty. Given the security constraints and vastness of the Iraqi marshlands, as well as cost-effectiveness considerations, satellite remote sensing was the only viable tool to observe the changes taking place on a continuous basis. The proposed system (ALCS – AATSR LAND CLASSIFICATION SYSTEM) avoids the direct use of the (A)ATSR images and foresees the application of LULCC evolution models directly to „stock‟ of classified maps. This approach is made possible by the availability of a 13 year classified image database, conceived and implemented in the CARD project (http://earth.esa.int/rtd/Projects/#CARD).The approach here presented evolves toward an innovative, efficient and fast method to exploit the potentiality of multi-temporal LULCC analysis of (A)ATSR images. The two main objectives of this work are both linked to a sort of assessment: the first is to assessing the ability of modeling with the web-application ALCS using image-based AATSR classified with SOIL MAPPER ® and the second is to evaluate the magnitude, the character and the extension of wetland rehabilitation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We need a large amount of energy to make our homes pleasantly warm in winter and cool in summer. If we also consider the energy losses that occur through roofs, perimeter walls and windows, it would be more appropriate to speak of waste than consumption. The solution would be to build passive houses, i.e. buildings more efficient and environmentally friendly, able to ensure a drastic reduction of electricity and heating bills. Recently, the increase of public awareness about global warming and environmental pollution problems have “finally” opened wide possibility in the field of sustainable construction by encouraging new renewable methods for heating and cooling space. Shallow geothermal allows to exploit the renewable heat reservoir, present in the soil at depths between 15 and 20 m, for air-conditioning of buildings, using a ground source heat pump. This thesis focuses on the design of an air-conditioning system with geothermal heat pump coupled to energy piles, i.e. piles with internal heat exchangers, for a typical Italian-family building, on the basis of a geological-technical report about a plot of Bologna’s plain provided by Geo-Net s.r.l. The study has involved a preliminary static sizing of the piles in order to calculate their length and number, then the project was completed making the energy sizing, where it has been verified if the building energy needs were met with the static solution obtained. Finally the attention was focused on the technical and economical validity compared to a traditional system (cost-benefit analysis) and on the problem of the uncertainty data design and their effects on the operating and initial costs of the system (sensitivity analysis). To evaluate the performance of the thermal system and the potential use of the piles was also used the PILESIM2 software, designed by Dr. Pahud of the SUPSI’s school.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La crittografia ha sempre rivestito un ruolo primario nella storia del genere umano, dagli albori ai giorni nostri, e il periodo in cui viviamo non fa certo eccezione. Al giorno d'oggi, molti dei gesti che vengono compiuti anche solo come abitudine (operazioni bancarie, apertura automatica dell'auto, accedere a Facebook, ecc.), celano al loro interno la costante presenza di sofisticati sistemi crittografici. Proprio a causa di questo fatto, è importante che gli algoritmi utilizzati siano in qualche modo certificati come ragionevolmente sicuri e che la ricerca in questo campo proceda costantemente, sia dal punto di vista dei possibili nuovi exploit per forzare gli algoritmi usati, sia introducendo nuovi e sempre più complessi sistemi di sicurezza. In questa tesi viene proposto una possibile implementazione di un particolare tipo di attacco crittoanalitico, introdotto nel 2000 da due ricercatori dell'Università "La Sapienza" di Roma, e conosciuto come "Crittoanalisi Logica". L'algoritmo su cui è incentrato il lavoro è il Data Encryption Standard (DES), ostico standard crittografico caduto in disuso nel 1999 a causa delle dimensioni ridotte della chiave, seppur tuttora sia algebricamente inviolato. Il testo è strutturato nel seguente modo: il primo capitolo è dedicato ad una breve descrizione di DES e della sua storia, introducendo i concetti fondamentali con cui si avrà a che fare per l'intera dissertazione Nel secondo capitolo viene introdotta la Crittoanalisi Logica e viene fornita una definizione della stessa, accennando ai concetti matematici necessari alla comprensione dei capitoli seguenti. Nel capitolo 3 viene presentato il primo dei due software sviluppati per rendere possibile l'attuazione di questo attacco crittoanalitico, una libreria per la rappresentazione e la manipolazione di formule logiche scritta in Java. Il quarto ed ultimo capitolo descrive il programma che, utilizzando la libreria descritta nel capitolo 3, elabora in maniera automatica un insieme di proposizioni logiche semanticamente equivalenti a DES, la cui verifica di soddisfacibilità, effettuata tramite appositi tools (SAT solvers) equivale ad effettuare un attacco di tipo known-plaintext su tale algoritmo.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Communication and coordination are two key-aspects in open distributed agent system, being both responsible for the system’s behaviour integrity. An infrastructure capable to handling these issues, like TuCSoN, should to be able to exploit modern technologies and tools provided by fast software engineering contexts. Thesis aims to demonstrate TuCSoN infrastructure’s abilities to cope new possibilities, hardware and software, offered by mobile technology. The scenarios are going to configure, are related to the distributed nature of multi-agent systems where an agent should be located and runned just on a mobile device. We deal new mobile technology frontiers concerned with smartphones using Android operating system by Google. Analysis and deployment of a distributed agent-based system so described go first to impact with quality and quantity considerations about available resources. Engineering issue at the base of our research is to use TuCSoN against to reduced memory and computing capability of a smartphone, without the loss of functionality, efficiency and integrity for the infrastructure. Thesis work is organized on two fronts simultaneously: the former is the rationalization process of the available hardware and software resources, the latter, totally orthogonal, is the adaptation and optimization process about TuCSoN architecture for an ad-hoc client side release.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of the present thesis, carried out at the Analytical Group of the Faculty of Industrial Chemistry in Bologna, is to develop a new electrochemical method for the determination of the Antioxidant Capacity (AOC). The approach is based on the deposition of a non-conducting polymeric film on the working electrode surface and the following exposition to the radicals OH· produced by H2O2 photolysis. The strongly oxidant action of hydroxyl radicals degrades, causing an increase of the Faradic current, relevant to the redox couple [Ru(NH3)6]2+/3+ monitored by cyclic voltammetry(CV); the presence of an antioxidant compound in solution slows down the radical action, thus protecting the polymeric film and blocking the charge transfer. The parameter adopted for the quantification of the AOC, was the induction time, called also lag phase, which is the time when the degradation of the film starts. Five pure compounds, among most commonly antioxidant, were investigated : Trolox®(an analogue water-soluble of vitamin E), (L)-ascorbic acid, gallic acid, pyrogallol and (-)- epicatechin. The AOC of each antioxidant was expressed by TEAC index (Trolox® Equivalent Antioxidant Capacity), calculated from the ratio between the slope of the calibration curve of the target compound and the slope of the calibration curve of Trolox®. The results from the electrochemical method, have been compared with those obtained from some other standardized methods, widely employed. The assays used for the comparison, have been: ORAC, a spectrofluorimetric method based on the decrease of fluorescein emission after the attack of alkylperoxide radicals, ABTS and DPPH that exploit the decoloration of stable nitrogen radicals when they are reduced in presence of an antioxidant compound and, finally, a potentiometric method based on the response of the redox couple [Fe(CN)6]3-/ [Fe(CN)6]4-. From the results obtained from pure compounds, it has been found that ORAC is the methodology showing the best correlation with the developed electrochemical method, maybe since similar radical species are involved. The comparison between the considered assays, was also extended to the analysis of a real sample of fruit juice. In such a case the TEAC value resulting from the electrochemical method is higher than those from standardized assays.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hydrogen peroxide (H2O2) is a powerful oxidant which is commonly used in a wide range of applications in the industrial field. Several methods for the quantification of H2O2 have been developed. Among them, electrochemical methods exploit the ability of some hexacyanoferrates (such as Prussian Blue) to detect H2O2 at potentials close to 0.0 V (vs. SCE) avoiding the occurrence of secondary reactions, which are likely to run at large overpotentials. This electrocatalytic behaviour makes hexacyanoferrates excellent redox mediators. When deposited in the form of thin films on the electrode surfaces, they can be employed in the fabrication of sensors and biosensors, normally operated in solutions at pH values close to physiological ones. As hexacyanoferrates show limited stability in not strongly acidic solutions, it is necessary to improve the configuration of the modified electrodes to increase the stability of the films. In this thesis work, organic conducting polymers were used to fabricate composite films with Prussian Blue (PB) to be electro-deposited on Pt surfaces, in order to increase their pH stability. Different electrode configurations and different methods of synthesis of both components were tested, and for each one the achievement of a possible increase in the operational stability of Prussian Blue was verified. Good results were obtained for the polymer 3,3''-didodecyl-2,2':5',2''-terthiophene (poly(3,3''-DDTT)), whose presence created a favourable microenvironment for the electrodeposition of Prussian Blue. The electrochemical behaviour of the modified electrodes was studied in both aqueous and organic solutions. Poly(3,3''-DDTT) showed no response in aqueous solution in the potential range where PB is electroactive, thus in buffered aqueous solution is was possible to characterize the composite material, focusing only on the redox behaviour of PB. A combined effect of anion and cation of the supporting electrolyte was noticed. The response of Pt electrodes modified with films of the PB /poly(3,3''-DDTT) composite was evaluated for the determination of H2O2. The performance of such films was found better than that of the PB alone. It can be concluded that poly(3,3''-DDTT) plays a key role in the stabilization of Prussian Blue causing also a wider linearity range for the electrocatalytic response to H2O2.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Today there are many techniques that allows to exploit vulnerabilities of an application; there are also many techniques that are designed to stop these exploit attacks. This thesis wants to highlight how a specific type of attack, based on a technique called Return Oriented Programming (ROP), can be easily applied to binaries with particular characteristics. A new method that allows the injection of "useful" code in an Open Source projects without arousing suspicions is presented; this is possible because of the harmless aspects of the injected code. This useful code facilitate a ROP attack against an executable that contains vulnerable bugs. The injection process can be visualized in environment where an user can contribute with own code to a particular Open Source project. This thesis also highlights how current software protections are not correctly applied to Open Source project, thus enabling the proposed approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Synthetic biology has recently had a great development, many papers have been published and many applications have been presented, spanning from the production of biopharmacheuticals to the synthesis of bioenergetic substrates or industrial catalysts. But, despite these advances, most of the applications are quite simple and don’t fully exploit the potential of this discipline. This limitation in complexity has many causes, like the incomplete characterization of some components, or the intrinsic variability of the biological systems, but one of the most important reasons is the incapability of the cell to sustain the additional metabolic burden introduced by a complex circuit. The objective of the project, of which this work is part, is trying to solve this problem through the engineering of a multicellular behaviour in prokaryotic cells. This system will introduce a cooperative behaviour that will allow to implement complex functionalities, that can’t be obtained with a single cell. In particular the goal is to implement the Leader Election, this procedure has been firstly devised in the field of distributed computing, to identify the process that allow to identify a single process as organizer and coordinator of a series of tasks assigned to the whole population. The election of the Leader greatly simplifies the computation providing a centralized control. Further- more this system may even be useful to evolutionary studies that aims to explain how complex organisms evolved from unicellular systems. The work presented here describes, in particular, the design and the experimental characterization of a component of the circuit that solves the Leader Election problem. This module, composed of an hybrid promoter and a gene, is activated in the non-leader cells after receiving the signal that a leader is present in the colony. The most important element, in this case, is the hybrid promoter, it has been realized in different versions, applying the heuristic rules stated in [22], and their activity has been experimentally tested. The objective of the experimental characterization was to test the response of the genetic circuit to the introduction, in the cellular environment, of particular molecules, inducers, that can be considered inputs of the system. The desired behaviour is similar to the one of a logic AND gate in which the exit, represented by the luminous signal produced by a fluorescent protein, is one only in presence of both inducers. The robustness and the stability of this behaviour have been tested by changing the concentration of the input signals and building dose response curves. From these data it is possible to conclude that the analysed constructs have an AND-like behaviour over a wide range of inducers’ concentrations, even if it is possible to identify many differences in the expression profiles of the different constructs. This variability accounts for the fact that the input and the output signals are continuous, and so their binary representation isn’t able to capture the complexity of the behaviour. The module of the circuit that has been considered in this analysis has a fundamental role in the realization of the intercellular communication system that is necessary for the cooperative behaviour to take place. For this reason, the second phase of the characterization has been focused on the analysis of the signal transmission. In particular, the interaction between this element and the one that is responsible for emitting the chemical signal has been tested. The desired behaviour is still similar to a logic AND, since, even in this case, the exit signal is determined by the hybrid promoter activity. The experimental results have demonstrated that the systems behave correctly, even if there is still a substantial variability between them. The dose response curves highlighted that stricter constrains on the inducers concentrations need to be imposed in order to obtain a clear separation between the two levels of expression. In the conclusive chapter the DNA sequences of the hybrid promoters are analysed, trying to identify the regulatory elements that are most important for the determination of the gene expression. Given the available data it wasn’t possible to draw definitive conclusions. In the end, few considerations on promoter engineering and complex circuits realization are presented. This section aims to briefly recall some of the problems outlined in the introduction and provide a few possible solutions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

DNA is a fascinating biomolecule that is well known for its genetic role in living systems. The emerging area of DNA nanotechnology provides an alternative view that exploits unparallel self-assembly ability of DNA molecules for material use of DNA. Although many reports exist on the results of DNA self-assembling systems, still few of them focus on the in vitro study about the function of such DNA nanostructures in live cells. Due to this, there are still a limited research about the in vitro functionality of such designs. To address an aspect of this issue, we have designed, synthesized and characterized two multifunctional fluorescencent nanobiosensors by DNA self-assembling. Each structure was designed and implemented to be introduced in live cells in order to give information on their functioning in real-time. Computational tools were used in order to design a graphic model of two new DNA motifs and also to obtain the specific sequences to all the ssDNA molecules. By thermal self-assembly techniques we have successfully synthesized the structure and corroborate their formation by the PAGE technique. In addition, we have established the conditions to characterize their structural conformation change when they perform their sensor response. The sensing behavior was also accomplished by fluorescence spectroscopy techniques; FRET evaluation and fluorescence microscopy imaging. Providing the evidence about their adequate sensing performance outside and inside the cells detected in real-time. In a preliminary evaluation we have tried to show the in vitro functionality of our structures in different cancer cell lines with the ability to perform local sensing responses. Our findings suggest that DNA sensor nanostructures could serve as a platform to exploit further therapeutic achievements in live cells.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resource management is of paramount importance in network scenarios and it is a long-standing and still open issue. Unfortunately, while technology and innovation continue to evolve, our network infrastructure system has been maintained almost in the same shape for decades and this phenomenon is known as “Internet ossification”. Software-Defined Networking (SDN) is an emerging paradigm in computer networking that allows a logically centralized software program to control the behavior of an entire network. This is done by decoupling the network control logic from the underlying physical routers and switches that forward traffic to the selected destination. One mechanism that allows the control plane to communicate with the data plane is OpenFlow. The network operators could write high-level control programs that specify the behavior of an entire network. Moreover, the centralized control makes it possible to define more specific and complex tasks that could involve many network functionalities, e.g., security, resource management and control, into a single framework. Nowadays, the explosive growth of real time applications that require stringent Quality of Service (QoS) guarantees, brings the network programmers to design network protocols that deliver certain performance guarantees. This thesis exploits the use of SDN in conjunction with OpenFlow to manage differentiating network services with an high QoS. Initially, we define a QoS Management and Orchestration architecture that allows us to manage the network in a modular way. Then, we provide a seamless integration between the architecture and the standard SDN paradigm following the separation between the control and data planes. This work is a first step towards the deployment of our proposal in the University of California, Los Angeles (UCLA) campus network with differentiating services and stringent QoS requirements. We also plan to exploit our solution to manage the handoff between different network technologies, e.g., Wi-Fi and WiMAX. Indeed, the model can be run with different parameters, depending on the communication protocol and can provide optimal results to be implemented on the campus network.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Radial velocities measured from near-infrared (NIR) spectra are a potential tool to search for extrasolar planets around cool stars. High resolution infrared spectrographs now available reach the high precision of visible instruments, with a constant improvement over time. GIANO is an infrared echelle spectrograph and it is a powerful tool to provide high resolution spectra for accurate radial velocity measurements of exo-planets and for chemical and dynamical studies of stellar or extragalactic objects. No other IR instruments have the GIANO's capability to cover the entire NIR wavelength range. In this work we develop an ensemble of IDL procedures to measure high precision radial velocities on a few GIANO spectra acquired during the commissioning run, using the telluric lines as wevelength reference. In Section 1.1 various exoplanet search methods are described. They exploit different properties of the planetary system. In Section 1.2 we describe the exoplanet population discovered trough the different methods. In Section 1.3 we explain motivations for NIR radial velocities and the challenges related the main issue that has limited the pursuit of high-precision NIR radial velocity, that is, the lack of a suitable calibration method. We briefly describe calibration methods in the visible and the solutions for IR calibration, for instance, the use of telluric lines. The latter has advantages and problems, described in detail. In this work we use telluric lines as wavelength reference. In Section 1.4 the Cross Correlation Function (CCF) method is described. This method is widely used to measure the radial velocities.In Section 1.5 we describe GIANO and its main science targets. In Chapter 2 observational data obtained with GIANO spectrograph are presented and the choice criteria are reported. In Chapter 3 we describe the detail of the analysis and examine in depth the flow chart reported in Section 3.1. In Chapter 4 we give the radial velocities measured with our IDL procedure for all available targets. We obtain an rms scatter in radial velocities of about 7 m/s. Finally, we conclude that GIANO can be used to measure radial velocities of late type stars with an accuracy close to or better than 10 m/s, using telluric lines as wevelength reference. In 2014 September GIANO is being operative at TNG for Science Verification and more observational data will allow to further refine this analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In my work I derive closed-form pricing formulas for volatility based options by suitably approximating the volatility process risk-neutral density function. I exploit and adapt the idea, which stands behind popular techniques already employed in the context of equity options such as Edgeworth and Gram-Charlier expansions, of approximating the underlying process as a sum of some particular polynomials weighted by a kernel, which is typically a Gaussian distribution. I propose instead a Gamma kernel to adapt the methodology to the context of volatility options. VIX vanilla options closed-form pricing formulas are derived and their accuracy is tested for the Heston model (1993) as well as for the jump-diffusion SVJJ model proposed by Duffie et al. (2000).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Computing the weighted geometric mean of large sparse matrices is an operation that tends to become rapidly intractable, when the size of the matrices involved grows. However, if we are not interested in the computation of the matrix function itself, but just in that of its product times a vector, the problem turns simpler and there is a chance to solve it even when the matrix mean would actually be impossible to compute. Our interest is motivated by the fact that this calculation has some practical applications, related to the preconditioning of some operators arising in domain decomposition of elliptic problems. In this thesis, we explore how such a computation can be efficiently performed. First, we exploit the properties of the weighted geometric mean and find several equivalent ways to express it through real powers of a matrix. Hence, we focus our attention on matrix powers and examine how well-known techniques can be adapted to the solution of the problem at hand. In particular, we consider two broad families of approaches for the computation of f(A) v, namely quadrature formulae and Krylov subspace methods, and generalize them to the pencil case f(A\B) v. Finally, we provide an extensive experimental evaluation of the proposed algorithms and also try to assess how convergence speed and execution time are influenced by some characteristics of the input matrices. Our results suggest that a few elements have some bearing on the performance and that, although there is no best choice in general, knowing the conditioning and the sparsity of the arguments beforehand can considerably help in choosing the best strategy to tackle the problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Passive acoustic data have been collected using HARPs (High-frequency Acoustic Recording Packages) and were used to assess (1) the seasonality of blue whale D calls in the Southern California Bight, (2) their interannual abundance during 2007-2012 and (3) their diel variation. This goal has been achieved running the GPL (Generalized Power-Law) automated detector. (1) Blue whale D calls were detected in the Southern California Bight from May through November with a peak in July, even though few detections were from December to April as well. A key predictor for blue whale distribution and movement in the California Current region has been identified with zooplankton aggregations, paying a particular attention to those euphausiid species, such as E. pacifica and T. spinifera, which are blue whale favorite krill. The Southern California Bight experiences seasonal upwelling, resulting in an increase of productivity and prey availability. The summer and early fall have been marked as the most favorable periods. This supports the presence of blue whales in the area at that time, supposing these marine mammals exploit the region as a feeding ground. (2) As to the interannual abundance during 2007-2012, I found a large variability. I observed a great increase of vocalizations in 2007 and 2010, whereas a decrease was shown in the other years, which is well marked in 2009. It is my belief that these fluctuations in abundance of D calls detections through the deployed period are due to the alternation of El Nino and La Nina events, which occurred in those years. (3) The assessment of the daily timing of D calls production shows that D calls are more abundant during the day than during the night with a peak at 12:00 and 13:00. Assuming that D calling is associated with feeding, the daily pattern of D calls may be linked to the prey availability. E. pacifica and T. spinifera are among those species of krill which undertake daily vertical migrations, remaining at depth during the day and slowly coming up towards the surface at night. Because of some anatomical arrangements, these euphausiids are very sensitive to the light. Given that we believe D calls have a social function, I hypothesize that blue whales may recognize the hours at the highest solar incidence as the best moment of the day in terms of prey availability, exploiting this time window to advert their conspecifics.