8 resultados para Query-by-example
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
Salt deposits characterize the subsurface of Tuzla (BiH) and made it famous since the ancient times. Archeological discoveries demonstrate the presence of a Neolithic pile-dwelling settlement related to the existence of saltwater springs that contributed to make the most of the area a swampy ground. Since the Roman times, the town is reported as “the City of Salt deposits and Springs”; "tuz" is the Turkish word for salt, as the Ottomans renamed the settlement in the 15th century following their conquest of the medieval Bosnia (Donia and Fine, 1994). Natural brine springs were located everywhere and salt has been evaporated by means of hot charcoals since pre-Roman times. The ancient use of salt was just a small exploitation compared to the massive salt production carried out during the 20th century by means of classical mine methodologies and especially wild brine pumping. In the past salt extraction was practised tapping natural brine springs, while the modern technique consists in about 100 boreholes with pumps tapped to the natural underground brine runs, at an average depth of 400-500 m. The mining operation changed the hydrogeological conditions enabling the downward flow of fresh water causing additional salt dissolution. This process induced severe ground subsidence during the last 60 years reaching up to 10 meters of sinking in the most affected area. Stress and strain of the overlying rocks induced the formation of numerous fractures over a conspicuous area (3 Km2). Consequently serious damages occurred to buildings and infrastructures such as water supply system, sewage networks and power lines. Downtown urban life was compromised by the destruction of more than 2000 buildings that collapsed or needed to be demolished causing the resettlement of about 15000 inhabitants (Tatić, 1979). Recently salt extraction activities have been strongly reduced, but the underground water system is returning to his natural conditions, threatening the flooding of the most collapsed area. During the last 60 years local government developed a monitoring system of the phenomenon, collecting several data about geodetic measurements, amount of brine pumped, piezometry, lithostratigraphy, extension of the salt body and geotechnical parameters. A database was created within a scientific cooperation between the municipality of Tuzla and the city of Rotterdam (D.O.O. Mining Institute Tuzla, 2000). The scientific investigation presented in this dissertation has been financially supported by a cooperation project between the Municipality of Tuzla, The University of Bologna (CIRSA) and the Province of Ravenna. The University of Tuzla (RGGF) gave an important scientific support in particular about the geological and hydrogeological features. Subsidence damage resulting from evaporite dissolution generates substantial losses throughout the world, but the causes are only well understood in a few areas (Gutierrez et al., 2008). The subject of this study is the collapsing phenomenon occurring in Tuzla area with the aim to identify and quantify the several factors involved in the system and their correlations. Tuzla subsidence phenomenon can be defined as geohazard, which represents the consequence of an adverse combination of geological processes and ground conditions precipitated by human activity with the potential to cause harm (Rosenbaum and Culshaw, 2003). Where an hazard induces a risk to a vulnerable element, a risk management process is required. The single factors involved in the subsidence of Tuzla can be considered as hazards. The final objective of this dissertation represents a preliminary risk assessment procedure and guidelines, developed in order to quantify the buildings vulnerability in relation to the overall geohazard that affect the town. The historical available database, never fully processed, have been analyzed by means of geographic information systems and mathematical interpolators (PART I). Modern geomatic applications have been implemented to deeply investigate the most relevant hazards (PART II). In order to monitor and quantify the actual subsidence rates, geodetic GPS technologies have been implemented and 4 survey campaigns have been carried out once a year. Subsidence related fractures system has been identified by means of field surveys and mathematical interpretations of the sinking surface, called curvature analysis. The comparison of mapped and predicted fractures leaded to a better comprehension of the problem. Results confirmed the reliability of fractures identification using curvature analysis applied to sinking data instead of topographic or seismic data. Urban changes evolution has been reconstructed analyzing topographic maps and satellite imageries, identifying the most damaged areas. This part of the investigation was very important for the quantification of buildings vulnerability.
Resumo:
Large scale wireless adhoc networks of computers, sensors, PDAs etc. (i.e. nodes) are revolutionizing connectivity and leading to a paradigm shift from centralized systems to highly distributed and dynamic environments. An example of adhoc networks are sensor networks, which are usually composed by small units able to sense and transmit to a sink elementary data which are successively processed by an external machine. Recent improvements in the memory and computational power of sensors, together with the reduction of energy consumptions, are rapidly changing the potential of such systems, moving the attention towards datacentric sensor networks. A plethora of routing and data management algorithms have been proposed for the network path discovery ranging from broadcasting/floodingbased approaches to those using global positioning systems (GPS). We studied WGrid, a novel decentralized infrastructure that organizes wireless devices in an adhoc manner, where each node has one or more virtual coordinates through which both message routing and data management occur without reliance on either flooding/broadcasting operations or GPS. The resulting adhoc network does not suffer from the deadend problem, which happens in geographicbased routing when a node is unable to locate a neighbor closer to the destination than itself. WGrid allow multidimensional data management capability since nodes' virtual coordinates can act as a distributed database without needing neither special implementation or reorganization. Any kind of data (both single and multidimensional) can be distributed, stored and managed. We will show how a location service can be easily implemented so that any search is reduced to a simple query, like for any other data type. WGrid has then been extended by adopting a replication methodology. We called the resulting algorithm WRGrid. Just like WGrid, WRGrid acts as a distributed database without needing neither special implementation nor reorganization and any kind of data can be distributed, stored and managed. We have evaluated the benefits of replication on data management, finding out, from experimental results, that it can halve the average number of hops in the network. The direct consequence of this fact are a significant improvement on energy consumption and a workload balancing among sensors (number of messages routed by each node). Finally, thanks to the replications, whose number can be arbitrarily chosen, the resulting sensor network can face sensors disconnections/connections, due to failures of sensors, without data loss. Another extension to {WGrid} is {W*Grid} which extends it by strongly improving network recovery performance from link and/or device failures that may happen due to crashes or battery exhaustion of devices or to temporary obstacles. W*Grid guarantees, by construction, at least two disjoint paths between each couple of nodes. This implies that the recovery in W*Grid occurs without broadcasting transmissions and guaranteeing robustness while drastically reducing the energy consumption. An extensive number of simulations shows the efficiency, robustness and traffic road of resulting networks under several scenarios of device density and of number of coordinates. Performance analysis have been compared to existent algorithms in order to validate the results.
Resumo:
The obligate intracellular pathogen Chlamydia trachomatis is a gram negative bacterium which infects epithelial cells of the reproductive tract. C. trachomatis is the leading cause of bacterial sexually transmitted disease worldwide and a vaccine against this pathogen is highly needed. Many evidences suggest that both antigen specific-Th1 cells and antibodies may be important to provide protection against Chlamydia infection. In a previous study we have identified eight new Chlamydia antigens inducing CD4-Th1 and/or antibody responses that, when combined properly, can protect mice from Chlamydia infection. However, all selected recombinant antigens, upon immunization in mice, elicited antibodies not able to neutralize Chlamydia infectivity in vitro. With the aim to improve the quality of the immune response by inducing effective neutralizing antibodies, we used a novel delivery system based on the unique capacity of E. coli Outer Membrane Vesicles (OMV) to present membrane proteins in their natural composition and conformation. We have expressed Chlamydia antigens, previously identified as vaccine candidates, in the OMV system. Among all OMV preparations, the one expressing HtrA Chlamydia antigen (OMV-HtrA), showed to be the best in terms of yield and quantity of expressed protein, was used to produce mice immune sera to be tested in neutralization assay in vitro. We observed that OMV-HtrA elicited specific antibodies able to neutralize efficiently Chlamydia infection in vitro, indicating that the presentation of the antigens in their natural conformation is crucial to induce an effective immune response. This is one of the first examples in which antibodies directed against a new Chlamydia antigen, other than MOMP (the only so far known antigen inducing neutralizing antibodies), are able to block the Chlamydia infectivity in vitro. Finally, by performing an epitope mapping study, we investigated the specificity of the antibody response induced by the recombinant HtrA and by OMV-HtrA. In particular, we identified some linear epitopes exclusively recognized by antibodies raised with the OMV-HtrA system, detecting in this manner the antigen regions likely responsible of the neutralizing effect.
Resumo:
The diameters of traditional dish concentrators can reach several tens of meters, the construction of monolithic mirrors being difficult at these scales: cheap flat reflecting facets mounted on a common frame generally reproduce a paraboloidal surface. When a standard imaging mirror is coupled with a PV dense array, problems arise since the solar image focused is intrinsically circular. Moreover, the corresponding irradiance distribution is bell-shaped in contrast with the requirement of having all the cells under the same illumination. Mismatch losses occur when interconnected cells experience different conditions, in particular in series connections. In this PhD Thesis, we aim at solving these issues by a multidisciplinary approach, exploiting optical concepts and applications developed specifically for astronomical use, where the improvement of the image quality is a very important issue. The strategy we propose is to boost the spot uniformity acting uniquely on the primary reflector and avoiding the big mirrors segmentation into numerous smaller elements that need to be accurately mounted and aligned. In the proposed method, the shape of the mirrors is analytically described by the Zernike polynomials and its optimization is numerically obtained to give a non-imaging optics able to produce a quasi-square spot, spatially uniform and with prescribed concentration level. The freeform primary optics leads to a substantial gain in efficiency without secondary optics. Simple electrical schemes for the receiver are also required. The concept has been investigated theoretically modeling an example of CPV dense array application, including the development of non-optical aspects as the design of the detector and of the supporting mechanics. For the method proposed and the specific CPV system described, a patent application has been filed in Italy with the number TO2014A000016. The patent has been developed thanks to the collaboration between the University of Bologna and INAF (National Institute for Astrophysics).
Resumo:
The first part of this work deals with the inverse problem solution in the X-ray spectroscopy field. An original strategy to solve the inverse problem by using the maximum entropy principle is illustrated. It is built the code UMESTRAT, to apply the described strategy in a semiautomatic way. The application of UMESTRAT is shown with a computational example. The second part of this work deals with the improvement of the X-ray Boltzmann model, by studying two radiative interactions neglected in the current photon models. Firstly it is studied the characteristic line emission due to Compton ionization. It is developed a strategy that allows the evaluation of this contribution for the shells K, L and M of all elements with Z from 11 to 92. It is evaluated the single shell Compton/photoelectric ratio as a function of the primary photon energy. It is derived the energy values at which the Compton interaction becomes the prevailing process to produce ionization for the considered shells. Finally it is introduced a new kernel for the XRF from Compton ionization. In a second place it is characterized the bremsstrahlung radiative contribution due the secondary electrons. The bremsstrahlung radiation is characterized in terms of space, angle and energy, for all elements whit Z=1-92 in the energy range 1–150 keV by using the Monte Carlo code PENELOPE. It is demonstrated that bremsstrahlung radiative contribution can be well approximated with an isotropic point photon source. It is created a data library comprising the energetic distributions of bremsstrahlung. It is developed a new bremsstrahlung kernel which allows the introduction of this contribution in the modified Boltzmann equation. An example of application to the simulation of a synchrotron experiment is shown.
Resumo:
The first mechanical Automaton concept was found in a Chinese text written in the 3rd century BC, while Computer Vision was born in the late 1960s. Therefore, visual perception applied to machines (i.e. the Machine Vision) is a young and exciting alliance. When robots came in, the new field of Robotic Vision was born, and these terms began to be erroneously interchanged. In short, we can say that Machine Vision is an engineering domain, which concern the industrial use of Vision. The Robotic Vision, instead, is a research field that tries to incorporate robotics aspects in computer vision algorithms. Visual Servoing, for example, is one of the problems that cannot be solved by computer vision only. Accordingly, a large part of this work deals with boosting popular Computer Vision techniques by exploiting robotics: e.g. the use of kinematics to localize a vision sensor, mounted as the robot end-effector. The remainder of this work is dedicated to the counterparty, i.e. the use of computer vision to solve real robotic problems like grasping objects or navigate avoiding obstacles. Will be presented a brief survey about mapping data structures most widely used in robotics along with SkiMap, a novel sparse data structure created both for robotic mapping and as a general purpose 3D spatial index. Thus, several approaches to implement Object Detection and Manipulation, by exploiting the aforementioned mapping strategies, will be proposed, along with a completely new Machine Teaching facility in order to simply the training procedure of modern Deep Learning networks.
Resumo:
Non-linear effects are responsible for peculiar phenomena in charged particles dynamics in circular accelerators. Recently, they have been used to propose novel beam manipulations where one can modify the transverse beam distribution in a controlled way, to fulfil the constraints posed by new applications. One example is the resonant beam splitting used at CERN for the Multi-Turn Extraction (MTE), to transfer proton beams from PS to SPS. The theoretical description of these effects relies on the formulation of the particle's dynamics in terms of Hamiltonian systems and symplectic maps, and on the theory of adiabatic invariance and resonant separatrix crossing. Close to resonance, new stable regions and new separatrices appear in the phase space. As non-linear effects do not preserve the Courant-Snyder invariant, it is possible for a particle to cross a separatrix, changing the value of its adiabatic invariant. This process opens the path to new beam manipulations. This thesis deals with various possible effects that can be used to shape the transverse beam dynamics, using 2D and 4D models of particles' motion. We show the possibility of splitting a beam using a resonant external exciter, or combining its action with MTE-like tune modulation close to resonance. Non-linear effects can also be used to cool a beam acting on its transverse beam distribution. We discuss the case of an annular beam distribution, showing that emittance can be reduced modulating amplitude and frequency of a resonant oscillating dipole. We then consider 4D models where, close to resonance, motion in the two transverse planes is coupled. This is exploited to operate on the transverse emittances with a 2D resonance crossing. Depending on the resonance, the result is an emittance exchange between the two planes, or an emittance sharing. These phenomena are described and understood in terms of adiabatic invariance theory.
Resumo:
The PhD research project was a striking example of the enhancement of milling by-product and alternative protein sources from house cricket (Acheta domesticus), conceived as sustainable and renewable sources, to produce innovative food products. During milling processing of wheat and rye, several by-products with high technological and functional potential, are produced. The use of selected microbial consortia, allowed to obtain a pre-fermented ingredient for use in the bakery. The pre-ferments obtained were characterized by a high technological, functional and nutritional value, also interesting from a nutraceutical point of view. Bakery products obtained by the addition of pre-fermented ingredients were characterized by a greater quantity of aromatic molecules and an increase in SCFA, antioxidant activity, total amino acids and total phenols resulting in positive effect on the functionality. Moreover, the industrial scaling-up of pre-ferment and innovative bakery goods production, developed in this research, underlined the technological applicability of pre-fermented ingredients on a large scale. Moreover, the identification of innovative protein sources, can address the request of new sustainable ingredients able to less impact on the environment and to satisfy the food global demand. To upscale the insect production and ensure food safety of insect-based products, biotechnological formulations based on Acheta domesticus powder were optimized. The use of Yarrowia lipolytica in the biotechnological transformation of cricket powder led to the achievement of a cricket-based food ingredient characterized by a reduced content of chitin and an increase of antimicrobial and health-promoting molecules. The innovative bakery products containing cricket-based hydrolysates from Y. lipolytica possessed specific sensory, qualitative and functional characteristics to the final product. Moreover, the combination of Y. lipolytica hydrolysis and baking showed promising results regarding a reduced allergenicity in cricket-based baked products. Thus, the hydrolysate of cricket powder may represent a versatile and promising ingredient in the production of innovative foods.