8 resultados para Life-time distribution

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The digital electronic market development is founded on the continuous reduction of the transistors size, to reduce area, power, cost and increase the computational performance of integrated circuits. This trend, known as technology scaling, is approaching the nanometer size. The lithographic process in the manufacturing stage is increasing its uncertainty with the scaling down of the transistors size, resulting in a larger parameter variation in future technology generations. Furthermore, the exponential relationship between the leakage current and the threshold voltage, is limiting the threshold and supply voltages scaling, increasing the power density and creating local thermal issues, such as hot spots, thermal runaway and thermal cycles. In addiction, the introduction of new materials and the smaller devices dimension are reducing transistors robustness, that combined with high temperature and frequently thermal cycles, are speeding up wear out processes. Those effects are no longer addressable only at the process level. Consequently the deep sub-micron devices will require solutions which will imply several design levels, as system and logic, and new approaches called Design For Manufacturability (DFM) and Design For Reliability. The purpose of the above approaches is to bring in the early design stages the awareness of the device reliability and manufacturability, in order to introduce logic and system able to cope with the yield and reliability loss. The ITRS roadmap suggests the following research steps to integrate the design for manufacturability and reliability in the standard CAD automated design flow: i) The implementation of new analysis algorithms able to predict the system thermal behavior with the impact to the power and speed performances. ii) High level wear out models able to predict the mean time to failure of the system (MTTF). iii) Statistical performance analysis able to predict the impact of the process variation, both random and systematic. The new analysis tools have to be developed beside new logic and system strategies to cope with the future challenges, as for instance: i) Thermal management strategy that increase the reliability and life time of the devices acting to some tunable parameter,such as supply voltage or body bias. ii) Error detection logic able to interact with compensation techniques as Adaptive Supply Voltage ASV, Adaptive Body Bias ABB and error recovering, in order to increase yield and reliability. iii) architectures that are fundamentally resistant to variability, including locally asynchronous designs, redundancy, and error correcting signal encodings (ECC). The literature already features works addressing the prediction of the MTTF, papers focusing on thermal management in the general purpose chip, and publications on statistical performance analysis. In my Phd research activity, I investigated the need for thermal management in future embedded low-power Network On Chip (NoC) devices.I developed a thermal analysis library, that has been integrated in a NoC cycle accurate simulator and in a FPGA based NoC simulator. The results have shown that an accurate layout distribution can avoid the onset of hot-spot in a NoC chip. Furthermore the application of thermal management can reduce temperature and number of thermal cycles, increasing the systemreliability. Therefore the thesis advocates the need to integrate a thermal analysis in the first design stages for embedded NoC design. Later on, I focused my research in the development of statistical process variation analysis tool that is able to address both random and systematic variations. The tool was used to analyze the impact of self-timed asynchronous logic stages in an embedded microprocessor. As results we confirmed the capability of self-timed logic to increase the manufacturability and reliability. Furthermore we used the tool to investigate the suitability of low-swing techniques in the NoC system communication under process variations. In this case We discovered the superior robustness to systematic process variation of low-swing links, which shows a good response to compensation technique as ASV and ABB. Hence low-swing is a good alternative to the standard CMOS communication for power, speed, reliability and manufacturability. In summary my work proves the advantage of integrating a statistical process variation analysis tool in the first stages of the design flow.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Understanding the complex relationships between quantities measured by volcanic monitoring network and shallow magma processes is a crucial headway for the comprehension of volcanic processes and a more realistic evaluation of the associated hazard. This question is very relevant at Campi Flegrei, a volcanic quiescent caldera immediately north-west of Napoli (Italy). The system activity shows a high fumarole release and periodic ground slow movement (bradyseism) with high seismicity. This activity, with the high people density and the presence of military and industrial buildings, makes Campi Flegrei one of the areas with higher volcanic hazard in the world. In such a context my thesis has been focused on magma dynamics due to the refilling of shallow magma chambers, and on the geophysical signals detectable by seismic, deformative and gravimetric monitoring networks that are associated with this phenomenologies. Indeed, the refilling of magma chambers is a process frequently occurring just before a volcanic eruption; therefore, the faculty of identifying this dynamics by means of recorded signal analysis is important to evaluate the short term volcanic hazard. The space-time evolution of dynamics due to injection of new magma in the magma chamber has been studied performing numerical simulations with, and implementing additional features in, the code GALES (Longo et al., 2006), recently developed and still on the upgrade at the Istituto Nazionale di Geofisica e Vulcanologia in Pisa (Italy). GALES is a finite element code based on a physico-mathematical two dimensional, transient model able to treat fluids as multiphase homogeneous mixtures, compressible to incompressible. The fundamental equations of mass, momentum and energy balance are discretised both in time and space using the Galerkin Least-Squares and discontinuity-capturing stabilisation technique. The physical properties of the mixture are computed as a function of local conditions of magma composition, pressure and temperature.The model features enable to study a broad range of phenomenologies characterizing pre and sin-eruptive magma dynamics in a wide domain from the volcanic crater to deep magma feeding zones. The study of displacement field associated with the simulated fluid dynamics has been carried out with a numerical code developed by the Geophysical group at the University College Dublin (O’Brien and Bean, 2004b), with whom we started a very profitable collaboration. In this code, the seismic wave propagation in heterogeneous media with free surface (e.g. the Earth’s surface) is simulated using a discrete elastic lattice where particle interactions are controlled by the Hooke’s law. This method allows to consider medium heterogeneities and complex topography. The initial and boundary conditions for the simulations have been defined within a coordinate project (INGV-DPC 2004-06 V3_2 “Research on active volcanoes, precursors, scenarios, hazard and risk - Campi Flegrei”), to which this thesis contributes, and many researchers experienced on Campi Flegrei in volcanological, seismic, petrological, geochemical fields, etc. collaborate. Numerical simulations of magma and rock dynamis have been coupled as described in the thesis. The first part of the thesis consists of a parametric study aimed at understanding the eect of the presence in magma of carbon dioxide in magma in the convection dynamics. Indeed, the presence of this volatile was relevant in many Campi Flegrei eruptions, including some eruptions commonly considered as reference for a future activity of this volcano. A set of simulations considering an elliptical magma chamber, compositionally uniform, refilled from below by a magma with volatile content equal or dierent from that of the resident magma has been performed. To do this, a multicomponent non-ideal magma saturation model (Papale et al., 2006) that considers the simultaneous presence of CO2 and H2O, has been implemented in GALES. Results show that the presence of CO2 in the incoming magma increases its buoyancy force promoting convection ad mixing. The simulated dynamics produce pressure transients with frequency and amplitude in the sensitivity range of modern geophysical monitoring networks such as the one installed at Campi Flegrei . In the second part, simulations more related with the Campi Flegrei volcanic system have been performed. The simulated system has been defined on the basis of conditions consistent with the bulk of knowledge of Campi Flegrei and in particular of the Agnano-Monte Spina eruption (4100 B.P.), commonly considered as reference for a future high intensity eruption in this area. The magmatic system has been modelled as a long dyke refilling a small shallow magma chamber; magmas with trachytic and phonolitic composition and variable volatile content of H2O and CO2 have been considered. The simulations have been carried out changing the condition of magma injection, the system configuration (magma chamber geometry, dyke size) and the resident and refilling magma composition and volatile content, in order to study the influence of these factors on the simulated dynamics. Simulation results allow to follow each step of the gas-rich magma ascent in the denser magma, highlighting the details of magma convection and mixing. In particular, the presence of more CO2 in the deep magma results in more ecient and faster dynamics. Through this simulations the variation of the gravimetric field has been determined. Afterward, the space-time distribution of stress resulting from numerical simulations have been used as boundary conditions for the simulations of the displacement field imposed by the magmatic dynamics on rocks. The properties of the simulated domain (rock density, P and S wave velocities) have been based on data from literature on active and passive tomographic experiments, obtained through a collaboration with A. Zollo at the Dept. of Physics of the Federici II Univeristy in Napoli. The elasto-dynamics simulations allow to determine the variations of the space-time distribution of deformation and the seismic signal associated with the studied magmatic dynamics. In particular, results show that these dynamics induce deformations similar to those measured at Campi Flegrei and seismic signals with energies concentrated on the typical frequency bands observed in volcanic areas. The present work shows that an approach based on the solution of equations describing the physics of processes within a magmatic fluid and the surrounding rock system is able to recognise and describe the relationships between geophysical signals detectable on the surface and deep magma dynamics. Therefore, the results suggest that the combined study of geophysical data and informations from numerical simulations can allow in a near future a more ecient evaluation of the short term volcanic hazard.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Gaia space mission is a major project for the European astronomical community. As challenging as it is, the processing and analysis of the huge data-flow incoming from Gaia is the subject of thorough study and preparatory work by the DPAC (Data Processing and Analysis Consortium), in charge of all aspects of the Gaia data reduction. This PhD Thesis was carried out in the framework of the DPAC, within the team based in Bologna. The task of the Bologna team is to define the calibration model and to build a grid of spectro-photometric standard stars (SPSS) suitable for the absolute flux calibration of the Gaia G-band photometry and the BP/RP spectrophotometry. Such a flux calibration can be performed by repeatedly observing each SPSS during the life-time of the Gaia mission and by comparing the observed Gaia spectra to the spectra obtained by our ground-based observations. Due to both the different observing sites involved and the huge amount of frames expected (≃100000), it is essential to maintain the maximum homogeneity in data quality, acquisition and treatment, and a particular care has to be used to test the capabilities of each telescope/instrument combination (through the “instrument familiarization plan”), to devise methods to keep under control, and eventually to correct for, the typical instrumental effects that can affect the high precision required for the Gaia SPSS grid (a few % with respect to Vega). I contributed to the ground-based survey of Gaia SPSS in many respects: with the observations, the instrument familiarization plan, the data reduction and analysis activities (both photometry and spectroscopy), and to the maintenance of the data archives. However, the field I was personally responsible for was photometry and in particular relative photometry for the production of short-term light curves. In this context I defined and tested a semi-automated pipeline which allows for the pre-reduction of imaging SPSS data and the production of aperture photometry catalogues ready to be used for further analysis. A series of semi-automated quality control criteria are included in the pipeline at various levels, from pre-reduction, to aperture photometry, to light curves production and analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As land is developed, the impervious surfaces that are created increase the amount of runoff during rainfall events, disrupting the natural hydrologic cycle, with an increment in volume of runoff and in pollutant loadings. Pollutants deposited or derived from an activity on the land surface will likely end up in stormwater runoff in some concentration, such as nutrients, sediment, heavy metals, hydrocarbons, gasoline additives, pathogens, deicers, herbicides and pesticides. Several of these pollutants are particulate-bound, so it appears clear that sediment removal can provide significant water-quality improvements and it appears to be important the knowledge of the ability of stromwater treatment devices to retain particulate matter. For this reason three different units which remove sediments have been tested through laboratory. In particular a roadside gully pot has been tested under steady hydraulic conditions, varying the characteristics of the influent solids (diameter, particle size distribution and specific gravity). The efficiency in terms of particles retained has been evaluated as a function of influent flow rate and particles characteristics; results have been compared to efficiency evaluated applying an overflow rate model. Furthermore the role of particles settling velocity in efficiency determination has been investigated. After the experimental runs on the gully pot, a standard full-scale model of an hydrodynamic separator (HS) has been tested under unsteady influent flow rate condition, and constant solid concentration at the input. The results presented in this study illustrate that particle separation efficiency of the unit is predominately influenced by operating flow rate, which strongly affects the particles and hydraulic residence time of the system. The efficiency data have been compared to results obtained from a modified overflow rate model; moreover the residence time distribution has been experimentally determined through tracer analyses for several steady flow rates. Finally three testing experiments have been performed for two different configurations of a full-scale model of a clarifier (linear and crenulated) under unsteady influent flow rate condition, and constant solid concentration at the input. The results illustrate that particle separation efficiency of the unit is predominately influenced by the configuration of the unit itself. Turbidity measures have been used to compare turbidity with the suspended sediments concentration, in order to find a correlation between these two values, which can allow to have a measure of the sediments concentration simply installing a turbidity probe.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Sebbene il magnesio sia essenziale per la maggior parte dei processi biologici, si conosce ancora poco sulla sua distribuzione e compartimentalizzazione intracellulare, soprattutto a causa dell’inadeguatezza delle tecniche attualmente disponibili. Per questo motivo, particolare interesse ha recentemente suscitato una famiglia di molecole fluorescenti, diaza-18-crown-6 8-idrossichinoline (DCHQ1 e suoi derivati), che mostrano un’alta specificità e affinità per il magnesio (superiore a quella delle sonde commerciali), che consente di mappare il magnesio totale intracellulare. L’approccio sintetico alle molecole DCHQ è stato ottimizzato mediante riscaldamento alle microonde: con questa nuova metodica è stato possibile sintetizzare una famiglia di derivati con caratteristiche di fluorescenza, uptake, ritenzione e localizzazione intracellulare potenziate rispetto alla capostipite DCHQ1. Il derivato acetometossi estere (DCHQ3), idrolizzato dalle esterasi cellulari, ha mostrato un miglior uptake e ritenzione intracellulare; le lunghe catene laterali alchiliche della sonda DCHQ4, invece, hanno conferito a questo derivato maggiore lipofilicità e, di conseguenza, maggiore affinità per le membrane; con l’inserimento di gruppi laterali aromatici, infine, si sono ottenute due sonde (DCHQ5 e DCHQ6) molto fluorescenti e altamente ritenute all’interno delle cellule anche dopo i lavaggi. Il derivato fenilico DCHQ5 si è dimostrato, inoltre, utilizzabile anche per saggi fluorimetrici quantitativi del magnesio totale in campioni cellulari molto piccoli; in più, grazie all’alta ritenzione cellulare, è stato usato per monitorare e quantificare l’efflusso di magnesio attraverso la membrana plasmatica in risposta a stimolazione con cAMP. I risultati presentati in questa tesi mostrano che i DCHQ-derivati potranno rappresentare in futuro uno strumento versatile per lo studio della distribuzione e dell’omeostasi del magnesio cellulare. In particolare la sonda DCHQ5 ha mostrato l’ulteriore peculiarità di essere eccitabile sia nell’UV che nel visibile, e potrebbe essere quindi utilizzata con successo in un’ampia varietà di misure di fluorescenza, fornendo un contributo importante per la comprensione del ruolo di questo importante elemento.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La città medievale di Leopoli-Cencelle (fondata da Papa Leone IV nell‘854 d.C. non lontano da Civitavecchia) è stata oggetto di studio e di periodiche campagne di scavo a partire dal 1994. Le stratigrafie investigate con metodi tradizionali, hanno portato alla luce le numerose trasformazioni che la città ha subìto nel corso della sua esistenza in vita. Case, torri, botteghe e strati di vissuto, sono stati interpretati sin dall’inizio dello scavo basandosi sulla documentazione tradizionale e bi-dimensionale, legata al dato cartaceo e al disegno. Il presente lavoro intende re-interpretare i dati di scavo con l’ausilio delle tecnologie digitali. Per il progetto sono stati utilizzati un laser scanner, tecniche di Computer Vision e modellazione 3D. I tre metodi sono stati combinati in modo da poter visualizzare tridimensionalmente gli edifici abitativi scavati, con la possibilità di sovrapporre semplici modelli 3D che permettano di formulare ipotesi differenti sulla forma e sull’uso degli spazi. Modellare spazio e tempo offrendo varie possibilità di scelta, permette di combinare i dati reali tridimensionali, acquisiti con un laser scanner, con semplici modelli filologici in 3D e offre l’opportunità di valutare diverse possibili interpretazioni delle caratteristiche dell’edificio in base agli spazi, ai materiali, alle tecniche costruttive. Lo scopo del progetto è andare oltre la Realtà Virtuale, con la possibilità di analizzare i resti e di re-interpretare la funzione di un edificio, sia in fase di scavo che a scavo concluso. Dal punto di vista della ricerca, la possibilità di visualizzare le ipotesi sul campo favorisce una comprensione più profonda del contesto archeologico. Un secondo obiettivo è la comunicazione a un pubblico di “non-archeologi”. Si vuole offrire a normali visitatori la possibilità di comprendere e sperimentare il processo interpretativo, fornendo loro qualcosa in più rispetto a una sola ipotesi definitiva.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Maintaining the postharvest quality of whole and fresh-cut fruit during storage and distribution is the major challenge facing fruit industry. For this purpose, industry adopt a wide range of technologies to enable extended shelf-life. Many factors can lead to loss of quality in fresh product, hence the common description of these products as ‘perishable’. As a consequence normal factors such as transpiration and respiration lead ultimately to water loss and senescence of the product. Fruits and vegetables are living commodities and their rate of respiration is of key importance to maintenance of quality. It has been commonly observed that the greater the respiration rate of a product, the shorter the shelf-life. The principal problem for fresh-cut fruit industries is the relative shorter shelf-life of minimally processed fruit (MPF) compared to intact product. This fact is strictly connected with the higher ethylene production of fruit tissue stimulated during fresh-cut processing (peeling, cutting, dipping). 1-Methylcyclopropene (1-MCP) is an inhibitor of ethylene action and several researches have shown its effectiveness on the inhibition of ripening and senescence incidence for intact fruit and consequently on their shelf-life extension. More recently 1-MCP treatment has been tested also for shelf-life extension of MPF but discordant results have been obtained. Considering that in some countries 1-MCP is already a commercial product registered for the use on a number of horticultural products, the main aim of this actual study was to enhance our understanding on the effects of 1-MCP treatment on the quality maintenance of whole and fresh-cut climacteric and non-climacteric fruit (apple, kiwifruit and pineapple). Concerning the effects of 1-MCP on whole fruit, was investigated the effects of a semi-commercial postharvest treatment with 1-MCP on the quality of Pink Lady apples as functions of fruit ripening stage, 1-MCP dose, storage time and also in combination with controlled atmospheres storage in order to better understand what is the relationship among these parameters and if is possible to maximize the 1-MCP treatment to meet the market/consumer needs and then in order to put in the market excellent fruit. To achieve this purpose an incomplete three-level three-factor design was adopted. During the storage were monitored several quality parameters: firmness, ripening index, ethylene and carbon dioxide production and were also performed a sensory evaluations after 6 month of storage. In this study the higher retention of firmness (at the end of storage) was achieved by applying the greatest 1-MCP concentration to fruits with the lowest maturity stage. This finding means that in these semi-commercial conditions we may considerate completely blocked the fruit softening. 1-MCP was able to delay also the ethylene and CO2 production and the maturity parameters (soluble solids content and total acidity). Only in some cases 1-MCP generate a synergistic effect with the CA storage. The results of sensory analyses indicated that, the 1-MCP treatment did not affect the sweetness and whole fruit flavour while had a little effect on the decreasing cut fruit flavour. On the contrary the treated apple was more sour, crisp, firm and juicy. The effects of some treatment (dipping and MAP) on the nutrient stability were also investigated showing that in this case study the adopted treatments did not have drastic effects on the antioxidant compounds on the contrary the dipping may enhance the total antioxidant activity by the accumulation of ascorbic acid on the apple cut surface. Results concerning the effects of 1-MCP in combination with MAP on the quality parameters behaviour of the kiwifruit were not always consistent and clear: in terms of colour maintenance, it seemed to have a synergistic effect with N2O MAP; as far as ripening index is concerned, 1-MCP had a preservative effect, but just for sample packed in air.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Nanotechnologies are rapidly expanding because of the opportunities that the new materials offer in many areas such as the manufacturing industry, food production, processing and preservation, and in the pharmaceutical and cosmetic industry. Size distribution of the nanoparticles determines their properties and is a fundamental parameter that needs to be monitored from the small-scale synthesis up to the bulk production and quality control of nanotech products on the market. A consequence of the increasing number of applications of nanomaterial is that the EU regulatory authorities are introducing the obligation for companies that make use of nanomaterials to acquire analytical platforms for the assessment of the size parameters of the nanomaterials. In this work, Asymmetrical Flow Field-Flow Fractionation (AF4) and Hollow Fiber F4 (HF5), hyphenated with Multiangle Light Scattering (MALS) are presented as tools for a deep functional characterization of nanoparticles. In particular, it is demonstrated the applicability of AF4-MALS for the characterization of liposomes in a wide series of mediums. Afterwards the technique is used to explore the functional features of a liposomal drug vector in terms of its biological and physical interaction with blood serum components: a comprehensive approach to understand the behavior of lipid vesicles in terms of drug release and fusion/interaction with other biological species is described, together with weaknesses and strength of the method. Afterwards the size characterization, size stability, and conjugation of azidothymidine drug molecules with a new generation of metastable drug vectors, the Metal Organic Frameworks, is discussed. Lastly, it is shown the applicability of HF5-ICP-MS for the rapid screening of samples of relevant nanorisk: rather than a deep and comprehensive characterization it this time shown a quick and smart methodology that within few steps provides qualitative information on the content of metallic nanoparticles in tattoo ink samples.