12 resultados para HIGH-HARMONIC-GENERATION

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Most of current ultra-miniaturized devices are obtained by the top-down approach, in which nanoscale components are fabricated by cutting down larger precursors. Since this physical-engineering method is reaching its limits, especially for components below 30 nm in size, alternative strategies are necessary. Of particular appeal to chemists is the supramolecular bottom-up approach to nanotechnology, a methodology that utilizes the principles of molecular recognition to build materials and devices from molecular components. The subject of this thesis is the photophysical and electrochemical investigation of nanodevices obtained harnessing the principles of supramolecular chemistry. These systems operate in solution-based environments and are investigated at the ensemble level. The majority of the chemical systems discussed here are based on pseudorotaxanes and catenanes. Such supramolecular systems represent prototypes of molecular machines since they are capable of performing simple controlled mechanical movements. Their properties and operation are strictly related to the supramolecular interactions between molecular components (generally photoactive or electroactive molecules) and to the possibility of modulating such interactions by means of external stimuli. The main issues addressed throughout the thesis are: (i) the analysis of the factors that can affect the architecture and perturb the stability of supramolecular systems; (ii) the possibility of controlling the direction of supramolecular motions exploiting the molecular information content; (iii) the development of switchable supramolecular polymers starting from simple host-guest complexes; (iv) the capability of some molecular machines to process information at molecular level, thus behaving as logic devices; (v) the behaviour of molecular machine components in a biological-type environment; (vi) the study of chemically functionalized metal nanoparticles by second harmonic generation spectroscopy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The quality of temperature and humidity retrievals from the infrared SEVIRI sensors on the geostationary Meteosat Second Generation (MSG) satellites is assessed by means of a one dimensional variational algorithm. The study is performed with the aim of improving the spatial and temporal resolution of available observations to feed analysis systems designed for high resolution regional scale numerical weather prediction (NWP) models. The non-hydrostatic forecast model COSMO (COnsortium for Small scale MOdelling) in the ARPA-SIM operational configuration is used to provide background fields. Only clear sky observations over sea are processed. An optimised 1D–VAR set-up comprising of the two water vapour and the three window channels is selected. It maximises the reduction of errors in the model backgrounds while ensuring ease of operational implementation through accurate bias correction procedures and correct radiative transfer simulations. The 1D–VAR retrieval quality is firstly quantified in relative terms employing statistics to estimate the reduction in the background model errors. Additionally the absolute retrieval accuracy is assessed comparing the analysis with independent radiosonde and satellite observations. The inclusion of satellite data brings a substantial reduction in the warm and dry biases present in the forecast model. Moreover it is shown that the retrieval profiles generated by the 1D–VAR are well correlated with the radiosonde measurements. Subsequently the 1D–VAR technique is applied to two three–dimensional case–studies: a false alarm case–study occurred in Friuli–Venezia–Giulia on the 8th of July 2004 and a heavy precipitation case occurred in Emilia–Romagna region between 9th and 12th of April 2005. The impact of satellite data for these two events is evaluated in terms of increments in the integrated water vapour and saturation water vapour over the column, in the 2 meters temperature and specific humidity and in the surface temperature. To improve the 1D–VAR technique a method to calculate flow–dependent model error covariance matrices is also assessed. The approach employs members from an ensemble forecast system generated by perturbing physical parameterisation schemes inside the model. The improved set–up applied to the case of 8th of July 2004 shows a substantial neutral impact.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The hard X-ray band (10 - 100 keV) has been only observed so far by collimated and coded aperture mask instruments, with a sensitivity and an angular resolution lower than two orders of magnitude as respects the current X-ray focusing telescopes operating below 10 - 15 keV. The technological advance in X-ray mirrors and detection systems is now able to extend the X-ray focusing technique to the hard X-ray domain, filling the gap in terms of observational performances and providing a totally new deep view on some of the most energetic phenomena of the Universe. In order to reach a sensitivity of 1 muCrab in the 10 - 40 keV energy range, a great care in the background minimization is required, a common issue for all the hard X-ray focusing telescopes. In the present PhD thesis, a comprehensive analysis of the space radiation environment, the payload design and the resulting prompt X-ray background level is presented, with the aim of driving the feasibility study of the shielding system and assessing the scientific requirements of the future hard X-ray missions. A Geant4 based multi-mission background simulator, BoGEMMS, is developed to be applied to any high energy mission for which the shielding and instruments performances are required. It allows to interactively create a virtual model of the telescope and expose it to the space radiation environment, tracking the particles along their path and filtering the simulated background counts as a real observation in space. Its flexibility is exploited to evaluate the background spectra of the Simbol-X and NHXM mission, as well as the soft proton scattering by the X-ray optics and the selection of the best shielding configuration. Altough the Simbol-X and NHXM missions are the case studies of the background analysis, the obtained results can be generalized to any future hard X-ray telescope. For this reason, a simplified, ideal payload model is also used to select the major sources of background in LEO. All the results are original contributions to the assessment studies of the cited missions, as part of the background groups activities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Next generation electronic devices have to guarantee high performance while being less power-consuming and highly reliable for several application domains ranging from the entertainment to the business. In this context, multicore platforms have proven the most efficient design choice but new challenges have to be faced. The ever-increasing miniaturization of the components produces unexpected variations on technological parameters and wear-out characterized by soft and hard errors. Even though hardware techniques, which lend themselves to be applied at design time, have been studied with the objective to mitigate these effects, they are not sufficient; thus software adaptive techniques are necessary. In this thesis we focus on multicore task allocation strategies to minimize the energy consumption while meeting performance constraints. We firstly devise a technique based on an Integer Linear Problem formulation which provides the optimal solution but cannot be applied on-line since the algorithm it needs is time-demanding; then we propose a sub-optimal technique based on two steps which can be applied on-line. We demonstrate the effectiveness of the latter solution through an exhaustive comparison against the optimal solution, state-of-the-art policies, and variability-agnostic task allocations by running multimedia applications on the virtual prototype of a next generation industrial multicore platform. We also face the problem of the performance and lifetime degradation. We firstly focus on embedded multicore platforms and propose an idleness distribution policy that increases core expected lifetimes by duty cycling their activity; then, we investigate the use of micro thermoelectrical coolers in general-purpose multicore processors to control the temperature of the cores at runtime with the objective of meeting lifetime constraints without performance loss.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cost, performance and availability considerations are forcing even the most conservative high-integrity embedded real-time systems industry to migrate from simple hardware processors to ones equipped with caches and other acceleration features. This migration disrupts the practices and solutions that industry had developed and consolidated over the years to perform timing analysis. Industry that are confident with the efficiency/effectiveness of their verification and validation processes for old-generation processors, do not have sufficient insight on the effects of the migration to cache-equipped processors. Caches are perceived as an additional source of complexity, which has potential for shattering the guarantees of cost- and schedule-constrained qualification of their systems. The current industrial approach to timing analysis is ill-equipped to cope with the variability incurred by caches. Conversely, the application of advanced WCET analysis techniques on real-world industrial software, developed without analysability in mind, is hardly feasible. We propose a development approach aimed at minimising the cache jitters, as well as at enabling the application of advanced WCET analysis techniques to industrial systems. Our approach builds on:(i) identification of those software constructs that may impede or complicate timing analysis in industrial-scale systems; (ii) elaboration of practical means, under the model-driven engineering (MDE) paradigm, to enforce the automated generation of software that is analyzable by construction; (iii) implementation of a layout optimisation method to remove cache jitters stemming from the software layout in memory, with the intent of facilitating incremental software development, which is of high strategic interest to industry. The integration of those constituents in a structured approach to timing analysis achieves two interesting properties: the resulting software is analysable from the earliest releases onwards - as opposed to becoming so only when the system is final - and more easily amenable to advanced timing analysis by construction, regardless of the system scale and complexity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

My PhD project was focused on Atlantic bluefin tuna, Thunnus thynnus, a fishery resource overexploited in the last decades. For a better management of stocks, it was necessary to improve scientific knowledge of this species and to develop novel tools to avoid collapse of this important commercial resource. To do this, we used new high throughput sequencing technologies, as Next Generation Sequencing (NGS), and markers linked to expressed genes, as SNPs (Single Nucleotide Polymorphisms). In this work we applied a combined approach: transcriptomic resources were used to build cDNA libreries from mRNA isolated by muscle, and genomic resources allowed to create a reference backbone for this species lacking of reference genome. All cDNA reads, obtained from mRNA, were mapped against this genome and, employing several bioinformatics tools and different restricted parameters, we achieved a set of contigs to detect SNPs. Once a final panel of 384 SNPs was developed, following the selection criteria, it was genotyped in 960 individuals of Atlantic bluefin tuna, including all size/age classes, from larvae to adults, collected from the entire range of the species. The analysis of obtained data was aimed to evaluate the genetic diversity and the population structure of Thunnus thynnus. We detect a low but significant signal of genetic differentiation among spawning samples, that can suggest the presence of three genetically separate reproduction areas. The adult samples resulted instead genetically undifferentiated between them and from the spawning populations, indicating a presence of panmictic population of adult bluefin tuna in the Mediterranean Sea, without different meta populations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The quest for universal memory is driving the rapid development of memories with superior all-round capabilities in non-volatility, high speed, high endurance and low power. The memory subsystem accounts for a significant cost and power budget of a computer system. Current DRAM-based main memory systems are starting to hit the power and cost limit. To resolve this issue the industry is improving existing technologies such as Flash and exploring new ones. Among those new technologies is the Phase Change Memory (PCM), which overcomes some of the shortcomings of the Flash such as durability and scalability. This alternative non-volatile memory technology, which uses resistance contrast in phase-change materials, offers more density relative to DRAM, and can help to increase main memory capacity of future systems while remaining within the cost and power constraints. Chalcogenide materials can suitably be exploited for manufacturing phase-change memory devices. Charge transport in amorphous chalcogenide-GST used for memory devices is modeled using two contributions: hopping of trapped electrons and motion of band electrons in extended states. Crystalline GST exhibits an almost Ohmic I(V) curve. In contrast amorphous GST shows a high resistance at low biases while, above a threshold voltage, a transition takes place from a highly resistive to a conductive state, characterized by a negative differential-resistance behavior. A clear and complete understanding of the threshold behavior of the amorphous phase is fundamental for exploiting such materials in the fabrication of innovative nonvolatile memories. The type of feedback that produces the snapback phenomenon is described as a filamentation in energy that is controlled by electron–electron interactions between trapped electrons and band electrons. The model thus derived is implemented within a state-of-the-art simulator. An analytical version of the model is also derived and is useful for discussing the snapback behavior and the scaling properties of the device.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background. Human small cell lung cancer (SCLC) accounting for approximately 15-20% of all lung cancers, is an aggressive tumor with high propensity for early regional and distant metastases. Although the initial tumor rate response to chemotherapy is very high, SCLC relapses after approximately 4 months in ED and 12 months in LD. Basal cell carcinoma (BCC) is the most prevalent cancer in the western world, and its incidence is increasing worldwide. This type of cancer rarely metastasizes and the death rate is extraordinary low. Surgery is curative for most of the patients, but for those that develop locally advanced or metastatic BCC there is currently no effective treatment. Both types of cancer have been deeply investigated and genetic alterations, MYCN amplification (MA) among the most interesting, have been found. These could become targets of new pharmacological therapies. Procedures. We created and characterized novel BLI xenograft orthotopic mouse models of SCLC to evaluate the tumor onset and progression and the efficacy of new pharmacological strategies. We compared an in vitro model with a transgenic mouse model of BCC, to investigate and delineate the canonical HH signalling pathway and its connections with other molecular pathways. Results and conclusions. The orthotopic models showed latency and progression patterns similar to human disease. Chemotherapy treatments improved survival rates and validated the in vivo model. The presence of MA and overexpression were confirmed in each model and we tested the efficacy of a new MYCN inhibitor in vitro. Preliminar data of BCC models highlighted Hedgehog pathway role and underlined the importance of both in vitro and in vivo strategies to achieve a better understanding of the pathology and to evaluate the applicability of new therapeutic compounds

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The clonal distribution of BRAFV600E in papillary thyroid carcinoma (PTC) has been recently debated. No information is currently available about precursor lesions of PTCs. My first aim was to establish whether the BRAFV600E mutation occurs as a subclonal event in PTCs. My second aim was to screen BRAF mutations in histologically benign tissue of cases with BRAFV600E or BRAFwt PTCs in order to identify putative precursor lesions of PTCs. Highly sensitive semi-quantitative methods were used: Allele Specific LNA quantitative PCR (ASLNAqPCR) and 454 Next-Generation Sequencing (NGS). For the first aim 155 consecutive formalin-fixed and paraffin-embedded (FFPE) specimens of PTCs were analyzed. The percentage of mutated cells obtained was normalized to the estimated number of neoplastic cells. Three groups of tumors were identified: a first had a percentage of BRAF mutated neoplastic cells > 80%; a second group showed a number of BRAF mutated neoplastic cells < 30%; a third group had a distribution of BRAFV600E between 30-80%. The large presence of BRAFV600E mutated neoplastic cell sub-populations suggests that BRAFV600E may be acquired early during tumorigenesis: therefore, BRAFV600E can be heterogeneously distributed in PTC. For the second aim, two groups were studied: one consisted of 20 cases with BRAFV600E mutated PTC, the other of 9 BRAFwt PTCs. Seventy-five and 23 histologically benign FFPE thyroid specimens were analyzed from the BRAFV600E mutated and BRAFwt PTC groups, respectively. The screening of BRAF mutations identified BRAFV600E in “atypical” cell foci from both groups of patients. “Unusual” BRAF substitutions were observed in histologically benign thyroid associated with BRAFV600E PTCs. These mutations were very uncommon in the group with BRAFwt PTCs and in BRAFV600E PTCs. Therefore, lesions carrying BRAF mutations may represent “abortive” attempts at cancer development: only BRAFV600E boosts neoplastic transformation to PTC. BRAFV600E mutated “atypical foci” may represent precursor lesions of BRAFV600E mutated PTCs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study is focused on the development of new VIII group metal on CeO2 – ZrO2 (CZO) catalyst to be used in reforming reaction for syngas production. The catalyst are tested in the oxyreforming process, extensively studied by Barbera [44] in a new multistep process configuration, with intermediate H2 membrane separation, that can be carried out at lower temperature (750°C) with respect the reforming processes (900 – 1000°C). In spite of the milder temperatures, the oxy-reforming conditions (S/C = 0.7; O2/C = 0.21) remain critical regarding the deactivation problems mainly deriving from thermal sintering and carbon formation phenomena. The combination of the high thermal stability characterizing the ZrO2, with the CeO2 redox properties, allows the formation of stable mixed oxide system with high oxygen mobility. This feature can be exploited in order to contrast the carbon deposition on the active metal surface through the oxidation of the carbon by means of the mobile oxygen atoms available at the surface of the CZO support. Ce0.5Zr0.5O2 is the phase claimed to have the highest oxygen mobility but its formation is difficult through classical synthesis (co-precipitation), hence a water-in-oil microemulsion method is, widely studied and characterized. Two methods (IWI and bulk) for the insertion of the active metal (Rh, Ru, Ni) are followed and their effects, mainly related to the metal stability and dispersion on the support, are discussed, correlating the characterization with the catalytic activity. Different parameters (calcination and reduction temperatures) are tuned to obtain the best catalytic system both in terms of activity and stability. Interesting results are obtained with impregnated and bulk catalysts, the latter representing a new class of catalysts. The best catalysts are also tested in a low temperature (350 – 500°C) steam reforming process and preliminary tests with H2 membrane separation have been also carried out.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work investigates the feasibility of a new process aimed at the production of hydrogen with inherent separation of carbon oxides. The process consists in a cycle in which, in the first step, a mixed metal oxide is reduced by ethanol (obtained from biomasses). The reduced metal is then contacted with steam in order to split the water and sequestrating the oxygen into the looping material’s structure. The oxides used to run this thermochemical cycle, also called “steam-iron process” are mixed ferrites in the spinel structure MeFe2O4 (Me = Fe, Co, Ni or Cu). To understand the reactions involved in the anaerobic reforming of ethanol, diffuse reflectance spectroscopy (DRIFTS) was used, coupled with the mass analysis of the effluent, to study the surface composition of the ferrites during the adsorption of ethanol and its transformations during the temperature program. This study was paired with the tests on a laboratory scale plant and the characterization through various techniques such as XRD, Mössbauer spectroscopy, elemental analysis... on the materials as synthesized and at different reduction degrees In the first step it was found that besides the generation of the expected CO, CO2 and H2O, the products of ethanol anaerobic oxidation, also a large amount of H2 and coke were produced. The latter is highly undesired, since it affects the second step, during which water is fed over the pre-reduced spinel at high temperature. The behavior of the different spinels was affected by the nature of the divalent metal cation; magnetite was the oxide showing the slower rate of reduction by ethanol, but on the other hand it was that one which could perform the entire cycle of the process more efficiently. Still the problem of coke formation remains the greater challenge to solve.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

I sottotipi H1N1, H1N2 e H3N2 di influenza A virus sono largamente diffusi nella popolazione suina di tutto il mondo. Nel presente lavoro è stato sviluppato un protocollo di sequenziamento di c.d. nuova generazione, su piattaforma Ion Torrent PGM, idoneo per l’analisi di tutti i virus influenzali suini (SIV). Per valutare l’evoluzione molecolare dei SIV italiani, sono stati sequenziati ed analizzati mediante analisi genomica e filogenetica un totale di sessantadue ceppi di SIV appartenenti ai sottotipi H1N1, H1N2 e H3N2, isolati in Italia dal 1998 al 2014. Sono stati evidenziati in sei campioni due fenomeni di riassortimento: tutti i SIV H1N2 esaminati presentavano una neuraminidasi di derivazione umana, diversa da quella dei SIV H1N2 circolanti in Europa, inoltre l’emoagglutinina (HA) di due isolati H1N2 era originata dal riassortimento con un SIV H1N1 avian-like. L’analisi molecolare dell’HA ha permesso di rivelare un’inserzione di due amminoacidi in quattro SIV H1N1 pandemici e una delezione di due aminoacidi in quattro SIV H1N2, entrambe a livello del sito di legame con il recettore cellulare. E’ stata inoltre evidenziata un’elevata omologia di un SIV H1N1 con ceppi europei isolati negli anni ’80, suggerendo la possibile origine vaccinale di questo virus. E’ stato possibile, in aggiunta, applicare il nuovo protocollo sviluppato per sequenziare un virus influenzale aviare altamente patogeno trasmesso all’uomo, direttamente da campione biologico. La diversità genetica nei SIV esaminati in questo studio conferma l’importanza di un continuo monitoraggio della costellazione genomica dei virus influenzali nella popolazione suina.