941 resultados para Technological physics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several activities were conducted during my PhD activity. For the NEMO experiment a collaboration between the INFN/University groups of Catania and Bologna led to the development and production of a mixed signal acquisition board for the Nemo Km3 telescope. The research concerned the feasibility study for a different acquisition technique quite far from that adopted in the NEMO Phase 1 telescope. The DAQ board that we realized exploits the LIRA06 front-end chip for the analog acquisition of anodic an dynodic sources of a PMT (Photo-Multiplier Tube). The low-power analog acquisition allows to sample contemporaneously multiple channels of the PMT at different gain factors in order to increase the signal response linearity over a wider dynamic range. Also the auto triggering and self-event-classification features help to improve the acquisition performance and the knowledge on the neutrino event. A fully functional interface towards the first level data concentrator, the Floor Control Module, has been integrated as well on the board, and a specific firmware has been realized to comply with the present communication protocols. This stage of the project foresees the use of an FPGA, a high speed configurable device, to provide the board with a flexible digital logic control core. After the validation of the whole front-end architecture this feature would be probably integrated in a common mixed-signal ASIC (Application Specific Integrated Circuit). The volatile nature of the configuration memory of the FPGA implied the integration of a flash ISP (In System Programming) memory and a smart architecture for a safe remote reconfiguration of it. All the integrated features of the board have been tested. At the Catania laboratory the behavior of the LIRA chip has been investigated in the digital environment of the DAQ board and we succeeded in driving the acquisition with the FPGA. The PMT pulses generated with an arbitrary waveform generator were correctly triggered and acquired by the analog chip, and successively they were digitized by the on board ADC under the supervision of the FPGA. For the communication towards the data concentrator a test bench has been realized in Bologna where, thanks to a lending of the Roma University and INFN, a full readout chain equivalent to that present in the NEMO phase-1 was installed. These tests showed a good behavior of the digital electronic that was able to receive and to execute command imparted by the PC console and to answer back with a reply. The remotely configurable logic behaved well too and demonstrated, at least in principle, the validity of this technique. A new prototype board is now under development at the Catania laboratory as an evolution of the one described above. This board is going to be deployed within the NEMO Phase-2 tower in one of its floors dedicated to new front-end proposals. This board will integrate a new analog acquisition chip called SAS (Smart Auto-triggering Sampler) introducing thus a new analog front-end but inheriting most of the digital logic present in the current DAQ board discussed in this thesis. For what concern the activity on high-resolution vertex detectors, I worked within the SLIM5 collaboration for the characterization of a MAPS (Monolithic Active Pixel Sensor) device called APSEL-4D. The mentioned chip is a matrix of 4096 active pixel sensors with deep N-well implantations meant for charge collection and to shield the analog electronics from digital noise. The chip integrates the full-custom sensors matrix and the sparsifification/readout logic realized with standard-cells in STM CMOS technology 130 nm. For the chip characterization a test-beam has been set up on the 12 GeV PS (Proton Synchrotron) line facility at CERN of Geneva (CH). The collaboration prepared a silicon strip telescope and a DAQ system (hardware and software) for data acquisition and control of the telescope that allowed to store about 90 million events in 7 equivalent days of live-time of the beam. My activities concerned basically the realization of a firmware interface towards and from the MAPS chip in order to integrate it on the general DAQ system. Thereafter I worked on the DAQ software to implement on it a proper Slow Control interface of the APSEL4D. Several APSEL4D chips with different thinning have been tested during the test beam. Those with 100 and 300 um presented an overall efficiency of about 90% imparting a threshold of 450 electrons. The test-beam allowed to estimate also the resolution of the pixel sensor providing good results consistent with the pitch/sqrt(12) formula. The MAPS intrinsic resolution has been extracted from the width of the residual plot taking into account the multiple scattering effect.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Communication and coordination are two key-aspects in open distributed agent system, being both responsible for the system’s behaviour integrity. An infrastructure capable to handling these issues, like TuCSoN, should to be able to exploit modern technologies and tools provided by fast software engineering contexts. Thesis aims to demonstrate TuCSoN infrastructure’s abilities to cope new possibilities, hardware and software, offered by mobile technology. The scenarios are going to configure, are related to the distributed nature of multi-agent systems where an agent should be located and runned just on a mobile device. We deal new mobile technology frontiers concerned with smartphones using Android operating system by Google. Analysis and deployment of a distributed agent-based system so described go first to impact with quality and quantity considerations about available resources. Engineering issue at the base of our research is to use TuCSoN against to reduced memory and computing capability of a smartphone, without the loss of functionality, efficiency and integrity for the infrastructure. Thesis work is organized on two fronts simultaneously: the former is the rationalization process of the available hardware and software resources, the latter, totally orthogonal, is the adaptation and optimization process about TuCSoN architecture for an ad-hoc client side release.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lipids are important components that contribute very significantly to nutritional and technological quality of foods because they are the least stable macro-components in foods, due to high susceptibility to oxidation. When rancidity take place, it makes food unhealthy and unacceptable for consumers. Thus, the presence of antioxidants, naturally present of added to foods, is required to enhance shelf life of foods. Moreover, antioxidant like phenolic compounds play an important role in human health enhancing the functionality of foods. The aim of this PhD project was the study of lipid quality and lipid oxidation in different vegetable foods focusing on analytical and technological aspects in order to figure out the effects of lipid composition and bioactive compounds (phenolic compounds, omega-3 fatty acids and dietary fiber) addition on their shelf life. In addition, bioavailability and antioxidant effects of phenolic compounds in human and animals, respectively, were evaluated after consumption of vegetable foods. The first section of the work was focused on the evaluation of lipid quality impact on technological behaviour of vegetable foods. Because of that, cocoa butter with different melting point were evaluated by chromatographic techniques (GC, TLC) and the sample with the higher melting point showed the presence of fatty acids, triglycerides, 2-monoglycerides and FT-IR profile different from genuine cocoa butter, meaning an adding of foreign fat (lauric-fat) not allowed by the law. Looking at lipid quality of other vegetable foods, an accelerated shelf life test (OXITEST®), was used to evaluate of lipid stability to oxidation in tarallini snacks made up using different lipid matrices (sunflower oil, extravirgin olive oil and a blend of extravirgin olive oil and lard). The results showed a good ability of OXITEST® to discriminate between lipid unsaturation and different cooking times, without any samples fat extraction. In the second section, the role of bioactive compounds on cereal based food shelf life was studied in different bakeries by GC, spectrophotometric methods and capillary electrophoresis. It was examined the relationships between phenolic compounds, added with flour, and lipid oxidation of tarallini and frollini. Both products showed an increase in lipid oxidation during storage and antioxidant effects on lipid oxidation were not as expected. Furthermore, the influence of enrichment in polyunsaturated fatty acids on lipid oxidation of pasta was evaluated. The results proved that LC n-3 PUFA were not significantly implicated in the onset of oxidation in spaghetti stored under daylight and accelerated oxidation in a laboratory heater. The importance of phenolic compounds as antioxidant in humans and rats was also studied, by HPLC/MS in the latter section. For this purpose, apigenin and apigenin glycosides excretion was investigated in six women’s urine in a 24 hours study. After a single dose of steamed artichokes, both aglicone and glucuronide metabolites were recovered in 24 h urine. Moreover, the effect of whole grain durum wheat bread and whole grain Kamut® khorasan bread in rats were evaluated. Both cereals were good sources of antioxidants but Kamut® bread fed animals had a better response to stress than wheat durum fed, especially when a sourdough bread was supplied.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to its high Curie temperature of 420K and band structure calculations predicting 100% spin polarisation, Sr2FeMoO6 is a potential candidate for spintronic devices. However, the preparation of good quality thin films has proven to be a non-trivial task. Epitaxial Sr2FeMoO6 thin films were prepared by pulsed laser deposition on different substrates. Differing from previous reports a post-deposition annealing step at low oxygen partial pressure (10-5 mbar) was introduced and enabled the fabrication of reproducible, high quality samples. According to the structural properties of the substrates the crystal structure and morphology of the thin films are modified. The close interrelation between the structural, magnetic and electronic properties of Sr2FeMoO6 was studied. A detailed evaluation of the results allowed to extract valuable information on the microscopic nature of magnetism and charge transport. Smooth films with a mean roughness of about 2 nm have been achieved, which is a pre-requisite for a possible inclusion of this material in future devices. In order to establish device-oriented sub-micron patterning as a standard technique, electron beam lithography and focussed ion beam etching facilities have been put into operation. A detailed characterisation of these systems has been performed. To determine the technological prospects of new spintronics materials, the verification of a high spin polarisation is of vital interest. A popular technique for this task is point contact Andreev reflection (PCAR). Commonly, the charge transport in a transparent metal-superconductor contact of nanometer dimensions is attributed solely to coherent transport. If this condition is not fulfilled, inelastic processes in the constriction have to be considered. PCAR has been applied to Sr2FeMoO6 and the Heusler compound Co2Cr0.6Fe0.4Al. Systematic deviations between measured spectra and the standard models of PCAR have been observed. Therefore existing approaches have been generalised, in order to include the influence of heating. With the extended model the measured data was successfully reproduced but the analysis has revealed grave implications for the determination of spin polarisation, which was found to break down completely in certain cases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A polar stratospheric cloud submodel has been developed and incorporated in a general circulation model including atmospheric chemistry (ECHAM5/MESSy). The formation and sedimentation of polar stratospheric cloud (PSC) particles can thus be simulated as well as heterogeneous chemical reactions that take place on the PSC particles. For solid PSC particle sedimentation, the need for a tailor-made algorithm has been elucidated. A sedimentation scheme based on first order approximations of vertical mixing ratio profiles has been developed. It produces relatively little numerical diffusion and can deal well with divergent or convergent sedimentation velocity fields. For the determination of solid PSC particle sizes, an efficient algorithm has been adapted. It assumes a monodisperse radii distribution and thermodynamic equilibrium between the gas phase and the solid particle phase. This scheme, though relatively simple, is shown to produce particle number densities and radii within the observed range. The combined effects of the representations of sedimentation and solid PSC particles on vertical H2O and HNO3 redistribution are investigated in a series of tests. The formation of solid PSC particles, especially of those consisting of nitric acid trihydrate, has been discussed extensively in recent years. Three particle formation schemes in accordance with the most widely used approaches have been identified and implemented. For the evaluation of PSC occurrence a new data set with unprecedented spatial and temporal coverage was available. A quantitative method for the comparison of simulation results and observations is developed and applied. It reveals that the relative PSC sighting frequency can be reproduced well with the PSC submodel whereas the detailed modelling of PSC events is beyond the scope of coarse global scale models. In addition to the development and evaluation of new PSC submodel components, parts of existing simulation programs have been improved, e.g. a method for the assimilation of meteorological analysis data in the general circulation model, the liquid PSC particle composition scheme, and the calculation of heterogeneous reaction rate coefficients. The interplay of these model components is demonstrated in a simulation of stratospheric chemistry with the coupled general circulation model. Tests against recent satellite data show that the model successfully reproduces the Antarctic ozone hole.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study aims at assessing the innovation strategies adopted within a regional economic system, the Italian region Emilia-Romagna, as it faced the challenges of a changing international scenario. As the strengthening of the regional innovative capabilities is regarded as a keystone to foster a new phase of economic growth, it is important also to understand how the local industrial, institutional, and academic actors have tackled the problem of innovation in the recent past. In this study we explore the approaches to innovation and the strategies adopted by the main regional actors through three different case studies. Chapter 1 provides a general survey of the innovative performance of the regional industries over the past two decades, as it emerges from statistical data and systematic comparisons at the national and European levels. The chapter also discusses the innovation policies that the regional government set up since 2001 in order to strengthen the collaboration among local economic actors, including universities and research centres. As mechanics is the most important regional industry, chapter 2 analyses the combination of knowledge and practices utilized in the period 1960s-1990s in the design of a particular kind of machinery produced by G.D S.p.A., a world-leader in the market of tobacco packaging machines. G.D is based in Bologna, the region’s capital, and is at the centre of the most important Italian packaging district. In chapter 3 the attention turns to the institutional level, focusing on how the local public administrations, and the local, publicly-owned utility companies have dealt with the creation of new telematic networks on the regional territory during the 1990s and 2000s. Finally, chapter 4 assesses the technology transfer carried out by the main university of the region – the University of Bologna – by focusing on the patenting activities involving its research personnel in the period 1960-2010.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we develop and analyze an adaptive numerical scheme for simulating a class of macroscopic semiconductor models. At first the numerical modelling of semiconductors is reviewed in order to classify the Energy-Transport models for semiconductors that are later simulated in 2D. In this class of models the flow of charged particles, that are negatively charged electrons and so-called holes, which are quasi-particles of positive charge, as well as their energy distributions are described by a coupled system of nonlinear partial differential equations. A considerable difficulty in simulating these convection-dominated equations is posed by the nonlinear coupling as well as due to the fact that the local phenomena such as "hot electron effects" are only partially assessable through the given data. The primary variables that are used in the simulations are the particle density and the particle energy density. The user of these simulations is mostly interested in the current flow through parts of the domain boundary - the contacts. The numerical method considered here utilizes mixed finite-elements as trial functions for the discrete solution. The continuous discretization of the normal fluxes is the most important property of this discretization from the users perspective. It will be proven that under certain assumptions on the triangulation the particle density remains positive in the iterative solution algorithm. Connected to this result an a priori error estimate for the discrete solution of linear convection-diffusion equations is derived. The local charge transport phenomena will be resolved by an adaptive algorithm, which is based on a posteriori error estimators. At that stage a comparison of different estimations is performed. Additionally a method to effectively estimate the error in local quantities derived from the solution, so-called "functional outputs", is developed by transferring the dual weighted residual method to mixed finite elements. For a model problem we present how this method can deliver promising results even when standard error estimator fail completely to reduce the error in an iterative mesh refinement process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The subject of this thesis is in the area of Applied Mathematics known as Inverse Problems. Inverse problems are those where a set of measured data is analysed in order to get as much information as possible on a model which is assumed to represent a system in the real world. We study two inverse problems in the fields of classical and quantum physics: QCD condensates from tau-decay data and the inverse conductivity problem. Despite a concentrated effort by physicists extending over many years, an understanding of QCD from first principles continues to be elusive. Fortunately, data continues to appear which provide a rather direct probe of the inner workings of the strong interactions. We use a functional method which allows us to extract within rather general assumptions phenomenological parameters of QCD (the condensates) from a comparison of the time-like experimental data with asymptotic space-like results from theory. The price to be paid for the generality of assumptions is relatively large errors in the values of the extracted parameters. Although we do not claim that our method is superior to other approaches, we hope that our results lend additional confidence to the numerical results obtained with the help of methods based on QCD sum rules. EIT is a technology developed to image the electrical conductivity distribution of a conductive medium. The technique works by performing simultaneous measurements of direct or alternating electric currents and voltages on the boundary of an object. These are the data used by an image reconstruction algorithm to determine the electrical conductivity distribution within the object. In this thesis, two approaches of EIT image reconstruction are proposed. The first is based on reformulating the inverse problem in terms of integral equations. This method uses only a single set of measurements for the reconstruction. The second approach is an algorithm based on linearisation which uses more then one set of measurements. A promising result is that one can qualitatively reconstruct the conductivity inside the cross-section of a human chest. Even though the human volunteer is neither two-dimensional nor circular, such reconstructions can be useful in medical applications: monitoring for lung problems such as accumulating fluid or a collapsed lung and noninvasive monitoring of heart function and blood flow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis, we extend some ideas of statistical physics to describe the properties of human mobility. By using a database containing GPS measures of individual paths (position, velocity and covered space at a spatial scale of 2 Km or a time scale of 30 sec), which includes the 2% of the private vehicles in Italy, we succeed in determining some statistical empirical laws pointing out "universal" characteristics of human mobility. Developing simple stochastic models suggesting possible explanations of the empirical observations, we are able to indicate what are the key quantities and cognitive features that are ruling individuals' mobility. To understand the features of individual dynamics, we have studied different aspects of urban mobility from a physical point of view. We discuss the implications of the Benford's law emerging from the distribution of times elapsed between successive trips. We observe how the daily travel-time budget is related with many aspects of the urban environment, and describe how the daily mobility budget is then spent. We link the scaling properties of individual mobility networks to the inhomogeneous average durations of the activities that are performed, and those of the networks describing people's common use of space with the fractional dimension of the urban territory. We study entropy measures of individual mobility patterns, showing that they carry almost the same information of the related mobility networks, but are also influenced by a hierarchy among the activities performed. We discover that Wardrop's principles are violated as drivers have only incomplete information on traffic state and therefore rely on knowledge on the average travel-times. We propose an assimilation model to solve the intrinsic scattering of GPS data on the street network, permitting the real-time reconstruction of traffic state at a urban scale.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work was based on the synthesis and characterization of innovative crystals for biomedical and technological applications. Different types of syntheses were developed in order to obtain crystals with high photocatalytic properties. A hydrothermal synthesis was also processed to correlate the chemical-physical characteristics with synthesis parameters obtaining synthesis of nanoparticles of titanium dioxide with different morphology, size and crystalline phase depending on the variation of the synthesis parameters. Also a synthesis in water at 80 °C temperature and low pressure was developed from which anatase containing a small percentage of brookite nanoparticles were obtained, presenting a high photocatalytic activity. These particles have been used to obtain the microcrystals formed by an inorganic core of hydroxyapatite surface covered by TiO2 nanoparticles. Micrometer material with higher photocatalytic has been produced. The same nanoparticles have been functionalized with resorcinol oxidized in order to increase the photocatalytic efficiency. Photodegradation test results have confirmed this increase. Finally, synthetic nanoparticles with a waterless synthesis using formic acid and octanol, through esterification "in situ" were synthesized. Nanoparticles superficially covered by carboxylic residues able to bind a wide range of molecules to obtain further photocatalytic properties were obtained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability of block copolymers to spontaneously self-assemble into a variety of ordered nano-structures not only makes them a scientifically interesting system for the investigation of order-disorder phase transitions, but also offers a wide range of nano-technological applications. The architecture of a diblock is the most simple among the block copolymer systems, hence it is often used as a model system in both experiment and theory. We introduce a new soft-tetramer model for efficient computer simulations of diblock copolymer melts. The instantaneous non-spherical shape of polymer chains in molten state is incorporated by modeling each of the two blocks as two soft spheres. The interactions between the spheres are modeled in a way that the diblock melt tends to microphase separate with decreasing temperature. Using Monte Carlo simulations, we determine the equilibrium structures at variable values of the two relevant control parameters, the diblock composition and the incompatibility of unlike components. The simplicity of the model allows us to scan the control parameter space in a completeness that has not been reached in previous molecular simulations.The resulting phase diagram shows clear similarities with the phase diagram found in experiments. Moreover, we show that structural details of block copolymer chains can be reproduced by our simple model.We develop a novel method for the identification of the observed diblock copolymer mesophases that formalizes the usual approach of direct visual observation,using the characteristic geometry of the structures. A cluster analysis algorithm is used to determine clusters of each component of the diblock, and the number and shape of the clusters can be used to determine the mesophase.We also employ methods from integral geometry for the identification of mesophases and compare their usefulness to the cluster analysis approach.To probe the properties of our model in confinement, we perform molecular dynamics simulations of atomistic polyethylene melts confined between graphite surfaces. The results from these simulations are used as an input for an iterative coarse-graining procedure that yields a surface interaction potential for the soft-tetramer model. Using the interaction potential derived in that way, we perform an initial study on the behavior of the soft-tetramer model in confinement. Comparing with experimental studies, we find that our model can reflect basic features of confined diblock copolymer melts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In questo elaborato si affrontano problematiche cliniche legate ai traumi gravi della cute in cui è necessario intervenire chirurgicamente per ripristinare una situazione normale: si approfondisce lo studio della fisiologia del tessuto, la classificazione dei gradi delle ustioni della pelle, la guarigione delle ferite e la meccanica della cute. Il trapianto di tessuto autologo costituisce la soluzione più efficace e con minori complicazioni. Tuttavia il paziente potrebbe non presentare una superficie di cute disponibile sufficientemente estesa, per cui si ricorre ad altri metodi. In primo luogo, si effettuano degli allotrapianti di tessuto di donatore cadavere prelevati secondo le normative vigenti e conservati attraverso le varie tecniche, il cui sviluppo ha consentito una durata di conservazione maggiore; mentre la glicerolizzazione abbatte al 100% il rischio di trasmissione di patologie e lo sviluppo di microorganismi, la crioconservazione preserva la vitalità del tessuto. La chirurgia utilizzata per queste operazioni si avvale di tecnologie innovative come la Tecnologia a Pressione Negativa. Un'alternativa necessaria per sopperire all'ingente richiesta di tessuto di donatore sono i sostituti cutanei, che presentano un grande potenziale per il futuro. Per eliminare totalmente il rischio di rigetto sarebbe necessario personalizzare il costrutto utilizzando cellule autologhe, ma la ricerca è stata rallentata da minori investimenti da parte dell'industria biomedica, che si è maggiormente focalizzata sulla realizzazione di prodotti utilizzabili da un più ampio raggio di pazienti. Per queste ragioni, l'ingegneria tissutale della cute ha trovato più ampio campo di applicazione nel sistema dei test in vitro. A tale scopo sono stati creati dei protocolli certificati per testare la corrosività, la irritabilità e la vitalità del tessuto cutaneo, quali EpiDerm, EpiSkin e SkinEthic che si avvalgono dell'uso del metodo MMT e della spettrofotometria, che è diventata un supporto fondamentale per le scienze biologiche.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis reports on the creation and analysis of many-body states of interacting fermionic atoms in optical lattices. The realized system can be described by the Fermi-Hubbard hamiltonian, which is an important model for correlated electrons in modern condensed matter physics. In this way, ultra-cold atoms can be utilized as a quantum simulator to study solid state phenomena. The use of a Feshbach resonance in combination with a blue-detuned optical lattice and a red-detuned dipole trap enables an independent control over all relevant parameters in the many-body hamiltonian. By measuring the in-situ density distribution and doublon fraction it has been possible to identify both metallic and insulating phases in the repulsive Hubbard model, including the experimental observation of the fermionic Mott insulator. In the attractive case, the appearance of strong correlations has been detected via an anomalous expansion of the cloud that is caused by the formation of non-condensed pairs. By monitoring the in-situ density distribution of initially localized atoms during the free expansion in a homogeneous optical lattice, a strong influence of interactions on the out-of-equilibrium dynamics within the Hubbard model has been found. The reported experiments pave the way for future studies on magnetic order and fermionic superfluidity in a clean and well-controlled experimental system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis, the phenomenology of the Randall-Sundrum setup is investigated. In this context models with and without an enlarged SU(2)_L x SU(2)_R x U(1)_X x P_{LR} gauge symmetry, which removes corrections to the T parameter and to the Z b_L \bar b_L coupling, are compared with each other. The Kaluza-Klein decomposition is formulated within the mass basis, which allows for a clear understanding of various model-specific features. A complete discussion of tree-level flavor-changing effects is presented. Exact expressions for five dimensional propagators are derived, including Yukawa interactions that mediate flavor-off-diagonal transitions. The symmetry that reduces the corrections to the left-handed Z b \bar b coupling is analyzed in detail. In the literature, Randall-Sundrum models have been used to address the measured anomaly in the t \bar t forward-backward asymmetry. However, it will be shown that this is not possible within a natural approach to flavor. The rare decays t \to cZ and t \to ch are investigated, where in particular the latter could be observed at the LHC. A calculation of \Gamma_{12}^{B_s} in the presence of new physics is presented. It is shown that the Randall-Sundrum setup allows for an improved agreement with measurements of A_{SL}^s, S_{\psi\phi}, and \Delta\Gamma_s. For the first time, a complete one-loop calculation of all relevant Higgs-boson production and decay channels in the custodial Randall-Sundrum setup is performed, revealing a sensitivity to large new-physics scales at the LHC.