824 resultados para Bromine generation
Resumo:
The inherent stochastic character of most of the physical quantities involved in engineering models has led to an always increasing interest for probabilistic analysis. Many approaches to stochastic analysis have been proposed. However, it is widely acknowledged that the only universal method available to solve accurately any kind of stochastic mechanics problem is Monte Carlo Simulation. One of the key parts in the implementation of this technique is the accurate and efficient generation of samples of the random processes and fields involved in the problem at hand. In the present thesis an original method for the simulation of homogeneous, multi-dimensional, multi-variate, non-Gaussian random fields is proposed. The algorithm has proved to be very accurate in matching both the target spectrum and the marginal probability. The computational efficiency and robustness are very good too, even when dealing with strongly non-Gaussian distributions. What is more, the resulting samples posses all the relevant, welldefined and desired properties of “translation fields”, including crossing rates and distributions of extremes. The topic of the second part of the thesis lies in the field of non-destructive parametric structural identification. Its objective is to evaluate the mechanical characteristics of constituent bars in existing truss structures, using static loads and strain measurements. In the cases of missing data and of damages that interest only a small portion of the bar, Genetic Algorithm have proved to be an effective tool to solve the problem.
Resumo:
[EN]The meccano method is a novel and promising mesh generation method for simultaneously creating adaptive tetrahedral meshes and volume parametrizations of a complex solid. We highlight the fact that the method requires minimum user intervention and has a low computational cost. The method builds a 3-D triangulation of the solid as a deformation of an appropriate tetrahedral mesh of the meccano. The new mesh generator combines an automatic parametrization of surface triangulations, a local refinement algorithm for 3-D nested triangulations and a simultaneous untangling and smoothing procedure. At present, the procedure is fully automatic for a genus-zero solid. In this case, the meccano can be a single cube. The efficiency of the proposed technique is shown with several applications...
Resumo:
[EN]The meccano method is a novel and promising mesh generation technique for simultaneously creating adaptive tetrahedral meshes and volume parameterizations of a complex solid. The method combines several former procedures: a mapping from the meccano boundary to the solid surface, a 3-D local refinement algorithm and a simultaneous mesh untangling and smoothing. In this paper we present the main advantages of our method against other standard mesh generation techniques. We show that our method constructs meshes that can be locally refined by using the Kossaczky bisection rule and maintaining a high mesh quality. Finally, we generate volume T-mesh for isogeometric analysis, based on the volume parameterization obtained by the method…
Resumo:
[EN]The application of the Isogeometric Analysis (IA) with T-splines [1] demands a partition of the parametric space, C, in a tiling containing T-junctions denominated T-mesh. The T-splines are used both for the geometric modelization of the physical domain, D, and the basis of the numerical approximation. They have the advantage over the NURBS of allowing local refinement. In this work we propose a procedure to construct T-spline representations of complex domains in order to be applied to the resolution of elliptic PDE with IA. In precedent works [2, 3] we accomplished this task by using a tetrahedral parametrization…
Resumo:
[EN]We present advances of the meccano method [1,2] for tetrahedral mesh generation and volumetric parameterization of solids. The method combines several former procedures: a mapping from the meccano boundary to the solid surface, a 3-D local refinement algorithm and a simultaneous mesh untangling and smoothing. The key of the method lies in defining a one-to-one volumetric transformation between the parametric and physical domains. Results with adaptive finite elements will be shown for several engineering problems. In addition, the application of the method to T-spline modelling and isogeometric analysis [3,4] of complex geometries will be introduced…
Resumo:
Objective of these four first chapters is to have a complete understanding of the supramolecular organisation of several complementary modules able to form 2-D networks first in solution using optical spectroscopy measurements as function of solvent polarity , concentration and temperature, and then on solid surface using microscopy techniques such as STM, AFM and TEM. The last chapter presents another type of supramolecular material for application in solar cells technology involving fullerenes and OPV systems. We describes the photoinduced energy and electron process using transient absorption experiments. All these systems provide an exceptional example for the potential of the supramolecular approach as an alternative to the restricted lithographic method for the fabrication of adressable molecular devices.
Resumo:
Oligomere mit konjugierten pi-Elektronensystemen sind für die Materialwissenschaften von großer Bedeutung. Die vielfältigen und umfangreichen Forschungen auf diesem Gebiet gründen im Potenzial dieser Substanzklassen, das im Bereich der Laserfarbstoffe, Leuchtdioden, Photoleiter, optische Schalter oder auch der molekularen Elektronik angesiedelt ist. Zu diesen gehören auch die in dieser Arbeit synthetisierten und untersuchten Phenylenethinylene. Die Herstellung der Oligomere erfolgt nach der Methode von Sonogashira und Hagihara. Dabei wird ein Halogenaren mit einer Alkinkomponente zur Reaktion gebracht. Als Katalysator dient dabei ein Gemisch aus Bis(triphenylphosphin-palladiumdichlorid), Kupfer-(I)-iodid und Triphenylphosphin. Verwendung fanden bei der Synthese zwei Arten von Schutzgruppen. Es handelt sich dabei einerseits um die Trimethylsilyl- und die Triisopropylsilyl-Funktion, die unabhängig voneinander in ein System eingeführt werden und selektiv wieder entfernt werden können. Die zweite Art sind die Halogene Brom und Iod, die aufgrund ihrer Eigenschaft vielmehr als 'dormant group' bezeichnet werden müssen. Eine Ethinylierung führt zunächst zur Substitution des Iod- und anschließend des Bromatoms. Die so erhaltenen Oligomere werden mit verschiedenen spektroskopischen Methoden untersucht. Besonderes Interesse liegt dabei auf der Bestimmung der effektiven Konjugationslänge (EKL). Damit ist es möglich, die Länge des konjugierten Systems zu bestimmen, das für die betreffenden Eigenschaften des entsprechenden Polymers maßgeblich ist. Das nichtlineare optische Verhalten der Oligomere wird mittels der Third-Harmonic-Generation-Methode (THG) gemessen. Die resultierende Größe, die Suszeptibilität 3. Ordnung, gibt Aufschluß über mögliche industrielle Anwendungen.
Resumo:
Mit einem Abscherverfahren wurde in vitro die Scherhaftung (SBS) von 4 kommerziell erhältlichen Adhäsiven der 4., 5. und 6. Generation (Optibond FL, Excite, Gluma Comfort Bond, Prompt L-Pop) und 7 experimentellen Einflaschen- (Capo E1, Capo E2, Capo Water, ENA) oder selbstkonditionierenden Adhäsiven (AC, AC+Desensitizer, Resulcin Aqua Prime N) an Schmelz/Dentin untersucht.Alle Adhäsive wurden entsprechend Herstellerangaben auf flache Schmelz-/Dentinflächen appliziert. Auf alle Adhäsivflächen wurden 3,5x2mm große Tetric Ceram Prüfkörper aufgebracht. Die Proben wurden vor Thermocycling 24h in Kochsalzlösung gelagert. Anschließend wurden die SBS-Werte gemessen.Das Mehrflaschensystem Optibond (33.2/34.4) zeigte signifikant (p<0.05) höhere Werte an Schmelz und Dentin als alle experimentellen Einflaschen- und alle selbstkonditionierenden Adhäsive. Die kommerziell erhältlichen 5. Generation Bondings tendieren zu höheren Werten als die selbstkonditionierenden Adhäsive, liegen aber unter denen der 4. Generation. AC (28.1/27.0)
Towards model driven software development for Arduino platforms: a DSL and automatic code generation
Resumo:
La tesi ha lo scopo di esplorare la produzione di sistemi software per Embedded Systems mediante l'utilizzo di tecniche relative al mondo del Model Driven Software Development. La fase più importante dello sviluppo sarà la definizione di un Meta-Modello che caratterizza i concetti fondamentali relativi agli embedded systems. Tale modello cercherà di astrarre dalla particolare piattaforma utilizzata ed individuare quali astrazioni caratterizzano il mondo degli embedded systems in generale. Tale meta-modello sarà quindi di tipo platform-independent. Per la generazione automatica di codice è stata adottata una piattaforma di riferimento, cioè Arduino. Arduino è un sistema embedded che si sta sempre più affermando perché coniuga un buon livello di performance ed un prezzo relativamente basso. Tale piattaforma permette lo sviluppo di sistemi special purpose che utilizzano sensori ed attuatori di vario genere, facilmente connessi ai pin messi a disposizione. Il meta-modello definito è un'istanza del meta-metamodello MOF, definito formalmente dall'organizzazione OMG. Questo permette allo sviluppatore di pensare ad un sistema sotto forma di modello, istanza del meta-modello definito. Un meta-modello può essere considerato anche come la sintassi astratta di un linguaggio, quindi può essere definito da un insieme di regole EBNF. La tecnologia utilizzata per la definizione del meta-modello è stata Xtext: un framework che permette la scrittura di regole EBNF e che genera automaticamente il modello Ecore associato al meta-modello definito. Ecore è l'implementazione di EMOF in ambiente Eclipse. Xtext genera inoltre dei plugin che permettono di avere un editor guidato dalla sintassi, definita nel meta-modello. La generazione automatica di codice è stata realizzata usando il linguaggio Xtend2. Tale linguaggio permette di esplorare l'Abstract Syntax Tree generato dalla traduzione del modello in Ecore e di generare tutti i file di codice necessari. Il codice generato fornisce praticamente tutta la schematic part dell'applicazione, mentre lascia all'application designer lo sviluppo della business logic. Dopo la definizione del meta-modello di un sistema embedded, il livello di astrazione è stato spostato più in alto, andando verso la definizione della parte di meta-modello relativa all'interazione di un sistema embedded con altri sistemi. Ci si è quindi spostati verso un ottica di Sistema, inteso come insieme di sistemi concentrati che interagiscono. Tale difinizione viene fatta dal punto di vista del sistema concentrato di cui si sta definendo il modello. Nella tesi viene inoltre introdotto un caso di studio che, anche se abbastanza semplice, fornisce un esempio ed un tutorial allo sviluppo di applicazioni mediante l'uso del meta-modello. Ci permette inoltre di notare come il compito dell'application designer diventi piuttosto semplice ed immediato, sempre se basato su una buona analisi del problema. I risultati ottenuti sono stati di buona qualità ed il meta-modello viene tradotto in codice che funziona correttamente.
Resumo:
This thesis deal with the design of advanced OFDM systems. Both waveform and receiver design have been treated. The main scope of the Thesis is to study, create, and propose, ideas and novel design solutions able to cope with the weaknesses and crucial aspects of modern OFDM systems. Starting from the the transmitter side, the problem represented by low resilience to non-linear distortion has been assessed. A novel technique that considerably reduces the Peak-to-Average Power Ratio (PAPR) yielding a quasi constant signal envelope in the time domain (PAPR close to 1 dB) has been proposed.The proposed technique, named Rotation Invariant Subcarrier Mapping (RISM),is a novel scheme for subcarriers data mapping,where the symbols belonging to the modulation alphabet are not anchored, but maintain some degrees of freedom. In other words, a bit tuple is not mapped on a single point, rather it is mapped onto a geometrical locus, which is totally or partially rotation invariant. The final positions of the transmitted complex symbols are chosen by an iterative optimization process in order to minimize the PAPR of the resulting OFDM symbol. Numerical results confirm that RISM makes OFDM usable even in severe non-linear channels. Another well known problem which has been tackled is the vulnerability to synchronization errors. Indeed in OFDM system an accurate recovery of carrier frequency and symbol timing is crucial for the proper demodulation of the received packets. In general, timing and frequency synchronization is performed in two separate phases called PRE-FFT and POST-FFT synchronization. Regarding the PRE-FFT phase, a novel joint symbol timing and carrier frequency synchronization algorithm has been presented. The proposed algorithm is characterized by a very low hardware complexity, and, at the same time, it guarantees very good performance in in both AWGN and multipath channels. Regarding the POST-FFT phase, a novel approach for both pilot structure and receiver design has been presented. In particular, a novel pilot pattern has been introduced in order to minimize the occurrence of overlaps between two pattern shifted replicas. This allows to replace conventional pilots with nulls in the frequency domain, introducing the so called Silent Pilots. As a result, the optimal receiver turns out to be very robust against severe Rayleigh fading multipath and characterized by low complexity. Performance of this approach has been analytically and numerically evaluated. Comparing the proposed approach with state of the art alternatives, in both AWGN and multipath fading channels, considerable performance improvements have been obtained. The crucial problem of channel estimation has been thoroughly investigated, with particular emphasis on the decimation of the Channel Impulse Response (CIR) through the selection of the Most Significant Samples (MSSs). In this contest our contribution is twofold, from the theoretical side, we derived lower bounds on the estimation mean-square error (MSE) performance for any MSS selection strategy,from the receiver design we proposed novel MSS selection strategies which have been shown to approach these MSE lower bounds, and outperformed the state-of-the-art alternatives. Finally, the possibility of using of Single Carrier Frequency Division Multiple Access (SC-FDMA) in the Broadband Satellite Return Channel has been assessed. Notably, SC-FDMA is able to improve the physical layer spectral efficiency with respect to single carrier systems, which have been used so far in the Return Channel Satellite (RCS) standards. However, it requires a strict synchronization and it is also sensitive to phase noise of local radio frequency oscillators. For this reason, an effective pilot tone arrangement within the SC-FDMA frame, and a novel Joint Multi-User (JMU) estimation method for the SC-FDMA, has been proposed. As shown by numerical results, the proposed scheme manages to satisfy strict synchronization requirements and to guarantee a proper demodulation of the received signal.
Resumo:
The hard X-ray band (10 - 100 keV) has been only observed so far by collimated and coded aperture mask instruments, with a sensitivity and an angular resolution lower than two orders of magnitude as respects the current X-ray focusing telescopes operating below 10 - 15 keV. The technological advance in X-ray mirrors and detection systems is now able to extend the X-ray focusing technique to the hard X-ray domain, filling the gap in terms of observational performances and providing a totally new deep view on some of the most energetic phenomena of the Universe. In order to reach a sensitivity of 1 muCrab in the 10 - 40 keV energy range, a great care in the background minimization is required, a common issue for all the hard X-ray focusing telescopes. In the present PhD thesis, a comprehensive analysis of the space radiation environment, the payload design and the resulting prompt X-ray background level is presented, with the aim of driving the feasibility study of the shielding system and assessing the scientific requirements of the future hard X-ray missions. A Geant4 based multi-mission background simulator, BoGEMMS, is developed to be applied to any high energy mission for which the shielding and instruments performances are required. It allows to interactively create a virtual model of the telescope and expose it to the space radiation environment, tracking the particles along their path and filtering the simulated background counts as a real observation in space. Its flexibility is exploited to evaluate the background spectra of the Simbol-X and NHXM mission, as well as the soft proton scattering by the X-ray optics and the selection of the best shielding configuration. Altough the Simbol-X and NHXM missions are the case studies of the background analysis, the obtained results can be generalized to any future hard X-ray telescope. For this reason, a simplified, ideal payload model is also used to select the major sources of background in LEO. All the results are original contributions to the assessment studies of the cited missions, as part of the background groups activities.