999 resultados para ricostruzione 3D triangolazione laser computervision
Resumo:
The photosensitivity of GeSx binary glasses in response to irradiation to femtosecond pulses at 800 nm is investigated. Samples with three different molecular compositions were irradiated under different exposure conditions. The material response to laser exposure was characterized by both refractometry and micro-Raman spectroscopy. It is shown that the relative content of sulfur in the glass matrix influences the photo-induced refractive index modification. At low sulfur content, both positive and negative index changes can be obtained while at high sulfur content, only a positive index change can be reached. These changes were correlated with variations in the Raman response of exposed glass which were interpreted in terms of structural modifications of the glass network. Under optimized exposure conditions, waveguides with positive index changes of up to 7.8x10−3 and a controllable diameter from 14 to 25 μm can be obtained. Direct inscription of low insertion losses (IL = 3.1 – 3.9 dB) waveguides is demonstrated in a sample characterized by a S/Ge ratio of 4. The current results open a pathway towards the use of Ge-S binary glasses for the fabrication of integrated mid-infrared photonic components.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
In this paper, a method is proposed to refine the LASER 3D roofs geometrically by using a high-resolution aerial image and Markov Random Field (MRF) models. In order to do so, a MRF description for grouping straight lines is developed, assuming that each projected side contour and ridge is topologically correct and that it is only necessary to improve its accuracy. Although the combination of laser data with data from image is most justified for refining roof contour, the structure of ridges can give greater robustness in the topological description of the roof structure. The MRF model is formulated based on relationships (length, proximity, and orientation) between the straight lines extracted from the image and projected polygon and also on retangularity and corner injunctions. The energy function associated with MRF is minimized by the genetic algorithm optimization method, resulting in the grouping of straight lines for each roof object. Finally, each grouping of straight lines is topologically reconstructed based on the topology of the corresponding LASER scanning polygon projected onto the image-space. The results obtained were satisfactory. This method was able to provide polygons roof refined buildings in which most of its contour sides and ridges were geometrically improved.
Resumo:
Tailoring properties of materials by femtosecond laser processing has been proposed in the last decade as a powerful approach for technological applications, ranging from optics to biology. Although most of the research output in this field is related to femtosecond laser processing of single either organic or inorganic materials, more recently a similar approach has been proposed to develop advanced hybrid nanomaterials. Here, we report results on the use of femtosecond lasers to process hybrid nanomaterials, composed of polymeric and glassy matrices containing metal or semiconductor nanostructures. We present results on the use of femtosecond pulses to induce Cu and Ag nanoparticles in the bulk of borate and borosilicate glasses, which can be applied for a new generation of waveguides. We also report on 3D polymeric structures, fabricated by two-photon polymerization, containing Au and ZnO nanostructures, with intense two-photon fluorescent properties. The approach based on femtosecond laser processing to fabricate hybrid materials containing metal or semiconductor nanostructures is promising to be exploited for optical sensors and photonics devices.
Resumo:
The laser driven ion acceleration is a burgeoning field of resarch and is attracting a growing number of scientists since the first results reported in 2000 obtained irradiating thin solid foils by high power laser pulses. The growing interest is driven by the peculiar characteristics of the produced bunches, the compactness of the whole accelerating system and the very short accelerating length of this all-optical accelerators. A fervent theoretical and experimental work has been done since then. An important part of the theoretical study is done by means of numerical simulations and the most widely used technique exploits PIC codes (“Particle In Cell'”). In this thesis the PIC code AlaDyn, developed by our research group considering innovative algorithms, is described. My work has been devoted to the developement of the code and the investigation of the laser driven ion acceleration for different target configurations. Two target configurations for the proton acceleration are presented together with the results of the 2D and 3D numerical investigation. One target configuration consists of a solid foil with a low density layer attached on the irradiated side. The nearly critical plasma of the foam layer allows a very high energy absorption by the target and an increase of the proton energy up to a factor 3, when compared to the ``pure'' TNSA configuration. The differences of the regime with respect to the standard TNSA are described The case of nearly critical density targets has been investigated with 3D simulations. In this case the laser travels throughout the plasma and exits on the rear side. During the propagation, the laser drills a channel and induce a magnetic vortex that expanding on the rear side of the targer is source of a very intense electric field. The protons of the plasma are strongly accelerated up to energies of 100 MeV using a 200PW laser.
Resumo:
In recent years, the use of Reverse Engineering systems has got a considerable interest for a wide number of applications. Therefore, many research activities are focused on accuracy and precision of the acquired data and post processing phase improvements. In this context, this PhD Thesis deals with the definition of two novel methods for data post processing and data fusion between physical and geometrical information. In particular a technique has been defined for error definition in 3D points’ coordinates acquired by an optical triangulation laser scanner, with the aim to identify adequate correction arrays to apply under different acquisition parameters and operative conditions. Systematic error in data acquired is thus compensated, in order to increase accuracy value. Moreover, the definition of a 3D thermogram is examined. Object geometrical information and its thermal properties, coming from a thermographic inspection, are combined in order to have a temperature value for each recognizable point. Data acquired by an optical triangulation laser scanner are also used to normalize temperature values and make thermal data independent from thermal-camera point of view.
Resumo:
In the race to obtain protons with higher energies, using more compact systems at the same time, laser-driven plasma accelerators are becoming an interesting possibility. But for now, only beams with extremely broad energy spectra and high divergence have been produced. The driving line of this PhD thesis was the study and design of a compact system to extract a high quality beam out of the initial bunch of protons produced by the interaction of a laser pulse with a thin solid target, using experimentally reliable technologies in order to be able to test such a system as soon as possible. In this thesis, different transport lines are analyzed. The first is based on a high field pulsed solenoid, some collimators and, for perfect filtering and post-acceleration, a high field high frequency compact linear accelerator, originally designed to accelerate a 30 MeV beam extracted from a cyclotron. The second one is based on a quadruplet of permanent magnetic quadrupoles: thanks to its greater simplicity and reliability, it has great interest for experiments, but the effectiveness is lower than the one based on the solenoid; in fact, the final beam intensity drops by an order of magnitude. An additional sensible decrease in intensity is verified in the third case, where the energy selection is achieved using a chicane, because of its very low efficiency for off-axis protons. The proposed schemes have all been analyzed with 3D simulations and all the significant results are presented. Future experimental work based on the outcome of this thesis can be planned and is being discussed now.
Resumo:
Questa tesi valuta l’efficacia della tecnica delle griglie in titanio con osso particolato nella ricostruzione dei difetti alveolari tridimensionali ai fini della riabilitazione dentale implanto-protesica. Il primo studio ha considerato la metodica in termini di complicanze post-operatorie e di risultati implanto-protesici. Sono stati considerati 24 pazienti con difetti tridimensionali trattati con l’applicazione di 34 griglie di titanio e osso particolato e riabilitati protesicamente dopo circa 8-9 mesi. 4 su 34 griglie sono state rimosse prima dell’inserimento implantare (11.76% di fallimento totale); 20 su 34 griglie si sono esposte per deiscenza dei tessuti molli (58.82% di complicanze): 4 (11.77%) prima e 16 (47.05%) dopo le prime 4-6 settimane dall’intervento; in nessun caso il piano di trattamento implanto-protesico ha subito variazioni. Dopo un follow-up medio di 20 (3-48) mesi dal carico protesico, nessuno degli 88 impianti ha perso la propria osteo-integrazione (100% di sopravvivenza implantare), con un valore complessivo di successo implantare di 82.9%. Il secondo studio ha calcolato in termini volumetrici la ricostruzione ossea ottenuta con griglie e la sua corre-lazione con l’estensione dell’esposizione e la tempistica del suo verificarsi. Sono stati valutati 12 pazienti con 15 difetti alveolari. Per ciascun sito sono state studiate le immagini TC con un software dedicato per misurare i volumi in tre dimensioni: il volume di osso non formatosi rispetto a quanto pianificato, lacking bone volume (LBV), è stato calcolato sottraendo il volume di osso ricostruito, reconstructed bone volume (RBV) in fase di ri-entro chirurgico dal volume di osso pianificato pre-operativamente, planned bone volume (PBV). LBV è risultato direttamente proporzionale all’area di esposizione della griglia, con un valore del 16.3% di LBV per ogni cm2 di griglia esposta. Si sono evidenziate, inoltre, correlazioni positive tra LBV , la tempistica precoce di esposizione e il valore di PBV.
Resumo:
Theories and numerical modeling are fundamental tools for understanding, optimizing and designing present and future laser-plasma accelerators (LPAs). Laser evolution and plasma wave excitation in a LPA driven by a weakly relativistically intense, short-pulse laser propagating in a preformed parabolic plasma channel, is studied analytically in 3D including the effects of pulse steepening and energy depletion. At higher laser intensities, the process of electron self-injection in the nonlinear bubble wake regime is studied by means of fully self-consistent Particle-in-Cell simulations. Considering a non-evolving laser driver propagating with a prescribed velocity, the geometrical properties of the non-evolving bubble wake are studied. For a range of parameters of interest for laser plasma acceleration, The dependence of the threshold for self-injection in the non-evolving wake on laser intensity and wake velocity is characterized. Due to the nonlinear and complex nature of the Physics involved, computationally challenging numerical simulations are required to model laser-plasma accelerators operating at relativistic laser intensities. The numerical and computational optimizations, that combined in the codes INF&RNO and INF&RNO/quasi-static give the possibility to accurately model multi-GeV laser wakefield acceleration stages with present supercomputing architectures, are discussed. The PIC code jasmine, capable of efficiently running laser-plasma simulations on Graphics Processing Units (GPUs) clusters, is presented. GPUs deliver exceptional performance to PIC codes, but the core algorithms had to be redesigned for satisfying the constraints imposed by the intrinsic parallelism of the architecture. The simulation campaigns, run with the code jasmine for modeling the recent LPA experiments with the INFN-FLAME and CNR-ILIL laser systems, are also presented.
Resumo:
Coastal sand dunes represent a richness first of all in terms of defense from the sea storms waves and the saltwater ingression; moreover these morphological elements constitute an unique ecosystem of transition between the sea and the land environment. The research about dune system is a strong part of the coastal sciences, since the last century. Nowadays this branch have assumed even more importance for two reasons: on one side the born of brand new technologies, especially related to the Remote Sensing, have increased the researcher possibilities; on the other side the intense urbanization of these days have strongly limited the dune possibilities of development and fragmented what was remaining from the last century. This is particularly true in the Ravenna area, where the industrialization united to the touristic economy and an intense subsidence, have left only few dune ridges residual still active. In this work three different foredune ridges, along the Ravenna coast, have been studied with Laser Scanner technology. This research didn’t limit to analyze volume or spatial difference, but try also to find new ways and new features to monitor this environment. Moreover the author planned a series of test to validate data from Terrestrial Laser Scanner (TLS), with the additional aim of finalize a methodology to test 3D survey accuracy. Data acquired by TLS were then applied on one hand to test some brand new applications, such as Digital Shore Line Analysis System (DSAS) and Computational Fluid Dynamics (CFD), to prove their efficacy in this field; on the other hand the author used TLS data to find any correlation with meteorological indexes (Forcing Factors), linked to sea and wind (Fryberger's method) applying statistical tools, such as the Principal Component Analysis (PCA).
Resumo:
The main purposes of this essay were to investigate in detail the burning rate anomaly phenomenon, also known as "Hump Effect", in solid rocket motors casted in mandrel and the mechanisms at the base of it, as well as the developing of a numeric code, in Matlab environment, in order to obtain a forecasting tool to generate concentration and orientation maps of the particles within the grain. The importance of these analysis is due to the fact that the forecasts of ballistics curves in new motors have to be improved in order to reduce the amount of experimental tests needed for the characterization of their ballistic behavior. This graduate work is divided into two parts. The first one is about bidimensional and tridimensional simulations on z9 motor casting process. The simulations have been carried out respectively with Fluent and Flow 3D. The second one is about the analysis of fluid dynamic data and the developing of numeric codes which give information about the concentration and orientation of particles within the grain based on fluid strain rate information which are extrapolated from CFD software.
Resumo:
Il presente lavoro di tesi presenta la progettazione, realizzazione e applicazione di un setup sperimentale miniaturizzato per la ricostruzione di immagine, con tecnica di Tomografia ad Impedenza Elettrica (EIT). Il lavoro descritto nel presente elaborato costituisce uno studio di fattibilità preliminare per ricostruire la posizione di piccole porzioni di tessuto (ordine di qualche millimetro) o aggregati cellulari dentro uno scaffold in colture tissutali o cellulari 3D. Il setup disegnato incorpora 8 elettrodi verticali disposti alla periferia di una camera di misura circolare del diametro di 10 mm. Il metodo di analisi EIT è stato svolto utilizzando i) elettrodi conduttivi per tutta l’altezza della camera (usati nel modello EIT bidimensionale e quasi-bidimensionale) e ii) elettrodi per deep brain stimulation (conduttivi esclusivamente su un ridotto volume in punta e posti a tre diverse altezze: alto, centro e basso) usati nel modello EIT tridimensionale. Il metodo ad elementi finiti (FEM) è stato utilizzato per la soluzione sia del problema diretto che del problema inverso, con la ricostruzione della mappa di distribuzione della conduttività entro la camera di misura. Gli esperimenti svolti hanno permesso di ricostruire la mappa di distribuzione di conduttività relativa a campioni dell’ordine del millimetro di diametro. Tali dimensioni sono compatibili con quelle dei campioni oggetto di studio in ingegneria tissutale e, anche, con quelle tipiche dei sistemi organ-on-a-chip. Il metodo EIT sviluppato, il prototipo del setup realizzato e la trattazione statistica dei dati sono attualmente in fase di implementazione in collaborazione con il gruppo del Professor David Holder, Dept. Medical Physics and Bioengineering, University College London (UCL), United Kingdom.
Resumo:
L’obiettivo di questa tesi è presentare una tecnica di monitoraggio applicabile alle dune costiere, utilizzata per questo studio nella provincia di Ravenna e in particolare su di un cordone trasversale di duna costiera presente nell’area naturale adiacente alla foce del torrente Bevano nella zona di Lido di Classe. Tale tecnica si avvale dell’uso di tecnologia laser per fornire una documentazione 3D estremamente dettagliata, il quale ci permetterà di valutare come il sistema dunale si comporta di fronte ad un evento climatico estremo e/o sotto l’azione delle mareggiate, confrontando sia l’aspetto morfologico che morfometrico mediante l’uso di programmi che ci hanno permesso di confrontare i dati ottenuti prima e dopo l’evento climatico
Resumo:
Il framework in oggetto, è un ambiente ideato con lo scopo di applicare tecniche di Machine Learning (in particolare le Random Forest) alle funzionalità dell'algoritmo di stereo matching SGM (Semi Global Matching), al fine di incrementarne l'accuratezza in versione standard. Scopo della presente tesi è quello di modificare alcune impostazioni di tale framework rendendolo un ambiente che meglio si adatti alla direzionalità delle scanline (introducendo finestre di supporto rettangolari e ortogonali e il training di foreste separate in base alla singola scanline) e ampliarne le funzionalità tramite l'aggiunta di alcune nuove feature, quali la distanza dal più vicino edge direzionale e la distintività calcolate sulle immagini Left della stereo pair e gli edge direzionali sulle mappe di disparità. Il fine ultimo sarà quello di eseguire svariati test sui dataset Middlebury 2014 e KITTI e raccogliere dati che descrivano l'andamento in positivo o negativo delle modifiche effettuate.
Resumo:
For crime scene investigation in cases of homicide, the pattern of bloodstains at the incident site is of critical importance. The morphology of the bloodstain pattern serves to determine the approximate blood source locations, the minimum number of blows and the positioning of the victim. In the present work, the benefits of the three-dimensional bloodstain pattern analysis, including the ballistic approximation of the trajectories of the blood drops, will be demonstrated using two illustrative cases. The crime scenes were documented in 3D, using the non-contact methods digital photogrammetry, tachymetry and laser scanning. Accurate, true-to-scale 3D models of the crime scenes, including the bloodstain pattern and the traces, were created. For the determination of the areas of origin of the bloodstain pattern, the trajectories of up to 200 well-defined bloodstains were analysed in CAD and photogrammetry software. The ballistic determination of the trajectories was performed using ballistics software. The advantages of this method are the short preparation time on site, the non-contact measurement of the bloodstains and the high accuracy of the bloodstain analysis. It should be expected that this method delivers accurate results regarding the number and position of the areas of origin of bloodstains, in particular the vertical component is determined more precisely than using conventional methods. In both cases relevant forensic conclusions regarding the course of events were enabled by the ballistic bloodstain pattern analysis.