992 resultados para Compound enhancement techniques
Resumo:
To continuously improve the performance of metal-oxide-semiconductor field-effect-transistors (MOSFETs), innovative device architectures, gate stack engineering and mobility enhancement techniques are under investigation. In this framework, new physics-based models for Technology Computer-Aided-Design (TCAD) simulation tools are needed to accurately predict the performance of upcoming nanoscale devices and to provide guidelines for their optimization. In this thesis, advanced physically-based mobility models for ultrathin body (UTB) devices with either planar or vertical architectures such as single-gate silicon-on-insulator (SOI) field-effect transistors (FETs), double-gate FETs, FinFETs and silicon nanowire FETs, integrating strain technology and high-κ gate stacks are presented. The effective mobility of the two-dimensional electron/hole gas in a UTB FETs channel is calculated taking into account its tensorial nature and the quantization effects. All the scattering events relevant for thin silicon films and for high-κ dielectrics and metal gates have been addressed and modeled for UTB FETs on differently oriented substrates. The effects of mechanical stress on (100) and (110) silicon band structures have been modeled for a generic stress configuration. Performance will also derive from heterogeneity, coming from the increasing diversity of functions integrated on complementary metal-oxide-semiconductor (CMOS) platforms. For example, new architectural concepts are of interest not only to extend the FET scaling process, but also to develop innovative sensor applications. Benefiting from properties like large surface-to-volume ratio and extreme sensitivity to surface modifications, silicon-nanowire-based sensors are gaining special attention in research. In this thesis, a comprehensive analysis of the physical effects playing a role in the detection of gas molecules is carried out by TCAD simulations combined with interface characterization techniques. The complex interaction of charge transport in silicon nanowires of different dimensions with interface trap states and remote charges is addressed to correctly reproduce experimental results of recently fabricated gas nanosensors.
Resumo:
The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) has been used to quantify SO2 emissions from passively degassing volcanoes. This dissertation explores ASTER’s capability to detect SO2 with satellite validation, enhancement techniques and extensive processing of images at a variety of volcanoes. ASTER is compared to the Mini UV Spectrometer (MUSe), a ground based instrument, to determine if reasonable SO2 fluxes can be quantified from a plume emitted from Lascar, Chile. The two sensors were in good agreement with ASTER proving to be a reliable detector of SO2. ASTER illustrated the advantages of imaging a plume in 2D, with better temporal resolution than the MUSe. SO2 plumes in ASTER imagery are not always discernible in the raw TIR data. Principal Component Analysis (PCA) and Decorrelation Stretch (DCS) enhancement techniques were compared to determine how well they highlight a variety of volcanic plumes. DCS produced a consistent output and the composition of the plumes was easy to identify from explosive eruptions. As the plumes became smaller and lower in altitude they became harder to distinguish using DCS. PCA proved to be better at identifying smaller low altitude plumes. ASTER was used to investigate SO2 emissions at Lascar, Chile. Activity at Lascar has been characterized by cyclic behavior and persistent degassing (Matthews et al. 1997). Previous studies at Lascar have primarily focused on changes in thermal infrared anomalies, neglecting gas emissions. Using the SO2 data along with changes in thermal anomalies and visual observations it is evident that Lascar is at the end an eruptive cycle that began in 1993. Declining gas emissions and crater temperatures suggest that the conduit is sealing. ASTER and the Ozone Monitoring Instrument (OMI) were used to determine the annual contribution of SO2 to the troposphere from the Central and South American volcanic arcs between 2000 and 2011. Fluxes of 3.4 Tg/a for Central America and 3.7 Tg/a for South America were calculated. The detection limits of ASTER were explored. The results a proved to be interesting, with plumes from many of the high emitting volcanoes, such as Villarrica, Chile, not being detected by ASTER.
Resumo:
Context. During the most recent perihelion passage in 2009 of comet 67P/Churyumov-Gerasimenko (67P), ground-based observations showed an anisotropic dust coma where jet-like features were detected at similar to 1.3 AU from the Sun. The current perihelion passage is exceptional as the Rosetta spacecraft is monitoring the nucleus activity since March 2014, when a clear dust coma was already surrounding the nucleus at 4.3 AU from the Sun. Subsequently, the OSIRIS camera also witnessed an outburst in activity between April 27 and 30, and since mid-July, the dust coma at rh similar to 3.7-3.6 AU preperihelion is clearly non-isotropic, pointing to the existence of dust jet-like features. Aims. We aim to ascertain on the nucleus surface the origin of the dust jet-like features detected as early as in mid-July 2014. This will help to establish how the localized comet nucleus activity compares with that seen in previous apparitions and will also help following its evolution as the comet approaches its perihelion, at which phase most of the jets were detected from ground-based observations. Determining these areas also allows locating them in regions on the nucleus with spectroscopic or geomorphological distinct characteristics. Methods. Three series of dust images of comet 67P obtained with the Wide Angle Camera (WAC) of the OSIRIS instrument onboard the Rosetta spacecraft were processed with different enhancement techniques. This was made to clearly show the existence of jet-like features in the dust coma, whose appearance toward the observer changed as a result of the rotation of the comet nucleus and of the changing observing geometry from the spacecraft. The position angles of these features in the coma together with information on the observing geometry, nucleus shape, and rotation, allowed us to determine the most likely locations on the nucleus surface where the jets originate from. Results. Geometrical tracing of jet sources indicates that the activity of the nucleus of 67P gave rise during July and August 2014 to large-scale jet-like features from the Hapi, Hathor, Anuket, and Aten regions, confirming that active regions may be present on the nucleus localized at 60. northern latitude as deduced from previous comet apparitions. There are also hints that large-scale jets observed from the ground are possibly composed, at their place of origin on the nucleus surface, of numerous small-scale features.
Resumo:
El estudio de materiales, especialmente biológicos, por medios no destructivos está adquiriendo una importancia creciente tanto en las aplicaciones científicas como industriales. Las ventajas económicas de los métodos no destructivos son múltiples. Existen numerosos procedimientos físicos capaces de extraer información detallada de las superficie de la madera con escaso o nulo tratamiento previo y mínima intrusión en el material. Entre los diversos métodos destacan las técnicas ópticas y las acústicas por su gran versatilidad, relativa sencillez y bajo coste. Esta tesis pretende establecer desde la aplicación de principios simples de física, de medición directa y superficial, a través del desarrollo de los algoritmos de decisión mas adecuados basados en la estadística, unas soluciones tecnológicas simples y en esencia, de coste mínimo, para su posible aplicación en la determinación de la especie y los defectos superficiales de la madera de cada muestra tratando, en la medida de lo posible, no alterar su geometría de trabajo. Los análisis desarrollados han sido los tres siguientes: El primer método óptico utiliza las propiedades de la luz dispersada por la superficie de la madera cuando es iluminada por un laser difuso. Esta dispersión produce un moteado luminoso (speckle) cuyas propiedades estadísticas permiten extraer propiedades muy precisas de la estructura tanto microscópica como macroscópica de la madera. El análisis de las propiedades espectrales de la luz laser dispersada genera ciertos patrones mas o menos regulares relacionados con la estructura anatómica, composición, procesado y textura superficial de la madera bajo estudio que ponen de manifiesto características del material o de la calidad de los procesos a los que ha sido sometido. El uso de este tipo de láseres implica también la posibilidad de realizar monitorizaciones de procesos industriales en tiempo real y a distancia sin interferir con otros sensores. La segunda técnica óptica que emplearemos hace uso del estudio estadístico y matemático de las propiedades de las imágenes digitales obtenidas de la superficie de la madera a través de un sistema de scanner de alta resolución. Después de aislar los detalles mas relevantes de las imágenes, diversos algoritmos de clasificacion automatica se encargan de generar bases de datos con las diversas especies de maderas a las que pertenecían las imágenes, junto con los márgenes de error de tales clasificaciones. Una parte fundamental de las herramientas de clasificacion se basa en el estudio preciso de las bandas de color de las diversas maderas. Finalmente, numerosas técnicas acústicas, tales como el análisis de pulsos por impacto acústico, permiten complementar y afinar los resultados obtenidos con los métodos ópticos descritos, identificando estructuras superficiales y profundas en la madera así como patologías o deformaciones, aspectos de especial utilidad en usos de la madera en estructuras. La utilidad de estas técnicas esta mas que demostrada en el campo industrial aun cuando su aplicación carece de la suficiente expansión debido a sus altos costes y falta de normalización de los procesos, lo cual hace que cada análisis no sea comparable con su teórico equivalente de mercado. En la actualidad gran parte de los esfuerzos de investigación tienden a dar por supuesto que la diferenciación entre especies es un mecanismo de reconocimiento propio del ser humano y concentran las tecnologías en la definición de parámetros físicos (módulos de elasticidad, conductividad eléctrica o acústica, etc.), utilizando aparatos muy costosos y en muchos casos complejos en su aplicación de campo. Abstract The study of materials, especially the biological ones, by non-destructive techniques is becoming increasingly important in both scientific and industrial applications. The economic advantages of non-destructive methods are multiple and clear due to the related costs and resources necessaries. There are many physical processes capable of extracting detailed information on the wood surface with little or no previous treatment and minimal intrusion into the material. Among the various methods stand out acoustic and optical techniques for their great versatility, relative simplicity and low cost. This thesis aims to establish from the application of simple principles of physics, surface direct measurement and through the development of the more appropriate decision algorithms based on statistics, a simple technological solutions with the minimum cost for possible application in determining the species and the wood surface defects of each sample. Looking for a reasonable accuracy without altering their work-location or properties is the main objetive. There are three different work lines: Empirical characterization of wood surfaces by means of iterative autocorrelation of laser speckle patterns: A simple and inexpensive method for the qualitative characterization of wood surfaces is presented. it is based on the iterative autocorrelation of laser speckle patterns produced by diffuse laser illumination of the wood surfaces. The method exploits the high spatial frequency content of speckle images. A similar approach with raw conventional photographs taken with ordinary light would be very difficult. A few iterations of the algorithm are necessary, typically three or four, in order to visualize the most important periodic features of the surface. The processed patterns help in the study of surface parameters, to design new scattering models and to classify the wood species. Fractal-based image enhancement techniques inspired by differential interference contrast microscopy: Differential interference contrast microscopy is a very powerful optical technique for microscopic imaging. Inspired by the physics of this type of microscope, we have developed a series of image processing algorithms aimed at the magnification, noise reduction, contrast enhancement and tissue analysis of biological samples. These algorithms use fractal convolution schemes which provide fast and accurate results with a performance comparable to the best present image enhancement algorithms. These techniques can be used as post processing tools for advanced microscopy or as a means to improve the performance of less expensive visualization instruments. Several examples of the use of these algorithms to visualize microscopic images of raw pine wood samples with a simple desktop scanner are provided. Wood species identification using stress-wave analysis in the audible range: Stress-wave analysis is a powerful and flexible technique to study mechanical properties of many materials. We present a simple technique to obtain information about the species of wood samples using stress-wave sounds in the audible range generated by collision with a small pendulum. Stress-wave analysis has been used for flaw detection and quality control for decades, but its use for material identification and classification is less cited in the literature. Accurate wood species identification is a time consuming task for highly trained human experts. For this reason, the development of cost effective techniques for automatic wood classification is a desirable goal. Our proposed approach is fully non-invasive and non-destructive, reducing significantly the cost and complexity of the identification and classification process.
Resumo:
Sharpening is a powerful image transformation because sharp edges can bring out image details. Sharpness is achieved by increasing local contrast and reducing edge widths. We present a method that enhances sharpness of images and thereby their perceptual quality. Most existing enhancement techniques require user input to improve the perception of the scene in a manner most pleasing to the particular user. Our goal of image enhancement is to improve the perception of sharpness in digital images for human viewers. We consider two parameters in order to exaggerate the differences between local intensities. The two parameters exploit local contrast and widths of edges. We start from the assumption that color, texture, or objects of focus such as faces affect the human perception of photographs. When human raters are presented with a collection of images with different sharpness and asked to rank them according to perceived sharpness, the results have shown that there is a statistical consensus among the raters. We introduce a ramp enhancement technique by modifying the optimal overshoot in the ramp for different region contrasts as well as the new ramp width. Optimal parameter values are searched to be applied to regions under the criteria mentioned above. In this way, we aim to enhance digital images automatically to create pleasing image output for common users.
Resumo:
The U.S. railroad companies spend billions of dollars every year on railroad track maintenance in order to ensure safety and operational efficiency of their railroad networks. Besides maintenance costs, other costs such as train accident costs, train and shipment delay costs and rolling stock maintenance costs are also closely related to track maintenance activities. Optimizing the track maintenance process on the extensive railroad networks is a very complex problem with major cost implications. Currently, the decision making process for track maintenance planning is largely manual and primarily relies on the knowledge and judgment of experts. There is considerable potential to improve the process by using operations research techniques to develop solutions to the optimization problems on track maintenance. In this dissertation study, we propose a range of mathematical models and solution algorithms for three network-level scheduling problems on track maintenance: track inspection scheduling problem (TISP), production team scheduling problem (PTSP) and job-to-project clustering problem (JTPCP). TISP involves a set of inspection teams which travel over the railroad network to identify track defects. It is a large-scale routing and scheduling problem where thousands of tasks are to be scheduled subject to many difficult side constraints such as periodicity constraints and discrete working time constraints. A vehicle routing problem formulation was proposed for TISP, and a customized heuristic algorithm was developed to solve the model. The algorithm iteratively applies a constructive heuristic and a local search algorithm in an incremental scheduling horizon framework. The proposed model and algorithm have been adopted by a Class I railroad in its decision making process. Real-world case studies show the proposed approach outperforms the manual approach in short-term scheduling and can be used to conduct long-term what-if analyses to yield managerial insights. PTSP schedules capital track maintenance projects, which are the largest track maintenance activities and account for the majority of railroad capital spending. A time-space network model was proposed to formulate PTSP. More than ten types of side constraints were considered in the model, including very complex constraints such as mutual exclusion constraints and consecution constraints. A multiple neighborhood search algorithm, including a decomposition and restriction search and a block-interchange search, was developed to solve the model. Various performance enhancement techniques, such as data reduction, augmented cost function and subproblem prioritization, were developed to improve the algorithm. The proposed approach has been adopted by a Class I railroad for two years. Our numerical results show the model solutions are able to satisfy all hard constraints and most soft constraints. Compared with the existing manual procedure, the proposed approach is able to bring significant cost savings and operational efficiency improvement. JTPCP is an intermediate problem between TISP and PTSP. It focuses on clustering thousands of capital track maintenance jobs (based on the defects identified in track inspection) into projects so that the projects can be scheduled in PTSP. A vehicle routing problem based model and a multiple-step heuristic algorithm were developed to solve this problem. Various side constraints such as mutual exclusion constraints and rounding constraints were considered. The proposed approach has been applied in practice and has shown good performance in both solution quality and efficiency.
Resumo:
This study utilised recent developments in forensic aromatic hydrocarbon fingerprint analysis to characterise and identify specific biogenic, pyrogenic and petrogenic contamination. The fingerprinting and data interpretation techniques discussed include the recognition of: The distribution patterns of hydrocarbons (alkylated naphthalene, phenanthrene, dibenzothiophene, fluorene, chrysene and phenol isomers), • Analysis of “source-specific marker” compounds (individual saturated hydrocarbons, including n-alkanes (n-C5 through 0-C40) • Selected benzene, toluene, ethylbenzene and xylene isomers (BTEX), • The recalcitrant isoprenoids; pristane and phytane and • The determination of diagnostic ratios of specific petroleum / non-petroleum constituents, and the application of various statistical and numerical analysis tools. An unknown sample from the Irish Environmental Protection Agency (EPA) for origin characterisation was subjected to analysis by gas chromatography utilising both flame ionisation and mass spectral detection techniques in comparison to known reference materials. The percentage of the individual Polycyclic Aromatic Hydrocarbons (PAIIs) and biomarker concentrations in the unknown sample were normalised to the sum of the analytes and the results were compared with the corresponding results with a range of reference materials. In addition, to the determination of conventional diagnostic PAH and biomarker ratios, a number of “source-specific markers” isomeric PAHs within the same alkylation levels were determined, and their relative abundance ratios were computed in order to definitively identify and differentiate the various sources. Statistical logarithmic star plots were generated from both sets of data to give a pictorial representation of the comparison between the unknown sample and reference products. The study successfully characterised the unknown sample as being contaminated with a “coal tar” and clearly demonstrates the future role of compound ratio analysis (CORAT) in the identification of possible source contaminants.
Resumo:
The research project object of this thesis is focused on the development of an advanced analytical system based on the combination of an improved thin layer chromatography (TLC) plate coupled with infrared (FTIR) and Raman microscopies for the detection of synthetic dyes. Indeed, the characterization of organic colorants, which are commonly present in mixtures with other components and in a very limited amount, still represents a challenging task in scientific analyses of cultural heritage materials. The approach provides selective spectral fingerprints for each compound, foreseeing the complementary information obtained by micro ATR-RAIRS-FTIR and SERS-Raman analyses, which can be performed on the same separated spot. In particular, silver iodide (AgI) applied on a gold coated slide is proposed as an efficient stationary phase for the discrimination of complex analyte mixtures, such as dyes present in samples of art-historical interest. The gold-AgI-TLC plate shows high performances related both to the chromatographic separation of analytes and to the spectroscopic detection of components. The use of a mid-IR transparent inorganic salt as the stationary phase avoids interferences of the background absorption in FTIR investigations. Moreover, by ATR microscopy measurements performed on the gold-AgI surface, a considerable enhancement in the intensity of spectra is observed. Complementary information can be obtained by Raman analyses, foreseeing a SERS activity of the AgI substrate. The method has been tested for the characterization of a mixture of three synthetic organic colorants widely used in dyeing processes: Brilliant Green (BG1), Rhodamine B (BV10) and Methylene Blue (BB9).
Resumo:
The entrapment of hematoporphyrin IX (Hp IX) in silica by means of a microemulsion resulted in silica spheres of 33 +/- 6 nm. The small size, narrow size distribution and lack of aggregation maintain Hp IX silica nanospheres stable in aqueous solutions for long periods and permit a detailed study of the entrapped drug by different techniques. Hp IX entrapped in the silica matrix is accessed by oxygen and upon irradiation generates singlet oxygen which diffuses very efficiently to the outside solution. The Hp IX entrapped in the silica matrix is also reached by iron(II) ions, which causes quenching of the porphyrin fluorescence emission. The silica matrix also provides extra protection to the photosensitizer against interaction with BSA and ascorbic acid, which are known to cause suppression of singlet oxygen generation by the Hp IX free in solution. Therefore, the incorporation of Hp IX molecules into silica nanospheres increased the potential of the photosensitizer to perform photodynamic therapy.
Resumo:
In this work, we present the simulation, fabrication and characterization of a tunable Bragg filter employing amorphous dielectric films deposited by plasma enhanced chemical vapor deposition technique on a crystalline silicon substrate. The optical device was built using conventional microelectronic processes and consisted of fifteen periodic intervals of Si3N4 layers separated by air with appropriated thickness and lengths to produce transmittance attenuation peaks in the visible region. For this, previous simulations were realized based in the optical parameters of the dielectric film, which were extracted from ellipsometry and profilometry techniques. For the characterization of the optical interferential filter, a 633 nm monochromatic light was injected on the filter, and then the transmitted output light was collected and conducted to a detector through an optical waveguide made also of amorphous dielectric layers. Afterwards, the optical filter was mounted on a Peltier thermoelectric device in order to control the temperature of the optical device. When the temperature of filter changes, a refractive index variation is originated in the dielectric film due to the thermo-optic effect, producing a shift of attenuation peak, which can be well predicted by numerical simulations. This characteristic allows this device to be used as a thermo-optic sensor. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
In this work, we take advantage of association rule mining to support two types of medical systems: the Content-based Image Retrieval (CBIR) systems and the Computer-Aided Diagnosis (CAD) systems. For content-based retrieval, association rules are employed to reduce the dimensionality of the feature vectors that represent the images and to improve the precision of the similarity queries. We refer to the association rule-based method to improve CBIR systems proposed here as Feature selection through Association Rules (FAR). To improve CAD systems, we propose the Image Diagnosis Enhancement through Association rules (IDEA) method. Association rules are employed to suggest a second opinion to the radiologist or a preliminary diagnosis of a new image. A second opinion automatically obtained can either accelerate the process of diagnosing or to strengthen a hypothesis, increasing the probability of a prescribed treatment be successful. Two new algorithms are proposed to support the IDEA method: to pre-process low-level features and to propose a preliminary diagnosis based on association rules. We performed several experiments to validate the proposed methods. The results indicate that association rules can be successfully applied to improve CBIR and CAD systems, empowering the arsenal of techniques to support medical image analysis in medical systems. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Phenol is a toxic compound present in a wide variety of foundry resins. Its quantification is important for the characterization of the resins as well as for the evaluation of free contaminants present in foundry wastes. Two chromatographic methods, liquid chromatography with ultraviolet detection (LC-UV) and gas chromatography with flame ionization detection (GC-FID), for the analysis of free phenol in several foundry resins, after a simple extraction procedure (30 min), were developed. Both chromatographic methods were suitable for the determination of phenol in the studied furanic and phenolic resins, showing good selectivity, accuracy (recovery 99–100%; relative deviations <5%), and precision (coefficients of variation <6%). The used ASTM reference method was only found to be useful in the analysis of phenolic resins, while the LC and GC methods were applicable for all the studied resins. The developed methods reduce the time of analysis from 3.5 hours to about 30 min and can readily be used in routine quality control laboratories.
Resumo:
The recent developments on Hidden Markov Models (HMM) based speech synthesis showed that this is a promising technology fully capable of competing with other established techniques. However some issues still lack a solution. Several authors report an over-smoothing phenomenon on both time and frequencies which decreases naturalness and sometimes intelligibility. In this work we present a new vowel intelligibility enhancement algorithm that uses a discrete Kalman filter (DKF) for tracking frame based parameters. The inter-frame correlations are modelled by an autoregressive structure which provides an underlying time frame dependency and can improve time-frequency resolution. The system’s performance has been evaluated using objective and subjective tests and the proposed methodology has led to improved results.
Resumo:
Tese de Doutoramento em Engenharia Civil
Resumo:
A generic optical biosensing strategy was developed that relies on the absorbance enhancement phenomenon occurring in a multiple scattering matrix. Experimentally, inserts made of glass fiber membrane were placed into microplate wells in order to significantly lengthen the trajectory of the incident light through the sample and therefore increase the corresponding absorbance. Enhancement factor was calculated by comparing the absorbance values measured for a given amount of dye with and without the absorbance-enhancing inserts in the wells. Moreover, the dilution of dye in solutions with different refractive indices (RI) clearly revealed that the enhancement factor increased with the ΔRI between the membrane and the surrounding medium, reaching a maximum value (EF>25) when the membranes were dried. On this basis, two H2O2-biosensing systems were developed based on the biofunctionalization of the glass fiber inserts either with cytochrome c or horseradish peroxidase (HRP) and the analytical performances were systematically compared with the corresponding bioassay in solution. The efficiency of the absorbance-enhancement approach was particularly clear in the case of the cytochrome c-based biosensor with a sensitivity gain of 40 folds and wider dynamic range. Therefore, the developed strategy represents a promising way to convert standard colorimetric bioassays into optical biosensors with improved sensitivity.