7 resultados para document and text processing

em Digital Commons - Michigan Tech


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A series of aluminum alloys containing additions of scandium, zirconium, and ytterbium were cast to evaluate the effect of partial ytterbium substitution for scandium on tensile behavior. Due to the high price of scandium, a crucible-melt interaction study was performed to ensure no scandium was lost in graphite, alumina, magnesia, or zirconia crucibles after holding a liquid Al-Sc master alloy for 8 hours at 900 °C in an argon atmosphere. The alloys were subjected to an isochronal aging treatment and tested for conductivity and Vickers microhardness after each increment. For scandium-containing alloys, peak hardnesses of 520-790 MPa, and peak tensile stresses of 138-234 MPa were observed after aging from 150-350 °C for 3 hours in increments of 50 °C, and for alloys without scandium, peak hardnesses of 217-335 MPa and peak tensile stresses of 45-63 MPa were observed after a 3 hour, 150 °C aging treatment. The hardness and tensile strength of the ytterbium containing alloy was found to be lower than in the alloy with no ytterbium substitution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research reported in this dissertation investigates the processes required to mechanically alloy Pb1-xSnxTe and AgSbTe2 and a method of combining these two end compounds to result in (y)(AgSbTe2)–(1 - y)(Pb1-xSnxTe) thermoelectric materials for power generation applications. In general, traditional melt processing of these alloys has employed high purity materials that are subjected to time and energy intensive processes that result in highly functional material that is not easily reproducible. This research reports the development of mechanical alloying processes using commercially available 99.9% pure elemental powders in order to provide a basis for the economical production of highly functional thermoelectric materials. Though there have been reports of high and low ZT materials fabricated by both melt alloying and mechanical alloying, the processing-structure-properties-performance relationship connecting how the material is made to its resulting functionality is poorly understood. This is particularly true for mechanically alloyed material, motivating an effort to investigate bulk material within the (y)(AgSbTe2)–(1 - y)(Pb1-xSnx- Te) system using the mechanical alloying method. This research adds to the body of knowledge concerning the way in which mechanical alloying can be used to efficiently produce high ZT thermoelectric materials. The processes required to mechanically alloy elemental powders to form Pb1-xSnxTe and AgSbTe2 and to subsequently consolidate the alloyed powder is described. The composition, phases present in the alloy, volume percent, size and spacing of the phases are reported. The room temperature electronic transport properties of electrical conductivity, carrier concentration and carrier mobility are reported for each alloy and the effect of the presence of any secondary phase on the electronic transport properties is described. An mechanical mixing approach for incorporating the end compounds to result in (y)(AgSbTe2)–(1-y)(Pb1-xSnxTe) is described and when 5 vol.% AgSbTe2 was incorporated was found to form a solid solution with the Pb1-xSnxTe phase. An initial attempt to change the carrier concentration of the Pb1-xSnxTe phase was made by adding excess Te and found that the carrier density of the alloys in this work are not sensitive to excess Te. It has been demonstrated using the processing techniques reported in this research that this material system, when appropriately doped, has the potential to perform as highly functional thermoelectric material.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) has been used to quantify SO2 emissions from passively degassing volcanoes. This dissertation explores ASTER’s capability to detect SO2 with satellite validation, enhancement techniques and extensive processing of images at a variety of volcanoes. ASTER is compared to the Mini UV Spectrometer (MUSe), a ground based instrument, to determine if reasonable SO2 fluxes can be quantified from a plume emitted from Lascar, Chile. The two sensors were in good agreement with ASTER proving to be a reliable detector of SO2. ASTER illustrated the advantages of imaging a plume in 2D, with better temporal resolution than the MUSe. SO2 plumes in ASTER imagery are not always discernible in the raw TIR data. Principal Component Analysis (PCA) and Decorrelation Stretch (DCS) enhancement techniques were compared to determine how well they highlight a variety of volcanic plumes. DCS produced a consistent output and the composition of the plumes was easy to identify from explosive eruptions. As the plumes became smaller and lower in altitude they became harder to distinguish using DCS. PCA proved to be better at identifying smaller low altitude plumes. ASTER was used to investigate SO2 emissions at Lascar, Chile. Activity at Lascar has been characterized by cyclic behavior and persistent degassing (Matthews et al. 1997). Previous studies at Lascar have primarily focused on changes in thermal infrared anomalies, neglecting gas emissions. Using the SO2 data along with changes in thermal anomalies and visual observations it is evident that Lascar is at the end an eruptive cycle that began in 1993. Declining gas emissions and crater temperatures suggest that the conduit is sealing. ASTER and the Ozone Monitoring Instrument (OMI) were used to determine the annual contribution of SO2 to the troposphere from the Central and South American volcanic arcs between 2000 and 2011. Fluxes of 3.4 Tg/a for Central America and 3.7 Tg/a for South America were calculated. The detection limits of ASTER were explored. The results a proved to be interesting, with plumes from many of the high emitting volcanoes, such as Villarrica, Chile, not being detected by ASTER.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis represents the overview of hydrographic surveying and different types of modern and traditional surveying equipment, and data acquisition using the traditional single beam sonar system and a modern fully autonomous underwater vehicle, IVER3. During the thesis, the data sets were collected using the vehicles of the Great Lake Research Center at Michigan Technological University. This thesis also presents how to process and edit the bathymetric data on SonarWiz5. Moreover, the three dimensional models were created after importing the data sets in the same coordinate system. In these interpolated surfaces, the details and excavations can be easily seen on the surface models. In this study, the profiles are plotted on the surface models to compare the sensors and details on the seabed. It is shown that single beam sonar might miss some details, such as pipeline and quick elevation changes on the seabed when we compare to the side scan sonar of IVER3 because the single side scan sonar can acquire better resolution. However, sometimes using single beam sonar can save your project time and money because the single beam sonar is cheaper than side scan sonars and the processing might be easier than the side scan data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Approximately 90% of fine aerosol in the Midwestern United States has a regional component with a sizable fraction attributed to secondary production of organic aerosol (SOA). The Ozark Forest is an important source of biogenic SOA precursors like isoprene (> 150 mg m-2 d-1), monoterpenes (10-40 mg m-2 d-1), and sesquiterpenes (10-40 mg m-2d-1). Anthropogenic sources include secondary sulfate and nitrate and biomass burning (51-60%), vehicle emissions (17-26%), and industrial emissions (16-18%). Vehicle emissions are an important source of volatile and vapor-phase, semivolatile aliphatic and aromatic hydrocarbons that are important anthropogenic sources of SOA precursors. The short lifetime of SOA precursors and the complex mixture of functionalized oxidation products make rapid sampling, quantitative processing methods, and comprehensive organic molecular analysis essential elements of a comprehensive strategy to advance understanding of SOA formation pathways. Uncertainties in forecasting SOA production on regional scales are large and related to uncertainties in biogenic emission inventories and measurement of SOA yields under ambient conditions. This work presents a bottom-up approach to develop a conifer emission inventory based on foliar and cortical oleoresin composition, development of a model to estimate terpene and terpenoid signatures of foliar and bole emissions from conifers, development of processing and analytic techniques for comprehensive organic molecular characterization of SOA precursors and oxidation products, implementation of the high-volume sampling technique to measure OA and vapor-phase organic matter, and results from a 5 day field experiment conducted to evaluate temporal and diurnal trends in SOA precursors and oxidation products. A total of 98, 115, and 87 terpene and terpenoid species were identified and quantified in commercially available essential oils of Pinus sylvestris, Picea mariana, and Thuja occidentalis, respectively, by comprehensive, two-dimensional gas chromatography with time-of-flight mass spectrometric detection (GC × GC-ToF-MS). Analysis of the literature showed that cortical oleoresin composition was similar to foliar composition of the oldest branches. Our proposed conceptual model for estimation of signatures of terpene and terpenoid emissions from foliar and cortical oleoresin showed that emission potentials of the foliar and bole release pathways are dissimilar and should be considered for conifer species that develop resin blisters or are infested with herbivores or pathogens. Average derivatization efficiencies for Methods 1 and 2 were 87.9 and 114%, respectively. Despite the lower average derivatization efficiency of Method 1, distinct advantages included a greater certainty of derivatization yield for the entire suite of multi- and poly-functional species and fewer processing steps for sequential derivatization. Detection limits for Method 1 using GC × GC- ToF-MS were 0.09-1.89 ng μL-1. A theoretical retention index diagram was developed for a hypothetical GC × 2GC analysis of the complex mixture of SOA precursors and derivatized oxidation products. In general, species eluted (relative to the alkyl diester reference compounds) from the primary column (DB-210) in bands according to n and from the secondary columns (BPX90, SolGel-WAX) according to functionality, essentially making the GC × 2GC retention diagram a Carbon number-functionality grid. The species clustered into 35 groups by functionality and species within each group exhibited good separation by n. Average recoveries of n-alkanes and polyaromatic hydrocarbons (PAHs) by Soxhlet extraction of XAD-2 resin with dichloromethane were 80.1 ± 16.1 and 76.1 ± 17.5%, respectively. Vehicle emissions were the common source for HSVOCs [i.e., resolved alkanes, the unresolved complex mixture (UCM), alkylbenzenes, and 2- and 3-ring PAHs]. An absence of monoterpenes at 0600-1000 and high concentrations of monoterpenoids during the same period was indicative of substantial losses of monoterpenes overnight and the early morning hours. Post-collection, comprehensive organic molecular characterization of SOA precursors and products by GC × GC-ToFMS in ambient air collected with ~2 hr resolution is a promising method for determining biogenic and anthropogenic SOA yields that can be used to evaluate SOA formation models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objectives of this thesis are to validate an improved principal components analysis (IPCA) algorithm on images; designing and simulating a digital model for image compression, face recognition and image detection by using a principal components analysis (PCA) algorithm and the IPCA algorithm; designing and simulating an optical model for face recognition and object detection by using the joint transform correlator (JTC); establishing detection and recognition thresholds for each model; comparing between the performance of the PCA algorithm and the performance of the IPCA algorithm in compression, recognition and, detection; and comparing between the performance of the digital model and the performance of the optical model in recognition and detection. The MATLAB © software was used for simulating the models. PCA is a technique used for identifying patterns in data and representing the data in order to highlight any similarities or differences. The identification of patterns in data of high dimensions (more than three dimensions) is too difficult because the graphical representation of data is impossible. Therefore, PCA is a powerful method for analyzing data. IPCA is another statistical tool for identifying patterns in data. It uses information theory for improving PCA. The joint transform correlator (JTC) is an optical correlator used for synthesizing a frequency plane filter for coherent optical systems. The IPCA algorithm, in general, behaves better than the PCA algorithm in the most of the applications. It is better than the PCA algorithm in image compression because it obtains higher compression, more accurate reconstruction, and faster processing speed with acceptable errors; in addition, it is better than the PCA algorithm in real-time image detection due to the fact that it achieves the smallest error rate as well as remarkable speed. On the other hand, the PCA algorithm performs better than the IPCA algorithm in face recognition because it offers an acceptable error rate, easy calculation, and a reasonable speed. Finally, in detection and recognition, the performance of the digital model is better than the performance of the optical model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Three-dimensional flow visualization plays an essential role in many areas of science and engineering, such as aero- and hydro-dynamical systems which dominate various physical and natural phenomena. For popular methods such as the streamline visualization to be effective, they should capture the underlying flow features while facilitating user observation and understanding of the flow field in a clear manner. My research mainly focuses on the analysis and visualization of flow fields using various techniques, e.g. information-theoretic techniques and graph-based representations. Since the streamline visualization is a popular technique in flow field visualization, how to select good streamlines to capture flow patterns and how to pick good viewpoints to observe flow fields become critical. We treat streamline selection and viewpoint selection as symmetric problems and solve them simultaneously using the dual information channel [81]. To the best of my knowledge, this is the first attempt in flow visualization to combine these two selection problems in a unified approach. This work selects streamline in a view-independent manner and the selected streamlines will not change for all viewpoints. My another work [56] uses an information-theoretic approach to evaluate the importance of each streamline under various sample viewpoints and presents a solution for view-dependent streamline selection that guarantees coherent streamline update when the view changes gradually. When projecting 3D streamlines to 2D images for viewing, occlusion and clutter become inevitable. To address this challenge, we design FlowGraph [57, 58], a novel compound graph representation that organizes field line clusters and spatiotemporal regions hierarchically for occlusion-free and controllable visual exploration. We enable observation and exploration of the relationships among field line clusters, spatiotemporal regions and their interconnection in the transformed space. Most viewpoint selection methods only consider the external viewpoints outside of the flow field. This will not convey a clear observation when the flow field is clutter on the boundary side. Therefore, we propose a new way to explore flow fields by selecting several internal viewpoints around the flow features inside of the flow field and then generating a B-Spline curve path traversing these viewpoints to provide users with closeup views of the flow field for detailed observation of hidden or occluded internal flow features [54]. This work is also extended to deal with unsteady flow fields. Besides flow field visualization, some other topics relevant to visualization also attract my attention. In iGraph [31], we leverage a distributed system along with a tiled display wall to provide users with high-resolution visual analytics of big image and text collections in real time. Developing pedagogical visualization tools forms my other research focus. Since most cryptography algorithms use sophisticated mathematics, it is difficult for beginners to understand both what the algorithm does and how the algorithm does that. Therefore, we develop a set of visualization tools to provide users with an intuitive way to learn and understand these algorithms.