930 resultados para post-processing
Resumo:
Brewers spent grain (BSG) is a widely available feedstock representing approximately 85% of the total by-products generated in the brewing industry. This is currently either disposed of to landfill or used as cattle feed due to its high protein content. BSG has received little or no attention as a potential energy resource, but increasing disposal costs and environmental constraints are now prompting the consideration of this. One possibility for the utilisation of BSG for energy is via intermediate pyrolysis to produce gases, vapours and chars. Intermediate pyrolysis is characterised by indirect heating in the absence of oxygen for short solids residence times of a few minutes, at temperatures of 350-450 °C. In the present work BSG has been characterised by chemical, proximate, ultimate and thermo-gravimetric analysis. Intermediate pyrolysis of BSG at 450 °C was carried out using a twin coaxial screw reactor known as Pyroformer to give yields of char 29%, 51% of bio-oil and 19% of permanent gases. The bio-oil liquid was found to separate in to an aqueous phase and organic phase. The organic phase contained viscous compounds that could age over time leading to solid tars that can present problems in CHP application. The quality of the pyrolysis vapour products before quenching can be upgraded to achieve much improved suitability as a fuel by downstream catalytic reforming. A Bench Scale batch pyrolysis reactor has then been used to pyrolyse small samples of BSG under a range of conditions of heating rate and temperature simulating the Pyroformer. A small catalytic reformer has been added downstream of the reactor in which the pyrolysis vapours can be further cracked and reformed. A commercial reforming nickel catalyst was used at 500, 750 and 850 °C at a space velocity about 10,000 L/h with and without the addition of steam. Results are presented for the properties of BSG, and the products of the pyrolysis process both with and without catalytic post-processing. Results indicate that catalytic reforming produced a significant increase in permanent gases mainly (H2 and CO) with H2 content exceeding 50 vol% at higher reforming temperatures. Bio-oil yield decreased significantly as reforming temperature increased with char remaining the same as pyrolysis condition remained unchanged. The process shows an increase in heating value for the product gas ranging between 10.8-25.2 MJ/m as reforming temperature increased. © 2012 Elsevier B.V. All rights reserved.
Resumo:
The principle theme of this thesis is the advancement and expansion of ophthalmic research via the collaboration between professional Engineers and professional Optometrists. The aim has been to develop new and novel approaches and solutions to contemporary problems in the field. The work is sub divided into three areas of investigation; 1) High technology systems, 2) Modification of current systems to increase functionality, and 3) Development of smaller more portable and cost effective systems. High Technology Systems: A novel high speed Optical Coherence Tomography (OCT) system with integrated simultaneous high speed photography was developed achieving better operational speed than is currently available commercially. The mechanical design of the system featured a novel 8 axis alignment system. A full set of capture, analysis, and post processing software was developed providing custom analysis systems for ophthalmic OCT imaging, expanding the current capabilities of the technology. A large clinical trial was undertaken to test the dynamics of contact lens edge interaction with the cornea in-vivo. The interaction between lens edge design, lens base curvature, post insertion times and edge positions was investigated. A novel method for correction of optical distortion when assessing lens indentation was also demonstrated. Modification of Current Systems: A commercial autorefractor, the WAM-5500, was modified with the addition of extra hardware and a custom software and firmware solution to produce a system that was capable of measuring dynamic accommodative response to various stimuli in real time. A novel software package to control the data capture process was developed allowing real time monitoring of data by the practitioner, adding considerable functionality of the instrument further to the standard system. The device was used to assess the accommodative response differences between subjects who had worn UV blocking contact lens for 5 years, verses a control group that had not worn UV blocking lenses. While the standard static measurement of accommodation showed no differences between the two groups, it was determined that the UV blocking group did show better accommodative rise and fall times (faster), thus demonstrating the benefits of the modification of this commercially available instrumentation. Portable and Cost effective Systems: A new instrument was developed to expand the capability of the now defunct Keeler Tearscope. A device was developed that provided a similar capability in allowing observation of the reflected mires from the tear film surface, but with the added advantage of being able to record the observations. The device was tested comparatively with the tearscope and other tear film break-up techniques, demonstrating its potential. In Conclusion: This work has successfully demonstrated the advantages of interdisciplinary research between engineering and ophthalmic research has provided new and novel instrumented solutions as well as having added to the sum of scientific understanding in the ophthalmic field.
Resumo:
In this second talk on dissipative structures in fiber applications, we overview theoretical aspects of the generation, evolution and characterization of self-similar parabolic-shaped pulses in fiber amplifier media. In particular, we present a perturbation analysis that describes the structural changes induced by third-order fiber dispersion on the parabolic pulse solution of the nonlinear Schrödinger equation with gain. Promising applications of parabolic pulses in optical signal post-processing and regeneration in communication systems are also discussed.
Resumo:
The fabrication precision is one of the most critical challenges to the creation of practical photonic circuits composed of coupled high Q-factor microresonators. While very accurate transient tuning of microresonators based on local heating has been reported, the record precision of permanent resonance positioning achieved by post-processing is still within 1 and 5 GHz. Here we demonstrate two coupled bottle microresonators fabricated at the fiber surface with resonances that are matched with a better than 0.16 GHz precision. This corresponds to a better than 0.17 Å precision in the effective fiber radius variation. The achieved fabrication precision is only limited by the resolution of our optical spectrum analyzer and can be potentially improved by an order of magnitude.
Resumo:
This thesis investigated the risk of accidental release of hydrocarbons during transportation and storage. Transportation of hydrocarbons from an offshore platform to processing units through subsea pipelines involves risk of release due to pipeline leakage resulting from corrosion, plastic deformation caused by seabed shakedown or damaged by contact with drifting iceberg. The environmental impacts of hydrocarbon dispersion can be severe. Overall safety and economic concerns of pipeline leakage at subsea environment are immense. A large leak can be detected by employing conventional technology such as, radar, intelligent pigging or chemical tracer but in a remote location like subsea or arctic, a small chronic leak may be undetected for a period of time. In case of storage, an accidental release of hydrocarbon from the storage tank could lead pool fire; further it could escalate to domino effects. This chain of accidents may lead to extremely severe consequences. Analyzing past accident scenarios it is observed that more than half of the industrial domino accidents involved fire as a primary event, and some other factors for instance, wind speed and direction, fuel type and engulfment of the compound. In this thesis, a computational fluid dynamics (CFD) approach is taken to model the subsea pipeline leak and the pool fire from a storage tank. A commercial software package ANSYS FLUENT Workbench 15 is used to model the subsea pipeline leakage. The CFD simulation results of four different types of fluids showed that the static pressure and pressure gradient along the axial length of the pipeline have a sharp signature variation near the leak orifice at steady state condition. Transient simulation is performed to obtain the acoustic signature of the pipe near leak orifice. The power spectral density (PSD) of acoustic signal is strong near the leak orifice and it dissipates as the distance and orientation from the leak orifice increase. The high-pressure fluid flow generates more noise than the low-pressure fluid flow. In order to model the pool fire from the storage tank, ANSYS CFX Workbench 14 is used. The CFD results show that the wind speed has significant contribution on the behavior of pool fire and its domino effects. The radiation contours are also obtained from CFD post processing, which can be applied for risk analysis. The outcome of this study will be helpful for better understanding of the domino effects of pool fire in complex geometrical settings of process industries. The attempt to reduce and prevent risks is discussed based on the results obtained from the numerical simulations of the numerical models.
Resumo:
The 1 : 1,500,000 AWI Bathymetric Chart of the Gakkel Ridge (AWI BCGR) has been developed from multibeam data measured during the Arctic Mid-Ocean Ridge Expedition in 2001 (AMORE 2001, ARK-XVII/2). This expedition was conducted to investigate the Gakkel Ridge in the Arctic Ocean and was carried out by the icebreaking research vessels RV Polarstern and USCGC Healy. Polarstern is equipped with the multibeam sonar system Hydrosweep DS-2, whereas Healy carries Seabeam 2112. During the expedition an area of 8890 km length and 18 - 46 km width, situated between 82°N/8°W and 87°N/75°E, was surveyed simultaneously by both vessels. Water depths ranged from 566 to 5673 meters. Dense sea ice cover derogated the sonar measurements and decreased data quality. Data errors were corrected in an extensive post-processing. The data of two different sonar systems had to be consolidated in order to derive a high resolution bathymetry of the Gakkel Ridge. Final result was a digital terrain model (DTM) with a grid spacing of 100 meters, which was utilized for generating the map series AWI Bathymetric Chart of the Gakkel Ridge, consisting of ten map sheets.
Resumo:
Much of the bridge stock on major transport links in North America and Europe was constructed in the 1950s and 1960s and has since deteriorated or is carrying loads far in excess of the original design loads. Structural Health Monitoring Systems (SHM) can provide valuable information on the bridge capacity but the application of such systems is currently limited by access and bridge type. This paper investigates the use of computer vision systems for SHM. A series of field tests have been carried out to test the accuracy of displacement measurements using contactless methods. A video image of each test was processed using a modified version of the optical flow tracking method to track displacement. These results have been validated with an established measurement method using linear variable differential transformers (LVDTs). The results obtained from the algorithm provided an accurate comparison with the validation measurements. The calculated displacements agree within 2% of the verified LVDT measurements, a number of post processing methods were then applied to attempt to reduce this error.
Resumo:
There have been over 3000 bridge weigh-in-motion (B-WIM) installations in 25 countries worldwide, this has led vast improvements in post processing of B-WIM systems since its introduction in the 1970’s. This paper introduces a new low-power B-WIM system using fibre optic sensors (FOS). The system consisted of a series of FOS which were attached to the soffit of an existing integral bridge with a single span of 19m. The site selection criteria and full installation process has been detailed in the paper. A method of calibration was adopted using live traffic at the bridge site and based on this calibration the accuracy of the system was determined.
Resumo:
Wir betrachten zeitabhängige Konvektions-Diffusions-Reaktions-Gleichungen in zeitabhängi- gen Gebieten, wobei die Bewegung des Gebietsrandes bekannt ist. Die zeitliche Entwicklung des Gebietes wird durch die ALE-Formulierung behandelt, die die Nachteile der klassischen Euler- und Lagrange-Betrachtungsweisen behebt. Die Position des Randes und seine Geschwindigkeit werden dabei so in das Gebietsinnere fortgesetzt, dass starke Gitterdeformationen verhindert werden. Als Zeitdiskretisierungen höherer Ordnung werden stetige Galerkin-Petrov-Verfahren (cGP) und unstetige Galerkin-Verfahren (dG) auf Probleme in zeitabhängigen Gebieten angewendet. Weiterhin werden das C 1 -stetige Galerkin-Petrov-Verfahren und das C 0 -stetige Galerkin- Verfahren vorgestellt. Deren Lösungen lassen sich auch in zeitabhängigen Gebieten durch ein einfaches einheitliches Postprocessing aus der Lösung des cGP-Problems bzw. dG-Problems erhalten. Für Problemstellungen in festen Gebieten und mit zeitlich konstanten Konvektions- und Reaktionstermen werden Stabilitätsresultate sowie optimale Fehlerabschätzungen für die nachbereiteten Lösungen der cGP-Verfahren und der dG-Verfahren angegeben. Für zeitabhängige Konvektions-Diffusions-Reaktions-Gleichungen in zeitabhängigen Gebieten präsentieren wir konservative und nicht-konservative Formulierungen, wobei eine besondere Aufmerksamkeit der Behandlung der Zeitableitung und der Gittergeschwindigkeit gilt. Stabilität und optimale Fehlerschätzungen für die in der Zeit semi-diskretisierten konservativen und nicht-konservativen Formulierungen werden vorgestellt. Abschließend wird das volldiskretisierte Problem betrachtet, wobei eine Finite-Elemente-Methode zur Ortsdiskretisierung der Konvektions-Diffusions-Reaktions-Gleichungen in zeitabhängigen Gebieten im ALE-Rahmen einbezogen wurde. Darüber hinaus wird eine lokale Projektionsstabilisierung (LPS) eingesetzt, um der Konvektionsdominanz Rechnung zu tragen. Weiterhin wird numerisch untersucht, wie sich die Approximation der Gebietsgeschwindigkeit auf die Genauigkeit der Zeitdiskretisierungsverfahren auswirkt.
Resumo:
Il seguente elaborato di tesi tratta il problema della pianificazione di voli fotogrammetrici a bassa quota mediante l’uso di SAPR, in particolare è presentata una disamina delle principali applicazioni che permettono di programmare una copertura fotogrammetrica trasversale e longitudinale di un certo poligono con un drone commerciale. Il tema principale sviluppato è la gestione di un volo fotogrammetrico UAV mediante l’uso di applicativi software che permettono all’utente di inserire i parametri di volo in base alla tipologia di rilievo che vuole effettuare. L’obbiettivo finale è quello di ottenere una corretta presa fotogrammetrica da utilizzare per la creazione di un modello digitale del terreno o di un oggetto attraverso elaborazione dati in post-processing. La perfetta configurazione del volo non può prescindere dalle conoscenze base di fotogrammetria e delle meccaniche di un veicolo UAV. I capitoli introduttivi tratteranno infatti i principi della fotogrammetria analogica e digitale soffermandosi su temi utili alla comprensione delle problematiche relative al progetto di rilievo fotogrammetrico aereo. Una particolare attenzione è stata posta sulle nozioni di fotogrammetria digitale che, insieme agli algoritmi di Imagine Matching derivanti dalla Computer Vision, permette di definire il ramo della Fotogrammetria Moderna. Nei capitoli centrali verranno esaminate e confrontate una serie di applicazioni commerciali per smartphone e tablet, disponibili per sistemi Apple e Android, per trarne un breve resoconto conclusivo che le compari in termini di accessibilità, potenzialità e destinazione d’uso. Per una maggiore comprensione si determinano univocamente gli acronimi con cui i droni vengono chiamati nei diversi contesti: UAV (Unmanned Aerial Vehicle), SAPR (Sistemi Aeromobili a Pilotaggio Remoto), RPAS (Remotely Piloted Aicraft System), ARP (Aeromobili a Pilotaggio Remoto).
Resumo:
The overwhelming amount and unprecedented speed of publication in the biomedical domain make it difficult for life science researchers to acquire and maintain a broad view of the field and gather all information that would be relevant for their research. As a response to this problem, the BioNLP (Biomedical Natural Language Processing) community of researches has emerged and strives to assist life science researchers by developing modern natural language processing (NLP), information extraction (IE) and information retrieval (IR) methods that can be applied at large-scale, to scan the whole publicly available biomedical literature and extract and aggregate the information found within, while automatically normalizing the variability of natural language statements. Among different tasks, biomedical event extraction has received much attention within BioNLP community recently. Biomedical event extraction constitutes the identification of biological processes and interactions described in biomedical literature, and their representation as a set of recursive event structures. The 2009–2013 series of BioNLP Shared Tasks on Event Extraction have given raise to a number of event extraction systems, several of which have been applied at a large scale (the full set of PubMed abstracts and PubMed Central Open Access full text articles), leading to creation of massive biomedical event databases, each of which containing millions of events. Sinece top-ranking event extraction systems are based on machine-learning approach and are trained on the narrow-domain, carefully selected Shared Task training data, their performance drops when being faced with the topically highly varied PubMed and PubMed Central documents. Specifically, false-positive predictions by these systems lead to generation of incorrect biomolecular events which are spotted by the end-users. This thesis proposes a novel post-processing approach, utilizing a combination of supervised and unsupervised learning techniques, that can automatically identify and filter out a considerable proportion of incorrect events from large-scale event databases, thus increasing the general credibility of those databases. The second part of this thesis is dedicated to a system we developed for hypothesis generation from large-scale event databases, which is able to discover novel biomolecular interactions among genes/gene-products. We cast the hypothesis generation problem as a supervised network topology prediction, i.e predicting new edges in the network, as well as types and directions for these edges, utilizing a set of features that can be extracted from large biomedical event networks. Routine machine learning evaluation results, as well as manual evaluation results suggest that the problem is indeed learnable. This work won the Best Paper Award in The 5th International Symposium on Languages in Biology and Medicine (LBM 2013).
Resumo:
Kandidaatintyön tarkoituksena oli selvittää pienahitsin juuren kriittisyyttä. Työ oli saanut aiheen rakenneputki kokeiden yhteydessä tehdyistä havainnoista. Työssä tutustuttiin millaiset ovat pienahitsin mitoitus menetelmät ja tausta tutkimusta kuinka sitä sovelletaan käytäntöön suurlujuusteräksille. Työssä esitellään käytetyt tutkimusmenetelmät kuinka menetelmätriangulaatio saavutettiin. Tutkimuskysymyksenä oli hitsien kestävyyden mitoituksen riittävyys. Tutkimukset suoritettiin tarkastellen staattisesti kuormitettuja pienahitsejä. Pienahitsi kappaleista tehtiin laboratoriokoekappale ja FEM-laskentamalli joista vertailtiin tuloksia. Laboratoriokokeessa mittaus menetelmänä käytettiin DIC-mittausta, jolle voitiin tehdä jälkikäsittelyjä ja sieltä määrittää haluttuja datapisteitä. Laskennassa suurimmat jännityskeskittymät syntyivät hitsin kohdalle mutta vetokokeessa koekappaleeseen syntyi vauriot sularajalle ja vetokorvakkeen kiinnityshitsin rajaviivalle. Tällä kohtaa todettiin materiaalimalli riittämättömäksi, koska siihen ei ollut määritelty muutosvyöhykkeen parametreja.
Resumo:
Optical full-field measurement methods such as Digital Image Correlation (DIC) provide a new opportunity for measuring deformations and vibrations with high spatial and temporal resolution. However, application to full-scale wind turbines is not trivial. Elaborate preparation of the experiment is vital and sophisticated post processing of the DIC results essential. In the present study, a rotor blade of a 3.2 MW wind turbine is equipped with a random black-and-white dot pattern at four different radial positions. Two cameras are located in front of the wind turbine and the response of the rotor blade is monitored using DIC for different turbine operations. In addition, a Light Detection and Ranging (LiDAR) system is used in order to measure the wind conditions. Wind fields are created based on the LiDAR measurements and used to perform aeroelastic simulations of the wind turbine by means of advanced multibody codes. The results from the optical DIC system appear plausible when checked against common and expected results. In addition, the comparison of relative out-of-plane blade deflections shows good agreement between DIC results and aeroelastic simulations.
Resumo:
One of the most exciting discoveries in astrophysics of the last last decade is of the sheer diversity of planetary systems. These include "hot Jupiters", giant planets so close to their host stars that they orbit once every few days; "Super-Earths", planets with sizes intermediate to those of Earth and Neptune, of which no analogs exist in our own solar system; multi-planet systems with planets smaller than Mars to larger than Jupiter; planets orbiting binary stars; free-floating planets flying through the emptiness of space without any star; even planets orbiting pulsars. Despite these remarkable discoveries, the field is still young, and there are many areas about which precious little is known. In particular, we don't know the planets orbiting Sun-like stars nearest to our own solar system, and we know very little about the compositions of extrasolar planets. This thesis provides developments in those directions, through two instrumentation projects.
The first chapter of this thesis concerns detecting planets in the Solar neighborhood using precision stellar radial velocities, also known as the Doppler technique. We present an analysis determining the most efficient way to detect planets considering factors such as spectral type, wavelengths of observation, spectrograph resolution, observing time, and instrumental sensitivity. We show that G and K dwarfs observed at 400-600 nm are the best targets for surveys complete down to a given planet mass and out to a specified orbital period. Overall we find that M dwarfs observed at 700-800 nm are the best targets for habitable-zone planets, particularly when including the effects of systematic noise floors caused by instrumental imperfections. Somewhat surprisingly, we demonstrate that a modestly sized observatory, with a dedicated observing program, is up to the task of discovering such planets.
We present just such an observatory in the second chapter, called the "MINiature Exoplanet Radial Velocity Array," or MINERVA. We describe the design, which uses a novel multi-aperture approach to increase stability and performance through lower system etendue, as well as keeping costs and time to deployment down. We present calculations of the expected planet yield, and data showing the system performance from our testing and development of the system at Caltech's campus. We also present the motivation, design, and performance of a fiber coupling system for the array, critical for efficiently and reliably bringing light from the telescopes to the spectrograph. We finish by presenting the current status of MINERVA, operational at Mt. Hopkins observatory in Arizona.
The second part of this thesis concerns a very different method of planet detection, direct imaging, which involves discovery and characterization of planets by collecting and analyzing their light. Directly analyzing planetary light is the most promising way to study their atmospheres, formation histories, and compositions. Direct imaging is extremely challenging, as it requires a high performance adaptive optics system to unblur the point-spread function of the parent star through the atmosphere, a coronagraph to suppress stellar diffraction, and image post-processing to remove non-common path "speckle" aberrations that can overwhelm any planetary companions.
To this end, we present the "Stellar Double Coronagraph," or SDC, a flexible coronagraphic platform for use with the 200" Hale telescope. It has two focal and pupil planes, allowing for a number of different observing modes, including multiple vortex phase masks in series for improved contrast and inner working angle behind the obscured aperture of the telescope. We present the motivation, design, performance, and data reduction pipeline of the instrument. In the following chapter, we present some early science results, including the first image of a companion to the star delta Andromeda, which had been previously hypothesized but never seen.
A further chapter presents a wavefront control code developed for the instrument, using the technique of "speckle nulling," which can remove optical aberrations from the system using the deformable mirror of the adaptive optics system. This code allows for improved contrast and inner working angles, and was written in a modular style so as to be portable to other high contrast imaging platforms. We present its performance on optical, near-infrared, and thermal infrared instruments on the Palomar and Keck telescopes, showing how it can improve contrasts by a factor of a few in less than ten iterations.
One of the large challenges in direct imaging is sensing and correcting the electric field in the focal plane to remove scattered light that can be much brighter than any planets. In the last chapter, we present a new method of focal-plane wavefront sensing, combining a coronagraph with a simple phase-shifting interferometer. We present its design and implementation on the Stellar Double Coronagraph, demonstrating its ability to create regions of high contrast by measuring and correcting for optical aberrations in the focal plane. Finally, we derive how it is possible to use the same hardware to distinguish companions from speckle errors using the principles of optical coherence. We present results observing the brown dwarf HD 49197b, demonstrating the ability to detect it despite it being buried in the speckle noise floor. We believe this is the first detection of a substellar companion using the coherence properties of light.
Resumo:
This thesis focuses on advanced reconstruction methods and Dual Energy (DE) Computed Tomography (CT) applications for proton therapy, aiming at improving patient positioning and investigating approaches to deal with metal artifacts. To tackle the first goal, an algorithm for post-processing input DE images has been developed. The outputs are tumor- and bone-canceled images, which help in recognising structures in patient body. We proved that positioning error is substantially reduced using contrast enhanced images, thus suggesting the potential of such application. If positioning plays a key role in the delivery, even more important is the quality of planning CT. For that, modern CT scanners offer possibility to tackle challenging cases, like treatment of tumors close to metal implants. Possible approaches for dealing with artifacts introduced by such rods have been investigated experimentally at Paul Scherrer Institut (Switzerland), simulating several treatment plans on an anthropomorphic phantom. In particular, we examined the cases in which none, manual or Iterative Metal Artifact Reduction (iMAR) algorithm were used to correct the artifacts, using both Filtered Back Projection and Sinogram Affirmed Iterative Reconstruction as image reconstruction techniques. Moreover, direct stopping power calculation from DE images with iMAR has also been considered as alternative approach. Delivered dose measured with Gafchromic EBT3 films was compared with the one calculated in Treatment Planning System. Residual positioning errors, daily machine dependent uncertainties and film quenching have been taken into account in the analyses. Although plans with multiple fields seemed more robust than single field, results showed in general better agreement between prescribed and delivered dose when using iMAR, especially if combined with DE approach. Thus, we proved the potential of these advanced algorithms in improving dosimetry for plans in presence of metal implants.