953 resultados para Post processing


Relevância:

60.00% 60.00%

Publicador:

Resumo:

There have been over 3000 bridge weigh-in-motion (B-WIM) installations in 25 countries worldwide, this has led vast improvements in post processing of B-WIM systems since its introduction in the 1970’s. This paper introduces a new low-power B-WIM system using fibre optic sensors (FOS). The system consisted of a series of FOS which were attached to the soffit of an existing integral bridge with a single span of 19m. The site selection criteria and full installation process has been detailed in the paper. A method of calibration was adopted using live traffic at the bridge site and based on this calibration the accuracy of the system was determined.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Wir betrachten zeitabhängige Konvektions-Diffusions-Reaktions-Gleichungen in zeitabhängi- gen Gebieten, wobei die Bewegung des Gebietsrandes bekannt ist. Die zeitliche Entwicklung des Gebietes wird durch die ALE-Formulierung behandelt, die die Nachteile der klassischen Euler- und Lagrange-Betrachtungsweisen behebt. Die Position des Randes und seine Geschwindigkeit werden dabei so in das Gebietsinnere fortgesetzt, dass starke Gitterdeformationen verhindert werden. Als Zeitdiskretisierungen höherer Ordnung werden stetige Galerkin-Petrov-Verfahren (cGP) und unstetige Galerkin-Verfahren (dG) auf Probleme in zeitabhängigen Gebieten angewendet. Weiterhin werden das C 1 -stetige Galerkin-Petrov-Verfahren und das C 0 -stetige Galerkin- Verfahren vorgestellt. Deren Lösungen lassen sich auch in zeitabhängigen Gebieten durch ein einfaches einheitliches Postprocessing aus der Lösung des cGP-Problems bzw. dG-Problems erhalten. Für Problemstellungen in festen Gebieten und mit zeitlich konstanten Konvektions- und Reaktionstermen werden Stabilitätsresultate sowie optimale Fehlerabschätzungen für die nachbereiteten Lösungen der cGP-Verfahren und der dG-Verfahren angegeben. Für zeitabhängige Konvektions-Diffusions-Reaktions-Gleichungen in zeitabhängigen Gebieten präsentieren wir konservative und nicht-konservative Formulierungen, wobei eine besondere Aufmerksamkeit der Behandlung der Zeitableitung und der Gittergeschwindigkeit gilt. Stabilität und optimale Fehlerschätzungen für die in der Zeit semi-diskretisierten konservativen und nicht-konservativen Formulierungen werden vorgestellt. Abschließend wird das volldiskretisierte Problem betrachtet, wobei eine Finite-Elemente-Methode zur Ortsdiskretisierung der Konvektions-Diffusions-Reaktions-Gleichungen in zeitabhängigen Gebieten im ALE-Rahmen einbezogen wurde. Darüber hinaus wird eine lokale Projektionsstabilisierung (LPS) eingesetzt, um der Konvektionsdominanz Rechnung zu tragen. Weiterhin wird numerisch untersucht, wie sich die Approximation der Gebietsgeschwindigkeit auf die Genauigkeit der Zeitdiskretisierungsverfahren auswirkt.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Il seguente elaborato di tesi tratta il problema della pianificazione di voli fotogrammetrici a bassa quota mediante l’uso di SAPR, in particolare è presentata una disamina delle principali applicazioni che permettono di programmare una copertura fotogrammetrica trasversale e longitudinale di un certo poligono con un drone commerciale. Il tema principale sviluppato è la gestione di un volo fotogrammetrico UAV mediante l’uso di applicativi software che permettono all’utente di inserire i parametri di volo in base alla tipologia di rilievo che vuole effettuare. L’obbiettivo finale è quello di ottenere una corretta presa fotogrammetrica da utilizzare per la creazione di un modello digitale del terreno o di un oggetto attraverso elaborazione dati in post-processing. La perfetta configurazione del volo non può prescindere dalle conoscenze base di fotogrammetria e delle meccaniche di un veicolo UAV. I capitoli introduttivi tratteranno infatti i principi della fotogrammetria analogica e digitale soffermandosi su temi utili alla comprensione delle problematiche relative al progetto di rilievo fotogrammetrico aereo. Una particolare attenzione è stata posta sulle nozioni di fotogrammetria digitale che, insieme agli algoritmi di Imagine Matching derivanti dalla Computer Vision, permette di definire il ramo della Fotogrammetria Moderna. Nei capitoli centrali verranno esaminate e confrontate una serie di applicazioni commerciali per smartphone e tablet, disponibili per sistemi Apple e Android, per trarne un breve resoconto conclusivo che le compari in termini di accessibilità, potenzialità e destinazione d’uso. Per una maggiore comprensione si determinano univocamente gli acronimi con cui i droni vengono chiamati nei diversi contesti: UAV (Unmanned Aerial Vehicle), SAPR (Sistemi Aeromobili a Pilotaggio Remoto), RPAS (Remotely Piloted Aicraft System), ARP (Aeromobili a Pilotaggio Remoto).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The overwhelming amount and unprecedented speed of publication in the biomedical domain make it difficult for life science researchers to acquire and maintain a broad view of the field and gather all information that would be relevant for their research. As a response to this problem, the BioNLP (Biomedical Natural Language Processing) community of researches has emerged and strives to assist life science researchers by developing modern natural language processing (NLP), information extraction (IE) and information retrieval (IR) methods that can be applied at large-scale, to scan the whole publicly available biomedical literature and extract and aggregate the information found within, while automatically normalizing the variability of natural language statements. Among different tasks, biomedical event extraction has received much attention within BioNLP community recently. Biomedical event extraction constitutes the identification of biological processes and interactions described in biomedical literature, and their representation as a set of recursive event structures. The 2009–2013 series of BioNLP Shared Tasks on Event Extraction have given raise to a number of event extraction systems, several of which have been applied at a large scale (the full set of PubMed abstracts and PubMed Central Open Access full text articles), leading to creation of massive biomedical event databases, each of which containing millions of events. Sinece top-ranking event extraction systems are based on machine-learning approach and are trained on the narrow-domain, carefully selected Shared Task training data, their performance drops when being faced with the topically highly varied PubMed and PubMed Central documents. Specifically, false-positive predictions by these systems lead to generation of incorrect biomolecular events which are spotted by the end-users. This thesis proposes a novel post-processing approach, utilizing a combination of supervised and unsupervised learning techniques, that can automatically identify and filter out a considerable proportion of incorrect events from large-scale event databases, thus increasing the general credibility of those databases. The second part of this thesis is dedicated to a system we developed for hypothesis generation from large-scale event databases, which is able to discover novel biomolecular interactions among genes/gene-products. We cast the hypothesis generation problem as a supervised network topology prediction, i.e predicting new edges in the network, as well as types and directions for these edges, utilizing a set of features that can be extracted from large biomedical event networks. Routine machine learning evaluation results, as well as manual evaluation results suggest that the problem is indeed learnable. This work won the Best Paper Award in The 5th International Symposium on Languages in Biology and Medicine (LBM 2013).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Kandidaatintyön tarkoituksena oli selvittää pienahitsin juuren kriittisyyttä. Työ oli saanut aiheen rakenneputki kokeiden yhteydessä tehdyistä havainnoista. Työssä tutustuttiin millaiset ovat pienahitsin mitoitus menetelmät ja tausta tutkimusta kuinka sitä sovelletaan käytäntöön suurlujuusteräksille. Työssä esitellään käytetyt tutkimusmenetelmät kuinka menetelmätriangulaatio saavutettiin. Tutkimuskysymyksenä oli hitsien kestävyyden mitoituksen riittävyys. Tutkimukset suoritettiin tarkastellen staattisesti kuormitettuja pienahitsejä. Pienahitsi kappaleista tehtiin laboratoriokoekappale ja FEM-laskentamalli joista vertailtiin tuloksia. Laboratoriokokeessa mittaus menetelmänä käytettiin DIC-mittausta, jolle voitiin tehdä jälkikäsittelyjä ja sieltä määrittää haluttuja datapisteitä. Laskennassa suurimmat jännityskeskittymät syntyivät hitsin kohdalle mutta vetokokeessa koekappaleeseen syntyi vauriot sularajalle ja vetokorvakkeen kiinnityshitsin rajaviivalle. Tällä kohtaa todettiin materiaalimalli riittämättömäksi, koska siihen ei ollut määritelty muutosvyöhykkeen parametreja.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Optical full-field measurement methods such as Digital Image Correlation (DIC) provide a new opportunity for measuring deformations and vibrations with high spatial and temporal resolution. However, application to full-scale wind turbines is not trivial. Elaborate preparation of the experiment is vital and sophisticated post processing of the DIC results essential. In the present study, a rotor blade of a 3.2 MW wind turbine is equipped with a random black-and-white dot pattern at four different radial positions. Two cameras are located in front of the wind turbine and the response of the rotor blade is monitored using DIC for different turbine operations. In addition, a Light Detection and Ranging (LiDAR) system is used in order to measure the wind conditions. Wind fields are created based on the LiDAR measurements and used to perform aeroelastic simulations of the wind turbine by means of advanced multibody codes. The results from the optical DIC system appear plausible when checked against common and expected results. In addition, the comparison of relative out-of-plane blade deflections shows good agreement between DIC results and aeroelastic simulations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

One of the most exciting discoveries in astrophysics of the last last decade is of the sheer diversity of planetary systems. These include "hot Jupiters", giant planets so close to their host stars that they orbit once every few days; "Super-Earths", planets with sizes intermediate to those of Earth and Neptune, of which no analogs exist in our own solar system; multi-planet systems with planets smaller than Mars to larger than Jupiter; planets orbiting binary stars; free-floating planets flying through the emptiness of space without any star; even planets orbiting pulsars. Despite these remarkable discoveries, the field is still young, and there are many areas about which precious little is known. In particular, we don't know the planets orbiting Sun-like stars nearest to our own solar system, and we know very little about the compositions of extrasolar planets. This thesis provides developments in those directions, through two instrumentation projects.

The first chapter of this thesis concerns detecting planets in the Solar neighborhood using precision stellar radial velocities, also known as the Doppler technique. We present an analysis determining the most efficient way to detect planets considering factors such as spectral type, wavelengths of observation, spectrograph resolution, observing time, and instrumental sensitivity. We show that G and K dwarfs observed at 400-600 nm are the best targets for surveys complete down to a given planet mass and out to a specified orbital period. Overall we find that M dwarfs observed at 700-800 nm are the best targets for habitable-zone planets, particularly when including the effects of systematic noise floors caused by instrumental imperfections. Somewhat surprisingly, we demonstrate that a modestly sized observatory, with a dedicated observing program, is up to the task of discovering such planets.

We present just such an observatory in the second chapter, called the "MINiature Exoplanet Radial Velocity Array," or MINERVA. We describe the design, which uses a novel multi-aperture approach to increase stability and performance through lower system etendue, as well as keeping costs and time to deployment down. We present calculations of the expected planet yield, and data showing the system performance from our testing and development of the system at Caltech's campus. We also present the motivation, design, and performance of a fiber coupling system for the array, critical for efficiently and reliably bringing light from the telescopes to the spectrograph. We finish by presenting the current status of MINERVA, operational at Mt. Hopkins observatory in Arizona.

The second part of this thesis concerns a very different method of planet detection, direct imaging, which involves discovery and characterization of planets by collecting and analyzing their light. Directly analyzing planetary light is the most promising way to study their atmospheres, formation histories, and compositions. Direct imaging is extremely challenging, as it requires a high performance adaptive optics system to unblur the point-spread function of the parent star through the atmosphere, a coronagraph to suppress stellar diffraction, and image post-processing to remove non-common path "speckle" aberrations that can overwhelm any planetary companions.

To this end, we present the "Stellar Double Coronagraph," or SDC, a flexible coronagraphic platform for use with the 200" Hale telescope. It has two focal and pupil planes, allowing for a number of different observing modes, including multiple vortex phase masks in series for improved contrast and inner working angle behind the obscured aperture of the telescope. We present the motivation, design, performance, and data reduction pipeline of the instrument. In the following chapter, we present some early science results, including the first image of a companion to the star delta Andromeda, which had been previously hypothesized but never seen.

A further chapter presents a wavefront control code developed for the instrument, using the technique of "speckle nulling," which can remove optical aberrations from the system using the deformable mirror of the adaptive optics system. This code allows for improved contrast and inner working angles, and was written in a modular style so as to be portable to other high contrast imaging platforms. We present its performance on optical, near-infrared, and thermal infrared instruments on the Palomar and Keck telescopes, showing how it can improve contrasts by a factor of a few in less than ten iterations.

One of the large challenges in direct imaging is sensing and correcting the electric field in the focal plane to remove scattered light that can be much brighter than any planets. In the last chapter, we present a new method of focal-plane wavefront sensing, combining a coronagraph with a simple phase-shifting interferometer. We present its design and implementation on the Stellar Double Coronagraph, demonstrating its ability to create regions of high contrast by measuring and correcting for optical aberrations in the focal plane. Finally, we derive how it is possible to use the same hardware to distinguish companions from speckle errors using the principles of optical coherence. We present results observing the brown dwarf HD 49197b, demonstrating the ability to detect it despite it being buried in the speckle noise floor. We believe this is the first detection of a substellar companion using the coherence properties of light.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A method is presented to determine residual stress distribution in sheet material from data collected in a free bending test. It may be used where the residual stress distribution is symmetrical about the mid-surface as it is usually the case for frequently-used sheet metal post-processing techniques such as skin-pass or temper rolling, tension- and roller leveling. An existing inverse technique is used to obtain a residual stress profile and material constants that provide the best fit in a finite element analysis of bending with the experimentally derived moment-curvature relation. The method is verified for bending of a low-carbon stainless steel using measurement of residual stress by X-ray diffraction. The residual stresses were induced in the sheet by cold rolling. The technique described here can be used industrially as a rapid method of investigating residual stresses in incoming sheet. In processes where the deformation is principally one of bending, such as cold roll forming, it is known that residual stresses have an influence on shape defects and springback and the method presented here can be used to determine whether incoming sheet is suitable for further processing and also as a means of obtaining improved material data input for numerical simulation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Traditional information extraction methods mainly rely on visual feature assisted techniques; but without considering the hierarchical dependencies within the paragraph structure, some important information is missing. This paper proposes an integrated approach for extracting academic information from conference Web pages. Firstly, Web pages are segmented into text blocks by applying a new hybrid page segmentation algorithm which combines visual feature and DOM structure together. Then, these text blocks are labeled by a Tree-structured Random Fields model, and the block functions are differentiated using various features such as visual features, semantic features and hierarchical dependencies. Finally, an additional post-processing is introduced to tune the initial annotation results. Our experimental results on real-world data sets demonstrated that the proposed method is able to effectively and accurately extract the needed academic information from conference Web pages. © 2013 Springer-Verlag.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ternary Mg-Y-Zn alloys have attracted considerable attention from researchers due to their excellent mechanical properties and unique microstructures, particularly from the presence of long-period stacking-order (LPSO) phases. Microstructural variations and the resulting mechanical properties can be affected by various processing routes, particularly those involving severe plastic deformation of a cast billet. The approach used in this work was based on subjecting cast Mg92Y4Zn4 (composition in wt%) billet to severe plastic deformation by three different routes, namely equal channel angular pressing (ECAP), high pressure torsion (HPT) and ECAP followed by HPT, with the aim of refining the microstructure and improving mechanical properties. Samples processed by ECAP were annealed by post-processing and tested in compression and tension. The effect of the processing route and the process parameters on the microstructure and the hardness of the Mg-Y-Zn alloy is reported. An overall positive effect of annealing treatment on the mechanical properties of ECAP-processed alloy is demonstrated. © 2014 Elsevier B.V.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis focuses on advanced reconstruction methods and Dual Energy (DE) Computed Tomography (CT) applications for proton therapy, aiming at improving patient positioning and investigating approaches to deal with metal artifacts. To tackle the first goal, an algorithm for post-processing input DE images has been developed. The outputs are tumor- and bone-canceled images, which help in recognising structures in patient body. We proved that positioning error is substantially reduced using contrast enhanced images, thus suggesting the potential of such application. If positioning plays a key role in the delivery, even more important is the quality of planning CT. For that, modern CT scanners offer possibility to tackle challenging cases, like treatment of tumors close to metal implants. Possible approaches for dealing with artifacts introduced by such rods have been investigated experimentally at Paul Scherrer Institut (Switzerland), simulating several treatment plans on an anthropomorphic phantom. In particular, we examined the cases in which none, manual or Iterative Metal Artifact Reduction (iMAR) algorithm were used to correct the artifacts, using both Filtered Back Projection and Sinogram Affirmed Iterative Reconstruction as image reconstruction techniques. Moreover, direct stopping power calculation from DE images with iMAR has also been considered as alternative approach. Delivered dose measured with Gafchromic EBT3 films was compared with the one calculated in Treatment Planning System. Residual positioning errors, daily machine dependent uncertainties and film quenching have been taken into account in the analyses. Although plans with multiple fields seemed more robust than single field, results showed in general better agreement between prescribed and delivered dose when using iMAR, especially if combined with DE approach. Thus, we proved the potential of these advanced algorithms in improving dosimetry for plans in presence of metal implants.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Recent developments have made researchers to reconsider Lagrangian measurement techniques as an alternative to their Eulerian counterpart when investigating non-stationary flows. This thesis advances the state-of-the-art of Lagrangian measurement techniques by pursuing three different objectives: (i) developing new Lagrangian measurement techniques for difficult-to-measure, in situ flow environments; (ii) developing new post-processing strategies designed for unstructured Lagrangian data, as well as providing guidelines towards their use; and (iii) presenting the advantages that the Lagrangian framework has over their Eulerian counterpart in various non-stationary flow problems. Towards the first objective, a large-scale particle tracking velocimetry apparatus is designed for atmospheric surface layer measurements. Towards the second objective, two techniques, one for identifying Lagrangian Coherent Structures (LCS) and the other for characterizing entrainment directly from unstructured Lagrangian data, are developed. Finally, towards the third objective, the advantages of Lagrangian-based measurements are showcased in two unsteady flow problems: the atmospheric surface layer, and entrainment in a non-stationary turbulent flow. Through developing new experimental and post-processing strategies for Lagrangian data, and through showcasing the advantages of Lagrangian data in various non-stationary flows, the thesis works to help investigators to more easily adopt Lagrangian-based measurement techniques.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We propose a framework for eliciting and aggregating pairwise preference relations based on the assumption of an underlying fuzzy partial order. We also propose some linear programming optimization methods for ensuring consistency either as part of the aggregation phase or as a pre- or post-processing task. We contend that this framework of pairwise-preference relations, based on the Kemeny distance, can be less sensitive to extreme or biased opinions and is also less complex to elicit from experts. We provide some examples and outline their relevant properties and associated concepts.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Bangla OCR (Optical Character Recognition) is a long deserving software for Bengali community all over the world. Numerous e efforts suggest that due to the inherent complex nature of Bangla alphabet and its word formation process development of high fidelity OCR producing a reasonably acceptable output still remains a challenge. One possible way of improvement is by using post processing of OCR’s output; algorithms such as Edit Distance and the use of n-grams statistical information have been used to rectify misspelled words in language processing. This work presents the first known approach to use these algorithms to replace misrecognized words produced by Bangla OCR. The assessment is made on a set of fifty documents written in Bangla script and uses a dictionary of 541,167 words. The proposed correction model can correct several words lowering the recognition error rate by 2.87% and 3.18% for the character based n- gram and edit distance algorithms respectively. The developed system suggests a list of 5 (five) alternatives for a misspelled word. It is found that in 33.82% cases, the correct word is the topmost suggestion of 5 words list for n-gram algorithm while using Edit distance algorithm the first word in the suggestion properly matches 36.31% of the cases. This work will ignite rooms of thoughts for possible improvements in character recognition endeavour.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La seguente tesi nasce dall’esigenza di ottimizzare, da un punto di vista acustico e prestazionale, un ventilatore centrifugo preesistente in azienda. Nei primi tre capitoli si è analizzato il problema da un punto di vista teorico, mentre nel terzo e quarto capitolo da un punto di vista computazionale sfruttando tecniche CFD. Nel primo capitolo è stata fatta una trattazione generale dei ventilatori centrifughi, concentrandosi sul tipo di problematiche a cui questi vanno incontro. Nel secondo capitolo è stata presentata la teoria che sta alla base di una rilevazione sperimentale e di un’analisi acustica. Unitamente a ciò sono stati riportati alcuni articoli che mostrano tecniche di ottimizzazione acustica in ventilatori centrifughi. Nel terzo capitolo è stata riassunta la teoria alla base della fluidodinamica e di uno studio fluidodinamico. Nel quarto capitolo viene spiegato come è stato creato il modello fluidodinamico. Si è optato per un’analisi del problema in stato stazionario, sfruttando il Moving Reference Frame, e considerando l’aria come incomprimibile, visto il ridotto numero di Mach. L’analisi acustica è stata effettuata nel post-processing sfruttando il modello di Proudman. Infine è stata dimostrata la correlazione che intercorre tra i tre punti della curva resistente del ventilatore di funzionamento reale, permettendo di estendere i risultati ricavati dalla analisi di uno di questi agli altri due. Nel quinto capitolo è stata effettuata un’analisi dei risultati ottenuti dalle simulazioni fluidodinamiche e sono state proposte diverse modifiche della geometria. La modifica scelta ha visto un miglioramento delle prestazioni e una minore rumorosità. Infine sono state proposte nelle conclusioni ulteriori possibili strade da percorre per un’indagine e ottimizzazione del ventilatore più accurata.