960 resultados para IIR filter
Resumo:
An increasing number of studies in recent years have sought to identify individual inventors from patent data. A variety of heuristics have been proposed for using the names and other information disclosed in patent documents to establish who is who in patents. This paper contributes to this literature by describing a methodology for identifying inventors using patents applied to the European Patent Office, EPO hereafter. As in much of this literature, we basically follow a threestep procedure : 1- the parsing stage, aimed at reducing the noise in the inventor’s name and other fields of the patent; 2- the matching stage, where name matching algorithms are used to group similar names; and 3- the filtering stage, where additional information and various scoring schemes are used to filter out these similarlynamed inventors. The paper presents the results obtained by using the algorithms with the set of European inventors applying to the EPO over a long period of time.
Resumo:
In this thesis the factors affecting lime mud filterability were studied. In the literature part there is information about recausticizing process, lime mud properties and filterability, and general filtration theory. In the experimental part the properties of lime mud particles and the properties of lime mud filter cake were investigated and the behavior of lime mud precoat was studied. The experiments were also carried out with R-PCC (rhombohedral precipitated calcium carbonate). The filtration and precoat studies were performed with a laboratory scale pressure filter. The pressure differences used were between 0.25-1.50 bars. Six different lime mud samples with varying amount of impurities and two R-PCC samples were used in the experiments. The average specific cake resistance of different lime mud samples varied quite extensively. The most influential factor affecting the lime mud filterability and dry solids content was found to be silica content. As the lime mud contained silica, the average specific cake resistance was high and the lime mud dryness was low. If the lime mud samples containing silica were not taken into account, the smaller the mean particle size, the higher the average specific cake resistance of the lime mud. The R-PCC had also a high average specific cake resistance, which was because of small particle size. In addition, the pressure difference affected the average specific cake resistance of some lime mud samples. In those cases lime mud was compressible material. During the precoat experiments the lime mud samples having the largest mean particle sizes, compressed. However, the average specific cake resistances of the filtration and precoat part were approximately the same magnitude. A brief displacement by air did not affect the behavior of the precoat. Instead, vibration and a brief but relatively great change in pressure difference had a slight influence.
Resumo:
Online paper web analysis relies on traversing scanners that criss-cross on top of a rapidly moving paper web. The sensors embedded in the scanners measure many important quality variables of paper, such as basis weight, caliper and porosity. Most of these quantities are varying a lot and the measurements are noisy at many different scales. The zigzagging nature of scanning makes it difficult to separate machine direction (MD) and cross direction (CD) variability from one another. For improving the 2D resolution of the quality variables above, the paper quality control team at the Department of Mathematics and Physics at LUT has implemented efficient Kalman filtering based methods that currently use 2D Fourier series. Fourier series are global and therefore resolve local spatial detail on the paper web rather poorly. The target of the current thesis is to study alternative wavelet based representations as candidates to replace the Fourier basis for a higher resolution spatial reconstruction of these quality variables. The accuracy of wavelet compressed 2D web fields will be compared with corresponding truncated Fourier series based fields.
Resumo:
Cartel detection is one of the most basic and most complicated tasks of competition authorities. In recent years, however, variance filters have provided a fairly simple tool for rejecting the existence of price-fixing, with the added advantage that the methodology requires only a low volume of data. In this paper we analyze two aspects of variance filters: 1- the relationship established between market structure and price rigidity, and 2- the use of different benchmarks for implementing the filters. This paper addresses these two issues by applying a variance filter to a gasoline retail market characterized by a set of unique features. Our results confirm the positive relationship between monopolies and price rigidity, and the variance filter's ability to detect non-competitive behavior when an appropriate benchmark is used. Our findings should serve to promote the implementation of this methodology among competition authorities, albeit in the awareness that a more exhaustive complementary analysis is required.
Resumo:
PURPOSE: Pencil beam scanning and filter free techniques may involve dose-rates considerably higher than those used in conventional external-beam radiotherapy. Our purpose was to investigate normal tissue and tumour responses in vivo to short pulses of radiation. MATERIAL AND METHODS: C57BL/6J mice were exposed to bilateral thorax irradiation using pulsed (at least 40Gy/s, flash) or conventional dose-rate irradiation (0.03Gy/s or less) in single dose. Immunohistochemical and histological methods were used to compare early radio-induced apoptosis and the development of lung fibrosis in the two situations. The response of two human (HBCx-12A, HEp-2) tumour xenografts in nude mice and one syngeneic, orthotopic lung carcinoma in C57BL/6J mice (TC-1 Luc+), was monitored in both radiation modes. RESULTS: A 17Gy conventional irradiation induced pulmonary fibrosis and activation of the TGF-beta cascade in 100% of the animals 24-36 weeks post-treatment, as expected, whereas no animal developed complications below 23Gy flash irradiation, and a 30Gy flash irradiation was required to induce the same extent of fibrosis as 17Gy conventional irradiation. Cutaneous lesions were also reduced in severity. Flash irradiation protected vascular and bronchial smooth muscle cells as well as epithelial cells of bronchi against acute apoptosis as shown by analysis of caspase-3 activation and TUNEL staining. In contrast, the antitumour effectiveness of flash irradiation was maintained and not different from that of conventional irradiation. CONCLUSION: Flash irradiation shifted by a large factor the threshold dose required to initiate lung fibrosis without loss of the antitumour efficiency, suggesting that the method might be used to advantage to minimize the complications of radiotherapy.
Resumo:
All the experimental part of this final project was done at Laboratoire de Biotechnologie Environnementale (LBE) from the École Polytechnique Fédérale de Lausanne (EPFL), Switzerland, during 6 months (November 2013- May 2014). A fungal biofilter composed of woodchips was designed in order to remove micropollutants from the effluents of waste water treatment plants. Two fungi were tested: Pleurotus ostreatus and Trametes versicolor in order to evaluate their efficiency for the removal of two micropollutants: the anti-inflammatory drug naproxen and the antibiotic sulfamethoxazole,. Although Trametes versicolor was able to degrade quickly naproxen, this fungus was not any more active after one week of operation in the filter. Pleurotus ostreatus was, on contrary, able to survive more than 3 months in the filter, showing good removal efficiencies of naproxen and sulfamethoxazole during all this period, in tap water but also in real treated municipal wastewater. Several other experiments have provided insight on the removal mechanisms of these micropollutants in the fungal biofilter (degradation and adsorption) and also allowed to model the removal trend. Fungal treatment with Pleurotus ostreatus grown on wood substrates appeared to be a promising solution to improve micropollutants removal in wastewater.
Resumo:
BACKGROUND: Autologous blood transfusion (ABT) efficiently increases sport performance and is the most challenging doping method to detect. Current methods for detecting this practice center on the plasticizer di(2-ethlyhexyl) phthalate (DEHP), which enters the stored blood from blood bags. Quantification of this plasticizer and its metabolites in urine can detect the transfusion of autologous blood stored in these bags. However, DEHP-free blood bags are available on the market, including n-butyryl-tri-(n-hexyl)-citrate (BTHC) blood bags. Athletes may shift to using such bags to avoid the detection of urinary DEHP metabolites. STUDY DESIGN AND METHODS: A clinical randomized double-blinded two-phase study was conducted of healthy male volunteers who underwent ABT using DEHP-containing or BTHC blood bags. All subjects received a saline injection for the control phase and a blood donation followed by ABT 36 days later. Kinetic excretion of five urinary DEHP metabolites was quantified with liquid chromatography coupled with tandem mass spectrometry. RESULTS: Surprisingly, considerable levels of urinary DEHP metabolites were observed up to 1 day after blood transfusion with BTHC blood bags. The long-term metabolites mono-(2-ethyl-5-carboxypentyl) phthalate and mono-(2-carboxymethylhexyl) phthalate were the most sensitive biomarkers to detect ABT with BTHC blood bags. Levels of DEHP were high in BTHC bags (6.6%), the tubing in the transfusion kit (25.2%), and the white blood cell filter (22.3%). CONCLUSIONS: The BTHC bag contained DEHP, despite being labeled DEHP-free. Urinary DEHP metabolite measurement is a cost-effective way to detect ABT in the antidoping field even when BTHC bags are used for blood storage.
Resumo:
Abstract Objective: Derive filtered tungsten X-ray spectra used in digital mammography systems by means of Monte Carlo simulations. Materials and Methods: Filtered spectra for rhodium filter were obtained for tube potentials between 26 and 32 kV. The half-value layer (HVL) of simulated filtered spectra were compared with those obtained experimentally with a solid state detector Unfors model 8202031-H Xi R/F & MAM Detector Platinum and 8201023-C Xi Base unit Platinum Plus w mAs in a Hologic Selenia Dimensions system using a direct radiography mode. Results: Calculated HVL values showed good agreement as compared with those obtained experimentally. The greatest relative difference between the Monte Carlo calculated HVL values and experimental HVL values was 4%. Conclusion: The results show that the filtered tungsten anode X-ray spectra and the EGSnrc Monte Carlo code can be used for mean glandular dose determination in mammography.
Resumo:
The Extended Kalman Filter (EKF) and four dimensional assimilation variational method (4D-VAR) are both advanced data assimilation methods. The EKF is impractical in large scale problems and 4D-VAR needs much effort in building the adjoint model. In this work we have formulated a data assimilation method that will tackle the above difficulties. The method will be later called the Variational Ensemble Kalman Filter (VEnKF). The method has been tested with the Lorenz95 model. Data has been simulated from the solution of the Lorenz95 equation with normally distributed noise. Two experiments have been conducted, first with full observations and the other one with partial observations. In each experiment we assimilate data with three-hour and six-hour time windows. Different ensemble sizes have been tested to examine the method. There is no strong difference between the results shown by the two time windows in either experiment. Experiment I gave similar results for all ensemble sizes tested while in experiment II, higher ensembles produce better results. In experiment I, a small ensemble size was enough to produce nice results while in experiment II the size had to be larger. Computational speed is not as good as we would want. The use of the Limited memory BFGS method instead of the current BFGS method might improve this. The method has proven succesful. Even if, it is unable to match the quality of analyses of EKF, it attains significant skill in forecasts ensuing from the analysis it has produced. It has two advantages over EKF; VEnKF does not require an adjoint model and it can be easily parallelized.
Resumo:
This paper presents a novel technique to align partial 3D reconstructions of the seabed acquired by a stereo camera mounted on an autonomous underwater vehicle. Vehicle localization and seabed mapping is performed simultaneously by means of an Extended Kalman Filter. Passive landmarks are detected on the images and characterized considering 2D and 3D features. Landmarks are re-observed while the robot is navigating and data association becomes easier but robust. Once the survey is completed, vehicle trajectory is smoothed by a Rauch-Tung-Striebel filter obtaining an even better alignment of the 3D views and yet a large-scale acquisition of the seabed
Resumo:
A visual SLAM system has been implemented and optimised for real-time deployment on an AUV equipped with calibrated stereo cameras. The system incorporates a novel approach to landmark description in which landmarks are local sub maps that consist of a cloud of 3D points and their associated SIFT/SURF descriptors. Landmarks are also sparsely distributed which simplifies and accelerates data association and map updates. In addition to landmark-based localisation the system utilises visual odometry to estimate the pose of the vehicle in 6 degrees of freedom by identifying temporal matches between consecutive local sub maps and computing the motion. Both the extended Kalman filter and unscented Kalman filter have been considered for filtering the observations. The output of the filter is also smoothed using the Rauch-Tung-Striebel (RTS) method to obtain a better alignment of the sequence of local sub maps and to deliver a large-scale 3D acquisition of the surveyed area. Synthetic experiments have been performed using a simulation environment in which ray tracing is used to generate synthetic images for the stereo system
Resumo:
El càncer de mama és una de les causes de més mortalitat entreles dones dels països desenvolupats. És tractat d'una maneramés eficient quan es fa una detecció precoç, on les tècniques d'imatge són molt importants. Una de les tècniques d'imatge més utilitzades després dels raigs-X són els ultrasons. A l'hora de fer un processat d'imatges d'ultrasò, els experts en aquest camp es troben amb una sèrie de limitacions en el moment d'utilitzar uns filtrats per les imatges, quan es fa ús de determinades eines. Una d'aquestes limitacions consisteix en la falta d'interactivitat que aquestes ens ofereixen. Per tal de solventar aquestes limitacions, s'ha desenvolupat una eina interactiva que permet explorar el mapa de paràmetres visualitzant el resultat del filtrat en temps real, d'una manera dinàmica i intuïtiva. Aquesta eina s'ha desenvolupat dins l'entorn de visualització d'imatge mèdica MeVisLab. El MeVisLab és un entorn molt potent i modular pel desenvolupament d'algorismes de processat d'imatges, visualització i mètodes d'interacció, especialment enfocats a la imatge mèdica. A més del processament bàsic d'imatges i de mòduls de visualització, inclou algorismes avançats de segmentació, registre i moltes análisis morfològiques i funcionals de les imatges.S'ha dut a terme un experiment amb quatre experts que, utilitzantl'eina desenvolupada, han escollit els paràmetres que creien adientsper al filtrat d'una sèrie d'imatges d'ultrasò. En aquest experiments'han utilitzat uns filtres que l'entorn MeVisLab ja té implementats:el Bilateral Filter, l'Anisotropic Difusion i una combinació d'un filtrede Mediana i un de Mitjana.Amb l'experiment realitzat, s'ha fet un estudi dels paràmetres capturats i s'han proposat una sèrie d'estimadors que seran favorables en la majoria dels casos per dur a terme el preprocessat d'imatges d'ultrasò
Resumo:
Simultaneous localization and mapping(SLAM) is a very important problem in mobile robotics. Many solutions have been proposed by different scientists during the last two decades, nevertheless few studies have considered the use of multiple sensors simultane¬ously. The solution is on combining several data sources with the aid of an Extended Kalman Filter (EKF). Two approaches are proposed. The first one is to use the ordinary EKF SLAM algorithm for each data source separately in parallel and then at the end of each step, fuse the results into one solution. Another proposed approach is the use of multiple data sources simultaneously in a single filter. The comparison of the computational com¬plexity of the two methods is also presented. The first method is almost four times faster than the second one.
Resumo:
Teräksen raaka-ainekustannusten jyrkkä nousu pakottaa prosessiteollisuuden etsimään vaihtoehtoisia rakennemateriaaleja, joiden ominaisuudet ovat vähintään terästä vastaavia. Prosessiteollisuudessa joudutaan käyttämään runsaasti kemikaaleja kestäviä laitteita, jolloin rakennemateriaaliksi sopiva on haponkestävä teräs. Laitteiden kokojen kasvusta seuraa niiden kokonaispainojen kasvu aiheuttaen haasteita niitä kantaville perustuksille. Komposiitti on yksi varteenotettava vaihtoehto haponkestävän teräksen korvaajaksi, sillä komposiitin ominaisuus/painosuhde on terästä parempi. Diplomityön tavoitteena on ollut selvittää Washers & Filters – tuoteperheeseen kuuluvien laitteiden, DD-pesurin (Drum Displacer) ja GF- suotimen (GasFree Filter), rakenteista löytyvien osien mahdollisuutta korvata komposiitilla. Työn pohjaksi on käsitelty komposiittien ominaisuuksia, tyyppejä ja valmistusmenetelmiä sekä perehdytty sellupesureiden toimintaprosessiin. Tältä pohjalta löytyi valintakriteerien avulla rakenteita, joita olisi mahdollista korvata komposiitilla.
Resumo:
La localización en ambientes de interiores basada en dispositivos modernos de uso masivo, como los dispositivos móviles, es un área actual de investigación y desarrollo que ha empezado a generar productos para diferentes aplicaciones. Sin embargo, la estimación del error en estas mediciones, sus fuentes y la forma como se propaga, que evita tener mayor exactitud en la localización, aún presenta un área pendiente de estudio. Esta investigación toma una de las técnicas comunes de estudio de localización de dispositivos por redes inalámbricas WiFi para estudiar los diferentes errores que ella conlleva, sus fuentes y su propagación. Este trabajo también propone un nuevo enfoque y nueva metodología, que permite reducir el error en la localización de dispositivos WiFi, con base a la clasificación de las redes inalámbricas presentes en un escaneo, la identificación de muestras y datos fortuitos, la estabilización de muestreos de potencias de redes WiFi, y la implementación de un filtro de Kalman, para la predicción de medidas estables de estas potencias.