959 resultados para Minres Filter Diagonalization
Resumo:
Anthropomorphic model observers are mathe- matical algorithms which are applied to images with the ultimate goal of predicting human signal detection and classification accuracy across varieties of backgrounds, image acquisitions and display conditions. A limitation of current channelized model observers is their inability to handle irregularly-shaped signals, which are common in clinical images, without a high number of directional channels. Here, we derive a new linear model observer based on convolution channels which we refer to as the "Filtered Channel observer" (FCO), as an extension of the channelized Hotelling observer (CHO) and the nonprewhitening with an eye filter (NPWE) observer. In analogy to the CHO, this linear model observer can take the form of a single template with an external noise term. To compare with human observers, we tested signals with irregular and asymmetrical shapes spanning the size of lesions down to those of microcalfications in 4-AFC breast tomosynthesis detection tasks, with three different contrasts for each case. Whereas humans uniformly outperformed conventional CHOs, the FCO observer outperformed humans for every signal with only one exception. Additive internal noise in the models allowed us to degrade model performance and match human performance. We could not match all the human performances with a model with a single internal noise component for all signal shape, size and contrast conditions. This suggests that either the internal noise might vary across signals or that the model cannot entirely capture the human detection strategy. However, the FCO model offers an efficient way to apprehend human observer performance for a non-symmetric signal.
Resumo:
Podocytes are essential for the function of the kidney glomerular filter. A highly differentiated cytoskeleton is requisite for their integrity. Although much knowledge has been gained on the organization of cortical actin networks in podocyte's foot processes, less is known about the molecular organization of the microtubular cytoskeleton in primary processes and the cell body. To gain an insight into the organization of the microtubular cytoskeleton of the podocyte, we systematically analyzed the expression of microtubule associated proteins (Maps), a family of microtubules interacting proteins with known functions as regulator, scaffold and guidance proteins. We identified microtubule associated protein 1b (MAP1B) to be specifically enriched in podocytes in human and rodent kidney. Using immunogold labeling in electron microscopy, we were able to demonstrate an enrichment of MAP1B in primary processes. A similar association of MAP1B with the microtubule cytoskeleton was detected in cultured podocytes. Subcellular distribution of MAP1B HC and LC1 was analyzed using a double fluorescent reporter MAP1B fusion protein. Subsequently we analyzed mice constitutively depleted of MAP1B. Interestingly, MAP1B KO was not associated with any functional or structural alterations pointing towards a redundancy of MAP proteins in podocytes. In summary, we established MAP1B as a specific marker protein of the podocyte microtubular cytoskeleton.
Resumo:
The purpose of this work is to support the company in the designing of a welding clamp and to switch partly from manual welding to automatic welding. The product is a stainless steel frame of an industrial filter. This work includes the theory and the case sections. The theory covers the half of the work and studies significant aspects of designing of a welding clamp. The case reveals few models for a welding clamp for the company's needs. The study of the initial conditions and possible changes to existing product line are also included.
Resumo:
Diplomityössä perehdyttiin taajuusmuuttajien toimintaan ja ohjaukseen. Lisäksi työssä tarkasteltiin vaihtosuuntaajan nopeiden transienttitilojen aiheuttamaa moottorin ylijännitettä. Moottorikaapelin heijastuksia käsiteltiin vertaamalla moottorikaapelia siirtolinjaan ja todennettiin ylijännitteen syyt. Ylijännitteen vähentämiseksi on kehitetty useita suodatusmenetelmiä. Työssä vertailtiin näitä menetelmiä ja kartoitettiin kaupallisia vaihtoehtoja. Taajuusmuuttajan ohjaus on tähän päivään asti tehty yleensä käyttäen mikroprosessoria sekä logiikkapiiriä. Tulevaisuudessa ohjaukseen käytetään todennäköisesti uudelleenohjelmoitavia FPGA-piirejä (Field Programmable Gate Array). FPGA-piirin etuihin kuuluu uudelleenohjelmoitavuus sekä ohjauksen keskittäminen yhdelle piirille.
Resumo:
Brown packaging linerboard, made entirely from recovered pulp, was subjected to deinking flotation for evaluating the possible improvements in its chemical, optical and mechanical properties. The increase in the rate of recovered paper utilisation, along with the tendency towards lower basis weights, in the packaging paper production, has created a growing need for the utilisation of secondary fibers of improved quality. To attain better quality fibers, flotation deinking of brown grades is being considered, along with the addition of primary fibers to recovered paper furnish. Numerous conducted studies, in which the flotation technology was used in the treatment of brown grades, support this idea. Most of them show that the quality of fibers is improved after flotation deinking, resulting in higher mechanical properties of the deinked handsheets and in lower amounts of chemical contaminants. As to food and human health safety, packaging paper has to meet specific requirements, to be classified as suitable for its direct contact with foods. Recycled paper and board may contain many potential contaminants, which, especially in the case of direct food contact, may migrate from packaging materials into foodstuffs. In this work, the linerboard sample selected for deinking was made from recycled fibers not submitted previously to chemical deinking flotation. Therefore, the original sample contained many noncellulosic components, as well as the residues of printing inks. The studied linerboardsample was a type of packaging paper used for contact with food products that are usually peeled before use, e.g. fruits and vegetables. The decrease in the amount of chemical contaminants, after conducting deinking flotation, was evaluated, along with the changes in the mechanical and optical properties of the deinked handsheets. Food contact analysis was done on both the original paper samples and the filter pads and handsheets made before and after deinking flotation. Food contact analysis consisted of migration tests of brightening agents, colorants, PCPs, formaldehydes and metals. Microbiological tests were also performed to determine the possible transfer of antimicrobial constituents
Resumo:
Clogging, measured through head loss across filters, and the filtration quality of different filters using different effluents were studied. The filters used were: 115, 130, and 200 m disc filters; 98, 115, 130, and 178 m screen filters; and a sand filter filled with a single layer of sand with an effective diameter of 0.65 mm. The filters were used with a meat industry effluent and secondary and tertiary effluents of two wastewater treatment plants. It was observed that clogging depended on the type of effluent. With the meat industry effluent, the poorest quality effluent, disc filters clogged more than the other filter types. When the wastewater treatment plant effluents were used, the disc filters showed less frequent clogging. Several physical and chemical parameters, such as total suspended solids, chemical oxygen demand, turbidity, electrical conductivity, pH, and number of particles, were analyzed in the effluents at the entry and exit points of the filters. In general, filters did not reduce the values of the main clogging parameters to a great degree. It was found that the parameter that explained the clogging, expressed as Boucher’s filterability index, was different depending on the type of effluent and filter. The best quality of filtration was achieved with a sand filter when the meat industry effluent was used. No significant differences were observed between the quality of filtration of disc and screen filters when operating with the secondary and tertiary effluents
Resumo:
An increasing number of studies in recent years have sought to identify individual inventors from patent data. A variety of heuristics have been proposed for using the names and other information disclosed in patent documents to establish who is who in patents. This paper contributes to this literature by describing a methodology for identifying inventors using patents applied to the European Patent Office, EPO hereafter. As in much of this literature, we basically follow a threestep procedure : 1- the parsing stage, aimed at reducing the noise in the inventor’s name and other fields of the patent; 2- the matching stage, where name matching algorithms are used to group similar names; and 3- the filtering stage, where additional information and various scoring schemes are used to filter out these similarlynamed inventors. The paper presents the results obtained by using the algorithms with the set of European inventors applying to the EPO over a long period of time.
Resumo:
In this thesis the factors affecting lime mud filterability were studied. In the literature part there is information about recausticizing process, lime mud properties and filterability, and general filtration theory. In the experimental part the properties of lime mud particles and the properties of lime mud filter cake were investigated and the behavior of lime mud precoat was studied. The experiments were also carried out with R-PCC (rhombohedral precipitated calcium carbonate). The filtration and precoat studies were performed with a laboratory scale pressure filter. The pressure differences used were between 0.25-1.50 bars. Six different lime mud samples with varying amount of impurities and two R-PCC samples were used in the experiments. The average specific cake resistance of different lime mud samples varied quite extensively. The most influential factor affecting the lime mud filterability and dry solids content was found to be silica content. As the lime mud contained silica, the average specific cake resistance was high and the lime mud dryness was low. If the lime mud samples containing silica were not taken into account, the smaller the mean particle size, the higher the average specific cake resistance of the lime mud. The R-PCC had also a high average specific cake resistance, which was because of small particle size. In addition, the pressure difference affected the average specific cake resistance of some lime mud samples. In those cases lime mud was compressible material. During the precoat experiments the lime mud samples having the largest mean particle sizes, compressed. However, the average specific cake resistances of the filtration and precoat part were approximately the same magnitude. A brief displacement by air did not affect the behavior of the precoat. Instead, vibration and a brief but relatively great change in pressure difference had a slight influence.
Resumo:
Online paper web analysis relies on traversing scanners that criss-cross on top of a rapidly moving paper web. The sensors embedded in the scanners measure many important quality variables of paper, such as basis weight, caliper and porosity. Most of these quantities are varying a lot and the measurements are noisy at many different scales. The zigzagging nature of scanning makes it difficult to separate machine direction (MD) and cross direction (CD) variability from one another. For improving the 2D resolution of the quality variables above, the paper quality control team at the Department of Mathematics and Physics at LUT has implemented efficient Kalman filtering based methods that currently use 2D Fourier series. Fourier series are global and therefore resolve local spatial detail on the paper web rather poorly. The target of the current thesis is to study alternative wavelet based representations as candidates to replace the Fourier basis for a higher resolution spatial reconstruction of these quality variables. The accuracy of wavelet compressed 2D web fields will be compared with corresponding truncated Fourier series based fields.
Resumo:
Cartel detection is one of the most basic and most complicated tasks of competition authorities. In recent years, however, variance filters have provided a fairly simple tool for rejecting the existence of price-fixing, with the added advantage that the methodology requires only a low volume of data. In this paper we analyze two aspects of variance filters: 1- the relationship established between market structure and price rigidity, and 2- the use of different benchmarks for implementing the filters. This paper addresses these two issues by applying a variance filter to a gasoline retail market characterized by a set of unique features. Our results confirm the positive relationship between monopolies and price rigidity, and the variance filter's ability to detect non-competitive behavior when an appropriate benchmark is used. Our findings should serve to promote the implementation of this methodology among competition authorities, albeit in the awareness that a more exhaustive complementary analysis is required.
Resumo:
PURPOSE: Pencil beam scanning and filter free techniques may involve dose-rates considerably higher than those used in conventional external-beam radiotherapy. Our purpose was to investigate normal tissue and tumour responses in vivo to short pulses of radiation. MATERIAL AND METHODS: C57BL/6J mice were exposed to bilateral thorax irradiation using pulsed (at least 40Gy/s, flash) or conventional dose-rate irradiation (0.03Gy/s or less) in single dose. Immunohistochemical and histological methods were used to compare early radio-induced apoptosis and the development of lung fibrosis in the two situations. The response of two human (HBCx-12A, HEp-2) tumour xenografts in nude mice and one syngeneic, orthotopic lung carcinoma in C57BL/6J mice (TC-1 Luc+), was monitored in both radiation modes. RESULTS: A 17Gy conventional irradiation induced pulmonary fibrosis and activation of the TGF-beta cascade in 100% of the animals 24-36 weeks post-treatment, as expected, whereas no animal developed complications below 23Gy flash irradiation, and a 30Gy flash irradiation was required to induce the same extent of fibrosis as 17Gy conventional irradiation. Cutaneous lesions were also reduced in severity. Flash irradiation protected vascular and bronchial smooth muscle cells as well as epithelial cells of bronchi against acute apoptosis as shown by analysis of caspase-3 activation and TUNEL staining. In contrast, the antitumour effectiveness of flash irradiation was maintained and not different from that of conventional irradiation. CONCLUSION: Flash irradiation shifted by a large factor the threshold dose required to initiate lung fibrosis without loss of the antitumour efficiency, suggesting that the method might be used to advantage to minimize the complications of radiotherapy.
Resumo:
All the experimental part of this final project was done at Laboratoire de Biotechnologie Environnementale (LBE) from the École Polytechnique Fédérale de Lausanne (EPFL), Switzerland, during 6 months (November 2013- May 2014). A fungal biofilter composed of woodchips was designed in order to remove micropollutants from the effluents of waste water treatment plants. Two fungi were tested: Pleurotus ostreatus and Trametes versicolor in order to evaluate their efficiency for the removal of two micropollutants: the anti-inflammatory drug naproxen and the antibiotic sulfamethoxazole,. Although Trametes versicolor was able to degrade quickly naproxen, this fungus was not any more active after one week of operation in the filter. Pleurotus ostreatus was, on contrary, able to survive more than 3 months in the filter, showing good removal efficiencies of naproxen and sulfamethoxazole during all this period, in tap water but also in real treated municipal wastewater. Several other experiments have provided insight on the removal mechanisms of these micropollutants in the fungal biofilter (degradation and adsorption) and also allowed to model the removal trend. Fungal treatment with Pleurotus ostreatus grown on wood substrates appeared to be a promising solution to improve micropollutants removal in wastewater.
Resumo:
BACKGROUND: Autologous blood transfusion (ABT) efficiently increases sport performance and is the most challenging doping method to detect. Current methods for detecting this practice center on the plasticizer di(2-ethlyhexyl) phthalate (DEHP), which enters the stored blood from blood bags. Quantification of this plasticizer and its metabolites in urine can detect the transfusion of autologous blood stored in these bags. However, DEHP-free blood bags are available on the market, including n-butyryl-tri-(n-hexyl)-citrate (BTHC) blood bags. Athletes may shift to using such bags to avoid the detection of urinary DEHP metabolites. STUDY DESIGN AND METHODS: A clinical randomized double-blinded two-phase study was conducted of healthy male volunteers who underwent ABT using DEHP-containing or BTHC blood bags. All subjects received a saline injection for the control phase and a blood donation followed by ABT 36 days later. Kinetic excretion of five urinary DEHP metabolites was quantified with liquid chromatography coupled with tandem mass spectrometry. RESULTS: Surprisingly, considerable levels of urinary DEHP metabolites were observed up to 1 day after blood transfusion with BTHC blood bags. The long-term metabolites mono-(2-ethyl-5-carboxypentyl) phthalate and mono-(2-carboxymethylhexyl) phthalate were the most sensitive biomarkers to detect ABT with BTHC blood bags. Levels of DEHP were high in BTHC bags (6.6%), the tubing in the transfusion kit (25.2%), and the white blood cell filter (22.3%). CONCLUSIONS: The BTHC bag contained DEHP, despite being labeled DEHP-free. Urinary DEHP metabolite measurement is a cost-effective way to detect ABT in the antidoping field even when BTHC bags are used for blood storage.
Resumo:
Abstract Objective: Derive filtered tungsten X-ray spectra used in digital mammography systems by means of Monte Carlo simulations. Materials and Methods: Filtered spectra for rhodium filter were obtained for tube potentials between 26 and 32 kV. The half-value layer (HVL) of simulated filtered spectra were compared with those obtained experimentally with a solid state detector Unfors model 8202031-H Xi R/F & MAM Detector Platinum and 8201023-C Xi Base unit Platinum Plus w mAs in a Hologic Selenia Dimensions system using a direct radiography mode. Results: Calculated HVL values showed good agreement as compared with those obtained experimentally. The greatest relative difference between the Monte Carlo calculated HVL values and experimental HVL values was 4%. Conclusion: The results show that the filtered tungsten anode X-ray spectra and the EGSnrc Monte Carlo code can be used for mean glandular dose determination in mammography.
Resumo:
The Extended Kalman Filter (EKF) and four dimensional assimilation variational method (4D-VAR) are both advanced data assimilation methods. The EKF is impractical in large scale problems and 4D-VAR needs much effort in building the adjoint model. In this work we have formulated a data assimilation method that will tackle the above difficulties. The method will be later called the Variational Ensemble Kalman Filter (VEnKF). The method has been tested with the Lorenz95 model. Data has been simulated from the solution of the Lorenz95 equation with normally distributed noise. Two experiments have been conducted, first with full observations and the other one with partial observations. In each experiment we assimilate data with three-hour and six-hour time windows. Different ensemble sizes have been tested to examine the method. There is no strong difference between the results shown by the two time windows in either experiment. Experiment I gave similar results for all ensemble sizes tested while in experiment II, higher ensembles produce better results. In experiment I, a small ensemble size was enough to produce nice results while in experiment II the size had to be larger. Computational speed is not as good as we would want. The use of the Limited memory BFGS method instead of the current BFGS method might improve this. The method has proven succesful. Even if, it is unable to match the quality of analyses of EKF, it attains significant skill in forecasts ensuing from the analysis it has produced. It has two advantages over EKF; VEnKF does not require an adjoint model and it can be easily parallelized.