994 resultados para Filter methods
Resumo:
BACKGROUND: Protein-energy malnutrition is highly prevalent in aged populations. Associated clinical, economic, and social burden is important. A valid screening method that would be robust and precise, but also easy, simple, and rapid to apply, is essential for adequate therapeutic management. OBJECTIVES: To compare the interobserver variability of 2 methods measuring food intake: semiquantitative visual estimations made by nurses versus calorie measurements performed by dieticians on the basis of standardized color digital photographs of servings before and after consumption. DESIGN: Observational monocentric pilot study. SETTING/PARTICIPANTS: A geriatric ward. The meals were randomly chosen from the meal tray. The choice was anonymous with respect to the patients who consumed them. MEASUREMENTS: The test method consisted of the estimation of calorie consumption by dieticians on the basis of standardized color digital photographs of servings before and after consumption. The reference method was based on direct visual estimations of the meals by nurses. Food intake was expressed in the form of a percentage of the serving consumed and calorie intake was then calculated by a dietician based on these percentages. The methods were applied with no previous training of the observers. Analysis of variance was performed to compare their interobserver variability. RESULTS: Of 15 meals consumed and initially examined, 6 were assessed with each method. Servings not consumed at all (0% consumption) or entirely consumed by the patient (100% consumption) were not included in the analysis so as to avoid systematic error. The digital photography method showed higher interobserver variability in calorie intake estimations. The difference between the compared methods was statistically significant (P < .03). CONCLUSIONS: Calorie intake measures for geriatric patients are more concordant when estimated in a semiquantitative way. Digital photography for food intake estimation without previous specific training of dieticians should not be considered as a reference method in geriatric settings, as it shows no advantages in terms of interobserver variability.
Resumo:
BACKGROUND: Finding genes that are differentially expressed between conditions is an integral part of understanding the molecular basis of phenotypic variation. In the past decades, DNA microarrays have been used extensively to quantify the abundance of mRNA corresponding to different genes, and more recently high-throughput sequencing of cDNA (RNA-seq) has emerged as a powerful competitor. As the cost of sequencing decreases, it is conceivable that the use of RNA-seq for differential expression analysis will increase rapidly. To exploit the possibilities and address the challenges posed by this relatively new type of data, a number of software packages have been developed especially for differential expression analysis of RNA-seq data. RESULTS: We conducted an extensive comparison of eleven methods for differential expression analysis of RNA-seq data. All methods are freely available within the R framework and take as input a matrix of counts, i.e. the number of reads mapping to each genomic feature of interest in each of a number of samples. We evaluate the methods based on both simulated data and real RNA-seq data. CONCLUSIONS: Very small sample sizes, which are still common in RNA-seq experiments, impose problems for all evaluated methods and any results obtained under such conditions should be interpreted with caution. For larger sample sizes, the methods combining a variance-stabilizing transformation with the 'limma' method for differential expression analysis perform well under many different conditions, as does the nonparametric SAMseq method.
Resumo:
We present an open-source ITK implementation of a directFourier method for tomographic reconstruction, applicableto parallel-beam x-ray images. Direct Fourierreconstruction makes use of the central-slice theorem tobuild a polar 2D Fourier space from the 1D transformedprojections of the scanned object, that is resampled intoa Cartesian grid. Inverse 2D Fourier transform eventuallyyields the reconstructed image. Additionally, we providea complex wrapper to the BSplineInterpolateImageFunctionto overcome ITKâeuro?s current lack for image interpolatorsdealing with complex data types. A sample application ispresented and extensively illustrated on the Shepp-Loganhead phantom. We show that appropriate input zeropaddingand 2D-DFT oversampling rates together with radial cubicb-spline interpolation improve 2D-DFT interpolationquality and are efficient remedies to reducereconstruction artifacts.
Resumo:
This is the report of the first workshop on Incorporating In Vitro Alternative Methods for Developmental Neurotoxicity (DNT) Testing into International Hazard and Risk Assessment Strategies, held in Ispra, Italy, on 19-21 April 2005. The workshop was hosted by the European Centre for the Validation of Alternative Methods (ECVAM) and jointly organized by ECVAM, the European Chemical Industry Council, and the Johns Hopkins University Center for Alternatives to Animal Testing. The primary aim of the workshop was to identify and catalog potential methods that could be used to assess how data from in vitro alternative methods could help to predict and identify DNT hazards. Working groups focused on two different aspects: a) details on the science available in the field of DNT, including discussions on the models available to capture the critical DNT mechanisms and processes, and b) policy and strategy aspects to assess the integration of alternative methods in a regulatory framework. This report summarizes these discussions and details the recommendations and priorities for future work.
Resumo:
PURPOSE: We describe the results of a preliminary prospective study using different recently developed temporary and retrievable inferior vena cava (IVC) filters. METHODS: Fifty temporary IVC filters (Gunther, Gunther Tulip, Antheor) were inserted in 47 patients when the required period of protection against pulmonary embolism (PE) was estimated to be less than 2 weeks. The indications were documented deep vein thrombosis (DVT) and temporary contraindications for anticoagulation, a high risk for PE, and PE despite DVT prophylaxis. RESULTS: Filters were removed 1-12 days after placement and nine (18%) had captured thrombi. Complications were one PE during and after removal of a filter, two minor filter migrations, and one IVC thrombosis. CONCLUSION: Temporary filters are effective in trapping clots and protecting against PE, and the complication rate does not exceed that of permanent filters. They are an alternative when protection from PE is required temporarily, and should be considered in patients with a normal life expectancy.
Resumo:
Tiivistelmä: Harvennusmenetelmien vertailu ojitetun turvemaan männikössä. Simulointitutkimus
Resumo:
Introduction. Agricultural workers are among the professional groups most at risk of developing acute or chronic respiratory problems. Despite this fact, the etiology of these occupational diseases is poorly known, even in important sectors of agriculture such as the crops sector. Cereals can be colonized by a large number of fungal species throughout the plants' growth, but also during grain storage. Some of these fungi deliver toxins that can have a serious impact on human health when they are ingested via wheat products. Although International and European legislation on contaminants in food, including mycotoxins, include measures to ensure protection of public health by setting down the maximum levels for certain contaminants, the risks associated with the inhalation of such molecules during grain handling remains poorly documented. Goal of study. This project's objective was to characterize worker exposure to pathogenic, irritative or allergenic microorganisms and to identify the abiotic or biotic factors that reduce the growth of these microorganisms in crops. Indeed, the proliferation of microorganisms on wheat is dependent on temperature, rainfall and human disturbance (e.g. usage of tillage, addition of fungicides). A change in the concentration of these microorganisms in the substrate will directly result in a change in the concentration of aerosolized particles of the same microorganisms. Therefore, the exposure of worker to bioaérosols will also change. The Vaud region of Switzerland is a perfect region for conduct such a project as weather conditions vary and agricultural land management programs are divers at a small geographic scale. Methods. Bioaerosols and wheat dust have been sampled during wheat harvesting of summer 2010 at 100 sites uniformly distributed in the Vaud region that are representative of the different agriculture practices. Personal exposure has been evaluated for different wheat related activities: harvesting, grain unload, baling straw, the cleaning of harvesters and silos. Aerosols have been sampled at a rate of 2L/min between 15 min to 4 hours (t) on a 5m PVC filter for estimating the total dust inhaled, on gelatine filter for the identification and quantification of molds, and on a 0.45um polycarbonate filter for endotoxin quantification. Altitude, temperature and annual average rainfall were considered for each site. The physical and chemical characteristics of soils were determined using the methods in effect at Sol Council (Nyon). Total dust has been quantified following NIOSH 0500 method. Reactive endotoxine activity has been determined with Limulus Amebocyte Lysate Assay. All molds have been identified by the pyrosequencing of ITS2 amplicons generated from bioaerosol or wheat dust genomic DNA. Results & Conclusions. Our results confirm the previous quantitative data on the worker exposure to wheat dust. In addition, they show that crop workers are systematically exposed to complex mixtures of allergens, irritants or cytotoxic components. The novelty of our study is the systematic detection of molds such as Fusarium - that is a mycotoxins producer - in the bioaerosols. The results are interpreted by taking in account the agriculture practice, the Phosphorus : Carbon : Nitrogen ratio of the soil, the altitude and the average of rainy days per year.
Resumo:
In the scope of the European project Hydroptimet, INTERREG IIIB-MEDOCC programme, limited area model (LAM) intercomparison of intense events that produced many damages to people and territory is performed. As the comparison is limited to single case studies, the work is not meant to provide a measure of the different models' skill, but to identify the key model factors useful to give a good forecast on such a kind of meteorological phenomena. This work focuses on the Spanish flash-flood event, also known as "Montserrat-2000" event. The study is performed using forecast data from seven operational LAMs, placed at partners' disposal via the Hydroptimet ftp site, and observed data from Catalonia rain gauge network. To improve the event analysis, satellite rainfall estimates have been also considered. For statistical evaluation of quantitative precipitation forecasts (QPFs), several non-parametric skill scores based on contingency tables have been used. Furthermore, for each model run it has been possible to identify Catalonia regions affected by misses and false alarms using contingency table elements. Moreover, the standard "eyeball" analysis of forecast and observed precipitation fields has been supported by the use of a state-of-the-art diagnostic method, the contiguous rain area (CRA) analysis. This method allows to quantify the spatial shift forecast error and to identify the error sources that affected each model forecasts. High-resolution modelling and domain size seem to have a key role for providing a skillful forecast. Further work is needed to support this statement, including verification using a wider observational data set.
Resumo:
Usual image fusion methods inject features from a high spatial resolution panchromatic sensor into every low spatial resolution multispectral band trying to preserve spectral signatures and improve spatial resolution to that of the panchromatic sensor. The objective is to obtain the image that would be observed by a sensor with the same spectral response (i.e., spectral sensitivity and quantum efficiency) as the multispectral sensors and the spatial resolution of the panchromatic sensor. But in these methods, features from electromagnetic spectrum regions not covered by multispectral sensors are injected into them, and physical spectral responses of the sensors are not considered during this process. This produces some undesirable effects, such as resolution overinjection images and slightly modified spectral signatures in some features. The authors present a technique which takes into account the physical electromagnetic spectrum responses of sensors during the fusion process, which produces images closer to the image obtained by the ideal sensor than those obtained by usual wavelet-based image fusion methods. This technique is used to define a new wavelet-based image fusion method.
Resumo:
The purposes of this study were to characterize the performance of a 3-dimensional (3D) ordered-subset expectation maximization (OSEM) algorithm in the quantification of left ventricular (LV) function with (99m)Tc-labeled agent gated SPECT (G-SPECT), the QGS program, and a beating-heart phantom and to optimize the reconstruction parameters for clinical applications. METHODS: A G-SPECT image of a dynamic heart phantom simulating the beating left ventricle was acquired. The exact volumes of the phantom were known and were as follows: end-diastolic volume (EDV) of 112 mL, end-systolic volume (ESV) of 37 mL, and stroke volume (SV) of 75 mL; these volumes produced an LV ejection fraction (LVEF) of 67%. Tomographic reconstructions were obtained after 10-20 iterations (I) with 4, 8, and 16 subsets (S) at full width at half maximum (FWHM) gaussian postprocessing filter cutoff values of 8-15 mm. The QGS program was used for quantitative measurements. RESULTS: Measured values ranged from 72 to 92 mL for EDV, from 18 to 32 mL for ESV, and from 54 to 63 mL for SV, and the calculated LVEF ranged from 65% to 76%. Overall, the combination of 10 I, 8 S, and a cutoff filter value of 10 mm produced the most accurate results. The plot of the measures with respect to the expectation maximization-equivalent iterations (I x S product) revealed a bell-shaped curve for the LV volumes and a reverse distribution for the LVEF, with the best results in the intermediate range. In particular, FWHM cutoff values exceeding 10 mm affected the estimation of the LV volumes. CONCLUSION: The QGS program is able to correctly calculate the LVEF when used in association with an optimized 3D OSEM algorithm (8 S, 10 I, and FWHM of 10 mm) but underestimates the LV volumes. However, various combinations of technical parameters, including a limited range of I and S (80-160 expectation maximization-equivalent iterations) and low cutoff values (< or =10 mm) for the gaussian postprocessing filter, produced results with similar accuracies and without clinically relevant differences in the LV volumes and the estimated LVEF.
Resumo:
An enormous burst of interest in the public health burden from chronic disease in Africa has emerged as a consequence of efforts to estimate global population health. Detailed estimates are now published for Africa as a whole and each country on the continent. These data have formed the basis for warnings about sharp increases in cardiovascular disease (CVD) in the coming decades. In this essay we briefly examine the trajectory of social development on the continent and its consequences for the epidemiology of CVD and potential control strategies. Since full vital registration has only been implemented in segments of South Africa and the island nations of Seychelles and Mauritius - formally part of WHO-AFRO - mortality data are extremely limited. Numerous sample surveys have been conducted but they often lack standardization or objective measures of health status. Trend data are even less informative. However, using the best quality data available, age-standardized trends in CVD are downward, and in the case of stroke, sharply so. While acknowledging that the extremely limited available data cannot be used as the basis for inference to the continent, we raise the concern that general estimates based on imputation to fill in the missing mortality tables may be even more misleading. No immediate remedies to this problem can be identified, however bilateral collaborative efforts to strength local educational institutions and governmental agencies rank as the highest priority for near term development.
Resumo:
Purpose: To investigate the differences between Fundus Camera (Topcon TRC-50X) and Confocal Scanning Laser Ophthalmoscope (Heidelberg retina angiogram (HRA)) on the fundus autofluorescence (FAF) imaging (resolution and FAF characteristics). Methods: 105 eyes of 56 patients with various retinal diseases underwent FAF imaging with HRA (488nm exciter/500nm barrier filter) before fluorescein angiography (FFA) and Topcon Fundus Camera (580nm exciter/695nm barrier filter) before and after FFA. The quality of the FAF images was compared for their resolution and analysed for the influence of fixation stability and cataracts. Hypo-and hyper-FAF behaviour was analysed for the healthy disc, healthy fovea, and a variety of pathological features. Results: HRA images were found to be of superior resolution in 18, while Topcon images were estimated superior in 29 eyes. No difference was found in 58 eyes. Both poor fixation (p=0.009) and more advanced cataract (p=0.013) were found associated with better image quality by Topcon. Images acquired by Topcon before and after FFA were identical (100%). The healthy disc was usually dark on HRA (72%), but showed mild autofluorescence on Topcon (85%). The healthy fovea showed in 100% Hypo-FAF on HRA, while Topcon showed in 53% Iso-FAF, in 43% mild Hypo-FAF, and in 4% Hypo-FAF as on HRA. No difference of FAF was found for geographic atrophy, pigment changes, and drusen, although Topcon images were often more detailed. Hyper-FAF due to serous exudation showed better on HRA. Cystic edema was visible only on HRA in a petaloid hyper-FAF pattern in 83%, while only two eyes (17%) showed similar behavior in both HRA- and Topcon images. Hard exudates caused Hypo-FAF only on HRA, hardly visible on Topcon. Blockage phenomenon by blood however was identical. Conclusions: The filter set of Topcon and the single image acquisition appear to be an advantage for patients with cataract and poor fixation respectively. Preceding FFA does not alter the Topcon FAF image. Regarding the FAF behavior, there are differences between the 2 systems which need to be taken into account when interpreting the images.
Resumo:
Prediction of species' distributions is central to diverse applications in ecology, evolution and conservation science. There is increasing electronic access to vast sets of occurrence records in museums and herbaria, yet little effective guidance on how best to use this information in the context of numerous approaches for modelling distributions. To meet this need, we compared 16 modelling methods over 226 species from 6 regions of the world, creating the most comprehensive set of model comparisons to date. We used presence-only data to fit models, and independent presence-absence data to evaluate the predictions. Along with well-established modelling methods such as generalised additive models and GARP and BIOCLIM, we explored methods that either have been developed recently or have rarely been applied to modelling species' distributions. These include machine-learning methods and community models, both of which have features that may make them particularly well suited to noisy or sparse information, as is typical of species' occurrence data. Presence-only data were effective for modelling species' distributions for many species and regions. The novel methods consistently outperformed more established methods. The results of our analysis are promising for the use of data from museums and herbaria, especially as methods suited to the noise inherent in such data improve.