840 resultados para filtering


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Liquid Argon Time Projection Chamber (LAr TPC) technique is a promising technology for future neutrino detectors. At LHEP of the University of Bern (Switzerland), an R&D program towards large detectors are on-going. The main goal is to show the feasibility of long drift paths over many meters. Therefore, a liquid Argon TPC with 5m of drift distance was constructed. Many other aspects of the liquid Argon TPC technology are also investigated, such as a new device to generate high voltage in liquid Argon (Greinacher circuit), a recirculation filtering system and the multi-photon ionization of liquid Argon with a UV laser. Two detectors are built: a medium size prototype for specific detector technology studies, and ARGONTUBE, a 5m long device.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we propose novel methodologies for the automatic segmentation and recognition of multi-food images. The proposed methods implement the first modules of a carbohydrate counting and insulin advisory system for type 1 diabetic patients. Initially the plate is segmented using pyramidal mean-shift filtering and a region growing algorithm. Then each of the resulted segments is described by both color and texture features and classified by a support vector machine into one of six different major food classes. Finally, a modified version of the Huang and Dom evaluation index was proposed, addressing the particular needs of the food segmentation problem. The experimental results prove the effectiveness of the proposed method achieving a segmentation accuracy of 88.5% and recognition rate equal to 87%

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents the application of a variety of techniques to study jet substructure. The performance of various modified jet algorithms, or jet grooming techniques, for several jet types and event topologies is investigated for jets with transverse momentum larger than 300 GeV. Properties of jets subjected to the mass-drop filtering, trimming, and pruning algorithms are found to have reduced sensitivity to multiple proton-proton interactions, are more stable at high luminosity and improve the physics potential of searches for heavy boosted objects. Studies of the expected discrimination power of jet mass and jet substructure observables in searches for new physics are also presented. Event samples enriched in boosted W and Z bosons and top-quark pairs are used to study both the individual jet invariant mass scales and the efficacy of algorithms to tag boosted hadronic objects. The analyses presented use the full 2011 ATLAS dataset, corresponding to an integrated luminosity of 4.7 +/- 0.1 /fb from proton-proton collisions produced by the Large Hadron Collider at a center-of-mass energy of sqrt(s) = 7 TeV.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Manual counting of bacterial colony forming units (CFUs) on agar plates is laborious and error-prone. We therefore implemented a colony counting system with a novel segmentation algorithm to discriminate bacterial colonies from blood and other agar plates.A colony counter hardware was designed and a novel segmentation algorithm was written in MATLAB. In brief, pre-processing with Top-Hat-filtering to obtain a uniform background was followed by the segmentation step, during which the colony images were extracted from the blood agar and individual colonies were separated. A Bayes classifier was then applied to count the final number of bacterial colonies as some of the colonies could still be concatenated to form larger groups. To assess accuracy and performance of the colony counter, we tested automated colony counting of different agar plates with known CFU numbers of S. pneumoniae, P. aeruginosa and M. catarrhalis and showed excellent performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE The aim of this work is to derive a theoretical framework for quantitative noise and temporal fidelity analysis of time-resolved k-space-based parallel imaging methods. THEORY An analytical formalism of noise distribution is derived extending the existing g-factor formulation for nontime-resolved generalized autocalibrating partially parallel acquisition (GRAPPA) to time-resolved k-space-based methods. The noise analysis considers temporal noise correlations and is further accompanied by a temporal filtering analysis. METHODS All methods are derived and presented for k-t-GRAPPA and PEAK-GRAPPA. A sliding window reconstruction and nontime-resolved GRAPPA are taken as a reference. Statistical validation is based on series of pseudoreplica images. The analysis is demonstrated on a short-axis cardiac CINE dataset. RESULTS The superior signal-to-noise performance of time-resolved over nontime-resolved parallel imaging methods at the expense of temporal frequency filtering is analytically confirmed. Further, different temporal frequency filter characteristics of k-t-GRAPPA, PEAK-GRAPPA, and sliding window are revealed. CONCLUSION The proposed analysis of noise behavior and temporal fidelity establishes a theoretical basis for a quantitative evaluation of time-resolved reconstruction methods. Therefore, the presented theory allows for comparison between time-resolved parallel imaging methods and also nontime-resolved methods. Magn Reson Med, 2014. © 2014 Wiley Periodicals, Inc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Methods for tracking an object have generally fallen into two groups: tracking by detection and tracking through local optimization. The advantage of detection-based tracking is its ability to deal with target appearance and disappearance, but it does not naturally take advantage of target motion continuity during detection. The advantage of local optimization is efficiency and accuracy, but it requires additional algorithms to initialize tracking when the target is lost. To bridge these two approaches, we propose a framework for unified detection and tracking as a time-series Bayesian estimation problem. The basis of our approach is to treat both detection and tracking as a sequential entropy minimization problem, where the goal is to determine the parameters describing a target in each frame. To do this we integrate the Active Testing (AT) paradigm with Bayesian filtering, and this results in a framework capable of both detecting and tracking robustly in situations where the target object enters and leaves the field of view regularly. We demonstrate our approach on a retinal tool tracking problem and show through extensive experiments that our method provides an efficient and robust tracking solution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Plant functional traits reflect different evolutionary responses to environmental variation, and among extant species determine the outcomes of interactions between plants and their environment, including other plant species. Thus, combining phylogenetic and trait-based information can be a powerful approach for understanding community assembly processes across a range of spatial scales. We used this approach to investigate tree community composition at Phou Khao Khouay National Park (18°14’-18°32’N; 102°38’- 102°59’E), Laos, where several distinct forest types occur in close proximity. The aim of our study was to examine patterns of plant community assembly across the strong environmental gradients evident at our site. We hypothesized that differences in tree community composition were being driven by an underlying gradient in soil conditions. Thus, we predicted that environmental filtering would predominate at the site and that the filtering would be strongest on sandier soil with low pH, as these are the conditions least favorable to plant growth. We surveyed eleven 0.25 ha (50x50 m) plots for all trees above 10 cm dbh (1221 individual trees, including 47 families, 70 genera and 123 species) and sampled soils in each plot. For each species in the community, we measured 11 commonly studied plant functional traits covering both the leaf and wood economic spectrum traits and we reconstructed a phylogenetic tree for 115 of the species in the community using rbcL and matK sequences downloaded from Genebank (other species were not available). Finally we compared the distribution of trait values and species at two scales (among plots and 10x10m subplots) to examine trait and phylogenetic community structures. Although there was strong evidence that an underlying soil gradient was determining patterns of species composition at the site, our results did not support the hypothesis that the environmental filtering dominated community assembly processes. For the measured plant functional traits there was no consistent pattern of trait dispersion across the site, either when traits were considered individually or when combined in a multivariate analysis. However, there was a significant correlation between the degree of phylogenetic dispersion and the first principle component axis (PCA1) for the soil parameters.Moreover, the more phylogenetically clustered plots were on sandier soils with lower pH. Hence, we suggest that the community assembly processes across our sitemay reflect the influence ofmore conserved traits that we did not measure. Nevertheless, our results are equivocal and other interpretations are possible. Our study illustrates some difficulties in combining trait and phylogenetic approaches that may result from the complexities of integrating spatial and evolutionary processes that vary at different scales.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We explore the feasibility of obtaining a spatially resolved picture of Ca2+Ca2+ inward currents (ICaICa) in multicellular cardiac tissue by differentiating optically recorded Ca2+Ca2+ transients that accompany propagating action potentials. Patterned growth strands of neonatal rat ventricular cardiomyocytes were stained with the Ca2+Ca2+ indicators Fluo-4 or Fluo-4FF. Preparations were stimulated at 1 Hz, and Ca2+Ca2+ transients were recorded with high spatiotemporal resolution (50  μm50  μm, 2 kHz analog bandwidth) with a photodiode array. Signals were differentiated after appropriate digital filtering. Differentiation of Ca2+Ca2+ transients resulted in optically recorded calcium currents (ORCCs) that carried the temporal and pharmacological signatures of L-type Ca2+Ca2+ inward currents: the time to peak amounted to ∼2.1  ms∼2.1  ms (Fluo-4FF) and ∼2.4  ms∼2.4  ms (Fluo-4), full-width at half-maximum was ∼8  ms∼8  ms, and ORCCs were completely suppressed by 50  μmol/L50  μmol/LCdCl2CdCl2. Also, and as reported before from patch-clamp studies, caffeine reversibly depressed the amplitude of ORCCs. The results demonstrate that the differentiation of Ca2+Ca2+ transients can be used to obtain a spatially resolved picture of the initial phase of ICaICa in cardiac tissue and to assess relative changes of activation/fast inactivation of ICaICa following pharmacological interventions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over 250 Mendelian traits and disorders, caused by rare alleles have been mapped in the canine genome. Although each disease is rare in the dog as a species, they are collectively common and have major impact on canine health. With SNP-based genotyping arrays, genome-wide association studies (GWAS) have proven to be a powerful method to map the genomic region of interest when 10-20 cases and 10-20 controls are available. However, to identify the genetic variant in associated regions, fine-mapping and targeted re-sequencing is required. Here we present a new approach using whole-genome sequencing (WGS) of a family trio without prior GWAS. As a proof-of-concept, we chose an autosomal recessive disease known as hereditary footpad hyperkeratosis (HFH) in Kromfohrl änder dogs. To our knowledge, this is the first time this family trio WGS-approach, has successfully been used to identify a genetic variant that perfectly segregates with a canine disorder. The sequencing of three Kromfohrl änder dogs from a family trio (an affected offspring and both its healthy parents) resulted in an average genome coverage of 9.2X per individual. After applying stringent filtering criteria for candidate causative coding variants, 527 single nucleotide variants (SNVs) and 15 indels were found to be homozygous in the affected offspring and heterozygous in the parents. Using the computer software packages ANNOVAR and SIFT to functionally annotate coding sequence differences and to predict their functional effect, resulted in seven candidate variants located in six different genes. Of these, only FAM83G:c155G>C (p.R52P) was found to be concordant in eight additional cases and 16 healthy Kromfohrl änder dogs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim Our aims were to compare the composition of testate amoeba (TA) communities from Santa Cruz Island, Galápagos Archipelago, which are likely in existence only as a result of anthropogenic habitat transformation, with similar naturally occurring communities from northern and southern continental peatlands. Additionally, we aimed at assessing the importance of niche-based and dispersal-based processes in determining community composition and taxonomic and functional diversity. Location The humid highlands of the central island of Santa Cruz, Galápagos Archipelago. Methods We survey the alpha, beta and gamma taxonomic and functional diversities of TA, and the changes in functional traits along a gradient of wet to dry habitats. We compare the TA community composition, abundance and frequency recorded in the insular peatlands with that recorded in continental peatlands of Northern and Southern Hemispheres. We use generalized linear models to determine how environmental conditions influence taxonomic and functional diversity as well as the mean values of functional traits within communities. We finally apply variance partitioning to assess the relative importance of niche- and dispersal-based processes in determining community composition. Results TA communities in Santa Cruz Island were different from their Northern Hemisphere and South American counterparts with most genera considered as characteristic for Northern Hemisphere and South American Sphagnum peatlands missing or very rare in the Galápagos. Functional traits were most correlated with elevation and site topography and alpha functional diversity to the type of material sampled and site topography. Community composition was more strongly correlated with spatial variables than with environmental ones. Main conclusions TA communities of the Sphagnum peatlands of Santa Cruz Island and the mechanisms shaping these communities contrast with Northern Hemisphere and South American peatlands. Soil moisture was not a strong predictor of community composition most likely because rainfall and clouds provide sufficient moisture. Dispersal limitation was more important than environmental filtering because of the isolation of the insular peatlands from continental ones and the young ecological history of these ecosystems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the ongoing shift in the computer graphics industry toward Monte Carlo rendering, there is a need for effective, practical noise-reduction techniques that are applicable to a wide range of rendering effects and easily integrated into existing production pipelines. This course surveys recent advances in image-space adaptive sampling and reconstruction algorithms for noise reduction, which have proven very effective at reducing the computational cost of Monte Carlo techniques in practice. These approaches leverage advanced image-filtering techniques with statistical methods for error estimation. They are attractive because they can be integrated easily into conventional Monte Carlo rendering frameworks, they are applicable to most rendering effects, and their computational overhead is modest.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While there are reports of developing sexual relationships on the Internet (I) among MSM, there are few reports that have examined the process of developing sexual relationships on the I and comparing to that in real life (IRL). This study examines the process to provide insight into how MSM make decisions about courtship, engages in negotiations for sex, and choose sexual partners and examines the comparative sexual risks taken between I vs. IRL negotiation. This self-selected convenience sample at a national level (n=1001) of MSM recruited through the I, systematically explored the different steps, the process of courtship in a flow chart of I and IRL dating to portray the process of filtering, courtship and/or negotiation for sex. Risk behaviors in both environments are presented along with interactions that create predictable sequences or "scripts". These sequences constitute 'filtering' and 'sexual positioning'. Differences between I & IRL suggest discussion of HIV/STD status to have consistent differences for all variables except 'unprotected sex' meaning no condom use. There was more communication on the I in regards to self revealing information or variables relating to reducing risks which enable 'filtering' (including serosorting). Data indicate more steps in the I process, providing more complex, multiple steps to filter and position with regard not only to HIV/STD risk but also to negotiate position for complementary sexual interest. The study established a pattern of MSM's courtships or negotiation for sex and a pattern of acquisition, and more I negotiation. Data suggest negotiation opportunities which could lend to intervention to advise people how to negotiate safely. ^ Previous studies have reviewed MSM and drug use. This is a study to review the process of drug use associated with sexual behavior regarding the Internet (I) and in real life (IRL) using a self-selected, convenience sample of MSM (n=1001) recruited nation-wide through the Internet. Data on MSM and drugs illustrate the Internet being used as a tool to filter for drug use among MSM. MSM's drug use in both environments highlights the use of sexual performance drugs with an IRL pursuit of intimacy or negotiation for sex. IRL encounters were more likely to involve drug use (both recreational and sexual performance-enhancing) than Internet encounters. This may be due to more IRL meetings occurring at bars, clubs or parties where drug use is a norm. Compared with IRL, the Internet may provide a venue for persons who do not want to use drugs to select partners with similar attitudes. This suggests that filtering may be occurring as part of the internet negotiation. Data indicated that IRL persons get drunk/high before having sex in past 60 days significantly more often than Internet participants. Age did not alter the pattern of results. Thus drug filtering is really not recreational drug filtering or selecting for PNP, but appears to be situationally-based. Thus, it should perhaps be seen as another form of filtering to select drug-free partners, rather than using the Internet to specifically recruit and interact with other recreational drug users. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In population studies, most current methods focus on identifying one outcome-related SNP at a time by testing for differences of genotype frequencies between disease and healthy groups or among different population groups. However, testing a great number of SNPs simultaneously has a problem of multiple testing and will give false-positive results. Although, this problem can be effectively dealt with through several approaches such as Bonferroni correction, permutation testing and false discovery rates, patterns of the joint effects by several genes, each with weak effect, might not be able to be determined. With the availability of high-throughput genotyping technology, searching for multiple scattered SNPs over the whole genome and modeling their joint effect on the target variable has become possible. Exhaustive search of all SNP subsets is computationally infeasible for millions of SNPs in a genome-wide study. Several effective feature selection methods combined with classification functions have been proposed to search for an optimal SNP subset among big data sets where the number of feature SNPs far exceeds the number of observations. ^ In this study, we take two steps to achieve the goal. First we selected 1000 SNPs through an effective filter method and then we performed a feature selection wrapped around a classifier to identify an optimal SNP subset for predicting disease. And also we developed a novel classification method-sequential information bottleneck method wrapped inside different search algorithms to identify an optimal subset of SNPs for classifying the outcome variable. This new method was compared with the classical linear discriminant analysis in terms of classification performance. Finally, we performed chi-square test to look at the relationship between each SNP and disease from another point of view. ^ In general, our results show that filtering features using harmononic mean of sensitivity and specificity(HMSS) through linear discriminant analysis (LDA) is better than using LDA training accuracy or mutual information in our study. Our results also demonstrate that exhaustive search of a small subset with one SNP, two SNPs or 3 SNP subset based on best 100 composite 2-SNPs can find an optimal subset and further inclusion of more SNPs through heuristic algorithm doesn't always increase the performance of SNP subsets. Although sequential forward floating selection can be applied to prevent from the nesting effect of forward selection, it does not always out-perform the latter due to overfitting from observing more complex subset states. ^ Our results also indicate that HMSS as a criterion to evaluate the classification ability of a function can be used in imbalanced data without modifying the original dataset as against classification accuracy. Our four studies suggest that Sequential Information Bottleneck(sIB), a new unsupervised technique, can be adopted to predict the outcome and its ability to detect the target status is superior to the traditional LDA in the study. ^ From our results we can see that the best test probability-HMSS for predicting CVD, stroke,CAD and psoriasis through sIB is 0.59406, 0.641815, 0.645315 and 0.678658, respectively. In terms of group prediction accuracy, the highest test accuracy of sIB for diagnosing a normal status among controls can reach 0.708999, 0.863216, 0.639918 and 0.850275 respectively in the four studies if the test accuracy among cases is required to be not less than 0.4. On the other hand, the highest test accuracy of sIB for diagnosing a disease among cases can reach 0.748644, 0.789916, 0.705701 and 0.749436 respectively in the four studies if the test accuracy among controls is required to be at least 0.4. ^ A further genome-wide association study through Chi square test shows that there are no significant SNPs detected at the cut-off level 9.09451E-08 in the Framingham heart study of CVD. Study results in WTCCC can only detect two significant SNPs that are associated with CAD. In the genome-wide study of psoriasis most of top 20 SNP markers with impressive classification accuracy are also significantly associated with the disease through chi-square test at the cut-off value 1.11E-07. ^ Although our classification methods can achieve high accuracy in the study, complete descriptions of those classification results(95% confidence interval or statistical test of differences) require more cost-effective methods or efficient computing system, both of which can't be accomplished currently in our genome-wide study. We should also note that the purpose of this study is to identify subsets of SNPs with high prediction ability and those SNPs with good discriminant power are not necessary to be causal markers for the disease.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a noninvasive technique for quantitative assessment of the integrity of blood-brain barrier and blood-spinal cord barrier (BSCB) in the presence of central nervous system pathologies. However, the results of DCE-MRI show substantial variability. The high variability can be caused by a number of factors including inaccurate T1 estimation, insufficient temporal resolution and poor contrast-to-noise ratio. My thesis work is to develop improved methods to reduce the variability of DCE-MRI results. To obtain fast and accurate T1 map, the Look-Locker acquisition technique was implemented with a novel and truly centric k-space segmentation scheme. In addition, an original multi-step curve fitting procedure was developed to increase the accuracy of T1 estimation. A view sharing acquisition method was implemented to increase temporal resolution, and a novel normalization method was introduced to reduce image artifacts. Finally, a new clustering algorithm was developed to reduce apparent noise in the DCE-MRI data. The performance of these proposed methods was verified by simulations and phantom studies. As part of this work, the proposed techniques were applied to an in vivo DCE-MRI study of experimental spinal cord injury (SCI). These methods have shown robust results and allow quantitative assessment of regions with very low vascular permeability. In conclusion, applications of the improved DCE-MRI acquisition and analysis methods developed in this thesis work can improve the accuracy of the DCE-MRI results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Accurate calculation of absorbed dose to target tumors and normal tissues in the body is an important requirement for establishing fundamental dose-response relationships for radioimmunotherapy. Two major obstacles have been the difficulty in obtaining an accurate patient-specific 3-D activity map in-vivo and calculating the resulting absorbed dose. This study investigated a methodology for 3-D internal dosimetry, which integrates the 3-D biodistribution of the radionuclide acquired from SPECT with a dose-point kernel convolution technique to provide the 3-D distribution of absorbed dose. Accurate SPECT images were reconstructed with appropriate methods for noise filtering, attenuation correction, and Compton scatter correction. The SPECT images were converted into activity maps using a calibration phantom. The activity map was convolved with an $\sp{131}$I dose-point kernel using a 3-D fast Fourier transform to yield a 3-D distribution of absorbed dose. The 3-D absorbed dose map was then processed to provide the absorbed dose distribution in regions of interest. This methodology can provide heterogeneous distributions of absorbed dose in volumes of any size and shape with nonuniform distributions of activity. Comparison of the activities quantitated by our SPECT methodology to true activities in an Alderson abdominal phantom (with spleen, liver, and spherical tumor) yielded errors of $-$16.3% to 4.4%. Volume quantitation errors ranged from $-$4.0 to 5.9% for volumes greater than 88 ml. The percentage differences of the average absorbed dose rates calculated by this methodology and the MIRD S-values were 9.1% for liver, 13.7% for spleen, and 0.9% for the tumor. Good agreement (percent differences were less than 8%) was found between the absorbed dose due to penetrating radiation calculated from this methodology and TLD measurement. More accurate estimates of the 3-D distribution of absorbed dose can be used as a guide in specifying the minimum activity to be administered to patients to deliver a prescribed absorbed dose to tumor without exceeding the toxicity limits of normal tissues. ^