822 resultados para Multicast Filtering
Resumo:
ABSTRACT : Background : The aim of this study was to evaluate the midterm biocompatibility of a new x-shaped implant made of zirconium in an animal model of glaucoma surgery. Methods : Preoperatively, ultrasound biomicroscopy (UBM), intraocular pressure (IOP) and outflow facility (OF) data were acquired. Upon surgery, one eye was chosen randomly to receive an implant, while the other received none. Ten rabbits went through a 1-, 2-, 3-, 4- and 6-month follow-up. [OP was measured regularly, UBM performed at 1, 3 and 6 months after surgery. At the end of the follow-up, OF was again measured. Histology sections were analyzed. Results : For both groups IOP control was satisfactory, while OF initially increased at month 1 to resume preoperative values thereafter. Eyes with implants had larger filtration blebs which decreased faster than in eyes without the implant. Drainage vessel density, inflammatory cell number and fibrosis were higher in tissues near the implant. Conclusions : The zirconium implant initially promoted the positive effects of the surgery (IOP control, OF increase). Nevertheless, after several months, foreign body reactions and fibrosis had occurred on some implants that restrained the early benefit of such a procedure. Modifications of the zirconium implant geometry could enhance the overall success rate.
Resumo:
Anti-doping authorities have high expectations of the athlete steroidal passport (ASP) for anabolic-androgenic steroids misuse detection. However, it is still limited to the monitoring of known well-established compounds and might greatly benefit from the discovery of new relevant biomarkers candidates. In this context, steroidomics opens the way to the untargeted simultaneous evaluation of a high number of compounds. Analytical platforms associating the performance of ultra-high pressure liquid chromatography (UHPLC) and the high mass-resolving power of quadrupole time-of-flight (QTOF) mass spectrometers are particularly adapted for such purpose. An untargeted steroidomic approach was proposed to analyse urine samples from a clinical trial for the discovery of relevant biomarkers of testosterone undecanoate oral intake. Automatic peak detection was performed and a filter of reference steroid metabolites mass-to-charge ratio (m/z) values was applied to the raw data to ensure the selection of a subset of steroid-related features. Chemometric tools were applied for the filtering and the analysis of UHPLC-QTOF-MS(E) data. Time kinetics could be assessed with N-way projections to latent structures discriminant analysis (N-PLS-DA) and a detection window was confirmed. Orthogonal projections to latent structures discriminant analysis (O-PLS-DA) classification models were evaluated in a second step to assess the predictive power of both known metabolites and unknown compounds. A shared and unique structure plot (SUS-plot) analysis was performed to select the most promising unknown candidates and receiver operating characteristic (ROC) curves were computed to assess specificity criteria applied in routine doping control. This approach underlined the pertinence to monitor both glucuronide and sulphate steroid conjugates and include them in the athletes passport, while promising biomarkers were also highlighted.
Resumo:
Amplified Fragment Length Polymorphisms (AFLPs) are a cheap and efficient protocol for generating large sets of genetic markers. This technique has become increasingly used during the last decade in various fields of biology, including population genomics, phylogeography, and genome mapping. Here, we present RawGeno, an R library dedicated to the automated scoring of AFLPs (i.e., the coding of electropherogram signals into ready-to-use datasets). Our program includes a complete suite of tools for binning, editing, visualizing, and exporting results obtained from AFLP experiments. RawGeno can either be used with command lines and program analysis routines or through a user-friendly graphical user interface. We describe the whole RawGeno pipeline along with recommendations for (a) setting the analysis of electropherograms in combination with PeakScanner, a program freely distributed by Applied Biosystems; (b) performing quality checks; (c) defining bins and proceeding to scoring; (d) filtering nonoptimal bins; and (e) exporting results in different formats.
Resumo:
Aim Conservation strategies are in need of predictions that capture spatial community composition and structure. Currently, the methods used to generate these predictions generally focus on deterministic processes and omit important stochastic processes and other unexplained variation in model outputs. Here we test a novel approach of community models that accounts for this variation and determine how well it reproduces observed properties of alpine butterfly communities. Location The western Swiss Alps. Methods We propose a new approach to process probabilistic predictions derived from stacked species distribution models (S-SDMs) in order to predict and assess the uncertainty in the predictions of community properties. We test the utility of our novel approach against a traditional threshold-based approach. We used mountain butterfly communities spanning a large elevation gradient as a case study and evaluated the ability of our approach to model species richness and phylogenetic diversity of communities. Results S-SDMs reproduced the observed decrease in phylogenetic diversity and species richness with elevation, syndromes of environmental filtering. The prediction accuracy of community properties vary along environmental gradient: variability in predictions of species richness was higher at low elevation, while it was lower for phylogenetic diversity. Our approach allowed mapping the variability in species richness and phylogenetic diversity projections. Main conclusion Using our probabilistic approach to process species distribution models outputs to reconstruct communities furnishes an improved picture of the range of possible assemblage realisations under similar environmental conditions given stochastic processes and help inform manager of the uncertainty in the modelling results
Resumo:
MicroRNAs (miRs) are involved in the pathogenesis of several neoplasms; however, there are no data on their expression patterns and possible roles in adrenocortical tumors. Our objective was to study adrenocortical tumors by an integrative bioinformatics analysis involving miR and transcriptomics profiling, pathway analysis, and a novel, tissue-specific miR target prediction approach. Thirty-six tissue samples including normal adrenocortical tissues, benign adenomas, and adrenocortical carcinomas (ACC) were studied by simultaneous miR and mRNA profiling. A novel data-processing software was used to identify all predicted miR-mRNA interactions retrieved from PicTar, TargetScan, and miRBase. Tissue-specific target prediction was achieved by filtering out mRNAs with undetectable expression and searching for mRNA targets with inverse expression alterations as their regulatory miRs. Target sets and significant microarray data were subjected to Ingenuity Pathway Analysis. Six miRs with significantly different expression were found. miR-184 and miR-503 showed significantly higher, whereas miR-511 and miR-214 showed significantly lower expression in ACCs than in other groups. Expression of miR-210 was significantly lower in cortisol-secreting adenomas than in ACCs. By calculating the difference between dCT(miR-511) and dCT(miR-503) (delta cycle threshold), ACCs could be distinguished from benign adenomas with high sensitivity and specificity. Pathway analysis revealed the possible involvement of G2/M checkpoint damage in ACC pathogenesis. To our knowledge, this is the first report describing miR expression patterns and pathway analysis in sporadic adrenocortical tumors. miR biomarkers may be helpful for the diagnosis of adrenocortical malignancy. This tissue-specific target prediction approach may be used in other tumors too.
Resumo:
Aim: Climatic niche modelling of species and community distributions implicitly assumes strong and constant climatic determinism across geographic space. This assumption had however never been tested so far. We tested it by assessing how stacked-species distribution models (S-SDMs) perform for predicting plant species assemblages along elevation. Location: Western Swiss Alps. Methods: Using robust presence-absence data, we first assessed the ability of topo-climatic S-SDMs to predict plant assemblages in a study area encompassing a 2800 m wide elevation gradient. We then assessed the relationships among several evaluation metrics and trait-based tests of community assembly rules. Results: The standard errors of individual SDMs decreased significantly towards higher elevations. Overall, the S-SDM overpredicted far more than they underpredicted richness and could not reproduce the humpback curve along elevation. Overprediction was greater at low and mid-range elevations in absolute values but greater at high elevations when standardised by the actual richness. Looking at species composition, the evaluation metrics accounting for both the presence and absence of species (overall prediction success and kappa) or focusing on correctly predicted absences (specificity) increased with increasing elevation, while the metrics focusing on correctly predicted presences (Jaccard index and sensitivity) decreased. The best overall evaluation - as driven by specificity - occurred at high elevation where species assemblages were shown to be under significant environmental filtering of small plants. In contrast, the decreased overall accuracy in the lowlands was associated with functional patterns representing any type of assembly rule (environmental filtering, limiting similarity or null assembly). Main Conclusions: Our study reveals interesting patterns of change in S-SDM errors with changes in assembly rules along elevation. Yet, significant levels of assemblage prediction errors occurred throughout the gradient, calling for further improvement of SDMs, e.g., by adding key environmental filters that act at fine scales and developing approaches to account for variations in the influence of predictors along environmental gradients.
Resumo:
A study on the qualitative and quantitative coniposition of macroinvertebrate drift in the Llobregat river ( N E Spain) is put fonvard. Samples were taken hourly during a 24 hour period in august 1982. The samples were taken with a net of 625 cm. of filtering surface and an opening mesh size of 500 microns, which íiltered 72,1 m3 /h. and collected 75.719 individuals during the sampling period. That means a drift rate of 1.224 indiv./h. and a drift density of 17 indi/m3 . 8 I0/o of the organismes collected were exuviae, mainly pupal chironomid skins (54%) and nimphal ephemeroptera moults (27(Ynj. The remainder percentage of living organisms represents a drift rate of 227 indiv./h. and a drift density of 3.14 indiv./m3. These are intermediate values ifwe compare them with the data published. Sixty per cent of the living drift were chiroiiomids. 17.5 '1 ephemeroptera and 10 (%1 trichoptera; these being the niost iniportant groups. Ofthe 87 species identified in those groups 23 were common in al1 the (7 1 ) samples (Table 1)
Resumo:
The Powell Basin is a small oceanic basin located at the NE end of the Antarctic Peninsula developed during the Early Miocene and mostly surrounded by the continental crusts of the South Orkney Microcontinent, South Scotia Ridge and Antarctic Peninsula margins. Gravity data from the SCAN 97 cruise obtained with the R/V Hespérides and data from the Global Gravity Grid and Sea Floor Topography (GGSFT) database (Sandwell and Smith, 1997) are used to determine the 3D geometry of the crustal-mantle interface (CMI) by numerical inversion methods. Water layer contribution and sedimentary effects were eliminated from the Free Air anomaly to obtain the total anomaly. Sedimentary effects were obtained from the analysis of existing and new SCAN 97 multichannel seismic profiles (MCS). The regional anomaly was obtained after spectral and filtering processes. The smooth 3D geometry of the crustal mantle interface obtained after inversion of the regional anomaly shows an increase in the thickness of the crust towards the continental margins and a NW-SE oriented axis of symmetry coinciding with the position of an older oceanic spreading axis. This interface shows a moderate uplift towards the western part and depicts two main uplifts to the northern and eastern sectors.
Resumo:
Introduction: The field of Connectomic research is growing rapidly, resulting from methodological advances in structural neuroimaging on many spatial scales. Especially progress in Diffusion MRI data acquisition and processing made available macroscopic structural connectivity maps in vivo through Connectome Mapping Pipelines (Hagmann et al, 2008) into so-called Connectomes (Hagmann 2005, Sporns et al, 2005). They exhibit both spatial and topological information that constrain functional imaging studies and are relevant in their interpretation. The need for a special-purpose software tool for both clinical researchers and neuroscientists to support investigations of such connectome data has grown. Methods: We developed the ConnectomeViewer, a powerful, extensible software tool for visualization and analysis in connectomic research. It uses the novel defined container-like Connectome File Format, specifying networks (GraphML), surfaces (Gifti), volumes (Nifti), track data (TrackVis) and metadata. Usage of Python as programming language allows it to by cross-platform and have access to a multitude of scientific libraries. Results: Using a flexible plugin architecture, it is possible to enhance functionality for specific purposes easily. Following features are already implemented: * Ready usage of libraries, e.g. for complex network analysis (NetworkX) and data plotting (Matplotlib). More brain connectivity measures will be implemented in a future release (Rubinov et al, 2009). * 3D View of networks with node positioning based on corresponding ROI surface patch. Other layouts possible. * Picking functionality to select nodes, select edges, get more node information (ConnectomeWiki), toggle surface representations * Interactive thresholding and modality selection of edge properties using filters * Arbitrary metadata can be stored for networks, thereby allowing e.g. group-based analysis or meta-analysis. * Python Shell for scripting. Application data is exposed and can be modified or used for further post-processing. * Visualization pipelines using filters and modules can be composed with Mayavi (Ramachandran et al, 2008). * Interface to TrackVis to visualize track data. Selected nodes are converted to ROIs for fiber filtering The Connectome Mapping Pipeline (Hagmann et al, 2008) processed 20 healthy subjects into an average Connectome dataset. The Figures show the ConnectomeViewer user interface using this dataset. Connections are shown that occur in all 20 subjects. The dataset is freely available from the homepage (connectomeviewer.org). Conclusions: The ConnectomeViewer is a cross-platform, open-source software tool that provides extensive visualization and analysis capabilities for connectomic research. It has a modular architecture, integrates relevant datatypes and is completely scriptable. Visit www.connectomics.org to get involved as user or developer.
Resumo:
Cocktail parties, busy streets, and other noisy environments pose a difficult challenge to the auditory system: how to focus attention on selected sounds while ignoring others? Neurons of primary auditory cortex, many of which are sharply tuned to sound frequency, could help solve this problem by filtering selected sound information based on frequency-content. To investigate whether this occurs, we used high-resolution fMRI at 7 tesla to map the fine-scale frequency-tuning (1.5 mm isotropic resolution) of primary auditory areas A1 and R in six human participants. Then, in a selective attention experiment, participants heard low (250 Hz)- and high (4000 Hz)-frequency streams of tones presented at the same time (dual-stream) and were instructed to focus attention onto one stream versus the other, switching back and forth every 30 s. Attention to low-frequency tones enhanced neural responses within low-frequency-tuned voxels relative to high, and when attention switched the pattern quickly reversed. Thus, like a radio, human primary auditory cortex is able to tune into attended frequency channels and can switch channels on demand.
Resumo:
En aquest Treball de Final de Grau s’exposen els resultats de l’anàlisi de les dades genètiques del projecte EurGast2 "Genetic susceptibility, environmental exposure and gastric cancer risk in an European population”, estudi cas‐control niat a la cohort europea EPIC “European Prospective lnvestigation into Cancer and Nutrition”, que té per objectiu l’estudi dels factors genètics i ambientals associats amb el risc de desenvolupar càncer gàstric (CG). A partir de les dades resultants de l’estudi EurGast2, en el què es van analitzar 1.294 SNPs en 365 casos de càncer gàstric i 1.284 controls en l’anàlisi Single SNP previ, la hipòtesi de partida del present Treball de Final de Grau és que algunes variants amb un efecte marginal molt feble, però que conjuntament amb altres variants estarien associades al risc de CG, podrien no haver‐se detectat. Així doncs, l’objectiu principal del projecte és la identificació d’interaccions de segon ordre entre variants genètiques de gens candidats implicades en la carcinogènesi de càncer gàstric. L’anàlisi de les interaccions s’ha dut a terme aplicant el mètode estadístic Model‐based Multifactor Dimensionality Reduction Method (MB‐MDR), desenvolupat per Calle et al. l’any 2008 i s’han aplicat dues metodologies de filtratge per seleccionar les interaccions que s’exploraran: 1) filtratge d’interaccions amb un SNP significatiu en el Single SNP analysis i 2) filtratge d’interaccions segons la mesura Sinèrgia. Els resultats del projecte han identificat 5 interaccions de segon ordre entre SNPs associades significativament amb un major risc de desenvolupar càncer gàstric, amb p‐valor inferior a 10‐4. Les interaccions identificades corresponen a interaccions entre els gens MPO i CDH1, XRCC1 i GAS6, ADH1B i NR5A2 i IL4R i IL1RN (que s’ha validat en les dues metodologies de filtratge). Excepte CDH1, cap altre d’aquests gens s’havia associat significativament amb el CG o prioritzat en les anàlisis prèvies, el que confirma l’interès d’analitzar les interaccions genètiques de segon ordre. Aquestes poden ser un punt de partida per altres anàlisis destinades a confirmar gens putatius i a estudiar a nivell biològic i molecular els mecanismes de carcinogènesi, i orientades a la recerca de noves dianes terapèutiques i mètodes de diagnosi i pronòstic més eficients.
Resumo:
Soils on gypsum are well known in dry climates, but were very little described in temperate climate, and never in Switzerland. This study aims to describe soils affected by gypsum in temperate climate and to understand their pedogenesis using standard laboratory analyzes performed on ten Swiss soils located on gypsum outcrops. In parallel, phytosociological relevés described the vegetation encountered in gypsiferous grounds. Gypsification process (secondary gypsum enrichment by precipitation) was observed in all soils. It was particularly important in regions where potential evapotranspiration exceed strongly precipitations in summer (central Valais, Chablais under influence of warm wind). Gypsum contents were regularly measured above 20% in deep horizons, and exceeded locally 70%, building a white, indurate horizon. However, the absence of such a gypsic horizon in the top soil hindered the use of gypsosol (according to the Référentiel pédologique, BAIZE & GIRARD 2009), the typical name of soils affected by gypsum, but restricted to dry regions. As all soils had a high content of magnesium carbonates, they were logically classified in the group of DOLOMITOSOLS. However, according to the World Reference Base for Soil Resources (IUSS 2014), five soils can be classified among the Gypsisols, criteria being here less restricting. These soils are characterized by a coarse texture and a particulate brittle structure making a filtering substrate. They allow water to flow easily taking nutrients. They are not retained by clay, which does generally not exceed 1% of the fine material. The saturation of calcium blocks the breakdown of organic matter. Moreover, these soils are often rejuvenated by erosion caused by the rough relief due to gypsum (landslides, sinkholes, cliffs and slopes). Hence, the vegetation is mainly characterized by calcareous and drought tolerant species, with mostly xerothermophilic beech (Cephalanthero-Fagenion) and pine forests (Erico-Pinion sylvestris) in lowlands, or subalpine heathlands (Ericion) and dry calcareous grasslands (Caricion firmae) in higher elevations.
Resumo:
Defining the limits of an urban agglomeration is essential both for fundamental and applied studies in quantitative and theoretical geography. A simple and consistent way for defining such urban clusters is important for performing different statistical analysis and comparisons. Traditionally, agglomerations are defined using a rather qualitative approach based on various statistical measures. This definition varies generally from one country to another, and the data taken into account are different. In this paper, we explore the use of the City Clustering Algorithm (CCA) for the agglomeration definition in Switzerland. This algorithm provides a systemic and easy way to define an urban area based only on population data. The CCA allows the specification of the spatial resolution for defining the urban clusters. The results from different resolutions are compared and analysed, and the effect of filtering the data investigated. Different scales and parameters allow highlighting different phenomena. The study of Zipf's law using the visual rank-size rule shows that it is valid only for some specific urban clusters, inside a narrow range of the spatial resolution of the CCA. The scale where emergence of one main cluster occurs can also be found in the analysis using Zipf's law. The study of the urban clusters at different scales using the lacunarity measure - a complementary measure to the fractal dimension - allows to highlight the change of scale at a given range.
Resumo:
BACKGROUND: Selective laser trabeculoplasty (SLT) is a relatively new treatment strategy for the treatment of glaucoma. Its principle is similar to that of argon laser trabeculoplasty (ALT), but may lead to less damage to the trabecular meshwork. METHODS: We assessed the 2-year efficacy of SLT in a noncomparative consecutive case series. Any adult patient either suspected of having glaucoma or with open-angle glaucoma, whose treatment was judged insufficient to reach target intraocular pressure (IOP), could be recruited. IOP and number of glaucoma treatments were recorded over 2 years after the procedure. RESULTS: Our sample consisted of 44 consecutive eyes of 26 patients, aged 69+/-8 years. Eyes were treated initially on the lower 180 degrees . Three of them were retreated after 15 days on the upper 180. Fourteen eyes had ocular hypertension, 17 primary open-angle/normal-tension glaucoma, 11 pseudoexfoliation (PEX) glaucoma, and two pigmentary glaucoma. Thirty-six eyes had previously been treated and continued to be treated with topical anti-glaucoma medication, ten had had prior ALT, nine iridotomy, and 12 filtering surgery. The 2-year-follow up could not be completed for eight eyes because they needed filtering surgery. In the remaining 36 eyes, IOP decreased by a mean of 17.2%, 3.3 mmHg, (19.2+/-4.7 to 15+/-3.6 mmHg) after 2 years (p<0.001). As a secondary outcome, the number of glaucoma treatments decreased from 1.44 to 1.36 drops/patient. Other results according to subgroups of patients are analyzed: the greatest IOP decrease occurred in eyes that had never been treated with anti-glaucoma medication or with PEX glaucoma. SLT was probably valuable in a few eyes after filtering surgery; however, the statistical power of the study was not strong enough to draw a firm conclusion. When expressed in survival curves after 2 years, however, only 48% and 41% of eyes experienced a decrease of more than 3 mmHg or more than 20% of preoperative intraocular pressure, respectively. CONCLUSION: SLT decreases IOP somewhat for at least 2 years without an increase in topical glaucoma treatment. However, it cannot totally replace topical glaucoma treatment. In the future, patient selection should be improved to decrease the cost/effectiveness ratio.
Resumo:
Learning object repositories are a basic piece of virtual learning environments used for content management. Nevertheless, learning objects have special characteristics that make traditional solutions for content management ine ective. In particular, browsing and searching for learning objects cannot be based on the typical authoritative meta-data used for describing content, such as author, title or publicationdate, among others. We propose to build a social layer on top of a learning object repository, providing nal users with additional services fordescribing, rating and curating learning objects from a teaching perspective. All these interactions among users, services and resources can be captured and further analyzed, so both browsing and searching can be personalized according to user pro le and the educational context, helping users to nd the most valuable resources for their learning process. In this paper we propose to use reputation schemes and collaborative filtering techniques for improving the user interface of a DSpace based learning object repository.