61 resultados para multiresolution filtering

em Université de Lausanne, Switzerland


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study introduces a novel approach for automatic temporal phase detection and inter-arm coordination estimation in front-crawl swimming using inertial measurement units (IMUs). We examined the validity of our method by comparison against a video-based system. Three waterproofed IMUs (composed of 3D accelerometer, 3D gyroscope) were placed on both forearms and the sacrum of the swimmer. We used two underwater video cameras in side and frontal views as our reference system. Two independent operators performed the video analysis. To test our methodology, seven well-trained swimmers performed three 300 m trials in a 50 m indoor pool. Each trial was in a different coordination mode quantified by the index of coordination. We detected different phases of the arm stroke by employing orientation estimation techniques and a new adaptive change detection algorithm on inertial signals. The difference of 0.2 +/- 3.9% between our estimation and video-based system in assessment of the index of coordination was comparable to experienced operators' difference (1.1 +/- 3.6%). The 95% limits of agreement of the difference between the two systems in estimation of the temporal phases were always less than 7.9% of the cycle duration. The inertial system offers an automatic easy-to-use system with timely feedback for the study of swimming.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Errors in the inferred multiple sequence alignment may lead to false prediction of positive selection. Recently, methods for detecting unreliable alignment regions were developed and were shown to accurately identify incorrectly aligned regions. While removing unreliable alignment regions is expected to increase the accuracy of positive selection inference, such filtering may also significantly decrease the power of the test, as positively selected regions are fast evolving, and those same regions are often those that are difficult to align. Here, we used realistic simulations that mimic sequence evolution of HIV-1 genes to test the hypothesis that the performance of positive selection inference using codon models can be improved by removing unreliable alignment regions. Our study shows that the benefit of removing unreliable regions exceeds the loss of power due to the removal of some of the true positively selected sites.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Oscillations have been increasingly recognized as a core property of neural responses that contribute to spontaneous, induced, and evoked activities within and between individual neurons and neural ensembles. They are considered as a prominent mechanism for information processing within and communication between brain areas. More recently, it has been proposed that interactions between periodic components at different frequencies, known as cross-frequency couplings, may support the integration of neuronal oscillations at different temporal and spatial scales. The present study details methods based on an adaptive frequency tracking approach that improve the quantification and statistical analysis of oscillatory components and cross-frequency couplings. This approach allows for time-varying instantaneous frequency, which is particularly important when measuring phase interactions between components. We compared this adaptive approach to traditional band-pass filters in their measurement of phase-amplitude and phase-phase cross-frequency couplings. Evaluations were performed with synthetic signals and EEG data recorded from healthy humans performing an illusory contour discrimination task. First, the synthetic signals in conjunction with Monte Carlo simulations highlighted two desirable features of the proposed algorithm vs. classical filter-bank approaches: resilience to broad-band noise and oscillatory interference. Second, the analyses with real EEG signals revealed statistically more robust effects (i.e. improved sensitivity) when using an adaptive frequency tracking framework, particularly when identifying phase-amplitude couplings. This was further confirmed after generating surrogate signals from the real EEG data. Adaptive frequency tracking appears to improve the measurements of cross-frequency couplings through precise extraction of neuronal oscillations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To make surgeons performing nonpenetrating filtering surgery aware of an unusual complication namely Descemet membrane detachment. METHODS: We retrospectively reviewed nine eyes of nine patients seen in our hospital with Descemet membrane detachment occurring after nonpenetrating filtering surgery from January 1994 to December 2000. RESULTS: Both planar and nonplanar detachments were reported. Neither scrolls nor tears in the Descemet membrane were observed in any patient. After viscocanalostomy (four patients), the detachment was generally noticed shortly after the procedure and the cornea maintained its clarity. After deep sclerectomy with a collagen implant (five patients), it developed weeks to months postoperatively with adjacent corneal edema. Four patients had descemetopexy. None required more than one procedure. However, at the last visit, two detachments persisted although they had diminished in size: one after viscocanalostomy and conservative treatment and one after descemetopexy after deep sclerectomy with a collagen implant. To date otherwise, no signs of significant corneal damage could be observed clinically nor by specular microscopy and pachymetry. CONCLUSIONS: The diagnosis of Descemet membrane detachment can be easily overlooked or misdiagnosed. The clinical presentation, clinical course, and pathogenesis depend on the type of nonpenetrating filtering surgery performed. Ophthalmologists should be aware of this unusual complication, which is likely to be more common after nonpenetrating filtering surgery than after trabeculectomy. A period of observation before attempting descemetopexy is recommended.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Detecting local differences between groups of connectomes is a great challenge in neuroimaging, because the large number of tests that have to be performed and the impact on multiplicity correction. Any available information should be exploited to increase the power of detecting true between-group effects. We present an adaptive strategy that exploits the data structure and the prior information concerning positive dependence between nodes and connections, without relying on strong assumptions. As a first step, we decompose the brain network, i.e., the connectome, into subnetworks and we apply a screening at the subnetwork level. The subnetworks are defined either according to prior knowledge or by applying a data driven algorithm. Given the results of the screening step, a filtering is performed to seek real differences at the node/connection level. The proposed strategy could be used to strongly control either the family-wise error rate or the false discovery rate. We show by means of different simulations the benefit of the proposed strategy, and we present a real application of comparing connectomes of preschool children and adolescents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have developed a processing methodology that allows crosshole ERT (electrical resistivity tomography) monitoring data to be used to derive temporal fluctuations of groundwater electrical resistivity and thereby characterize the dynamics of groundwater in a gravel aquifer as it is infiltrated by river water. Temporal variations of the raw ERT apparent-resistivity data were mainly sensitive to the resistivity (salinity), temperature and height of the groundwater, with the relative contributions of these effects depending on the time and the electrode configuration. To resolve the changes in groundwater resistivity, we first expressed fluctuations of temperature-detrended apparent-resistivity data as linear superpositions of (i) time series of riverwater-resistivity variations convolved with suitable filter functions and (ii) linear and quadratic representations of river-water-height variations multiplied by appropriate sensitivity factors; river-water height was determined to be a reliable proxy for groundwater height. Individual filter functions and sensitivity factors were obtained for each electrode configuration via deconvolution using a one month calibration period and then the predicted contributions related to changes in water height were removed prior to inversion of the temperature-detrended apparent-resistivity data. Applications of the filter functions and sensitivity factors accurately predicted the apparent-resistivity variations (the correlation coefficient was 0.98). Furthermore, the filtered ERT monitoring data and resultant time-lapse resistivity models correlated closely with independently measured groundwater electrical resistivity monitoring data and only weakly with the groundwater-height fluctuations. The inversion results based on the filtered ERT data also showed significantly less inversion artefacts than the raw data inversions. We observed resistivity increases of up to 10% and the arrival time peaks in the time-lapse resistivity models matched those in the groundwater resistivity monitoring data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To describe methods and outcomes of excisional revision of a filtering bleb (bleb revision) using free conjunctival autologous graft either for bleb repair or for bleb reduction after trabeculectomy and deep sclerectomy with an implant. METHODS: Retrospective medical records were reviewed for a consecutive non-comparative case series comprising patients who underwent excisional revision of a filtering bleb between May 1998-January 2001. Excisional revision using free conjunctival autologous graft (bleb revision) was performed either for bleb repair, to treat early and late leaks and hypotony with maculopathy, or for bleb reduction, to improve ocular pain, discomfort, burning, foreign body sensation, tearing, and fluctuations of visual acuity. The revision consisted of bleb excision and free conjunctival autologous graft. The bleb histopathology was analyzed in patients who underwent bleb repair. RESULTS: Sixteen patients were included in the study, consisting of nine patients who had a trabeculectomy and seven patients who had a deep sclerectomy with an implant. Bleb revision was necessary in 14 patients due to leaking filtering bleb (bleb repair), and in 2 patients due to bleb dysesthesia (bleb reduction). After a follow-up of 15.1 +/- 8.4 months, the mean intraocular pressure (IOP) rose from 7.8 +/- 6.3 mm Hg to 14.3 +/- 6.5 mm Hg, and the visual acuity from 0.4 +/- 0.3 to 0.7 +/- 0.3, with a P value of 0.008 and 0.03, respectively. The complete success rate at 32 months, according to the Kaplan-Meier survival curve, was 38.3%, and the qualified success rate was 83.3%. Four patients (25%) required additional suturing for persistent bleb leak. To control IOP, antiglaucoma medical therapy was needed for six patients (37.5%) and repeated glaucoma surgery was needed for one patient. CONCLUSION: Free conjunctival autologous graft is a safe and successful procedure for bleb repair and bleb reduction. However, patients should be aware of the postoperative possibility of requiring medical or surgical intervention for IOP control after revision.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To evaluate the antimitotic and toxic effects of 5-chlorouracil (5-CU) and 5-fluorouracil (5-FU) and study their potential to delay filtering bleb closure in the rabbit eye when released by poly(ortho esters) (POE). METHODS: Rabbit Tenon fibroblasts and human conjunctival cells were incubated with various 5-CU and 5-FU concentrations. Antiproliferative effects and toxicity were evaluated at 24 and 72 hours by monotetrazolium, neutral red, and Hoechst tests and cell counting. Mechanisms of cell death were evaluated using TUNEL assay, annexin V binding, immunohistochemistry for anti-apoptosis-inducing factor (AIF) and LEI/L-DNase II. Trabeculectomy was performed in pigmented rabbits. Two hundred microliters of POE loaded with 1% wt/wt 5-FU or 5-CU was injected into the subconjunctival space after surgery. Intraocular pressure (IOP) and bleb persistence were monitored for 150 days. RESULTS: In vitro, 5-FU showed a higher antiproliferative effect and a more toxic effect than 5-CU. 5-FU induced cell necrosis, whereas 5-CU induced mostly apoptosis. The apoptosis induced by 5-CU was driven through a non-caspase-dependent pathway involving AIF and LEI/L-DNase II. In vivo, at 34 days after surgery, the mean IOP in the POE/5-CU-treated group was 83% of the baseline level and only 40% in the POE/5-FU-treated group. At 100 days after surgery, IOP was still decreased in the POE/5-CU group when compared with the controls and still inferior to the preoperative value. The mean long-term IOP, with all time points considered, was significantly (P < 0.0001) decreased in the POE/5-CU-treated group (6.0 +/- 2.4 mm Hg) when compared with both control groups, the trabeculectomy alone group (7.6 +/- 2.9 mm Hg), and the POE alone group (7.5 +/- 2.6 mm Hg). Histologic analysis showed evidence of functioning blebs in the POE-5-CU-treated eyes along with a preserved structure of the conjunctiva epithelium. CONCLUSIONS: The slow release of 5-CU from POE has a longstanding effect on the decrease of IOP after glaucoma-filtering surgery in the rabbit eye. Thus, the slow release of POE/5-CU may be beneficial for the prevention of bleb closure in patients who undergo complicated trabeculectomy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: Pharmacologic modulation of wound healing after glaucoma filtering surgery remains a major clinical challenge in ophthalmology. Poly(ortho ester) (POE) is a bioerodible and biocompatible viscous polymer potentially useful as a sustained drug delivery system that allows the frequency of intraocular injections to be reduced. The purpose of this study was to determine the efficacy of POE containing a precise amount of 5-fluorouracil (5-FU) in an experimental model of filtering surgery in the rabbit. METHODS: Trabeculectomy was performed in pigmented rabbit eyes. An ointmentlike formulation of POE containing 1% wt/wt 5-FU was injected subconjunctivally at the site of surgery, during the procedure. Intraocular pressure (IOP), bleb persistence, and ocular inflammatory reaction were monitored until postoperative day 30. Quantitative analysis of 5-FU was performed in the anterior chamber. Histologic analysis was used to assess the appearance of the filtering fistula and the polymer's biocompatibility. RESULTS: The decrease in IOP from baseline and the persistence of the filtering bleb were significantly more marked in the 5-FU-treated eyes during postoperative days 9 through 28. Corneal toxicity triggered by 5-FU was significantly lower in the group that received 5-FU in POE compared with a 5-FU tamponade. Histopathologic evaluation showed that POE was well tolerated, and no fibrosis occurred in eyes treated with POE containing 5-FU. CONCLUSIONS: In this rabbit model of trabeculectomy, the formulation based on POE and containing a precise amount of 5-FU reduced IOP and prolonged bleb persistence in a way similar to the conventional method of a 5-FU tamponade, while significantly reducing 5-FU toxicity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Cone-beam computed tomography (CBCT) image-guided radiotherapy (IGRT) systems are widely used tools to verify and correct the target position before each fraction, allowing to maximize treatment accuracy and precision. In this study, we evaluate automatic three-dimensional intensity-based rigid registration (RR) methods for prostate setup correction using CBCT scans and study the impact of rectal distension on registration quality. METHODS: We retrospectively analyzed 115 CBCT scans of 10 prostate patients. CT-to-CBCT registration was performed using (a) global RR, (b) bony RR, or (c) bony RR refined by a local prostate RR using the CT clinical target volume (CTV) expanded with 1-to-20-mm varying margins. After propagation of the manual CT contours, automatic CBCT contours were generated. For evaluation, a radiation oncologist manually delineated the CTV on the CBCT scans. The propagated and manual CBCT contours were compared using the Dice similarity and a measure based on the bidirectional local distance (BLD). We also conducted a blind visual assessment of the quality of the propagated segmentations. Moreover, we automatically quantified rectal distension between the CT and CBCT scans without using the manual CBCT contours and we investigated its correlation with the registration failures. To improve the registration quality, the air in the rectum was replaced with soft tissue using a filter. The results with and without filtering were compared. RESULTS: The statistical analysis of the Dice coefficients and the BLD values resulted in highly significant differences (p<10(-6)) for the 5-mm and 8-mm local RRs vs the global, bony and 1-mm local RRs. The 8-mm local RR provided the best compromise between accuracy and robustness (Dice median of 0.814 and 97% of success with filtering the air in the rectum). We observed that all failures were due to high rectal distension. Moreover, the visual assessment confirmed the superiority of the 8-mm local RR over the bony RR. CONCLUSION: The most successful CT-to-CBCT RR method proved to be the 8-mm local RR. We have shown the correlation between its registration failures and rectal distension. Furthermore, we have provided a simple (easily applicable in routine) and automatic method to quantify rectal distension and to predict registration failure using only the manual CT contours.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

SUMMARY : Eukaryotic DNA interacts with the nuclear proteins using non-covalent ionic interactions. Proteins can recognize specific nucleotide sequences based on the sterical interactions with the DNA and these specific protein-DNA interactions are the basis for many nuclear processes, e.g. gene transcription, chromosomal replication, and recombination. New technology termed ChIP-Seq has been recently developed for the analysis of protein-DNA interactions on a whole genome scale and it is based on immunoprecipitation of chromatin and high-throughput DNA sequencing procedure. ChIP-Seq is a novel technique with a great potential to replace older techniques for mapping of protein-DNA interactions. In this thesis, we bring some new insights into the ChIP-Seq data analysis. First, we point out to some common and so far unknown artifacts of the method. Sequence tag distribution in the genome does not follow uniform distribution and we have found extreme hot-spots of tag accumulation over specific loci in the human and mouse genomes. These artifactual sequence tags accumulations will create false peaks in every ChIP-Seq dataset and we propose different filtering methods to reduce the number of false positives. Next, we propose random sampling as a powerful analytical tool in the ChIP-Seq data analysis that could be used to infer biological knowledge from the massive ChIP-Seq datasets. We created unbiased random sampling algorithm and we used this methodology to reveal some of the important biological properties of Nuclear Factor I DNA binding proteins. Finally, by analyzing the ChIP-Seq data in detail, we revealed that Nuclear Factor I transcription factors mainly act as activators of transcription, and that they are associated with specific chromatin modifications that are markers of open chromatin. We speculate that NFI factors only interact with the DNA wrapped around the nucleosome. We also found multiple loci that indicate possible chromatin barrier activity of NFI proteins, which could suggest the use of NFI binding sequences as chromatin insulators in biotechnology applications. RESUME : L'ADN des eucaryotes interagit avec les protéines nucléaires par des interactions noncovalentes ioniques. Les protéines peuvent reconnaître les séquences nucléotidiques spécifiques basées sur l'interaction stérique avec l'ADN, et des interactions spécifiques contrôlent de nombreux processus nucléaire, p.ex. transcription du gène, la réplication chromosomique, et la recombinaison. Une nouvelle technologie appelée ChIP-Seq a été récemment développée pour l'analyse des interactions protéine-ADN à l'échelle du génome entier et cette approche est basée sur l'immuno-précipitation de la chromatine et sur la procédure de séquençage de l'ADN à haut débit. La nouvelle approche ChIP-Seq a donc un fort potentiel pour remplacer les anciennes techniques de cartographie des interactions protéine-ADN. Dans cette thèse, nous apportons de nouvelles perspectives dans l'analyse des données ChIP-Seq. Tout d'abord, nous avons identifié des artefacts très communs associés à cette méthode qui étaient jusqu'à présent insoupçonnés. La distribution des séquences dans le génome ne suit pas une distribution uniforme et nous avons constaté des positions extrêmes d'accumulation de séquence à des régions spécifiques, des génomes humains et de la souris. Ces accumulations des séquences artéfactuelles créera de faux pics dans toutes les données ChIP-Seq, et nous proposons différentes méthodes de filtrage pour réduire le nombre de faux positifs. Ensuite, nous proposons un nouvel échantillonnage aléatoire comme un outil puissant d'analyse des données ChIP-Seq, ce qui pourraient augmenter l'acquisition de connaissances biologiques à partir des données ChIP-Seq. Nous avons créé un algorithme d'échantillonnage aléatoire et nous avons utilisé cette méthode pour révéler certaines des propriétés biologiques importantes de protéines liant à l'ADN nommés Facteur Nucléaire I (NFI). Enfin, en analysant en détail les données de ChIP-Seq pour la famille de facteurs de transcription nommés Facteur Nucléaire I, nous avons révélé que ces protéines agissent principalement comme des activateurs de transcription, et qu'elles sont associées à des modifications de la chromatine spécifiques qui sont des marqueurs de la chromatine ouverte. Nous pensons que lés facteurs NFI interagir uniquement avec l'ADN enroulé autour du nucléosome. Nous avons également constaté plusieurs régions génomiques qui indiquent une éventuelle activité de barrière chromatinienne des protéines NFI, ce qui pourrait suggérer l'utilisation de séquences de liaison NFI comme séquences isolatrices dans des applications de la biotechnologie.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ability to model biodiversity patterns is of prime importance in this era of severe environmental crisis. Species assemblage along environmental gradient is subject to the interplay of biotic interactions in complement to abiotic environmental filtering. Accounting for complex biotic interactions for a wide array of species remains so far challenging. Here, we propose to use food web models that can infer the potential interaction links between species as a constraint in species distribution models. Using a plant-herbivore (butterfly) interaction dataset, we demonstrate that this combined approach is able to improve both species distribution and community forecasts. Most importantly, this combined approach is very useful in rendering models of more generalist species that have multiple potential interaction links, where gap in the literature may be recurrent. Our combined approach points a promising direction forward to model the spatial variation of entire species interaction networks. Our work has implications for studies of range shifting species and invasive species biology where it may be unknown how a given biota might interact with a potential invader or in future climate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A better understanding of the factors that mould ecological community structure is required to accurately predict community composition and to anticipate threats to ecosystems due to global changes. We tested how well stacked climate-based species distribution models (S-SDMs) could predict butterfly communities in a mountain region. It has been suggested that climate is the main force driving butterfly distribution and community structure in mountain environments, and that, as a consequence, climate-based S-SDMs should yield unbiased predictions. In contrast to this expectation, at lower altitudes, climate-based S-SDMs overpredicted butterfly species richness at sites with low plant species richness and underpredicted species richness at sites with high plant species richness. According to two indices of composition accuracy, the Sorensen index and a matching coefficient considering both absences and presences, S-SDMs were more accurate in plant-rich grasslands. Butterflies display strong and often specialised trophic interactions with plants. At lower altitudes, where land use is more intense, considering climate alone without accounting for land use influences on grassland plant richness leads to erroneous predictions of butterfly presences and absences. In contrast, at higher altitudes, where climate is the main force filtering communities, there were fewer differences between observed and predicted butterfly richness. At high altitudes, even if stochastic processes decrease the accuracy of predictions of presence, climate-based S-SDMs are able to better filter out butterfly species that are unable to cope with severe climatic conditions, providing more accurate predictions of absences. Our results suggest that predictions should account for plants in disturbed habitats at lower altitudes but that stochastic processes and heterogeneity at high altitudes may limit prediction success of climate-based S-SDMs.