175 resultados para Computer algorithms
Resumo:
Neuronal oscillations are an important aspect of EEG recordings. These oscillations are supposed to be involved in several cognitive mechanisms. For instance, oscillatory activity is considered a key component for the top-down control of perception. However, measuring this activity and its influence requires precise extraction of frequency components. This processing is not straightforward. Particularly, difficulties with extracting oscillations arise due to their time-varying characteristics. Moreover, when phase information is needed, it is of the utmost importance to extract narrow-band signals. This paper presents a novel method using adaptive filters for tracking and extracting these time-varying oscillations. This scheme is designed to maximize the oscillatory behavior at the output of the adaptive filter. It is then capable of tracking an oscillation and describing its temporal evolution even during low amplitude time segments. Moreover, this method can be extended in order to track several oscillations simultaneously and to use multiple signals. These two extensions are particularly relevant in the framework of EEG data processing, where oscillations are active at the same time in different frequency bands and signals are recorded with multiple sensors. The presented tracking scheme is first tested with synthetic signals in order to highlight its capabilities. Then it is applied to data recorded during a visual shape discrimination experiment for assessing its usefulness during EEG processing and in detecting functionally relevant changes. This method is an interesting additional processing step for providing alternative information compared to classical time-frequency analyses and for improving the detection and analysis of cross-frequency couplings.
Resumo:
BACKGROUND/AIM: Raloxifene is the first selective estrogen receptor modulator that has been approved for the treatment and prevention of osteoporosis in postmenopausal women in Europe and in the US. Although raloxifene reduces the risk of invasive breast cancer in postmenopausal women with osteoporosis and in postmenopausal women at high risk for invasive breast cancer, it is approved in that indication in the US but not in the EU. The aim was to characterize the clinical profiles of postmenopausal women expected to benefit most from therapy with raloxifene based on published scientific evidence to date. METHODS: Key individual patient characteristics relevant to the prescription of raloxifene in daily practice were defined by a board of Swiss experts in the fields of menopause and metabolic bone diseases and linked to published scientific evidence. Consensus was reached about translating these insights into daily practice. RESULTS: Through estrogen agonistic effects on bone, raloxifene reduces biochemical markers of bone turnover to premenopausal levels, increases bone mineral density (BMD) at the lumbar spine, proximal femur, and total body, and reduces vertebral fracture risk in women with osteopenia or osteoporosis with and without prevalent vertebral fracture. Through estrogen antagonistic effects on breast tissue, raloxifene reduces the risk of invasive estrogen-receptor positive breast cancer in postmenopausal women with osteoporosis and in postmenopausal women at high risk for invasive breast cancer. Finally, raloxifene increases the incidence of hot flushes, the risk of venous thromboembolic events, and the risk of fatal stroke in postmenopausal women at increased risk for coronary heart disease. Postmenopausal women in whom the use of raloxifene is considered can be categorized in a 2 × 2 matrix reflecting their bone status (osteopenic or osteoporotic based on their BMD T-score by dual energy X-ray absorptiometry) and their breast cancer risk (low or high based on the modified Gail model). Women at high risk of breast cancer should be considered for treatment with raloxifene. CONCLUSION: Postmenopausal women between 50 and 70 years of age without climacteric symptoms with either osteopenia or osteoporosis should be evaluated with regard to their breast cancer risk and considered for treatment with raloxifene within the framework of its contraindications and precautions.
Resumo:
PURPOSE: To objectively characterize different heart tissues from functional and viability images provided by composite-strain-encoding (C-SENC) MRI. MATERIALS AND METHODS: C-SENC is a new MRI technique for simultaneously acquiring cardiac functional and viability images. In this work, an unsupervised multi-stage fuzzy clustering method is proposed to identify different heart tissues in the C-SENC images. The method is based on sequential application of the fuzzy c-means (FCM) and iterative self-organizing data (ISODATA) clustering algorithms. The proposed method is tested on simulated heart images and on images from nine patients with and without myocardial infarction (MI). The resulting clustered images are compared with MRI delayed-enhancement (DE) viability images for determining MI. Also, Bland-Altman analysis is conducted between the two methods. RESULTS: Normal myocardium, infarcted myocardium, and blood are correctly identified using the proposed method. The clustered images correctly identified 90 +/- 4% of the pixels defined as infarct in the DE images. In addition, 89 +/- 5% of the pixels defined as infarct in the clustered images were also defined as infarct in DE images. The Bland-Altman results show no bias between the two methods in identifying MI. CONCLUSION: The proposed technique allows for objectively identifying divergent heart tissues, which would be potentially important for clinical decision-making in patients with MI.
Resumo:
Three-dimensional imaging and quantification of myocardial function are essential steps in the evaluation of cardiac disease. We propose a tagged magnetic resonance imaging methodology called zHARP that encodes and automatically tracks myocardial displacement in three dimensions. Unlike other motion encoding techniques, zHARP encodes both in-plane and through-plane motion in a single image plane without affecting the acquisition speed. Postprocessing unravels this encoding in order to directly track the 3-D displacement of every point within the image plane throughout an entire image sequence. Experimental results include a phantom validation experiment, which compares zHARP to phase contrast imaging, and an in vivo study of a normal human volunteer. Results demonstrate that the simultaneous extraction of in-plane and through-plane displacements from tagged images is feasible.
Resumo:
At high magnetic field strengths (≥ 3T), the radiofrequency wavelength used in MRI is of the same order of magnitude of (or smaller than) the typical sample size, making transmit magnetic field (B1+) inhomogeneities more prominent. Methods such as radiofrequency-shimming and transmit SENSE have been proposed to mitigate these undesirable effects. A prerequisite for such approaches is an accurate and rapid characterization of the B1+ field in the organ of interest. In this work, a new phase-sensitive three-dimensional B1+-mapping technique is introduced that allows the acquisition of a 64 × 64 × 8 B1+-map in ≈ 20 s, yielding an accurate mapping of the relative B1+ with a 10-fold dynamic range (0.2-2 times the nominal B1+). Moreover, the predominant use of low flip angle excitations in the presented sequence minimizes specific absorption rate, which is an important asset for in vivo B1+-shimming procedures at high magnetic fields. The proposed methodology was validated in phantom experiments and demonstrated good results in phantom and human B1+-shimming using an 8-channel transmit-receive array.
Resumo:
The worldwide prevalence of smoking has been estimated at about 50% in men, and 10% in women, with larger variations among different populations studied. Smoking has been shown to affect many organ systems resulting in severe morbidity and increased mortality. In addition, smoking has been identified as a predictor of ten-year fracture risk in men and women, largely independent of an individual's bone mineral density. This finding has eventually lead to incorporation of this risk factor into FRAX®, an algorithm that has been developed to calculate an individual's ten-year fracture risk. However, only little, or conflicting data is available on a possible association between smoking dose, duration, length of time after cessation, type of tobacco and fracture risk, limiting this risk factor's applicability in the context of FRAX®.
Resumo:
The high molecular weight and low concentration of brain glycogen render its noninvasive quantification challenging. Therefore, the precision increase of the quantification by localized (13) C MR at 9.4 to 14.1 T was investigated. Signal-to-noise ratio increased by 66%, slightly offset by a T(1) increase of 332 ± 15 to 521 ± 34 ms. Isotopic enrichment after long-term (13) C administration was comparable (≈ 40%) as was the nominal linewidth of glycogen C1 (≈ 50 Hz). Among the factors that contributed to the 66% observed increase in signal-to-noise ratio, the T(1) relaxation time impacted the effective signal-to-noise ratio by only 10% at a repetition time = 1 s. The signal-to-noise ratio increase together with the larger spectral dispersion at 14.1 T resulted in a better defined baseline, which allowed for more accurate fitting. Quantified glycogen concentrations were 5.8 ± 0.9 mM at 9.4 T and 6.0 ± 0.4 mM at 14.1 T; the decreased standard deviation demonstrates the compounded effect of increased magnetization and improved baseline on the precision of glycogen quantification.
Resumo:
The ability to determine the location and relative strength of all transcription-factor binding sites in a genome is important both for a comprehensive understanding of gene regulation and for effective promoter engineering in biotechnological applications. Here we present a bioinformatically driven experimental method to accurately define the DNA-binding sequence specificity of transcription factors. A generalized profile was used as a predictive quantitative model for binding sites, and its parameters were estimated from in vitro-selected ligands using standard hidden Markov model training algorithms. Computer simulations showed that several thousand low- to medium-affinity sequences are required to generate a profile of desired accuracy. To produce data on this scale, we applied high-throughput genomics methods to the biochemical problem addressed here. A method combining systematic evolution of ligands by exponential enrichment (SELEX) and serial analysis of gene expression (SAGE) protocols was coupled to an automated quality-controlled sequence extraction procedure based on Phred quality scores. This allowed the sequencing of a database of more than 10,000 potential DNA ligands for the CTF/NFI transcription factor. The resulting binding-site model defines the sequence specificity of this protein with a high degree of accuracy not achieved earlier and thereby makes it possible to identify previously unknown regulatory sequences in genomic DNA. A covariance analysis of the selected sites revealed non-independent base preferences at different nucleotide positions, providing insight into the binding mechanism.
Resumo:
For the last 2 decades, supertree reconstruction has been an active field of research and has seen the development of a large number of major algorithms. Because of the growing popularity of the supertree methods, it has become necessary to evaluate the performance of these algorithms to determine which are the best options (especially with regard to the supermatrix approach that is widely used). In this study, seven of the most commonly used supertree methods are investigated by using a large empirical data set (in terms of number of taxa and molecular markers) from the worldwide flowering plant family Sapindaceae. Supertree methods were evaluated using several criteria: similarity of the supertrees with the input trees, similarity between the supertrees and the total evidence tree, level of resolution of the supertree and computational time required by the algorithm. Additional analyses were also conducted on a reduced data set to test if the performance levels were affected by the heuristic searches rather than the algorithms themselves. Based on our results, two main groups of supertree methods were identified: on one hand, the matrix representation with parsimony (MRP), MinFlip, and MinCut methods performed well according to our criteria, whereas the average consensus, split fit, and most similar supertree methods showed a poorer performance or at least did not behave the same way as the total evidence tree. Results for the super distance matrix, that is, the most recent approach tested here, were promising with at least one derived method performing as well as MRP, MinFlip, and MinCut. The output of each method was only slightly improved when applied to the reduced data set, suggesting a correct behavior of the heuristic searches and a relatively low sensitivity of the algorithms to data set sizes and missing data. Results also showed that the MRP analyses could reach a high level of quality even when using a simple heuristic search strategy, with the exception of MRP with Purvis coding scheme and reversible parsimony. The future of supertrees lies in the implementation of a standardized heuristic search for all methods and the increase in computing power to handle large data sets. The latter would prove to be particularly useful for promising approaches such as the maximum quartet fit method that yet requires substantial computing power.
Resumo:
We have developed a digital holographic microscope (DHM), in a transmission mode, especially dedicated to the quantitative visualization of phase objects such as living cells. The method is based on an original numerical algorithm presented in detail elsewhere [Cuche et al., Appl. Opt. 38, 6994 (1999)]. DHM images of living cells in culture are shown for what is to our knowledge the first time. They represent the distribution of the optical path length over the cell, which has been measured with subwavelength accuracy. These DHM images are compared with those obtained by use of the widely used phase contrast and Nomarski differential interference contrast techniques.
Resumo:
Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.
Resumo:
Objectives: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on blood concentrations measurement. Maintaining concentrations within a target range requires pharmacokinetic (PK) and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Methods: Literature and Internet were searched to identify software. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software characteristics. Numbers of drugs handled vary from 2 to more than 180, and integration of different population types is available for some programs. Nevertheless, 8 programs offer the ability to add new drug models based on population PK data. 10 computer tools incorporate Bayesian computation to predict dosage regimen (individual parameters are calculated based on population PK models). All of them are able to compute Bayesian a posteriori dosage adaptation based on a blood concentration while 9 are also able to suggest a priori dosage regimen, only based on individual patient covariates. Among those applying Bayesian analysis, MM-USC*PACK uses a non-parametric approach. The top 2 programs emerging from this benchmark are MwPharm and TCIWorks. Others programs evaluated have also a good potential but are less sophisticated or less user-friendly.¦Conclusions: Whereas 2 software packages are ranked at the top of the list, such complex tools would possibly not fit all institutions, and each program must be regarded with respect to individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Although interest in TDM tools is growing and efforts were put into it in the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capability of data storage and automated report generation.
Resumo:
Changes in intracellular Na(+) concentration underlie essential neurobiological processes, but few reliable tools exist for their measurement. Here we characterize a new synthetic Na(+)-sensitive fluorescent dye, Asante Natrium Green (ANG), with unique properties. This indicator was excitable in the visible spectrum and by two-photon illumination, suffered little photobleaching and located to the cytosol were it remained for long durations without noticeable unwanted effects on basic cell properties. When used in brain tissue, ANG yielded a bright fluorescent signal during physiological Na(+) responses both in neurons and astrocytes. Synchronous electrophysiological and fluorometric recordings showed that ANG produced accurate Na(+) measurement in situ. This new Na(+) indicator opens innovative ways of probing neuronal circuits.
Resumo:
PURPOSE: To improve the tag persistence throughout the whole cardiac cycle by providing a constant tag-contrast throughout all the cardiac phases when using balanced steady-state free precession (bSSFP) imaging. MATERIALS AND METHODS: The flip angles of the imaging radiofrequency pulses were optimized to compensate for the tagging contrast-to-noise ratio (Tag-CNR) fading at later cardiac phases in bSSFP imaging. Complementary spatial modulation of magnetization (CSPAMM) tagging was implemented to improve the Tag-CNR. Numerical simulations were performed to examine the behavior of the Tag-CNR with the proposed method, and to compare the resulting Tag-CNR with that obtained from the more commonly used spoiled gradient echo (SPGR) imaging. A gel phantom, as well as five healthy human volunteers, were scanned on a 1.5T scanner using bSSFP imaging with and without the proposed technique. The phantom was also scanned with SPGR imaging. RESULTS: With the proposed technique, the Tag-CNR remained almost constant during the whole cardiac cycle. Using bSSFP imaging, the Tag-CNR was about double that of SPGR. CONCLUSION: The tag persistence was significantly improved when the proposed method was applied, with better Tag-CNR during the diastolic cardiac phase. The improved Tag-CNR will support automated tagging analysis and quantification methods.