58 resultados para Free Software
em Université de Lausanne, Switzerland
Resumo:
Accurate detection of subpopulation size determinations in bimodal populations remains problematic yet it represents a powerful way by which cellular heterogeneity under different environmental conditions can be compared. So far, most studies have relied on qualitative descriptions of population distribution patterns, on population-independent descriptors, or on arbitrary placement of thresholds distinguishing biological ON from OFF states. We found that all these methods fall short of accurately describing small population sizes in bimodal populations. Here we propose a simple, statistics-based method for the analysis of small subpopulation sizes for use in the free software environment R and test this method on real as well as simulated data. Four so-called population splitting methods were designed with different algorithms that can estimate subpopulation sizes from bimodal populations. All four methods proved more precise than previously used methods when analyzing subpopulation sizes of transfer competent cells arising in populations of the bacterium Pseudomonas knackmussii B13. The methods' resolving powers were further explored by bootstrapping and simulations. Two of the methods were not severely limited by the proportions of subpopulations they could estimate correctly, but the two others only allowed accurate subpopulation quantification when this amounted to less than 25% of the total population. In contrast, only one method was still sufficiently accurate with subpopulations smaller than 1% of the total population. This study proposes a number of rational approximations to quantifying small subpopulations and offers an easy-to-use protocol for their implementation in the open source statistical software environment R.
Resumo:
Due to the existence of free software and pedagogical guides, the use of data envelopment analysis (DEA) has been further democratized in recent years. Nowadays, it is quite usual for practitioners and decision makers with no or little knowledge in operational research to run themselves their own efficiency analysis. Within DEA, several alternative models allow for an environment adjustment. Five alternative models, each of them easily accessible to and achievable by practitioners and decision makers, are performed using the empirical case of the 90 primary schools of the State of Geneva, Switzerland. As the State of Geneva practices an upstream positive discrimination policy towards schools, this empirical case is particularly appropriate for an environment adjustment. The alternative of the majority of DEA models deliver divergent results. It is a matter of concern for applied researchers and a matter of confusion for practitioners and decision makers. From a political standpoint, these diverging results could lead to potentially opposite decisions. Grâce à l'existence de logiciels en libre accès et de guides pédagogiques, la méthode data envelopment analysis (DEA) s'est démocratisée ces dernières années. Aujourd'hui, il n'est pas rare que les décideurs avec peu ou pas de connaissances en recherche opérationnelle réalisent eux-mêmes leur propre analyse d'efficience. A l'intérieur de la méthode DEA, plusieurs modèles permettent de tenir compte des conditions plus ou moins favorables de l'environnement. Cinq de ces modèles, facilement accessibles et applicables par les décideurs, sont utilisés pour mesurer l'efficience des 90 écoles primaires du canton de Genève, Suisse. Le canton de Genève pratiquant une politique de discrimination positive envers les écoles défavorisées, ce cas pratique est particulièrement adapté pour un ajustement à l'environnement. La majorité des modèles DEA génèrent des résultats divergents. Ce constat est préoccupant pour les chercheurs appliqués et perturbant pour les décideurs. D'un point de vue politique, ces résultats divergents conduisent à des prises de décision différentes selon le modèle sur lequel elles sont fondées.
Resumo:
Due to the existence of free software and pedagogical guides, the use of Data Envelopment Analysis (DEA) has been further democratized in recent years. Nowadays, it is quite usual for practitioners and decision makers with no or little knowledge in operational research to run their own efficiency analysis. Within DEA, several alternative models allow for an environmental adjustment. Four alternative models, each user-friendly and easily accessible to practitioners and decision makers, are performed using empirical data of 90 primary schools in the State of Geneva, Switzerland. Results show that the majority of alternative models deliver divergent results. From a political and a managerial standpoint, these diverging results could lead to potentially ineffective decisions. As no consensus emerges on the best model to use, practitioners and decision makers may be tempted to select the model that is right for them, in other words, the model that best reflects their own preferences. Further studies should investigate how an appropriate multi-criteria decision analysis method could help decision makers to select the right model.
Resumo:
The analysis of rockfall characteristics and spatial distribution is fundamental to understand and model the main factors that predispose to failure. In our study we analysed LiDAR point clouds aiming to: (1) detect and characterise single rockfalls; (2) investigate their spatial distribution. To this end, different cluster algorithms were applied: 1a) Nearest Neighbour Clutter Removal (NNCR) in combination with the Expectation?Maximization (EM) in order to separate feature points from clutter; 1b) a density based algorithm (DBSCAN) was applied to isolate the single clusters (i.e. the rockfall events); 2) finally we computed the Ripley's K-function to investigate the global spatial pattern of the extracted rockfalls. The method allowed proper identification and characterization of more than 600 rockfalls occurred on a cliff located in Puigcercos (Catalonia, Spain) during a time span of six months. The spatial distribution of these events proved that rockfall were clustered distributed at a welldefined distance-range. Computations were carried out using R free software for statistical computing and graphics. The understanding of the spatial distribution of precursory rockfalls may shed light on the forecasting of future failures.
Resumo:
OBJECTIVES: The goal of the present study was to develop a strategy for three-dimensional (3D) volume acquisition along the major axes of the coronary arteries. BACKGROUND: For high-resolution 3D free-breathing coronary magnetic resonance angiography (MRA), coverage of the coronary artery tree may be limited due to excessive measurement times associated with large volume acquisitions. Planning the 3D volume along the major axis of the coronary vessels may help to overcome such limitations. METHODS: Fifteen healthy adult volunteers and seven patients with X-ray angiographically confirmed coronary artery disease underwent free-breathing navigator-gated and corrected 3D coronary MRA. For an accurate volume targeting of the high resolution scans, a three-point planscan software tool was applied. RESULTS: The average length of contiguously visualized left main and left anterior descending coronary artery was 81.8 +/- 13.9 mm in the healthy volunteers and 76.2 +/- 16.5 mm in the patients (p = NS). For the right coronary artery, a total length of 111.7 +/- 27.7 mm was found in the healthy volunteers and 79.3 +/- 4.6 mm in the patients (p = NS). Comparing coronary MRA and X-ray angiography, a good agreement of anatomy and pathology was found in the patients. CONCLUSIONS: Double-oblique submillimeter free-breathing coronary MRA allows depiction of extensive parts of the native coronary arteries. The results obtained in patients suggest that the method has the potential to be applied in broader prospective multicenter studies where coronary MRA is compared with X-ray angiography.
Resumo:
The authors developed a free-breathing black-blood coronary magnetic resonance (MR) angiographic technique with a potential for exclusive visualization of the coronary blood pool. Results with the MR angiographic technique were evaluated in eight healthy subjects and four patients with coronary disease identified at conventional angiography. This MR angiographic technique accurately depicted luminal disease in the patients and permitted visualization of extensive continuous segments of the native coronary tree in both the healthy subjects and the patients. Black-blood coronary MR angiography provides an alternative source of contrast enhancement.
Resumo:
BACKGROUND: Transient balanced steady-state free-precession (bSSFP) has shown substantial promise for noninvasive assessment of coronary arteries but its utilization at 3.0 T and above has been hampered by susceptibility to field inhomogeneities that degrade image quality. The purpose of this work was to refine, implement, and test a robust, practical single-breathhold bSSFP coronary MRA sequence at 3.0 T and to test the reproducibility of the technique. METHODS: A 3D, volume-targeted, high-resolution bSSFP sequence was implemented. Localized image-based shimming was performed to minimize inhomogeneities of both the static magnetic field and the radio frequency excitation field. Fifteen healthy volunteers and three patients with coronary artery disease underwent examination with the bSSFP sequence (scan time = 20.5 ± 2.0 seconds), and acquisitions were repeated in nine subjects. The images were quantitatively analyzed using a semi-automated software tool, and the repeatability and reproducibility of measurements were determined using regression analysis and intra-class correlation coefficient (ICC), in a blinded manner. RESULTS: The 3D bSSFP sequence provided uniform, high-quality depiction of coronary arteries (n = 20). The average visible vessel length of 100.5 ± 6.3 mm and sharpness of 55 ± 2% compared favorably with earlier reported navigator-gated bSSFP and gradient echo sequences at 3.0 T. Length measurements demonstrated a highly statistically significant degree of inter-observer (r = 0.994, ICC = 0.993), intra-observer (r = 0.894, ICC = 0.896), and inter-scan concordance (r = 0.980, ICC = 0.974). Furthermore, ICC values demonstrated excellent intra-observer, inter-observer, and inter-scan agreement for vessel diameter measurements (ICC = 0.987, 0.976, and 0.961, respectively), and vessel sharpness values (ICC = 0.989, 0.938, and 0.904, respectively). CONCLUSIONS: The 3D bSSFP acquisition, using a state-of-the-art MR scanner equipped with recently available technologies such as multi-transmit, 32-channel cardiac coil, and localized B0 and B1+ shimming, allows accelerated and reproducible multi-segment assessment of the major coronary arteries at 3.0 T in a single breathhold. This rapid sequence may be especially useful for functional imaging of the coronaries where the acquisition time is limited by the stress duration and in cases where low navigator-gating efficiency prohibits acquisition of a free breathing scan in a reasonable time period.
Resumo:
AIMS: Although the coronary artery vessel wall can be imaged non-invasively using magnetic resonance imaging (MRI), the in vivo reproducibility of wall thickness measures has not been previously investigated. Using a refined magnetization preparation scheme, we sought to assess the reproducibility of three-dimensional (3D) free-breathing black-blood coronary MRI in vivo. METHODS AND RESULTS: MRI vessel wall scans parallel to the right coronary artery (RCA) were obtained in 18 healthy individuals (age range 25-43, six women), with no known history of coronary artery disease, using a 3D dual-inversion navigator-gated black-blood spiral imaging sequence. Vessel wall scans were repeated 1 month later in eight subjects. The visible vessel wall segment and the wall thickness were quantitatively assessed using a semi-automatic tool and the intra-observer, inter-observer, and inter-scan reproducibilities were determined. The average imaged length of the RCA vessel wall was 44.5+/-7 mm. The average wall thickness was 1.6+/-0.2 mm. There was a highly significant intra-observer (r=0.97), inter-observer (r=0.94), and inter-scan (r=0.90) correlation for wall thickness (all P<0.001). There was also a significant agreement for intra-observer, inter-observer, and inter-scan measurements on Bland-Altman analysis. The intra-class correlation coefficients for intra-observer (r=0.97), inter-observer (r=0.92), and inter-scan (r=0.86) analyses were also excellent. CONCLUSION: The use of black-blood free-breathing 3D MRI in conjunction with semi-automated analysis software allows for reproducible measurements of right coronary arterial vessel-wall thickness. This technique may be well-suited for non-invasive longitudinal studies of coronary atherosclerosis.
Resumo:
Amplified Fragment Length Polymorphisms (AFLPs) are a cheap and efficient protocol for generating large sets of genetic markers. This technique has become increasingly used during the last decade in various fields of biology, including population genomics, phylogeography, and genome mapping. Here, we present RawGeno, an R library dedicated to the automated scoring of AFLPs (i.e., the coding of electropherogram signals into ready-to-use datasets). Our program includes a complete suite of tools for binning, editing, visualizing, and exporting results obtained from AFLP experiments. RawGeno can either be used with command lines and program analysis routines or through a user-friendly graphical user interface. We describe the whole RawGeno pipeline along with recommendations for (a) setting the analysis of electropherograms in combination with PeakScanner, a program freely distributed by Applied Biosystems; (b) performing quality checks; (c) defining bins and proceeding to scoring; (d) filtering nonoptimal bins; and (e) exporting results in different formats.
Resumo:
Serum-free aggregating cell cultures of fetal rat telencephalon were examined by a combined biochemical and double-labeling immunocytochemical study for the developmental expression of glial fibrillary acidic protein (GFAP) and glutamine synthetase (GS). It was found that these two astroglial markers are co-expressed at different developmental stages in vitro. During the phase of cellular maturation (i.e. between days 14 and 34), GFAP levels and GS activity increase rapidly and in parallel. At the same time, the number of immunoreactive cells increase while the long and thick processes staining in early cultures gradually disappear. The present results demonstrate that in this particular cell culture system only one type of astrocytes develops which expresses both GFAP and GS and which attains a relatively high degree of maturation.
Resumo:
OBJECTIVE: Contemporary free-breathing non contrast enhanced cardiovascular magnetic resonance angiography (CMRA) was qualitatively and quantitatively evaluated to ascertain the reproducibility of the method for coronary artery luminal dimension measurements. SUBJECTS AND METHODS: Twenty-two healthy volunteers (mean age 32 +/- 7 years, 12 males) without coronary artery disease were imaged at 2 centers (1 each in Europe and North America) using navigator-gated and corrected SSFP CMRA on a commercial whole body 1.5T System. Repeat images of right (RCA, n = 21), left anterior descending (LAD, n = 14) and left circumflex (LCX, n = 14) coronary arteries were obtained in separate sessions using identical scan protocol and imaging parameters. True visible vessel length, signal-to-noise (SNR), contrast-to-noise ratios (CNR) and the average luminal diameter over the first 4 cm of the vessel were measured. Intra-observer, inter-observer and inter-scan reproducibility of coronary artery luminal diameter were determined using Pearson's correlation, Bland-Altman analysis and intraclass correlation coefficients (ICC). RESULTS: CNR, SNR and the mean length of the RCA, LAD and LCX imaged for original and repeat scans were not significantly different (all p > 0.30). There was a high degree of intra-observer, inter-observer and inter-scan agreements for RCA, LAD and LCX luminal diameter respectively on Bland-Altman and ICC analysis (ICC's for RCA: 0.98. 0.98 and 0.86; LAD: 0.89, 0.89 and 0.63; LCX: 0.95, 0.94 and 0.79). CONCLUSION: In a 2-center study, we demonstrate that free-breathing 3D SSFP CMRA can visualize long continuous segments of coronary vessels with highly reproducible measurements of luminal diameter.
Resumo:
Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.