42 resultados para Optics in computing
em Université de Lausanne, Switzerland
Resumo:
For the last 2 decades, supertree reconstruction has been an active field of research and has seen the development of a large number of major algorithms. Because of the growing popularity of the supertree methods, it has become necessary to evaluate the performance of these algorithms to determine which are the best options (especially with regard to the supermatrix approach that is widely used). In this study, seven of the most commonly used supertree methods are investigated by using a large empirical data set (in terms of number of taxa and molecular markers) from the worldwide flowering plant family Sapindaceae. Supertree methods were evaluated using several criteria: similarity of the supertrees with the input trees, similarity between the supertrees and the total evidence tree, level of resolution of the supertree and computational time required by the algorithm. Additional analyses were also conducted on a reduced data set to test if the performance levels were affected by the heuristic searches rather than the algorithms themselves. Based on our results, two main groups of supertree methods were identified: on one hand, the matrix representation with parsimony (MRP), MinFlip, and MinCut methods performed well according to our criteria, whereas the average consensus, split fit, and most similar supertree methods showed a poorer performance or at least did not behave the same way as the total evidence tree. Results for the super distance matrix, that is, the most recent approach tested here, were promising with at least one derived method performing as well as MRP, MinFlip, and MinCut. The output of each method was only slightly improved when applied to the reduced data set, suggesting a correct behavior of the heuristic searches and a relatively low sensitivity of the algorithms to data set sizes and missing data. Results also showed that the MRP analyses could reach a high level of quality even when using a simple heuristic search strategy, with the exception of MRP with Purvis coding scheme and reversible parsimony. The future of supertrees lies in the implementation of a standardized heuristic search for all methods and the increase in computing power to handle large data sets. The latter would prove to be particularly useful for promising approaches such as the maximum quartet fit method that yet requires substantial computing power.
Resumo:
Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.
Resumo:
Abstract in English : Ubiquitous Computing is the emerging trend in computing systems. Based on this observation this thesis proposes an analysis of the hardware and environmental constraints that rule pervasive platforms. These constraints have a strong impact on the programming of such platforms. Therefore solutions are proposed to facilitate this programming both at the platform and node levels. The first contribution presented in this document proposes a combination of agentoriented programming with the principles of bio-inspiration (Phylogenesys, Ontogenesys and Epigenesys) to program pervasive platforms such as the PERvasive computing framework for modeling comPLEX virtually Unbounded Systems platform. The second contribution proposes a method to program efficiently parallelizable applications on each computing node of this platform. Résumé en Français : Basée sur le constat que les calculs ubiquitaires vont devenir le paradigme de programmation dans les années à venir, cette thèse propose une analyse des contraintes matérielles et environnementale auxquelles sont soumises les plateformes pervasives. Ces contraintes ayant un impact fort sur la programmation des plateformes. Des solutions sont donc proposées pour faciliter cette programmation tant au niveau de l'ensemble des noeuds qu'au niveau de chacun des noeuds de la plateforme. La première contribution présentée dans ce document propose d'utiliser une alliance de programmation orientée agent avec les grands principes de la bio-inspiration (Phylogénèse, Ontogénèse et Épigénèse). Ceci pour répondres aux contraintes de programmation de plateformes pervasives comme la plateforme PERvasive computing framework for modeling comPLEX virtually Unbounded Systems . La seconde contribution propose quant à elle une méthode permettant de programmer efficacement des applications parallélisable sur chaque noeud de calcul de la plateforme
Resumo:
Exposure to solar ultraviolet (UV) radiation is the main causative factor for skin cancer. UV exposure depends on environmental and individual factors, but individual exposure data remain scarce. UV irradiance is monitored via different techniques including ground measurements and satellite observations. However it is difficult to translate such observations into human UV exposure or dose because of confounding factors (shape of the exposed surface, shading, behavior, etc.) A collaboration between public health institutions, a meteorological office and an institute specialized in computing techniques developed a model predicting the dose and distribution of UV exposure on the basis of ground irradiation and morphological data. Standard 3D computer graphics techniques were adapted to develop this tool, which estimates solar exposure of a virtual manikin depicted as a triangle mesh surface. The amount of solar energy received by various body locations is computed for direct, diffuse and reflected radiation separately. The radiation components are deduced from corresponding measurements of UV irradiance, and the related UV dose received by each triangle of the virtual manikin is computed accounting for shading by other body parts and eventual protection measures. The model was verified with dosimetric measurements (n=54) in field conditions using a foam manikin as surrogate for an exposed individual. Dosimetric results were compared to the model predictions. The model predicted exposure to solar UV adequately. The symmetric mean absolute percentage error was 13%. Half of the predictions were within 17% range of the measurements. This model allows assessing outdoor occupational and recreational UV exposures, without necessitating time-consuming individual dosimetry, with numerous potential uses in skin cancer prevention and research. Using this tool, we investigated solar UV exposure patterns with respect to the relative contribution of the direct, diffuse and reflected radiation. We assessed exposure doses for various body parts and exposure scenarios of a standing individual (static and dynamic postures). As input, the model used erythemally-weighted ground irradiance data measured in 2009 at Payerne, Switzerland. A year-round daily exposure (8 am to 5 pm) without protection was assumed. For most anatomical sites, mean daily doses were high (typically 6.2-14.6 SED) and exceeded recommended exposure values. Direct exposure was important during specific periods (e.g. midday during summer), but contributed moderately to the annual dose, ranging from 15 to 24% for vertical and horizontal body parts, respectively. Diffuse irradiation explained about 80% of the cumulative annual exposure dose. Acute diffuse exposures were also obtained for cloudy summer days. The importance of diffuse UV radiation should not be underestimated when advocating preventive measures. Messages focused on avoiding acute direct exposures may be of limited efficiency to prevent skin cancers associated with chronic exposure (e.g., squamous cell carcinomas).
Resumo:
The goal of this study was to investigate the impact of computing parameters and the location of volumes of interest (VOI) on the calculation of 3D noise power spectrum (NPS) in order to determine an optimal set of computing parameters and propose a robust method for evaluating the noise properties of imaging systems. Noise stationarity in noise volumes acquired with a water phantom on a 128-MDCT and a 320-MDCT scanner were analyzed in the spatial domain in order to define locally stationary VOIs. The influence of the computing parameters in the 3D NPS measurement: the sampling distances bx,y,z and the VOI lengths Lx,y,z, the number of VOIs NVOI and the structured noise were investigated to minimize measurement errors. The effect of the VOI locations on the NPS was also investigated. Results showed that the noise (standard deviation) varies more in the r-direction (phantom radius) than z-direction plane. A 25 × 25 × 40 mm(3) VOI associated with DFOV = 200 mm (Lx,y,z = 64, bx,y = 0.391 mm with 512 × 512 matrix) and a first-order detrending method to reduce structured noise led to an accurate NPS estimation. NPS estimated from off centered small VOIs had a directional dependency contrary to NPS obtained from large VOIs located in the center of the volume or from small VOIs located on a concentric circle. This showed that the VOI size and location play a major role in the determination of NPS when images are not stationary. This study emphasizes the need for consistent measurement methods to assess and compare image quality in CT.
Resumo:
Fluorescence imaging for detection of non-muscle-invasive bladder cancer is based on the selective production and accumulation of fluorescing porphyrins-mainly, protoporphyrin IX-in cancerous tissues after the instillation of Hexvix®. Although the sensitivity of this procedure is very good, its specificity is somewhat limited due to fluorescence false-positive sites. Consequently, magnification cystoscopy has been investigated in order to discriminate false from true fluorescence positive findings. Both white-light and fluorescence modes are possible with the magnification cystoscope, allowing observation of the bladder wall with magnification ranging between 30× for standard observation and 650×. The optical zooming setup allows adjusting the magnification continuously in situ. In the high-magnification (HM) regime, the smallest diameter of the field of view is 600 microns and the resolution is 2.5 microns when in contact with the bladder wall. With this cystoscope, we characterized the superficial vascularization of the fluorescing sites in order to discriminate cancerous from noncancerous tissues. This procedure allowed us to establish a classification based on observed vascular patterns. Seventy-two patients subject to Hexvix® fluorescence cystoscopy were included in the study. Comparison of HM cystoscopy classification with histopathology results confirmed 32?33 (97%) cancerous biopsies and rejected 17?20 (85%) noncancerous lesions.
Resumo:
Measuring tissue oxygenation in vivo is of interest in fundamental biological as well as medical applications. One minimally invasive approach to assess the oxygen partial pressure in tissue (pO2) is to measure the oxygen-dependent luminescence lifetime of molecular probes. The relation between tissue pO2 and the probes' luminescence lifetime is governed by the Stern-Volmer equation. Unfortunately, virtually all oxygen-sensitive probes based on this principle induce some degree of phototoxicity. For that reason, we studied the oxygen sensitivity and phototoxicity of dichlorotris(1, 10-phenanthroline)-ruthenium(II) hydrate [Ru(Phen)] using a dedicated optical fiber-based, time-resolved spectrometer in the chicken embryo chorioallantoic membrane. We demonstrated that, after intravenous injection, Ru(Phen)'s luminescence lifetime presents an easily detectable pO2 dependence at a low drug dose (1 mg∕kg) and low fluence (120 mJ∕cm2 at 470 nm). The phototoxic threshold was found to be at 10 J∕cm2 with the same wavelength and drug dose, i.e., about two orders of magnitude larger than the fluence necessary to perform a pO2 measurement. Finally, an illustrative application of this pO2 measurement approach in a hypoxic tumor environment is presented.
Resumo:
Cloud computing has recently become very popular, and several bioinformatics applications exist already in that domain. The aim of this article is to analyse a current cloud system with respect to usability, benchmark its performance and compare its user friendliness with a conventional cluster job submission system. Given the current hype on the theme, user expectations are rather high, but current results show that neither the price/performance ratio nor the usage model is very satisfactory for large-scale embarrassingly parallel applications. However, for small to medium scale applications that require CPU time at certain peak times the cloud is a suitable alternative.
Resumo:
Digital Holographic Microscopy (DHM), is a new imaging technique allowing to provide quantitative phase images with a high accuracy and stability making possible to explore a large variety of relevant processes, occurring on the p.s to day time scale, in the fields including material research as well as cell biology. As a non invasive and real time imaging technique, DHM is particularly well suited for high throughput screening
Resumo:
We present first results on a method enabling mechanical scanning-free tomography with submicrometer axial resolution by multiple-wavelength digital holographic microscopy. By sequentially acquiring reflection holograms and summing 20 wavefronts equally spaced in spatial frequency in the 485-670 nm range, we are able to achieve a slice-by-slice tomographic reconstruction with a 0.6-1 mum axial resolution in a biological medium. The method is applied to erythrocytes investigation to retrieve the cellular membrane profile in three dimensions.
Resumo:
A fully-automated 3D image analysis method is proposed to segment lung nodules in HRCT. A specific gray-level mathematical morphology operator, the SMDC-connection cost, acting in the 3D space of the thorax volume is defined in order to discriminate lung nodules from other dense (vascular) structures. Applied to clinical data concerning patients with pulmonary carcinoma, the proposed method detects isolated, juxtavascular and peripheral nodules with sizes ranging from 2 to 20 mm diameter. The segmentation accuracy was objectively evaluated on real and simulated nodules. The method showed a sensitivity and a specificity ranging from 85% to 97% and from 90% to 98%, respectively.
Resumo:
BACKGROUND: Histologic grade in breast cancer provides clinically important prognostic information. However, 30%-60% of tumors are classified as histologic grade 2. This grade is associated with an intermediate risk of recurrence and is thus not informative for clinical decision making. We examined whether histologic grade was associated with gene expression profiles of breast cancers and whether such profiles could be used to improve histologic grading. METHODS: We analyzed microarray data from 189 invasive breast carcinomas and from three published gene expression datasets from breast carcinomas. We identified differentially expressed genes in a training set of 64 estrogen receptor (ER)-positive tumor samples by comparing expression profiles between histologic grade 3 tumors and histologic grade 1 tumors and used the expression of these genes to define the gene expression grade index. Data from 597 independent tumors were used to evaluate the association between relapse-free survival and the gene expression grade index in a Kaplan-Meier analysis. All statistical tests were two-sided. RESULTS: We identified 97 genes in our training set that were associated with histologic grade; most of these genes were involved in cell cycle regulation and proliferation. In validation datasets, the gene expression grade index was strongly associated with histologic grade 1 and 3 status; however, among histologic grade 2 tumors, the index spanned the values for histologic grade 1-3 tumors. Among patients with histologic grade 2 tumors, a high gene expression grade index was associated with a higher risk of recurrence than a low gene expression grade index (hazard ratio = 3.61, 95% confidence interval = 2.25 to 5.78; P < .001, log-rank test). CONCLUSIONS: Gene expression grade index appeared to reclassify patients with histologic grade 2 tumors into two groups with high versus low risks of recurrence. This approach may improve the accuracy of tumor grading and thus its prognostic value.
Resumo:
Diagnosis of several neurological disorders is based on the detection of typical pathological patterns in the electroencephalogram (EEG). This is a time-consuming task requiring significant training and experience. Automatic detection of these EEG patterns would greatly assist in quantitative analysis and interpretation. We present a method, which allows automatic detection of epileptiform events and discrimination of them from eye blinks, and is based on features derived using a novel application of independent component analysis. The algorithm was trained and cross validated using seven EEGs with epileptiform activity. For epileptiform events with compensation for eyeblinks, the sensitivity was 65 +/- 22% at a specificity of 86 +/- 7% (mean +/- SD). With feature extraction by PCA or classification of raw data, specificity reduced to 76 and 74%, respectively, for the same sensitivity. On exactly the same data, the commercially available software Reveal had a maximum sensitivity of 30% and concurrent specificity of 77%. Our algorithm performed well at detecting epileptiform events in this preliminary test and offers a flexible tool that is intended to be generalized to the simultaneous classification of many waveforms in the EEG.
Resumo:
Using a numerical approach, we explore wave-induced fluid flow effects in partially saturated porous rocks in which the gas-water saturation patterns are governed by mesoscopic heterogeneities associated with the dry frame properties. The link between the dry frame properties and the gas saturation is defined by the assumption of capillary pressure equilibrium, which in the presence of heterogeneity implies that neighbouring regions can exhibit different levels of saturation. To determine the equivalent attenuation and phase velocity of the synthetic rock samples considered in this study, we apply a numerical upscaling procedure, which permits to take into account mesoscopic heterogeneities associated with the dry frame properties as well as spatially continuous variations of the pore fluid properties. The multiscale nature of the fluid saturation is taken into account by locally computing the physical properties of an effective fluid, which are then used for the larger-scale simulations. We consider two sets of numerical experiments to analyse such effects in heterogeneous partially saturated porous media, where the saturation field is determined by variations in porosity and clay content, respectively. In both cases we also evaluate the seismic responses of corresponding binary, patchy-type saturation patterns. Our results indicate that significant attenuation and modest velocity dispersion effects take place in this kind of media for both binary patchy-type and spatially continuous gas saturation patterns and in particular in the presence of relatively small amounts of gas. The numerical experiments also show that the nature of the gas distribution patterns is a critical parameter controlling the seismic responses of these environments, since attenuation and velocity dispersion effects are much more significant and occur over a broader saturation range for binary patchy-type gas-water distributions. This analysis therefore suggests that the physical mechanisms governing partial saturation should be accounted for when analysing seismic data in a poroelastic framework. In this context, heterogeneities associated with the dry frame properties, which do not play important roles in wave-induced fluid flow processes per se, should be taken into account since they may determine the kind of gas distribution pattern taking place in the porous rock.