80 resultados para Optimization software

em Université de Lausanne, Switzerland


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the Juste-Neige system for predicting the snow height on the ski runs of a resort using a multi-agent simulation software. Its aim is to facilitate snow cover management in order to i) reduce the production cost of artificial snow and to improve the profit margin for the companies managing the ski resorts; and ii) to reduce the water and energy consumption, and thus to reduce the environmental impact, by producing only the snow needed for a good skiing experience. The software provides maps with the predicted snow heights for up to 13 days. On these maps, the areas most exposed to snow erosion are highlighted. The software proceeds in three steps: i) interpolation of snow height measurements with a neural network; ii) local meteorological forecasts for every ski resort; iii) simulation of the impact caused by skiers using a multi-agent system. The software has been evaluated in the Swiss ski resort of Verbier and provides useful predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

RATIONALE AND OBJECTIVES: To determine optimum spatial resolution when imaging peripheral arteries with magnetic resonance angiography (MRA). MATERIALS AND METHODS: Eight vessel diameters ranging from 1.0 to 8.0 mm were simulated in a vascular phantom. A total of 40 three-dimensional flash MRA sequences were acquired with incremental variations of fields of view, matrix size, and slice thickness. The accurately known eight diameters were combined pairwise to generate 22 "exact" degrees of stenosis ranging from 42% to 87%. Then, the diameters were measured in the MRA images by three independent observers and with quantitative angiography (QA) software and used to compute the degrees of stenosis corresponding to the 22 "exact" ones. The accuracy and reproducibility of vessel diameter measurements and stenosis calculations were assessed for vessel size ranging from 6 to 8 mm (iliac artery), 4 to 5 mm (femoro-popliteal arteries), and 1 to 3 mm (infrapopliteal arteries). Maximum pixel dimension and slice thickness to obtain a mean error in stenosis evaluation of less than 10% were determined by linear regression analysis. RESULTS: Mean errors on stenosis quantification were 8.8% +/- 6.3% for 6- to 8-mm vessels, 15.5% +/- 8.2% for 4- to 5-mm vessels, and 18.9% +/- 7.5% for 1- to 3-mm vessels. Mean errors on stenosis calculation were 12.3% +/- 8.2% for observers and 11.4% +/- 15.1% for QA software (P = .0342). To evaluate stenosis with a mean error of less than 10%, maximum pixel surface, the pixel size in the phase direction, and the slice thickness should be less than 1.56 mm2, 1.34 mm, 1.70 mm, respectively (voxel size 2.65 mm3) for 6- to 8-mm vessels; 1.31 mm2, 1.10 mm, 1.34 mm (voxel size 1.76 mm3), for 4- to 5-mm vessels; and 1.17 mm2, 0.90 mm, 0.9 mm (voxel size 1.05 mm3) for 1- to 3-mm vessels. CONCLUSION: Higher spatial resolution than currently used should be selected for imaging peripheral vessels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ski resorts are deploying more and more systems of artificial snow. These tools are necessary to ensure an important economic activity for the high alpine valleys. However, artificial snow raises important environmental issues that can be reduced by an optimization of its production. This paper presents a software prototype based on artificial intelligence to help ski resorts better manage their snowpack. It combines on one hand a General Neural Network for the analysis of the snow cover and the spatial prediction, with on the other hand a multiagent simulation of skiers for the analysis of the spatial impact of ski practice. The prototype has been tested on the ski resort of Verbier (Switzerland).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, protein-ligand docking has become a powerful tool for drug development. Although several approaches suitable for high throughput screening are available, there is a need for methods able to identify binding modes with high accuracy. This accuracy is essential to reliably compute the binding free energy of the ligand. Such methods are needed when the binding mode of lead compounds is not determined experimentally but is needed for structure-based lead optimization. We present here a new docking software, called EADock, that aims at this goal. It uses an hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 A around the center of mass of the ligand position in the crystal structure, and on the contrary to other benchmarks, our algorithm was fed with optimized ligand positions up to 10 A root mean square deviation (RMSD) from the crystal structure, excluding the latter. This validation illustrates the efficiency of our sampling strategy, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures could be explained by the presence of crystal contacts in the experimental structure. Finally, the ability of EADock to accurately predict binding modes on a real application was illustrated by the successful docking of the RGD cyclic pentapeptide on the alphaVbeta3 integrin, starting far away from the binding pocket.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

MOTIVATION: The detection of positive selection is widely used to study gene and genome evolution, but its application remains limited by the high computational cost of existing implementations. We present a series of computational optimizations for more efficient estimation of the likelihood function on large-scale phylogenetic problems. We illustrate our approach using the branch-site model of codon evolution. RESULTS: We introduce novel optimization techniques that substantially outperform both CodeML from the PAML package and our previously optimized sequential version SlimCodeML. These techniques can also be applied to other likelihood-based phylogeny software. Our implementation scales well for large numbers of codons and/or species. It can therefore analyse substantially larger datasets than CodeML. We evaluated FastCodeML on different platforms and measured average sequential speedups of FastCodeML (single-threaded) versus CodeML of up to 5.8, average speedups of FastCodeML (multi-threaded) versus CodeML on a single node (shared memory) of up to 36.9 for 12 CPU cores, and average speedups of the distributed FastCodeML versus CodeML of up to 170.9 on eight nodes (96 CPU cores in total).Availability and implementation: ftp://ftp.vital-it.ch/tools/FastCodeML/. CONTACT: selectome@unil.ch or nicolas.salamin@unil.ch.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Restriction site-associated DNA sequencing (RADseq) provides researchers with the ability to record genetic polymorphism across thousands of loci for nonmodel organisms, potentially revolutionizing the field of molecular ecology. However, as with other genotyping methods, RADseq is prone to a number of sources of error that may have consequential effects for population genetic inferences, and these have received only limited attention in terms of the estimation and reporting of genotyping error rates. Here we use individual sample replicates, under the expectation of identical genotypes, to quantify genotyping error in the absence of a reference genome. We then use sample replicates to (i) optimize de novo assembly parameters within the program Stacks, by minimizing error and maximizing the retrieval of informative loci; and (ii) quantify error rates for loci, alleles and single-nucleotide polymorphisms. As an empirical example, we use a double-digest RAD data set of a nonmodel plant species, Berberis alpina, collected from high-altitude mountains in Mexico.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The goal of the present work was assess the feasibility of using a pseudo-inverse and null-space optimization approach in the modeling of the shoulder biomechanics. The method was applied to a simplified musculoskeletal shoulder model. The mechanical system consisted in the arm, and the external forces were the arm weight, 6 scapulo-humeral muscles and the reaction at the glenohumeral joint, which was considered as a spherical joint. The muscle wrapping was considered around the humeral head assumed spherical. The dynamical equations were solved in a Lagrangian approach. The mathematical redundancy of the mechanical system was solved in two steps: a pseudo-inverse optimization to minimize the square of the muscle stress and a null-space optimization to restrict the muscle force to physiological limits. Several movements were simulated. The mathematical and numerical aspects of the constrained redundancy problem were efficiently solved by the proposed method. The prediction of muscle moment arms was consistent with cadaveric measurements and the joint reaction force was consistent with in vivo measurements. This preliminary work demonstrated that the developed algorithm has a great potential for more complex musculoskeletal modeling of the shoulder joint. In particular it could be further applied to a non-spherical joint model, allowing for the natural translation of the humeral head in the glenoid fossa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To evaluate the feasibility, determine the optimal b-value, and assess the utility of 3-T diffusion-weighted MR imaging (DWI) of the spine in differentiating benign from pathologic vertebral compression fractures.Methods and Materials: Twenty patients with 38 vertebral compression fractures (24 benign, 14 pathologic) and 20 controls (total: 23 men, 17 women, mean age 56.2years) were included from December 2010 to May 2011 in this IRB-approved prospective study. MR imaging of the spine was performed on a 3-T unit with T1-w, fat-suppressed T2-w, gadolinium-enhanced fat-suppressed T1-w and zoomed-EPI (2D RF excitation pulse combined with reduced field-of-view single-shot echo-planar readout) diffusion-w (b-values: 0, 300, 500 and 700s/mm2) sequences. Two radiologists independently assessed zoomed-EPI image quality in random order using a 4-point scale: 1=excellent to 4=poor. They subsequently measured apparent diffusion coefficients (ADCs) in normal vertebral bodies and compression fractures, in consensus.Results: Lower b-values correlated with better image quality scores, with significant differences between b=300 (mean±SD=2.6±0.8), b=500 (3.0±0.7) and b=700 (3.6±0.6) (all p<0.001). Mean ADCs of normal vertebral bodies (n=162) were 0.23, 0.17 and 0.11×10-3mm2/s with b=300, 500 and 700s/mm2, respectively. In contrast, mean ADCs were 0.89, 0.70 and 0.59×10-3mm2/s for benign vertebral compression fractures and 0.79, 0.66 and 0.51×10-3mm2/s for pathologic fractures with b=300, 500 and 700s/mm2, respectively. No significant difference was found between ADCs of benign and pathologic fractures.Conclusion: 3-T DWI of the spine is feasible and lower b-values (300s/mm2) are recommended. However, our preliminary results show no advantage of DWI in differentiating benign from pathologic vertebral compression fractures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Images obtained from high-throughput mass spectrometry (MS) contain information that remains hidden when looking at a single spectrum at a time. Image processing of liquid chromatography-MS datasets can be extremely useful for quality control, experimental monitoring and knowledge extraction. The importance of imaging in differential analysis of proteomic experiments has already been established through two-dimensional gels and can now be foreseen with MS images. We present MSight, a new software designed to construct and manipulate MS images, as well as to facilitate their analysis and comparison.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The action of various DNA topoisomerases frequently results in characteristic changes in DNA topology. Important information for understanding mechanistic details of action of these topoisomerases can be provided by investigating the knot types resulting from topoisomerase action on circular DNA forming a particular knot type. Depending on the topological bias of a given topoisomerase reaction, one observes different subsets of knotted products. To establish the character of topological bias, one needs to be aware of all possible topological outcomes of intersegmental passages occurring within a given knot type. However, it is not trivial to systematically enumerate topological outcomes of strand passage from a given knot type. We present here a 3D visualization software (TopoICE-X in KnotPlot) that incorporates topological analysis methods in order to visualize, for example, knots that can be obtained from a given knot by one intersegmental passage. The software has several other options for the topological analysis of mechanisms of action of various topoisomerases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coltop3D is a software that performs structural analysis by using digital elevation model (DEM) and 3D point clouds acquired with terrestrial laser scanners. A color representation merging slope aspect and slope angle is used in order to obtain a unique code of color for each orientation of a local slope. Thus a continuous planar structure appears in a unique color. Several tools are included to create stereonets, to draw traces of discontinuities, or to compute automatically density stereonet. Examples are shown to demonstrate the efficiency of the method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Restriction site-associated DNA sequencing (RADseq) provides researchers with the ability to record genetic polymorphism across thousands of loci for nonmodel organisms, potentially revolutionizing the field of molecular ecology. However, as with other genotyping methods, RADseq is prone to a number of sources of error that may have consequential effects for population genetic inferences, and these have received only limited attention in terms of the estimation and reporting of genotyping error rates. Here we use individual sample replicates, under the expectation of identical genotypes, to quantify genotyping error in the absence of a reference genome. We then use sample replicates to (i) optimize de novo assembly parameters within the program Stacks, by minimizing error and maximizing the retrieval of informative loci; and (ii) quantify error rates for loci, alleles and single-nucleotide polymorphisms. As an empirical example, we use a double-digest RAD data set of a nonmodel plant species, Berberis alpina, collected from high-altitude mountains in Mexico.