20 resultados para atk-ohjelmat - LSP - Library software package

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Forensic examinations of ink have been performed since the beginning of the 20th century. Since the 1960s, the International Ink Library, maintained by the United States Secret Service, has supported those analyses. Until 2009, the search and identification of inks were essentially performed manually. This paper describes the results of a project designed to improve ink samples' analytical and search processes. The project focused on the development of improved standardization procedures to ensure the best possible reproducibility between analyses run on different HPTLC plates. The successful implementation of this new calibration method enabled the development of mathematical algorithms and of a software package to complement the existing ink library.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: TIDratio indirectly reflects myocardial ischemia and is correlated with cardiacprognosis. We aimed at comparing the influence of three different softwarepackages for the assessment of TID using Rb-82 cardiac PET/CT. Methods: Intotal, data of 30 patients were used based on normal myocardial perfusion(SSS<3 and SRS<3) and stress myocardial blood flow 2mL/min/g)assessed by Rb-82 cardiac PET/CT. After reconstruction using 2D OSEM (2Iterations, 28 subsets), 3-D filtering (Butterworth, order=10, ωc=0.5), data were automatically processed, and then manually processed fordefining identical basal and apical limits on both stress and rest images.TIDratio were determined with Myometrix®, ECToolbox® and QGS®software packages. Comparisons used ANOVA, Student t-tests and Lin concordancetest (ρc). Results: All of the 90 processings were successfullyperformed. TID ratio were not statistically different between software packageswhen data were processed automatically (P=0.2) or manually (P=0.17). There was a slight, butsignificant relative overestimation of TID with automatic processing incomparison to manual processing using ECToolbox® (1.07 ± 0.13 vs 1.0± 0.13, P=0.001)and Myometrix® (1.07 ± 0.15 vs 1.01 ± 0.11, P=0.003) but not using QGS®(1.02 ±0.12 vs 1.05 ± 0.11, P=0.16). The best concordance was achieved between ECToolbox®and Myometrix® manual (ρc=0.67) processing.Conclusion: Using automatic or manual mode TID estimation was not significantlyinfluenced by software type. Using Myometrix® or ECToolbox®TID was significantly different between automatic and manual processing, butnot using QGS®. Software package should be account for when definingTID normal reference limits, as well as when used in multicenter studies. QGS®software seemed to be the most operator-independent software package, whileECToolbox® and Myometrix® produced the closest results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Lung clearance index (LCI), a marker of ventilation inhomogeneity, is elevated early in children with cystic fibrosis (CF). However, in infants with CF, LCI values are found to be normal, although structural lung abnormalities are often detectable. We hypothesized that this discrepancy is due to inadequate algorithms of the available software package. AIM: Our aim was to challenge the validity of these software algorithms. METHODS: We compared multiple breath washout (MBW) results of current software algorithms (automatic modus) to refined algorithms (manual modus) in 17 asymptomatic infants with CF, and 24 matched healthy term-born infants. The main difference between these two analysis methods lies in the calculation of the molar mass differences that the system uses to define the completion of the measurement. RESULTS: In infants with CF the refined manual modus revealed clearly elevated LCI above 9 in 8 out of 35 measurements (23%), all showing LCI values below 8.3 using the automatic modus (paired t-test comparing the means, P < 0.001). Healthy infants showed normal LCI values using both analysis methods (n = 47, paired t-test, P = 0.79). The most relevant reason for false normal LCI values in infants with CF using the automatic modus was the incorrect recognition of the end-of-test too early during the washout. CONCLUSION: We recommend the use of the manual modus for the analysis of MBW outcomes in infants in order to obtain more accurate results. This will allow appropriate use of infant lung function results for clinical and scientific purposes. Pediatr Pulmonol. 2015; 50:970-977. © 2015 Wiley Periodicals, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rigorous organization and quality control (QC) are necessary to facilitate successful genome-wide association meta-analyses (GWAMAs) of statistics aggregated across multiple genome-wide association studies. This protocol provides guidelines for (i) organizational aspects of GWAMAs, and for (ii) QC at the study file level, the meta-level across studies and the meta-analysis output level. Real-world examples highlight issues experienced and solutions developed by the GIANT Consortium that has conducted meta-analyses including data from 125 studies comprising more than 330,000 individuals. We provide a general protocol for conducting GWAMAs and carrying out QC to minimize errors and to guarantee maximum use of the data. We also include details for the use of a powerful and flexible software package called EasyQC. Precise timings will be greatly influenced by consortium size. For consortia of comparable size to the GIANT Consortium, this protocol takes a minimum of about 10 months to complete.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The yeast Schizosaccharomyces pombe is frequently used as a model for studying the cell cycle. The cells are rod-shaped and divide by medial fission. The process of cell division, or cytokinesis, is controlled by a network of signaling proteins called the Septation Initiation Network (SIN); SIN proteins associate with the SPBs during nuclear division (mitosis). Some SIN proteins associate with both SPBs early in mitosis, and then display strongly asymmetric signal intensity at the SPBs in late mitosis, just before cytokinesis. This asymmetry is thought to be important for correct regulation of SIN signaling, and coordination of cytokinesis and mitosis. In order to study the dynamics of organelles or large protein complexes such as the spindle pole body (SPB), which have been labeled with a fluorescent protein tag in living cells, a number of the image analysis problems must be solved; the cell outline must be detected automatically, and the position and signal intensity associated with the structures of interest within the cell must be determined. RESULTS: We present a new 2D and 3D image analysis system that permits versatile and robust analysis of motile, fluorescently labeled structures in rod-shaped cells. We have designed an image analysis system that we have implemented as a user-friendly software package allowing the fast and robust image-analysis of large numbers of rod-shaped cells. We have developed new robust algorithms, which we combined with existing methodologies to facilitate fast and accurate analysis. Our software permits the detection and segmentation of rod-shaped cells in either static or dynamic (i.e. time lapse) multi-channel images. It enables tracking of two structures (for example SPBs) in two different image channels. For 2D or 3D static images, the locations of the structures are identified, and then intensity values are extracted together with several quantitative parameters, such as length, width, cell orientation, background fluorescence and the distance between the structures of interest. Furthermore, two kinds of kymographs of the tracked structures can be established, one representing the migration with respect to their relative position, the other representing their individual trajectories inside the cell. This software package, called "RodCellJ", allowed us to analyze a large number of S. pombe cells to understand the rules that govern SIN protein asymmetry. CONCLUSIONS: "RodCell" is freely available to the community as a package of several ImageJ plugins to simultaneously analyze the behavior of a large number of rod-shaped cells in an extensive manner. The integration of different image-processing techniques in a single package, as well as the development of novel algorithms does not only allow to speed up the analysis with respect to the usage of existing tools, but also accounts for higher accuracy. Its utility was demonstrated on both 2D and 3D static and dynamic images to study the septation initiation network of the yeast Schizosaccharomyces pombe. More generally, it can be used in any kind of biological context where fluorescent-protein labeled structures need to be analyzed in rod-shaped cells. AVAILABILITY: RodCellJ is freely available under http://bigwww.epfl.ch/algorithms.html, (after acceptance of the publication).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This contribution introduces Data Envelopment Analysis (DEA), a performance measurement technique. DEA helps decision makers for the following reasons: (1) By calculating an efficiency score, it indicates if a firm is efficient or has capacity for improvement; (2) By setting target values for input and output, it calculates how much input must be decreased or output increased in order to become efficient; (3) By identifying the nature of returns to scale, it indicates if a firm has to decrease or increase its scale (or size) in order to minimise the average total cost; (4) By identifying a set of benchmarks, it specifies which other firms' processes need to be analysed in order to improve its own practices. This contribution presents the essentials about DEA, alongside a case study to intuitively understand its application. It also introduces Win4DEAP, a software package that conducts efficiency analysis based on DEA methodology. The methodical background of DEA is presented for more demanding readers. Finally, four advanced topics of DEA are treated: adjustment to the environment, preferences, sensitivity analysis and time series data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: There is growing evidence that informal payments for health care are fairly common in many low- and middle-income countries. Informal payments are reported to have a negative consequence on equity and quality of care; it has been suggested, however, that they may contribute to health worker motivation and retention. Given the significance of motivation and retention issues in human resources for health, a better understanding of the relationships between the two phenomena is needed. This study attempts to assess whether and in what ways informal payments occur in Kibaha, Tanzania. Moreover, it aims to assess how informal earnings might help boost health worker motivation and retention. METHODS: Nine focus groups were conducted in three health facilities of different levels in the health system. In total, 64 health workers participated in the focus group discussions (81% female, 19% male) and where possible, focus groups were divided by cadre. All data were processed and analysed by means of the NVivo software package. RESULTS: The use of informal payments in the study area was confirmed by this study. Furthermore, a negative relationship between informal payments and job satisfaction and better motivation is suggested. Participants mentioned that they felt enslaved by patients as a result of being bribed and this resulted in loss of self-esteem. Furthermore, fear of detection was a main demotivating factor. These factors seem to counterbalance the positive effect of financial incentives. Moreover, informal payments were not found to be related to retention of health workers in the public health system. Other factors such as job security seemed to be more relevant for retention. CONCLUSION: This study suggests that the practice of informal payments contributes to the general demotivation of health workers and negatively affects access to health care services and quality of the health system. Policy action is needed that not only provides better financial incentives for individuals but also tackles an environment in which corruption is endemic.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes methods to analyze the brain's electric fields recorded with multichannel Electroencephalogram (EEG) and demonstrates their implementation in the software CARTOOL. It focuses on the analysis of the spatial properties of these fields and on quantitative assessment of changes of field topographies across time, experimental conditions, or populations. Topographic analyses are advantageous because they are reference independents and thus render statistically unambiguous results. Neurophysiologically, differences in topography directly indicate changes in the configuration of the active neuronal sources in the brain. We describe global measures of field strength and field similarities, temporal segmentation based on topographic variations, topographic analysis in the frequency domain, topographic statistical analysis, and source imaging based on distributed inverse solutions. All analysis methods are implemented in a freely available academic software package called CARTOOL. Besides providing these analysis tools, CARTOOL is particularly designed to visualize the data and the analysis results using 3-dimensional display routines that allow rapid manipulation and animation of 3D images. CARTOOL therefore is a helpful tool for researchers as well as for clinicians to interpret multichannel EEG and evoked potentials in a global, comprehensive, and unambiguous way.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tumors in non-Hodgkin lymphoma (NHL) patients are often proximal to the major blood vessels in the abdomen or neck. In external-beam radiotherapy, these tumors present a challenge because imaging resolution prevents the beam from being targeted to the tumor lesion without also irradiating the artery wall. This problem has led to potentially life-threatening delayed toxicity. Because radioimmunotherapy has resulted in long-term survival of NHL patients, we investigated whether the absorbed dose (AD) to the artery wall in radioimmunotherapy of NHL is of potential concern for delayed toxicity. SPECT resolution is not sufficient to enable dosimetric analysis of anatomic features of the thickness of the aortic wall. Therefore, we present a model of aortic wall toxicity based on data from 4 patients treated with (131)I-tositumomab. METHODS: Four NHL patients with periaortic tumors were administered pretherapeutic (131)I-tositumomab. Abdominal SPECT and whole-body planar images were obtained at 48, 72, and 144 h after tracer administration. Blood-pool activity concentrations were obtained from regions of interest drawn on the heart on the planar images. Tumor and blood activity concentrations, scaled to therapeutic administered activities-both standard and myeloablative-were input into a geometry and tracking model (GEANT, version 4) of the aorta. The simulated energy deposited in the arterial walls was collected and fitted, and the AD and biologic effective dose values to the aortic wall and tumors were obtained for standard therapeutic and hypothetical myeloablative administered activities. RESULTS: Arterial wall ADs from standard therapy were lower (0.6-3.7 Gy) than those typical from external-beam therapy, as were the tumor ADs (1.4-10.5 Gy). The ratios of tumor AD to arterial wall AD were greater for radioimmunotherapy by a factor of 1.9-4.0. For myeloablative therapy, artery wall ADs were in general less than those typical for external-beam therapy (9.4-11.4 Gy for 3 of 4 patients) but comparable for 1 patient (32.6 Gy). CONCLUSION: Blood vessel radiation dose can be estimated using the software package 3D-RD combined with GEANT modeling. The dosimetry analysis suggested that arterial wall toxicity is highly unlikely in standard dose radioimmunotherapy but should be considered a potential concern and limiting factor in myeloablative therapy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The MIGCLIM R package is a function library for the open source R software that enables the implementation of species-specific dispersal constraints into projections of species distribution models under environmental change and/or landscape fragmentation scenarios. The model is based on a cellular automaton and the basic modeling unit is a cell that is inhabited or not. Model parameters include dispersal distance and kernel, long distance dispersal, barriers to dispersal, propagule production potential and habitat invasibility. The MIGCLIM R package has been designed to be highly flexible in the parameter values it accepts, and to offer good compatibility with existing species distribution modeling software. Possible applications include the projection of future species distributions under environmental change conditions and modeling the spread of invasive species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: IOL centration and stability after cataract surgery is of high interest for cataract surgeons and IOL-producing companies. We present a new imaging software to evaluate the centration of the rhexis and the centration of the IOL after cataract surgery.Methods: We developed, in collaboration with the Biomedical Imaging Group (BIG), EPFL, Lausanne, a new working tool in order to assess precisely outcomes after IOL-implantation, such as ideal capsulorhexis and IOL-centration. The software is a plug-in of ImageJ, a general-purpose image processing and image-analysis package. The specifications of this software are: evaluation of the rhexis-centration and evaluation the position of the IOL in the posterior chamber. The end points are to analyze the quality of the centration of a rhexis after cataract surgery, the deformation of the rhexis with capsular bag retraction and the centration of the IOL after implantation.Results: This software delivers tools to interactively measure the distances between limbus, IOL and capsulorhexis and its changes over time. The user is invited to adjust nodes of three radial curves for the limbus, rhexis and the optic of the IOL. The radial distances of the curves are computed to evaluate the IOL implantation. The user is also able to define patterns for ideal capsulorhexis and optimal IOL-centration. We are going to present examples of calculations after cataract surgery.Conclusions: Evaluation of the centration of the rhexis and of the IOL after cataract surgery is an important end point for optimal IOL implantation after cataract surgery. Especially multifocal or accommodative lenses need a precise position in the bag with a good stability over time. This software is able to evaluate these parameters just after the surgery but also its changes over time. The results of these evaluations can lead to an optimizing of surgical procedures and materials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The variety of DNA microarray formats and datasets presently available offers an unprecedented opportunity to perform insightful comparisons of heterogeneous data. Cross-species studies, in particular, have the power of identifying conserved, functionally important molecular processes. Validation of discoveries can now often be performed in readily available public data which frequently requires cross-platform studies.Cross-platform and cross-species analyses require matching probes on different microarray formats. This can be achieved using the information in microarray annotations and additional molecular biology databases, such as orthology databases. Although annotations and other biological information are stored using modern database models ( e. g. relational), they are very often distributed and shared as tables in text files, i.e. flat file databases. This common flat database format thus provides a simple and robust solution to flexibly integrate various sources of information and a basis for the combined analysis of heterogeneous gene expression profiles.Results: We provide annotationTools, a Bioconductor-compliant R package to annotate microarray experiments and integrate heterogeneous gene expression profiles using annotation and other molecular biology information available as flat file databases. First, annotationTools contains a specialized set of functions for mining this widely used database format in a systematic manner. It thus offers a straightforward solution for annotating microarray experiments. Second, building on these basic functions and relying on the combination of information from several databases, it provides tools to easily perform cross-species analyses of gene expression data.Here, we present two example applications of annotationTools that are of direct relevance for the analysis of heterogeneous gene expression profiles, namely a cross-platform mapping of probes and a cross-species mapping of orthologous probes using different orthology databases. We also show how to perform an explorative comparison of disease-related transcriptional changes in human patients and in a genetic mouse model.Conclusion: The R package annotationTools provides a simple solution to handle microarray annotation and orthology tables, as well as other flat molecular biology databases. Thereby, it allows easy integration and analysis of heterogeneous microarray experiments across different technological platforms or species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The package HIERFSTAT for the statistical software R, created by the R Development Core Team, allows the estimate of hierarchical F-statistics from a hierarchy with any numbers of levels. In addition, it allows testing the statistical significance of population differentiation for these different levels, using a generalized likelihood-ratio test. The package HIERFSTAT is available at http://www.unil.ch/popgen/softwares/hierfstat.htm.