16 resultados para Cloud Computing, Software-as-a-Service (SaaS), SaaS Multi-Tenant, Windows Azure


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cloud computing and its three facets (Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS)) are terms that denote new developments in the software industry. In particular, PaaS solutions, also referred to as cloud platforms, are changing the way software is being produced, distributed, consumed, and priced. Software vendors have started considering cloud platforms as a strategic option but are battling to redefine their offerings to embrace PaaS. In contrast to SaaS and IaaS, PaaS allows for value co-creation with partners to develop complementary components and applications. It thus requires multisided business models that bring together two or more distinct customer segments. Understanding how to design PaaS business models to establish a flourishing ecosystem is crucial for software vendors. This doctoral thesis aims to address this issue in three interrelated research parts. First, based on case study research, the thesis provides a deeper understanding of current PaaS business models and their evolution. Second, it analyses and simulates consumers' preferences regarding PaaS business models, using a conjoint approach to find out what determines the choice of cloud platforms. Finally, building on the previous research outcomes, the third part introduces a design theory for the emerging class of PaaS business models, which is grounded on an extensive action design research study with a large European software vendor. Understanding PaaS business models from a market as well as a consumer perspective will, together with the design theory, inform and guide decision makers in their business model innovation plans. It also closes gaps in the research related to PaaS business model design and more generally related to platform business models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cloud computing has recently become very popular, and several bioinformatics applications exist already in that domain. The aim of this article is to analyse a current cloud system with respect to usability, benchmark its performance and compare its user friendliness with a conventional cluster job submission system. Given the current hype on the theme, user expectations are rather high, but current results show that neither the price/performance ratio nor the usage model is very satisfactory for large-scale embarrassingly parallel applications. However, for small to medium scale applications that require CPU time at certain peak times the cloud is a suitable alternative.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The emergence of powerful new technologies, the existence of large quantities of data, and increasing demands for the extraction of added value from these technologies and data have created a number of significant challenges for those charged with both corporate and information technology management. The possibilities are great, the expectations high, and the risks significant. Organisations seeking to employ cloud technologies and exploit the value of the data to which they have access, be this in the form of "Big Data" available from different external sources or data held within the organisation, in structured or unstructured formats, need to understand the risks involved in such activities. Data owners have responsibilities towards the subjects of the data and must also, frequently, demonstrate that they are in compliance with current standards, laws and regulations. This thesis sets out to explore the nature of the technologies that organisations might utilise, identify the most pertinent constraints and risks, and propose a framework for the management of data from discovery to external hosting that will allow the most significant risks to be managed through the definition, implementation, and performance of appropriate internal control activities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents the segmentation of bilateral parotid glands in the Head and Neck (H&N) CT images using an active contour based atlas registration. We compare segmentation results from three atlas selection strategies: (i) selection of "single-most-similar" atlas for each image to be segmented, (ii) fusion of segmentation results from multiple atlases using STAPLE, and (iii) fusion of segmentation results using majority voting. Among these three approaches, fusion using majority voting provided the best results. Finally, we present a detailed evaluation on a dataset of eight images (provided as a part of H&N auto segmentation challenge conducted in conjunction with MICCAI-2010 conference) using majority voting strategy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is a need for more efficient methods giving insight into the complex mechanisms of neurotoxicity. Testing strategies including in vitro methods have been proposed to comply with this requirement. With the present study we aimed to develop a novel in vitro approach which mimics in vivo complexity, detects neurotoxicity comprehensively, and provides mechanistic insight. For this purpose we combined rat primary re-aggregating brain cell cultures with a mass spectrometry (MS)-based metabolomics approach. For the proof of principle we treated developing re-aggregating brain cell cultures for 48h with the neurotoxicant methyl mercury chloride (0.1-100muM) and the brain stimulant caffeine (1-100muM) and acquired cellular metabolic profiles. To detect toxicant-induced metabolic alterations the profiles were analysed using commercial software which revealed patterns in the multi-parametric dataset by principal component analyses (PCA), and recognised the most significantly altered metabolites. PCA revealed concentration-dependent cluster formations for methyl mercury chloride (0.1-1muM), and treatment-dependent cluster formations for caffeine (1-100muM) at sub-cytotoxic concentrations. Four relevant metabolites responsible for the concentration-dependent alterations following methyl mercury chloride treatment could be identified using MS-MS fragmentation analysis. These were gamma-aminobutyric acid, choline, glutamine, creatine and spermine. Their respective mass ion intensities demonstrated metabolic alterations in line with the literature and suggest that the metabolites could be biomarkers for mechanisms of neurotoxicity or neuroprotection. In addition, we evaluated whether the approach could identify neurotoxic potential by testing eight compounds which have target organ toxicity in the liver, kidney or brain at sub-cytotoxic concentrations. PCA revealed cluster formations largely dependent on target organ toxicity indicating possible potential for the development of a neurotoxicity prediction model. With such results it could be useful to perform a validation study to determine the reliability, relevance and applicability of this approach to neurotoxicity screening. Thus, for the first time we show the benefits and utility of in vitro metabolomics to comprehensively detect neurotoxicity and to discover new biomarkers.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Zermatt-Saas Fee Zone (ZSZ) in the Western Alps consists of multiple slices of ultramafic, mafic and metasedimentary rocks. They represent the remnants of the Mesozoic Piemonte-Ligurian oceanic basin which was subducted to eclogite facies conditions with peak pressures and temperatures of up to 20-28 kbar and 550-630 °C, followed by a greenschist overprint during exhumation. Previous studies, emphasizing on isotopie geochronology and modeling of REE-behavior in garnets from mafic eclogites, suggest that the ZSZ is buildup of tectonic slices which underwent a protracted diachronous subduction followed by a rapid synchronous exhumation. In this study Rb/Sr geochronology is applied to phengite included in garnets from metasediments of two different slices of the ZSZ to date garnet growth. Inclusion ages for 2 metapelitic samples from the same locality from the first slice are 44.25 ± 0.48 Ma and 43.19 ± 0.32 Ma. Those are about 4 Ma older than the corresponding matrix mica ages of respectively 40.02 ± 0.13 Ma and 39.55 ± 0.25 Ma. The inclusion age for a third calcschist sample, collected from a second slice, is 40.58 ± 0.24 Ma and the matrix age is 39.8 ± 1.5 Ma. The results show that garnet effectively functioned as a shield, preventing a reset of the Rb/Sr isotopie clock in the included phengites to temperatures well above the closure of Sr in mica. The results are consistent with the results of former studies on the ZSZ using both Lu/Hf and Sm/Nd geochronology on mafic eclogites. They confirm that at least parts of the ZSZ underwent close to peak metamorphic HP conditions younger than 43 m.y. ago before being rapidly exhumed about 40 m.y. ago. Fluid infiltration in rocks of the second slice occurred likely close to the peak metamorphic conditions, resulting in rapid growth of garnets. Similar calcschists from the same slice contain two distinct types of porphyroblast garnets with indications of multiple growth pulses and resorption indicated by truncated chemical zoning patterns. In-situ oxygen isotope Sensitive High Resolution Ion Microprobe (SHRIMP) analyses along profiles on central sections of the garnets reveal variations of up to 5 %o in individual garnets. The complex compositional zoning and graphite inclusion patterns as well as the variations in oxygen isotopes correspond to growing under changing fluid composition conditions caused by external infiltrated fluids. The ultramafic and mafic rocks, which were subducted along with the sediments and form the volumetrically most important part of the ZSZ, are the likely source of those mainly aqueous fluids. - La Zone de Zermatt-Saas Fee (ZZS) est constituée de multiples écailles de roches ultramafiques, mafiques et méta-sédimentaires. Cette zone, qui affleure dans les Alpes occidentales, représente les restes du basin océanique Piémontais-Ligurien d'âge mésozoïque. Lors de la subduction de ce basin océanique à l'Eocène, les différentes roches composant le planché océanique ont atteint les conditions du faciès éclogitique avec des pressions et des températures maximales estimées entre 20 - 28 kbar et 550 - 630 °C respectivement, avant de subir une rétrogression au faciès schiste vert pendant l'exhumation. Différentes études antérieures combinant la géochronologie isotopique et la modélisation des mécanismes gouvernant l'incorporation des terres rares dans les grenats des éclogites mafiques, suggèrent que la ZZS ne correspond pas à une seule unité, mais est constituée de différentes écailles tectoniques qui ont subi une subduction prolongée et diachrone suivie d'une exhumation rapide et synchrone. Afin de tester cette hypothèse, j'ai daté, dans cette étude, des phengites incluses dans les grenats des méta-sédiments de deux différentes écailles tectoniques de la ZZS, afin de dater la croissance relative de ces grenats. Pour cela j'ai utilisé la méthode géochronologique basée sur la décroissance du Rb87 en Sr87. J'ai daté trois échantillons de deux différentes écailles. Les premiers deux échantillons proviennent de Triftji, au nord du Breithorn, d'une première écaille dont les méta-sédiments sont caractérisés par des bandes méta-pélitiques à grenat et des calcschistes. Le troisième échantillon a été collectionné au Riffelberg, dans une écaille dont les méta-sédiments sont essentiellement des calcschistes qui sont mélangés avec des roches mafiques et des serpentinites. Ce mélange se trouve au-dessus de la grande masse de serpentinites qui forment le Riffelhorn, le Trockenersteg et le Breithorn, et qui est connu sous le nom de la Zone de mélange de Riffelberg (Bearth, 1953). Les inclusions dans les grenats de deux échantillons méta-pélitiques de la première écaille sont datées à 44.25 ± 0.48 Ma et à 43.19 ± 0.32 Ma. Ces âges sont à peu près 4 Ma plus vieux que les âges obtenus sur les phengites provenant de la matrice de ces mêmes échantillons qui donnent des âges de 40.02 ± 0.13 Ma et 39.55 ± 0.25 Ma respectivement. Les inclusions de phengite dans les grenats appartenant à un calcschiste de la deuxième écaille ont un âge de 40.58 ± 0.24 Ma alors que les phengites de la matrice ont un âge de 39.8 ± 1.5 Ma. Pour expliquer ces différences d'âge entre les phengites incluses dans le grenat et les phengites provenant de la matrice, nous suggérons que la cristallisation de grenat ait permis d'isoler ces phengites et de les préserver de tous rééquilibrage lors de la suite du chemin métamorphique prograde, puis rétrograde. Ceci est particulièrement important pour expliquer l'absence de rééquilibrage des phengites dans des conditions de températures supérieures à la température de fermeture du système Rb/Sr pour les phengites. Les phengites en inclusions n'ayant pas pu être datées individuellement, nous interprétons l'âge de 44 Ma pour les inclusions de phengite comme un âge moyen pour l'incorporation de ces phengites dans le grenat. Ces résultats sont cohérents avec les résultats des études antérieures de la ZZS utilisant les systèmes isotopiques de Sm/Nd et Lu/Hf sur des eclogites mafiques. ils confirment qu'aux moins une partie de la ZZS a subi des conditions de pression et de température maximale il y a moins de 44 à 42 Ma avant d'être rapidement exhumée à des conditions métamorphiques du faciès schiste vert supérieur autour de 40 Ma. Cette étude détaillée des grenats a permis, également, de mettre en évidence le rôle des fluides durant le métamorphisme prograde. En effet, si tous les grenats montrent des puises de croissance et de résorption, on peut distinguer, dans différents calcschists provenant de la deuxième écaille, deux types distincts de porphyroblast de grenat en fonction de la présence ou non d'inclusions de graphite. Nous lions ces puises de croissances/résorptions ainsi que la présence ou l'absence de graphite en inclusion dans les grenats à l'infiltration de fluides dans le système, et ceci durant tous le chemin prograde mais plus particulièrement proche et éventuellement peu après du pic du métamorphisme comme le suggère l'âge de 40 Ma mesuré dans les inclusions de phengites de l'échantillon du Riffelberg. Des analyses in-situ d'isotopes d'oxygène réalisé à l'aide de la SHRIMP (Sensitive High Resolution Ion Microprobe) dans des coupes centrales des grenats indiquent des variations jusqu'à 5 %o au sein même d'un grenat. Les motifs de zonations chimiques et d'inclusions de graphite complexes, ainsi que les variations du δ180 correspondent à une croissance de grenat sous des conditions de fluides changeantes dues aux infiltrations de fluides externes. Nous lions l'origine de ces fluides aqueux aux unités ultramafiques et mafiques qui ont été subductés avec les méta-sédiments ; unités ultramafiques et mafiques qui forment la partie volumétrique la plus importante de la ZZS.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The M-Coffee server is a web server that makes it possible to compute multiple sequence alignments (MSAs) by running several MSA methods and combining their output into one single model. This allows the user to simultaneously run all his methods of choice without having to arbitrarily choose one of them. The MSA is delivered along with a local estimation of its consistency with the individual MSAs it was derived from. The computation of the consensus multiple alignment is carried out using a special mode of the T-Coffee package [Notredame, Higgins and Heringa (T-Coffee: a novel method for fast and accurate multiple sequence alignment. J. Mol. Biol. 2000; 302: 205-217); Wallace, O'Sullivan, Higgins and Notredame (M-Coffee: combining multiple sequence alignment methods with T-Coffee. Nucleic Acids Res. 2006; 34: 1692-1699)] Given a set of sequences (DNA or proteins) in FASTA format, M-Coffee delivers a multiple alignment in the most common formats. M-Coffee is a freeware open source package distributed under a GPL license and it is available either as a standalone package or as a web service from www.tcoffee.org.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

THESIS ABSTRACT Garnets are one of the key metamorphic minerals used to study peak metamorphic conditions or crystallization ages. Equilibrium is typically assumed between the garnet and the matrix. This thesis attempts to understand garnet growth in the Zermatt-Saas Fee (ZSF) eclogites, and discusses consequences for Sm/Nd and Lu/Hf dating and the equilibrium assumption. All studied garnets from the ZSF eclogites are strongly zoned in Mn, Fe, Mg, and Ca. Methods based on chemical zoning patterns and on 3D spatial statistics indicate different growth mechanisms depending on the sample studied. Garnets from the Pfulwe area are grown in a system where surface kinetics likely dominated over intergranular diffusion kinetics. Garnets fram two other localities, Nuarsax and Lago di Cignana, seem to have grown in a system where intergranular diffusion kinetics were dominating over surface kinetics, at least during initial growth. Garnets reveal strong prograde REE+Y zoning. They contain narrow central peaks for Lu + Yb + Tm ± Er and at least one additional small peak towards the rim. The REE Sm + Eu + Gd + Tb ± Dy are depleted in the cores but show one prominent peak close to the rim. It is shown that these patterns cam be explained using a transient matrix diffusion model where REE uptake is limited by diffusion in the matrix surrounding the porphyroblast. The secondary peaks in the garnet profiles are interpreted to reflect thermally activated diffusion due to a temperature increase during prograde metamorphism. The model predicts anomalously low 176Lu/177Hf and 147Sm/144Nd ratios in garnets where growth rates are fast compared to diffusion of the REE, which decreases garnet isochron precisions. The sharp Lu zoning was further used to constrain maximum Lu volume diffusion rates in garnet. The modeled minimum pre-exponential diffusion coefficient which fits the measured central peak is in the order of Do = 5.7* 106 m2/s, taking an activation energy of 270 kJ/mol. The latter was chosen in agreement with experimentally determined values. This can be used to estimate a minimum closure temperature of around 630°C for the ZSF zone. Zoning of REE was combined with published Lu/Hf and Sm/Nd age information to redefine the prograde crystallization interval for Lago di Cignana UHP eclogites. Modeling revealed that a prograde growth interval in the order of 25 m.y. is needed to produce the measured spread in ages. RÉSUMÉ Le grenat est un minéral métamorphique clé pour déterminer les conditions du pic de métamorphisme ainsi que l'âge de cristallisation. L'équilibre entre le grenat et la matrice est requis. Cette étude a pour but de comprendre la croissance du grenat dans les éclogites de la zone de Zermatt-Saas Fee (ZSF) et d'examiner quelques conséquences sur les datations Sm/Nd et Lu/Hf. Tous les grenats des éclogites de ZSF étudiés sont fortement zonés en Mn, Fe, Mg et partiellement en Ca. Les différentes méthodes basées sur le modèle de zonation chimique ainsi que sur les statistiques de répartition spatiale en 3D indiquent un mécanisme de croissance différent en fonction de la localité d'échantillonnage. Les grenats provenant de la zone de Pfulwe ont probablement crû dans un système principalement dominé par la cinétique de surface au détriment de 1a cinétique de diffusion intergranulaire. Les grenats provenant de deux autres localités, Nuarsax et Lago di Cignana, semblent avoir cristallisé dans un système dominé par la diffusion intergranulaire, au moins durant les premiers stades de croissance. Les grenats montrent une forte zonation prograde en Terres Rares (REE) ainsi qu'en Y. Les profils présentent au coeur un pic étroit en Lu + Yb+ Tm ± Er et au moins un petit pic supplémentaire vers le bord. Les coeurs des grenats sont appauvris en Sm + Eu + Gd + Tb ± Dy, mais les bords sont marqués par un pic important de ces REE. Ces profils s'expliquent par un modèle de diffusion matricielle dans lequel l'apport en REE est limité par la diffusion dans la matrice environnant les porphyroblastes. Les pics secondaires en bordure de grain reflètent la diffusion activée par l'augmentation de la température lors du métamorphisme prograde. Ce modèle prédit des rapports 176Lu/177Hf et 147Sm/144Nd anormalement bas lorsque les taux de croissance sont plus rapides que la diffusion des REE, ce qui diminue la précision des isochrones impliquant le grenat. La zonation nette en Lu a permis de contraindre le maximum de diffusion volumique par une approche numérique. Le coefficient de diffusion minimum modélisé en adéquation avec les pics mesurés est de l'ordre de Do = 5.7*10-6 m2/s, en prenant une énergie d'activation ~270 kJ/mol déterminée expérimentalement. Ainsi, la température de clôture minimale est estimée aux alentours de 630°C pour la zone ZSF. Des nouvelles données de zonation de REE sont combinées aux âges obtenus avec les rapports Lu/Hf et Sm/Nd qui redéfissent l'intervalle de cristallisation prograde pour les éclogites UHP de Lago di Cignana. La modélisation permet d'attribuer au minimum un intervalle de croissance prograde de 25 Ma afin d'obtenir les âges préalablement mesurés. RESUME GRAND PUBLIC L'un des principaux buts du pétrologue .métamorphique est d'extraire des roches les informations sur l'évolution temporelle, thermique et barométrique qu'elles ont subi au cours de la formation d'une chaîne de montagne. Le grenat est l'un des minéraux clés dans une grande variété de roches métamorphiques. Il a fait l'objet de nombreuses études dans des terrains d'origines variées ou lors d'études expérimentales afin de comprendre ses domaines de stabilité, ses réactions et sa coexistence avec d'autres minéraux. Cela fait du grenat l'un des minéraux les plus attractifs pour la datation des roches. Cependant, lorsqu'on l'utilise pour la datation et/ou pour la géothermobarométrie, on suppose toujours que le grenat croît en équilibre avec les phases coexistantes de la matrice. Pourtant, la croissance d'un minéral est en général liée au processus de déséquilibre. Cette étude a pour but de comprendre comment croît le grenat dans les éclogites de Zermatt - Saas Fee et donc d'évaluer le degré de déséquilibre. Il s'agit aussi d'expliquer les différences d'âges obtenues grâce aux grenats dans les différentes localités de l'unité de Zermatt-Saas Fee. La principale question posée lors de l'étude des mécanismes de croissance du grenat est: Parmi les processus en jeu lors de la croissance du grenat (dissolution des anciens minéraux, transport des éléments vers le nouveau grenat, précipitation d'une nouvelle couche en surface du minéral), lequel est le plus lent et ainsi détermine le degré de déséquilibre? En effet, les grenats d'une des localités (Pfulwe) indiquent que le phénomène d'adhérence en surface est le plus lent, contrairement aux grenats des autres localités (Lago di Cignana, Nuarsax) dans lesquels ce sont les processus de transport qui sont les plus lents. Cela montre que les processus dominants sont variables, même dans des roches similaires de la même unité tectonique. Ceci implique que les processus doivent être déterminés individuellement pour chaque roche afin d'évaluer le degré de déséquilibre du grenat dans la roche. Tous les grenats analysés présentent au coeur une forte concentration de Terres Rares: Lu + Yb + Tm ± Er qui décroît vers le bord du grain. Inversement, les Terres Rares Sm + Eu + Gd + Tb ± Dy sont appauvries au coeur et se concentrent en bordure du grain. La modélisation révèle que ces profils sont-dus à des cinétiques lentes de transport des Terres Rares. De plus, les modèles prédisent des concentrations basses en éléments radiogéniques pères dans certaines roches, ce qui influence fortement sur la précision des âges obtenus par la méthode d'isochrone. Ceci signifie que les roches les plus adaptées pour les datations ne doivent contenir ni beaucoup de grenat ni de très gros cristaux, car dans ce cas, la compétition des éléments entre les cristaux limite à de faibles concentrations la quantité d'éléments pères dans chaque cristal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Forensic examinations of ink have been performed since the beginning of the 20th century. Since the 1960s, the International Ink Library, maintained by the United States Secret Service, has supported those analyses. Until 2009, the search and identification of inks were essentially performed manually. This paper describes the results of a project designed to improve ink samples' analytical and search processes. The project focused on the development of improved standardization procedures to ensure the best possible reproducibility between analyses run on different HPTLC plates. The successful implementation of this new calibration method enabled the development of mathematical algorithms and of a software package to complement the existing ink library.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most life science processes involve, at the atomic scale, recognition between two molecules. The prediction of such interactions at the molecular level, by so-called docking software, is a non-trivial task. Docking programs have a wide range of applications ranging from protein engineering to drug design. This article presents SwissDock, a web server dedicated to the docking of small molecules on target proteins. It is based on the EADock DSS engine, combined with setup scripts for curating common problems and for preparing both the target protein and the ligand input files. An efficient Ajax/HTML interface was designed and implemented so that scientists can easily submit dockings and retrieve the predicted complexes. For automated docking tasks, a programmatic SOAP interface has been set up and template programs can be downloaded in Perl, Python and PHP. The web site also provides an access to a database of manually curated complexes, based on the Ligand Protein Database. A wiki and a forum are available to the community to promote interactions between users. The SwissDock web site is available online at http://www.swissdock.ch. We believe it constitutes a step toward generalizing the use of docking tools beyond the traditional molecular modeling community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Extensible Markup Language (XML) is a generic computing language that provides an outstanding case study of commodification of service standards. The development of this language in the late 1990s marked a shift in computer science as its extensibility let store and share any kind of data. Many office suites software rely on it. The chapter highlights how the largest multinational firms pay special attention to gain a recognised international standard for such a major technological innovation. It argues that standardisation processes affects market structures and can lead to market capture. By examining how a strategic use of standardisation arenas can generate profits, it shows that Microsoft succeeded in making its own technical solution a recognised ISO standard in 2008, while the same arena already adopted two years earlier the open source standard set by IBM and Sun Microsystems. Yet XML standardisation also helped to establish a distinct model of information technology services at the expense of Microsoft monopoly on proprietary software

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on blood concentrations measurement. Maintaining concentrations within a target range requires pharmacokinetic (PK) and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Methods: Literature and Internet were searched to identify software. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software characteristics. Numbers of drugs handled vary from 2 to more than 180, and integration of different population types is available for some programs. Nevertheless, 8 programs offer the ability to add new drug models based on population PK data. 10 computer tools incorporate Bayesian computation to predict dosage regimen (individual parameters are calculated based on population PK models). All of them are able to compute Bayesian a posteriori dosage adaptation based on a blood concentration while 9 are also able to suggest a priori dosage regimen, only based on individual patient covariates. Among those applying Bayesian analysis, MM-USC*PACK uses a non-parametric approach. The top 2 programs emerging from this benchmark are MwPharm and TCIWorks. Others programs evaluated have also a good potential but are less sophisticated or less user-friendly.¦Conclusions: Whereas 2 software packages are ranked at the top of the list, such complex tools would possibly not fit all institutions, and each program must be regarded with respect to individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Although interest in TDM tools is growing and efforts were put into it in the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capability of data storage and automated report generation.