41 resultados para automation roadmap
Resumo:
The goal of this study was to compare the quantity and purity of DNA extracted from biological tracesusing the QIAsymphony robot with that of the manual QIAamp DNA mini kit currently in use in ourlaboratory. We found that the DNA yield of robot was 1.6-3.5 times lower than that of the manualprotocol. This resulted in a loss of 8% and 29% of the alleles correctly scored when analyzing 1/400 and 1/800 diluted saliva samples, respectively. Specific tests showed that the QIAsymphony was at least 2-16times more efficient at removing PCR inhibitors. The higher purity of the DNA may therefore partlycompensate for the lower DNA yield obtained. No case of cross-contamination was observed amongsamples. After purification with the robot, DNA extracts can be automatically transferred in 96-wellsplates, which is an ideal format for subsequent RT-qPCR quantification and DNA amplification. Lesshands-on time and reduced risk of operational errors represent additional advantages of the robotic platform.
Resumo:
The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.
Resumo:
In a multicenter study a new, fully automated Roche Diagnostics Elecsys HBsAg II screening assay with improved sensitivity to HBsAg mutant detection was compared to well-established HBsAg tests: AxSYM HBsAg V2 (Abbott), Architect HBsAg (Abbott), Advia Centaur HBsAg (Bayer) Enzygnost HBsAg 5.0 (Dade-Behring), and Vitros Eci HBsAg (Ortho). A total of 16 seroconversion panels, samples of 60 HBsAg native mutants, and 31 HBsAg recombinant mutants, dilution series of NIBSC and PEI standards, 156 HBV positive samples comprising genotypes A to G, 686 preselected HBsAg positive samples from different stages of infection, 3,593 samples from daily routine, and 6,360 unselected blood donations were tested to evaluate the analytical and clinical sensitivity, the detection of mutants, and the specificity of the new assay. Elecsys HBsAg II showed a statistically significant better sensitivity in seroconversion panels to the compared tests. Fifty-seven out of 60 native mutants and all recombinant mutants were found positive. Among 156 HBV samples with different genotypes and 696 preselected HBsAg positive samples Elecsys HBsAg II achieved a sensitivity of 100%. The lower detection limit for NIBSC standard was calculated to be 0.025 IU/ml and for the PEI standards ad and ay it was <0.001 and <0.005 U/ml, respectively. Within 2,724 daily routine specimens and 6.360 unselected blood donations Elecsys HBsAg II showed a specificity of 99.97 and 99.88%, respectively. In conclusion the new Elecsys HBsAg II shows a high sensitivity for the detection of all stages of HBV infection and HBsAg mutants paired together with a high specificity in blood donors, daily routine samples, and potentially interfering sera.
Resumo:
Despite intensive research efforts, the aetiology of the majority of chronic lung diseases (CLD) in both, children and adults, remains elusive. Current therapeutic options are limited, providing only symptomatic relief, rather than treating the underlying condition, or preventing its development in the first place. Thus, there is a strong and unmet clinical need for the development of both, novel effective therapies and preventative strategies for CLD. Many studies suggest that modifications of prenatal and/or early postnatal lung development will have important implications for future lung function and risk of CLD throughout life. This view represents a fundamental change of current pathophysiological concepts and treatment paradigms, and holds the potential to develop novel preventative and/or therapeutic strategies. However, for the successful development of such approaches, key questions, such as a clear understanding of underlying mechanisms of impaired lung development, the identification and validation of relevant preclinical models to facilitate translational research, and the development of concepts for correction of aberrant development, all need to be solved. Accordingly, a European Science Foundation Exploratory Workshop was held where clinical, translational and basic research scientists from different disciplines met to discuss potential mechanisms of developmental origins of CLD, and to identify major knowledge gaps in order to delineate a roadmap for future integrative research.
Resumo:
"MotionMaker (TM)" is a stationary programmable test and training system for the lower limbs developed at the 'Ecole Polytechnique Federale de Lausanne' with the 'Fondation Suisse pour les Cybertheses'.. The system is composed of two robotic orthoses comprising motors and sensors, and a control unit managing the trans-cutaneous electrical muscle stimulation with real-time regulation. The control of the Functional Electrical Stimulation (FES) induced muscle force necessary to mimic natural exercise is ensured by the control unit which receives a continuous input from the position and force sensors mounted on the robot. First results with control subjects showed the feasibility of creating movements by such closed-loop controlled FES induced muscle contractions. To make exercising with the MotionMaker (TM) safe for clinical trials with Spinal Cord Injured (SCI) volunteers, several original safety features have been introduced. The MotionMaker (TM) is able to identify and manage the occurrence of spasms. Fatigue can also be detected and overfatigue during exercise prevented.
Resumo:
Les reconstructions palinspastiques fournissent le cadre idéal à de nombreuses études géologiques, géographiques, océanographique ou climatiques. En tant qu?historiens de la terre, les "reconstructeurs" essayent d?en déchiffrer le passé. Depuis qu?ils savent que les continents bougent, les géologues essayent de retracer leur évolution à travers les âges. Si l?idée originale de Wegener était révolutionnaire au début du siècle passé, nous savons depuis le début des années « soixante » que les continents ne "dérivent" pas sans but au milieu des océans mais sont inclus dans un sur-ensemble associant croûte « continentale » et « océanique »: les plaques tectoniques. Malheureusement, pour des raisons historiques aussi bien que techniques, cette idée ne reçoit toujours pas l'écho suffisant parmi la communauté des reconstructeurs. Néanmoins, nous sommes intimement convaincus qu?en appliquant certaines méthodes et certains principes il est possible d?échapper à l?approche "Wégenerienne" traditionnelle pour enfin tendre vers la tectonique des plaques. Le but principal du présent travail est d?exposer, avec tous les détails nécessaires, nos outils et méthodes. Partant des données paléomagnétiques et paléogéographiques classiquement utilisées pour les reconstructions, nous avons développé une nouvelle méthodologie replaçant les plaques tectoniques et leur cinématique au coeur du problème. En utilisant des assemblages continentaux (aussi appelés "assemblées clés") comme des points d?ancrage répartis sur toute la durée de notre étude (allant de l?Eocène jusqu?au Cambrien), nous développons des scénarios géodynamiques permettant de passer de l?une à l?autre en allant du passé vers le présent. Entre deux étapes, les plaques lithosphériques sont peu à peu reconstruites en additionnant/ supprimant les matériels océaniques (symbolisés par des isochrones synthétiques) aux continents. Excepté lors des collisions, les plaques sont bougées comme des entités propres et rigides. A travers les âges, les seuls éléments évoluant sont les limites de plaques. Elles sont préservées aux cours du temps et suivent une évolution géodynamique consistante tout en formant toujours un réseau interconnecté à travers l?espace. Cette approche appelée "limites de plaques dynamiques" intègre de multiples facteurs parmi lesquels la flottabilité des plaques, les taux d'accrétions aux rides, les courbes de subsidence, les données stratigraphiques et paléobiogéographiques aussi bien que les évènements tectoniques et magmatiques majeurs. Cette méthode offre ainsi un bon contrôle sur la cinématique des plaques et fournit de sévères contraintes au modèle. Cette approche "multi-source" nécessite une organisation et une gestion des données efficaces. Avant le début de cette étude, les masses de données nécessaires était devenues un obstacle difficilement surmontable. Les SIG (Systèmes d?Information Géographiques) et les géo-databases sont des outils informatiques spécialement dédiés à la gestion, au stockage et à l?analyse des données spatialement référencées et de leurs attributs. Grâce au développement dans ArcGIS de la base de données PaleoDyn nous avons pu convertir cette masse de données discontinues en informations géodynamiques précieuses et facilement accessibles pour la création des reconstructions. Dans le même temps, grâce à des outils spécialement développés, nous avons, tout à la fois, facilité le travail de reconstruction (tâches automatisées) et amélioré le modèle en développant fortement le contrôle cinématique par la création de modèles de vitesses des plaques. Sur la base des 340 terranes nouvellement définis, nous avons ainsi développé un set de 35 reconstructions auxquelles est toujours associé un modèle de vitesse. Grâce à cet ensemble de données unique, nous pouvons maintenant aborder des problématiques majeurs de la géologie moderne telles que l?étude des variations du niveau marin et des changements climatiques. Nous avons commencé par aborder un autre problème majeur (et non définitivement élucidé!) de la tectonique moderne: les mécanismes contrôlant les mouvements des plaques. Nous avons pu observer que, tout au long de l?histoire de la terre, les pôles de rotation des plaques (décrivant les mouvements des plaques à la surface de la terre) tendent à se répartir le long d'une bande allant du Pacifique Nord au Nord de l'Amérique du Sud, l'Atlantique Central, l'Afrique du Nord, l'Asie Centrale jusqu'au Japon. Fondamentalement, cette répartition signifie que les plaques ont tendance à fuir ce plan médian. En l'absence d'un biais méthodologique que nous n'aurions pas identifié, nous avons interprété ce phénomène comme reflétant l'influence séculaire de la Lune sur le mouvement des plaques. La Lune sur le mouvement des plaques. Le domaine océanique est la clé de voute de notre modèle. Nous avons attaché un intérêt tout particulier à le reconstruire avec beaucoup de détails. Dans ce modèle, la croûte océanique est préservée d?une reconstruction à l?autre. Le matériel crustal y est symbolisé sous la forme d?isochrones synthétiques dont nous connaissons les âges. Nous avons également reconstruit les marges (actives ou passives), les rides médio-océaniques et les subductions intra-océaniques. En utilisant ce set de données très détaillé, nous avons pu développer des modèles bathymétriques 3-D unique offrant une précision bien supérieure aux précédents.<br/><br/>Palinspastic reconstructions offer an ideal framework for geological, geographical, oceanographic and climatology studies. As historians of the Earth, "reconstructers" try to decipher the past. Since they know that continents are moving, geologists a trying to retrieve the continents distributions through ages. If Wegener?s view of continent motions was revolutionary at the beginning of the 20th century, we know, since the Early 1960?s that continents are not drifting without goal in the oceanic realm but are included in a larger set including, all at once, the oceanic and the continental crust: the tectonic plates. Unfortunately, mainly due to technical and historical issues, this idea seems not to receive a sufficient echo among our particularly concerned community. However, we are intimately convinced that, by applying specific methods and principles we can escape the traditional "Wegenerian" point of view to, at last, reach real plate tectonics. This is the main aim of this study to defend this point of view by exposing, with all necessary details, our methods and tools. Starting with the paleomagnetic and paleogeographic data classically used in reconstruction studies, we developed a modern methodology placing the plates and their kinematics at the centre of the issue. Using assemblies of continents (referred as "key assemblies") as anchors distributed all along the scope of our study (ranging from Eocene time to Cambrian time) we develop geodynamic scenarios leading from one to the next, from the past to the present. In between, lithospheric plates are progressively reconstructed by adding/removing oceanic material (symbolized by synthetic isochrones) to major continents. Except during collisions, plates are moved as single rigid entities. The only evolving elements are the plate boundaries which are preserved and follow a consistent geodynamical evolution through time and form an interconnected network through space. This "dynamic plate boundaries" approach integrates plate buoyancy factors, oceans spreading rates, subsidence patterns, stratigraphic and paleobiogeographic data, as well as major tectonic and magmatic events. It offers a good control on plate kinematics and provides severe constraints for the model. This multi-sources approach requires an efficient data management. Prior to this study, the critical mass of necessary data became a sorely surmountable obstacle. GIS and geodatabases are modern informatics tools of specifically devoted to store, analyze and manage data and associated attributes spatially referenced on the Earth. By developing the PaleoDyn database in ArcGIS software we converted the mass of scattered data offered by the geological records into valuable geodynamical information easily accessible for reconstructions creation. In the same time, by programming specific tools we, all at once, facilitated the reconstruction work (tasks automation) and enhanced the model (by highly increasing the kinematic control of plate motions thanks to plate velocity models). Based on the 340 terranes properly defined, we developed a revised set of 35 reconstructions associated to their own velocity models. Using this unique dataset we are now able to tackle major issues of the geology (such as the global sea-level variations and climate changes). We started by studying one of the major unsolved issues of the modern plate tectonics: the driving mechanism of plate motions. We observed that, all along the Earth?s history, plates rotation poles (describing plate motions across the Earth?s surface) tend to follow a slight linear distribution along a band going from the Northern Pacific through Northern South-America, Central Atlantic, Northern Africa, Central Asia up to Japan. Basically, it sighifies that plates tend to escape this median plan. In the absence of a non-identified methodological bias, we interpreted it as the potential secular influence ot the Moon on plate motions. The oceanic realms are the cornerstone of our model and we attached a particular interest to reconstruct them with many details. In this model, the oceanic crust is preserved from one reconstruction to the next. The crustal material is symbolised by the synthetic isochrons from which we know the ages. We also reconstruct the margins (active or passive), ridges and intra-oceanic subductions. Using this detailed oceanic dataset, we developed unique 3-D bathymetric models offering a better precision than all the previously existing ones.
Resumo:
BACKGROUND: Chronic kidney disease (CKD) accelerates vascular stiffening related to age. Arterial stiffness may be evaluated measuring the carotid-femoral pulse wave velocity (PWV) or more simply, as recommended by KDOQI, monitoring pulse pressure (PP). Both correlate to survival and incidence of cardiovascular disease. PWV can also be estimated on the brachial artery using a Mobil-O-Graph; a non-operator dependent automatic device. The aim was to analyse whether, in a dialysis population, PWV obtained by Mobil-O-Graph (MogPWV) is more sensitive for vascular aging than PP. METHODS: A cohort of 143 patients from 4 dialysis units has been followed measuring MogPWV and PP every 3 to 6 months and compared to a control group with the same risk factors but an eGFR > 30 ml/min. RESULTS: MogPWV contrarily to PP did discriminate the dialysis population from the control group. The mean difference translated in age between the two populations was 8.4 years. The increase in MogPWV, as a function of age, was more rapid in the dialysis group. 13.3% of the dialysis patients but only 3.0% of the control group were outliers for MogPWV. The mortality rate (16 out of 143) was similar in outliers and inliers (7.4 and 8.0%/year). Stratifying patients according to MogPWV, a significant difference in survival was seen. A high parathormone (PTH) and to be dialysed for a hypertensive nephropathy were associated to a higher baseline MogPWV. CONCLUSIONS: Assessing PWV on the brachial artery using a Mobil-O-Graph is a valid and simple alternative, which, in the dialysis population, is more sensitive for vascular aging than PP. As demonstrated in previous studies PWV correlates to mortality. Among specific CKD risk factors only PTH is associated with a higher baseline PWV. TRIAL REGISTRATION: ClinicalTrials.gov Identifier: NCT02327962.
Resumo:
The quality of sample inoculation is critical for achieving an optimal yield of discrete colonies in both monomicrobial and polymicrobial samples to perform identification and antibiotic susceptibility testing. Consequently, we compared the performance between the InoqulA (BD Kiestra), the WASP (Copan), and manual inoculation methods. Defined mono- and polymicrobial samples of 4 bacterial species and cloudy urine specimens were inoculated on chromogenic agar by the InoqulA, the WASP, and manual methods. Images taken with ImagA (BD Kiestra) were analyzed with the VisionLab version 3.43 image analysis software to assess the quality of growth and to prevent subjective interpretation of the data. A 3- to 10-fold higher yield of discrete colonies was observed following automated inoculation with both the InoqulA and WASP systems than that with manual inoculation. The difference in performance between automated and manual inoculation was mainly observed at concentrations of >10(6) bacteria/ml. Inoculation with the InoqulA system allowed us to obtain significantly more discrete colonies than the WASP system at concentrations of >10(7) bacteria/ml. However, the level of difference observed was bacterial species dependent. Discrete colonies of bacteria present in 100- to 1,000-fold lower concentrations than the most concentrated populations in defined polymicrobial samples were not reproducibly recovered, even with the automated systems. The analysis of cloudy urine specimens showed that InoqulA inoculation provided a statistically significantly higher number of discrete colonies than that with WASP and manual inoculation. Consequently, the automated InoqulA inoculation greatly decreased the requirement for bacterial subculture and thus resulted in a significant reduction in the time to results, laboratory workload, and laboratory costs.
Resumo:
High-grade serous ovarian cancer (HGSOC) accounts for 70-80% of ovarian cancer deaths, and overall survival has not changed significantly for several decades. In this Opinion article, we outline a set of research priorities that we believe will reduce incidence and improve outcomes for women with this disease. This 'roadmap' for HGSOC was determined after extensive discussions at an Ovarian Cancer Action meeting in January 2015.
Resumo:
Cysteine cathepsin protease activity is frequently dysregulated in the context of neoplastic transformation. Increased activity and aberrant localization of proteases within the tumour microenvironment have a potent role in driving cancer progression, proliferation, invasion and metastasis. Recent studies have also uncovered functions for cathepsins in the suppression of the response to therapeutic intervention in various malignancies. However, cathepsins can be either tumour promoting or tumour suppressive depending on the context, which emphasizes the importance of rigorous in vivo analyses to ascertain function. Here, we review the basic research and clinical findings that underlie the roles of cathepsins in cancer, and provide a roadmap for the rational integration of cathepsin-targeting agents into clinical treatment.