45 resultados para Robotics, Automation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multidisciplinary management of colorectal liver metastases allows an increase of about 20% in the resection rate of liver metastases. It includes chemotherapy, interventional radiology and surgery. In 2013, the preliminary results of the in-situ split of the liver associated with portal vein ligation (ALLPS) are promising with unprecedented mean hypertrophy up to 70% at day 9. However, the related morbidity of this procedure is about 40% and hence should be performed in the setting of study protocol only. For pancreatic cancer, the future belongs to the use of adjuvant and neo adjuvant therapies in order to increase the resection rate. Laparoscopic and robot-assisted surgery is still in evolution with significant benefits in the reduction of cost, hospital stay, and postoperative morbidity. Finally, enhanced recovery pathways (ERAS) have been validated for colorectal surgery and are currently assessed in other fields of surgery like HPB and upper GI surgery.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The goal of this study was to compare the quantity and purity of DNA extracted from biological tracesusing the QIAsymphony robot with that of the manual QIAamp DNA mini kit currently in use in ourlaboratory. We found that the DNA yield of robot was 1.6-3.5 times lower than that of the manualprotocol. This resulted in a loss of 8% and 29% of the alleles correctly scored when analyzing 1/400 and 1/800 diluted saliva samples, respectively. Specific tests showed that the QIAsymphony was at least 2-16times more efficient at removing PCR inhibitors. The higher purity of the DNA may therefore partlycompensate for the lower DNA yield obtained. No case of cross-contamination was observed amongsamples. After purification with the robot, DNA extracts can be automatically transferred in 96-wellsplates, which is an ideal format for subsequent RT-qPCR quantification and DNA amplification. Lesshands-on time and reduced risk of operational errors represent additional advantages of the robotic platform.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Laparoscopic surgery has become a standard approach for many interventions, including oncologic surgery. Laparoscopic instruments have been developed to allow advanced surgical procedure. Imaging and computer assistance in virtual reality or robotic procedure will certainly improve access to this surgery.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Chronic atrial fibrillation affects millions of people worldwide. Its surgical treatment often fails to restore the transport function of the atrium. This study first introduces the concept of an atrial assist device (AAD) to restore the pump function of the atrium. The AAD is developed to be totally implantable in the human body with a transcutaneous energy transfer system to recharge the implanted battery. The ADD consists of a motorless pump based on artificial muscle technology, positioned on the external surface of the atrium to compress it and restore its muscular activity. A bench model reproduces the function of a fibrillating atrium to assess the circulatory support that this pump can provide. Atripump (Nanopowers SA, Switzerland) is a dome-shaped silicone-coated nitinol actuator 5 mm high, sutured on the external surface of the atrium. A pacemaker-like control unit drives the actuator that compresses the atrium, providing the mechanical support to the blood circulation. Electrical characteristics: the system is composed of one actuator that needs a minimal tension of 15 V and has a maximum current of 1.5 A with a 50% duty cycle. The implantable rechargeable battery is made of a cell having the following specifications: nominal tension of a cell: 4.1 V, tension after 90% of discharge: 3.5 V, nominal capacity of a cell: 163 mA h. The bench model consists of an open circuit made of latex bladder 60 mm in diameter filled with water. The bladder is connected to a vertically positioned tube that is filled to different levels, reproducing changes in cardiac preload. The Atripump is placed on the outer surface of the bladder. Pressure, volume and temperature changes were recorded. The contraction rate was 1 Hz with a power supply of 12 V, 400 mA for 200 ms. Preload ranged from 15 to 21 cm H(2)O. Maximal silicone membrane temperature was 55 degrees C and maximal temperature of the liquid environment was 35 degrees C. The pump produced a maximal work of 16 x 10(-3) J. Maximal volume pumped was 492 ml min(-1). This artificial muscle pump is compact, follows the Starling law and reproduces the hemodynamic performances of a normal atrium. It could represent a new tool to restore the atrial kick in persistent atrial fibrillation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a multicenter study a new, fully automated Roche Diagnostics Elecsys HBsAg II screening assay with improved sensitivity to HBsAg mutant detection was compared to well-established HBsAg tests: AxSYM HBsAg V2 (Abbott), Architect HBsAg (Abbott), Advia Centaur HBsAg (Bayer) Enzygnost HBsAg 5.0 (Dade-Behring), and Vitros Eci HBsAg (Ortho). A total of 16 seroconversion panels, samples of 60 HBsAg native mutants, and 31 HBsAg recombinant mutants, dilution series of NIBSC and PEI standards, 156 HBV positive samples comprising genotypes A to G, 686 preselected HBsAg positive samples from different stages of infection, 3,593 samples from daily routine, and 6,360 unselected blood donations were tested to evaluate the analytical and clinical sensitivity, the detection of mutants, and the specificity of the new assay. Elecsys HBsAg II showed a statistically significant better sensitivity in seroconversion panels to the compared tests. Fifty-seven out of 60 native mutants and all recombinant mutants were found positive. Among 156 HBV samples with different genotypes and 696 preselected HBsAg positive samples Elecsys HBsAg II achieved a sensitivity of 100%. The lower detection limit for NIBSC standard was calculated to be 0.025 IU/ml and for the PEI standards ad and ay it was <0.001 and <0.005 U/ml, respectively. Within 2,724 daily routine specimens and 6.360 unselected blood donations Elecsys HBsAg II showed a specificity of 99.97 and 99.88%, respectively. In conclusion the new Elecsys HBsAg II shows a high sensitivity for the detection of all stages of HBV infection and HBsAg mutants paired together with a high specificity in blood donors, daily routine samples, and potentially interfering sera.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since 2000 and the commercialisation of the Da Vinci robotic system, indications for robotic surgery are rapidly increasing. Recent publications proved superior functional outcomes with equal oncologic safety in comparison to conventional open surgery. Its field of application may extend to the nasopharynx and skull base surgery. The preliminary results are encouraging. This article reviews the current literature on the role of transoral robotic surgery in head and neck cancer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The effect of motor training using closed loop controlled Functional Electrical Stimulation (FES) on motor performance was studied in 5 spinal cord injured (SCI) volunteers. The subjects trained 2 to 3 times a week during 2 months on a newly developed rehabilitation robot (MotionMaker?). The FES induced muscle force could be adequately adjusted throughout the programmed exercises by the way of a closed loop control of the stimulation currents. The software of the MotionMaker? allowed spasms to be detected accurately and managed in a way to prevent any harm to the SCI persons. Subjects with incomplete SCI reported an increased proprioceptive awareness for motion and were able to achieve a better voluntary activation of their leg muscles during controlled FES. At the end of the training, the voluntary force of the 4 incomplete SCI patients was found increased by 388% on their most affected leg and by 193% on the other leg. Active mobilisation with controlled FES seems to be effective in improving motor function in SCI persons by increasing the sensory input to neuronal circuits involved in motor control as well as by increasing muscle strength.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Les reconstructions palinspastiques fournissent le cadre idéal à de nombreuses études géologiques, géographiques, océanographique ou climatiques. En tant qu?historiens de la terre, les "reconstructeurs" essayent d?en déchiffrer le passé. Depuis qu?ils savent que les continents bougent, les géologues essayent de retracer leur évolution à travers les âges. Si l?idée originale de Wegener était révolutionnaire au début du siècle passé, nous savons depuis le début des années « soixante » que les continents ne "dérivent" pas sans but au milieu des océans mais sont inclus dans un sur-ensemble associant croûte « continentale » et « océanique »: les plaques tectoniques. Malheureusement, pour des raisons historiques aussi bien que techniques, cette idée ne reçoit toujours pas l'écho suffisant parmi la communauté des reconstructeurs. Néanmoins, nous sommes intimement convaincus qu?en appliquant certaines méthodes et certains principes il est possible d?échapper à l?approche "Wégenerienne" traditionnelle pour enfin tendre vers la tectonique des plaques. Le but principal du présent travail est d?exposer, avec tous les détails nécessaires, nos outils et méthodes. Partant des données paléomagnétiques et paléogéographiques classiquement utilisées pour les reconstructions, nous avons développé une nouvelle méthodologie replaçant les plaques tectoniques et leur cinématique au coeur du problème. En utilisant des assemblages continentaux (aussi appelés "assemblées clés") comme des points d?ancrage répartis sur toute la durée de notre étude (allant de l?Eocène jusqu?au Cambrien), nous développons des scénarios géodynamiques permettant de passer de l?une à l?autre en allant du passé vers le présent. Entre deux étapes, les plaques lithosphériques sont peu à peu reconstruites en additionnant/ supprimant les matériels océaniques (symbolisés par des isochrones synthétiques) aux continents. Excepté lors des collisions, les plaques sont bougées comme des entités propres et rigides. A travers les âges, les seuls éléments évoluant sont les limites de plaques. Elles sont préservées aux cours du temps et suivent une évolution géodynamique consistante tout en formant toujours un réseau interconnecté à travers l?espace. Cette approche appelée "limites de plaques dynamiques" intègre de multiples facteurs parmi lesquels la flottabilité des plaques, les taux d'accrétions aux rides, les courbes de subsidence, les données stratigraphiques et paléobiogéographiques aussi bien que les évènements tectoniques et magmatiques majeurs. Cette méthode offre ainsi un bon contrôle sur la cinématique des plaques et fournit de sévères contraintes au modèle. Cette approche "multi-source" nécessite une organisation et une gestion des données efficaces. Avant le début de cette étude, les masses de données nécessaires était devenues un obstacle difficilement surmontable. Les SIG (Systèmes d?Information Géographiques) et les géo-databases sont des outils informatiques spécialement dédiés à la gestion, au stockage et à l?analyse des données spatialement référencées et de leurs attributs. Grâce au développement dans ArcGIS de la base de données PaleoDyn nous avons pu convertir cette masse de données discontinues en informations géodynamiques précieuses et facilement accessibles pour la création des reconstructions. Dans le même temps, grâce à des outils spécialement développés, nous avons, tout à la fois, facilité le travail de reconstruction (tâches automatisées) et amélioré le modèle en développant fortement le contrôle cinématique par la création de modèles de vitesses des plaques. Sur la base des 340 terranes nouvellement définis, nous avons ainsi développé un set de 35 reconstructions auxquelles est toujours associé un modèle de vitesse. Grâce à cet ensemble de données unique, nous pouvons maintenant aborder des problématiques majeurs de la géologie moderne telles que l?étude des variations du niveau marin et des changements climatiques. Nous avons commencé par aborder un autre problème majeur (et non définitivement élucidé!) de la tectonique moderne: les mécanismes contrôlant les mouvements des plaques. Nous avons pu observer que, tout au long de l?histoire de la terre, les pôles de rotation des plaques (décrivant les mouvements des plaques à la surface de la terre) tendent à se répartir le long d'une bande allant du Pacifique Nord au Nord de l'Amérique du Sud, l'Atlantique Central, l'Afrique du Nord, l'Asie Centrale jusqu'au Japon. Fondamentalement, cette répartition signifie que les plaques ont tendance à fuir ce plan médian. En l'absence d'un biais méthodologique que nous n'aurions pas identifié, nous avons interprété ce phénomène comme reflétant l'influence séculaire de la Lune sur le mouvement des plaques. La Lune sur le mouvement des plaques. Le domaine océanique est la clé de voute de notre modèle. Nous avons attaché un intérêt tout particulier à le reconstruire avec beaucoup de détails. Dans ce modèle, la croûte océanique est préservée d?une reconstruction à l?autre. Le matériel crustal y est symbolisé sous la forme d?isochrones synthétiques dont nous connaissons les âges. Nous avons également reconstruit les marges (actives ou passives), les rides médio-océaniques et les subductions intra-océaniques. En utilisant ce set de données très détaillé, nous avons pu développer des modèles bathymétriques 3-D unique offrant une précision bien supérieure aux précédents.<br/><br/>Palinspastic reconstructions offer an ideal framework for geological, geographical, oceanographic and climatology studies. As historians of the Earth, "reconstructers" try to decipher the past. Since they know that continents are moving, geologists a trying to retrieve the continents distributions through ages. If Wegener?s view of continent motions was revolutionary at the beginning of the 20th century, we know, since the Early 1960?s that continents are not drifting without goal in the oceanic realm but are included in a larger set including, all at once, the oceanic and the continental crust: the tectonic plates. Unfortunately, mainly due to technical and historical issues, this idea seems not to receive a sufficient echo among our particularly concerned community. However, we are intimately convinced that, by applying specific methods and principles we can escape the traditional "Wegenerian" point of view to, at last, reach real plate tectonics. This is the main aim of this study to defend this point of view by exposing, with all necessary details, our methods and tools. Starting with the paleomagnetic and paleogeographic data classically used in reconstruction studies, we developed a modern methodology placing the plates and their kinematics at the centre of the issue. Using assemblies of continents (referred as "key assemblies") as anchors distributed all along the scope of our study (ranging from Eocene time to Cambrian time) we develop geodynamic scenarios leading from one to the next, from the past to the present. In between, lithospheric plates are progressively reconstructed by adding/removing oceanic material (symbolized by synthetic isochrones) to major continents. Except during collisions, plates are moved as single rigid entities. The only evolving elements are the plate boundaries which are preserved and follow a consistent geodynamical evolution through time and form an interconnected network through space. This "dynamic plate boundaries" approach integrates plate buoyancy factors, oceans spreading rates, subsidence patterns, stratigraphic and paleobiogeographic data, as well as major tectonic and magmatic events. It offers a good control on plate kinematics and provides severe constraints for the model. This multi-sources approach requires an efficient data management. Prior to this study, the critical mass of necessary data became a sorely surmountable obstacle. GIS and geodatabases are modern informatics tools of specifically devoted to store, analyze and manage data and associated attributes spatially referenced on the Earth. By developing the PaleoDyn database in ArcGIS software we converted the mass of scattered data offered by the geological records into valuable geodynamical information easily accessible for reconstructions creation. In the same time, by programming specific tools we, all at once, facilitated the reconstruction work (tasks automation) and enhanced the model (by highly increasing the kinematic control of plate motions thanks to plate velocity models). Based on the 340 terranes properly defined, we developed a revised set of 35 reconstructions associated to their own velocity models. Using this unique dataset we are now able to tackle major issues of the geology (such as the global sea-level variations and climate changes). We started by studying one of the major unsolved issues of the modern plate tectonics: the driving mechanism of plate motions. We observed that, all along the Earth?s history, plates rotation poles (describing plate motions across the Earth?s surface) tend to follow a slight linear distribution along a band going from the Northern Pacific through Northern South-America, Central Atlantic, Northern Africa, Central Asia up to Japan. Basically, it sighifies that plates tend to escape this median plan. In the absence of a non-identified methodological bias, we interpreted it as the potential secular influence ot the Moon on plate motions. The oceanic realms are the cornerstone of our model and we attached a particular interest to reconstruct them with many details. In this model, the oceanic crust is preserved from one reconstruction to the next. The crustal material is symbolised by the synthetic isochrons from which we know the ages. We also reconstruct the margins (active or passive), ridges and intra-oceanic subductions. Using this detailed oceanic dataset, we developed unique 3-D bathymetric models offering a better precision than all the previously existing ones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Chronic kidney disease (CKD) accelerates vascular stiffening related to age. Arterial stiffness may be evaluated measuring the carotid-femoral pulse wave velocity (PWV) or more simply, as recommended by KDOQI, monitoring pulse pressure (PP). Both correlate to survival and incidence of cardiovascular disease. PWV can also be estimated on the brachial artery using a Mobil-O-Graph; a non-operator dependent automatic device. The aim was to analyse whether, in a dialysis population, PWV obtained by Mobil-O-Graph (MogPWV) is more sensitive for vascular aging than PP. METHODS: A cohort of 143 patients from 4 dialysis units has been followed measuring MogPWV and PP every 3 to 6 months and compared to a control group with the same risk factors but an eGFR > 30 ml/min. RESULTS: MogPWV contrarily to PP did discriminate the dialysis population from the control group. The mean difference translated in age between the two populations was 8.4 years. The increase in MogPWV, as a function of age, was more rapid in the dialysis group. 13.3% of the dialysis patients but only 3.0% of the control group were outliers for MogPWV. The mortality rate (16 out of 143) was similar in outliers and inliers (7.4 and 8.0%/year). Stratifying patients according to MogPWV, a significant difference in survival was seen. A high parathormone (PTH) and to be dialysed for a hypertensive nephropathy were associated to a higher baseline MogPWV. CONCLUSIONS: Assessing PWV on the brachial artery using a Mobil-O-Graph is a valid and simple alternative, which, in the dialysis population, is more sensitive for vascular aging than PP. As demonstrated in previous studies PWV correlates to mortality. Among specific CKD risk factors only PTH is associated with a higher baseline PWV. TRIAL REGISTRATION: ClinicalTrials.gov Identifier: NCT02327962.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Le traitement de radiochirurgie par Gamma Knife (GK) est utilisé de plus en plus souvent comme une alternative à la microchirurgie conventionnelle pour le traitement des pathologies neurochirurgicales intracrâniennes. Il s'agit d'irradier en dose unique et à haute énergie, en condition stéréotaxique et à l'aide d'une imagerie multimodale (imagerie par résonance magnétique [IRM], tomodensitométrie et éventuellement artériographie). Le GK a été inventé par le neurochirurgien suédois Lars Leksell, qui a réalisé le premier ciblage du nerf trijumeau en 1951, sur la base d'une radiographie standard. Depuis, les progrès de l'informatique et de la robotique ont permis d'améliorer la technique de radiochirurgie qui s'effectue actuellement soit par accélérateur linéaire de particules monté sur un bras robotisé (Novalis®, Cyberknife®), soit par collimation de près de 192 sources fixes (GK). La principale indication radiochirurgicale dans le traitement de la douleur est la névralgie du nerf trijumeau. Les autres indications, plus rares, sont la névralgie du nerf glossopharyngien, l'algie vasculaire de la face, ainsi qu'un traitement de la douleur d'origine cancéreuse par hypophysiolyse. Gamma Knife surgery (GKS) is widely used as an alternative to open microsurgical procedures as noninvasive treatment of many intracranial conditions. It consists of delivering a single dose of high energy in stereotactic conditions, and with the help of a multimodal imaging (e.g., magnetic resonance imaging [MRI], computer tomography, and eventually angiography). The Gamma Knife (GK) was invented by the Swedish neurosurgeon Lars Leksell who was the first to treat a trigeminal neuralgia sufferer in 1951 using an orthogonal X-ray tube. Since then, the progresses made both in the field of informatics and robotics have allowed to improve the radiosurgical technique, which is currently performed either by a linear accelerator of particles mounted on a robotized arm (Novalis®, Cyberknife®), or by collimation of 192 fixed Co-60 sources (GK). The main indication of GKS in the treatment of pain is trigeminal neuralgia. The other indications, less frequent, are: glossopharyngeal neuralgia, cluster headache, and hypophysiolyse for cancer pain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The quality of sample inoculation is critical for achieving an optimal yield of discrete colonies in both monomicrobial and polymicrobial samples to perform identification and antibiotic susceptibility testing. Consequently, we compared the performance between the InoqulA (BD Kiestra), the WASP (Copan), and manual inoculation methods. Defined mono- and polymicrobial samples of 4 bacterial species and cloudy urine specimens were inoculated on chromogenic agar by the InoqulA, the WASP, and manual methods. Images taken with ImagA (BD Kiestra) were analyzed with the VisionLab version 3.43 image analysis software to assess the quality of growth and to prevent subjective interpretation of the data. A 3- to 10-fold higher yield of discrete colonies was observed following automated inoculation with both the InoqulA and WASP systems than that with manual inoculation. The difference in performance between automated and manual inoculation was mainly observed at concentrations of >10(6) bacteria/ml. Inoculation with the InoqulA system allowed us to obtain significantly more discrete colonies than the WASP system at concentrations of >10(7) bacteria/ml. However, the level of difference observed was bacterial species dependent. Discrete colonies of bacteria present in 100- to 1,000-fold lower concentrations than the most concentrated populations in defined polymicrobial samples were not reproducibly recovered, even with the automated systems. The analysis of cloudy urine specimens showed that InoqulA inoculation provided a statistically significantly higher number of discrete colonies than that with WASP and manual inoculation. Consequently, the automated InoqulA inoculation greatly decreased the requirement for bacterial subculture and thus resulted in a significant reduction in the time to results, laboratory workload, and laboratory costs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modelling the shoulder's musculature is challenging given its mechanical and geometric complexity. The use of the ideal fibre model to represent a muscle's line of action cannot always faithfully represent the mechanical effect of each muscle, leading to considerable differences between model-estimated and in vivo measured muscle activity. While the musculo-tendon force coordination problem has been extensively analysed in terms of the cost function, only few works have investigated the existence and sensitivity of solutions to fibre topology. The goal of this paper is to present an analysis of the solution set using the concepts of torque-feasible space (TFS) and wrench-feasible space (WFS) from cable-driven robotics. A shoulder model is presented and a simple musculo-tendon force coordination problem is defined. The ideal fibre model for representing muscles is reviewed and the TFS and WFS are defined, leading to the necessary and sufficient conditions for the existence of a solution. The shoulder model's TFS is analysed to explain the lack of anterior deltoid (DLTa) activity. Based on the analysis, a modification of the model's muscle fibre geometry is proposed. The performance with and without the modification is assessed by solving the musculo-tendon force coordination problem for quasi-static abduction in the scapular plane. After the proposed modification, the DLTa reaches 20% of activation.