35 resultados para Robotic benchmarks


Relevância:

10.00% 10.00%

Publicador:

Resumo:

NanoImpactNet (NIN) is a multidisciplinary European Commission funded network on the environmental, health and safety (EHS) impact of nanomaterials. The 24 founding scientific institutes are leading European research groups active in the fields of nanosafety, nanorisk assessment and nanotoxicology. This 4-year project is the new focal point for information exchange within the research community. Contact with other stakeholders is vital and their needs are being surveyed. NIN is communicating with 100s of stakeholders: businesses; internet platforms; industry associations; regulators; policy makers; national ministries; international agencies; standard-setting bodies and NGOs concerned by labour rights, EHS or animal welfare. To improve this communication, internet research, a questionnaire distributed via partners and targeted phone calls were used to identify stakeholders' interests and needs. Knowledge gaps and the necessity for further data mentioned by representatives of all stakeholder groups in the targeted phone calls concerned: • the potential toxic and safety hazards of nanomaterials throughout their lifecycles; • the fate and persistence of nanoparticles in humans, animals and the environment; • the associated risks of nanoparticle exposure; • greater participation in: the preparation of nomenclature, standards, methodologies, protocols and benchmarks; • the development of best practice guidelines; • voluntary schemes on responsibility; • databases of materials, research topics and themes, but also of expertise. These findings suggested that stakeholders and NIN researchers share very similar knowledge needs, and that open communication and free movement of knowledge will benefit both researchers and industry. Subsequently a workshop was organised by NIN focused on building a sustainable multi-stakeholder dialogue. Specific questions were asked to different stakeholder groups to encourage discussions and open communication. 1. What information do stakeholders need from researchers and why? The discussions about this question confirmed the needs identified in the targeted phone calls. 2. How to communicate information? While it was agreed that reporting should be enhanced, commercial confidentiality and economic competition were identified as major obstacles. It was recognised that expertise was needed in the areas of commercial law and economics for a wellinformed treatment of this communication issue. 3. Can engineered nanomaterials be used safely? The idea that nanomaterials are probably safe because some of them have been produced 'for a long time', was questioned, since many materials in common use have been proved to be unsafe. The question of safety is also about whether the public has confidence. New legislation like REACH could help with this issue. Hazards do not materialise if exposure can be avoided or at least significantly reduced. Thus, there is a need for information on what can be regarded as acceptable levels of exposure. Finally, it was noted that there is no such thing as a perfectly safe material but only boundaries. At this moment we do not know where these boundaries lie. The matter of labelling of products containing nanomaterials was raised, as in the public mind safety and labelling are connected. This may need to be addressed since the issue of nanomaterials in food, drink and food packaging may be the first safety issue to attract public and media attention, and this may have an impact on 'nanotechnology as a whole. 4. Do we need more or other regulation? Any decision making process should accommodate the changing level of uncertainty. To address the uncertainties, adaptations of frameworks such as REACH may be indicated for nanomaterials. Regulation is often needed even if voluntary measures are welcome because it mitigates the effects of competition between industries. Data cannot be collected on voluntary bases for example. NIN will continue with an active stakeholder dialogue to further build on interdisciplinary relationships towards a healthy future with nanotechnology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Assays that measure a patient's immune response play an increasingly important role in the development of immunotherapies. The inherent complexity of these assays and independent protocol development between laboratories result in high data variability and poor reproducibility. Quality control through harmonization--based on integration of laboratory-specific protocols with standard operating procedures and assay performance benchmarks--is one way to overcome these limitations. Harmonization guidelines can be widely implemented to address assay performance variables. This process enables objective interpretation and comparison of data across clinical trial sites and also facilitates the identification of relevant immune biomarkers, guiding the development of new therapies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Neuroimaging of the self focused on high-level mechanisms such as language, memory or imagery of the self. Recent evidence suggests that low-level mechanisms of multisensory and sensorimotor integration may play a fundamental role in encoding self-location and the first-person perspective (Blanke and Metzinger, 2009). Neurological patients with out-of body experiences (OBE) suffer from abnormal self-location and the first-person perspective due to a damage in the temporo-parietal junction (Blanke et al., 2004). Although self-location and the first-person perspective can be studied experimentally (Lenggenhager et al., 2009), the neural underpinnings of self-location have yet to be investigated. To investigate the brain network involved in self-location and first-person perspective we used visuo-tactile multisensory conflict, magnetic resonance (MR)-compatible robotics, and fMRI in study 1, and lesion analysis in a sample of 9 patients with OBE due to focal brain damage in study 2. Methods: Twenty-two participants saw a video showing either a person's back or an empty room being stroked (visual stimuli) while the MR-compatible robotic device stroked their back (tactile stimulation). Direction and speed of the seen stroking could either correspond (synchronous) or not (asynchronous) to those of the seen stroking. Each run comprised the four conditions according to a 2x2 factorial design with Object (Body, No-Body) and Synchrony (Synchronous, Asynchronous) as main factors. Self-location was estimated using the mental ball dropping (MBD; Lenggenhager et al., 2009). After the fMRI session participants completed a 6-item adapted from the original questionnaire created by Botvinick and Cohen (1998) and based on questions and data obtained by Lenggenhager et al. (2007, 2009). They were also asked to complete a questionnaire to disclose the perspective they adopted during the illusion. Response times (RTs) for the MBD and fMRI data were analyzed with a 3-way mixed model ANOVA with the in-between factor Perspective (up, down) and the two with-in factors Object (body, no-body) and Stroking (synchronous, asynchronous). Quantitative lesion analysis was performed using MRIcron (Rorden et al., 2007). We compared the distributions of brain lesions confirmed by multimodality imaging (Knowlton, 2004) in patients with OBE with those showing complex visual hallucinations involving people or faces, but without any disturbance of self-location and first person perspective. Nine patients with OBE were investigated. The control group comprised 8 patients. Structural imaging data were available for normalization and co-registration in all the patients. Normalization of each patient's lesion into the common MNI (Montreal Neurological Institute) reference space permitted simple, voxel-wise, algebraic comparisons to be made. Results: Even if in the scanner all participants were lying on their back and were facing upwards, analysis of perspective showed that half of the participants had the impression to be looking down at the virtual human body below them, despite any cues about their body position (Down-group). The other participants had the impression to be looking up at the virtual body above them (Up-group). Analysis of Q3 ("How strong was the feeling that the body you saw was you?") indicated stronger self-identification with the virtual body during the synchronous stroking. RTs in the MBD task confirmed these subjective data (significant 3-way interaction between perspective, object and stroking). fMRI results showed eight cortical regions where the BOLD signal was significantly different during at least one of the conditions resulting from the combination of Object and Stroking, relative to baseline: right and left temporo-parietal junction, right EBA, left middle occipito-temporal gyrus, left postcentral gyrus, right medial parietal lobe, bilateral medial occipital lobe (Fig 1). The activation patterns in right and left temporo-parietal junction and right EBA reflected changes in self-location and perspective as revealed by statistical analysis that was performed on the percentage of BOLD change with respect to the baseline. Statistical lesion overlap comparison (using nonparametric voxel based lesion symptom mapping) with respect to the control group revealed the right temporo-parietal junction, centered at the angular gyrus (Talairach coordinates x = 54, y =-52, z = 26; p>0.05, FDR corrected). Conclusions: The present questionnaire and behavioural results show that - despite the noisy and constraining MR environment) our participants had predictable changes in self-location, self-identification, and first-person perspective when robotic tactile stroking was applied synchronously with the robotic visual stroking. fMRI data in healthy participants and lesion data in patients with abnormal self-location and first-person perspective jointly revealed that the temporo-parietal cortex especially in the right hemisphere encodes these conscious experiences. We argue that temporo-parietal activity reflects the experience of the conscious "I" as embodied and localized within bodily space.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Neuroimaging of the self has focused on high-level mechanisms such as language, memory or imagery of the self and implicated widely distributed brain networks. Yet recent evidence suggests that low-level mechanisms such as multisensory and sensorimotor integration may play a fundamental role in self-related processing. In the present study we used visuotactile multisensory conflict, robotics, virtual reality, and fMRI to study such low-level mechanisms by experimentally inducing changes in self-location. Participants saw a video of a person's back (body) or an empty room (no-body) being stroked while a MR-compatible robotic device stroked their back. The latter tactile input was synchronous or asynchronous with respect to the seen stroking. Self-location was estimated behaviorally confirming previous data that self-location only differed between the two body conditions. fMRI results showed a bilateral activation of the temporo-parietal cortex with a significantly higher BOLD signal increase in the synchronous/body condition with respect to the other conditions. Sensorimotor cortex and extrastriate-body-area were also activated. We argue that temporo-parietal activity reflects the experience of the conscious 'I' as embodied and localized within bodily space, compatible with clinical data in neurological patients with out-of-body experiences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The infinite slope method is widely used as the geotechnical component of geomorphic and landscape evolution models. Its assumption that shallow landslides are infinitely long (in a downslope direction) is usually considered valid for natural landslides on the basis that they are generally long relative to their depth. However, this is rarely justified, because the critical length/depth (L/H) ratio below which edge effects become important is unknown. We establish this critical L/H ratio by benchmarking infinite slope stability predictions against finite element predictions for a set of synthetic two-dimensional slopes, assuming that the difference between the predictions is due to error in the infinite slope method. We test the infinite slope method for six different L/H ratios to find the critical ratio at which its predictions fall within 5% of those from the finite element method. We repeat these tests for 5000 synthetic slopes with a range of failure plane depths, pore water pressures, friction angles, soil cohesions, soil unit weights and slope angles characteristic of natural slopes. We find that: (1) infinite slope stability predictions are consistently too conservative for small L/H ratios; (2) the predictions always converge to within 5% of the finite element benchmarks by a L/H ratio of 25 (i.e. the infinite slope assumption is reasonable for landslides 25 times longer than they are deep); but (3) they can converge at much lower ratios depending on slope properties, particularly for low cohesion soils. The implication for catchment scale stability models is that the infinite length assumption is reasonable if their grid resolution is coarse (e.g. >25?m). However, it may also be valid even at much finer grid resolutions (e.g. 1?m), because spatial organization in the predicted pore water pressure field reduces the probability of short landslides and minimizes the risk that predicted landslides will have L/H ratios less than 25. Copyright (c) 2012 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This contribution introduces Data Envelopment Analysis (DEA), a performance measurement technique. DEA helps decision makers for the following reasons: (1) By calculating an efficiency score, it indicates if a firm is efficient or has capacity for improvement; (2) By setting target values for input and output, it calculates how much input must be decreased or output increased in order to become efficient; (3) By identifying the nature of returns to scale, it indicates if a firm has to decrease or increase its scale (or size) in order to minimise the average total cost; (4) By identifying a set of benchmarks, it specifies which other firms' processes need to be analysed in order to improve its own practices. This contribution presents the essentials about DEA, alongside a case study to intuitively understand its application. It also introduces Win4DEAP, a software package that conducts efficiency analysis based on DEA methodology. The methodical background of DEA is presented for more demanding readers. Finally, four advanced topics of DEA are treated: adjustment to the environment, preferences, sensitivity analysis and time series data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Living in a multisensory world entails the continuous sensory processing of environmental information in order to enact appropriate motor routines. The interaction between our body and our brain is the crucial factor for achieving such sensorimotor integration ability. Several clinical conditions dramatically affect the constant body-brain exchange, but the latest developments in biomedical engineering provide promising solutions for overcoming this communication breakdown. NEW METHOD: The ultimate technological developments succeeded in transforming neuronal electrical activity into computational input for robotic devices, giving birth to the era of the so-called brain-machine interfaces. Combining rehabilitation robotics and experimental neuroscience the rise of brain-machine interfaces into clinical protocols provided the technological solution for bypassing the neural disconnection and restore sensorimotor function. RESULTS: Based on these advances, the recovery of sensorimotor functionality is progressively becoming a concrete reality. However, despite the success of several recent techniques, some open issues still need to be addressed. COMPARISON WITH EXISTING METHOD(S): Typical interventions for sensorimotor deficits include pharmaceutical treatments and manual/robotic assistance in passive movements. These procedures achieve symptoms relief but their applicability to more severe disconnection pathologies is limited (e.g. spinal cord injury or amputation). CONCLUSIONS: Here we review how state-of-the-art solutions in biomedical engineering are continuously increasing expectances in sensorimotor rehabilitation, as well as the current challenges especially with regards to the translation of the signals from brain-machine interfaces into sensory feedback and the incorporation of brain-machine interfaces into daily activities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Object Recent years have been marked by efforts to improve the quality and safety of pedicle screw placement in spinal instrumentation. The aim of the present study is to compare the accuracy of the SpineAssist robot system with conventional fluoroscopy-guided pedicle screw placement. Methods Ninety-five patients suffering from degenerative disease and requiring elective lumbar instrumentation were included in the study. The robot cohort (Group I; 55 patients, 244 screws) consisted of an initial open robot-assisted subgroup (Subgroup IA; 17 patients, 83 screws) and a percutaneous cohort (Subgroup IB, 38 patients, 161 screws). In these groups, pedicle screws were placed under robotic guidance and lateral fluoroscopic control. In the fluoroscopy-guided cohort (Group II; 40 patients, 163 screws) screws were inserted using anatomical landmarks and lateral fluoroscopic guidance. The primary outcome measure was accuracy of screw placement on the Gertzbein-Robbins scale (Grade A to E and R [revised]). Secondary parameters were duration of surgery, blood loss, cumulative morphine, and length of stay. Results In the robot group (Group I), a perfect trajectory (A) was observed in 204 screws (83.6%). The remaining screws were graded B (n = 19 [7.8%]), C (n = 9 [3.7%]), D (n = 4 [1.6%]), E (n = 2 [0.8%]), and R (n = 6 [2.5%]). In the fluoroscopy-guided group (Group II), a completely intrapedicular course graded A was found in 79.8% (n = 130). The remaining screws were graded B (n = 12 [7.4%]), C (n = 10 [6.1%]), D (n = 6 [3.7%]), and E (n = 5 [3.1%]). The comparison of "clinically acceptable" (that is, A and B screws) was neither different between groups (I vs II [p = 0.19]) nor subgroups (Subgroup IA vs IB [p = 0.81]; Subgroup IA vs Group II [p = 0.53]; Subgroup IB vs Group II [p = 0.20]). Blood loss was lower in the robot-assisted group than in the fluoroscopy-guided group, while duration of surgery, length of stay, and cumulative morphine dose were not statistically different. Conclusions Robot-guided pedicle screw placement is a safe and useful tool for assisting spine surgeons in degenerative spine cases. Nonetheless, technical difficulties remain and fluoroscopy backup is advocated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To meet the challenges related to the development of health problems taking into account the development of knowledge, several innovations in care are being implemented. Among these, advanced nursing roles and increased interprofessional collaboration are considered as important features in Switzerland. Although the international literature provides benchmarks for advanced roles, it was considered essential to contextualize these in order to promote their application value in Switzerland. Thus, from 79 statements drawn from the literature, 172 participants involved in a two-sequential phases study only kept 29 statements because they considered they were relevant, important and applicable in daily practice. However, it is important to point out that statements which have not been selected at this stage to describe advanced practice cannot be considered irrelevant permanently. Indeed, given the emergence of advanced practice in western Switzerland, it is possible that a statement judged not so relevant at this moment of the development of advanced practice, will be considered as such later on. The master's program in nursing embedded at the University of Lausanne and the University of Applied Sciences Western Switzerland was also examined in the light of these statements. It was concluded that all the objectives of the program are aligned with the competencies statements that were kept.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Division of labor is a complex phenomenon observed throughout nature. Theoretical studies have focused either on its emergence through self-organization mechanisms or on its adaptive consequences. We suggest that the interaction of self-organization, which undoubtedly characterizes division of labor in social insects, and evolution should be further explored. We review the factors empirically shown to influence task choice. In light of these factors, we review the most important self-organization and evolutionary models for division of labor and outline their advantages and limitations. We describe ways to unify evolution and self-organization in the theoretical study of division of labor and recent results in this area. Finally, we discuss some benchmarks and primary challenges of this approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Volumes of data used in science and industry are growing rapidly. When researchers face the challenge of analyzing them, their format is often the first obstacle. Lack of standardized ways of exploring different data layouts requires an effort each time to solve the problem from scratch. Possibility to access data in a rich, uniform manner, e.g. using Structured Query Language (SQL) would offer expressiveness and user-friendliness. Comma-separated values (CSV) are one of the most common data storage formats. Despite its simplicity, with growing file size handling it becomes non-trivial. Importing CSVs into existing databases is time-consuming and troublesome, or even impossible if its horizontal dimension reaches thousands of columns. Most databases are optimized for handling large number of rows rather than columns, therefore, performance for datasets with non-typical layouts is often unacceptable. Other challenges include schema creation, updates and repeated data imports. To address the above-mentioned problems, I present a system for accessing very large CSV-based datasets by means of SQL. It's characterized by: "no copy" approach - data stay mostly in the CSV files; "zero configuration" - no need to specify database schema; written in C++, with boost [1], SQLite [2] and Qt [3], doesn't require installation and has very small size; query rewriting, dynamic creation of indices for appropriate columns and static data retrieval directly from CSV files ensure efficient plan execution; effortless support for millions of columns; due to per-value typing, using mixed text/numbers data is easy; very simple network protocol provides efficient interface for MATLAB and reduces implementation time for other languages. The software is available as freeware along with educational videos on its website [4]. It doesn't need any prerequisites to run, as all of the libraries are included in the distribution package. I test it against existing database solutions using a battery of benchmarks and discuss the results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, protein-ligand docking has become a powerful tool for drug development. Although several approaches suitable for high throughput screening are available, there is a need for methods able to identify binding modes with high accuracy. This accuracy is essential to reliably compute the binding free energy of the ligand. Such methods are needed when the binding mode of lead compounds is not determined experimentally but is needed for structure-based lead optimization. We present here a new docking software, called EADock, that aims at this goal. It uses an hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 A around the center of mass of the ligand position in the crystal structure, and on the contrary to other benchmarks, our algorithm was fed with optimized ligand positions up to 10 A root mean square deviation (RMSD) from the crystal structure, excluding the latter. This validation illustrates the efficiency of our sampling strategy, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures could be explained by the presence of crystal contacts in the experimental structure. Finally, the ability of EADock to accurately predict binding modes on a real application was illustrated by the successful docking of the RGD cyclic pentapeptide on the alphaVbeta3 integrin, starting far away from the binding pocket.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The goal of this study was to compare the quantity and purity of DNA extracted from biological tracesusing the QIAsymphony robot with that of the manual QIAamp DNA mini kit currently in use in ourlaboratory. We found that the DNA yield of robot was 1.6-3.5 times lower than that of the manualprotocol. This resulted in a loss of 8% and 29% of the alleles correctly scored when analyzing 1/400 and 1/800 diluted saliva samples, respectively. Specific tests showed that the QIAsymphony was at least 2-16times more efficient at removing PCR inhibitors. The higher purity of the DNA may therefore partlycompensate for the lower DNA yield obtained. No case of cross-contamination was observed amongsamples. After purification with the robot, DNA extracts can be automatically transferred in 96-wellsplates, which is an ideal format for subsequent RT-qPCR quantification and DNA amplification. Lesshands-on time and reduced risk of operational errors represent additional advantages of the robotic platform.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Laparoscopic surgery has become a standard approach for many interventions, including oncologic surgery. Laparoscopic instruments have been developed to allow advanced surgical procedure. Imaging and computer assistance in virtual reality or robotic procedure will certainly improve access to this surgery.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

L'objectif principal de cette thèse consiste à mettre en évidence la persistance du capitalisme familial en Suisse au cours du 20e siècle, et sa résistance aux capitalismes managérial et financier qui sont censés lui avoir succédé. Pour ce faire, nous avons retenu vingt-deux grandes entreprises du secteur des machines, de l'électrotechnique et de la métallurgie - principale branche de l'industrie suisse pour la période considérée -, pour lesquelles ont été recensés les membres des conseils d'administration et les principaux dirigeants exécutifs pour cinq dates- repère couvrant le siècle (1910, 1937, 1957, 1980 et 2000). Cette thèse s'inscrit dans une démarche pluridisciplinaire qui relève à la fois de l'histoire d'entreprise et de la sociologie des dirigeants, et fait appel à différentes méthodes telles que l'analyse de réseau et l'analyse prosopographique. Elle s'articule autour de trois axes de recherche principaux : le premier vise à mettre en évidence l'évolution des modes de gouvernance dans notre groupe d'entreprises, le second investit la question de la coordination patronale et le troisième a pour but de dresser un portrait collectif des élites à la tête de nos vingt-deux firmes. Nos résultats montrent que durant la majeure partie du siècle, la plupart de nos entreprises sont contrôlées par des familles et fonctionnent sur un mode de coordination hors marché qui repose notamment sur un réseau dense de liens interfirmes, le profil des dirigeants restant dans l'ensemble stable. Si la fin du siècle est marquée par plusieurs changements qui confirment l'avènement d'un capitalisme dit financier ou actionnarial et la mise en place de pratiques plus concurrentielles parmi les firmes et les élites industrielles, le maintien du contrôle familial dans plusieurs entreprises et la persistance de certains anciens mécanismes de coopération nous incitent cependant à nuancer ce constat. - The main objective of this research is to highlight the persistence of family capitalism in Switzerland during the 20th century and its resistance to managerial and financial capitalisms that succeeded. For this purpose, we focus on twenty- two big companies of the machine, electrotechnical and metallurgy sector - the main branch of the Swiss industry for the considered period - whose boards of directors and executive managers have been identified for five benchmarks across the century (1910, 1937, 1957, 1980 and 2000). This thesis relates to business history and elites sociology, and uses different methods such as network analysis and prosopography. It is articulated around three main parts. The aim of the first one is to identify the evolution of corporate governance in our twenty-two enterprises, the second part concentrates on interfirms coordination and the objective of the last one is to highlight the profile of the corporate elite leading our firms. Our results show that during the main part of the century, most of the companies were controlled by families and were characterized by non-market mechanisms of coordination such as interlocking directorates ; moreover, the profile of the corporate elite remained very stable. Although some major changes that took place by the end of the century confirmed a transition towards financial capitalism and more competitive interaction among firms and the corporate elite, the persistence of family control in several companies and the maintaining of some former mechanisms of coordination allow us to put this evolution into perspective.