988 resultados para Robot applications
Resumo:
Ce projet de recherche, intitulé Téléopération d'un robot collaboratif par outil haptique traite un des problèmes contemporains de la robotique, à savoir la coopération entre l'humain et la machine. La robotique est en pleine expansion depuis maintenant deux décennies: les robots investissent de plus en plus l'industrie, les services ou encore l'assistance à la personne et se diversifient considérablement. Ces nouvelles tendances font sortir les robots des cages dans lesquelles ils étaient placés et ouvrent grand la porte vers de nouvelles applications. Parmi elles, la coopération et les interactions avec l'humain représentent une réelle opportunité pour soulager l'homme dans des tâches complexes, fastidieuses et répétitives. En parallèle de cela, la robotique moderne s'oriente vers un développement massif du domaine humanoïde. Effectivement, plusieurs expériences sociales ont montré que l'être humain, constamment en interaction avec les systèmes qui l'entourent, a plus de facilités à contribuer à la réalisation d'une tâche avec un robot d'apparence humaine plutôt qu'avec une machine. Le travail présenté dans ce projet de recherche s'intègre dans un contexte d'interaction homme-robot (IHR) qui repose sur la robotique humanoïde. Le système qui en découle doit permettre à un utilisateur d'interagir efficacement et de façon intuitive avec la machine, tout en respectant certains critères, notamment de sécurité. Par une mise en commun des compétences respectives de l'homme et du robot humanoïde, les interactions sont améliorées. En effet, le robot peut réaliser une grande quantité d'actions avec précision et sans se fatiguer, mais n'est pas nécessairement doté d'une prise de décision adaptée à la situation, contrairement à l'homme qui est capable d'ajuster son comportement naturellement ou en fonction de son expérience. En d'autres termes, ce système cherche à intégrer le savoir-faire et la capacité de réflexion humaine avec la robustesse, l'efficacité et la précision du robot. Dans le domaine de la robotique, le terme d'interaction intègre également la notion de contrôle. La grande majorité des robots reçoit des commandes machines qui sont généralement des consignes de trajectoire, qu'ils sont capables d'interpréter. Or, plusieurs interfaces de contrôle sont envisageables, notamment celles utilisant des outils haptiques, qui permettent à un utilisateur d'avoir un ressenti et une perception tactile. Ces outils comme tous ceux qui augmentent le degré de contrôle auprès de l'utilisateur, en ajoutant un volet sensoriel, sont parfaitement adaptés pour ce genre d'applications. Dans ce projet, deux outils haptiques sont assemblés puis intégrés à une interface de contrôle haptique dans le but de commander le bras d'un robot humanoïde. Ainsi, l'homme est capable de diriger le robot tout en ajustant ses commandes en fonction des informations en provenance des différents capteurs du robot, qui lui sont retranscrites visuellement ou sensoriellement.
Resumo:
Dans la dernière décennie, la robotique souple a connu un gain de popularité considérable. Elle est, de façon inhérente, sécuritaire pour les humains et l’environnement qui l’entourent. Grâce à sa faible rigidité, la robotique souple est idéale pour manipuler des objets fragiles et elle est en mesure de s’adapter à son environnement. Les caractéristiques uniques de la robotique souple font de cette technologie un tremplin vers la conception d’appareils médicaux novateurs, plus particulièrement pour des outils permettant le positionnement d’aiguilles dans le but de faire des interventions percutanées, notamment au niveau du foie. Toutefois, la souplesse de cette technologie induit, du même coup, quelques désagréments. Elle procure un comportement sécuritaire, mais entraîne aussi un manque de rigidité limitant les applications de la robotique souple. Sans une rigidité minimale, il est impossible d’accomplir des opérations repérables et précises. La robotique souple a en fait un compromis majeur entre la capacité de chargement et la plage d’utilisation. Pour utiliser cette technologie dans le domaine médical, il est primordial d’ajouter un système permettant de moduler la rigidité du système pour inhiber ce compromis. Couplée avec un système de freinage granulaire, la robotique souple semble comporter l’ensemble des caractéristiques permettant d’accomplir des interventions au foie. Cette étude tend à démontrer que couplée à un système modulant la rigidité, la robotique souple peut être utilisée pour accomplir des opérations d’une façon précise et repérable, tout en demeurant sécuritaire. Le positionneur d’aiguilles développé est 100 % compatible avec l’Imagerie à Résonance Magnétique (IRM). La plage d’insertion du système permet de rejoindre l’entièreté du foie (1500 cm³), tout en maintenant une rigidité suffisante (3 N/mm) et en étant aussi précis que l’outil d’imagerie utilisée (1 mm). L’approche hybride consistant à développer un système activé de façon souple couplée à un module régulant sa rigidité permet d’obtenir à la fois les avantages d’une robotique compliante (souple) et conventionnelle (dure).
Resumo:
The most widespread work-related diseases are musculoskeletal disorders (MSD) caused by awkward postures and excessive effort to upper limb muscles during work operations. The use of wearable IMU sensors could monitor the workers constantly to prevent hazardous actions, thus diminishing work injuries. In this thesis, procedures are developed and tested for ergonomic analyses in a working environment, based on a commercial motion capture system (MoCap) made of 17 Inertial Measurement Units (IMUs). An IMU is usually made of a tri-axial gyroscope, a tri-axial accelerometer, and a tri-axial magnetometer that, through sensor fusion algorithms, estimates its attitude. Effective strategies for preventing MSD rely on various aspects: firstly, the accuracy of the IMU, depending on the chosen sensor and its calibration; secondly, the correct identification of the pose of each sensor on the worker’s body; thirdly, the chosen multibody model, which must consider both the accuracy and the computational burden, to provide results in real-time; finally, the model scaling law, which defines the possibility of a fast and accurate personalization of the multibody model geometry. Moreover, the MSD can be diminished using collaborative robots (cobots) as assisted devices for complex or heavy operations to relieve the worker's effort during repetitive tasks. All these aspects are considered to test and show the efficiency and usability of inertial MoCap systems for assessing ergonomics evaluation in real-time and implementing safety control strategies in collaborative robotics. Validation is performed with several experimental tests, both to test the proposed procedures and to compare the results of real-time multibody models developed in this thesis with the results from commercial software. As an additional result, the positive effects of using cobots as assisted devices for reducing human effort in repetitive industrial tasks are also shown, to demonstrate the potential of wearable electronics in on-field ergonomics analyses for industrial applications.
Resumo:
Agricultural techniques have been improved over the centuries to match with the growing demand of an increase in global population. Farming applications are facing new challenges to satisfy global needs and the recent technology advancements in terms of robotic platforms can be exploited. As the orchard management is one of the most challenging applications because of its tree structure and the required interaction with the environment, it was targeted also by the University of Bologna research group to provide a customized solution addressing new concept for agricultural vehicles. The result of this research has blossomed into a new lightweight tracked vehicle capable of performing autonomous navigation both in the open-filed scenario and while travelling inside orchards for what has been called in-row navigation. The mechanical design concept, together with customized software implementation has been detailed to highlight the strengths of the platform and some further improvements envisioned to improve the overall performances. Static stability testing has proved that the vehicle can withstand steep slopes scenarios. Some improvements have also been investigated to refine the estimation of the slippage that occurs during turning maneuvers and that is typical of skid-steering tracked vehicles. The software architecture has been implemented using the Robot Operating System (ROS) framework, so to exploit community available packages related to common and basic functions, such as sensor interfaces, while allowing dedicated custom implementation of the navigation algorithm developed. Real-world testing inside the university’s experimental orchards have proven the robustness and stability of the solution with more than 800 hours of fieldwork. The vehicle has also enabled a wide range of autonomous tasks such as spraying, mowing, and on-the-field data collection capabilities. The latter can be exploited to automatically estimate relevant orchard properties such as fruit counting and sizing, canopy properties estimation, and autonomous fruit harvesting with post-harvesting estimations.
Resumo:
In the recent decades, robotics has become firmly embedded in areas such as education, teaching, medicine, psychology and many others. We focus here on social robotics; social robots are designed to interact with people in a natural and interpersonal way, often to achieve positive results in different applications. To interact and cooperate with humans in their daily-life activities, robots should exhibit human-like intelligence. The rapid expansion of social robotics and the existence of various kinds of robots on the market have allowed research groups to carry out multiple experiments. The experiments carried out have led to the collections of various kinds of data, which can be used or processed for psychological studies, and studies in other fields. However, there are no tools available in which data can be stored, processed and shared with other research groups. This thesis proposes the design and implementation of visual tool for organizing dataflows in Human Robot Interaction (HRI).
Resumo:
Protocols for the generation of dendritic cells (DCs) using serum as a supplementation of culture media leads to reactions due to animal proteins and disease transmissions. Several types of serum-free media (SFM), based on good manufacture practices (GMP), have recently been used and seem to be a viable option. The aim of this study was to evaluate the results of the differentiation, maturation, and function of DCs from Acute Myeloid Leukemia patients (AML), generated in SFM and medium supplemented with autologous serum (AS). DCs were analyzed by phenotype characteristics, viability, and functionality. The results showed the possibility of generating viable DCs in all the conditions tested. In patients, the X-VIVO 15 medium was more efficient than the other media tested in the generation of DCs producing IL-12p70 (p=0.05). Moreover, the presence of AS led to a significant increase of IL-10 by DCs as compared with CellGro (p=0.05) and X-Vivo15 (p=0.05) media, both in patients and donors. We concluded that SFM was efficient in the production of DCs for immunotherapy in AML patients. However, the use of AS appears to interfere with the functional capacity of the generated DCs.
Resumo:
The goal of this cross-sectional observational study was to quantify the pattern-shift visual evoked potentials (VEP) and the thickness as well as the volume of retinal layers using optical coherence tomography (OCT) across a cohort of Parkinson's disease (PD) patients and age-matched controls. Forty-three PD patients and 38 controls were enrolled. All participants underwent a detailed neurological and ophthalmologic evaluation. Idiopathic PD cases were included. Cases with glaucoma or increased intra-ocular pressure were excluded. Patients were assessed by VEP and high-resolution Fourier-domain OCT, which quantified the inner and outer thicknesses of the retinal layers. VEP latencies and the thicknesses of the retinal layers were the main outcome measures. The mean age, with standard deviation (SD), of the PD patients and controls were 63.1 (7.5) and 62.4 (7.2) years, respectively. The patients were predominantly in the initial Hoehn-Yahr (HY) disease stages (34.8% in stage 1 or 1.5, and 55.8 % in stage 2). The VEP latencies and the thicknesses as well as the volumes of the retinal inner and outer layers of the groups were similar. A negative correlation between the retinal thickness and the age was noted in both groups. The thickness of the retinal nerve fibre layer (RNFL) was 102.7 μm in PD patients vs. 104.2 μm in controls. The thicknesses of retinal layers, VEP, and RNFL of PD patients were similar to those of the controls. Despite the use of a representative cohort of PD patients and high-resolution OCT in this study, further studies are required to establish the validity of using OCT and VEP measurements as the anatomic and functional biomarkers for the evaluation of retinal and visual pathways in PD patients.
Resumo:
Paper has become increasingly recognized as a very interesting substrate for the construction of microfluidic devices, with potential application in a variety of areas, including health diagnosis, environmental monitoring, immunoassays and food safety. The aim of this review is to present a short history of analytical systems constructed from paper, summarize the main advantages and disadvantages of fabrication techniques, exploit alternative methods of detection such as colorimetric, electrochemical, photoelectrochemical, chemiluminescence and electrochemiluminescence, as well as to take a closer look at the novel achievements in the field of bioanalysis published during the last 2 years. Finally, the future trends for production of such devices are discussed.
Resumo:
Technical evaluation of analytical data is of extreme relevance considering it can be used for comparisons with environmental quality standards and decision-making as related to the management of disposal of dredged sediments and the evaluation of salt and brackish water quality in accordance with CONAMA 357/05 Resolution. It is, therefore, essential that the project manager discusses the environmental agency's technical requirements with the laboratory contracted for the follow-up of the analysis underway and even with a view to possible re-analysis when anomalous data are identified. The main technical requirements are: (1) method quantitation limits (QLs) should fall below environmental standards; (2) analyses should be carried out in laboratories whose analytical scope is accredited by the National Institute of Metrology (INMETRO) or qualified or accepted by a licensing agency; (3) chain of custody should be provided in order to ensure sample traceability; (4) control charts should be provided to prove method performance; (5) certified reference material analysis or, if that is not available, matrix spike analysis, should be undertaken and (6) chromatograms should be included in the analytical report. Within this context and with a view to helping environmental managers in analytical report evaluation, this work has as objectives the discussion of the limitations of the application of SW 846 US EPA methods to marine samples, the consequences of having data based on method detection limits (MDL) and not sample quantitation limits (SQL), and present possible modifications of the principal method applied by laboratories in order to comply with environmental quality standards.
Resumo:
Colloidal particles have been used to template the electrosynthesis of several materials, such as semiconductors, metals and alloys. The method allows good control over the thickness of the resulting material by choosing the appropriate charge applied to the system, and it is able to produce high density deposited materials without shrinkage. These materials are a true model of the template structure and, due to the high surface areas obtained, are very promising for use in electrochemical applications. In the present work, the assembly of monodisperse polystyrene templates was conduced over gold, platinum and glassy carbon substrates in order to show the electrodeposition of an oxide, a conducting polymer and a hybrid inorganic-organic material with applications in the supercapacitor and sensor fields. The performances of the resulting nanostructured films have been compared with the analogue bulk material and the results achieved are depicted in this paper.
Resumo:
We describe the concept, the fabrication, and the most relevant properties of a piezoelectric-polymer system: Two fluoroethylenepropylene (FEP) films with good electret properties are laminated around a specifically designed and prepared polytetrafluoroethylene (PTFE) template at 300 degrees C. After removing the PTFE template, a two-layer FEP film with open tubular channels is obtained. For electric charging, the two-layer FEP system is subjected to a high electric field. The resulting dielectric barrier discharges inside the tubular channels yield a ferroelectret with high piezoelectricity. d(33) coefficients of up to 160 pC/N have already been achieved on the ferroelectret films. After charging at suitable elevated temperatures, the piezoelectricity is stable at temperatures of at least 130 degrees C. Advantages of the transducer films include ease of fabrication at laboratory or industrial scales, a wide range of possible geometrical and processing parameters, straightforward control of the uniformity of the polymer system, flexibility, and versatility of the soft ferroelectrets, and a large potential for device applications e.g., in the areas of biomedicine, communications, production engineering, sensor systems, environmental monitoring, etc.
Resumo:
The effects of chromium or nickel oxide additions on the composition of Portland clinker were investigated by X-ray powder diffraction associated with pattern analysis by the Rietveld method. The co-processing of industrial waste in Portland cement plants is an alternative solution to the problem of final disposal of hazardous waste. Industrial waste containing chromium or nickel is hazardous and is difficult to dispose of. It was observed that in concentrations up to 1% in mass, the chromium or nickel oxide additions do not cause significant alterations in Portland clinker composition. (C) 2008 International Centre for Diffraction Data.
Resumo:
Background Data and Objective: There is anecdotal evidence that low-level laser therapy (LLLT) may affect the development of muscular fatigue, minor muscle damage, and recovery after heavy exercises. Although manufacturers claim that cluster probes (LEDT) maybe more effective than single-diode lasers in clinical settings, there is a lack of head-to-head comparisons in controlled trials. This study was designed to compare the effect of single-diode LLLT and cluster LEDT before heavy exercise. Materials and Methods: This was a randomized, placebo-controlled, double-blind cross-over study. Young male volleyball players (n = 8) were enrolled and asked to perform three Wingate cycle tests after 4 x 30 sec LLLT or LEDT pretreatment of the rectus femoris muscle with either (1) an active LEDT cluster-probe (660/850 nm, 10/30mW), (2) a placebo cluster-probe with no output, and (3) a single-diode 810-nm 200-mW laser. Results: The active LEDT group had significantly decreased post-exercise creatine kinase (CK) levels (-18.88 +/- 41.48U/L), compared to the placebo cluster group (26.88 +/- 15.18U/L) (p < 0.05) and the active single-diode laser group (43.38 +/- 32.90U/L) (p<0.01). None of the pre-exercise LLLT or LEDT protocols enhanced performance on the Wingate tests or reduced post-exercise blood lactate levels. However, a non-significant tendency toward lower post-exercise blood lactate levels in the treated groups should be explored further. Conclusion: In this experimental set-up, only the active LEDT probe decreased post-exercise CK levels after the Wingate cycle test. Neither performance nor blood lactate levels were significantly affected by this protocol of pre-exercise LEDT or LLLT.
Resumo:
Background: Feature selection is a pattern recognition approach to choose important variables according to some criteria in order to distinguish or explain certain phenomena (i.e., for dimensionality reduction). There are many genomic and proteomic applications that rely on feature selection to answer questions such as selecting signature genes which are informative about some biological state, e. g., normal tissues and several types of cancer; or inferring a prediction network among elements such as genes, proteins and external stimuli. In these applications, a recurrent problem is the lack of samples to perform an adequate estimate of the joint probabilities between element states. A myriad of feature selection algorithms and criterion functions have been proposed, although it is difficult to point the best solution for each application. Results: The intent of this work is to provide an open-source multiplataform graphical environment for bioinformatics problems, which supports many feature selection algorithms, criterion functions and graphic visualization tools such as scatterplots, parallel coordinates and graphs. A feature selection approach for growing genetic networks from seed genes ( targets or predictors) is also implemented in the system. Conclusion: The proposed feature selection environment allows data analysis using several algorithms, criterion functions and graphic visualization tools. Our experiments have shown the software effectiveness in two distinct types of biological problems. Besides, the environment can be used in different pattern recognition applications, although the main concern regards bioinformatics tasks.
Resumo:
An (n, d)-expander is a graph G = (V, E) such that for every X subset of V with vertical bar X vertical bar <= 2n - 2 we have vertical bar Gamma(G)(X) vertical bar >= (d + 1) vertical bar X vertical bar. A tree T is small if it has at most n vertices and has maximum degree at most d. Friedman and Pippenger (1987) proved that any ( n; d)- expander contains every small tree. However, their elegant proof does not seem to yield an efficient algorithm for obtaining the tree. In this paper, we give an alternative result that does admit a polynomial time algorithm for finding the immersion of any small tree in subgraphs G of (N, D, lambda)-graphs Lambda, as long as G contains a positive fraction of the edges of Lambda and lambda/D is small enough. In several applications of the Friedman-Pippenger theorem, including the ones in the original paper of those authors, the (n, d)-expander G is a subgraph of an (N, D, lambda)-graph as above. Therefore, our result suffices to provide efficient algorithms for such previously non-constructive applications. As an example, we discuss a recent result of Alon, Krivelevich, and Sudakov (2007) concerning embedding nearly spanning bounded degree trees, the proof of which makes use of the Friedman-Pippenger theorem. We shall also show a construction inspired on Wigderson-Zuckerman expander graphs for which any sufficiently dense subgraph contains all trees of sizes and maximum degrees achieving essentially optimal parameters. Our algorithmic approach is based on a reduction of the tree embedding problem to a certain on-line matching problem for bipartite graphs, solved by Aggarwal et al. (1996).