670 resultados para Redundant Manipulator
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Tool center point calibration is a known problem in industrial robotics. The major focus of academic research is to enhance the accuracy and repeatability of next generation robots. However, operators of currently available robots are working within the limits of the robot´s repeatability and require calibration methods suitable for these basic applications. This study was conducted in association with Stresstech Oy, which provides solutions for manufacturing quality control. Their sensor, based on the Barkhausen noise effect, requires accurate positioning. The accuracy requirement admits a tool center point calibration problem if measurements are executed with an industrial robot. Multiple possibilities are available in the market for automatic tool center point calibration. Manufacturers provide customized calibrators to most robot types and tools. With the handmade sensors and multiple robot types that Stresstech uses, this would require great deal of labor. This thesis introduces a calibration method that is suitable for all robots which have two digital input ports free. It functions with the traditional method of using a light barrier to detect the tool in the robot coordinate system. However, this method utilizes two parallel light barriers to simultaneously measure and detect the center axis of the tool. Rotations about two axes are defined with the center axis. The last rotation about the Z-axis is calculated for tools that have different width of X- and Y-axes. The results indicate that this method is suitable for calibrating the geometric tool center point of a Barkhausen noise sensor. In the repeatability tests, a standard deviation inside robot repeatability was acquired. The Barkhausen noise signal was also evaluated after recalibration and the results indicate correct calibration. However, future studies should be conducted using a more accurate manipulator, since the method employs the robot itself as a measuring device.
Resumo:
Chemokines are members of a family of more than 30 human cytokines whose best-described activities are as chemotactic factors for leukocytes and that are presumed to be important in leukocyte recruitment and trafficking. While many chemokines can act on lymphocytes, the roles of chemokines and their receptors in lymphocyte biology are poorly understood. The recent discoveries that chemokines can suppress infection by HIV-1 and that chemokine receptors serve, along with CD4, as obligate co-receptors for HIV-1 entry have lent urgency to studies on the relationships between chemokines and lymphocytes. My laboratory has characterized Mig and Crg-2/IP-10, chemokines that are induced by IFN-g and that specifically target lymphocytes, particularly activated T cells. We have demonstrated that the genes for these chemokines are widely expressed during experimental infections in mice with protozoan and viral pathogens, but that the patterns of mig and crg-2 expression differed, suggesting non-redundant roles in vivo. Our related studies to identify new chemokine receptors from activated lymphocytes resulted in the cloning of STRL22 and STRL33. We and others have shown that STRL22 is a receptor for the CC chemokine MIP-3a, and STRL22 has been re-named CCR6. Although STRL33 remains an orphan receptor, we have shown that it can function as a co-receptor for HIV-1 envelope glycoproteins, and that it is active with a broader range of HIV-1 envelope glycoproteins than the major co-receptors described to date. The ability of STRL33 to function with a wide variety of envelope glycoproteins may become particularly important if therapies are instituted to block other specific co-receptors. We presume that investigations into the roles of chemokines and their receptors in lymphocyte biology will provide information important for understanding the pathogenesis of AIDS and for manipulating immune and inflammatory responses for clinical benefit
Resumo:
Background: Controversy exists concerning indications and outcomes of major bariatric surgery procedures. Massive weight loss after bariatric surgery leads to excess skin with functional and aesthetic impairments. The aim of this study was to investigate the major bariatric surgery procedures and their outcomes in two specific subgroups of morbidly obese patients, ≥55-year-olds and the superobese. Further aims were to evaluate whether the preoperative weight loss correlates with laparoscopic gastric bypass complications. The prevalence and impact of excess skin and the desire for body contouring after bariatric surgery were also studied. Patients and Methods: Data from patients who underwent Laparoscopic Adjustable Gastric Banding (LAGB) and Laparoscopic Roux-en-Y Gastric Bypass (LRYGB) at Vaasa Central Hospital were collected and postoperative outcomes were evaluated according to the BMI, age and preoperative weight loss. Patients who had undergone bariatric surgery procedures were asked to complete a questionnaire to estimate any impairment due to redundant skin and to analyse each patient’s desire for body contouring by area. Results: No significant difference was found in operative time, hospital stay, or overall early postoperative morbidity between LAGB and LRYGB. Mean excess weight loss percents (EWL%) at 6 and 12 months after LRYGB were significantly higher. A significant difference was found in operative time favouring patients <55 years. Intraoperative complications were significantly more frequent in the group aged >55 years. No significant difference was detected in overall postoperative morbidity rates. A significant difference was found in operative time and hospital stay favouring all patients who lost weight preoperatively. Most patients reported problems with redundant skin, especially on the abdomen, upper arms and rear/buttocks, which impaired daily physical activity in half of them. Excess skin was significantly associated with female gender, weight loss and ΔBMI. Patients with a WL >20 kg, ΔBMI ≥10 kg/m2 and an EWL % > 50 showed a significantly surplus skin discomfort (p < 0.001). Most patients desired body contouring surgery, with high or very high desire for waist/abdomen (62.2%), upper arm (37.6%), chest/breast (28.3%), and rear/buttock (35.6%) contouring. Conclusions: LRYGB is effective and safe in superobese (BMI >50) and elderly (>55 years) patients. A preoperative weight loss >5% is recommended to improve the outcomes and reduce complications. A WL >20 kg, ΔBMI ≥10 kg/m2 and an EWL % > 50 are associated with a higher functional discomfort due to redundant skin and to a stronger desire for body contouring plastic surgery.
Resumo:
Brain computer interface (BCI) is a kind of human machine interface, which provides a new interaction method between human and computer or other equipment. The most significant characteristic of BCI system is that its control input is brain electrical activities acquired from the brain instead of traditional input such as hands or eyes. BCI technique has rapidly developed during last two decades and it has mainly worked as an auxiliary technique to help the disable people improve their life qualities. With the appearance of low cost novel electrical devices such as EMOTIV, BCI technique has been applied to the general public through many useful applications including video gaming, virtual reality and virtual keyboard. The purpose of this research is to be familiar with EMOTIV EPOC system and make use of it to build an EEG based BCI system for controlling an industrial manipulator by means of human thought. To build a BCI system, an acquisition program based on EMOTIV EPOC system is designed and a MFC based dialog that works as an operation panel is presented. Furthermore, the inverse kinematics of RV-3SB industrial robot was solved. In the last part of this research, the designed BCI system with human thought input is examined and the results indicate that the system is running smoothly and displays clearly the motion type and the incremental displacement of the motion.
Resumo:
The glycosylation of glycoconjugates and the biosynthesis of polysaccharides depend on nucleotide-sugars which are the substrates for glycosyltransferases. A large proportion of these enzymes are located within the lumen of the Golgi apparatus as well as the endoplasmic reticulum, while many of the nucleotide-sugars are synthesized in the cytosol. Thus, nucleotide-sugars are translocated from the cytosol to the lumen of the Golgi apparatus and endoplasmic reticulum by multiple spanning domain proteins known as nucleotide-sugar transporters (NSTs). These proteins were first identified biochemically and some of them were cloned by complementation of mutants. Genome and expressed sequence tag sequencing allowed the identification of a number of sequences that may encode for NSTs in different organisms. The functional characterization of some of these genes has shown that some of them can be highly specific in their substrate specificity while others can utilize up to three different nucleotide-sugars containing the same nucleotide. Mutations in genes encoding for NSTs can lead to changes in development in Drosophila melanogaster or Caenorhabditis elegans, as well as alterations in the infectivity of Leishmania donovani. In humans, the mutation of a GDP-fucose transporter is responsible for an impaired immune response as well as retarded growth. These results suggest that, even though there appear to be a fair number of genes encoding for NSTs, they are not functionally redundant and seem to play specific roles in glycosylation.
Resumo:
Chronic hepatitis B (HBV) and C (HCV) virus infections are the most important factors associated with hepatocellular carcinoma (HCC), but tumor prognosis remains poor due to the lack of diagnostic biomarkers. In order to identify novel diagnostic markers and therapeutic targets, the gene expression profile associated with viral and non-viral HCC was assessed in 9 tumor samples by oligo-microarrays. The differentially expressed genes were examined using a z-score and KEGG pathway for the search of ontological biological processes. We selected a non-redundant set of 15 genes with the lowest P value for clustering samples into three groups using the non-supervised algorithm k-means. Fisher’s linear discriminant analysis was then applied in an exhaustive search of trios of genes that could be used to build classifiers for class distinction. Different transcriptional levels of genes were identified in HCC of different etiologies and from different HCC samples. When comparing HBV-HCC vs HCV-HCC, HBV-HCC/HCV-HCC vs non-viral (NV)-HCC, HBC-HCC vs NV-HCC, and HCV-HCC vs NV-HCC of the 58 non-redundant differentially expressed genes, only 6 genes (IKBKβ, CREBBP, WNT10B, PRDX6, ITGAV, and IFNAR1) were found to be associated with hepatic carcinogenesis. By combining trios, classifiers could be generated, which correctly classified 100% of the samples. This expression profiling may provide a useful tool for research into the pathophysiology of HCC. A detailed understanding of how these distinct genes are involved in molecular pathways is of fundamental importance to the development of effective HCC chemoprevention and treatment.
Resumo:
The aim of this thesis is to propose a novel control method for teleoperated electrohydraulic servo systems that implements a reliable haptic sense between the human and manipulator interaction, and an ideal position control between the manipulator and the task environment interaction. The proposed method has the characteristics of a universal technique independent of the actual control algorithm and it can be applied with other suitable control methods as a real-time control strategy. The motivation to develop this control method is the necessity for a reliable real-time controller for teleoperated electrohydraulic servo systems that provides highly accurate position control based on joystick inputs with haptic capabilities. The contribution of the research is that the proposed control method combines a directed random search method and a real-time simulation to develop an intelligent controller in which each generation of parameters is tested on-line by the real-time simulator before being applied to the real process. The controller was evaluated on a hydraulic position servo system. The simulator of the hydraulic system was built based on Markov chain Monte Carlo (MCMC) method. A Particle Swarm Optimization algorithm combined with the foraging behavior of E. coli bacteria was utilized as the directed random search engine. The control strategy allows the operator to be plugged into the work environment dynamically and kinetically. This helps to ensure the system has haptic sense with high stability, without abstracting away the dynamics of the hydraulic system. The new control algorithm provides asymptotically exact tracking of both, the position and the contact force. In addition, this research proposes a novel method for re-calibration of multi-axis force/torque sensors. The method makes several improvements to traditional methods. It can be used without dismantling the sensor from its application and it requires smaller number of standard loads for calibration. It is also more cost efficient and faster in comparison to traditional calibration methods. The proposed method was developed in response to re-calibration issues with the force sensors utilized in teleoperated systems. The new approach aimed to avoid dismantling of the sensors from their applications for applying calibration. A major complication with many manipulators is the difficulty accessing them when they operate inside a non-accessible environment; especially if those environments are harsh; such as in radioactive areas. The proposed technique is based on design of experiment methodology. It has been successfully applied to different force/torque sensors and this research presents experimental validation of use of the calibration method with one of the force sensors which method has been applied to.
Resumo:
ICT contributed to about 0.83 GtCO2 emissions where the 37% comes from the telecoms infrastructures. At the same time, the increasing cost of energy has been hindering the industry in providing more affordable services for the users. One of the sources of these problems is said to be the rigidity of the current network infrastructures which limits innovations in the network. SDN (Software Defined Network) has emerged as one of the prominent solutions with its idea of abstraction, visibility, and programmability in the network. Nevertheless, there are still significant efforts needed to actually utilize it to create a more energy and environmentally friendly network. In this paper, we suggested and developed a platform for developing ecology-related SDN applications. The main approach we take in realizing this goal is by maximizing the abstractions provided by OpenFlow and to expose RESTful interfaces to modules which enable energy saving in the network. While OpenFlow is made to be the standard for SDN protocol, there are still some mechanisms not defined in its specification such as settings related to Quality of Service (QoS). To solve this, we created REST interfaces for setting of QoS in the switches which can maximize network utilization. We also created a module for minimizing the required network resources in delivering packets across the network. This is achieved by utilizing redundant links when it is needed, but disabling them when the load in the network decreases. The usage of multi paths in a network is also evaluated for its benefit in terms of transfer rate improvement and energy savings. Hopefully, the developed framework can be beneficial for developers in creating applications for supporting environmentally friendly network infrastructures.
Resumo:
Avidins (Avds) are homotetrameric or homodimeric glycoproteins with typically less than 130 amino acid residues per monomer. They form a highly stable, non-covalent complex with biotin (vitamin H) with Kd = 10-15 M (for chicken Avd). The best-studied Avds are the chicken Avd from Gallus gallus and streptavidin from Streptomyces avidinii, although other Avd studies have also included Avds from various origins, e.g., from frogs, fishes, mushrooms and from many different bacteria. Several engineered Avds have been reported as well, e.g., dual-chain Avds (dcAvds) and single-chain Avds (scAvds), circular permutants with up to four simultaneously modifiable ligand-binding sites. These engineered Avds along with the many native Avds have potential to be used in various nanobiotechnological applications. In this study, we made a structure-based alignment representing all currently available sequences of Avds and studied the evolutionary relationship of Avds using phylogenetic analysis. First, we created an initial multiple sequence alignment of Avds using 42 closely related sequences, guided by the known Avd crystal structures. Next, we searched for non-redundant Avd sequences from various online databases, including National Centre for Biotechnology Information and the Universal Protein Resource; the identified sequences were added to the initial alignment to expand it to a final alignment of 242 Avd sequences. The MEGA software package was used to create distance matrices and a phylogenetic tree. Bootstrap reproducibility of the tree was poor at multiple nodes and may reflect on several possible issues with the data: the sequence length compared is relatively short and, whereas some positions are highly conserved and functional, others can vary without impinging on the structure or the function, so there are few informative sites; it may be that periods of rapid duplication have led to paralogs and that the differences among them are within the error limit of the data; and there may be other yet unknown reasons. Principle component analysis applied to alternative distance data did segregate the major groups, and success is likely due to the multivariate consideration of all the information. Furthermore, based on our extensive alignment and phylogenetic analysis, we expressed two novel Avds, lacavidin from Lactrodectus Hesperus, a western black widow spider, and hoefavidin from Hoeflea phototrophica, an aerobic marine bacterium, the ultimate aim being to determine their X-ray structures. These Avds were selected because of their unique sequences: lacavidin has an N-terminal Avd-like domain but a long C-terminal overhang, whereas hoefavidin was thought to be a dimeric Avd. Both these Avds could be used as novel scaffolds in biotechnological applications.
Resumo:
Given a heterogeneous relation algebra R, it is well known that the algebra of matrices with coefficient from R is relation algebra with relational sums that is not necessarily finite. When a relational product exists or the point axiom is given, we can represent the relation algebra by concrete binary relations between sets, which means the algebra may be seen as an algebra of Boolean matrices. However, it is not possible to represent every relation algebra. It is well known that the smallest relation algebra that is not representable has only 16 elements. Such an algebra can not be put in a Boolean matrix form.[15] In [15, 16] it was shown that every relation algebra R with relational sums and sub-objects is equivalent to an algebra of matrices over a suitable basis. This basis is given by the integral objects of R, and is, compared to R, much smaller. Aim of my thesis is to develop a system called ReAlM - Relation Algebra Manipulator - that is capable of visualizing computations in arbitrary relation algebras using the matrix approach.
Resumo:
The capacity for all living cells to sense and interact with their environment is a necessity for life. In highly evolved, eukaryotic species, like humans, signalling mechanisms are necessary to regulate the function and survival of all cells in the organism. Synchronizing systemic signalling systems at the cellular, organ and whole-organism level is a formidable task, and for most species requires a large number of signalling molecules and their receptors. One of the major types of signalling molecules used throughout the animal kingdom are modulatory substances (e.x. hormones and peptides). Modulators can act as chemical transmitters, facilitating communication at chemical synapses. There are hundreds of circulating modulators within the mammalian system, but the reason for so many remains a mystery. Recent work with the fruit fly, Drosophila melanogaster demonstrated the capacity for peptides to modulate synaptic transmission in a neuron-specific manner, suggesting that peptides are not simply redundant, but rather may have highly specific roles. Thus, the diversity of peptides may reflect cell-specific functions. The main objective of my doctoral thesis was to examine the extent to which neuromodulator substances and their receptors modulate synaptic transmission at a cell-specific level using D. melanogaster. Using three different modulatory substances, i) octopamine - a biogenic amine released from motor neuron terminals, ii) DPKQDFMRFa - a neuropeptide secreted into circulation, and iii) Proctolin - a pentapeptide released both from motor neuron terminals and into circulation, I was able to investigate not only the capacity of these various substances to work in a cell-selective manner, but also examine the different mechanisms of action and how modulatory substances work in concert to execute systemic functionality . The results support the idea that modulatory substances act in a circuit-selective manner in the central nervous system and in the periphery in order to coordinate and synchronize physiologically and behaviourally relevant outputs. The findings contribute as to why the nervous system encodes so many modulatory substances.
Resumo:
Les facteurs de transcription Pitx ont été impliqués dans la croissance et la détermination de l’identité des membres postérieurs. D’abord, l’inactivation de Pitx1 chez la souris résulte en la transformation partielle des membres postérieurs en membres antérieurs. Ensuite, la double mutation de Pitx1 et de Pitx2 a montré l’activité redondante de ces facteurs pour la croissance des membres postérieurs. Ainsi, les souris mutantes Pitx1-/-;Pitx2néo/néo montrent une perte des éléments squelettiques proximaux et antérieurs. Des travaux récents ont impliqué les gènes de la famille des Iroquois dans le développement des membres. Tout particulièrement, les souris Irx3-/-;Irx5-/- montrent la perte des éléments squelettiques proximaux et antérieurs, exclusivement au niveau des membres postérieurs. Cette phénocopie entre les souris mutantes pour Pitx1/2 et Irx3/5 nous a amenés à poser trois hypothèses : (1) les Pitx sont responsables de l’expression de Irx dans les bourgeons postérieurs ; (2) à l’inverse, les Irx dirigent l’expression des Pitx ; (3) les Pitx et les Irx participent ensemble au programme génétique de croissance des bourgeons postérieurs. Nous avons pu conclure que les Pitx et les Irx font partie de cascades de régulation indépendantes l’une de l’autre et qu’ils sont capables d’interaction transcriptionnelle autant sur un promoteur générique que sur des régions conservées du locus de Tbx4. Enfin, autant l’inactivation Pitx que celle des Irx mène à un retard d’expression de Pax9 exclusivement dans les bourgeons postérieurs. Ainsi, les Pitx et les Irx semblent agir sur des programmes génétiques parallèles impliqués dans la croissance et le patterning des membres postérieurs.
Resumo:
La survie des réseaux est un domaine d'étude technique très intéressant ainsi qu'une préoccupation critique dans la conception des réseaux. Compte tenu du fait que de plus en plus de données sont transportées à travers des réseaux de communication, une simple panne peut interrompre des millions d'utilisateurs et engendrer des millions de dollars de pertes de revenu. Les techniques de protection des réseaux consistent à fournir une capacité supplémentaire dans un réseau et à réacheminer les flux automatiquement autour de la panne en utilisant cette disponibilité de capacité. Cette thèse porte sur la conception de réseaux optiques intégrant des techniques de survie qui utilisent des schémas de protection basés sur les p-cycles. Plus précisément, les p-cycles de protection par chemin sont exploités dans le contexte de pannes sur les liens. Notre étude se concentre sur la mise en place de structures de protection par p-cycles, et ce, en supposant que les chemins d'opération pour l'ensemble des requêtes sont définis a priori. La majorité des travaux existants utilisent des heuristiques ou des méthodes de résolution ayant de la difficulté à résoudre des instances de grande taille. L'objectif de cette thèse est double. D'une part, nous proposons des modèles et des méthodes de résolution capables d'aborder des problèmes de plus grande taille que ceux déjà présentés dans la littérature. D'autre part, grâce aux nouveaux algorithmes, nous sommes en mesure de produire des solutions optimales ou quasi-optimales. Pour ce faire, nous nous appuyons sur la technique de génération de colonnes, celle-ci étant adéquate pour résoudre des problèmes de programmation linéaire de grande taille. Dans ce projet, la génération de colonnes est utilisée comme une façon intelligente d'énumérer implicitement des cycles prometteurs. Nous proposons d'abord des formulations pour le problème maître et le problème auxiliaire ainsi qu'un premier algorithme de génération de colonnes pour la conception de réseaux protegées par des p-cycles de la protection par chemin. L'algorithme obtient de meilleures solutions, dans un temps raisonnable, que celles obtenues par les méthodes existantes. Par la suite, une formulation plus compacte est proposée pour le problème auxiliaire. De plus, nous présentons une nouvelle méthode de décomposition hiérarchique qui apporte une grande amélioration de l'efficacité globale de l'algorithme. En ce qui concerne les solutions en nombres entiers, nous proposons deux méthodes heurisiques qui arrivent à trouver des bonnes solutions. Nous nous attardons aussi à une comparaison systématique entre les p-cycles et les schémas classiques de protection partagée. Nous effectuons donc une comparaison précise en utilisant des formulations unifiées et basées sur la génération de colonnes pour obtenir des résultats de bonne qualité. Par la suite, nous évaluons empiriquement les versions orientée et non-orientée des p-cycles pour la protection par lien ainsi que pour la protection par chemin, dans des scénarios de trafic asymétrique. Nous montrons quel est le coût de protection additionnel engendré lorsque des systèmes bidirectionnels sont employés dans de tels scénarios. Finalement, nous étudions une formulation de génération de colonnes pour la conception de réseaux avec des p-cycles en présence d'exigences de disponibilité et nous obtenons des premières bornes inférieures pour ce problème.
Resumo:
Les temps de réponse dans une tache de reconnaissance d’objets visuels diminuent de façon significative lorsque les cibles peuvent être distinguées à partir de deux attributs redondants. Le gain de redondance pour deux attributs est un résultat commun dans la littérature, mais un gain causé par trois attributs redondants n’a été observé que lorsque ces trois attributs venaient de trois modalités différentes (tactile, auditive et visuelle). La présente étude démontre que le gain de redondance pour trois attributs de la même modalité est effectivement possible. Elle inclut aussi une investigation plus détaillée des caractéristiques du gain de redondance. Celles-ci incluent, outre la diminution des temps de réponse, une diminution des temps de réponses minimaux particulièrement et une augmentation de la symétrie de la distribution des temps de réponse. Cette étude présente des indices que ni les modèles de course, ni les modèles de coactivation ne sont en mesure d’expliquer l’ensemble des caractéristiques du gain de redondance. Dans ce contexte, nous introduisons une nouvelle méthode pour évaluer le triple gain de redondance basée sur la performance des cibles doublement redondantes. Le modèle de cascade est présenté afin d’expliquer les résultats de cette étude. Ce modèle comporte plusieurs voies de traitement qui sont déclenchées par une cascade d’activations avant de satisfaire un seul critère de décision. Il offre une approche homogène aux recherches antérieures sur le gain de redondance. L’analyse des caractéristiques des distributions de temps de réponse, soit leur moyenne, leur symétrie, leur décalage ou leur étendue, est un outil essentiel pour cette étude. Il était important de trouver un test statistique capable de refléter les différences au niveau de toutes ces caractéristiques. Nous abordons la problématique d’analyser les temps de réponse sans perte d’information, ainsi que l’insuffisance des méthodes d’analyse communes dans ce contexte, comme grouper les temps de réponses de plusieurs participants (e. g. Vincentizing). Les tests de distributions, le plus connu étant le test de Kolmogorov- Smirnoff, constituent une meilleure alternative pour comparer des distributions, celles des temps de réponse en particulier. Un test encore inconnu en psychologie est introduit : le test d’Anderson-Darling à deux échantillons. Les deux tests sont comparés, et puis nous présentons des indices concluants démontrant la puissance du test d’Anderson-Darling : en comparant des distributions qui varient seulement au niveau de (1) leur décalage, (2) leur étendue, (3) leur symétrie, ou (4) leurs extrémités, nous pouvons affirmer que le test d’Anderson-Darling reconnait mieux les différences. De plus, le test d’Anderson-Darling a un taux d’erreur de type I qui correspond exactement à l’alpha tandis que le test de Kolmogorov-Smirnoff est trop conservateur. En conséquence, le test d’Anderson-Darling nécessite moins de données pour atteindre une puissance statistique suffisante.