908 resultados para 090602 Control Systems Robotics and Automation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Neuroimaging of the self focused on high-level mechanisms such as language, memory or imagery of the self. Recent evidence suggests that low-level mechanisms of multisensory and sensorimotor integration may play a fundamental role in encoding self-location and the first-person perspective (Blanke and Metzinger, 2009). Neurological patients with out-of body experiences (OBE) suffer from abnormal self-location and the first-person perspective due to a damage in the temporo-parietal junction (Blanke et al., 2004). Although self-location and the first-person perspective can be studied experimentally (Lenggenhager et al., 2009), the neural underpinnings of self-location have yet to be investigated. To investigate the brain network involved in self-location and first-person perspective we used visuo-tactile multisensory conflict, magnetic resonance (MR)-compatible robotics, and fMRI in study 1, and lesion analysis in a sample of 9 patients with OBE due to focal brain damage in study 2. Methods: Twenty-two participants saw a video showing either a person's back or an empty room being stroked (visual stimuli) while the MR-compatible robotic device stroked their back (tactile stimulation). Direction and speed of the seen stroking could either correspond (synchronous) or not (asynchronous) to those of the seen stroking. Each run comprised the four conditions according to a 2x2 factorial design with Object (Body, No-Body) and Synchrony (Synchronous, Asynchronous) as main factors. Self-location was estimated using the mental ball dropping (MBD; Lenggenhager et al., 2009). After the fMRI session participants completed a 6-item adapted from the original questionnaire created by Botvinick and Cohen (1998) and based on questions and data obtained by Lenggenhager et al. (2007, 2009). They were also asked to complete a questionnaire to disclose the perspective they adopted during the illusion. Response times (RTs) for the MBD and fMRI data were analyzed with a 3-way mixed model ANOVA with the in-between factor Perspective (up, down) and the two with-in factors Object (body, no-body) and Stroking (synchronous, asynchronous). Quantitative lesion analysis was performed using MRIcron (Rorden et al., 2007). We compared the distributions of brain lesions confirmed by multimodality imaging (Knowlton, 2004) in patients with OBE with those showing complex visual hallucinations involving people or faces, but without any disturbance of self-location and first person perspective. Nine patients with OBE were investigated. The control group comprised 8 patients. Structural imaging data were available for normalization and co-registration in all the patients. Normalization of each patient's lesion into the common MNI (Montreal Neurological Institute) reference space permitted simple, voxel-wise, algebraic comparisons to be made. Results: Even if in the scanner all participants were lying on their back and were facing upwards, analysis of perspective showed that half of the participants had the impression to be looking down at the virtual human body below them, despite any cues about their body position (Down-group). The other participants had the impression to be looking up at the virtual body above them (Up-group). Analysis of Q3 ("How strong was the feeling that the body you saw was you?") indicated stronger self-identification with the virtual body during the synchronous stroking. RTs in the MBD task confirmed these subjective data (significant 3-way interaction between perspective, object and stroking). fMRI results showed eight cortical regions where the BOLD signal was significantly different during at least one of the conditions resulting from the combination of Object and Stroking, relative to baseline: right and left temporo-parietal junction, right EBA, left middle occipito-temporal gyrus, left postcentral gyrus, right medial parietal lobe, bilateral medial occipital lobe (Fig 1). The activation patterns in right and left temporo-parietal junction and right EBA reflected changes in self-location and perspective as revealed by statistical analysis that was performed on the percentage of BOLD change with respect to the baseline. Statistical lesion overlap comparison (using nonparametric voxel based lesion symptom mapping) with respect to the control group revealed the right temporo-parietal junction, centered at the angular gyrus (Talairach coordinates x = 54, y =-52, z = 26; p>0.05, FDR corrected). Conclusions: The present questionnaire and behavioural results show that - despite the noisy and constraining MR environment) our participants had predictable changes in self-location, self-identification, and first-person perspective when robotic tactile stroking was applied synchronously with the robotic visual stroking. fMRI data in healthy participants and lesion data in patients with abnormal self-location and first-person perspective jointly revealed that the temporo-parietal cortex especially in the right hemisphere encodes these conscious experiences. We argue that temporo-parietal activity reflects the experience of the conscious "I" as embodied and localized within bodily space.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research work deals with the problem of modeling and design of low level speed controller for the mobile robot PRIM. The main objective is to develop an effective educational, and research tool. On one hand, the interests in using the open mobile platform PRIM consist in integrating several highly related subjects to the automatic control theory in an educational context, by embracing the subjects of communications, signal processing, sensor fusion and hardware design, amongst others. On the other hand, the idea is to implement useful navigation strategies such that the robot can be served as a mobile multimedia information point. It is in this context, when navigation strategies are oriented to goal achievement, that a local model predictive control is attained. Hence, such studies are presented as a very interesting control strategy in order to develop the future capabilities of the system. In this context the research developed includes the visual information as a meaningful source that allows detecting the obstacle position coordinates as well as planning the free obstacle trajectory that should be reached by the robot

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Usingof belt for high precision applications has become appropriate because of the rapid development in motor and drive technology as well as the implementation of timing belts in servo systems. Belt drive systems provide highspeed and acceleration, accurate and repeatable motion with high efficiency, long stroke lengths and low cost. Modeling of a linear belt-drive system and designing its position control are examined in this work. Friction phenomena and position dependent elasticity of the belt are analyzed. Computer simulated results show that the developed model is adequate. The PID control for accurate tracking control and accurate position control is designed and applied to the real test setup. Both the simulation and the experimental results demonstrate that the designed controller meets the specified performance specifications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Energy consumption and energy efficiency have become an issue. Energy consumption is rising all over the world and because of that, and the climate change, energy is becoming more and more expensive. Buildings are major consumers of energy, and inside the buildings the major consumers are heating, ventilation and air-conditioning systems. They usually run at constant speed without efficient control. In most cases HVAC equipment is also oversized. Traditionally heating, ventilation and air-conditioning systems have been sized to meet conditions that rarely occur. The theory part in this thesis represents the basics of life cycle costs and calculations for the whole life cycle of a system. It also represents HVAC systems, equipment, systems controls and ways to save energy in these systems. The empirical part of this thesis represents life cycle cost calculations for HVAC systems. With these calculations it is possible to compute costs for the whole life cycle for the wanted variables. Life cycle costs make it possible to compare which variable causes most of the costs from the whole life point of view. Life cycle costs were studied through two real life cases which were focused on two different kinds of HVAC systems. In both of these cases the renovations were already made, so that the comparison between the old and the new, now existing system would be easier. The study indicates that energy can be saved in HVAC systems by using variable speed drive as a control method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this thesis is to research Manufacturing Planning and Control (MPC) system and Master Scheduling (MS) in a manufacturing firm. The study is conducted at Ensto Finland Corporation, which operates on a field of electrical systems and supplies. The paper consists of theoretical and empirical parts. The empirical part is based on weekly operating at Ensto and includes inter-firm material analysis, learning and meetings. Master Scheduling is an important module of an MPC system, since it is beneficial on transforming strategic production plans based on demand forecasting into operational schedules. Furthermore, capacity planning tools can remarkably contribute to production planning: by Rough-Cut Capacity Planning (RCCP) tool, a MS plan can be critically analyzed in terms of available key resources in real manufacturing environment. Currently, there are remarkable inefficiencies when it comes to Ensto’s practices: the system is not able to take into consideration seasonal demand and react on market changes on time; This can cause significant lost sales. However, these inefficiencies could be eliminated through the appropriate utilization of MS and RCCP tools. To utilize MS and RCCP tools in Ensto’s production environment, further testing in real production environment is required. Moreover, data accuracy, appropriate commitment to adapting and learning the new tools, and continuous developing of functions closely related to MS, such as sales forecasting, need to be ensured.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years the analysis and synthesis of (mechanical) control systems in descriptor form has been established. This general description of dynamical systems is important for many applications in mechanics and mechatronics, in electrical and electronic engineering, and in chemical engineering as well. This contribution deals with linear mechanical descriptor systems and its control design with respect to a quadratic performance criterion. Here, the notion of properness plays an important role whether the standard Riccati approach can be applied as usual or not. Properness and non-properness distinguish between the cases if the descriptor system is exclusively governed by the control input or by its higher-order time-derivatives additionally. In the unusual case of non-proper systems a quite different problem of optimal control design has to be considered. Both cases will be solved completely.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work describes a lumped parameter mathematical model for the prediction of transients in an aerodynamic circuit of a transonic wind tunnel. Control actions to properly handle those perturbations are also assessed. The tunnel circuit technology is up to date and incorporates a novel feature: high-enthalpy air injection to extend the tunnel’s Reynolds number capability. The model solves the equations of continuity, energy and momentum and defines density, internal energy and mass flow as the basic parameters in the aerodynamic study as well as Mach number, stagnation pressure and stagnation temperature, all referred to test section conditions, as the main control variables. The tunnel circuit response to control actions and the stability of the flow are numerically investigated. Initially, for validation purposes, the code was applied to the AWT ("Altitude Wind Tunnel" of NASA-Lewis). In the sequel, the Brazilian transonic wind tunnel was investigated, with all the main control systems modeled, including injection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studies the effect of time delay on the active non-linear control of dynamically loaded flexible structures. The behavior of non-linear systems under state feedback control, considering a fixed time delay for the control force, is investigated. A control method based on non-linear optimal control, using a tensorial formulation and state feedback control is used. The state equations and the control forces are expressed in polynomial form and a performance index, quadratic in both state vector and control forces, is used. General polynomial representations of the non-linear control law are obtained and implemented for control algorithms up to the fifth order. This methodology is applied to systems with quadratic and cubic non-linearities. Strongly non-linear systems are tested and the effectiveness of the control system including a delay for the application of control forces is discussed. Numerical results indicate that the adopted control algorithm can be efficient for non-linear systems, chiefly in the presence of strong non-linearities but increasing time delay reduces the efficiency of the control system. Numerical results emphasize the importance of considering time delay in the project of active structural control systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study is to explore the possibilities of utilizing business intelligence (BI)systems in management control (MC). The topic of this study is explored trough four researchquestions. Firstly, what kind of management control systems (MCS) use or could use the data and information enabled by the BI system? Secondly, how the BI system is or could be utilized? Thirdly, has BI system enabled new forms of control or changed old ones? The fourth and final research question is whether the BI system supports some forms of control that the literature has not thought of, or is the BI system not used for some forms of control the literature suggests it should be used? The study is conducted as an extensive case study. Three different organizations were interviewed for the study. For the theoretical basis of the study, central theories in the field of management control are introduced. The term business intelligence is discussed in detail and the mechanisms for governance of business intelligence are presented. A literature analysis of the uses of BI for management control is introduced. The theoretical part of the study ends in the construction of a framework for business intelligence in management control. In the empirical part of the study the case organizations, their BI systems, and the ways they utilize these systems for management control are presented. The main findings of the study are that BI systems can be utilized in the fields suggested in the literature, namely in planning, cybernetic, reward, boundary, and interactive control. The systems are used both as the data or information feeders and directly as the tools. Using BI systems has also enabled entirely new forms of control in the studied organizations, most significantly in the area of interactive control. They have also changed the old control systems by making the information more readily available to the whole organization. No evidence of the BI systems being used for forms of control that the literature had not suggested was found. The systems were mostly used for cybernetic control and interactive control, whereas the support for other types of control was not as prevalent. The main contribution of the study to the existing literature is the insight provided into how BI systems, both theoretically and empirically, are used for management control. The framework for business intelligence in management control presented in the study can also be utilized in further studies about the subject.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Glutathione is the major intracellular antioxidant thiol protecting mammalian cells against oxidative stress induced by oxygen- and nitrogen-derived reactive species. In trypanosomes and leishmanias, trypanothione plays a central role in parasite protection against mammalian host defence systems by recycling trypanothione disulphide by the enzyme trypanothione reductase. Although Kinetoplastida parasites lack glutathione reductase, they maintain significant levels of glutathione. The aim of this study was to use Leishmania donovani trypanothione reductase gene mutant clones and different Leishmania species to examine the role of these two individual thiol systems in the protection mechanism against S-nitroso-N-acetyl-D,L-penicillamine (SNAP), a nitrogen-derived reactive species donor. We found that the resistance to SNAP of different species of Leishmania was inversely correlated with their glutathione concentration but not with their total low-molecular weight thiol content (about 0.18 nmol/10(7) parasites, regardless Leishmania species). The glutathione concentration in L. amazonensis, L. donovani, L. major, and L. braziliensis were 0.12, 0.10, 0.08, and 0.04 nmol/10(7) parasites, respectively. L. amazonensis, that have a higher level of glutathione, were less susceptible to SNAP (30 and 100 µM). The IC50 values of SNAP determined to L. amazonensis, L. donovani, L. major, and L. braziliensis were 207.8, 188.5, 160.9, and 83 µM, respectively. We also observed that L. donovani mutants carrying only one trypanothione reductase allele had a decreased capacity to survive (~40%) in the presence of SNAP (30-150 µM). In conclusion, the present data suggest that both antioxidant systems, glutathione and trypanothione/trypanothione reductase, participate in protection of Leishmania against the toxic effect of nitrogen-derived reactive species.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to various advantages such as flexibility, scalability and updatability, software intensive systems are increasingly embedded in everyday life. The constantly growing number of functions executed by these systems requires a high level of performance from the underlying platform. The main approach to incrementing performance has been the increase of operating frequency of a chip. However, this has led to the problem of power dissipation, which has shifted the focus of research to parallel and distributed computing. Parallel many-core platforms can provide the required level of computational power along with low power consumption. On the one hand, this enables parallel execution of highly intensive applications. With their computational power, these platforms are likely to be used in various application domains: from home use electronics (e.g., video processing) to complex critical control systems. On the other hand, the utilization of the resources has to be efficient in terms of performance and power consumption. However, the high level of on-chip integration results in the increase of the probability of various faults and creation of hotspots leading to thermal problems. Additionally, radiation, which is frequent in space but becomes an issue also at the ground level, can cause transient faults. This can eventually induce a faulty execution of applications. Therefore, it is crucial to develop methods that enable efficient as well as resilient execution of applications. The main objective of the thesis is to propose an approach to design agentbased systems for many-core platforms in a rigorous manner. When designing such a system, we explore and integrate various dynamic reconfiguration mechanisms into agents functionality. The use of these mechanisms enhances resilience of the underlying platform whilst maintaining performance at an acceptable level. The design of the system proceeds according to a formal refinement approach which allows us to ensure correct behaviour of the system with respect to postulated properties. To enable analysis of the proposed system in terms of area overhead as well as performance, we explore an approach, where the developed rigorous models are transformed into a high-level implementation language. Specifically, we investigate methods for deriving fault-free implementations from these models into, e.g., a hardware description language, namely VHDL.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The arterial partial pressure (P CO2) of carbon dioxide is virtually constant because of the close match between the metabolic production of this gas and its excretion via breathing. Blood gas homeostasis does not rely solely on changes in lung ventilation, but also to a considerable extent on circulatory adjustments that regulate the transport of CO2 from its sites of production to the lungs. The neural mechanisms that coordinate circulatory and ventilatory changes to achieve blood gas homeostasis are the subject of this review. Emphasis will be placed on the control of sympathetic outflow by central chemoreceptors. High levels of CO2 exert an excitatory effect on sympathetic outflow that is mediated by specialized chemoreceptors such as the neurons located in the retrotrapezoid region. In addition, high CO2 causes an aversive awareness in conscious animals, activating wake-promoting pathways such as the noradrenergic neurons. These neuronal groups, which may also be directly activated by brain acidification, have projections that contribute to the CO2-induced rise in breathing and sympathetic outflow. However, since the level of activity of the retrotrapezoid nucleus is regulated by converging inputs from wake-promoting systems, behavior-specific inputs from higher centers and by chemical drive, the main focus of the present manuscript is to review the contribution of central chemoreceptors to the control of autonomic and respiratory mechanisms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Whereas the role of the anterior cingulate cortex (ACC) in cognitive control has received considerable attention, much less work has been done on the role of the ACC in autonomic regulation. Its connections through the vagus nerve to the sinoatrial node of the heart are thought to exert modulatory control over cardiovascular arousal. Therefore, ACC is not only responsible for the implementation of cognitive control, but also for the dynamic regulation of cardiovascular activity that characterizes healthy heart rate and adaptive behaviour. However, cognitive control and autonomic regulation are rarely examined together. Moreover, those studies that have examined the role of phasic vagal cardiac control in conjunction with cognitive performance have produced mixed results, finding relations for specific age groups and types of tasks but not consistently. So, while autonomic regulatory control appears to support effective cognitive performance under some conditions, it is not presently clear just what factors contribute to these relations. The goal of the present study was, therefore, to examine the relations between autonomic arousal, neural responsivity, and cognitive performance in the context of a task that required ACC support. Participants completed a primary inhibitory control task with a working memory load embedded. Pre-test cardiovascular measures were obtained, and ontask ERPs associated with response control (N2/P3) and error-related processes (ERN/Pe) were analyzed. Results indicated that response inhibition was unrelated to phasic vagal cardiac control, as indexed by respiratory sinus arrhythmia (RSA). However, higher resting RSA was associated with larger ERN ampUtude for the highest working memory load condition. This finding suggests that those individuals with greater autonomic regulatory control exhibited more robust ACC error-related responses on the most challenging task condition. On the other hand, exploratory analyses with rate pressure product (RPP), a measure of sympathetic arousal, indicated that higher pre-test RPP (i.e., more sympathetic influence) was associated with more errors on "catch" NoGo trials, i.e., NoGo trials that simultaneously followed other NoGo trials, and consequently, reqviired enhanced response control. Higher pre-test RPP was also associated with smaller amplitude ERNs for all three working memory loads and smaller ampUtude P3s for the low and medium working memory load conditions. Thus, higher pretest sympathetic arousal was associated with poorer performance on more demanding "catch" NoGo trials and less robust ACC-related electrocortical responses. The findings firom the present study highlight tiie interdependence of electrocortical and cardiovascular processes. While higher pre-test parasympathetic control seemed to relate to more robust ACC error-related responses, higher pre-test sympathetic arousal resulted in poorer inhibitory control performance and smaller ACC-generated electrocortical responses. Furthermore, these results provide a base from which to explore the relation between ACC and neuro/cardiac responses in older adults who may display greater variance due to the vulnerabihty of these systems to the normal aging process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this Letter we numerically investigate the dynamics of a system of two coupled chaotic multimode Nd:YAG lasers with two mode and three mode outputs. Unidirectional and bidirectional coupling schemes are adopted; intensity time series plots, phase space plots and synchronization plots are used for studying the dynamics. Quality of synchronization is measured using correlation index plots. It is found that for laser with two mode output bidirectional direct coupling scheme is found to be effective in achieving complete synchronization, control of chaos and amplification in output intensity. For laser with three mode output, bidirectional difference coupling scheme gives much better chaotic synchronization as compared to unidirectional difference coupling but at the cost of higher coupling strength. We also conclude that the coupling scheme and system properties play an important role in determining the type of synchronization exhibited by the system.