882 resultados para advanced control technology
Resumo:
Increased penetration of generation and decentralised control are considered to be feasible and effective solution for reducing cost and emissions and hence efficiency associated with power generation and distribution. Distributed generation in combination with the multi-agent technology are perfect candidates for this solution. Pro-active and autonomous nature of multi-agent systems can provide an effective platform for decentralised control whilst improving reliability and flexibility of the grid.
Resumo:
In this paper a look is taken at how the use of implant and electrode technology can be employed to create biological brains for robots, to enable human enhancement and to diminish the effects of certain neural illnesses. In all cases the end result is to increase the range of abilities of the recipients. An indication is given of a number of areas in which such technology has already had a profound effect, a key element being the need for a clear interface linking a biological brain directly with computer technology. The emphasis is placed on practical scientific studies that have been and are being undertaken and reported on. The area of focus is the use of electrode technology, where either a connection is made directly with the cerebral cortex and/or nervous system or where implants into the human body are involved. The paper also considers robots that have biological brains in which human neurons can be employed as the sole thinking machine for a real world robot body.
Resumo:
Background: Medication errors are common in primary care and are associated with considerable risk of patient harm. We tested whether a pharmacist-led, information technology-based intervention was more effective than simple feedback in reducing the number of patients at risk of measures related to hazardous prescribing and inadequate blood-test monitoring of medicines 6 months after the intervention. Methods: In this pragmatic, cluster randomised trial general practices in the UK were stratified by research site and list size, and randomly assigned by a web-based randomisation service in block sizes of two or four to one of two groups. The practices were allocated to either computer-generated simple feedback for at-risk patients (control) or a pharmacist-led information technology intervention (PINCER), composed of feedback, educational outreach, and dedicated support. The allocation was masked to general practices, patients, pharmacists, researchers, and statisticians. Primary outcomes were the proportions of patients at 6 months after the intervention who had had any of three clinically important errors: non-selective non-steroidal anti-inflammatory drugs (NSAIDs) prescribed to those with a history of peptic ulcer without co-prescription of a proton-pump inhibitor; β blockers prescribed to those with a history of asthma; long-term prescription of angiotensin converting enzyme (ACE) inhibitor or loop diuretics to those 75 years or older without assessment of urea and electrolytes in the preceding 15 months. The cost per error avoided was estimated by incremental cost-eff ectiveness analysis. This study is registered with Controlled-Trials.com, number ISRCTN21785299. Findings: 72 general practices with a combined list size of 480 942 patients were randomised. At 6 months’ follow-up, patients in the PINCER group were significantly less likely to have been prescribed a non-selective NSAID if they had a history of peptic ulcer without gastroprotection (OR 0∙58, 95% CI 0∙38–0∙89); a β blocker if they had asthma (0∙73, 0∙58–0∙91); or an ACE inhibitor or loop diuretic without appropriate monitoring (0∙51, 0∙34–0∙78). PINCER has a 95% probability of being cost eff ective if the decision-maker’s ceiling willingness to pay reaches £75 per error avoided at 6 months. Interpretation: The PINCER intervention is an effective method for reducing a range of medication errors in general practices with computerised clinical records. Funding: Patient Safety Research Portfolio, Department of Health, England.
Resumo:
In recent years, various efforts have been made in air traffic control (ATC) to maintain traffic safety and efficiency in the face of increasing air traffic demands. ATC is a complex process that depends to a large degree on human capabilities, and so understanding how controllers carry out their tasks is an important issue in the design and development of ATC systems. In particular, the human factor is considered to be a serious problem in ATC safety and has been identified as a causal factor in both major and minor incidents. There is, therefore, a need to analyse the mechanisms by which errors occur due to complex factors and to develop systems that can deal with these errors. From the cognitive process perspective, it is essential that system developers have an understanding of the more complex working processes that involve the cooperative work of multiple controllers. Distributed cognition is a methodological framework for analysing cognitive processes that span multiple actors mediated by technology. In this research, we attempt to analyse and model interactions that take place in en route ATC systems based on distributed cognition. We examine the functional problems in an ATC system from a human factors perspective, and conclude by identifying certain measures by which to address these problems. This research focuses on the analysis of air traffic controllers' tasks for en route ATC and modelling controllers' cognitive processes.
Resumo:
Nanoscience and technology (NST) are widely cited to be the defining technology for the 21st century. In recent years, the debate surrounding NST has become increasingly public, with much of this interest stemming from two radically opposing long-term visions of a NST-enabled future: ‘nano-optimism’ and ‘nano-pessimism’. This paper demonstrates that NST is a complex and wide-ranging discipline, the future of which is characterised by uncertainty. It argues that consideration of the present-day issues surrounding NST is essential if the public debate is to move forwards. In particular, the social constitution of an emerging technology is crucial if any meaningful discussion surrounding costs and benefits is to be realised. An exploration of the social constitution of NST raises a number of issues, of which unintended consequences and the interests of those who own and control new technologies are highlighted.
Resumo:
Paraplegic subjects lack trunk stability due to the loss of voluntary muscle control.This leads to a restriction of the volume of bi-manual workspace available,and hence has a detrimental impact on activities of daily living. Electrical Stimulation of paralysed muscles can be used to stabilize the trunk, but has never been applied in closed loop for this purpose. This paper describes the development of two closed loop controllers(PID and LQR),and their experimental evaluation on a human subject. Advantages and disadvantages of the two are discussed,considering a potential use of this technology during daily activities.
Resumo:
In this paper, various types of fault detection methods for fuel cells are compared. For example, those that use a model based approach or a data driven approach or a combination of the two. The potential advantages and drawbacks of each method are discussed and comparisons between methods are made. In particular, classification algorithms are investigated, which separate a data set into classes or clusters based on some prior knowledge or measure of similarity. In particular, the application of classification methods to vectors of reconstructed currents by magnetic tomography or to vectors of magnetic field measurements directly is explored. Bases are simulated using the finite integration technique (FIT) and regularization techniques are employed to overcome ill-posedness. Fisher's linear discriminant is used to illustrate these concepts. Numerical experiments show that the ill-posedness of the magnetic tomography problem is a part of the classification problem on magnetic field measurements as well. This is independent of the particular working mode of the cell but influenced by the type of faulty behavior that is studied. The numerical results demonstrate the ill-posedness by the exponential decay behavior of the singular values for three examples of fault classes.
Resumo:
The increased use of technology is necessary in order for industrial control systems to maintain and monitor industrial, infrastructural, or environmental processes. The need to secure and identify threats to the system is equally critical. Securing Critical Infrastructures and Critical Control Systems: Approaches for Threat Protection provides a full and detailed understanding of the vulnerabilities and security threats that exist within an industrial control system. This collection of research defines and analyzes the technical, procedural, and managerial responses to securing these systems.
Resumo:
The physical pendulum treated with a Hamiltonian formulation is a natural topic for study in a course in advanced classical mechanics. For the past three years, we have been offering a series of problem sets studying this system numerically in our third-year undergraduate courses in mechanics. The problem sets investigate the physics of the pendulum in ways not easily accessible without computer technology and explore various algorithms for solving mechanics problems. Our computational physics is based on Mathematica with some C communicating with Mathematica, although nothing in this paper is dependent on that choice. We have nonetheless found this system, and particularly its graphics, to be a good one for use with undergraduates.
Resumo:
In the ten years since the first edition of this book appeared there have been significant developments in food process engineering, notably in biotechnology and membrane application. Advances have been made in the use of sensors for process control, and the growth of information technology and on-line computer applications continues apace. In addition, plant investment decisions are increasingly determined by quality assurance considerations and have to incorporate a greater emphasis on health and safety issues. The content of this edition has been rearranged to include descriptions of recent developments and to reflect the influence of new technology on the control and operations of automated plant. Original examples have been retained where relevant and these, together with many new illustrations, provide a comprehensive guide to good practice.
Resumo:
The rising share of intangibles in economies worldwide highlights the crucial role of knowledge-intensive and creative industries in current and future wealth generation. The recognition of this trend has led to intense competition in these industries. At the micro-level, firms from both advanced and emerging economies are globally dispersing their value chains to control costs and leverage capabilities. The geography of innovation is the outcome of a dynamic process whereby firms from emerging economies strive to catch-up with advanced economy competitors, creating strong pressures for continued innovation. However, two distinct strategies can be discerned with regard to the control of the value chain. A vertical integration strategy emphasizes taking advantage of ‘linkage economies’ whereby controlling multiple value chain activities enhances the efficiency and effectiveness of each one of them. In contrast, a specialization strategy focuses on identifying and controlling the creative heart of the value chain, while outsourcing all other activities. The global mobile handset industry is used as the template to illustrate the theory.
Resumo:
There are well-known difficulties in making measurements of the moisture content of baked goods (such as bread, buns, biscuits, crackers and cake) during baking or at the oven exit; in this paper several sensing methods are discussed, but none of them are able to provide direct measurement with sufficient precision. An alternative is to use indirect inferential methods. Some of these methods involve dynamic modelling, with incorporation of thermal properties and using techniques familiar in computational fluid dynamics (CFD); a method of this class that has been used for the modelling of heat and mass transfer in one direction during baking is summarized, which may be extended to model transport of moisture within the product and also within the surrounding atmosphere. The concept of injecting heat during the baking process proportional to the calculated heat load on the oven has been implemented in a control scheme based on heat balance zone by zone through a continuous baking oven, taking advantage of the high latent heat of evaporation of water. Tests on biscuit production ovens are reported, with results that support a claim that the scheme gives more reproducible water distribution in the final product than conventional closed loop control of zone ambient temperatures, thus enabling water content to be held more closely within tolerance.
Resumo:
Embedded computer systems equipped with wireless communication transceivers are nowadays used in a vast number of application scenarios. Energy consumption is important in many of these scenarios, as systems are battery operated and long maintenance-free operation is required. To achieve this goal, embedded systems employ low-power communication transceivers and protocols. However, currently used protocols cannot operate efficiently when communication channels are highly erroneous. In this study, we show how average diversity combining (ADC) can be used in state-of-the-art low-power communication protocols. This novel approach improves transmission reliability and in consequence energy consumption and transmission latency in the presence of erroneous channels. Using a testbed, we show that highly erroneous channels are indeed a common occurrence in situations, where low-power systems are used and we demonstrate that ADC improves low-power communication dramatically.
Resumo:
Mankind is facing an unprecedented health challenge in the current pandemic of obesity and diabetes. We propose that this is the inevitable (and predictable) consequence of the evolution of intelligence, which itself could be an expression of life being an information system driven by entropy. Because of its ability to make life more adaptable and robust, intelligence evolved as an efficient adaptive response to the stresses arising from an ever-changing environment. These adaptive responses are encapsulated by the epiphenomena of “hormesis”, a phenomenon we believe to be central to the evolution of intelligence and essential for the maintenance of optimal physiological function and health. Thus, as intelligence evolved, it would eventually reach a cognitive level with the ability to control its environment through technology and have the ability remove all stressors. In effect, it would act to remove the very hormetic factors that had driven its evolution. Mankind may have reached this point, creating an environmental utopia that has reduced the very stimuli necessary for optimal health and the evolution of intelligence – “the intelligence paradox”. One of the hallmarks of this paradox is of course the rising incidence in obesity, diabetes and the metabolic syndrome. This leads to the conclusion that wherever life evolves, here on earth or in another part of the galaxy, the “intelligence paradox’” would be the inevitable side-effect of the evolution of intelligence. ET may not need to just “phone home” but may also need to “phone the local gym”. This suggests another possible reason to explain Fermi’s paradox; Enrico Fermi, the famous physicist, suggested in the 1950s that if extra-terrestrial intelligence was so prevalent, which was a common belief at the time, then where was it? Our suggestion is that if advanced life has got going elsewhere in our galaxy, it can’t afford to explore the galaxy because it has to pay its healthcare costs.