970 resultados para Command
Resumo:
G-Rex is light-weight Java middleware that allows scientific applications deployed on remote computer systems to be launched and controlled as if they are running on the user's own computer. G-Rex is particularly suited to ocean and climate modelling applications because output from the model is transferred back to the user while the run is in progress, which prevents the accumulation of large amounts of data on the remote cluster. The G-Rex server is a RESTful Web application that runs inside a servlet container on the remote system, and the client component is a Java command line program that can easily be incorporated into existing scientific work-flow scripts. The NEMO and POLCOMS ocean models have been deployed as G-Rex services in the NERC Cluster Grid, and G-Rex is the core grid middleware in the GCEP and GCOMS e-science projects.
Resumo:
Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.
Resumo:
A pamphlet published by the British Army's Strategic and Combat Studies Institute on the then Captain Orde Wingate's formation and command of the Anglo-Jewish Special Night Squads in the Palestine Arab revolt of 1936-1939, with a discussion of their long-term strategic and political implications.
Resumo:
The service-oriented approach to performing distributed scientific research is potentially very powerful but is not yet widely used in many scientific fields. This is partly due to the technical difficulties involved in creating services and workflows and the inefficiency of many workflow systems with regard to handling large datasets. We present the Styx Grid Service, a simple system that wraps command-line programs and allows them to be run over the Internet exactly as if they were local programs. Styx Grid Services are very easy to create and use and can be composed into powerful workflows with simple shell scripts or more sophisticated graphical tools. An important feature of the system is that data can be streamed directly from service to service, significantly increasing the efficiency of workflows that use large data volumes. The status and progress of Styx Grid Services can be monitored asynchronously using a mechanism that places very few demands on firewalls. We show how Styx Grid Services can interoperate with with Web Services and WS-Resources using suitable adapters.
Resumo:
A new autonomous ship collision free (ASCF) trajectory navigation and control system has been introduced with a new recursive navigation algorithm based on analytic geometry and convex set theory for ship collision free guidance. The underlying assumption is that the geometric information of ship environment is available in the form of a polygon shaped free space, which may be easily generated from a 2D image or plots relating to physical hazards or other constraints such as collision avoidance regulations. The navigation command is given as a heading command sequence based on generating a way point which falls within a small neighborhood of the current position, and the sequence of the way points along the trajectory are guaranteed to lie within a bounded obstacle free region using convex set theory. A neurofuzzy network predictor which in practice uses only observed input/output data generated by on board sensors or external sensors (or a sensor fusion algorithm), based on using rudder deflection angle for the control of ship heading angle, is utilised in the simulation of an ESSO 190000 dwt tanker model to demonstrate the effectiveness of the system.
Resumo:
Latin had no word for "strategy", but the East Romans, whom we call the Byzantines, did. This book tracks the evolution of the concept of warfare being subjected to higher political aims from Antiquity to the Present, using Greek, Latin, French, Spanish, Italian, English and German sources. It tracks the rise, fall, and resurrection of the belief in the Roman and later the medieval and early modern world that warfare was only legitimate if it pursued the higher goal of a just peace, which in the 19th century gave way to a blinkered concentration on military victory as only war aim. It explains why one school of thought, from Antiquity to the present, emphasised eternal principles of warfare, while others emphasised, in Clausewitz's term, the "changing character of war". It tracks ideas from land warfare to naval warfare to air power and nuclear thinking, but it also stresses great leaps and discontinuities in thinking about strategy. It covers asymmetric wars both from the point of view of the weaker power seeking to overthrow a stronger power, and from the stronger power dealing with insurgents and other numerically inferior forces. It concludes with a commentary of the long-known problems of bureaucratic politics, non-centralised command and inter-service rivalry, which since the 16th century or earlier has created obstacles to coherent strategy making.
Resumo:
This paper describes the design and implementation of an agent based network for the support of collaborative switching tasks within the control room environment of the National Grid Company plc. This work includes aspects from several research disciplines, including operational analysis, human computer interaction, finite state modelling techniques, intelligent agents and computer supported co-operative work. Aspects of these procedures have been used in the analysis of collaborative tasks to produce distributed local models for all involved users. These models have been used as the basis for the production of local finite state automata. These automata have then been embedded within an agent network together with behavioural information extracted from the task and user analysis phase. The resulting support system is capable of task and communication management within the transmission despatch environment.
Resumo:
Across the world there are many bodies currently involved in researching into the design of autonomous guided vehicles (AGVs). One of the greatest problems at present however, is that much of the research work is being conducted in isolated groups, with the resulting AGVs sensor/control/command systems being almost completely nontransferable to other AGV designs. This paper describes a new modular method for robot design which when applied to AGVs overcomes the above problems. The method is explained here with respect to all forms of robotics but the examples have been specifically chosen to reflect typical AGV systems.
Resumo:
The National Grid Company plc. owns and operates the electricity transmission network in England and Wales, the day to day running of the network being carried out by teams of engineers within the national control room. The task of monitoring and operating the transmission network involves the transfer of large amounts of data and a high degree of cooperation between these engineers. The purpose of the research detailed in this paper is to investigate the use of interfacing techniques within the control room scenario, in particular, the development of an agent based architecture for the support of cooperative tasks. The proposed architecture revolves around the use of interface and user supervisor agents. Primarily, these agents are responsible for the flow of information to and from individual users and user groups. The agents are also responsible for tackling the synchronisation and control issues arising during the completion of cooperative tasks. In this paper a novel approach to human computer interaction (HCI) for power systems incorporating an embedded agent infrastructure is presented. The agent architectures used to form the base of the cooperative task support system are discussed, as is the nature of the support system and tasks it is intended to support.
Resumo:
The current research agenda for construction process improvement is heavily influenced by the rhetoric of business process re-engineering (BPR). In contrast to the wider literature on BPR, there is little evidence of critical thought within the construction management research community. A postmodernist interpretation is advocated whereby the reality of management practice is defined by the dominant management discourse. The persuasiveness of BPR rhetoric is analysed with particular reference to the way in which it plays on the insecurity of modern managers. Despite the lip service given to ‘empowerment’ and ‘teamwork’, the dominant theme of the re-engineering movement is that of technocratic totalitarianism. From a critical perspective, it is suggested that BPR is imposed on construction organizations to ensure continued control by the industry's dominant power groups. Whilst industry leaders are fond of calling for ‘attitudinal and cultural improvement’, the language of the accepted research agenda continually reinforces the industry's dominant culture of ‘control and command’. Therefore, current research directions in process improvement perpetuate existing attitudes rather than facilitating cultural change. The concept of lean construction is seen to be the latest manifestation of this phenomenon.
Resumo:
Intelligent viewing systems are required if efficient and productive teleoperation is to be applied to dynamic manufacturing environments. These systems must automatically provide remote views to an operator which assist in the completion of the task. This assistance increases the productivity of the teleoperation task if the robot controller is responsive to the unpredictable dynamic evolution of the workcell. Behavioral controllers can be utilized to give reactive 'intelligence.' The inherent complex structure of current systems, however, places considerable time overheads on any redesign of the emergent behavior. In industry, where the remote environment and task frequently change, this continual redesign process becomes inefficient. We introduce a novel behavioral controller, based on an 'ego-behavior' architecture, to command an active camera (a camera mounted on a robot) within a remote workcell. Using this ego-behavioral architecture the responses from individual behaviors are rapidly combined to produce an 'intelligent' responsive viewing system. The architecture is single-layered, each behavior being autonomous with no explicit knowledge of the number, description or activity of other behaviors present (if any). This lack of imposed structure decreases the development time as it allows each behavior to be designed and tested independently before insertion into the architecture. The fusion mechanism for the behaviors provides the ability for each behavior to compete and/or co-operate with other behaviors for full or partial control of the viewing active camera. Each behavior continually reassesses this degree of competition or co-operation by measuring its own success in controlling the active camera against pre-defined constraints. The ego-behavioral architecture is demonstrated through simulation and experimentation.
Resumo:
Environmental policy in the United Kingdom (UK) is witnessing a shift from command-and-control approaches towards more innovation-orientated environmental governance arrangements. These governance approaches are required which create institutions which support actors within a domain for learning not only about policy options, but also about their own interests and preferences. The need for construction actors to understand, engage and influence this process is critical to establishing policies which support innovation that satisfies each constituent’s needs. This capacity is particularly salient in an era where the expanding raft of environmental regulation is ushering in system-wide innovation in the construction sector. In this paper, the Code for Sustainable Homes (the Code) in the UK is used to demonstrate the emergence and operation of these new governance arrangements. The Code sets out a significant innovation challenge for the house-building sector with, for example, a requirement that all new houses must be zero-carbon by 2016. Drawing upon boundary organisation theory, the journey from the Code as a government aspiration, to the Code as a catalyst for the formation of the Zero Carbon Hub, a new institution, is traced and discussed. The case study reveals that the ZCH has demonstrated boundary organisation properties in its ability to be flexible to the needs and constraints of its constituent actors, yet robust enough to maintain and promote a common identity across regulation and industry boundaries.
Resumo:
Stroke is a medical emergency and can cause a neurological damage, affecting the motor and sensory systems. Harnessing brain plasticity should make it possible to reconstruct the closed loop between the brain and the body, i.e., association of the generation of the motor command with the somatic sensory feedback might enhance motor recovery. In order to aid reconstruction of this loop with a robotic device it is necessary to assist the paretic side of the body at the right moment to achieve simultaneity between motor command and feedback signal to somatic sensory area in brain. To this end, we propose an integrated EEG-driven assistive robotic system for stroke rehabilitation. Depending on the level of motor recovery, it is important to provide adequate stimulation for upper limb motion. Thus, we propose an assist arm incorporating a Magnetic Levitation Joint that can generate a compliant motion due to its levitation and mechanical redundancy. This paper reports on a feasibility study carried out to verify the validity of the robot sensing and on EEG measurements conducted with healthy volunteers while performing a spontaneous arm flexion/extension movement. A characteristic feature was found in the temporal evolution of EEG signal in the single motion prior to executed motion which can aid in coordinating timing of the robotic arm assistance onset.
Resumo:
This article explores the reasons that affect the decisions of managers of firms to adopt management practices in order to green their supply chain management. Under the context of environmental policy, the relationship between policy instruments (‘command and control’, market-based, and self-regulated) and the decisions of managers to adopt green supply chain management (G-SCM) practices is examined. The results show that in some cases the environmental legislation, market-based instruments and self-regulated incentives could play a critical role in the decisions of managers to adopt some specific G-SCM practices, while in other cases environmental policy instruments have not seemed to affect the decisions of managers regarding some other G-SCM practices.