929 resultados para Current systems
Resumo:
In this paper, we report data drawn from a larger project on the functioning of the Queensland community service delivery system, particularly that providing services to people with disabilities. Our reasoning for focusing at this level is that, from the service user's perspective, support is derived from the service delivery system, not just individual service providers. Defining the service delivery system as formal services and informal support networks, we undertook interviews and focus groups with service providers in six areas in Queensland: inner urban, outer urban, rural and remote. The period on which we report is one in which considerable reform activity had been undertaken by funding bodies of the Commonwealth and State governments. We report on those factors we identified which promote the integrated functioning of the service delivery system, as well as those factors that disrupt it. We conclude with a brief evaluative analysis of the current status of the system.
Resumo:
For quantum systems with linear dynamics in phase space much of classical feedback control theory applies. However, there are some questions that are sensible only for the quantum case: Given a fixed interaction between the system and the environment what is the optimal measurement on the environment for a particular control problem? We show that for a broad class of optimal (state- based) control problems ( the stationary linear-quadratic-Gaussian class), this question is a semidefinite program. Moreover, the answer also applies to Markovian (current-based) feedback.
Resumo:
Several mechanisms for self-enhancing feedback instabilities in marine ecosystems are identified and briefly elaborated. It appears that adverse phases of operation may be abruptly triggered by explosive breakouts in abundance of one or more previously suppressed populations. Moreover, an evident capacity of marine organisms to accomplish extensive geographic habitat expansions may expand and perpetuate a breakout event. This set of conceptual elements provides a framework for interpretation of a sequence of events that has occurred in the Northern Benguela Current Large Marine Ecosystem (off south-western Africa). This history can illustrate how multiple feedback loops might interact with one another in unanticipated and quite malignant ways, leading not only to collapse of customary resource stocks but also to degradation of the ecosystem to such an extent that disruption of customary goods and services may go beyond fisheries alone to adversely affect other major global ecosystem concerns (e.g. proliferations of jellyfish and other slimy, stingy, toxic and/or noxious organisms, perhaps even climate change itself, etc.). The wisdom of management interventions designed to interrupt an adverse mode of feedback operation is pondered. Research pathways are proposed that may lead to improved insights needed: (i) to avoid potential 'triggers' that might set adverse phases of feedback loop operation into motion; and (ii) to diagnose and properly evaluate plausible actions to reverse adverse phases of feedback operation that might already have been set in motion. These pathways include the drawing of inferences from available 'quasi-experiments' produced either by short-term climatic variation or inadvertently in the course of biased exploitation practices, and inter-regional applications of the comparative method of science.
Resumo:
Real-time software systems are rarely developed once and left to run. They are subject to changes of requirements as the applications they support expand, and they commonly outlive the platforms they were designed to run on. A successful real-time system is duplicated and adapted to a variety of applications - it becomes a product line. Current methods for real-time software development are commonly based on low-level programming languages and involve considerable duplication of effort when a similar system is to be developed or the hardware platform changes. To provide more dependable, flexible and maintainable real-time systems at a lower cost what is needed is a platform-independent approach to real-time systems development. The development process is composed of two phases: a platform-independent phase, that defines the desired system behaviour and develops a platform-independent design and implementation, and a platform-dependent phase that maps the implementation onto the target platform. The last phase should be highly automated. For critical systems, assessing dependability is crucial. The partitioning into platform dependent and independent phases has to support verification of system properties through both phases.
Resumo:
Land-surface processes include a broad class of models that operate at a landscape scale. Current modelling approaches tend to be specialised towards one type of process, yet it is the interaction of processes that is increasing seen as important to obtain a more integrated approach to land management. This paper presents a technique and a tool that may be applied generically to landscape processes. The technique tracks moving interfaces across landscapes for processes such as water flow, biochemical diffusion, and plant dispersal. Its theoretical development applies a Lagrangian approach to motion over a Eulerian grid space by tracking quantities across a landscape as an evolving front. An algorithm for this technique, called level set method, is implemented in a geographical information system (GIS). It fits with a field data model in GIS and is implemented as operators in map algebra. The paper describes an implementation of the level set methods in a map algebra programming language, called MapScript, and gives example program scripts for applications in ecology and hydrology.
Resumo:
E-Business Information Systems (eBIS) are Information Systems (IS) that support organizations to realize their e-Business strategy resulting in various benefits. Therefore those systems strongly focus on fulfilment of the e-business requirements. In order to realise the expected benefits, organizations need to turn to their eBIS and measure the maturity of those systems. In doing so, they need to identify the status of those systems with regards to their suitability to support the e-Business strategy, while also identifying required IS improvements. In our research we aim to develop a maturity model, particularly dedicated to the area of e-Business Information Systems, which can be used easily and objectively to measure of the current maturity of any Information System that supports e-Business. This research-in-progress paper presents initial results of our research.
Resumo:
This paper proposes an architecture for pervasive computing which utilizes context information to provide adaptations based on vertical handovers (handovers between heterogeneous networks) while supporting application Quality of Service (QoS). The future of mobile computing will see an increase in ubiquitous network connectivity which allows users to roam freely between heterogeneous networks. One of the requirements for pervasive computing is to adapt computing applications or their environment if current applications can no longer be provided with the requested QoS. One of possible adaptations is a vertical handover to a different network. Vertical handover operations include changing network interfaces on a single device or changes between different devices. Such handovers should be performed with minimal user distraction and minimal violation of communication QoS for user applications. The solution utilises context information regarding user devices, user location, application requirements, and network environment. The paper shows how vertical handover adaptations are incorporated into the whole infrastructure of a pervasive system
Resumo:
NASA is working on complex future missions that require cooperation between multiple satellites or rovers. To implement these systems, developers are proposing and using intelligent and autonomous systems. These autonomous missions are new to NASA, and the software development community is just learning to develop such systems. With these new systems, new verification and validation techniques must be used. Current techniques have been developed based on large monolithic systems. These techniques have worked well and reliably, but do not translate to the new autonomous systems that are highly parallel and nondeterministic.
Resumo:
Although the current level of organic production in industrialised countries amounts to little more than 1-2 percent, it is recognised that one of the major issues shaping agricultural output over the next several decades will be the demand for organic produce (Dixon et al. 2001). In Australia, the issues of healthy food and environmental concern contribute to increasing demand and market volumes for organic produce. However, in Indonesia, using more economical inputs for organic production is a supply-side factor driving organic production. For individual growers and processors, conversion from conventional to organic agriculture is often a challenging step, entailing a thorough revision of established practices and heightened market insecurity. This paper examines the potential for a systems approach to the analysis of the conversion process, to yield insights for household and community decisions. A framework for applying farming systems research to investigate the benefits of organic production in both Australia and Indonesia is discussed. The framework incorporates scope for farmer participation, crucial to the understanding of farming systems; analysis of production; and relationships to resources, technologies, markets, services, policies and institutions in their local cultural context. A systems approach offers the potential to internalise the external effects that may be constraining decisions to convert to organic production, and for the design of decision-making tools to assist households and the community. Systems models can guide policy design and serve as a mechanism for predicting the impact of changes to the policy and market environments. The increasing emphasis of farming systems research on community and environment in recent years is in keeping with the proposed application to organic production, processing and marketing issues. The approach will also facilitate the analysis of critical aspects of the Australian production, marketing and policy environment, and the investigation of these same features in an Indonesian context.
Resumo:
The developments of models in Earth Sciences, e.g. for earthquake prediction and for the simulation of mantel convection, are fare from being finalized. Therefore there is a need for a modelling environment that allows scientist to implement and test new models in an easy but flexible way. After been verified, the models should be easy to apply within its scope, typically by setting input parameters through a GUI or web services. It should be possible to link certain parameters to external data sources, such as databases and other simulation codes. Moreover, as typically large-scale meshes have to be used to achieve appropriate resolutions, the computational efficiency of the underlying numerical methods is important. Conceptional this leads to a software system with three major layers: the application layer, the mathematical layer, and the numerical algorithm layer. The latter is implemented as a C/C++ library to solve a basic, computational intensive linear problem, such as a linear partial differential equation. The mathematical layer allows the model developer to define his model and to implement high level solution algorithms (e.g. Newton-Raphson scheme, Crank-Nicholson scheme) or choose these algorithms form an algorithm library. The kernels of the model are generic, typically linear, solvers provided through the numerical algorithm layer. Finally, to provide an easy-to-use application environment, a web interface is (semi-automatically) built to edit the XML input file for the modelling code. In the talk, we will discuss the advantages and disadvantages of this concept in more details. We will also present the modelling environment escript which is a prototype implementation toward such a software system in Python (see www.python.org). Key components of escript are the Data class and the PDE class. Objects of the Data class allow generating, holding, accessing, and manipulating data, in such a way that the actual, in the particular context best, representation is transparent to the user. They are also the key to establish connections with external data sources. PDE class objects are describing (linear) partial differential equation objects to be solved by a numerical library. The current implementation of escript has been linked to the finite element code Finley to solve general linear partial differential equations. We will give a few simple examples which will illustrate the usage escript. Moreover, we show the usage of escript together with Finley for the modelling of interacting fault systems and for the simulation of mantel convection.
Resumo:
The physical implementation of quantum information processing is one of the major challenges of current research. In the last few years, several theoretical proposals and experimental demonstrations on a small number of qubits have been carried out, but a quantum computing architecture that is straightforwardly scalable, universal, and realizable with state-of-the-art technology is still lacking. In particular, a major ultimate objective is the construction of quantum simulators, yielding massively increased computational power in simulating quantum systems. Here we investigate promising routes towards the actual realization of a quantum computer, based on spin systems. The first one employs molecular nanomagnets with a doublet ground state to encode each qubit and exploits the wide chemical tunability of these systems to obtain the proper topology of inter-qubit interactions. Indeed, recent advances in coordination chemistry allow us to arrange these qubits in chains, with tailored interactions mediated by magnetic linkers. These act as switches of the effective qubit-qubit coupling, thus enabling the implementation of one- and two-qubit gates. Molecular qubits can be controlled either by uniform magnetic pulses, either by local electric fields. We introduce here two different schemes for quantum information processing with either global or local control of the inter-qubit interaction and demonstrate the high performance of these platforms by simulating the system time evolution with state-of-the-art parameters. The second architecture we propose is based on a hybrid spin-photon qubit encoding, which exploits the best characteristic of photons, whose mobility is exploited to efficiently establish long-range entanglement, and spin systems, which ensure long coherence times. The setup consists of spin ensembles coherently coupled to single photons within superconducting coplanar waveguide resonators. The tunability of the resonators frequency is exploited as the only manipulation tool to implement a universal set of quantum gates, by bringing the photons into/out of resonance with the spin transition. The time evolution of the system subject to the pulse sequence used to implement complex quantum algorithms has been simulated by numerically integrating the master equation for the system density matrix, thus including the harmful effects of decoherence. Finally a scheme to overcome the leakage of information due to inhomogeneous broadening of the spin ensemble is pointed out. Both the proposed setups are based on state-of-the-art technological achievements. By extensive numerical experiments we show that their performance is remarkably good, even for the implementation of long sequences of gates used to simulate interesting physical models. Therefore, the here examined systems are really promising buildingblocks of future scalable architectures and can be used for proof-of-principle experiments of quantum information processing and quantum simulation.
Resumo:
Original Paper European Journal of Information Systems (2001) 10, 135–146; doi:10.1057/palgrave.ejis.3000394 Organisational learning—a critical systems thinking discipline P Panagiotidis1,3 and J S Edwards2,4 1Deloitte and Touche, Athens, Greece 2Aston Business School, Aston University, Aston Triangle, Birmingham, B4 7ET, UK Correspondence: Dr J S Edwards, Aston Business School, Aston University, Aston Triangle, Birmingham, B4 7ET, UK. E-mail: j.s.edwards@aston.ac.uk 3Petros Panagiotidis is Manager responsible for the Process and Systems Integrity Services of Deloitte and Touche in Athens, Greece. He has a BSc in Business Administration and an MSc in Management Information Systems from Western International University, Phoenix, Arizona, USA; an MSc in Business Systems Analysis and Design from City University, London, UK; and a PhD degree from Aston University, Birmingham, UK. His doctorate was in Business Systems Analysis and Design. His principal interests now are in the ERP/DSS field, where he serves as project leader and project risk managment leader in the implementation of SAP and JD Edwards/Cognos in various major clients in the telecommunications and manufacturing sectors. In addition, he is responsible for the development and application of knowledge management systems and activity-based costing systems. 4John S Edwards is Senior Lecturer in Operational Research and Systems at Aston Business School, Birmingham, UK. He holds MA and PhD degrees (in mathematics and operational research respectively) from Cambridge University. His principal research interests are in knowledge management and decision support, especially methods and processes for system development. He has written more than 30 research papers on these topics, and two books, Building Knowledge-based Systems and Decision Making with Computers, both published by Pitman. Current research work includes the effect of scale of operations on knowledge management, interfacing expert systems with simulation models, process modelling in law and legal services, and a study of the use of artifical intelligence techniques in management accounting. Top of pageAbstract This paper deals with the application of critical systems thinking in the domain of organisational learning and knowledge management. Its viewpoint is that deep organisational learning only takes place when the business systems' stakeholders reflect on their actions and thus inquire about their purpose(s) in relation to the business system and the other stakeholders they perceive to exist. This is done by reflecting both on the sources of motivation and/or deception that are contained in their purpose, and also on the sources of collective motivation and/or deception that are contained in the business system's purpose. The development of an organisational information system that captures, manages and institutionalises meaningful information—a knowledge management system—cannot be separated from organisational learning practices, since it should be the result of these very practices. Although Senge's five disciplines provide a useful starting-point in looking at organisational learning, we argue for a critical systems approach, instead of an uncritical Systems Dynamics one that concentrates only on the organisational learning practices. We proceed to outline a methodology called Business Systems Purpose Analysis (BSPA) that offers a participatory structure for team and organisational learning, upon which the stakeholders can take legitimate action that is based on the force of the better argument. In addition, the organisational learning process in BSPA leads to the development of an intrinsically motivated information organisational system that allows for the institutionalisation of the learning process itself in the form of an organisational knowledge management system. This could be a specific application, or something as wide-ranging as an Enterprise Resource Planning (ERP) implementation. Examples of the use of BSPA in two ERP implementations are presented.
Resumo:
Purpose - To consider the role of technology in knowledge management in organizations, both actual and desired. Design/methodology/approach - Facilitated, computer-supported group workshops were conducted with 78 people from ten different organizations. The objective of each workshop was to review the current state of knowledge management in that organization and develop an action plan for the future. Findings - Only three organizations had adopted a strongly technology-based "solution" to knowledge management problems, and these followed three substantially different routes. There was a clear emphasis on the use of general information technology tools to support knowledge management activities, rather than the use of tools specific to knowledge management. Research limitations/implications - Further research is needed to help organizations make best use of generally available software such as intranets and e-mail for knowledge management. Many issues, especially human, relate to the implementation of any technology. Participation was restricted to organizations that wished to produce an action plan for knowledge management. The findings may therefore represent only "average" organizations, not the very best practice. Practical implications - Each organization must resolve four tensions: Between the quantity and quality of information/knowledge, between centralized and decentralized organization, between head office and organizational knowledge, and between "push" and "pull" processes. Originality/value - Although it is the group rather than an individual that determines what counts as knowledge, hardly any previous studies of knowledge management have collected data in a group context.
Resumo:
This chapter discusses the current state of biomass-based combined heat and power (CHP) production in the UK. It presents an overview of the UK's energy policy and targets which are relevant to the deployment of biomass-based CHP and summarises the current state for renewable, biomass and CHP. A number of small-scale biomass-based CHP projects are described while providing some indicative capital costs for combustion, pyrolysis and gasification technologies. For comparison purposes, it presents an overview of the respective situation in Europe and particularly in Sweden, Finland and Denmark. There is also a brief comment about novel CHP technologies in Austria. Finally it draws some conclusions on the potential of small-scale biomass CHP in the UK. © 2011 Woodhead Publishing Limited All rights reserved.
Resumo:
This thesis is devoted to the tribology at the head~to~tape interface of linear tape recording systems, OnStream ADRTM system being used as an experimental platform, Combining experimental characterisation with computer modelling, a comprehensive picture of the mechanisms involved in a tape recording system is drawn. The work is designed to isolate the mechanisms responsible for the physical spacing between head and tape with the aim of minimising spacing losses and errors and optimising signal output. Standard heads-used in ADR current products-and prototype heads- DLC and SPL coated and dummy heads built from a AI203-TiC and alternative single-phase ceramics intended to constitute the head tape-bearing surface-are tested in controlled environment for up to 500 hours (exceptionally 1000 hours), Evidences of wear on the standard head are mainly observable as a preferential wear of the TiC phase of the AI203-TiC ceramic, The TiC grains are believed to delaminate due to a fatigue wear mechanism, a hypothesis further confirmed via modelling, locating the maximum von Mises equivalent stress at a depth equivalent to the TiC recession (20 to 30 nm). Debris of TiC delaminated residues is moreover found trapped within the pole-tip recession, assumed therefore to provide three~body abrasive particles, thus increasing the pole-tip recession. Iron rich stain is found over the cycled standard head surface (preferentially over the pole-tip and to a lesser extent over the TiC grains) at any environment condition except high temperature/humidity, where mainly organic stain was apparent, Temperature (locally or globally) affects staining rate and aspect; stain transfer is generally promoted at high temperature. Humidity affects transfer rate and quantity; low humidity produces, thinner stains at higher rate. Stain generally targets preferentially head materials with high electrical conductivity, i.e. Permalloy and TiC. Stains are found to decrease the friction at the head-to-tape interface, delay the TiC recession hollow-out and act as a protective soft coating reducing the pole-tip recession. This is obviously at the expense of an additional spacing at the head-to-tape interface of the order of 20 nm. Two kinds of wear resistant coating are tested: diamond like carbon (DLC) and superprotective layer (SPL), 10 nm and 20 to 40 nm thick, respectively. DLC coating disappears within 100 hours due possibly to abrasive and fatigue wear. SPL coatings are generally more resistant, particularly at high temperature and low humidity, possibly in relation with stain transfer. 20 nm coatings are found to rely on the substrate wear behaviour whereas 40 nm coatings are found to rely on the adhesive strength at the coating/substrate interface. These observations seem to locate the wear-driving forces 40 nm below the surface, hence indicate that for coatings in the 10 nm thickness range-· i,e. compatible with high-density recording-the substrate resistance must be taken into account. Single-phase ceramic as candidate for wear-resistant tape-bearing surface are tested in form of full-contour dummy-heads. The absence of a second phase eliminates the preferential wear observed at the AI203-TiC surface; very low wear rates and no evidence of brittle fracture are observed.