811 resultados para Distribution Management System


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this study was to explore how the remote control of appliances/lights (active energy management system) affected household well-being, compared to in-home displays (passive energy management system). A six-week exploratory study was conducted with 14 participants divided into the following three groups: active; passive; and no equipment. The effect on well-being was measured through thematic analysis of two semi-structured interviews for each participant, administered at the start and end of the study. The well-being themes were based on existing measures of Satisfaction and Affect. The energy demand for each participant was also measured for two weeks without intervention, and then compared after four weeks with either the passive or active energy management systems. These measurements were used to complement the well-being analysis. Overall, the measure of Affect increased in the passive group but Satisfaction decreased; however, all three measures on average decreased in the active group. The measured energy demand also highlighted a disconnect between well-being and domestic energy consumption. The results point to a need for further investigation in this field; otherwise, there is a risk that nationally implemented energy management solutions may negatively affect our happiness and well-being. © 2013 Elsevier Ltd.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Due to concerns about environmental protection and resource utilization, product lifecycle management for end-of-life (EOL) has received increasing attention in many industrial sectors including manufacturing, maintenance/repair, and recycling/refurbishing of the product. To support these functions, crucial issues are studied to realize a product recovery management system (PRMS), including: (1) an architecture design for EOL services, such as remanufacturing and recycling; (2) a product data model required for EOL activity based on international standards; and (3) an infrastructure for information acquisition and mapping to product lifecycle information. The presented works are illustrated via a realistic scenario. © 2008 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Knowledge management is a critical issue for the next-generation web application, because the next-generation web is becoming a semantic web, a knowledge-intensive network. XML Topic Map (XTM), a new standard, is appearing in this field as one of the structures for the semantic web. It organizes information in a way that can be optimized for navigation. In this paper, a new set of hyper-graph operations on XTM (HyO-XTM) is proposed to manage the distributed knowledge resources.HyO-XTM is based on the XTM hyper-graph model. It is well applied upon XTM to simplify the workload of knowledge management.The application of the XTM hyper-graph operations is demonstrated by the knowledge management system of a consulting firm. HyO-XTM shows the potential to lead the knowledge management to the next-generation web.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The national science project HIRFL-CSR has recently been officially accepted. As a cyclotron and synchotron complex, it puts some particularly high demands on the control system. There are hundreds of pieces of equipment that need to be synchronized. An integrated timing control system is built to meet these demands. The output rate and the accuracy of the controller are 16 bit/mu s. The accuracy of the time delay reaches 40 ns. The timing control system is based on a typical event distribution system, which adopts the new event generation and the distribution scheme. The scheme of the tuning control system with innovation points, the architecture and the implemented method are presented in the paper.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper discusses the definition and use of the term ‘integrated management’ in the context of coastal and ocean resources. It identifies several components which appear to be needed to establish an integrated management system for a large area subject to multiple use and jurisdiction. It suggests that the basis of integrated management should be a clear articulation of common purpose which addresses long term needs and vision. Once developed, this common purpose should be securely established to provide the setting against which sectoral and agencies managers and the community conduct and co-ordinate their activities.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the paper through extensive study and design, the technical plan for establishing the exploration database center is made to combine imported and self developed techniques. By research and repeated experiment a modern database center has been set up with its hardware and network having advanced performance, its system well configured, its data store and management complete, and its data support being fast and direct. Through study on the theory, method and model of decision an exploration decision assistant schema is designed with one decision plan of well location decision support system being evaluated and put into action. 1. Study on the establishment of Shengli exploration database center Research is made on the hardware configuration of the database center including its workstations and all connected hardware and system. The hardware of the database center is formed by connecting workstations, microcomputer workstations, disk arrays, and those equipments used for seismic processing and interpretation. Research on the data store and management includes the analysis of the contents to be managed, data flow, data standard, data QC, data backup and restore policy, optimization of database system. A reasonable data management regulation and workflow is made and the scientific exploration data management system is created. Data load is done by working out a schedule firstly and at last 200 more projects of seismic surveys has been loaded amount to 25TB. 2. Exploration work support system and its application Seismic data processing system support has the following features, automatic extraction of seismic attributes, GIS navigation, data order, extraction of any sized data cube, pseudo huge capacity disk array, standard output exchange format etc. The prestack data can be accessed by the processing system or data can be transferred to other processing system through standard exchange format. For supporting seismic interpretation system the following features exist such as auto scan and store of interpretation result, internal data quality control etc. the interpretation system is connected directly with database center to get real time support of seismic data, formation data and well data. Comprehensive geological study support is done through intranet with the ability to query or display data graphically on the navigation system under some geological constraints. Production management support system is mainly used to collect, analyze and display production data with its core technology on the controlled data collection and creation of multiple standard forms. 3. exploration decision support system design By classification of workflow and data flow of all the exploration stages and study on decision theory and method, target of each decision step, decision model and requirement, three concept models has been formed for the Shengli exploration decision support system including the exploration distribution support system, the well location support system and production management support system. the well location decision support system has passed evaluation and been put into action. 4. Technical advance Hardware and software match with high performance for the database center. By combining parallel computer system, database server, huge capacity ATL, disk array, network and firewall together to create the first exploration database center in China with reasonable configuration, high performance and able to manage the whole data sets of exploration. Huge exploration data management technology is formed where exploration data standards and management regulations are made to guarantee data quality, safety and security. Multifunction query and support system for comprehensive exploration information support. It includes support system for geological study, seismic processing and interpretation and production management. In the system a lot of new database and computer technology have been used to provide real time information support for exploration work. Finally is the design of Shengli exploration decision support system. 5. Application and benefit Data storage has reached the amount of 25TB with thousand of users in Shengli oil field to access data to improve work efficiency multiple times. The technology has also been applied by many other units of SINOPEC. Its application of providing data to a project named Exploration achievements and Evaluation of Favorable Targets in Hekou Area shortened the data preparation period from 30 days to 2 days, enriching data abundance 15 percent and getting information support from the database center perfectly. Its application to provide former processed result for a project named Pre-stack depth migration in Guxi fracture zone reduced the amount of repeated process and shortened work period of one month and improved processing precision and quality, saving capital investment of data processing of 30 million yuan. It application by providing project database automatically in project named Geological and seismic study of southern slope zone of Dongying Sag shortened data preparation time so that researchers have more time to do research, thus to improve interpretation precision and quality.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Synthetic Geology Information System (SGIS) is an important constituent part of the theory of Engineering Geomechanics Mate-Synthetic (EGMS), and is the information system more suited for the collection, storage, management, analysis and processing to the information coming from engineering geology,' geological engineering and geotechnical engineering. Its contents involve various works and methods of the investigation, design, and construction in different stages of the geological engineering. Engineering geological and three-dimensional modeling and visualization is the fundamental part of the SGIS, and is a theory, method and technique by which, adopting the computer graphics and image processing techniques, the data derived from engineering geological survey and the calculated results obtained from the geomechanical numerical simulation and analysis are converted to the graphics and images displayed on the computer screen and can be processed interactively. In this paper, the significance and realizing approaches of the three-dimensional modeling and visualization for the complex geological mass in the engineering geology are discussed and the methods of taking advantage of the interpolation and fitting for the scattered and field-surveyed data to simulate the geological layers, such as the topography and earth surface, the groundwater table and the stratum boundary, are researched into. At the mean time, in mind the characteristics of the structure of the basic data for three-dimensional modeling, its visual management can be resolved into the engineering surveyed database management module, plot parameter management module and data output module and the requirement for basic data management can be fulfilled. In the paper, the establishment and development of the three-dimensional geological information system are probed tentatively, and an instance of three-dimensional visual Engineering Distribution Information System (EDIS), theConstruction Management Information System for an airport, in which the functions, such as the real-time browse among the three-dimensional virtual-reality landscapes of the airport construction from start to finish, the information query to the airport facility and the building in the housing district and the recording and playback of the animation sets for the browse and the takeoff and landing of the planes, is developed by applying the component-mode three-dimensional virtual-reality geological information system (GIS) software development kits (SDK), so the three-dimensional visual management platform is provided for the airport construction. Moreover, in the gaper, integrated with the three-dimensional topography visualization and its application in the Sichuan-Tibet Highways, the method of the digital elevation model (DEM) data collection from the topographic maps is described, and the three-dimensional visualization and the roaming about the terrain along the highway are achieved through computer language programming. Understanding to the important role played by the varied and unique topographical condition in the gestation and germination of the highly-dense, frequently-arising and severely-endangered geological hazards can be deepened.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dissertação apresentada à Universidade Fernando Pessoa como parte dos requisitos para a obtenção do grau de Mestre em Gestão da Qualidade

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The exploding demand for services like the World Wide Web reflects the potential that is presented by globally distributed information systems. The number of WWW servers world-wide has doubled every 3 to 5 months since 1993, outstripping even the growth of the Internet. At each of these self-managed sites, the Common Gateway Interface (CGI) and Hypertext Transfer Protocol (HTTP) already constitute a rudimentary basis for contributing local resources to remote collaborations. However, the Web has serious deficiencies that make it unsuited for use as a true medium for metacomputing --- the process of bringing hardware, software, and expertise from many geographically dispersed sources to bear on large scale problems. These deficiencies are, paradoxically, the direct result of the very simple design principles that enabled its exponential growth. There are many symptoms of the problems exhibited by the Web: disk and network resources are consumed extravagantly; information search and discovery are difficult; protocols are aimed at data movement rather than task migration, and ignore the potential for distributing computation. However, all of these can be seen as aspects of a single problem: as a distributed system for metacomputing, the Web offers unpredictable performance and unreliable results. The goal of our project is to use the Web as a medium (within either the global Internet or an enterprise intranet) for metacomputing in a reliable way with performance guarantees. We attack this problem one four levels: (1) Resource Management Services: Globally distributed computing allows novel approaches to the old problems of performance guarantees and reliability. Our first set of ideas involve setting up a family of real-time resource management models organized by the Web Computing Framework with a standard Resource Management Interface (RMI), a Resource Registry, a Task Registry, and resource management protocols to allow resource needs and availability information be collected and disseminated so that a family of algorithms with varying computational precision and accuracy of representations can be chosen to meet realtime and reliability constraints. (2) Middleware Services: Complementary to techniques for allocating and scheduling available resources to serve application needs under realtime and reliability constraints, the second set of ideas aim at reduce communication latency, traffic congestion, server work load, etc. We develop customizable middleware services to exploit application characteristics in traffic analysis to drive new server/browser design strategies (e.g., exploit self-similarity of Web traffic), derive document access patterns via multiserver cooperation, and use them in speculative prefetching, document caching, and aggressive replication to reduce server load and bandwidth requirements. (3) Communication Infrastructure: Finally, to achieve any guarantee of quality of service or performance, one must get at the network layer that can provide the basic guarantees of bandwidth, latency, and reliability. Therefore, the third area is a set of new techniques in network service and protocol designs. (4) Object-Oriented Web Computing Framework A useful resource management system must deal with job priority, fault-tolerance, quality of service, complex resources such as ATM channels, probabilistic models, etc., and models must be tailored to represent the best tradeoff for a particular setting. This requires a family of models, organized within an object-oriented framework, because no one-size-fits-all approach is appropriate. This presents a software engineering challenge requiring integration of solutions at all levels: algorithms, models, protocols, and profiling and monitoring tools. The framework captures the abstract class interfaces of the collection of cooperating components, but allows the concretization of each component to be driven by the requirements of a specific approach and environment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Wireless sensor networks (WSN) are becoming widely adopted for many applications including complicated tasks like building energy management. However, one major concern for WSN technologies is the short lifetime and high maintenance cost due to the limited battery energy. One of the solutions is to scavenge ambient energy, which is then rectified to power the WSN. The objective of this thesis was to investigate the feasibility of an ultra-low energy consumption power management system suitable for harvesting sub-mW photovoltaic and thermoelectric energy to power WSNs. To achieve this goal, energy harvesting system architectures have been analyzed. Detailed analysis of energy storage units (ESU) have led to an innovative ESU solution for the target applications. Battery-less, long-lifetime ESU and its associated power management circuitry, including fast-charge circuit, self-start circuit, output voltage regulation circuit and hybrid ESU, using a combination of super-capacitor and thin film battery, were developed to achieve continuous operation of energy harvester. Low start-up voltage DC/DC converters have been developed for 1mW level thermoelectric energy harvesting. The novel method of altering thermoelectric generator (TEG) configuration in order to match impedance has been verified in this work. Novel maximum power point tracking (MPPT) circuits, exploring the fractional open circuit voltage method, were particularly developed to suit the sub-1mW photovoltaic energy harvesting applications. The MPPT energy model has been developed and verified against both SPICE simulation and implemented prototypes. Both indoor light and thermoelectric energy harvesting methods proposed in this thesis have been implemented into prototype devices. The improved indoor light energy harvester prototype demonstrates 81% MPPT conversion efficiency with 0.5mW input power. This important improvement makes light energy harvesting from small energy sources (i.e. credit card size solar panel in 500lux indoor lighting conditions) a feasible approach. The 50mm × 54mm thermoelectric energy harvester prototype generates 0.95mW when placed on a 60oC heat source with 28% conversion efficiency. Both prototypes can be used to continuously power WSN for building energy management applications in typical office building environment. In addition to the hardware development, a comprehensive system energy model has been developed. This system energy model not only can be used to predict the available and consumed energy based on real-world ambient conditions, but also can be employed to optimize the system design and configuration. This energy model has been verified by indoor photovoltaic energy harvesting system prototypes in long-term deployed experiments.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Open environments involve distributed entities interacting with each other in an open manner. Many distributed entities are unknown to each other but need to collaborate and share resources in a secure fashion. Usually resource owners alone decide who is trusted to access their resources. Since resource owners in open environments do not have a complete picture of all trusted entities, trust management frameworks are used to ensure that only authorized entities will access requested resources. Every trust management system has limitations, and the limitations can be exploited by malicious entities. One vulnerability is due to the lack of globally unique interpretation for permission specifications. This limitation means that a malicious entity which receives a permission in one domain may misuse the permission in another domain via some deceptive but apparently authorized route; this malicious behaviour is called subterfuge. This thesis develops a secure approach, Subterfuge Safe Trust Management (SSTM), that prevents subterfuge by malicious entities. SSTM employs the Subterfuge Safe Authorization Language (SSAL) which uses the idea of a local permission with a globally unique interpretation (localPermission) to resolve the misinterpretation of permissions. We model and implement SSAL with an ontology-based approach, SSALO, which provides a generic representation for knowledge related to the SSAL-based security policy. SSALO enables integration of heterogeneous security policies which is useful for secure cooperation among principals in open environments where each principal may have a different security policy with different implementation. The other advantage of an ontology-based approach is the Open World Assumption, whereby reasoning over an existing security policy is easily extended to include further security policies that might be discovered in an open distributed environment. We add two extra SSAL rules to support dynamic coalition formation and secure cooperation among coalitions. Secure federation of cloud computing platforms and secure federation of XMPP servers are presented as case studies of SSTM. The results show that SSTM provides robust accountability for the use of permissions in federation. It is also shown that SSAL is a suitable policy language to express the subterfuge-safe policy statements due to its well-defined semantics, ease of use, and integrability.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

More and more often, universities make the decision to implement integrated learning management systems. Nevertheless, these technological developments are not realized without any trouble, and are achieved with more or less success and user satisfaction (Valenduc, 2000). It is why the presented study aims at identifying the factors influencing learning management system satisfaction and acceptance among students. The Technology Acceptance model created by Wixom and Todd (2005) studies information system acceptance through user satisfaction, and has the benefit of incorporating several ergonomic factors. More precisely, the survey, based on this model, investigates behavioral attitudes towards the system, perceived ease of use, perceived usefulness, as well as system satisfaction, information satisfaction and also incorporates two groups of factors affecting separately the two types of satisfaction. The study was conducted on a representative sample of 593 students from a Brussels university which had recently implemented an integrated learning management system. The results show on one hand, the impact of system reliability, accessibility, flexibility, lay-out and functionalities offered on system satisfaction. And on the other hand, the impact of information accuracy, intelligibility, relevance, exhaustiveness and actualization on information satisfaction. In conclusion, the results indicate the applicability of the theoretical model with learning management systems, and also highlight the importance of each aforementioned factor for a successful implantation of such a system in universities.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Se analizan y describen las principales líneas de trabajo de la Web Semántica en el ámbito de los archivos de televisión. Para ello, se analiza y contextualiza la web semántica desde una perspectiva general para posteriormente analizar las principales iniciativas que trabajan con lo audiovisual: Proyecto MuNCH, Proyecto S5T, Semantic Television y VideoActive.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose The aim of this paper is to explore the issues involved in developing and applying performance management approaches within a large UK public sector department using a multiple stakeholder perspective and an accompanying theoretical framework. Design/methodology/approach An initial short questionnaire was used to determine perceptions about the implementation and effectiveness of the new performance management system across the organisation. In total, 700 questionnaires were distributed. Running concurrently with an ethnographic approach, and informed by the questionnaire responses, was a series of semi-structured interviews and focus groups. Findings Staff at all levels had an understanding of the new system and perceived it as being beneficial. However, there were concerns that the approach was not continuously managed throughout the year and was in danger of becoming an annual event, rather than an ongoing process. Furthermore, the change process seemed to have advanced without corresponding changes to appraisal and reward and recognition systems. Thus, the business objectives were not aligned with motivating factors within the organisation. Research limitations/implications Additional research to test the validity and usefulness of the theoretical model, as discussed in this paper, would be beneficial. Practical implications The strategic integration of the stakeholder performance measures and scorecards was found to be essential to producing an overall stakeholder-driven strategy within the case study organisation. Originality/value This paper discusses in detail the approach adopted and the progress made by one large UK public sector organisation, as it attempts to develop better relationships with all of its stakeholders and hence improve its performance. This paper provides a concerted attempt to link theory with practice.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Balanced Scorecard of Kaplan and Norton is a management tool that supports the successful implementation of corporate strategies. It has been discussed and considered widely in both practice and research. By linking operational and non-financial corporate activities with causal chains to the firm's long-term strategy, the Balanced Scorecard supports the alignment and management of all corporate activities according to their strategic relevance. The Balanced Scorecard makes it possible to take into account non-monetary strategic success factors that significantly impact the economic success of a business. The Balanced Scorecard is thus a promising starting-point to also incorporate environmental and social aspects into the main management system of a firm. Sustainability management with the Balanced Scorecard helps to overcome the shortcomings of conventional approaches to environmental and social management systems by integrating the three pillars of sustainability into a single and overarching strategic management tool. After a brief discussion of the different possible forms of a Sustainability Balanced Scorecard the article takes a closer look at the process and steps of formulating a Sustainability Balanced Scorecard for a business unit. Before doing so, the basic conventional approach of the Balanced Scorecard and its suitability for sustainability management will be outlined in brief.