778 resultados para service-oriented grid computing systems


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The analysis of investment in the electric power has been the subject of intensive research for many years. The efficient generation and distribution of electrical energy is a difficult task involving the operation of a complex network of facilities, often located over very large geographical regions. Electric power utilities have made use of an enormous range of mathematical models. Some models address time spans which last for a fraction of a second, such as those that deal with lightning strikes on transmission lines while at the other end of the scale there are models which address time horizons consisting of ten or twenty years; these usually involve long range planning issues. This thesis addresses the optimal long term capacity expansion of an interconnected power system. The aim of this study has been to derive a new, long term planning model which recognises the regional differences which exist for energy demand and which are present in the construction and operation of power plant and transmission line equipment. Perhaps the most innovative feature of the new model is the direct inclusion of regional energy demand curves in the nonlinear form. This results in a nonlinear capacity expansion model. After review of the relevant literature, the thesis first develops a model for the optimal operation of a power grid. This model directly incorporates regional demand curves. The model is a nonlinear programming problem containing both integer and continuous variables. A solution algorithm is developed which is based upon a resource decomposition scheme that separates the integer variables from the continuous ones. The decompostion of the operating problem leads to an interactive scheme which employs a mixed integer programming problem, known as the master, to generate trial operating configurations. The optimum operating conditions of each trial configuration is found using a smooth nonlinear programming model. The dual vector recovered from this model is subsequently used by the master to generate the next trial configuration. The solution algorithm progresses until lower and upper bounds converge. A range of numerical experiments are conducted and these experiments are included in the discussion. Using the operating model as a basis, a regional capacity expansion model is then developed. It determines the type, location and capacity of additional power plants and transmission lines, which are required to meet predicted electicity demands. A generalised resource decompostion scheme, similar to that used to solve the operating problem, is employed. The solution algorithm is used to solve a range of test problems and the results of these numerical experiments are reported. Finally, the expansion problem is applied to the Queensland electricity grid in Australia

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Abstract Computer simulation is a versatile and commonly used tool for the design and evaluation of systems with different degrees of complexity. Power distribution systems and electric railway network are areas for which computer simulations are being heavily applied. A dominant factor in evaluating the performance of a software simulator is its processing time, especially in the cases of real-time simulation. Parallel processing provides a viable mean to reduce the computing time and is therefore suitable for building real-time simulators. In this paper, we present different issues related to solving the power distribution system with parallel computing based on a multiple-CPU server and we will concentrate, in particular, on the speedup performance of such an approach.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Given there is currently a migration trend from traditional electrical supervisory control and data acquisition (SCADA) systems towards a smart grid based approach to critical infrastructure management. This project provides an evaluation of existing and proposed implementations for both traditional electrical SCADA and smart grid based architectures, and proposals a set of reference requirements which test bed implementations should implement. A high-level design for smart grid test beds is proposed and initial implementation performed, based on the proposed design, using open source and freely available software tools. The project examines the move towards smart grid based critical infrastructure management and illustrates the increased security requirements. The implemented test bed provides a basic framework for testing network requirements in a smart grid environment, as well as a platform for further research and development. Particularly to develop, implement and test network security related disturbances such as intrusion detection and network forensics. The project undertaken proposes and develops an architecture of the emulation of some smart grid functionality. The Common Open Research Emulator (CORE) platform was used to emulate the communication network of the smart grid. Specifically CORE was used to virtualise and emulate the TCP/IP networking stack. This is intended to be used for further evaluation and analysis, for example the analysis of application protocol messages, etc. As a proof of concept, software libraries were designed, developed and documented to enable and support the design and development of further smart grid emulated components, such as reclosers, switches, smart meters, etc. As part of the testing and evaluation a Modbus based smart meter emulator was developed to provide basic functionality of a smart meter. Further code was developed to send Modbus request messages to the emulated smart meter and receive Modbus responses from it. Although the functionality of the emulated components were limited, it does provide a starting point for further research and development. The design is extensible to enable the design and implementation of additional SCADA protocols. The project also defines an evaluation criteria for the evaluation of the implemented test bed, and experiments are designed to evaluate the test bed according to the defined criteria. The results of the experiments are collated and presented, and conclusions drawn from the results to facilitate discussion on the test bed implementation. The discussion undertaken also present possible future work.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The first use of computing technologies and the development of land use models in order to support decision-making processes in urban planning date back to as early as mid 20th century. The main thrust of computing applications in urban planning is their contribution to sound decision-making and planning practices. During the last couple of decades many new computing tools and technologies, including geospatial technologies, are designed to enhance planners' capability in dealing with complex urban environments and planning for prosperous and healthy communities. This chapter, therefore, examines the role of information technologies, particularly internet-based geographic information systems, as decision support systems to aid public participatory planning. The chapter discusses challenges and opportunities for the use of internet-based mapping application and tools in collaborative decision-making, and introduces a prototype internet-based geographic information system that is developed to integrate public-oriented interactive decision mechanisms into urban planning practice. This system, referred as the 'Community-based Internet GIS' model, incorporates advanced information technologies, distance learning, sustainable urban development principles and community involvement techniques in decision-making processes, and piloted in Shibuya, Tokyo, Japan.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the emergence of multi-core processors into the mainstream, parallel programming is no longer the specialized domain it once was. There is a growing need for systems to allow programmers to more easily reason about data dependencies and inherent parallelism in general purpose programs. Many of these programs are written in popular imperative programming languages like Java and C]. In this thesis I present a system for reasoning about side-effects of evaluation in an abstract and composable manner that is suitable for use by both programmers and automated tools such as compilers. The goal of developing such a system is to both facilitate the automatic exploitation of the inherent parallelism present in imperative programs and to allow programmers to reason about dependencies which may be limiting the parallelism available for exploitation in their applications. Previous work on languages and type systems for parallel computing has tended to focus on providing the programmer with tools to facilitate the manual parallelization of programs; programmers must decide when and where it is safe to employ parallelism without the assistance of the compiler or other automated tools. None of the existing systems combine abstraction and composition with parallelization and correctness checking to produce a framework which helps both programmers and automated tools to reason about inherent parallelism. In this work I present a system for abstractly reasoning about side-effects and data dependencies in modern, imperative, object-oriented languages using a type and effect system based on ideas from Ownership Types. I have developed sufficient conditions for the safe, automated detection and exploitation of a number task, data and loop parallelism patterns in terms of ownership relationships. To validate my work, I have applied my ideas to the C] version 3.0 language to produce a language extension called Zal. I have implemented a compiler for the Zal language as an extension of the GPC] research compiler as a proof of concept of my system. I have used it to parallelize a number of real-world applications to demonstrate the feasibility of my proposed approach. In addition to this empirical validation, I present an argument for the correctness of the type system and language semantics I have proposed as well as sketches of proofs for the correctness of the sufficient conditions for parallelization proposed.

Relevância:

40.00% 40.00%

Publicador:

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The decision to represent the USDL abstract syntax as a metamodel, shown as a set of UML diagrams, has two main benefits: the ability to show a well- understood standard graphical representation of the concepts and their relation- ships to one another, and the ability to use object-oriented frameworks such as Eclipse Modeling Framework (EMF) to assist in the automated generation of tool support for USDL service descriptions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the growing significance of services in most developed economies, there is an increased interest in the role of service innovation in service firm competitive strategy. Despite growing literature on service innovation, it remains fragmented reflecting the need for a model that captures key antecedents driving the service innovation-based competitive advantage process. Building on extant literature and using thirteen in-depth interviews with CEOs of project-oriented service firms, this paper presents a model of innovation-based competitive advantage. The emergent model suggests that entrepreneurial service firms pursuing innovation carefully select and use dynamic capabilities that enable them to achieve greater innovation and sustained competitive advantage. Our findings indicate that firms purposefully use create, extend and modify processes to build and nurture key dynamic capabilities. The paper presents a set of theoretical propositions to guide future research. Implications for theory and practice are discussed. Finally, directions for future research are outlined.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Security and privacy in electronic health record systems have been hindering the growth of e-health systems since their emergence. The development of policies that satisfy the security and privacy requirements of different stakeholders in healthcare has proven to be difficult. But, these requirements have to be met if the systems developed are to succeed in achieving their intended goals. Access control is a fundamental security barrier for securing data in healthcare information systems. In this paper we present an access control model for electronic health records. We address patient privacy requirements, confidentiality of private information and the need for flexible access for health professionals for electronic health records. We carefully combine three existing access control models and present a novel access control model for EHRs which satisfies requirements of electronic health records.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This series of research vignettes is aimed at sharing current and interesting research findings from our team of international Entrepreneurship researchers. In this vignette, Dr Sandeep Salunke shares insights from his research on project-oriented firms.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Successful firms use business model innovation to rethink the way they do business and transform industries. However, current research on business model innovation is lacking theoretical underpinnings and is in need of new insights. This objective of this paper is to advance our understanding of both the business model concept and business model innovation based on service logic as foundation for customer value and value creation. We present and discuss a rationale for business models based on ‘service logic’ with service as a value-supporting process and compared it with a business model based on ‘goods logic’ with goods as value-supporting resources. The implications for each of the business model dimensions: customer, value proposition, organizational architecture and revenue model, are described and discussed in detail.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The criticality of service innovation in building and sustaining competitive advantage is gaining increasing recognition in the marketplace. Using empirical data from US and Australian project-oriented firms, the study uses a multi-staged multi-method research program to demonstrate how entrepreneurial service firms strategically combine resources at hand (bricolage) to innovate and stay ahead of rivals. The research shows that service entrepreneurship (SE) and bricolage influence two forms of service innovation (interactive and supportive), which in turn is associated with sustained competitive advantage (SCA). The results suggest that SE and bricolage indirectly relate to SCA through service innovation. The findings offer novel insights into how project-oriented service firms engage in innovation. In short, the findings encourage the “making do by combining resources at hand” as higher levels of entrepreneurial bricolage are associated with higher levels of interactive and supportive innovation enabling SCA, suggesting a new model.