887 resultados para Requirements engineering process
Resumo:
Substation Automation Systems have undergone many transformational changes triggered by improvements in technologies. Prior to the digital era, it made sense to confirm that the physical wiring matched the schematic design by meticulous and laborious point to point testing. In this way, human errors in either the design or the construction could be identified and fixed prior to entry into service. However, even though modern secondary systems today are largely computerised, we are still undertaking commissioning testing using the same philosophy as if each signal were hard wired. This is slow and tedious and doesn’t do justice to modern computer systems and software automation. One of the major architectural advantages of the IEC 61850 standard is that it “abstracts” the definition of data and services independently of any protocol allowing the mapping of them to any protocol that can meet the modelling and performance requirements. On this basis, any substation element can be defined using these common building blocks and are made available at the design, configuration and operational stages of the system. The primary advantage of accessing data using this methodology rather than the traditional position method (such as DNP 3.0) is that generic tools can be created to manipulate data. Self-describing data contains the information that these tools need to manipulate different data types correctly. More importantly, self-describing data makes the interface between programs robust and flexible. This paper proposes that the improved data definitions and methods for dealing with this data within a tightly bound and compliant IEC 61850 Substation Automation System could completely revolutionise the need to test systems when compared to traditional point to point methods. Using the outcomes of an undergraduate thesis project, we can demonstrate with some certainty that it is possible to automatically test the configuration of a protection relay by comparing the IEC 61850 configuration extracted from the relay against its SCL file for multiple relay vendors. The software tool provides a quick and automatic check that the data sets on a particular relay are correct according to its CID file, thus ensuring that no unexpected modifications are made at any stage of the commissioning process. This tool has been implemented in a Java programming environment using an open source IEC 61850 library to facilitate the server-client association with the relay.
Resumo:
‘Complexity’ is a term that is increasingly prevalent in conversations about building capacity for 21st Century professional engineers. Society is grappling with the urgent and challenging reality of accommodating seven billion people, meeting needs and innovating lifestyle improvements in ways that do not destroy atmospheric, biological and oceanic systems critical to life. Over the last two decades in particular, engineering educators have been active in attempting to build capacity amongst professionals to deliver ‘sustainable development’ in this rapidly changing global context. However curriculum literature clearly points to a lack of significant progress, with efforts best described as ad hoc and highly varied. Given the limited timeframes for action to curb environmental degradation proposed by scientists and intergovernmental agencies, the authors of this paper propose it is imperative that curriculum renewal towards education for sustainable development proceeds rapidly, systemically, and in a transformational manner. Within this context, the paper discusses the need to consider a multiple track approach to building capacity for 21st Century engineering, including priorities and timeframes for undergraduate and postgraduate curriculum renewal. The paper begins with a contextual discussion of the term complexity and how it relates to life in the 21st Century. The authors then present a whole of system approach for planning and implementing rapid curriculum renewal that addresses the critical roles of several generations of engineering professionals over the next three decades. The paper concludes with observations regarding engaging with this approach in the context of emerging accreditation requirements and existing curriculum renewal frameworks.
Resumo:
This paper reflects on the critical need for an urgent transformation of higher education curriculum globally, to equip society with professionals who can address our 21st Century sustainable living challenges. Specifically it discusses a toolkit called the ‘Engineering Sustainable Solutions Program’, which is a freely available, rigorously reviewed and robust content resource for higher education institutions to access content on innovations and opportunities in the process of evolving the curriculum...
Resumo:
Organisations are constantly seeking new ways to improve operational efficiencies. This study investigates a novel way to identify potential efficiency gains in business operations by observing how they were carried out in the past and then exploring better ways of executing them by taking into account trade-offs between time, cost and resource utilisation. This paper demonstrates how these trade-offs can be incorporated in the assessment of alternative process execution scenarios by making use of a cost environment. A number of optimisation techniques are proposed to explore and assess alternative execution scenarios. The objective function is represented by a cost structure that captures different process dimensions. An experimental evaluation is conducted to analyse the performance and scalability of the optimisation techniques: integer linear programming (ILP), hill climbing, tabu search, and our earlier proposed hybrid genetic algorithm approach. The findings demonstrate that the hybrid genetic algorithm is scalable and performs better compared to other techniques. Moreover, we argue that the use of ILP is unrealistic in this setup and cannot handle complex cost functions such as the ones we propose. Finally, we show how cost-related insights can be gained from improved execution scenarios and how these can be utilised to put forward recommendations for reducing process-related cost and overhead within organisations.
Resumo:
Companies standardise and automate their business processes in order to improve process eff ciency and minimise operational risks. However, it is di fficult to eliminate all process risks during the process design stage due to the fact that processes often run in complex and changeable environments and rely on human resources. Timely identification of process risks is crucial in order to insure the achievement of process goals. Business processes are often supported by information systems that record information about their executions in event logs. In this article we present an approach and a supporting tool for the evaluation of the overall process risk and for the prediction of process outcomes based on the analysis of information recorded in event logs. It can help managers evaluate the overall risk exposure of their business processes, track the evolution of overall process risk, identify changes and predict process outcomes based on the current value of overall process risk. The approach was implemented and validated using synthetic event logs and through a case study with a real event log.
Resumo:
We and others have published on the rapid manufacture of micropellet tissues, typically formed from 100-500 cells each. The micropellet geometry enhances cellular biological properties, and in many cases the micropellets can subsequently be utilized as building blocks to assemble complex macrotissues. Generally, micropellets are formed from cells alone, however when replicating matrix-rich tissues such as cartilage it would be ideal if matrix or biomaterials supplements could be incorporated directly into the micropellet during the manufacturing process. Herein we describe a method to efficiently incorporate donor cartilage matrix into tissue engineered cartilage micropellets. We lyophilized bovine cartilage matrix, and then shattered it into microscopic pieces having average dimensions < 10 μm diameter; we termed this microscopic donor matrix "cartilage dust (CD)". Using a microwell platform, we show that ~0.83 μg CD can be rapidly and efficiently incorporated into single multicellular aggregates formed from 180 bone marrow mesenchymal stem/stromal cells (MSC) each. The microwell platform enabled the rapid manufacture of thousands of replica composite micropellets, with each micropellet having a material/CD core and a cellular surface. This micropellet organization enabled the rapid bulking up of the micropellet core matrix content, and left an adhesive cellular outer surface. This morphological organization enabled the ready assembly of the composite micropellets into macroscopic tissues. Generically, this is a versatile method that enables the rapid and uniform integration of biomaterials into multicellular micropellets that can then be used as tissue building blocks. In this study, the addition of CD resulted in an approximate 8-fold volume increase in the micropellets, with the donor matrix functioning to contribute to an increase in total cartilage matrix content. Composite micropellets were readily assembled into macroscopic cartilage tissues; the incorporation of CD enhanced tissue size and matrix content, but did not enhance chondrogenic gene expression.
Resumo:
This report provides the Commonwealth Department of Resources, Energy and Tourism (RET) with a summary of consultation undertaken with representatives from industry and academia around Australia regarding mainstreaming energy efficiency within engineering education. Specifically, the report documents the purpose of the consultation process, key messages and emerging themes, industry-perceived gaps in energy efficiency related knowledge and skills, and academic considerations regarding graduate attributes and learning pathways to close these gaps. This information complements previous reports by presenting the current thoughts and ideas of more than 100 engineering academic and practising professionals who are actively involved in building capacity through the education system or implementing energy efficiency improvements in companies/the workplace. Furthermore, the report describes the emergence of a potential ‘community of practice’ in energy efficiency capacity building that arose during the project.
Resumo:
Service compositions enable users to realize their complex needs as a single request. Despite intensive research, especially in the area of business processes, web services and grids, an open and valid question is still how to manage service compositions in order to satisfy both functional and non-functional requirements as well as adapt to dynamic changes. In this paper we propose an (functional) architecture for adaptive management of QoS-aware service compositions. Comparing to the other existing architectures this one offers two major advantages. Firstly, this architecture supports various execution strategies based on dynamic selection and negotiation of services included in a service composition, contracting based on service level agreements, service enactment with flexible support for exception handling, monitoring of service level objectives, and profiling of execution data. Secondly, the architecture is built on the basis of well know existing standards to communicate and exchange data, which significantly reduces effort to integrate existing solutions and tools from different vendors. A first prototype of this architecture has been implemented within an EU-funded Adaptive Service Grid project. © 2006 Springer-Verlag.
Resumo:
Web service and business process technologies are widely adopted to facilitate business automation and collaboration. Given the complexity of business processes, it is a sought-after feature to show a business process with different views to cater for the diverse interests, authority levels, etc., of different users. Aiming to implement such flexible process views in the Web service environment, this paper presents a novel framework named FlexView to support view abstraction and concretisation of WS-BPEL processes. In the FlexView framework, a rigorous view model is proposed to specify the dependency and correlation between structural components of process views with emphasis on the characteristics of WS-BPEL, and a set of rules are defined to guarantee the structural consistency between process views during transformations. A set of algorithms are developed to shift the abstraction and concretisation operations to the operational level. A prototype is also implemented for the proof-of-concept purpose. © 2010 Springer Science+Business Media, LLC.
Resumo:
Broad knowledge is required when a business process is modeled by a business analyst. We argue that existing Business Process Management methodologies do not consider business goals at the appropriate level. In this paper we present an approach to integrate business goals and business process models. We design a Business Goal Ontology for modeling business goals. Furthermore, we devise a modeling pattern for linking the goals to process models and show how the ontology can be used in query answering. In this way, we integrate the intentional perspective into our business process ontology framework, enriching the process description and enabling new types of business process analysis. © 2008 IEEE.
Resumo:
Process variability in pollutant build-up and wash-off generates inherent uncertainty that affects the outcomes of stormwater quality models. Poor characterisation of process variability constrains the accurate accounting of the uncertainty associated with pollutant processes. This acts as a significant limitation to effective decision making in relation to stormwater pollution mitigation. The study undertaken developed three theoretical scenarios based on research findings that variations in particle size fractions <150µm and >150µm during pollutant build-up and wash-off primarily determine the variability associated with these processes. These scenarios, which combine pollutant build-up and wash-off processes that takes place on a continuous timeline, are able to explain process variability under different field conditions. Given the variability characteristics of a specific build-up or wash-off event, the theoretical scenarios help to infer the variability characteristics of the associated pollutant process that follows. Mathematical formulation of the theoretical scenarios enables the incorporation of variability characteristics of pollutant build-up and wash-off processes in stormwater quality models. The research study outcomes will contribute to the quantitative assessment of uncertainty as an integral part of the interpretation of stormwater quality modelling outcomes.
Resumo:
This research contributes novel techniques for identifying and evaluating business process risks and analysing human resource behaviour. The developed techniques use predefined indicators to identify process risks in individual process instances, evaluate overall process risk, predict process outcomes and analyse human resource behaviour based on the analysis of information about process executions recorded in event logs by information systems. The results of this research can help managers to more accurately evaluate the risk exposure of their business processes, to more objectively evaluate the performance of their employees, and to identify opportunities for improvement of resource and process performance.
Resumo:
This paper demonstrates the integration and usage of Process Query Language (PQL), a special-purpose programming language for querying large collections of process models based on process model behavior, in the Apromore open-source process model repository. The resulting environment provides a unique user experience when carrying out process model querying tasks. The tool is useful for researchers and practitioners working with large process model collections, and specifically for those with an interest in model retrieval tasks as part of process compliance, process redesign and process standardization initiatives.
Resumo:
Remote networked collaboration with business model documentation has many communication problems. The aim of this project is to solve some of these communication problems by using digital 3D representations of human visual cues. Results from this project increased our understanding of the role and effects of visual cues in remote collaboration, specifically for validating business process models. Technology designs to support such cues across a distance have been proposed in this thesis with qualitative and quantitative methods of analysis being combined to analyse the impact of these cues on the communication, coordination and performance of a team collaborating remotely.
Resumo:
The co-curing process for advanced grid-stiffened (AGS) composite structure is a promising manufacturing process, which could reduce the manufacturing cost, augment the advantages and improve the performance of AGS composite structure. An improved method named soft-mold aided co-curing process which replaces the expansion molds by a whole rubber mold is adopted in this paper. This co-curing process is capable to co-cure a typical AGS composite structure with the manufacturer’s recommended cure cycle (MRCC). Numerical models are developed to evaluate the variation of temperature and the degree of cure in AGS composite structure during the soft-mold aided co-curing process. The simulation results were validated by experimental results obtained from embedded temperature sensors. Based on the validated modeling framework, the cycle of cure can be optimized by reducing more than half the time of MRCC while obtaining a reliable degree of cure. The shape and size effects of AGS composite structure on the distribution of temperature and degree of cure are also investigated to provide insights for the optimization of soft-mold aided co-curing process.