861 resultados para Software Process
Resumo:
A curriculum for a university-level course called Business Process Modeling is presented in order to provide guidance for the increasing number of institutions who are currently developing such contents. The course caters to undergraduate and post graduate students. Its content is drawn from recent research, industry practice, and established teaching material, and teaches ways of specifying business processes for the analysis and design of process-aware information systems. The teaching approach is a blend of lectures and classroom exercises with innovative case studies, as well as reviews of research material. Students are asked to conceptualize, analyze, and articulate real life process scenarios. Tutorials and cheat sheets assist with the learning experience. Course evaluations from 40 students suggest the adequacy of the teaching approach. Specifically, evaluations show a high degree of satisfaction with course relevance, content presentation, and teaching approach.
Resumo:
Automated Scheduler is a prototype software tool that automatically prepares a construction schedule together with a 4D simulation of the construction process from a 3D CAD building model.
Resumo:
The ability to assess a commercial building for its impact on the environment at the earliest stage of design is a goal which is achievable by integrating several approaches into a single procedure directly from the 3D CAD representation. Such an approach enables building design professionals to make informed decisions on the environmental impact of building and its alternatives during the design development stage instead of at the post-design stage where options become limited. The indicators of interest are those which relate to consumption of resources and energy, contributions to pollution of air, water and soil, and impacts on the health and wellbeing of people in the built environment as a result of constructing and operating buildings. 3D object-oriented CAD files contain a wealth of building information which can be interrogated for details required for analysis of the performance of a design. The quantities of all components in the building can be automatically obtained from the 3D CAD objects and their constituent materials identified to calculate a complete list of the amounts of all building products such as concrete, steel, timber, plastic etc. When this information is combined with a life cycle inventory database, key internationally recognised environmental indicators can be estimated. Such a fully integrated tool known as LCADesign has been created for automated ecoefficiency assessment of commercial buildings direct from 3D CAD. This paper outlines the key features of LCADesign and its application to environmental assessment of commercial buildings.
Resumo:
Understanding the differences between the temporal and physical aspects of the building life cycle is an essential ingredient in the development of Building Environmental Assessment (BEA) tools. This paper illustrates a theoretical Life Cycle Assessment (LCA) framework aligning temporal decision-making with that of material flows over building development phases. It was derived during development of a prototype commercial building design tool that was based on a 3-D CAD information and communications technology (ICT) platform and LCA software. The framework aligns stakeholder BEA needs and the decision-making process against characteristics of leading green building tools. The paper explores related integration of BEA tool development applications on such ICT platforms. Key framework modules are depicted and practical examples for BEA are provided for: • Definition of investment and service goals at project initiation; • Design integrated to avoid overlaps/confusion over the project life cycle; • Detailing the supply chain considering building life cycle impacts; • Delivery of quality metrics for occupancy post-construction/handover; • Deconstruction profiling at end of life to facilitate recovery.
Resumo:
Collaborative networks have come to form a large part of the public sector’s strategy to address ongoing and often complex social problems. The relational power of networks, with its emphasis on trust, reciprocity and mutuality provides the mechanism to integrate previously dispersed and even competitive entities into a collective venture(Agranoff 2003; Agranoff and McGuire 2003; Mandell 1994; Mandell and Harrington 1999). It is argued that the refocusing of a single body of effort to a collective contributes to reducing duplication and overlap of services, maximizes increasingly scarce resources and contributes to solving intractable or 'wicked’problems (Clarke and Stewart 1997). Given the current proliferation of collaborative networks and the fact that they are likely to continue for some time, concerns with the management and leadership of such arrangements for optimal outcomes are increasingly relevant. This is especially important for public sector managers who are used to working in a top-down, hierarchical manner. While the management of networks (Agranoff and McGuire 2001, 2003), including collaborative or complex networks (Kickert et al. 1997; Koppenjan and Klijn 2004), has been the subject of considerable attention, there has been much less explicit discussion on leadership approaches in this context. It is argued in this chapter that the traditional use of the terms ‘leader’ or ‘leadership’ does not apply to collaborative networks. There are no ‘followers’ in collaborative networks or supervisor-subordinate relations. Instead there are equal, horizontal relationships that are focused on delivering systems change. In this way the emergent organizational forms such as collaborative networks challenge older models of leadership. However despite the questionable relevance of old leadership styles to the contemporary work environment, no clear alternative has come along to take its place.
Resumo:
Process Control Systems (PCSs) or Supervisory Control and Data Acquisition (SCADA) systems have recently been added to the already wide collection of wireless sensor networks applications. The PCS/SCADA environment is somewhat more amenable to the use of heavy cryptographic mechanisms such as public key cryptography than other sensor application environments. The sensor nodes in the environment, however, are still open to devastating attacks such as node capture, which makes designing a secure key management challenging. In this paper, a key management scheme is proposed to defeat node capture attack by offering both forward and backward secrecies. Our scheme overcomes the pitfalls which Nilsson et al.'s scheme suffers from, and is not more expensive than their scheme.
Resumo:
This appendix describes the Order Fulfillment process followed by a fictitious company named Genko Oil. The process is freely inspired by the VICS (Voluntary Inter-industry Commerce Solutions) reference model1 and provides a demonstration of YAWL’s capabilities in modelling complex control-flow, data and resourcing requirements.
Resumo:
This chapter describes how the YAWL meta-model was extended to support the definition of variation points. These variation points can be used to describe different variants of a YAWL process model in a unified, configurable model. The model can then be configured to suit the needs of specific settings, e.g. for a new organization of project.
Resumo:
The YAWL system is structured as a service-oriented architecture. It is composed of an extensible set of YAWL Services [1], each of which is deployed at a certain endpoint and offers one or multiple interfaces. Some of these services are userfacing, meaning that they offer interfaces to end users, while others offer interfaces to applications or other services.
Resumo:
Historically, asset management focused primarily on the reliability and maintainability of assets; organisations have since then accepted the notion that a much larger array of processes govern the life and use of an asset. With this, asset management’s new paradigm seeks a holistic, multi-disciplinary approach to the management of physical assets. A growing number of organisations now seek to develop integrated asset management frameworks and bodies of knowledge. This research seeks to complement existing outputs of the mentioned organisations through the development of an asset management ontology. Ontologies define a common vocabulary for both researchers and practitioners who need to share information in a chosen domain. A by-product of ontology development is the realisation of a process architecture, of which there is also no evidence in published literature. To develop the ontology and subsequent asset management process architecture, a standard knowledge-engineering methodology is followed. This involves text analysis, definition and classification of terms and visualisation through an appropriate tool (in this case, the Protégé application was used). The result of this research is the first attempt at developing an asset management ontology and process architecture.
Resumo:
This publication is the culmination of a 2 year Australian Learning and Teaching Council's Project Priority Programs Research Grant which investigates key issues and challenges in developing flexible guidelines lines for best practice in Australian Doctoral and Masters by Research Examination, encompassing the two modes of investigation, written and multi-modal (practice-led/based) theses, their distinctiveness and their potential interplay. The aims of the project were to address issues of assessment legitimacy raised by the entry of practice-orientated dance studies into Australian higher degrees; examine literal embodiment and presence, as opposed to cultural studies about states of embodiment; foreground the validity of questions around subjectivity and corporeal intelligence/s and the reliability of artistic/aesthetic communications, and finally to celebrate ‘performance mastery’(Melrose 2003) as a rigorous and legitimate mode of higher research. The project began with questions which centred around: the functions of higher degree dance research; concepts of 'master-ness’ and ‘doctorateness’; the kinds of languages, structures and processes which may guide candidates, supervisors, examiners and research personnel; the purpose of evaluation/examination; addressing positive and negative attributes of examination. Finally the study examined ways in which academic/professional, writing/dancing, tradition/creation and diversity/consistency relationships might be fostered to embrace change. Over two years, the authors undertook a qualitative national study encompassing a triangulation of semi-structured face to face interviews and industry forums to gather views from the profession, together with an analysis of existing guidelines, and recent literature in the field. The most significant primary data emerged from 74 qualitative interviews with supervisors, examiners, research deans and administrators, and candidates in dance and more broadly across the creative arts. Qualitative data gathered from the two primary sources, was coded and analysed using the NVivo software program. Further perspectives were drawn from international consultant and dance researcher Susan Melrose, as well as publications in the field, and initial feedback from a draft document circulated at the World Dance Alliance Global Summit in July 2008 in Brisbane. Refinement of data occurred in a continual sifting process until the final publication was produced. This process resulted in a set of guidelines in the form of a complex dynamic system for both product and process oriented outcomes of multi-modal theses, along with short position papers on issues which arose from the research such as contested definitions, embodiment and ephemerality, ‘liveness’ in performance research higher degrees, dissolving theory/practice binaries, the relationship between academe and industry, documenting practices and a re-consideration of the viva voce.
Resumo:
A configurable process model provides a consolidated view of a family of business processes. It promotes the reuse of proven practices by providing analysts with a generic modelling artifact from which to derive individual process models. Unfortunately, the scope of existing notations for configurable process modelling is restricted, thus hindering their applicability. Specifically, these notations focus on capturing tasks and control-flow dependencies, neglecting equally important ingredients of business processes such as data and resources. This research fills this gap by proposing a configurable process modelling notation incorporating features for capturing resources, data and physical objects involved in the performance of tasks. The proposal has been implemented in a toolset that assists analysts during the configuration phase and guarantees the correctness of the resulting process models. The approach has been validated by means of a case study from the film industry.
Resumo:
Organizations increasingly seek to achieve operational excellence by standardizing business processes. Standardization initiatives may have different purposes, such as process streamlining, process automation, or even process outsourcing. However, standardization of processes is easier said than done. Standardization success depends on various factors, such as existent IT capabilities, available standard frameworks, market situation, and the processes’ nature, such as their level of routine or structuredness. This paper uncovers the complex nature and relative influence of process-internal and -environmental factors relevant to process standardization, by discussing three case studies from different industries. The findings are summarized in a set of initial conjectures about successful process standardization. This exploratory research is a first step towards uncovering the characteristics of successful process standardization efforts.
Resumo:
Most infrastructure projects share the same characteristics in term of management aspects and shortcomings. Human factor is believed to be the major drawbacks due to the nature of unstructured problems which can further contribute to management conflicts. This growing complexity in infrastructure projects has shift the paradigm of policy makers to adopt Information Communication Technology (ICT) as a driving force. For this reason, it is vital to fully maximise and utilise the recent technologies to accelerate management process particularly in planning phase. Therefore, a lot of tools have been developed to assist decision making in construction project management. The variety of uncertainties and alternatives in decision making can be entertained by using useful tool such as Decision Support System (DSS). However, the recent trend shows that most DSS in this area only concentrated in model development and left few fundamentals of computing. Thus, most of them were found complicated and less efficient to support decision making within project team members. Due to the current incapability of many software aspects, it is desirable for DSS to provide more simplicity, better collaborative platform, efficient data manipulation and reflection to user needs. By considering these factors, the paper illustrates four challenges for future DSS development i.e. requirement engineering, communication framework, data management and interoperability, and software usability