913 resultados para headspace analysis
Resumo:
Selecting an appropriate business process modelling technique forms an important task within the methodological challenges of a business process management project. While a plethora of available techniques has been developed over the last decades, there is an obvious shortage of well-accepted reference frameworks that can be used to evaluate and compare the capabilities of the different techniques. Academic progress has been made at least in the area of representational analyses that use ontology as a benchmark for such evaluations. This paper reflects on the comprehensive experiences with the application of a model based on the Bunge ontology in this context. A brief overview of the underlying research model characterizes the different steps in such a research project. A comparative summary of previous representational analyses of process modelling techniques over time gives insights into the relative maturity of selected process modelling techniques. Based on these experiences suggestions are made as to where ontology-based representational analyses could be further developed and what limitations are inherent to such analyses.
Resumo:
The availability of innumerable intelligent building (IB) products, and the current dearth of inclusive building component selection methods suggest that decision makers might be confronted with the quandary of forming a particular combination of components to suit the needs of a specific IB project. Despite this problem, few empirical studies have so far been undertaken to analyse the selection of the IB systems, and to identify key selection criteria for major IB systems. This study is designed to fill these research gaps. Two surveys: a general survey and the analytic hierarchy process (AHP) survey are proposed to achieve these objectives. The first general survey aims to collect general views from IB experts and practitioners to identify the perceived critical selection criteria, while the AHP survey was conducted to prioritize and assign the important weightings for the perceived criteria in the general survey. Results generally suggest that each IB system was determined by a disparate set of selection criteria with different weightings. ‘Work efficiency’ is perceived to be most important core selection criterion for various IB systems, while ‘user comfort’, ‘safety’ and ‘cost effectiveness’ are also considered to be significant. Two sub-criteria, ‘reliability’ and ‘operating and maintenance costs’, are regarded as prime factors to be considered in selecting IB systems. The current study contributes to the industry and IB research in at least two aspects. First, it widens the understanding of the selection criteria, as well as their degree of importance, of the IB systems. It also adopts a multi-criteria AHP approach which is a new method to analyse and select the building systems in IB. Further research would investigate the inter-relationship amongst the selection criteria.
Resumo:
More than a century ago in their definitive work “The Right to Privacy” Samuel D. Warren and Louis D. Brandeis highlighted the challenges posed to individual privacy by advancing technology. Today’s workplace is characterised by its reliance on computer technology, particularly the use of email and the Internet to perform critical business functions. Increasingly these and other workplace activities are the focus of monitoring by employers. There is little formal regulation of electronic monitoring in Australian or United States workplaces. Without reasonable limits or controls, this has the potential to adversely affect employees’ privacy rights. Australia has a history of legislating to protect privacy rights, whereas the United States has relied on a combination of constitutional guarantees, federal and state statutes, and the common law. This thesis examines a number of existing and proposed statutory and other workplace privacy laws in Australia and the United States. The analysis demonstrates that existing measures fail to adequately regulate monitoring or provide employees with suitable remedies where unjustifiable intrusions occur. The thesis ultimately supports the view that enacting uniform legislation at the national level provides a more effective and comprehensive solution for both employers and employees. Chapter One provides a general introduction and briefly discusses issues relevant to electronic monitoring in the workplace. Chapter Two contains an overview of privacy law as it relates to electronic monitoring in Australian and United States workplaces. In Chapter Three there is an examination of the complaint process and remedies available to a hypothetical employee (Mary) who is concerned about protecting her privacy rights at work. Chapter Four provides an analysis of the major themes emerging from the research, and also discusses the draft national uniform legislation. Chapter Five details the proposed legislation in the form of the Workplace Surveillance and Monitoring Act, and Chapter Six contains the conclusion.
Resumo:
Although the benefits of service orientation are prevalent in literature, a review, analysis, and evaluation of the 30 existing service analysis approaches presented in this paper have shown that a comprehensive approach to the identification and analysis of both business and supporting software services is missing. Based on this evaluation of existing approaches and additional sources, we close this gap by proposing an integrated, consolidated approach to business and software service analysis that combines and extends the strengths of the examined methodologies.
Resumo:
The service-orientation paradigm has not only become prevalent in the software systems domain in recent years, but is also increasingly applied on the business level to restructure organisational capabilities. In this paper, we present the results of an extensive literature review of 30 approaches related to service identification and analysis for both domains. Based on the consolidation of a superset of comparison criteria for service-oriented methodologies found in related literature, we compare and evaluate the different characteristics of service engineering methods with a focus on service analysis. Although a close business and IT alignment is regarded as one of the core beneficial promises of service-orientation, our analysis suggests that there is a lack of unified, comprehensive methodology for service identification and analysis integrating and addressing both domains. Thus, we discuss how our results can inform directions for future research in this area.
Negotiating multiple identities between school and the outside world : A critical discourse analysis
Resumo:
This article examines interview talk of three students in an Australian high school to show how they negotiate their young adult identities between school and the outside world. It draws on Bakhtin’s concepts of dialogism and heteroglossia to argue that identities are linguistically and corporeally constituted. A critical discourse analysis of segments of transcribed interviews and student-related public documents finds a mismatch between a social justice curriculum at school and its transfer into students’ accounts of outside school lived realities. The article concludes that a productive social justice pedagogy must use its key principles of (con)textual interrogation to engage students in reflexive practice about their positioning within and against discourses of social justice in their student and civic lives. An impending national curriculum must decide whether or not it negotiates the discursive divide any better.
Resumo:
This document provides an overview of the differences and similarities in the objectives and implementation frameworks of the training and employment policies applying to public construction projects in Western Australia and Queensland. The material in the document clearly demonstrates the extent to which approaches to the pursuit of training objectives in particular have been informed by the experiences of other jurisdictions. The two State governments now have very similar approaches to the promotion of training with the WA government basing a good part of its policy approach on the “Queensland model”. As the two States share many similar economic and other characteristics, and have very similar social and economic goals, this similarity is to be expected. The capacity to benefit from the experiences of other jurisdictions is to be welcomed. The similarity in policy approach also suggests a potential for ongoing collaborations between the State governments on research aimed at further improving training and employment outcomes via public construction projects.
Resumo:
The report presents a methodology for whole of life cycle cost analysis of alternative treatment options for bridge structures, which require rehabilitation. The methodology has been developed after a review of current methods and establishing that a life cycle analysis based on a probabilistic risk approach has many advantages including the essential ability to consider variability of input parameters. The input parameters for the analysis are identified as initial cost, maintenance, monitoring and repair cost, user cost and failure cost. The methodology utilizes the advanced simulation technique of Monte Carlo simulation to combine a number of probability distributions to establish the distribution of whole of life cycle cost. In performing the simulation, the need for a powerful software package, which would work with spreadsheet program, has been identified. After exploring several products on the market, @RISK software has been selected for the simulation. In conclusion, the report presents a typical decision making scenario considering two alternative treatment options.
Resumo:
This project, as part of a broader Sustainable Sub-divisions research agenda, addresses the role of natural ventilation in reducing the use of energy required to cool dwellings
Resumo:
In the previous phase of this project, 2002-059-B Case-Based Reasoning in Construction and Infrastructure Projects, demonstration software was developed using a case-base reasoning engine to access a number of sources of information on lifetime of metallic building components. One source of information was data from the Queensland Department of Public Housing relating to maintenance operations over a number of years. Maintenance information is seen as being a particularly useful source of data about service life of building components as it relates to actual performance of materials in the working environment. If a building is constructed in 1984 and the maintenance records indicate that the guttering was replaced in 2006, then the service life of the gutters was 22 years in that environment. This phase of the project aims to look more deeply at the Department of Housing data, as an example of maintenance records, and formulate methods for using this data to inform the knowledge of service lifetimes.
Resumo:
Construction is an information intensive industry in which the accuracy and timeliness of information is paramount. It observed that the main communication issue in construction is to provide a method to exchange data between the site operation, the site office and the head office. The information needs under consideration are time critical to assist in maintaining or improving the efficiency at the jobsite. Without appropriate computing support this may increase the difficulty of problem solving. Many researchers focus their research on the usage of mobile computing devices in the construction industry and they believe that mobile computers have the potential to solve some construction problems that leads to reduce overall productivity. However, to date very limited observation has been conducted in terms of the deployment of mobile computers for construction workers on-site. By providing field workers with accurate, reliable and timely information at the location where it is needed, it will support the effectiveness and efficiency at the job site. Bringing a new technology into construction industry is not only need a better understanding of the application, but also need a proper preparation of the allocation of the resources such as people, and investment. With this in mind, an accurate analysis is needed to provide clearly idea of the overall costs and benefits of the new technology. A cost benefit analysis is a method of evaluating the relative merits of a proposed investment project in order to achieve efficient allocation of resources. It is a way of identifying, portraying and assessing the factors which need to be considered in making rational economic choices. In principle, a cost benefit analysis is a rigorous, quantitative and data-intensive procedure, which requires identification all potential effects, categorisation of these effects as costs and benefits, quantitative estimation of the extent of each cost and benefit associated with an action, translation of these into a common metric such as dollars, discounting of future costs and benefits into the terms of a given year, and summary of all cost and benefit to see which is greater. Even though many cost benefit analysis methodologies are available for a general assessment, there is no specific methodology can be applied for analysing the cost and benefit of the application of mobile computing devices in the construction site. Hence, the proposed methodology in this document is predominantly adapted from Baker et al. (2000), Department of Finance (1995), and Office of Investment Management (2005). The methodology is divided into four main stages and then detailed into ten steps. The methodology is provided for the CRC CI 2002-057-C Project: Enabling Team Collaboration with Pervasive and Mobile Computing and can be seen in detail in Section 3.
Resumo:
Realistic estimates of short- and long-term (strategic) budgets for maintenance and rehabilitation of road assessment management should consider the stochastic characteristics of asset conditions of the road networks so that the overall variability of road asset data conditions is taken into account. The probability theory has been used for assessing life-cycle costs for bridge infrastructures by Kong and Frangopol (2003), Zayed et.al. (2002), Kong and Frangopol (2003), Liu and Frangopol (2004), Noortwijk and Frangopol (2004), Novick (1993). Salem 2003 cited the importance of the collection and analysis of existing data on total costs for all life-cycle phases of existing infrastructure, including bridges, road etc., and the use of realistic methods for calculating the probable useful life of these infrastructures (Salem et. al. 2003). Zayed et. al. (2002) reported conflicting results in life-cycle cost analysis using deterministic and stochastic methods. Frangopol et. al. 2001 suggested that additional research was required to develop better life-cycle models and tools to quantify risks, and benefits associated with infrastructures. It is evident from the review of the literature that there is very limited information on the methodology that uses the stochastic characteristics of asset condition data for assessing budgets/costs for road maintenance and rehabilitation (Abaza 2002, Salem et. al. 2003, Zhao, et. al. 2004). Due to this limited information in the research literature, this report will describe and summarise the methodologies presented by each publication and also suggest a methodology for the current research project funded under the Cooperative Research Centre for Construction Innovation CRC CI project no 2003-029-C.
Resumo:
Reliable budget/cost estimates for road maintenance and rehabilitation are subjected to uncertainties and variability in road asset condition and characteristics of road users. The CRC CI research project 2003-029-C ‘Maintenance Cost Prediction for Road’ developed a method for assessing variation and reliability in budget/cost estimates for road maintenance and rehabilitation. The method is based on probability-based reliable theory and statistical method. The next stage of the current project is to apply the developed method to predict maintenance/rehabilitation budgets/costs of large networks for strategic investment. The first task is to assess the variability of road data. This report presents initial results of the analysis in assessing the variability of road data. A case study of the analysis for dry non reactive soil is presented to demonstrate the concept in analysing the variability of road data for large road networks. In assessing the variability of road data, large road networks were categorised into categories with common characteristics according to soil and climatic conditions, pavement conditions, pavement types, surface types and annual average daily traffic. The probability distributions, statistical means, and standard deviation values of asset conditions and annual average daily traffic for each type were quantified. The probability distributions and the statistical information obtained in this analysis will be used to asset the variation and reliability in budget/cost estimates in later stage. Generally, we usually used mean values of asset data of each category as input values for investment analysis. The variability of asset data in each category is not taken into account. This analysis method demonstrated that it can be used for practical application taking into account the variability of road data in analysing large road networks for maintenance/rehabilitation investment analysis.