891 resultados para contrast analysis
Resumo:
Experience plays an important role in building management. “How often will this asset need repair?” or “How much time is this repair going to take?” are types of questions that project and facility managers face daily in planning activities. Failure or success in developing good schedules, budgets and other project management tasks depend on the project manager's ability to obtain reliable information to be able to answer these types of questions. Young practitioners tend to rely on information that is based on regional averages and provided by publishing companies. This is in contrast to experienced project managers who tend to rely heavily on personal experience. Another aspect of building management is that many practitioners are seeking to improve available scheduling algorithms, estimating spreadsheets and other project management tools. Such “micro-scale” levels of research are important in providing the required tools for the project manager's tasks. However, even with such tools, low quality input information will produce inaccurate schedules and budgets as output. Thus, it is also important to have a broad approach to research at a more “macro-scale.” Recent trends show that the Architectural, Engineering, Construction (AEC) industry is experiencing explosive growth in its capabilities to generate and collect data. There is a great deal of valuable knowledge that can be obtained from the appropriate use of this data and therefore the need has arisen to analyse this increasing amount of available data. Data Mining can be applied as a powerful tool to extract relevant and useful information from this sea of data. Knowledge Discovery in Databases (KDD) and Data Mining (DM) are tools that allow identification of valid, useful, and previously unknown patterns so large amounts of project data may be analysed. These technologies combine techniques from machine learning, artificial intelligence, pattern recognition, statistics, databases, and visualization to automatically extract concepts, interrelationships, and patterns of interest from large databases. The project involves the development of a prototype tool to support facility managers, building owners and designers. This Industry focused report presents the AIMMTM prototype system and documents how and what data mining techniques can be applied, the results of their application and the benefits gained from the system. The AIMMTM system is capable of searching for useful patterns of knowledge and correlations within the existing building maintenance data to support decision making about future maintenance operations. The application of the AIMMTM prototype system on building models and their maintenance data (supplied by industry partners) utilises various data mining algorithms and the maintenance data is analysed using interactive visual tools. The application of the AIMMTM prototype system to help in improving maintenance management and building life cycle includes: (i) data preparation and cleaning, (ii) integrating meaningful domain attributes, (iii) performing extensive data mining experiments in which visual analysis (using stacked histograms), classification and clustering techniques, associative rule mining algorithm such as “Apriori” and (iv) filtering and refining data mining results, including the potential implications of these results for improving maintenance management. Maintenance data of a variety of asset types were selected for demonstration with the aim of discovering meaningful patterns to assist facility managers in strategic planning and provide a knowledge base to help shape future requirements and design briefing. Utilising the prototype system developed here, positive and interesting results regarding patterns and structures of data have been obtained.
Negotiating multiple identities between school and the outside world : A critical discourse analysis
Resumo:
This article examines interview talk of three students in an Australian high school to show how they negotiate their young adult identities between school and the outside world. It draws on Bakhtin’s concepts of dialogism and heteroglossia to argue that identities are linguistically and corporeally constituted. A critical discourse analysis of segments of transcribed interviews and student-related public documents finds a mismatch between a social justice curriculum at school and its transfer into students’ accounts of outside school lived realities. The article concludes that a productive social justice pedagogy must use its key principles of (con)textual interrogation to engage students in reflexive practice about their positioning within and against discourses of social justice in their student and civic lives. An impending national curriculum must decide whether or not it negotiates the discursive divide any better.
Resumo:
This document provides an overview of the differences and similarities in the objectives and implementation frameworks of the training and employment policies applying to public construction projects in Western Australia and Queensland. The material in the document clearly demonstrates the extent to which approaches to the pursuit of training objectives in particular have been informed by the experiences of other jurisdictions. The two State governments now have very similar approaches to the promotion of training with the WA government basing a good part of its policy approach on the “Queensland model”. As the two States share many similar economic and other characteristics, and have very similar social and economic goals, this similarity is to be expected. The capacity to benefit from the experiences of other jurisdictions is to be welcomed. The similarity in policy approach also suggests a potential for ongoing collaborations between the State governments on research aimed at further improving training and employment outcomes via public construction projects.
Resumo:
The report presents a methodology for whole of life cycle cost analysis of alternative treatment options for bridge structures, which require rehabilitation. The methodology has been developed after a review of current methods and establishing that a life cycle analysis based on a probabilistic risk approach has many advantages including the essential ability to consider variability of input parameters. The input parameters for the analysis are identified as initial cost, maintenance, monitoring and repair cost, user cost and failure cost. The methodology utilizes the advanced simulation technique of Monte Carlo simulation to combine a number of probability distributions to establish the distribution of whole of life cycle cost. In performing the simulation, the need for a powerful software package, which would work with spreadsheet program, has been identified. After exploring several products on the market, @RISK software has been selected for the simulation. In conclusion, the report presents a typical decision making scenario considering two alternative treatment options.
Resumo:
This project, as part of a broader Sustainable Sub-divisions research agenda, addresses the role of natural ventilation in reducing the use of energy required to cool dwellings
Resumo:
In the previous phase of this project, 2002-059-B Case-Based Reasoning in Construction and Infrastructure Projects, demonstration software was developed using a case-base reasoning engine to access a number of sources of information on lifetime of metallic building components. One source of information was data from the Queensland Department of Public Housing relating to maintenance operations over a number of years. Maintenance information is seen as being a particularly useful source of data about service life of building components as it relates to actual performance of materials in the working environment. If a building is constructed in 1984 and the maintenance records indicate that the guttering was replaced in 2006, then the service life of the gutters was 22 years in that environment. This phase of the project aims to look more deeply at the Department of Housing data, as an example of maintenance records, and formulate methods for using this data to inform the knowledge of service lifetimes.
Resumo:
Construction is an information intensive industry in which the accuracy and timeliness of information is paramount. It observed that the main communication issue in construction is to provide a method to exchange data between the site operation, the site office and the head office. The information needs under consideration are time critical to assist in maintaining or improving the efficiency at the jobsite. Without appropriate computing support this may increase the difficulty of problem solving. Many researchers focus their research on the usage of mobile computing devices in the construction industry and they believe that mobile computers have the potential to solve some construction problems that leads to reduce overall productivity. However, to date very limited observation has been conducted in terms of the deployment of mobile computers for construction workers on-site. By providing field workers with accurate, reliable and timely information at the location where it is needed, it will support the effectiveness and efficiency at the job site. Bringing a new technology into construction industry is not only need a better understanding of the application, but also need a proper preparation of the allocation of the resources such as people, and investment. With this in mind, an accurate analysis is needed to provide clearly idea of the overall costs and benefits of the new technology. A cost benefit analysis is a method of evaluating the relative merits of a proposed investment project in order to achieve efficient allocation of resources. It is a way of identifying, portraying and assessing the factors which need to be considered in making rational economic choices. In principle, a cost benefit analysis is a rigorous, quantitative and data-intensive procedure, which requires identification all potential effects, categorisation of these effects as costs and benefits, quantitative estimation of the extent of each cost and benefit associated with an action, translation of these into a common metric such as dollars, discounting of future costs and benefits into the terms of a given year, and summary of all cost and benefit to see which is greater. Even though many cost benefit analysis methodologies are available for a general assessment, there is no specific methodology can be applied for analysing the cost and benefit of the application of mobile computing devices in the construction site. Hence, the proposed methodology in this document is predominantly adapted from Baker et al. (2000), Department of Finance (1995), and Office of Investment Management (2005). The methodology is divided into four main stages and then detailed into ten steps. The methodology is provided for the CRC CI 2002-057-C Project: Enabling Team Collaboration with Pervasive and Mobile Computing and can be seen in detail in Section 3.
Resumo:
Realistic estimates of short- and long-term (strategic) budgets for maintenance and rehabilitation of road assessment management should consider the stochastic characteristics of asset conditions of the road networks so that the overall variability of road asset data conditions is taken into account. The probability theory has been used for assessing life-cycle costs for bridge infrastructures by Kong and Frangopol (2003), Zayed et.al. (2002), Kong and Frangopol (2003), Liu and Frangopol (2004), Noortwijk and Frangopol (2004), Novick (1993). Salem 2003 cited the importance of the collection and analysis of existing data on total costs for all life-cycle phases of existing infrastructure, including bridges, road etc., and the use of realistic methods for calculating the probable useful life of these infrastructures (Salem et. al. 2003). Zayed et. al. (2002) reported conflicting results in life-cycle cost analysis using deterministic and stochastic methods. Frangopol et. al. 2001 suggested that additional research was required to develop better life-cycle models and tools to quantify risks, and benefits associated with infrastructures. It is evident from the review of the literature that there is very limited information on the methodology that uses the stochastic characteristics of asset condition data for assessing budgets/costs for road maintenance and rehabilitation (Abaza 2002, Salem et. al. 2003, Zhao, et. al. 2004). Due to this limited information in the research literature, this report will describe and summarise the methodologies presented by each publication and also suggest a methodology for the current research project funded under the Cooperative Research Centre for Construction Innovation CRC CI project no 2003-029-C.
Resumo:
Reliable budget/cost estimates for road maintenance and rehabilitation are subjected to uncertainties and variability in road asset condition and characteristics of road users. The CRC CI research project 2003-029-C ‘Maintenance Cost Prediction for Road’ developed a method for assessing variation and reliability in budget/cost estimates for road maintenance and rehabilitation. The method is based on probability-based reliable theory and statistical method. The next stage of the current project is to apply the developed method to predict maintenance/rehabilitation budgets/costs of large networks for strategic investment. The first task is to assess the variability of road data. This report presents initial results of the analysis in assessing the variability of road data. A case study of the analysis for dry non reactive soil is presented to demonstrate the concept in analysing the variability of road data for large road networks. In assessing the variability of road data, large road networks were categorised into categories with common characteristics according to soil and climatic conditions, pavement conditions, pavement types, surface types and annual average daily traffic. The probability distributions, statistical means, and standard deviation values of asset conditions and annual average daily traffic for each type were quantified. The probability distributions and the statistical information obtained in this analysis will be used to asset the variation and reliability in budget/cost estimates in later stage. Generally, we usually used mean values of asset data of each category as input values for investment analysis. The variability of asset data in each category is not taken into account. This analysis method demonstrated that it can be used for practical application taking into account the variability of road data in analysing large road networks for maintenance/rehabilitation investment analysis.
Resumo:
Enhancing children's self-concepts is widely accepted as a critical educational outcome of schooling and is postulated as a mediating variable that facilitates the attainment of other desired outcomes such as improved academic achievement. Despite considerable advances in self-concept research, there has been limited progress in devising teacher-administered enhancement interventions. This is unfortunate as teachers are crucial change agents during important developmental periods when self-concept is formed. The primary aim of the present investigation is to build on the promising features of previous self-concept enhancement studies by: (a) combining two exciting research directions developed by Burnett and Craven to develop a potentially powerful cognitive-based intervention; (b) incorporating recent developments in theory and measurement to ensure that the multidimensionality of self-concept is accounted for in the research design; (c) fully investigating the effects of a potentially strong cognitive intervention on reading, mathematics, school and learning self-concepts by using a large sample size and a sophisticated research design; (d) evaluating the effects of the intervention on affective and cognitive subcomponents of reading, mathematics, school and learning self-concepts over time to test for differential effects of the intervention; (e) modifying and extending current procedures to maximise the successful implementation of a teacher-mediated intervention in a naturalistic setting by incorporating sophisticated teacher training as suggested by Hattie (1992) and including an assessment of the efficacy of implementation; and (f) examining the durability of effects associated with the intervention.
Resumo:
In this paper, the stability of an autonomous microgrid with multiple distributed generators (DG) is studied through eigenvalue analysis. It is assumed that all the DGs are connected through Voltage Source Converter (VSC) and all connected loads are passive. The VSCs are controlled by state feedback controller to achieve desired voltage and current outputs that are decided by a droop controller. The state space models of each of the converters with its associated feedback are derived. These are then connected with the state space models of the droop, network and loads to form a homogeneous model, through which the eigenvalues are evaluated. The system stability is then investigated as a function of the droop controller real and reac-tive power coefficients. These observations are then verified through simulation studies using PSCAD/EMTDC. It will be shown that the simulation results closely agree with stability be-havior predicted by the eigenvalue analysis.
Resumo:
In the past decade, the utilization of ambulance data to inform the prevalence of nonfatal heroin overdose has increased. These data can assist public health policymakers, law enforcement agencies, and health providers in planning and allocating resources. This study examined the 672 ambulance attendances at nonfatal heroin overdoses in Queensland, Australia, in 2000. Gender distribution showed a typical 70/30 male-to-female ratio. An equal number of persons with nonfatal heroin overdose were between 15 and 24 years of age and 25 and 34 years of age. Police were present in only 1 of 6 cases, and 28.1% of patients reported using drugs alone. Ambulance data are proving to be a valuable population-based resource for describing the incidence and characteristics of nonfatal heroin overdose episodes. Future studies could focus on the differences between nonfatal heroin overdose and fatal heroin overdose samples.
Resumo:
LEX is a stream cipher that progressed to Phase 3 of the eSTREAM stream cipher project. In this paper, we show that the security of LEX against algebraic attacks relies on a small equation system not being solvable faster than exhaustive search. We use the byte leakage in LEX to construct a system of 21 equa- tions in 17 variables. This is very close to the require- ment for an efficient attack, i.e. a system containing 16 variables. The system requires only 36 bytes of keystream, which is very low.