106 resultados para Analysis Tools


Relevância:

40.00% 40.00%

Publicador:

Resumo:

There is a growing need for parametric design software that communicates building performance feedback in early architectural exploration to support decision-making. This paper examines how the circuit of design and analysis process can be closed to provide active and concurrent feedback between architecture and services engineering domains. It presents the structure for an openly customisable design system that couples parametric modelling and energy analysis software to allow designers to assess the performance of early design iterations quickly. Finally, it discusses how user interactions with the system foster information exchanges that facilitate the sharing of design intelligence across disciplines.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Good daylighting design in buildings not only provides a comfortable luminous environment, but also delivers energy savings and comfortable and healthy environments for building occupants. Yet, there is still no consensus on how to assess what constitutes good daylighting design. Currently amongst building performance guidelines, Daylighting factors (DF) or minimum illuminance values are the standard; however, previous research has shown the shortcomings of these metrics. New computer software for daylighting analysis contains new more advanced metrics for daylighting (Climate Base Daylight Metrics-CBDM). Yet, these tools (new metrics or simulation tools) are not currently understood by architects and are not used within architectural firms in Australia. A survey of architectural firms in Brisbane showed the most relevant tools used by industry. The purpose of this paper is to assess and compare these computer simulation tools and new tools available architects and designers for daylighting. The tools are assessed in terms of their ease of use (e.g. previous knowledge required, complexity of geometry input, etc.), efficiency (e.g. speed, render capabilities, etc.) and outcomes (e.g. presentation of results, etc. The study shows tools that are most accessible for architects, are those that import a wide variety of files, or can be integrated into the current 3d modelling software or package. These software’s need to be able to calculate for point in times simulations, and annual analysis. There is a current need in these software solutions for an open source program able to read raw data (in the form of spreadsheets) and show that graphically within a 3D medium. Currently, development into plug-in based software’s are trying to solve this need through third party analysis, however some of these packages are heavily reliant and their host program. These programs however which allow dynamic daylighting simulation, which will make it easier to calculate accurate daylighting no matter which modelling platform the designer uses, while producing more tangible analysis today, without the need to process raw data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Appropriate disposition of emergency department (ED) patients with chest pain is dependent on clinical evaluation of risk. A number of chest pain risk stratification tools have been proposed. The aim of this study was to compare the predictive performance for major adverse cardiac events (MACE) using risk assessment tools from the National Heart Foundation of Australia (HFA), the Goldman risk score and the Thrombolysis in Myocardial Infarction risk score (TIMI RS). Methods: This prospective observational study evaluated ED patients aged ≥30 years with non-traumatic chest pain for which no definitive non-ischemic cause was found. Data collected included demographic and clinical information, investigation findings and occurrence of MACE by 30 days. The outcome of interest was the comparative predictive performance of the risk tools for MACE at 30 days, as analyzed by receiver operator curves (ROC). Results: Two hundred eighty-one patients were studied; the rate of MACE was 14.1%. Area under the curve (AUC) of the HFA, TIMI RS and Goldman tools for the endpoint of MACE was 0.54, 0.71 and 0.67, respectively, with the difference between the tools in predictive ability for MACE being highly significant [chi2 (3) = 67.21, N = 276, p < 0.0001]. Conclusion: The TIMI RS and Goldman tools performed better than the HFA in this undifferentiated ED chest pain population, but selection of cutoffs balancing sensitivity and specificity was problematic. There is an urgent need for validated risk stratification tools specific for the ED chest pain population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This document provides a review of international and national practices in investment decision support tools in road asset management. Efforts were concentrated on identifying analytic frameworks, evaluation methodologies and criteria adopted by current tools. Emphasis was also given to how current approaches support Triple Bottom Line decision-making. Benefit Cost Analysis and Multiple Criteria Analysis are principle methodologies in supporting decision-making in Road Asset Management. The complexity of the applications shows significant differences in international practices. There is continuing discussion amongst practitioners and researchers regarding to which one is more appropriate in supporting decision-making. It is suggested that the two approaches should be regarded as complementary instead of competitive means. Multiple Criteria Analysis may be particularly helpful in early stages of project development, say strategic planning. Benefit Cost Analysis is used most widely for project prioritisation and selecting the final project from amongst a set of alternatives. Benefit Cost Analysis approach is useful tool for investment decision-making from an economic perspective. An extension of the approach, which includes social and environmental externalities, is currently used in supporting Triple Bottom Line decision-making in the road sector. However, efforts should be given to several issues in the applications. First of all, there is a need to reach a degree of commonality on considering social and environmental externalities, which may be achieved by aggregating the best practices. At different decision-making level, the detail of consideration of the externalities should be different. It is intended to develop a generic framework to coordinate the range of existing practices. The standard framework will also be helpful in reducing double counting, which appears in some current practices. Cautions should also be given to the methods of determining the value of social and environmental externalities. A number of methods, such as market price, resource costs and Willingness to Pay, are found in the review. The use of unreasonable monetisation methods in some cases has discredited Benefit Cost Analysis in the eyes of decision makers and the public. Some social externalities, such as employment and regional economic impacts, are generally omitted in current practices. This is due to the lack of information and credible models. It may be appropriate to consider these externalities in qualitative forms in a Multiple Criteria Analysis. Consensus has been reached in considering noise and air pollution in international practices. However, Australia practices generally omitted these externalities. Equity is an important consideration in Road Asset Management. The considerations are either between regions, or social groups, such as income, age, gender, disable, etc. In current practice, there is not a well developed quantitative measure for equity issues. More research is needed to target this issue. Although Multiple Criteria Analysis has been used for decades, there is not a generally accepted framework in the choice of modelling methods and various externalities. The result is that different analysts are unlikely to reach consistent conclusions about a policy measure. In current practices, some favour using methods which are able to prioritise alternatives, such as Goal Programming, Goal Achievement Matrix, Analytic Hierarchy Process. The others just present various impacts to decision-makers to characterise the projects. Weighting and scoring system are critical in most Multiple Criteria Analysis. However, the processes of assessing weights and scores were criticised as highly arbitrary and subjective. It is essential that the process should be as transparent as possible. Obtaining weights and scores by consulting local communities is a common practice, but is likely to result in bias towards local interests. Interactive approach has the advantage in helping decision-makers elaborating their preferences. However, computation burden may result in lose of interests of decision-makers during the solution process of a large-scale problem, say a large state road network. Current practices tend to use cardinal or ordinal scales in measure in non-monetised externalities. Distorted valuations can occur where variables measured in physical units, are converted to scales. For example, decibels of noise converts to a scale of -4 to +4 with a linear transformation, the difference between 3 and 4 represents a far greater increase in discomfort to people than the increase from 0 to 1. It is suggested to assign different weights to individual score. Due to overlapped goals, the problem of double counting also appears in some of Multiple Criteria Analysis. The situation can be improved by carefully selecting and defining investment goals and criteria. Other issues, such as the treatment of time effect, incorporating risk and uncertainty, have been given scant attention in current practices. This report suggested establishing a common analytic framework to deal with these issues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The report presents a methodology for whole of life cycle cost analysis of alternative treatment options for bridge structures, which require rehabilitation. The methodology has been developed after a review of current methods and establishing that a life cycle analysis based on a probabilistic risk approach has many advantages including the essential ability to consider variability of input parameters. The input parameters for the analysis are identified as initial cost, maintenance, monitoring and repair cost, user cost and failure cost. The methodology utilizes the advanced simulation technique of Monte Carlo simulation to combine a number of probability distributions to establish the distribution of whole of life cycle cost. In performing the simulation, the need for a powerful software package, which would work with spreadsheet program, has been identified. After exploring several products on the market, @RISK software has been selected for the simulation. In conclusion, the report presents a typical decision making scenario considering two alternative treatment options.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Realistic estimates of short- and long-term (strategic) budgets for maintenance and rehabilitation of road assessment management should consider the stochastic characteristics of asset conditions of the road networks so that the overall variability of road asset data conditions is taken into account. The probability theory has been used for assessing life-cycle costs for bridge infrastructures by Kong and Frangopol (2003), Zayed et.al. (2002), Kong and Frangopol (2003), Liu and Frangopol (2004), Noortwijk and Frangopol (2004), Novick (1993). Salem 2003 cited the importance of the collection and analysis of existing data on total costs for all life-cycle phases of existing infrastructure, including bridges, road etc., and the use of realistic methods for calculating the probable useful life of these infrastructures (Salem et. al. 2003). Zayed et. al. (2002) reported conflicting results in life-cycle cost analysis using deterministic and stochastic methods. Frangopol et. al. 2001 suggested that additional research was required to develop better life-cycle models and tools to quantify risks, and benefits associated with infrastructures. It is evident from the review of the literature that there is very limited information on the methodology that uses the stochastic characteristics of asset condition data for assessing budgets/costs for road maintenance and rehabilitation (Abaza 2002, Salem et. al. 2003, Zhao, et. al. 2004). Due to this limited information in the research literature, this report will describe and summarise the methodologies presented by each publication and also suggest a methodology for the current research project funded under the Cooperative Research Centre for Construction Innovation CRC CI project no 2003-029-C.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Industrial employment growth has been one of the most dynamic areas of expansion in Asia; however, current trends in industrialised working environments have resulted in greater employee stress. Despite research showing that cultural values affect the way people cope with stress, there is a dearth of psychometrically established tools for use in non-Western countries to measure these constructs. Studies of the "Way of Coping Checklist-Revised" (WCCL-R) in the West suggest that the WCCL-R has good psychometric properties, but its applicability in the East is still understudied. A confirmatory factor analysis (CFA) is used to validate the WCCL-R constructs in an Asian population. This study used 1,314 participants from Indonesia, Sri Lanka, Singapore, and Thailand. An initial exploratory factor analysis revealed that original structures were not confirmed; however, a subsequent EFA and CFA showed that a 38-item, five-factor structure model was confirmed. The revised WCCL-R in the Asian sample was also found to have good reliability and sound construct and concurrent validity. The 38-item structure of the WCCL-R has considerable potential in future occupational stress-related research in Asian countries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report presents the current state and approach in Building Information Modelling (BIM). The report is focussed at providing a desktop audit of the current state and capabilities of the products and applications supporting BIM. This includes discussion on BIM model servers as well as discipline specific applications, for which the distinction is explained below. The report presented here is aimed at giving a broad overview of the tools and applications with respect to their BIM capabilities and in no way claims to be an exhaustive report for individual tools. Chapter 4 of the report includes the research and development agendas pertaining to the BIM approach based on the observations and analysis from the desktop audit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many studies in the area of project management and social networks have identified the significance of project knowledge transfer within and between projects. However, only few studies have examined the intra- and inter-projects knowledge transfer activities. Knowledge in projects can be transferred via face-to-face interactions on the one hand, and via IT-based tools on the other. Although companies have allocated many resources to the IT tools, it has been found that they are not always effectively utilised, and people prefer to look for knowledge using social face-to-face interactions. This paper explores how to effectively leverage two alternative knowledge transfer techniques, face-to-face and IT-based tools to facilitate knowledge transfer and enhance knowledge creation for intra- and inter-project knowledge transfer. The paper extends the previous research on the relationships between and within teams by examining the project’s external and internal knowledge networks concurrently. Social network qualitative analysis, using a case study within a small-medium enterprise, was used to examine the knowledge transfer activities within and between projects, and to investigate knowledge transfer techniques. This paper demonstrates the significance of overlapping employees working simultaneously on two or more projects and their impact on facilitating knowledge transfer between projects within a small/medium organisation. This research is also crucial to gaining better understanding of different knowledge transfer techniques used for intra- and inter-project knowledge exchange. The research provides recommendations on how to achieve better knowledge transfer within and between projects in order to fully utilise a project’s knowledge and achieve better project performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With increasing pressure to provide environmentally responsible infrastructure products and services, stakeholders are putting significant foci on the early identification of financial viability and outcome of infrastructure projects. Traditionally, there has been an imbalance between sustainable measures and project budget. On one hand, the industry tends to employ the first-cost mentality and approach to developing infrastructure projects. On the other, environmental experts and technology innovators often push for the ultimately green products and systems without much of a concern for cost. This situation is being quickly changed as the industry is under pressure to continue to return profit, while better adapting to current and emerging global issues of sustainability. For the infrastructure sector to contribute to sustainable development, it will need to increase value and efficiency. Thus, there is a great need for tools that will enable decision makers evaluate competing initiatives and identify the most sustainable approaches to procuring infrastructure projects. In order to ensure that these objectives are achieved, the concept of life-cycle costing analysis (LCCA) will play significant roles in the economics of an infrastructure project. Recently, a few research initiatives have applied the LCCA models for road infrastructure that focused on the traditional economics of a project. There is little coverage of life-cycle costing as a method to evaluate the criteria and assess the economic implications of pursuing sustainability in road infrastructure projects. To rectify this problem, this paper reviews the theoretical basis of previous LCCA models before discussing their inability to determinate the sustainability indicators in road infrastructure project. It then introduces an on-going research aimed at developing a new model to integrate the various new cost elements based on the sustainability indicators with the traditional and proven LCCA approach. It is expected that the research will generate a working model for sustainability based life-cycle cost analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today’s evolving networks are experiencing a large number of different attacks ranging from system break-ins, infection from automatic attack tools such as worms, viruses, trojan horses and denial of service (DoS). One important aspect of such attacks is that they are often indiscriminate and target Internet addresses without regard to whether they are bona fide allocated or not. Due to the absence of any advertised host services the traffic observed on unused IP addresses is by definition unsolicited and likely to be either opportunistic or malicious. The analysis of large repositories of such traffic can be used to extract useful information about both ongoing and new attack patterns and unearth unusual attack behaviors. However, such an analysis is difficult due to the size and nature of the collected traffic on unused address spaces. In this dissertation, we present a network traffic analysis technique which uses traffic collected from unused address spaces and relies on the statistical properties of the collected traffic, in order to accurately and quickly detect new and ongoing network anomalies. Detection of network anomalies is based on the concept that an anomalous activity usually transforms the network parameters in such a way that their statistical properties no longer remain constant, resulting in abrupt changes. In this dissertation, we use sequential analysis techniques to identify changes in the behavior of network traffic targeting unused address spaces to unveil both ongoing and new attack patterns. Specifically, we have developed a dynamic sliding window based non-parametric cumulative sum change detection techniques for identification of changes in network traffic. Furthermore we have introduced dynamic thresholds to detect changes in network traffic behavior and also detect when a particular change has ended. Experimental results are presented that demonstrate the operational effectiveness and efficiency of the proposed approach, using both synthetically generated datasets and real network traces collected from a dedicated block of unused IP addresses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cleaning of sugar mill evaporators is an expensive exercise. Identifying the scale components assists in determining which chemical cleaning agents would result in effective evaporator cleaning. The current methods (based on x-ray diffraction techniques, ion exchange/high performance liquid chromatography and thermogravimetry/differential thermal analysis) used for scale characterisation are difficult, time consuming and expensive, and cannot be performed in a conventional analytical laboratory or by mill staff. The present study has examined the use of simple descriptor tests for the characterisation of Australian sugar mill evaporator scales. Scale samples were obtained from seven Australian sugar mill evaporators by mechanical means. The appearance, texture and colour of the scale were noted before the samples were characterised using x-ray fluorescence and x-ray powder diffraction to determine the compounds present. A number of commercial analytical test kits were used to determine the phosphate and calcium contents of scale samples. Dissolution experiments were carried out on the scale samples with selected cleaning agents to provide relevant information about the effect the cleaning agents have on different evaporator scales. Results have shown that by simply identifying the colour and the appearance of the scale, the elemental composition and knowing from which effect the scale originates, a prediction of the scale composition can be made. These descriptors and dissolution experiments on scale samples can be used to provide factory staff with an on-site rapid process to predict the most effective chemicals for chemical cleaning of the evaporators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Road agencies require comprehensive, relevan and quality data describing their road assets to support their investment decisions. An investment decision support system for raod maintenance and rehabilitation mainly comprise three important supporting elements namely: road asset data, decision support tools and criteria for decision-making. Probability-based methods have played a crucial role in helping decision makers understand the relationship among road related data, asset performance and uncertainties in estimating budgets/costs for road management investment. This paper presents applications of the probability-bsed method for road asset management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is devoted to the study of linear relationships in symmetric block ciphers. A block cipher is designed so that the ciphertext is produced as a nonlinear function of the plaintext and secret master key. However, linear relationships within the cipher can still exist if the texts and components of the cipher are manipulated in a number of ways, as shown in this thesis. There are four main contributions of this thesis. The first contribution is the extension of the applicability of integral attacks from word-based to bitbased block ciphers. Integral attacks exploit the linear relationship between texts at intermediate stages of encryption. This relationship can be used to recover subkey bits in a key recovery attack. In principle, integral attacks can be applied to bit-based block ciphers. However, specific tools to define the attack on these ciphers are not available. This problem is addressed in this thesis by introducing a refined set of notations to describe the attack. The bit patternbased integral attack is successfully demonstrated on reduced-round variants of the block ciphers Noekeon, Present and Serpent. The second contribution is the discovery of a very small system of equations that describe the LEX-AES stream cipher. LEX-AES is based heavily on the 128-bit-key (16-byte) Advanced Encryption Standard (AES) block cipher. In one instance, the system contains 21 equations and 17 unknown bytes. This is very close to the upper limit for an exhaustive key search, which is 16 bytes. One only needs to acquire 36 bytes of keystream to generate the equations. Therefore, the security of this cipher depends on the difficulty of solving this small system of equations. The third contribution is the proposal of an alternative method to measure diffusion in the linear transformation of Substitution-Permutation-Network (SPN) block ciphers. Currently, the branch number is widely used for this purpose. It is useful for estimating the possible success of differential and linear attacks on a particular SPN cipher. However, the measure does not give information on the number of input bits that are left unchanged by the transformation when producing the output bits. The new measure introduced in this thesis is intended to complement the current branch number technique. The measure is based on fixed points and simple linear relationships between the input and output words of the linear transformation. The measure represents the average fraction of input words to a linear diffusion transformation that are not effectively changed by the transformation. This measure is applied to the block ciphers AES, ARIA, Serpent and Present. It is shown that except for Serpent, the linear transformations used in the block ciphers examined do not behave as expected for a random linear transformation. The fourth contribution is the identification of linear paths in the nonlinear round function of the SMS4 block cipher. The SMS4 block cipher is used as a standard in the Chinese Wireless LAN Wired Authentication and Privacy Infrastructure (WAPI) and hence, the round function should exhibit a high level of nonlinearity. However, the findings in this thesis on the existence of linear relationships show that this is not the case. It is shown that in some exceptional cases, the first four rounds of SMS4 are effectively linear. In these cases, the effective number of rounds for SMS4 is reduced by four, from 32 to 28. The findings raise questions about the security provided by SMS4, and might provide clues on the existence of a flaw in the design of the cipher.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Now in its second edition, this book describes tools that are commonly used in transportation data analysis. The first part of the text provides statistical fundamentals while the second part presents continuous dependent variable models. With a focus on count and discrete dependent variable models, the third part features new chapters on mixed logit models, logistic regression, and ordered probability models. The last section provides additional coverage of Bayesian statistical modeling, including Bayesian inference and Markov chain Monte Carlo methods. Data sets are available online to use with the modeling techniques discussed.