817 resultados para Cost-benefit
Resumo:
A pressing cost issue facing construction is the procurement of off-site pre-manufactured assemblies. In order to encourage Australian adoption of off-site manufacture (OSM), a new approach to underlying processes is required. The advent of object oriented digital models for construction design assumes intelligent use of data. However, the construction production system relies on traditional methods and data sources and is expected to benefit from the application of well-established business process management techniques. The integration of the old and new data sources allows for the development of business process models which, by capturing typical construction processes involving OSM, provides insights into such processes. This integrative approach is the foundation of research into the use of OSM to increase construction productivity in Australia. The purpose of this study is to develop business process models capturing the procurement, resources and information flow of construction projects. For each stage of the construction value chain, a number of sub-processes are identified. Business Process Modelling Notation (BPMN), a mainstream business process modelling standard, is used to create base-line generic construction process models. These models identify OSM decision-making points that could provide cost reductions in procurement workflow and management systems. This paper reports on phase one of an on-going research aiming to develop a proto-type workflow application that can provide semi-automated support to construction processes involving OSM and assist in decision-making in the adoption of OSM thus contributing to a sustainable built environment.
Resumo:
Organizations from every industry sector seek to enhance their business performance and competitiveness through the deployment of contemporary information systems (IS), such as Enterprise Systems (ERP). Investments in ERP are complex and costly, attracting scrutiny and pressure to justify their cost. Thus, IS researchers highlight the need for systematic evaluation of information system success, or impact, which has resulted in the introduction of varied models for evaluating information systems. One of these systematic measurement approaches is the IS-Impact Model introduced by a team of researchers at Queensland University of technology (QUT) (Gable, Sedera, & Chan, 2008). The IS-Impact Model is conceptualized as a formative, multidimensional index that consists of four dimensions. Gable et al. (2008) define IS-Impact as "a measure at a point in time, of the stream of net benefits from the IS, to date and anticipated, as perceived by all key-user-groups" (p.381). The IT Evaluation Research Program (ITE-Program) at QUT has grown the IS-Impact Research Track with the central goal of conducting further studies to enhance and extend the IS-Impact Model. The overall goal of the IS-Impact research track at QUT is "to develop the most widely employed model for benchmarking information systems in organizations for the joint benefit of both research and practice" (Gable, 2009). In order to achieve that, the IS-Impact research track advocates programmatic research having the principles of tenacity, holism, and generalizability through extension research strategies. This study was conducted within the IS-Impact Research Track, to further generalize the IS-Impact Model by extending it to the Saudi Arabian context. According to Hofsted (2012), the national culture of Saudi Arabia is significantly different from the Australian national culture making the Saudi Arabian culture an interesting context for testing the external validity of the IS-Impact Model. The study re-visits the IS-Impact Model from the ground up. Rather than assume the existing instrument is valid in the new context, or simply assess its validity through quantitative data collection, the study takes a qualitative, inductive approach to re-assessing the necessity and completeness of existing dimensions and measures. This is done in two phases: Exploratory Phase and Confirmatory Phase. The exploratory phase addresses the first research question of the study "Is the IS-Impact Model complete and able to capture the impact of information systems in Saudi Arabian Organization?". The content analysis, used to analyze the Identification Survey data, indicated that 2 of the 37 measures of the IS-Impact Model are not applicable for the Saudi Arabian Context. Moreover, no new measures or dimensions were identified, evidencing the completeness and content validity of the IS-Impact Model. In addition, the Identification Survey data suggested several concepts related to IS-Impact, the most prominent of which was "Computer Network Quality" (CNQ). The literature supported the existence of a theoretical link between IS-Impact and CNQ (CNQ is viewed as an antecedent of IS-Impact). With the primary goal of validating the IS-Impact model within its extended nomological network, CNQ was introduced to the research model. The Confirmatory Phase addresses the second research question of the study "Is the Extended IS-Impact Model Valid as a Hierarchical Multidimensional Formative Measurement Model?". The objective of the Confirmatory Phase was to test the validity of IS-Impact Model and CNQ Model. To achieve that, IS-Impact, CNQ, and IS-Satisfaction were operationalized in a survey instrument, and then the research model was assessed by employing the Partial Least Squares (PLS) approach. The CNQ model was validated as a formative model. Similarly, the IS-Impact Model was validated as a hierarchical multidimensional formative construct. However, the analysis indicated that one of the IS-Impact Model indicators was insignificant and can be removed from the model. Thus, the resulting Extended IS-Impact Model consists of 4 dimensions and 34 measures. Finally, the structural model was also assessed against two aspects: explanatory and predictive power. The analysis revealed that the path coefficient between CNQ and IS-Impact is significant with t-value= (4.826) and relatively strong with â = (0.426) with CNQ explaining 18% of the variance in IS-Impact. These results supported the hypothesis that CNQ is antecedent of IS-Impact. The study demonstrates that the quality of Computer Network affects the quality of the Enterprise System (ERP) and consequently the impacts of the system. Therefore, practitioners should pay attention to the Computer Network quality. Similarly, the path coefficient between IS-Impact and IS-Satisfaction was significant t-value = (17.79) and strong â = (0.744), with IS-Impact alone explaining 55% of the variance in Satisfaction, consistent with results of the original IS-Impact study (Gable et al., 2008). The research contributions include: (a) supporting the completeness and validity of IS-Impact Model as a Hierarchical Multi-dimensional Formative Measurement Model in the Saudi Arabian context, (b) operationalizing Computer Network Quality as conceptualized in the ITU-T Recommendation E.800 (ITU-T, 1993), (c) validating CNQ as a formative measurement model and as an antecedent of IS Impact, and (d) conceptualizing and validating IS-Satisfaction as a reflective measurement model and as an immediate consequence of IS Impact. The CNQ model provides a framework to perceptually measure Computer Network Quality from multiple perspectives. The CNQ model features an easy-to-understand, easy-to-use, and economical survey instrument.
Resumo:
Low-cost level crossings are often criticized as being unsafe. Does a SIL (safety integrity level) rating make the railway crossing any safer? This paper discusses how a supporting argument might be made for low-cost level crossing warning devices with lower levels of safety integrity and issues such as risk tolerability and derivation of tolerable hazard rates for system-level hazards. As part of the design of such systems according to fail-safe principles, the paper considers the assumptions around the pre-defined safe states of existing warning devices and how human factors issues around such states can give rise to additional hazards.
Resumo:
The ultimate goal of an access control system is to allocate each user the precise level of access they need to complete their job - no more and no less. This proves to be challenging in an organisational setting. On one hand employees need enough access to the organisation’s resources in order to perform their jobs and on the other hand more access will bring about an increasing risk of misuse - either intentionally, where an employee uses the access for personal benefit, or unintentionally, through carelessness or being socially engineered to give access to an adversary. This thesis investigates issues of existing approaches to access control in allocating optimal level of access to users and proposes solutions in the form of new access control models. These issues are most evident when uncertainty surrounding users’ access needs, incentive to misuse and accountability are considered, hence the title of the thesis. We first analyse access control in environments where the administrator is unable to identify the users who may need access to resources. To resolve this uncertainty an administrative model with delegation support is proposed. Further, a detailed technical enforcement mechanism is introduced to ensure delegated resources cannot be misused. Then we explicitly consider that users are self-interested and capable of misusing resources if they choose to. We propose a novel game theoretic access control model to reason about and influence the factors that may affect users’ incentive to misuse. Next we study access control in environments where neither users’ access needs can be predicted nor they can be held accountable for misuse. It is shown that by allocating budget to users, a virtual currency through which they can pay for the resources they deem necessary, the need for a precise pre-allocation of permissions can be relaxed. The budget also imposes an upper-bound on users’ ability to misuse. A generalised budget allocation function is proposed and it is shown that given the context information the optimal level of budget for users can always be numerically determined. Finally, Role Based Access Control (RBAC) model is analysed under the explicit assumption of administrators’ uncertainty about self-interested users’ access needs and their incentives to misuse. A novel Budget-oriented Role Based Access Control (B-RBAC) model is proposed. The new model introduces the notion of users’ behaviour into RBAC and provides means to influence users’ incentives. It is shown how RBAC policy can be used to individualise the cost of access to resources and also to determine users’ budget. The implementation overheads of B-RBAC is examined and several low-cost sub-models are proposed.
Resumo:
Objective: The aim of the study was to assess the relationship between dimensions of perfectionism and suicide ideation in a tertiary student population in Australia. Method: The methodology involved 405 students completing the General Health Questionnaire (GHQ-28) which includes a subset of questions which can be used to assess suicide ideation, and the Multidimensional Perfectionism Scale. Results: The presence of suicide ideation was associated with higher scores on total perfectionism and two perfectionism dimensions, and total GHQ scores. There were significant differences between participants with high levels of perfectionism and participants with moderate to low levels of perfectionism on a measure of suicide ideation. Neither gender nor age were associated with differences in the scores, with results indicating high levels of perfectionism may indicate a vulnerability to suicide ideation. Conclusions: Perfectionism is a valued attribute in high-achieving populations. The question needs to be asked, however, at what cost? The findings indicate that high levels of perfectionism may be associated with an increased vulnerability to suicide ideation. Future research is needed to gain a better understanding of the complex interrelationship between personality and temperament, environmental factors and self-destructive behaviour.
Resumo:
This paper examines the relationship between financial performance and ethical screening intensity of a special class of ethical funds that is rooted in Islamic values – Islamic equity funds (IEFs). These faith-based ethical funds screen investments on compliance with Islamic values where conventional interest expense (riba), gambling (maysir), excessive uncertainty (gharar), and non-ethical (non-halal) products are prohibited. We test whether these extra screens affect the financial performance of IEFs relative to non-Islamic funds. Based on a large survivorship-free international sample of 387 Islamic funds, our results show that IEFs on average underperform conventional funds by 40 basis points per month, or 4.8% per year (supporting the underperformance hypothesis). While Islamic funds do not generally perform better during crisis periods, they outperformed conventional funds during the recent sub-prime crisis (supporting the outperformance hypothesis). Using holdings-based measures for ethical screening intensity, results show IEFs that apply more intensive screening perform worse, suggesting that there is a cost to being ethical.
Resumo:
In March 2008, the Australian Government announced its intention to introduce a national Emissions Trading Scheme (ETS), now expected to start in 2015. This impending development provides an ideal setting to investigate the impact an ETS in Australia will have on the market valuation of Australian Securities Exchange (ASX) firms. This is the first empirical study into the pricing effects of the ETS in Australia. Primarily, we hypothesize that firm value will be negatively related to a firm's carbon intensity profile. That is, there will be a greater impact on firm value for high carbon emitters in the period prior (2007) to the introduction of the ETS, whether for reasons relating to the existence of unbooked liabilities associated with future compliance and/or abatement costs, or for reasons relating to reduced future earnings. Using a sample of 58 Australian listed firms (constrained by the current availability of emissions data) which comprise larger, more profitable and less risky listed Australian firms, we first undertake an event study focusing on five distinct information events argued to impact the probability of the proposed ETS being enacted. Here, we find direct evidence that the capital market is indeed pricing the proposed ETS. Second, using a modified version of the Ohlson (1995) valuation model, we undertake a valuation analysis designed not only to complement the event study results, but more importantly to provide insights into the capital market's assessment of the magnitude of the economic impact of the proposed ETS as reflected in market capitalization. Here, our results show that the market assesses the most carbon intensive sample firms a market value decrement relative to other sample firms of between 7% and 10% of market capitalization. Further, based on the carbon emission profile of the sample firms we imply a ‘future carbon permit price’ of between AUD$17 per tonne and AUD$26 per tonne of carbon dioxide emitted. This study is more precise than industry reports, which set a carbon price of between AUD$15 to AUD$74 per tonne.
Resumo:
Martin Skitmore introduces a most "remarkable couple", Rod and Annie Stewart of Huntsville, Alabama (and elsewhere), and their post retirement business, Mobile Data Services. Contrary to popular expectations, Rod and Annie are not only computer-friendly, but are almost entirely dependent on the new technology for their survival.
Resumo:
Background: Surgical site infection (SSI) is associated with substantial costs for health services, reduced quality of life, and functional outcomes. The aim of this study was to evaluate the cost-effectiveness of strategies claiming to reduce the risk of SSI in hip arthroplasty in Australia. Methods: Baseline use of antibiotic prophylaxis (AP) was compared with no antibiotic prophylaxis (no AP), antibiotic-impregnated cement (AP þ ABC), and laminar air operating rooms (AP þ LOR). A Markov model was used to simulate long-term health and cost outcomes of a hypothetical cohort of 30,000 total hip arthroplasty patients from a health services perspective. Model parameters were informed by the best available evidence. Uncertainty was explored in probabilistic sensitivity and scenario analyses. Results: Stopping the routine use of AP resulted in over Australian dollars (AUD) $1.5 million extra costs and a loss of 163 quality-adjusted life years (QALYs). Using antibiotic cement in addition to AP (AP þ ABC)generated an extra 32 QALYs while saving over AUD $123,000. The use of laminar air operating rooms combined with routine AP (AP þ LOR) resulted in an AUD $4.59 million cost increase and 127 QALYs lost compared with the baseline comparator. Conclusion: Preventing deep SSI with antibiotic prophylaxis and antibiotic-impregnated cement has shown to improve health outcomes among hospitalized patients, save lives, and enhance resource allocation. Based on this evidence, the use of laminar air operating rooms is not recommended.
Resumo:
Teachers of construction economics and estimating have for a long time recognised that there is more to construction pricing than detailed calculation of costs (to the contractor). We always get to the point where we have to say "of course, experience or familiarity of the market is very important and this needs judgement, intuition, etc". Quite how important is the matter in construction pricing is not known and we tend to trivialise its effect. If judgement of the market has a minimal effect, little harm would be done, but if it is really important then some quite serious consequences arise which go well beyond the teaching environment. Major areas of concern for the quantity surveyor are in cost modelling and cost planning - neither of which pay any significant attention to the market effect. There are currently two schools of thought about the market effect issue. The first school is prepared to ignore possible effects until more is known. This may be called the pragmatic school. The second school exists solely to criticise the first school. We will call this the antagonistic school. Neither the pragmatic nor the antagonistic schools seem to be particularly keen to resolve the issue one way or the other. The founder and leader of the antagonistic school is Brian Fine whose paper in 1974 is still the basic text on the subject, and in which he coined the term 'socially acceptable' price to describe what we now recognise as the market effect. Mr Fine's argument was then, and is since, that the uncertainty surrounding the contractors' costing and cost estimating process is such that the uncertainty surrounding the contractors' cost that it logically leads to a market-orientated pricing approach. Very little factual evidence, however, seems to be available to support these arguments in any conclusive manner. A further, and more important point for the pragmatic school, is that, even if the market effect is as important as Mr Fine believes, there are no indications of how it can be measured, evaluated or predicted. Since 1974 evidence has been accumulating which tends to reinforce the antagonists' view. A review of the literature covering both contractors' and designers' estimates found many references to the use of value judgements in construction pricing (Ashworth & Skitmore, 1985), which supports the antagonistic view in implying the existence of uncertainty overload. The most convincing evidence emerged quite by accident in some research we recently completed with practicing quantity surveyors in estimating accuracy (Skitmore, 1985). In addition to demonstrating that individual quantity surveyors and certain types of buildings had significant effect on estimating accuracy, one surprise result was that only a very small amount of information was used by the most expert surveyors for relatively very accurate estimates. Only the type and size of building, it seemed, was really relevant in determining accuracy. More detailed information about the buildings' specification, and even a sight to the drawings, did not significantly improve their accuracy level. This seemed to offer clear evidence that the constructional aspects of the project were largely irrelevant and that the expert surveyors were somehow tuning in to the market price of the building. The obvious next step is to feed our expert surveyors with more relevant 'market' information in order to assess its effect. The problem with this is that our experts do not seem able to verbalise their requirements in this respect - a common occurrence in research of this nature. The lack of research into the nature of market effects on prices also means the literature provides little of benefit. Hence the need for this study. It was felt that a clearer picture of the nature of construction markets would be obtained in an environment where free enterprise was a truly ideological force. For this reason, the United States of America was chosen for the next stage of our investigations. Several people were interviewed in an informal and unstructured manner to elicit their views on the action of market forces on construction prices. Although a small number of people were involved, they were thought to be reasonably representative of knowledge in construction pricing. They were also very well able to articulate their views. Our initial reaction to the interviews was that our USA subjects held very close views to those held in the UK. However, detailed analysis revealed the existence of remarkably clear and consistent insights that would not have been obtained in the UK. Further evidence was also obtained from literature relating to the subject and some of the interviewees very kindly expanded on their views in later postal correspondence. We have now analysed all the evidence received and, although a great deal is of an anecdotal nature, we feel that our findings enable at least the basic nature of the subject to be understood and that the factors and their interrelationships can now be examined more formally in relation to construction price levels. I must express my gratitude to the Royal Institution of Chartered Surveyors' Educational Trust and the University of Salford's Department of Civil Engineering for collectively funding this study. My sincere thanks also go to our American participants who freely gave their time and valuable knowledge to us in our enquiries. Finally, I must record my thanks to Tim and Anne for their remarkable ability to produce an intelligible typescript from my unintelligible writing.
Resumo:
The generation of a correlation matrix from a large set of long gene sequences is a common requirement in many bioinformatics problems such as phylogenetic analysis. The generation is not only computationally intensive but also requires significant memory resources as, typically, few gene sequences can be simultaneously stored in primary memory. The standard practice in such computation is to use frequent input/output (I/O) operations. Therefore, minimizing the number of these operations will yield much faster run-times. This paper develops an approach for the faster and scalable computing of large-size correlation matrices through the full use of available memory and a reduced number of I/O operations. The approach is scalable in the sense that the same algorithms can be executed on different computing platforms with different amounts of memory and can be applied to different problems with different correlation matrix sizes. The significant performance improvement of the approach over the existing approaches is demonstrated through benchmark examples.