952 resultados para Quality costs
Resumo:
Previous research has put forward a number of properties of business process models that have an impact on their understandability. Two such properties are compactness and(block-)structuredness. What has not been sufficiently appreciated at this point is that these desirable properties may be at odds with one another. This paper presents the results of a two-pronged study aimed at exploring the trade-off between compactness and structuredness of process models. The first prong of the study is a comparative analysis of the complexity of a set of unstructured process models from industrial practice and of their corresponding structured versions. The second prong is an experiment wherein a cohort of students was exposed to semantically equivalent unstructured and structured process models. The key finding is that structuredness is not an absolute desideratum vis-a-vis for process model understandability. Instead, subtle trade-offs between structuredness and other model properties are at play.
Resumo:
The case study 3 team viewed the mitigation of noise and air pollution generated in the transport corridor that borders the study site to be a paramount driver of the urban design solution. These key urban planning strategies were adopted: * Spatial separation from transport corridor pollution source. A linear green zone and environmental buffer was proposed adjacent to the transport corridor to mitigate the environmental noise and air quality impacts of the corridor, and to offer residents opportunities for recreation * Open space forming the key structural principle for neighbourhood design. A significant open space system underpins the planning and manages surface water flows. * Urban blocks running on east-west axis. The open space rationale emphasises an east-west pattern for local streets. Street alignment allows for predominantly north-south facing terrace type buildings which both face the street and overlook the green courtyard formed by the perimeter buildings. The results of the ESD assessment of the typologies conclude that the design will achieve good outcomes through: * Lower than average construction costs compared with other similar projects * Thermal comfort; A good balance between daylight access and solar gains is achieved * The energy rating achieved for the units is 8.5 stars.
Resumo:
Sourcing funding for the provision of new urban infrastructure has been a policy dilemma for governments around the world for decades. This is particularly relevant in high growth areas where new services are required to support swelling populations. Existing communities resist the introduction of new taxes to fund such infrastructure, hence the introduction of charges to the developer has flourished. The Australian infrastructure funding policy dilemmas are reflective of similar matters to some extent in the United Kingdom, and to a greater extent the United States of America. In these countries, infrastructure cost recovery policies have been in place since the 1940’s and 1970’s respectively. There is an extensive body of theoretical and empirical literature that discusses the passing on (to home buyers) or passing back (to the englobo land seller) of these increased infrastructure charges, and the corresponding impact on housing cost and supply. The purpose of this research is to examine the international evidence that suggests infrastructure charges contribute to increased house prices as well as reduced land supply. The paper concludes that whilst the theoretical work is largely consistent, the empirical research to date is inconclusive and further research is required into these impacts in Australia.
Resumo:
Article in Courier Mail. Friday July 22, 2011.
Resumo:
The quality of conceptual business process models is highly relevant for the design of corresponding information systems. In particular, a precise measurement of model characteristics can be beneficial from a business perspective, helping to save costs thanks to early error detection. This is just as true from a software engineering point of view. In this latter case, models facilitate stakeholder communication and software system design. Research has investigated several proposals as regards measures for business process models, from a rather correlational perspective. This is helpful for understanding, for example size and complexity as general driving forces of error probability. Yet, design decisions usually have to build on thresholds, which can reliably indicate that a certain counter-action has to be taken. This cannot be achieved only by providing measures; it requires a systematic identification of effective and meaningful thresholds. In this paper, we derive thresholds for a set of structural measures for predicting errors in conceptual process models. To this end, we use a collection of 2,000 business process models from practice as a means of determining thresholds, applying an adaptation of the ROC curves method. Furthermore, an extensive validation of the derived thresholds was conducted by using 429 EPC models from an Australian financial institution. Finally, significant thresholds were adapted to refine existing modeling guidelines in a quantitative way.
Resumo:
Several authors stress the importance of data’s crucial foundation for operational, tactical and strategic decisions (e.g., Redman 1998, Tee et al. 2007). Data provides the basis for decision making as data collection and processing is typically associated with reducing uncertainty in order to make more effective decisions (Daft and Lengel 1986). While the first series of investments of Information Systems/Information Technology (IS/IT) into organizations improved data collection, restricted computational capacity and limited processing power created challenges (Simon 1960). Fifty years on, capacity and processing problems are increasingly less relevant; in fact, the opposite exists. Determining data relevance and usefulness is complicated by increased data capture and storage capacity, as well as continual improvements in information processing capability. As the IT landscape changes, businesses are inundated with ever-increasing volumes of data from both internal and external sources available on both an ad-hoc and real-time basis. More data, however, does not necessarily translate into more effective and efficient organizations, nor does it increase the likelihood of better or timelier decisions. This raises questions about what data managers require to assist their decision making processes.
Resumo:
[Quality Management in Construction Projects by Abdul Razzak Rumane, CRC Press, Boca Raton, FL, 2011, 434 pp, ISBN 9781439838716] Issues of quality management, quality control and performance against specification have long been the focus of various business sectors. Recently there has been an additional drive to achieve the continuous improvement and customer satisfaction promised by the 20th-century ‘gurus’ some six or seven decades ago. The engineering and construction industries have generally taken somewhat longer than their counterparts in the manufacturing, service and production sectors to achieve these espoused levels of quality. The construction and engineering sectors stand to realize major rewards from better managing quality in projects. More effort is being put into instructing future participants in the industry as well as assisting existing professionals. This book comes at an opportune time.
Resumo:
The decision in QCOAL Pty Ltd v Cliffs Australia Coal Pty Ltd [2010] QSC 479 involved an examination of a number of issues relating to the assessment of costs under the Legal Profession Act 2007 (Qld). The decision highlights a range of issues which, in slightly different circumstances, may have deprived the successful party of the right to recover costs by reference to the costs agreement.
Resumo:
Collaborative question answering (cQA) portals such as Yahoo! Answers allow users as askers or answer authors to communicate, and exchange information through the asking and answering of questions in the network. In their current set-up, answers to a question are arranged in chronological order. For effective information retrieval, it will be advantageous to have the users’ answers ranked according to their quality. This paper proposes a novel approach of evaluating and ranking the users’answers and recommending the top-n quality answers to information seekers. The proposed approach is based on a user-reputation method which assigns a score to an answer reflecting its answer author’s reputation level in the network. The proposed approach is evaluated on a dataset collected from a live cQA, namely, Yahoo! Answers. To compare the results obtained by the non-content-based user-reputation method, experiments were also conducted with several content-based methods that assign a score to an answer reflecting its content quality. Various combinations of non-content and content-based scores were also used in comparing results. Empirical analysis shows that the proposed method is able to rank the users’ answers and recommend the top-n answers with good accuracy. Results of the proposed method outperform the content-based methods, various combinations, and the results obtained by the popular link analysis method, HITS.
Resumo:
This article examines the decision in Turner v Mitchells Solicitors [2011] QDC 61 and the issue whether an application for assessment of costs under an interim bill at the time of a final bill is subject to the usual 12-month restriction.
Resumo:
This thesis presents the outcomes of a comprehensive research study undertaken to investigate the influence of rainfall and catchment characteristics on urban stormwater quality. The knowledge created is expected to contribute to a greater understanding of urban stormwater quality and thereby enhance the design of stormwater quality treatment systems. The research study was undertaken based on selected urban catchments in Gold Coast, Australia. The research methodology included field investigations, laboratory testing, computer modelling and data analysis. Both univariate and multivariate data analysis techniques were used to investigate the influence of rainfall and catchment characteristics on urban stormwater quality. The rainfall characteristics investigated included average rainfall intensity and rainfall duration whilst catchment characteristics included land use, impervious area percentage, urban form and pervious area location. The catchment scale data for the analysis was obtained from four residential catchments, including rainfall-runoff records, drainage network data, stormwater quality data and land use and land cover data. Pollutants build-up samples were collected from twelve road surfaces in residential, commercial and industrial land use areas. The relationships between rainfall characteristics, catchment characteristics and urban stormwater quality were investigated based on residential catchments and then extended to other land uses. Based on the influence rainfall characteristics exert on urban stormwater quality, rainfall events can be classified into three different types, namely, high average intensity-short duration (Type 1), high average intensity-long duration (Type 2) and low average intensity-long duration (Type 3). This provides an innovative approach to conventional modelling which does not commonly relate stormwater quality to rainfall characteristics. Additionally, it was found that the threshold intensity for pollutant wash-off from urban catchments is much less than for rural catchments. High average intensity-short duration rainfall events are cumulatively responsible for the generation of a major fraction of the annual pollutants load compared to the other rainfall event types. Additionally, rainfall events less than 1 year ARI such as 6- month ARI should be considered for treatment design as they generate a significant fraction of the annual runoff volume and by implication a significant fraction of the pollutants load. This implies that stormwater treatment designs based on larger rainfall events would not be feasible in the context of cost-effectiveness, efficiency in treatment performance and possible savings in land area needed. This also suggests that the simulation of long-term continuous rainfall events for stormwater treatment design may not be needed and that event based simulations would be adequate. The investigations into the relationship between catchment characteristics and urban stormwater quality found that other than conventional catchment characteristics such as land use and impervious area percentage, other catchment characteristics such as urban form and pervious area location also play important roles in influencing urban stormwater quality. These outcomes point to the fact that the conventional modelling approach in the design of stormwater quality treatment systems which is commonly based on land use and impervious area percentage would be inadequate. It was also noted that the small uniformly urbanised areas within a larger mixed catchment produce relatively lower variations in stormwater quality and as expected lower runoff volume with the opposite being the case for large mixed use urbanised catchments. Therefore, a decentralised approach to water quality treatment would be more effective rather than an "end-of-pipe" approach. The investigation of pollutants build-up on different land uses showed that pollutant build-up characteristics vary even within the same land use. Therefore, the conventional approach in stormwater quality modelling, which is based solely on land use, may prove to be inappropriate. Industrial land use has relatively higher variability in maximum pollutant build-up, build-up rate and particle size distribution than the other two land uses. However, commercial and residential land uses had relatively higher variations of nutrients and organic carbon build-up. Additionally, it was found that particle size distribution had a relatively higher variability for all three land uses compared to the other build-up parameters. The high variability in particle size distribution for all land uses illustrate the dissimilarities associated with the fine and coarse particle size fractions even within the same land use and hence the variations in stormwater quality in relation to pollutants adsorbing to different sizes of particles.
Resumo:
Background: Trauma resulting from traffic crashes poses a significant problem in highly motorised countries. Over a million people worldwide are killed annually and 50 million are critically injured as a result of traffic collisions. In Australia, road crashes cost an average of $17 billion annually in personal loss of income and quality of life, organisational losses in productivity and workplace quality, and health care costs. Driver aggression has been identified as a key factor contributing to crashes, and many motorists report experiencing mild forms of aggression (e.g., rude gestures, horn honking). However despite this concern, driver aggression has received relatively little attention in empirical research, and existing research has been hampered by a number of methodological and conceptual shortcomings. Specifically, there has been substantial disagreement regarding what constitutes aggressive driving and a failure to examine both the situational factors and the emotional and cognitive processes underlying driver aggression. To enhance current understanding of aggressive driving, a model of driver aggression that highlights the cognitive and emotional processes at play in aggressive driving incidents is proposed. Aims: The research aims to improve current understanding of the complex nature of driver aggression by testing and refining a model of aggressive driving that incorporates the person-related and situational factors and the cognitive and emotional appraisal processes fundamental to driver aggression. In doing so, the research will assist to provide a clear definition of what constitutes aggressive driving, assist to identify on-road incidents that trigger driver aggression, and identify the emotional and cognitive appraisal processes that underlie driver aggression. Methods: The research involves three studies. Firstly, to contextualise the model and explore the cognitive and emotional aspects of driver aggression, a diary-based study using self-reports of aggressive driving events will be conducted with a general population of drivers. This data will be supplemented by in-depth follow-up interviews with a sub-sample of participants. Secondly, to test generalisability of the model, a large sample of drivers will be asked to respond to video-based scenarios depicting driving contexts derived from incidents identified in Study 1 as inciting aggression. Finally, to further operationalise and test the model an advanced driving simulator will be used with sample of drivers. These drivers will be exposed to various driving scenarios that would be expected to trigger negative emotional responses. Results: Work on the project has commenced and progress on the first study will be reported.
Resumo:
Prevailing video adaptation solutions change the quality of the video uniformly throughout the whole frame in the bitrate adjustment process; while region-of-interest (ROI)-based solutions selectively retains the quality in the areas of the frame where the viewers are more likely to pay more attention to. ROI-based coding can improve perceptual quality and viewer satisfaction while trading off some bandwidth. However, there has been no comprehensive study to measure the bitrate vs. perceptual quality trade-off so far. The paper proposes an ROI detection scheme for videos, which is characterized with low computational complexity and robustness, and measures the bitrate vs. quality trade-off for ROI-based encoding using a state-of-the-art H.264/AVC encoder to justify the viability of this type of encoding method. The results from the subjective quality test reveal that ROI-based encoding achieves a significant perceptual quality improvement over the encoding with uniform quality at the cost of slightly more bits. Based on the bitrate measurements and subjective quality assessments, the bitrate and the perceptual quality estimation models for non-scalable ROI-based video coding (AVC) are developed, which are found to be similar to the models for scalable video coding (SVC).
Resumo:
Zero energy buildings (ZEB) and zero energy homes (ZEH) are a current hot topic globally for policy makers (what are the benefits and costs), designers (how do we design them), the construction industry (can we build them), marketing (will consumers buy them) and researchers (do they work and what are the implications). This paper presents initial findings from actual measured data from a 9 star (as built), off-ground detached family home constructed in south-east Queensland in 2008. The integrated systems approach to the design of the house is analysed in each of its three main goals: maximising the thermal performance of the building envelope, minimising energy demand whilst maintaining energy service levels, and implementing a multi-pronged low carbon approach to energy supply. The performance outcomes of each of these stages are evaluated against definitions of Net Zero Carbon / Net Zero Emissions (Site and Source) and Net Zero Energy (onsite generation vs primary energy imports). The paper will conclude with a summary of the multiple benefits of combining very high efficiency building envelopes with diverse energy management strategies: a robustness, resilience, affordability and autonomy not generally seen in housing.