906 resultados para Benefit analysis


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This report fully summarises a project designed to enhance commercial real estate performance within both operational and investment contexts through the development of a model aimed at supporting improved decision-making. The model is based on a risk adjusted discounted cash flow, providing a valuable toolkit for building managers, owners, and potential investors for evaluating individual building performance in terms of financial, social and environmental criteria over the complete life-cycle of the asset. The ‘triple bottom line’ approach to the evaluation of commercial property has much significance for the administrators of public property portfolios in particular. It also has applications more generally for the wider real estate industry given that the advent of ‘green’ construction requires new methods for evaluating both new and existing building stocks. The research is unique in that it focuses on the accuracy of the input variables required for the model. These key variables were largely determined by market-based research and an extensive literature review, and have been fine-tuned with extensive testing. In essence, the project has considered probability-based risk analysis techniques that required market-based assessment. The projections listed in the partner engineers’ building audit reports of the four case study buildings were fed into the property evaluation model developed by the research team. The results are strongly consistent with previously existing, less robust evaluation techniques. And importantly, this model pioneers an approach for taking full account of the triple bottom line, establishing a benchmark for related research to follow. The project’s industry partners expressed a high degree of satisfaction with the project outcomes at a recent demonstration seminar. The project in its existing form has not been geared towards commercial applications but it is anticipated that QDPW and other industry partners will benefit greatly by using this tool for the performance evaluation of property assets. The project met the objectives of the original proposal as well as all the specified milestones. The project has been completed within budget and on time. This research project has achieved the objective by establishing research foci on the model structure, the key input variable identification, the drivers of the relevant property markets, the determinants of the key variables (Research Engine no.1), the examination of risk measurement, the incorporation of risk simulation exercises (Research Engine no.2), the importance of both environmental and social factors and, finally the impact of the triple bottom line measures on the asset (Research Engine no. 3).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The report presents a methodology for whole of life cycle cost analysis of alternative treatment options for bridge structures, which require rehabilitation. The methodology has been developed after a review of current methods and establishing that a life cycle analysis based on a probabilistic risk approach has many advantages including the essential ability to consider variability of input parameters. The input parameters for the analysis are identified as initial cost, maintenance, monitoring and repair cost, user cost and failure cost. The methodology utilizes the advanced simulation technique of Monte Carlo simulation to combine a number of probability distributions to establish the distribution of whole of life cycle cost. In performing the simulation, the need for a powerful software package, which would work with spreadsheet program, has been identified. After exploring several products on the market, @RISK software has been selected for the simulation. In conclusion, the report presents a typical decision making scenario considering two alternative treatment options.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resolving insurance disputes can focus only on quantum. Where insurers adopt integrative solutions they can enjoy cost savings and higher customer satisfaction. An integratively managed process can expand the negotiation options. The potential inherent in plaintiff’s emotions to resolve matters on an emotional basis, rather than an economic one, is explored. Using research, the author demonstrates how mediations are more likely to obtain integrative outcomes than unmediated conferences. Using a combination of governmental reports, published studies and academic publications, the paper demonstrates how mediation is more likely to foster an environment where the parties communicate and cooperate. Research is employed to demonstrate where mediators can reduce hostilities, in circumstances where negotiating parties alone would likely fail. Generally the paper constructs an argument to support the proposition that mediation can offer insurers an effective mechanism to reduce costs and increase customer satisfaction. INTRODUCTION Mediation can offer insurers an effective mechanism to reduce costs and increase customer satisfaction. This paper will first demonstrate the differences between distributive and integrative outcomes. It is argued insurer’s interest can be far better served through obtaining an integrative solution. The paper explains how the mediator can assist both parties to obtain an integrative outcome. Simultaneously the paper explores the extreme difficulties conference participants face in obtaining an integrative outcome without a mediator in an adversarial climate. The mediator’s ability to assist in the facilitation of integrative information exchange, defuse hostilities and reality check expectations is discussed. The mediator’s ability to facilitate in this area is compared to the inability of conference participants to achieve similar results. This paper concludes, the potential financial benefit offered by integrative solutions, combined with the ability of mediation to deliver such outcomes where unmediated conferences cannot deliver, leads to the recommendation that insurers opt for a mediation to best serve their commercial interests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In architectural design and the construction industry, there is insufficient evidence about the way designers collaborate in their normal working environments using both traditional and digital media. It is this gap in empirical evidence that the CRC project, “Team Collaboration in High Bandwidth Virtual Environments” addresses. The project is primarily, but not exclusively, concerned with the conceptual stages of design carried out by professional designers working in different offices. The aim is to increase opportunities for communication and interaction between people in geographically distant locations in order to improve the quality of collaboration. In order to understand the practical implications of introducing new digital tools on working practices, research into how designers work collaboratively using both traditional and digital media is being undertaken. This will involve a series of empirical studies in the work places of the industry partners in the project. The studies of collaboration processes will provide empirical results that will lead to more effective use of virtual environments in design and construction processes. The report describes the research approach, the industry study, the methods for data collection and analysis and the foundation research methodologies. A distinctive aspect is that the research has been devised to enable field studies to be undertaken in a live industrial environment where the participant designers carry out real projects alongside their colleagues and in familiar locations. There are two basic research objectives: one is to obtain evidence about design practice that will inform the architecture and construction industries about the impact and potential benefit of using digital collaboration technologies; the second is to add to long term research knowledge of human cognitive and behavioural processes based on real world data. In order to achieve this, the research methods must be able to acquire a rich and heterogeneous set of data from design activities as they are carried out in the normal working environment. This places different demands upon the data collection and analysis methods to those of laboratory studies where controlled conditions are required. In order to address this, the research approach that has been adopted is ethnographic in nature and case study-based. The plan is to carry out a series of indepth studies in order to provide baseline results for future research across a wider community of user groups. An important objective has been to develop a methodology that will produce valid, significant and transferable results. The research will contribute to knowledge about how architectural design and the construction industry may benefit from the introduction of leading edge collaboration technologies. The outcomes will provide a sound foundation for the production of guidelines for the assessment of high bandwidth tools and their future deployment. The knowledge will form the basis for the specification of future collaboration products and collaboration processes. This project directly addresses the industry-identified focus on cultural change, image, e-project management, and innovative methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project, as part of a broader Sustainable Sub-divisions research agenda, addresses the role of natural ventilation in reducing the use of energy required to cool dwellings

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the previous phase of this project, 2002-059-B Case-Based Reasoning in Construction and Infrastructure Projects, demonstration software was developed using a case-base reasoning engine to access a number of sources of information on lifetime of metallic building components. One source of information was data from the Queensland Department of Public Housing relating to maintenance operations over a number of years. Maintenance information is seen as being a particularly useful source of data about service life of building components as it relates to actual performance of materials in the working environment. If a building is constructed in 1984 and the maintenance records indicate that the guttering was replaced in 2006, then the service life of the gutters was 22 years in that environment. This phase of the project aims to look more deeply at the Department of Housing data, as an example of maintenance records, and formulate methods for using this data to inform the knowledge of service lifetimes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Realistic estimates of short- and long-term (strategic) budgets for maintenance and rehabilitation of road assessment management should consider the stochastic characteristics of asset conditions of the road networks so that the overall variability of road asset data conditions is taken into account. The probability theory has been used for assessing life-cycle costs for bridge infrastructures by Kong and Frangopol (2003), Zayed et.al. (2002), Kong and Frangopol (2003), Liu and Frangopol (2004), Noortwijk and Frangopol (2004), Novick (1993). Salem 2003 cited the importance of the collection and analysis of existing data on total costs for all life-cycle phases of existing infrastructure, including bridges, road etc., and the use of realistic methods for calculating the probable useful life of these infrastructures (Salem et. al. 2003). Zayed et. al. (2002) reported conflicting results in life-cycle cost analysis using deterministic and stochastic methods. Frangopol et. al. 2001 suggested that additional research was required to develop better life-cycle models and tools to quantify risks, and benefits associated with infrastructures. It is evident from the review of the literature that there is very limited information on the methodology that uses the stochastic characteristics of asset condition data for assessing budgets/costs for road maintenance and rehabilitation (Abaza 2002, Salem et. al. 2003, Zhao, et. al. 2004). Due to this limited information in the research literature, this report will describe and summarise the methodologies presented by each publication and also suggest a methodology for the current research project funded under the Cooperative Research Centre for Construction Innovation CRC CI project no 2003-029-C.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reliable budget/cost estimates for road maintenance and rehabilitation are subjected to uncertainties and variability in road asset condition and characteristics of road users. The CRC CI research project 2003-029-C ‘Maintenance Cost Prediction for Road’ developed a method for assessing variation and reliability in budget/cost estimates for road maintenance and rehabilitation. The method is based on probability-based reliable theory and statistical method. The next stage of the current project is to apply the developed method to predict maintenance/rehabilitation budgets/costs of large networks for strategic investment. The first task is to assess the variability of road data. This report presents initial results of the analysis in assessing the variability of road data. A case study of the analysis for dry non reactive soil is presented to demonstrate the concept in analysing the variability of road data for large road networks. In assessing the variability of road data, large road networks were categorised into categories with common characteristics according to soil and climatic conditions, pavement conditions, pavement types, surface types and annual average daily traffic. The probability distributions, statistical means, and standard deviation values of asset conditions and annual average daily traffic for each type were quantified. The probability distributions and the statistical information obtained in this analysis will be used to asset the variation and reliability in budget/cost estimates in later stage. Generally, we usually used mean values of asset data of each category as input values for investment analysis. The variability of asset data in each category is not taken into account. This analysis method demonstrated that it can be used for practical application taking into account the variability of road data in analysing large road networks for maintenance/rehabilitation investment analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Enhancing children's self-concepts is widely accepted as a critical educational outcome of schooling and is postulated as a mediating variable that facilitates the attainment of other desired outcomes such as improved academic achievement. Despite considerable advances in self-concept research, there has been limited progress in devising teacher-administered enhancement interventions. This is unfortunate as teachers are crucial change agents during important developmental periods when self-concept is formed. The primary aim of the present investigation is to build on the promising features of previous self-concept enhancement studies by: (a) combining two exciting research directions developed by Burnett and Craven to develop a potentially powerful cognitive-based intervention; (b) incorporating recent developments in theory and measurement to ensure that the multidimensionality of self-concept is accounted for in the research design; (c) fully investigating the effects of a potentially strong cognitive intervention on reading, mathematics, school and learning self-concepts by using a large sample size and a sophisticated research design; (d) evaluating the effects of the intervention on affective and cognitive subcomponents of reading, mathematics, school and learning self-concepts over time to test for differential effects of the intervention; (e) modifying and extending current procedures to maximise the successful implementation of a teacher-mediated intervention in a naturalistic setting by incorporating sophisticated teacher training as suggested by Hattie (1992) and including an assessment of the efficacy of implementation; and (f) examining the durability of effects associated with the intervention.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the stability of an autonomous microgrid with multiple distributed generators (DG) is studied through eigenvalue analysis. It is assumed that all the DGs are connected through Voltage Source Converter (VSC) and all connected loads are passive. The VSCs are controlled by state feedback controller to achieve desired voltage and current outputs that are decided by a droop controller. The state space models of each of the converters with its associated feedback are derived. These are then connected with the state space models of the droop, network and loads to form a homogeneous model, through which the eigenvalues are evaluated. The system stability is then investigated as a function of the droop controller real and reac-tive power coefficients. These observations are then verified through simulation studies using PSCAD/EMTDC. It will be shown that the simulation results closely agree with stability be-havior predicted by the eigenvalue analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the past decade, the utilization of ambulance data to inform the prevalence of nonfatal heroin overdose has increased. These data can assist public health policymakers, law enforcement agencies, and health providers in planning and allocating resources. This study examined the 672 ambulance attendances at nonfatal heroin overdoses in Queensland, Australia, in 2000. Gender distribution showed a typical 70/30 male-to-female ratio. An equal number of persons with nonfatal heroin overdose were between 15 and 24 years of age and 25 and 34 years of age. Police were present in only 1 of 6 cases, and 28.1% of patients reported using drugs alone. Ambulance data are proving to be a valuable population-based resource for describing the incidence and characteristics of nonfatal heroin overdose episodes. Future studies could focus on the differences between nonfatal heroin overdose and fatal heroin overdose samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Community awareness and the perception on the traffic noise related health impacts have increased significantly over the last decade resulting in a large volume of public inquiries flowing to Road Authorities for planning advice. Traffic noise management in the urban environment is therefore becoming a “social obligation”, essentially due to noise related health concerns. Although various aspects of urban noise pollution and mitigation have been researched independently, an integrated approach by stakeholders has not been attempted. Although the current treatment and mitigation strategies are predominantly handled by the Road Agencies, a concerted effort by all stakeholders is becoming mandatory for effective and tangible outcomes in the future. A research project is underway a RMIT University, Australia, led by the second author to consider the use of “hedonic pricing” for alternative noise amelioration treatments within the road reserve and outside the road reserve. The project aims to foster a full range noise abatement strategy encompassing source, path and noise receiver. The benefit of such a study would be to mitigate the problem where it is most effective and would defuse traditional “authority” boundaries to produce the optimum outcome. The project is conducted in collaboration with the Department of Main Roads Queensland, Australia and funded by the CRC for Construction Innovation. As part of this study, a comprehensive literature search is currently underway to investigate the advancements in community health research, related to environmental noise pollution, and the advancements in technical and engineering research in mitigating the issue. This paper presents the outcomes of this work outlining state of the art, national and international good practices and gap analysis to identify major anomalies and developments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

LEX is a stream cipher that progressed to Phase 3 of the eSTREAM stream cipher project. In this paper, we show that the security of LEX against algebraic attacks relies on a small equation system not being solvable faster than exhaustive search. We use the byte leakage in LEX to construct a system of 21 equa- tions in 17 variables. This is very close to the require- ment for an efficient attack, i.e. a system containing 16 variables. The system requires only 36 bytes of keystream, which is very low.