922 resultados para Asset Management, Decision, Taxonomy, Context Analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Digital data sets constitute rich sources of information, which can be extracted and evaluated applying computational tools, for example, those ones for Information Visualization. Web-based applications, such as social network environments, forums and virtual environments for Distance Learning, are good examples for such sources. The great amount of data has direct impact on processing and analysis tasks. This paper presents the computational tool Mapper, defined and implemented to use visual representations - maps, graphics and diagrams - for supporting the decision making process by analyzing data stored in Virtual Learning Environment TelEduc-Unesp. © 2012 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the project, managers encounter numerous contingencies and are faced with the challenging task of making decisions that will effectively keep the project on track. This task is very challenging because construction projects are non-prototypical and the processes are irreversible. Therefore, it is critical to apply a methodological approach to develop a few alternative management decision strategies during the planning phase, which can be deployed to manage alternative scenarios resulting from expected and unexpected disruptions in the as-planned schedule. Such a methodology should have the following features but are missing in the existing research: (1) looking at the effects of local decisions on the global project outcomes, (2) studying how a schedule responds to decisions and disruptive events because the risk in a schedule is a function of the decisions made, (3) establishing a method to assess and improve the management decision strategies, and (4) developing project specific decision strategies because each construction project is unique and the lessons from a particular project cannot be easily applied to projects that have different contexts. The objective of this dissertation is to develop a schedule-based simulation framework to design, assess, and improve sequences of decisions for the execution stage. The contribution of this research is the introduction of applying decision strategies to manage a project and the establishment of iterative methodology to continuously assess and improve decision strategies and schedules. The project managers or schedulers can implement the methodology to develop and identify schedules accompanied by suitable decision strategies to manage a project at the planning stage. The developed methodology also lays the foundation for an algorithm towards continuously automatically generating satisfactory schedule and strategies through the construction life of a project. Different from studying isolated daily decisions, the proposed framework introduces the notion of {em decision strategies} to manage construction process. A decision strategy is a sequence of interdependent decisions determined by resource allocation policies such as labor, material, equipment, and space policies. The schedule-based simulation framework consists of two parts, experiment design and result assessment. The core of the experiment design is the establishment of an iterative method to test and improve decision strategies and schedules, which is based on the introduction of decision strategies and the development of a schedule-based simulation testbed. The simulation testbed used is Interactive Construction Decision Making Aid (ICDMA). ICDMA has an emulator to duplicate the construction process that has been previously developed and a random event generator that allows the decision-maker to respond to disruptions in the emulation. It is used to study how the schedule responds to these disruptions and the corresponding decisions made over the duration of the project while accounting for cascading impacts and dependencies between activities. The dissertation is organized into two parts. The first part presents the existing research, identifies the departure points of this work, and develops a schedule-based simulation framework to design, assess, and improve decision strategies. In the second part, the proposed schedule-based simulation framework is applied to investigate specific research problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Desertification research conventionally focuses on the problem – that is, degradation – while neglecting the appraisal of successful conservation practices. Based on the premise that Sustainable Land Management (SLM) experiences are not sufficiently or comprehensively documented, evaluated, and shared, the World Overview of Conservation Approaches and Technologies (WOCAT) initiative (www.wocat.net), in collaboration with FAO’s Land Degradation Assessment in Drylands (LADA) project (www.fao.org/nr/lada/) and the EU’s DESIRE project (http://www.desire-project.eu/), has developed standardised tools and methods for compiling and evaluating the biophysical and socio-economic knowledge available about SLM. The tools allow SLM specialists to share their knowledge and assess the impact of SLM at the local, national, and global levels. As a whole, the WOCAT–LADA–DESIRE methodology comprises tools for documenting, self-evaluating, and assessing the impact of SLM practices, as well as for knowledge sharing and decision support in the field, at the planning level, and in scaling up identified good practices. SLM depends on flexibility and responsiveness to changing complex ecological and socioeconomic causes of land degradation. The WOCAT tools are designed to reflect and capture this capacity of SLM. In order to take account of new challenges and meet emerging needs of WOCAT users, the tools are constantly further developed and adapted. Recent enhancements include tools for improved data analysis (impact and cost/benefit), cross-scale mapping, climate change adaptation and disaster risk management, and easier reporting on SLM best practices to UNCCD and other national and international partners. Moreover, WOCAT has begun to give land users a voice by backing conventional documentation with video clips straight from the field. To promote the scaling up of SLM, WOCAT works with key institutions and partners at the local and national level, for example advisory services and implementation projects. Keywords: Sustainable Land Management (SLM), knowledge management, decision-making, WOCAT–LADA–DESIRE methodology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

According to the PMBOK (Project Management Body of Knowledge), project management is “the application of knowledge, skills, tools, and techniques to project activities to meet the project requirements” [1]. Project Management has proven to be one of the most important disciplines at the moment of determining the success of any project [2][3][4]. Given that many of the activities covered by this discipline can be said that are “horizontal” for any kind of domain, the importance of acknowledge the concepts and practices becomes even more obvious. The specific case of the projects that fall in the domain of Software Engineering are not the exception about the great influence of Project Management for their success. The critical role that this discipline plays in the industry has come to numbers. A report by McKinsey & Co [4] shows that the establishment of programs for the teaching of critical skills of project management can improve the performance of the project in time and costs. As an example of the above, the reports exposes: “One defense organization used these programs to train several waves of project managers and leaders who together administered a portfolio of more than 1,000 capital projects ranging in Project management size from $100,000 to $500 million. Managers who successfully completed the training were able to cut costs on most projects by between 20 and 35 percent. Over time, the organization expects savings of about 15 percent of its entire baseline spending”. In a white paper by the PMI (Project Management Institute) about the value of project management [5], it is stated that: “Leading organizations across sectors and geographic borders have been steadily embracing project management as a way to control spending and improve project results”. According to the research made by the PMI for the paper, after the economical crisis “Executives discovered that adhering to project management methods and strategies reduced risks, cut costs and improved success rates—all vital to surviving the economic crisis”. In every elite company, a proper execution of the project management discipline has become a must. Several members of the software industry have putted effort into achieving ways of assuring high quality results from projects; many standards, best practices, methodologies and other resources have been produced by experts from different fields of expertise. In the industry and the academic community, there is a continuous research on how to teach better software engineering together with project management [4][6]. For the general practices of Project Management the PMI produced a guide of the required knowledge that any project manager should have in their toolbox to lead any kind of project, this guide is called the PMBOK. On the side of best practices 10 and required knowledge for the Software Engineering discipline, the IEEE (Institute of Electrical and Electronics Engineers) developed the SWEBOK (Software Engineering Body of Knowledge) in collaboration with software industry experts and academic researchers, introducing into the guide many of the needed knowledge for a 5-year expertise software engineer [7]. The SWEBOK also covers management from the perspective of a software project. This thesis is developed to provide guidance to practitioners and members of the academic community about project management applied to software engineering. The way used in this thesis to get useful information for practitioners is to take an industry-approved guide for software engineering professionals such as the SWEBOK, and compare the content to what is found in the PMBOK. After comparing the contents of the SWEBOK and the PMBOK, what is found missing in the SWEBOK is used to give recommendations on how to enrich project management skills for a software engineering professional. Recommendations for members of the academic community on the other hand, are given taking into account the GSwE2009 (Graduated Software Engineering 2009) standard [8]. GSwE2009 is often used as a main reference for software engineering master programs [9]. The standard is mostly based on the content of the SWEBOK, plus some contents that are considered to reinforce the education of software engineering. Given the similarities between the SWEBOK and the GSwE2009, the results of comparing SWEBOK and PMBOK are also considered valid to enrich what the GSwE2009 proposes. So in the end the recommendations for practitioners end up being also useful for the academic community and their strategies to teach project management in the context of software engineering.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

National Highway Traffic Safety Administration, Washington, D.C.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study demonstrates a quantitative approach to construction risk management through analytic hierarchy process and decision tree analysis. All the risk factors are identified, their effects are quantified by determining probability and severity, and various alternative responses are generated with cost implication for mitigating the quantified risks. The expected monetary values are then derived for each alternative in a decision tree framework and subsequent probability analysis aids the decision process in managing risks. The entire methodology is explained through a case application of a cross-country petroleum pipeline project in India and its effectiveness in project management is demonstrated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is widely accepted that the Thatcher years and their immediate aftermath were associated with substantive social and organizational change. The privatisation programme, 'the rolling back of the State', prosecuted by the successive Conservative Governments from 1979-1997 was a central pillar of Governmental policy. This thesis seeks to engage with privatization through the of CoastElectric, a newly privatised Regional Electricity Company. This thesis contributes to the extant understanding of the dynamics of organizational change in four major ways. Firstly, the study into CoastElectric addresses the senior management decision making within the organization: in particular, it will attempt to make sense of 'why' particular decisions were made. The theoretical backdrop to this concern will draw on the concepts of normalization, cultural capital and corporate fashion. The argument presented in this thesis is that the decision-making broadly corresponded with that which could be considered to be at the vanguard of mangerialist thought. However, a detailed analysis suggested that at different junctures in CoastElectric's history there were differences in the approach to decision making that warranted further analysis. The most notable finding was that the relative levels of new managerialist cultural capital possessed by the decision-making elite had an important bearing upon whether the decision was formulated either endogenously or exogenously, with the assistance of cultural intermediaries such as management consultants. The thesis demonstrates the importance of the broader discourse of new managerialism in terms of shaping what is considered to be a 'commonsensical, rational' strategy. The second concern of this thesis is that of the process of organizational change. The study of CoastElectric attempts to provide a rich account of the dynamics of organizational change. This is realized through, first, examining the pre-existing context of the organization; second, through analyzing the power politics of change interventions. The master concepts utilised in this endeavour are that of: dividing practices, the establishment of violent hierarchies between competing discourses; symbolic violence; critical turning points; recursiveness; creative destruction; legitimation strategies and the reconstitution of subjects in the workplace.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – The objective of this exploratory study is to investigate the “flow-through” or relationship between top-line measures of hotel operating performance (occupancy, average daily rate and revenue per available room) and bottom-line measures of profitability (gross operating profit and net operating income), before and during the recent great recession. Design/methodology/approach – This study uses data provided by PKF Hospitality Research for the period from 2007-2009. A total of 714 hotels were analyzed and various top-line and bottom-line profitability changes were computed using both absolute levels and percentages. Multiple regression analysis was used to examine the relationship between top and bottom line measures, and to derive flow-through ratios. Findings – The results show that average daily rate (ADR) and occupancy are significantly and positively related to gross operating profit per available room (GOPPAR) and net operating income per available room (NOIPAR). The evidence indicates that ADR, rather than occupancy, appears to be the stronger predictor and better measure of RevPAR growth and bottom-line profitability. The correlations and explained variances are also higher than those reported in prior research. Flow-through ratios range between 1.83 and 1.91 for NOIPAR, and between 1.55 and 1.65 for GOPPAR, across all chain-scales. Research limitations/implications – Limitations of this study include the limited number of years in the study period, limited number of hotels in a competitive set, and self-selection of hotels by the researchers. Practical implications – While ADR and occupancy work in combination to drive profitability, the authors' study shows that ADR is the stronger predictor of profitability. Hotel managers can use flow-through ratios to make financial forecasts, or use them as inputs in valuation models, to forecast future profitability. Originality/value – This paper extends prior research on the relationship between top-line measures and bottom-line profitability and serves to inform lodging owners, operators and asset managers about flow-through ratios, and how these ratios impact hotel profitability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Maintenance of transport infrastructure assets is widely advocated as the key in minimizing current and future costs of the transportation network. While effective maintenance decisions are often a result of engineering skills and practical knowledge, efficient decisions must also account for the net result over an asset's life-cycle. One essential aspect in the long term perspective of transport infrastructure maintenance is to proactively estimate maintenance needs. In dealing with immediate maintenance actions, support tools that can prioritize potential maintenance candidates are important to obtain an efficient maintenance strategy. This dissertation consists of five individual research papers presenting a microdata analysis approach to transport infrastructure maintenance. Microdata analysis is a multidisciplinary field in which large quantities of data is collected, analyzed, and interpreted to improve decision-making. Increased access to transport infrastructure data enables a deeper understanding of causal effects and a possibility to make predictions of future outcomes. The microdata analysis approach covers the complete process from data collection to actual decisions and is therefore well suited for the task of improving efficiency in transport infrastructure maintenance. Statistical modeling was the selected analysis method in this dissertation and provided solutions to the different problems presented in each of the five papers. In Paper I, a time-to-event model was used to estimate remaining road pavement lifetimes in Sweden. In Paper II, an extension of the model in Paper I assessed the impact of latent variables on road lifetimes; displaying the sections in a road network that are weaker due to e.g. subsoil conditions or undetected heavy traffic. The study in Paper III incorporated a probabilistic parametric distribution as a representation of road lifetimes into an equation for the marginal cost of road wear. Differentiated road wear marginal costs for heavy and light vehicles are an important information basis for decisions regarding vehicle miles traveled (VMT) taxation policies. In Paper IV, a distribution based clustering method was used to distinguish between road segments that are deteriorating and road segments that have a stationary road condition. Within railway networks, temporary speed restrictions are often imposed because of maintenance and must be addressed in order to keep punctuality. The study in Paper V evaluated the empirical effect on running time of speed restrictions on a Norwegian railway line using a generalized linear mixed model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report presents a summary of the research conducted by the research team of the CRC project 2002-005-C, “Decision support tools for concrete infrastructure rehabilitation”. The project scope, objectives, significance and innovation and the research methodology is outlined in the introduction, which is followed by five chapters covering different aspects of the research completed. Major findings of a review of literature conducted covering both use of fibre reinforced polymer composites in rehabilitation of concrete bridge structures and decision support frameworks in civil infrastructure asset management is presented in chapter two. Case study of development of a strengthening scheme for the “Tenthill Creek bridge” is covered in the third chapter, which summarises the capacity assessment, traditional strengthening solution and the innovative solution using FRP composites. The fourth chapter presents the methodology for development of a user guide covering selection of materials, design and application of FRP in strengthening of concrete structures, which were demonstrated using design examples. Fifth chapter presents the methodology developed for evaluating whole of life cycle costing of treatment options for concrete bridge structures. The decision support software tool developed to compare different treatment options based on reliability based whole of life cycle costing will be briefly described in this chapter as well. The report concludes with a summary of findings and recommendations for future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the previous research CRC CI 2001-010-C “Investment Decision Framework for Infrastructure Asset Management”, a method for assessing variation in cost estimates for road maintenance and rehabilitation was developed. The variability of pavement strength collected from a 92km national highway was used in the analysis to demonstrate the concept. Further analysis was conducted to identify critical input parameters that significantly affect the prediction of road deterioration. In addition to pavement strength, rut depth, annual traffic loading and initial roughness were found to be critical input parameters for road deterioration. This report presents a method developed to incorporate other critical parameters in the analysis, such as unit costs, which are suspected to contribute to a certain degree to cost estimate variation. Thus, the variability of unit costs will be incorporated in this analysis. Bruce Highway located in the tropical east coast of Queensland has been identified to be the network for the analysis. This report presents a step by step methodology for assessing variation in road maintenance and rehabilitation cost estimates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With increasingly complex engineering assets and tight economic requirements, asset reliability becomes more crucial in Engineering Asset Management (EAM). Improving the reliability of systems has always been a major aim of EAM. Reliability assessment using degradation data has become a significant approach to evaluate the reliability and safety of critical systems. Degradation data often provide more information than failure time data for assessing reliability and predicting the remnant life of systems. In general, degradation is the reduction in performance, reliability, and life span of assets. Many failure mechanisms can be traced to an underlying degradation process. Degradation phenomenon is a kind of stochastic process; therefore, it could be modelled in several approaches. Degradation modelling techniques have generated a great amount of research in reliability field. While degradation models play a significant role in reliability analysis, there are few review papers on that. This paper presents a review of the existing literature on commonly used degradation models in reliability analysis. The current research and developments in degradation models are reviewed and summarised in this paper. This study synthesises these models and classifies them in certain groups. Additionally, it attempts to identify the merits, limitations, and applications of each model. It provides potential applications of these degradation models in asset health and reliability prediction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most infrastructure projects share the same characteristics in term of management aspects and shortcomings. Human factor is believed to be the major drawbacks due to the nature of unstructured problems which can further contribute to management conflicts. This growing complexity in infrastructure projects has shift the paradigm of policy makers to adopt Information Communication Technology (ICT) as a driving force. For this reason, it is vital to fully maximise and utilise the recent technologies to accelerate management process particularly in planning phase. Therefore, a lot of tools have been developed to assist decision making in construction project management. The variety of uncertainties and alternatives in decision making can be entertained by using useful tool such as Decision Support System (DSS). However, the recent trend shows that most DSS in this area only concentrated in model development and left few fundamentals of computing. Thus, most of them were found complicated and less efficient to support decision making within project team members. Due to the current incapability of many software aspects, it is desirable for DSS to provide more simplicity, better collaborative platform, efficient data manipulation and reflection to user needs. By considering these factors, the paper illustrates four challenges for future DSS development i.e. requirement engineering, communication framework, data management and interoperability, and software usability