963 resultados para event management


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research team recognized the value of network-level Falling Weight Deflectometer (FWD) testing to evaluate the structural condition trends of flexible pavements. However, practical limitations due to the cost of testing, traffic control and safety concerns and the ability to test a large network may discourage some agencies from conducting the network-level FWD testing. For this reason, the surrogate measure of the Structural Condition Index (SCI) is suggested for use. The main purpose of the research presented in this paper is to investigate data mining strategies and to develop a prediction method of the structural condition trends for network-level applications which does not require FWD testing. The research team first evaluated the existing and historical pavement condition, distress, ride, traffic and other data attributes in the Texas Department of Transportation (TxDOT) Pavement Maintenance Information System (PMIS), applied data mining strategies to the data, discovered useful patterns and knowledge for SCI value prediction, and finally provided a reasonable measure of pavement structural condition which is correlated to the SCI. To evaluate the performance of the developed prediction approach, a case study was conducted using the SCI data calculated from the FWD data collected on flexible pavements over a 5-year period (2005 – 09) from 354 PMIS sections representing 37 pavement sections on the Texas highway system. The preliminary study results showed that the proposed approach can be used as a supportive pavement structural index in the event when FWD deflection data is not available and help pavement managers identify the timing and appropriate treatment level of preventive maintenance activities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is to investigate the essential elements of sport management in Australia in the 1990's. The essential purpose is to view these elements from a legal perspective. In the past 12 months there has been at least three conferences in the sports law area. The majority of this paper has been allocated to the area of legal liability, especially the legal relationships evolving between the player and his co-participant, the player and his club, the player and his coach, and the duties and liabilities of the coach and the club. The area of insurance will also be discussed as it is a vital element in protecting the players, coaches and clubs in the event of any litigation. A well publicised case was that of Rogers v Bugden where the plaintiff Steven Rogers, who was a first grade rugby league football player for Cronulla, suffered a broken jaw and sued his co-participant Mark Bugden and Bugden's employer Canterbury/Bankstown District Rugby League Football Club. It was held that there was a contract of employment and Canterbury/Bankstown was found to be vicariously liable and was ordered to pay Rogers the sum of $68,154.00. The legal actions in tort and negligence are increasing. Sports managers will need to investigate thoroughly the protection available for their clients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Effective risk management is crucial for any organisation. One of its key steps is risk identification, but few tools exist to support this process. Here we present a method for the automatic discovery of a particular type of process-related risk, the danger of deadline transgressions or overruns, based on the analysis of event logs. We define a set of time-related process risk indicators, i.e., patterns observable in event logs that highlight the likelihood of an overrun, and then show how instances of these patterns can be identified automatically using statistical principles. To demonstrate its feasibility, the approach has been implemented as a plug-in module to the process mining framework ProM and tested using an event log from a Dutch financial institution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Organisations are constantly seeking efficiency improvements for their business processes in terms of time and cost. Management accounting enables reporting of detailed cost of operations for decision making purpose, although significant effort is required to gather accurate operational data. Business process management is concerned with systematically documenting, managing, automating, and optimising processes. Process mining gives valuable insight into processes through analysis of events recorded by an IT system in the form of an event log with the focus on efficient utilisation of time and resources, although its primary focus is not on cost implications. In this paper, we propose a framework to support management accounting decisions on cost control by automatically incorporating cost data with historical data from event logs for monitoring, predicting and reporting process-related costs. We also illustrate how accurate, relevant and timely management accounting style cost reports can be produced on demand by extending open-source process mining framework ProM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Process mining encompasses the research area which is concerned with knowledge discovery from information system event logs. Within the process mining research area, two prominent tasks can be discerned. First of all, process discovery deals with the automatic construction of a process model out of an event log. Secondly, conformance checking focuses on the assessment of the quality of a discovered or designed process model in respect to the actual behavior as captured in event logs. Hereto, multiple techniques and metrics have been developed and described in the literature. However, the process mining domain still lacks a comprehensive framework for assessing the goodness of a process model from a quantitative perspective. In this study, we describe the architecture of an extensible framework within ProM, allowing for the consistent, comparative and repeatable calculation of conformance metrics. For the development and assessment of both process discovery as well as conformance techniques, such a framework is considered greatly valuable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Risk identification is one of the most challenging stages in the risk management process. Conventional risk management approaches provide little guidance and companies often rely on the knowledge of experts for risk identification. In this paper we demonstrate how risk indicators can be used to predict process delays via a method for configuring so-called Process Risk Indicators(PRIs). The method learns suitable configurations from past process behaviour recorded in event logs. To validate the approach we have implemented it as a plug-in of the ProM process mining framework and have conducted experiments using various data sets from a major insurance company.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Organisations are constantly seeking efficiency gains for their business processes in terms of time and cost. Management accounting enables detailed cost reporting of business operations for decision making purposes, although significant effort is required to gather accurate operational data. Process mining, on the other hand, may provide valuable insight into processes through analysis of events recorded in logs by IT systems, but its primary focus is not on cost implications. In this paper, a framework is proposed which aims to exploit the strengths of both fields in order to better support management decisions on cost control. This is achieved by automatically merging cost data with historical data from event logs for the purposes of monitoring, predicting, and reporting process-related costs. The on-demand generation of accurate, relevant and timely cost reports, in a style akin to reports in the area of management accounting, will also be illustrated. This is achieved through extending the open-source process mining framework ProM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is only in recent years that the critical role that spatial data can play in disaster management and strengthening community resilience has been recognised. The recognition of this importance is singularly evident from the fact that in Australia spatial data is considered as soft infrastructure. In the aftermath of every disaster this importance is being increasingly strengthened with state agencies paying greater attention to ensuring the availability of accurate spatial data based on the lessons learnt. For example, the major flooding in Queensland during the summer of 2011 resulted in a comprehensive review of responsibilities and accountability for the provision of spatial information during such natural disasters. A high level commission of enquiry completed a comprehensive investigation of the 2011 Brisbane flood inundation event and made specific recommendations concerning the collection of and accessibility to spatial information for disaster management and for strengthening community resilience during and after a natural disaster. The lessons learnt and processes implemented were subsequently tested by natural disasters during subsequent years. This paper provides an overview of the practical implementation of the recommendations of the commission of enquiry. It focuses particularly on the measures adopted by the state agencies with the primary role for managing spatial data and the evolution of this role in Queensland State, Australia. The paper concludes with a review of the development of the role and the increasing importance of spatial data as an infrastructure for disaster planning and management which promotes the strengthening of community resilience.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The advances made within the aviation industry over the past several decades have significantly improved the availability, affordability and convenience of air travel and have been greatly beneficial in both social and economic terms. Air transport has developed into an irreplaceable service being relied on by millions of people each day and as such airports have become critical elements of national infrastructure to facilitate the movement of people and goods. As components of critical infrastructure (CI), airports are integral parts of a national economy supporting regional as well as national trade, commercial activity and employment. Therefore, any disruption or crisis which impacts the continuity of operations at airports can have significant negative consequences for the airport as a business, for the local economy and other nodes of transport infrastructure as well as for society. Due to the highly dynamic and volatile environment in which airports operate in, the aviation industry has faced many different challenges over the years ranging from terrorist attacks such as September 11, to health crises such as the SARS epidemic to system breakdowns such as the recent computer system outage at Virgin Blue Airlines in Australia. All these events have highlighted the vulnerability of airport systems to a range of disturbances as well as the gravity and widespread impact of any kind of discontinuity in airport functions. Such incidents thus emphasise the need for increasing resilience and reliability of airports and ensuring business continuity in the event of a crisis...

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel intelligent online demand side management system is proposed for peak load management. The method also regulates the network voltage, balances the power in three phases and coordinates the battery storage discharge within the network. This method uses low cost controllers with low bandwidth two-way communication installed in costumers' premises and at distribution transformers to manage the peak load while maximizing customer satisfaction. A multi-objective decision making process is proposed to select the load(s) to be delayed or controlled. The efficacy of the proposed control system is verified through an event-based developed simulation in Matlab.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Standard Monte Carlo (sMC) simulation models have been widely used in AEC industry research to address system uncertainties. Although the benefits of probabilistic simulation analyses over deterministic methods are well documented, the sMC simulation technique is quite sensitive to the probability distributions of the input variables. This phenomenon becomes highly pronounced when the region of interest within the joint probability distribution (a function of the input variables) is small. In such cases, the standard Monte Carlo approach is often impractical from a computational standpoint. In this paper, a comparative analysis of standard Monte Carlo simulation to Markov Chain Monte Carlo with subset simulation (MCMC/ss) is presented. The MCMC/ss technique constitutes a more complex simulation method (relative to sMC), wherein a structured sampling algorithm is employed in place of completely randomized sampling. Consequently, gains in computational efficiency can be made. The two simulation methods are compared via theoretical case studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the growing size and variety of social media files on the web, it’s becoming critical to efficiently organize them into clusters for further processing. This paper presents a novel scalable constrained document clustering method that harnesses the power of search engines capable of dealing with large text data. Instead of calculating distance between the documents and all of the clusters’ centroids, a neighborhood of best cluster candidates is chosen using a document ranking scheme. To make the method faster and less memory dependable, the in-memory and in-database processing are combined in a semi-incremental manner. This method has been extensively tested in the social event detection application. Empirical analysis shows that the proposed method is efficient both in computation and memory usage while producing notable accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Organisations are constantly seeking new ways to improve operational efficiencies. This research study investigates a novel way to identify potential efficiency gains in business operations by observing how they are carried out in the past and then exploring better ways of executing them by taking into account trade-offs between time, cost and resource utilisation. This paper demonstrates how they can be incorporated in the assessment of alternative process execution scenarios by making use of a cost environment. A genetic algorithm-based approach is proposed to explore and assess alternative process execution scenarios, where the objective function is represented by a comprehensive cost structure that captures different process dimensions. Experiments conducted with different variants of the genetic algorithm evaluate the approach's feasibility. The findings demonstrate that a genetic algorithm-based approach is able to make use of cost reduction as a way to identify improved execution scenarios in terms of reduced case durations and increased resource utilisation. The ultimate aim is to utilise cost-related insights gained from such improved scenarios to put forward recommendations for reducing process-related cost within organisations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an event-based failure model to predict the number of failures that occur in water distribution assets. Often, such models have been based on analysis of historical failure data combined with pipe characteristics and environmental conditions. In this paper weather data have been added to the model to take into account the commonly observed seasonal variation of the failure rate. The theoretical basis of existing logistic regression models is briefly described in this paper, along with the refinements made to the model for inclusion of seasonal variation of weather. The performance of these refinements is tested using data from two Australian water authorities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Through the application of process mining, valuable evidence-based insights can be obtained about business processes in organisations. As a result the field has seen an increased uptake in recent years as evidenced by success stories and increased tool support. However, despite this impact, current performance analysis capabilities remain somewhat limited in the context of information-poor event logs. For example, natural daily and weekly patterns are not considered. In this paper a new framework for analysing event logs is defined which is based on the concept of event gap. The framework allows for a systematic approach to sophisticated performance-related analysis of event logs containing varying degrees of information. The paper formalises a range of event gap types and then presents an implementation as well as an evaluation of the proposed approach.