806 resultados para intelligent decision support systems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

With organisations facing significant challenges to remain competitive, Business Process Improvement (BPI) initiatives are often conducted to improve the efficiency and effectiveness of their business processes, focussing on time, cost, and quality improvements. Event logs which contain a detailed record of business operations over a certain time period, recorded by an organisation's information systems, are the first step towards initiating evidence-based BPI activities. Given an (original) event log as a starting point, an approach to explore better ways to execute a business process was developed, resulting in an improved (perturbed) event log. Identifying the differences between the original event log and the perturbed event log can provide valuable insights, helping organisations to improve their processes. However, there is a lack of automated techniques to detect the differences between two event logs. Therefore, this research aims to develop visualisation techniques to provide targeted analysis of resource reallocation and activity rescheduling. The differences between two event logs are first identified. The changes between the two event logs are conceptualised and realised with a number of visualisations. With the proposed visualisations, analysts will then be able to identify the changes related to resource and time, resulting in a more efficient business process. Ultimately, analysts can make use of this comparative information to initiate evidence-based BPI activities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses the problem of predicting the outcome of an ongoing case of a business process based on event logs. In this setting, the outcome of a case may refer for example to the achievement of a performance objective or the fulfillment of a compliance rule upon completion of the case. Given a log consisting of traces of completed cases, given a trace of an ongoing case, and given two or more possible out- comes (e.g., a positive and a negative outcome), the paper addresses the problem of determining the most likely outcome for the case in question. Previous approaches to this problem are largely based on simple symbolic sequence classification, meaning that they extract features from traces seen as sequences of event labels, and use these features to construct a classifier for runtime prediction. In doing so, these approaches ignore the data payload associated to each event. This paper approaches the problem from a different angle by treating traces as complex symbolic sequences, that is, sequences of events each carrying a data payload. In this context, the paper outlines different feature encodings of complex symbolic sequences and compares their predictive accuracy on real-life business process event logs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

All the materials used for our two experimental studies for evaluating the BPMVM modeller.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large sized power transformers are important parts of the power supply chain. These very critical networks of engineering assets are an essential base of a nation’s energy resource infrastructure. This research identifies the key factors influencing transformer normal operating conditions and predicts the asset management lifespan. Engineering asset research has developed few lifespan forecasting methods combining real-time monitoring solutions for transformer maintenance and replacement. Utilizing the rich data source from a remote terminal unit (RTU) system for sensor-data driven analysis, this research develops an innovative real-time lifespan forecasting approach applying logistic regression based on the Weibull distribution. The methodology and the implementation prototype are verified using a data series from 161 kV transformers to evaluate the efficiency and accuracy for energy sector applications. The asset stakeholders and suppliers significantly benefit from the real-time power transformer lifespan evaluation for maintenance and replacement decision support.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a new knowledge acquisition method using a generic design environment where context-sensitive knowledge is used to build specific DSS for rural business. Although standard knowledge acquisition methods have been applied in rural business applications, uptake remains low and familiar weaknesses such as obsolescence and brittleness apply. We describe a decision support system (DSS) building environment where contextual factors relevant to the end users are directly taken into consideration. This "end user enabled design environment" (EUEDE) engages both domain experts in creating an expert knowledge base and business operators/end users (such as farmers) in using this knowledge for building their specific DSS. We document the knowledge organisation for the problem domain, namely a dairy industry application. This development involved a case-study research approach used to explore dairy operational knowledge. In this system end users can tailor their decision-making requirements using their own judgement to build specific DSSs. In a specific end user's farming context, each specific DSS provides expert suggestions to assist farmers in improving their farming practice. The paper also shows the environment's generic capability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When exposed to hot (22-35 degrees C) and dry climatic conditions in the field during the final 4-6 weeks of pod filling, peanuts (Arachis hypogaea L.) can accumulate highly carcinogenic and immuno-suppressing aflatoxins. Forecasting of the risk posed by these conditions can assist in minimizing pre-harvest contamination. A model was therefore developed as part of the Agricultural Production Systems Simulator (APSIM) peanut module, which calculated an aflatoxin risk index (ARI) using four temperature response functions when fractional available soil water was <0.20 and the crop was in the last 0.40 of the pod-filling phase. ARI explained 0.95 (P <= 0.05) of the variation in aflatoxin contamination, which varied from 0 to c. 800 mu g/kg in 17 large-scale sowings in tropical and four sowings in sub-tropical environments carried out in Australia between 13 November and 16 December 2007. ARI also explained 0.96 (P <= 0.01) of the variation in the proportion of aflatoxin-contaminated loads (>15 mu g/kg) of peanuts in the Kingaroy region of Australia during the period between the 1998/99 and 2007/08 seasons. Simulation of ARI using historical climatic data from 1890 to 2007 indicated a three-fold increase in its value since 1980 compared to the entire previous period. The increase was associated with increases in ambient temperature and decreases in rainfall. To facilitate routine monitoring of aflatoxin risk by growers in near real time, a web interface of the model was also developed. The ARI predicted using this interface for eight growers correlated significantly with the level of contamination in crops (r=095, P <= 0.01). These results suggest that ARI simulated by the model is a reliable indicator of aflatoxin contamination that can be used in aflatoxin research as well as a decision-support tool to monitor pre-harvest aflatoxin risk in peanuts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Department of Forest Resource Management in the University of Helsinki has in years 2004?2007 carried out so-called SIMO -project to develop a new generation planning system for forest management. Project parties are organisations doing most of Finnish forest planning in government, industry and private owned forests. Aim of this study was to find out the needs and requirements for new forest planning system and to clarify how parties see targets and processes in today's forest planning. Representatives responsible for forest planning in each organisation were interviewed one by one. According to study the stand-based system for managing and treating forests continues in the future. Because of variable data acquisition methods with different accuracy and sources, and development of single tree interpretation, more and more forest data is collected without field work. The benefits of using more specific forest data also calls for use of information units smaller than tree stand. In Finland the traditional way to arrange forest planning computation is divided in two elements. After updating the forest data to present situation every stand unit's growth is simulated with different alternative treatment schedule. After simulation, optimisation selects for every stand one treatment schedule so that the management program satisfies the owner's goals in the best possible way. This arrangement will be maintained in the future system. The parties' requirements to add multi-criteria problem solving, group decision support methods as well as heuristic and spatial optimisation into system make the programming work more challenging. Generally the new system is expected to be adjustable and transparent. Strict documentation and free source code helps to bring these expectations into effect. Variable growing models and treatment schedules with different source information, accuracy, methods and the speed of processing are supposed to work easily in system. Also possibilities to calibrate models regionally and to set local parameters changing in time are required. In future the forest planning system will be integrated in comprehensive data management systems together with geographic, economic and work supervision information. This requires a modular method of implementing the system and the use of a simple data transmission interface between modules and together with other systems. No major differences in parties' view of the systems requirements were noticed in this study. Rather the interviews completed the full picture from slightly different angles. In organisation the forest management is considered quite inflexible and it only draws the strategic lines. It does not yet have a role in operative activity, although the need and benefits of team level forest planning are admitted. Demands and opportunities of variable forest data, new planning goals and development of information technology are known. Party organisations want to keep on track with development. One example is the engagement in extensive SIMO-project which connects the whole field of forest planning in Finland.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this thesis is to develop a fully automatic lameness detection system that operates in a milking robot. The instrumentation, measurement software, algorithms for data analysis and a neural network model for lameness detection were developed. Automatic milking has become a common practice in dairy husbandry, and in the year 2006 about 4000 farms worldwide used over 6000 milking robots. There is a worldwide movement with the objective of fully automating every process from feeding to milking. Increase in automation is a consequence of increasing farm sizes, the demand for more efficient production and the growth of labour costs. As the level of automation increases, the time that the cattle keeper uses for monitoring animals often decreases. This has created a need for systems for automatically monitoring the health of farm animals. The popularity of milking robots also offers a new and unique possibility to monitor animals in a single confined space up to four times daily. Lameness is a crucial welfare issue in the modern dairy industry. Limb disorders cause serious welfare, health and economic problems especially in loose housing of cattle. Lameness causes losses in milk production and leads to early culling of animals. These costs could be reduced with early identification and treatment. At present, only a few methods for automatically detecting lameness have been developed, and the most common methods used for lameness detection and assessment are various visual locomotion scoring systems. The problem with locomotion scoring is that it needs experience to be conducted properly, it is labour intensive as an on-farm method and the results are subjective. A four balance system for measuring the leg load distribution of dairy cows during milking in order to detect lameness was developed and set up in the University of Helsinki Research farm Suitia. The leg weights of 73 cows were successfully recorded during almost 10,000 robotic milkings over a period of 5 months. The cows were locomotion scored weekly, and the lame cows were inspected clinically for hoof lesions. Unsuccessful measurements, caused by cows standing outside the balances, were removed from the data with a special algorithm, and the mean leg loads and the number of kicks during milking was calculated. In order to develop an expert system to automatically detect lameness cases, a model was needed. A probabilistic neural network (PNN) classifier model was chosen for the task. The data was divided in two parts and 5,074 measurements from 37 cows were used to train the model. The operation of the model was evaluated for its ability to detect lameness in the validating dataset, which had 4,868 measurements from 36 cows. The model was able to classify 96% of the measurements correctly as sound or lame cows, and 100% of the lameness cases in the validation data were identified. The number of measurements causing false alarms was 1.1%. The developed model has the potential to be used for on-farm decision support and can be used in a real-time lameness monitoring system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two trials were done in this project. One was a continuation of work started under a previous GRDC/SRDC-funded activity, 'Strategies to improve the integration of legumes into cane based farming systems'. This trial aimed to assess the impact of trash and tillage management options and nematicide application on nematodes and crop performance. Methods and results are contained in the following publication: Halpin NV, Stirling GR, Rehbein WE, Quinn B, Jakins A, Ginns SP. The impact of trash and tillage management options and nematicide application on crop performance and plant-parasitic nematode populations in a sugarcane/peanut farming system. Proc. Aust. Soc. Sugar Cane Technol. 37, 192-203. Nematicide application in the plant crop significantly reduced total numbers of plant parasitic nematodes (PPN) but there was no impact on yield. Application of nematicide to the ratoon crop significantly reduced sugar yield. The study confirmed other work demonstrating that implementation of strategies like reduced tillage reduced populations of total PPN, suggesting that the soil was more suppressive to PPN in those treatments. The second trial, a variety trial, demonstrated the limited value of nematicide application in sugarcane farming systems. This study has highlighted that growers shouldn’t view nematicides as a ‘cure all’ for paddocks that have historically had high PPN numbers. Nematicides have high mammalian toxicity, have the potential to contaminate ground water (Kookana et al. 1995) and are costly. The cost of nematicide used in R1 was approx. $320 - $350/ha, adding $3.50/t of cane in a 100 t/ha crop. Also, our study demonstrated that a single nematicide treatment at the application rate registered for sugarcane is not very effective in reducing populations of nematode pests. There appears to be some levels of resistance to nematodes within the current suite of varieties available to the southern canelands. For example the soil in plots that were growing Q183 had 560% more root knot nematodes / 200mL soil compared to plots that grew Q245. The authors see great value in investment into a nematode screening program that could rate varieties into groups of susceptibility to both major sugarcane nematode pests. Such a rating could then be built into a decision support ‘tree’ or tool to better enable producers to select varieties on a paddock by paddock basis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reductionist thinking will no longer suffice to address contemporary, complex challenges that defy sectoral, national, or disciplinary boundaries. Furthermore, lessons learned from the past cannot be confidently used to predict outcomes or help guide future actions. The authors propose that the confluence of a number of technology and social disruptors presents a pivotal moment in history to enable real-time, accelerated and integrated action that can adequately support a ‘future earth’ through transformational solutions. Building on more than a decade of dialogues hosted by the International Society for Digital Earth (ISDE), and evolving a briefing note presented to delegates of Pivotal2015, the paper presents an emergent context for collectively addressing spatial information, sustainable development and good governance through three guiding principles for enabling prosperous living in the 21st Century. These are: (1) open data, (2) real world context and (3) informed visualization for decision support. The paper synthesizes an interdisciplinary dialogue to create a credible and positive future vision of collaborative and transparent action for the betterment of humanity and planet. It is intended that the three Pivotal Principles can be used as an elegant framework for action towards the Digital Earth vision, across local, regional, and international communities and organizations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Predicting temporal responses of ecosystems to disturbances associated with industrial activities is critical for their management and conservation. However, prediction of ecosystem responses is challenging due to the complexity and potential non-linearities stemming from interactions between system components and multiple environmental drivers. Prediction is particularly difficult for marine ecosystems due to their often highly variable and complex natures and large uncertainties surrounding their dynamic responses. Consequently, current management of such systems often rely on expert judgement and/or complex quantitative models that consider only a subset of the relevant ecological processes. Hence there exists an urgent need for the development of whole-of-systems predictive models to support decision and policy makers in managing complex marine systems in the context of industry based disturbances. This paper presents Dynamic Bayesian Networks (DBNs) for predicting the temporal response of a marine ecosystem to anthropogenic disturbances. The DBN provides a visual representation of the problem domain in terms of factors (parts of the ecosystem) and their relationships. These relationships are quantified via Conditional Probability Tables (CPTs), which estimate the variability and uncertainty in the distribution of each factor. The combination of qualitative visual and quantitative elements in a DBN facilitates the integration of a wide array of data, published and expert knowledge and other models. Such multiple sources are often essential as one single source of information is rarely sufficient to cover the diverse range of factors relevant to a management task. Here, a DBN model is developed for tropical, annual Halophila and temperate, persistent Amphibolis seagrass meadows to inform dredging management and help meet environmental guidelines. Specifically, the impacts of capital (e.g. new port development) and maintenance (e.g. maintaining channel depths in established ports) dredging is evaluated with respect to the risk of permanent loss, defined as no recovery within 5 years (Environmental Protection Agency guidelines). The model is developed using expert knowledge, existing literature, statistical models of environmental light, and experimental data. The model is then demonstrated in a case study through the analysis of a variety of dredging, environmental and seagrass ecosystem recovery scenarios. In spatial zones significantly affected by dredging, such as the zone of moderate impact, shoot density has a very high probability of being driven to zero by capital dredging due to the duration of such dredging. Here, fast growing Halophila species can recover, however, the probability of recovery depends on the presence of seed banks. On the other hand, slow growing Amphibolis meadows have a high probability of suffering permanent loss. However, in the maintenance dredging scenario, due to the shorter duration of dredging, Amphibolis is better able to resist the impacts of dredging. For both types of seagrass meadows, the probability of loss was strongly dependent on the biological and ecological status of the meadow, as well as environmental conditions post-dredging. The ability to predict the ecosystem response under cumulative, non-linear interactions across a complex ecosystem highlights the utility of DBNs for decision support and environmental management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A comprehensive study was conducted on potential systems of integrated building utilities and transport power solutions that can simultaneously contain rising electricity, hot water and personal transport costs for apartment residents. The research developed the Commuter Energy and Building Utilities System (CEBUS) and quantified the economic, social and environmental benefits of incorporating such a system in future apartment developments. A decision support tool was produced to assist the exploration of the CEBUS design variants. A set of implementation guidelines for CEBUS was also developed for the property development industry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Acoustics is a rich source of environmental information that can reflect the ecological dynamics. To deal with the escalating acoustic data, a variety of automated classification techniques have been used for acoustic patterns or scene recognition, including urban soundscapes such as streets and restaurants; and natural soundscapes such as raining and thundering. It is common to classify acoustic patterns under the assumption that a single type of soundscapes present in an audio clip. This assumption is reasonable for some carefully selected audios. However, only few experiments have been focused on classifying simultaneous acoustic patterns in long-duration recordings. This paper proposes a binary relevance based multi-label classification approach to recognise simultaneous acoustic patterns in one-minute audio clips. By utilising acoustic indices as global features and multilayer perceptron as a base classifier, we achieve good classification performance on in-the-field data. Compared with single-label classification, multi-label classification approach provides more detailed information about the distributions of various acoustic patterns in long-duration recordings. These results will merit further biodiversity investigations, such as bird species surveys.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Economic valuation of ecosystem services is widely advocated as a useful decision-support tool for ecosystem management. However, the extent to which economic valuation of ecosystem services is actually used or considered useful in decision-making is poorly documented. This literature blindspot is explored with an application to coastal and marine ecosystems management in Australia. Based on a nation-wide survey of eighty-eight decision-makers representing a diversity of management organizations, the perceived usefulness and level of use of ecosystem services economic valuation in support of coastal and marine management are examined. A large majority of decision-makers are found to be familiar with economic valuation and consider it useful - even necessary - in decision-making, although this varies across decision-makers groups. However, most decision-makers never or rarely use it. The perceived level of importance and trust in estimated dollar values differ across ecosystem services, and are especially high for values that relate to commercial activities. A number of factors are also found to influence respondent’s use of economic valuation. Such findings concur with conclusions from other existing works, and are instructive to reflect on the issue of the usefulness of ESV in environmental management decision-making. They also confirm that the survey-based approach developed in this application represents a sound strategy to examine this issue at various scales and management levels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern database systems incorporate a query optimizer to identify the most efficient "query execution plan" for executing the declarative SQL queries submitted by users. A dynamic-programming-based approach is used to exhaustively enumerate the combinatorially large search space of plan alternatives and, using a cost model, to identify the optimal choice. While dynamic programming (DP) works very well for moderately complex queries with up to around a dozen base relations, it usually fails to scale beyond this stage due to its inherent exponential space and time complexity. Therefore, DP becomes practically infeasible for complex queries with a large number of base relations, such as those found in current decision-support and enterprise management applications. To address the above problem, a variety of approaches have been proposed in the literature. Some completely jettison the DP approach and resort to alternative techniques such as randomized algorithms, whereas others have retained DP by using heuristics to prune the search space to computationally manageable levels. In the latter class, a well-known strategy is "iterative dynamic programming" (IDP) wherein DP is employed bottom-up until it hits its feasibility limit, and then iteratively restarted with a significantly reduced subset of the execution plans currently under consideration. The experimental evaluation of IDP indicated that by appropriate choice of algorithmic parameters, it was possible to almost always obtain "good" (within a factor of twice of the optimal) plans, and in the few remaining cases, mostly "acceptable" (within an order of magnitude of the optimal) plans, and rarely, a "bad" plan. While IDP is certainly an innovative and powerful approach, we have found that there are a variety of common query frameworks wherein it can fail to consistently produce good plans, let alone the optimal choice. This is especially so when star or clique components are present, increasing the complexity of th- e join graphs. Worse, this shortcoming is exacerbated when the number of relations participating in the query is scaled upwards.