27 resultados para business-intelligence-system
Resumo:
We consider a multi-market framework where a set of firms compete on two oligopolistic markets. The cost of production of each firm allows for spillovers across markets, ensuring that output decisions for both markets have to be made jointly. Prior to competing in these markets, firms can establish links gathering business intelligence about other firms. A link formed by a firm generates two types of externalities for competitors and consumers. We characterize the business intelligence equilibrium networks and networks that maximize social welfare. By contrast with single market competition, we show that in multi-market competition there exist situations where intelligence gathering activities are underdeveloped with regard to social welfare and should be tolerated, if not encouraged, by public authorities.
Resumo:
Key Performance Indicators (KPIs) and their predictions are widely used by the enterprises for informed decision making. Nevertheless , a very important factor, which is generally overlooked, is that the top level strategic KPIs are actually driven by the operational level business processes. These two domains are, however, mostly segregated and analysed in silos with different Business Intelligence solutions. In this paper, we are proposing an approach for advanced Business Simulations, which converges the two domains by utilising process execution & business data, and concepts from Business Dynamics (BD) and Business Ontologies, to promote better system understanding and detailed KPI predictions. Our approach incorporates the automated creation of Causal Loop Diagrams, thus empowering the analyst to critically examine the complex dependencies hidden in the massive amounts of available enterprise data. We have further evaluated our proposed approach in the context of a retail use-case that involved verification of the automatically generated causal models by a domain expert.
Resumo:
The Balanced Scorecard of Kaplan and Norton is a management tool that supports the successful implementation of corporate strategies. It has been discussed and considered widely in both practice and research. By linking operational and non-financial corporate activities with causal chains to the firm's long-term strategy, the Balanced Scorecard supports the alignment and management of all corporate activities according to their strategic relevance. The Balanced Scorecard makes it possible to take into account non-monetary strategic success factors that significantly impact the economic success of a business. The Balanced Scorecard is thus a promising starting-point to also incorporate environmental and social aspects into the main management system of a firm. Sustainability management with the Balanced Scorecard helps to overcome the shortcomings of conventional approaches to environmental and social management systems by integrating the three pillars of sustainability into a single and overarching strategic management tool. After a brief discussion of the different possible forms of a Sustainability Balanced Scorecard the article takes a closer look at the process and steps of formulating a Sustainability Balanced Scorecard for a business unit. Before doing so, the basic conventional approach of the Balanced Scorecard and its suitability for sustainability management will be outlined in brief.
Resumo:
A novel methodology is proposed for the development of neural network models for complex engineering systems exhibiting nonlinearity. This method performs neural network modeling by first establishing some fundamental nonlinear functions from a priori engineering knowledge, which are then constructed and coded into appropriate chromosome representations. Given a suitable fitness function, using evolutionary approaches such as genetic algorithms, a population of chromosomes evolves for a certain number of generations to finally produce a neural network model best fitting the system data. The objective is to improve the transparency of the neural networks, i.e. to produce physically meaningful
Resumo:
PEGS (Production and Environmental Generic Scheduler) is a generic production scheduler that produces good schedules over a wide range of problems. It is centralised, using search strategies with the Shifting Bottleneck algorithm. We have also developed an alternative distributed approach using software agents. In some cases this reduces run times by a factor of 10 or more. In most cases, the agent-based program also produces good solutions for published benchmark data, and the short run times make our program useful for a large range of problems. Test results show that the agents can produce schedules comparable to the best found so far for some benchmark datasets and actually better schedules than PEGS on our own random datasets. The flexibility that agents can provide for today's dynamic scheduling is also appealing. We suggest that in this sort of generic or commercial system, the agent-based approach is a good alternative.
Resumo:
Exam timetabling is one of the most important administrative activities that takes place in academic institutions. In this paper we present a critical discussion of the research on exam timetabling in the last decade or so. This last ten years has seen an increased level of attention on this important topic. There has been a range of significant contributions to the scientific literature both in terms of theoretical andpractical aspects. The main aim of this survey is to highlight the new trends and key research achievements that have been carried out in the last decade.We also aim to outline a range of relevant important research issues and challenges that have been generated by this body of work.
We first define the problem and review previous survey papers. Algorithmic approaches are then classified and discussed. These include early techniques (e.g. graph heuristics) and state-of-the-art approaches including meta-heuristics, constraint based methods, multi-criteria techniques, hybridisations, and recent new trends concerning neighbourhood structures, which are motivated by raising the generality of the approaches. Summarising tables are presented to provide an overall view of these techniques. We discuss some issues on decomposition techniques, system tools and languages, models and complexity. We also present and discuss some important issues which have come to light concerning the public benchmark exam timetabling data. Different versions of problem datasetswith the same name have been circulating in the scientific community in the last ten years which has generated a significant amount of confusion. We clarify the situation and present a re-naming of the widely studied datasets to avoid future confusion. We also highlight which research papershave dealt with which dataset. Finally, we draw upon our discussion of the literature to present a (non-exhaustive) range of potential future research directions and open issues in exam timetabling research.
Resumo:
Query processing over the Internet involving autonomous data sources is a major task in data integration. It requires the estimated costs of possible queries in order to select the best one that has the minimum cost. In this context, the cost of a query is affected by three factors: network congestion, server contention state, and complexity of the query. In this paper, we study the effects of both the network congestion and server contention state on the cost of a query. We refer to these two factors together as system contention states. We present a new approach to determining the system contention states by clustering the costs of a sample query. For each system contention state, we construct two cost formulas for unary and join queries respectively using the multiple regression process. When a new query is submitted, its system contention state is estimated first using either the time slides method or the statistical method. The cost of the query is then calculated using the corresponding cost formulas. The estimated cost of the query is further adjusted to improve its accuracy. Our experiments show that our methods can produce quite accurate cost estimates of the submitted queries to remote data sources over the Internet.
Resumo:
We present a practical approach to Natural Language Generation (NLG) for spoken dialogue systems. The approach is based on small template fragments (mini-templates). The system’s object architecture facilitates generation of phrases across pre-defined business domains and registers, as well as into different languages. The architecture simplifies NLG in well-understood application contexts, while providing the flexibility for a developer and for the system, to vary linguistic output according to dialogue context, including any intended affective impact. Mini-templates are used with a suite of domain term objects, resulting in an NLG system (MINTGEN – MINi-Template GENerator) whose extensibility and ease of maintenance is enhanced by the sparsity of information devoted to individual domains. The system also avoids the need for specialist linguistic competence on the part of the system maintainer.
Resumo:
This study concerns the spatial allocation of material flows, with emphasis on construction material in the Irish housing sector. It addresses some of the key issues concerning anthropogenic impact on the environment through spatial temporal visualisation of the flow of materials, wastes and emissions at different spatial levels. This is presented in the form of a spatial model, Spatial Allocation of Material Flow Analysis (SAMFA), which enables the simulation of construction material flows and associated energy use. SAMFA parallels the Island Limits project (EPA funded under 2004-SD-MS-22-M2), which aimed to create a material flow analysis of the Irish economy classified by industrial sector. SAMFA further develops this by attempting to establish the material flows at the subnational geographical scale that could be used in the development of local authority (LA) sustainability strategies and spatial planning frameworks by highlighting the cumulative environmental impacts of the development of the built environment. By drawing on the idea of planning support systems, SAMFA also aims to provide a cross-disciplinary, integrative medium for involving stakeholders in strategies for a sustainable built environment and, as such, would help illustrate the sustainability consequences of alternative The pilot run of the model in Kildare has shown that the model can be successfully calibrated and applied to develop alternative material flows and energy-use scenarios at the ED level. This has been demonstrated through the development of an integrated and a business-as-usual scenario, with the former integrating a range of potential material efficiency and energysaving policy options and the latter replicating conditions that best describe the current trend. Their comparison shows that the former is better than the latter in terms of both material and energy use. This report also identifies a number of potential areas of future research and areas of broader application. This includes improving the accuracy of the SAMFA model (e.g. by establishing actual life expectancy of buildings in the Irish context through field surveys) and the extension of the model to other Irish counties. This would establish SAMFA as a valuable predicting and monitoring tool that is capable of integrating national and local spatial planning objectives with actual environmental impacts. Furthermore, should the model prove successful at this level, it then has the potential to transfer the modelling approach to other areas of the built environment, such as commercial development and other key contributors of greenhouse emissions. The ultimate aim is to develop a meta-model for predicting the consequences of consumption patterns at the local scale. This therefore offers the possibility of creating critical links between socio technical systems with the most important challenge of all the limitations of the biophysical environment.