200 resultados para data warehouse tuning aggregato business intelligence performance


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of this paper is to identify and empirically examine the key features, purposes, uses, and benefits of performance dashboards. We find that only about a quarter of the sales managers surveyed1 in Finland used a dashboard, which was lower than previously reported. Dashboards were used for four distinct purposes: (i) monitoring, (ii) problem solving, (iii) rationalizing, and (iv) communication and consistency. There was a high correlation between the different uses of dashboards and user productivity indicating that dashboards were perceived as effective tools in performance management, not just for monitoring one‟s own performance but for other purposes including communication. The quality of the data in dashboards did not seem to be a concern (except for completeness) but it was a critical driver regarding its use. This is the first empirical study on performance dashboards in terms of adoption rates, key features, and benefits. The study highlights the research potential and benefits of dashboards, which could be valuable for future researchers and practitioners.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The emergence of semantic technologies to deal with the underlying meaning of things, instead of a purely syntactical representation, has led to new developments in various fields, including business process modeling. Inspired by artificial intelligence research, technologies for semantic Web services have been proposed and extended to process modeling. However, the applicablility of semantic Web services for semantic business processes is limited because business processes encompass wider requirements of business than Web services. In particular, processes are concerned with the composition of tasks, that is, in which order activities are carried out, regardless of their implementation details; resources assigned to carry out tasks, such as machinery, people, and goods; data exchange; and security and compliance concerns.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Information Technology (IT) is an important resource that facilitates growth and development in both the developed and emerging economies. The increasing forces of globalization are creating a wider digital divide between the developed and emerging economies. The smaller emerging economies are the most venerable. Intense competition for IT resources means that these emerging economies would need to acquire a deeper understanding of how to source and evaluate their IT-related efforts. This effort would put these economies in a better position to source funding from various stakeholders. This research presents a complementary approach to securing better IT-related business value in organizations in the South Pacific Island countries – a case of emerging economies. Analysis of data collected from six South Pacific Island countries suggests that organizations that invest in IT and related complementaries are able to better their business processes. The data also suggest that improved business processes lead to overall business processes improvements.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This year marks the completion of data collection for year three (Wave 3) of the CAUSEE study. This report uses data from the first three years and focuses on the process of learning and adaptation in the business creation process. Most start-ups need to change their business model, their product, their marketing plan, their market or something else about the business to be successful. PayPal changed their product at least five times, moving from handheld security, to enterprise apps, to consumer apps, to a digital wallet, to payments between handhelds before finally stumbling on the model that made the a multi-billion dollar company revolving around email-based payments. PayPal is not alone and anecdotes abounds of start-ups changing direction: Sysmantec started as an artificial intelligence company, Apple started selling plans to build computers and Microsoft tried to peddle compilers before licensing an operating system out of New Mexico. To what extent do Australian new ventures change and adapt as their ideas and business develop? As a longitudinal study, CAUSEE was designed specifically to observe development in the venture creation process. In this research briefing paper, we compare development over time of randomly sampled Nascent Firms (NF) and Young Firms(YF), concentrating on the surviving cases. We also compare NFs with YFs at each yearly interval. The 'high potential' over sample is not used in this report.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Most approaches to business process compliance are restricted to the analysis of the structure of processes. It has been argued that full regulatory compliance requires information on not only the structure of processes but also on what the tasks in a process do. To this end Governatori and Sadiq[2007] proposed to extend business processes with semantic annotations. We propose a methodology to automatically extract one kind of such annotations; in particular the annotations related to the data schema and templates linked to the various tasks in a business process.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Nowadays, Workflow Management Systems (WfMSs) and, more generally, Process Management Systems (PMPs) are process-aware Information Systems (PAISs), are widely used to support many human organizational activities, ranging from well-understood, relatively stable and structures processes (supply chain management, postal delivery tracking, etc.) to processes that are more complicated, less structured and may exhibit a high degree of variation (health-care, emergency management, etc.). Every aspect of a business process involves a certain amount of knowledge which may be complex depending on the domain of interest. The adequate representation of this knowledge is determined by the modeling language used. Some processes behave in a way that is well understood, predictable and repeatable: the tasks are clearly delineated and the control flow is straightforward. Recent discussions, however, illustrate the increasing demand for solutions for knowledge-intensive processes, where these characteristics are less applicable. The actors involved in the conduct of a knowledge-intensive process have to deal with a high degree of uncertainty. Tasks may be hard to perform and the order in which they need to be performed may be highly variable. Modeling knowledge-intensive processes can be complex as it may be hard to capture at design-time what knowledge is available at run-time. In realistic environments, for example, actors lack important knowledge at execution time or this knowledge can become obsolete as the process progresses. Even if each actor (at some point) has perfect knowledge of the world, it may not be certain of its beliefs at later points in time, since tasks by other actors may change the world without those changes being perceived. Typically, a knowledge-intensive process cannot be adequately modeled by classical, state of the art process/workflow modeling approaches. In some respect there is a lack of maturity when it comes to capturing the semantic aspects involved, both in terms of reasoning about them. The main focus of the 1st International Workshop on Knowledge-intensive Business processes (KiBP 2012) was investigating how techniques from different fields, such as Artificial Intelligence (AI), Knowledge Representation (KR), Business Process Management (BPM), Service Oriented Computing (SOC), etc., can be combined with the aim of improving the modeling and the enactment phases of a knowledge-intensive process. The 1st International Workshop on Knowledge-intensive Business process (KiBP 2012) was held as part of the program of the 2012 Knowledge Representation & Reasoning International Conference (KR 2012) in Rome, Italy, in June 2012. The workshop was hosted by the Dipartimento di Ingegneria Informatica, Automatica e Gestionale Antonio Ruberti of Sapienza Universita di Roma, with financial support of the University, through grant 2010-C26A107CN9 TESTMED, and the EU Commission through the projects FP7-25888 Greener Buildings and FP7-257899 Smart Vortex. This volume contains the 5 papers accepted and presented at the workshop. Each paper was reviewed by three members of the internationally renowned Program Committee. In addition, a further paper was invted for inclusion in the workshop proceedings and for presentation at the workshop. There were two keynote talks, one by Marlon Dumas (Institute of Computer Science, University of Tartu, Estonia) on "Integrated Data and Process Management: Finally?" and the other by Yves Lesperance (Department of Computer Science and Engineering, York University, Canada) on "A Logic-Based Approach to Business Processes Customization" completed the scientific program. We would like to thank all the Program Committee members for the valuable work in selecting the papers, Andrea Marrella for his valuable work as publication and publicity chair of the workshop, and Carola Aiello and the consulting agency Consulta Umbria for the organization of this successful event.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Gazelles, or very rapidly growing firms, are important because they contribute disproportionately to economic growth. There is a concern that some of these firms pursue growth too aggressively resulting in lower subsequent performance. We investigate the relationship between growth and subsequent profitability for gazelle firms, and how this is moderated by firm strategy. Previous empirical research regarding the growth-profitability relationship for firms in general is rather inconclusive, with only one study specifically investigating gazelle firms. Likewise, there are theoretical arguments both for and against growth leading to profitability that equally apply to gazelle firms. Further, while contingency theory might suggest the relationship depends on the firm’s strategy, earlier studies have not investigated this relationship. We address these questions using longitudinal data (seven years) for a sample of 964 Danish Gazelle firms. Our study finds a clear positive relationship between growth and subsequent profitability among gazelle firms. Moreover, this relationship is stronger for firms pursuing a broad market strategy rather than a focus or niche strategy. An important managerial implication is that the growth strategy should be clearly integrated with the general strategic orientation of the firm.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The relationships between business planning and performance have divided the entrepreneurship research community for decades (Brinckmann et al, 2010). One side of this debate is the assumption that business plans may lock the firm in a specific direction early on, impede the firm to adapt to the changing market conditions (Dencker et al., 2009) and eventually, cause escalation of commitments by introducing rigidity (Vesper, 1993). Conversely, feedback received from the production and presentation of business plans may also lead the firm to take corrective actions. However, the mechanisms underlying the relationships between changes in business ideas, business plans and the performance of nascent firms are still largely unknown. While too many business idea changes may confuse stakeholders, exhaust the firm’s resources and hinder the undergoing legitimization process, some flexibility during the early stages of the venture may be beneficial to cope with the uncertainties surrounding new venture creation (Knight, 1921; March, 1982; Stinchcombe, 1965; Weick, 1979). Previous research has emphasized adaptability and flexibility as key success factors through effectual logic and interaction with the market (Sarasvathy, 2001; 2007) or improvisation and trial-and-error (Miner et al, 2001). However, those studies did not specifically investigate the role of business planning. Our objective is to reconcile those seemingly opposing views (flexibility versus rigidity) by undertaking a more fine-grained analysis at the relationships between business planning and changes in business ideas on a large longitudinal sample of nascent firms.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Purpose: There is a lack of theory relating to destination brand performance measurement in the destination branding literature, which emerged in the late 1990s (see for example Dosen, Vransevic, & Prebezac, 1998). Additionally, there is a lack of research about the importance of travel context in consumers’ destination decision making (Hu & Ritchie, 1993). This study develops a structural model to measure destination brand performance across different travel situations. The theory of planned behaviour (TpB) was utilised as a framework to underpin the consumer-based brand equity (CBBE) hierarchy to develop a model of destination brand performance. Research approach: A proposed model of destination brand performance was developed through a review of the literature. The first study was used to identify destination image attributes (the core construct) using an analysis of the literature, a document analysis, and personal interviews using the Repertory Test qualitative technique. Underpinned by Personal Construct Theory (PCT), the Repertory Test enables the elicitation of attributes consumers use to evaluate destinations when considering travel. Data was examined in the first study to i) identify any attribute differences in travel contexts and ii) create a scale for use in a questionnaire. A second study was conducted to test the proposed model using a questionnaire with eight groups of participants to assess four destinations across two travel contexts. The model was tested utilising structural equation modelling. Findings: The first study resulted in a list of 29 destination image attributes for use in a scale index. Attributes were assessed across travel contexts and few differences were identified. The second study assessed the congruence of destination brand identity (the destination marketing organisation’s desired image) and destination brand image (the actual perceptions held by consumers) using importance-performance analyses. Finally, the proposed model of destination brand performance was tested. Overall the data supported the model of destination brand performance across travel contexts and destinations. Additionally, this was compared to consumers’ decision sets, further supporting the model. Value: This research provides a contribution to the destination marketing literature through the development of a measurement of destination brand performance underpinned by TpB. Practically; it will provide destination marketing organisations with a tool to track destination brand performance, relative to key competing places, over time. This is important given the development of a destination brand is a long term endeavour.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Having a reliable understanding about the behaviours, problems, and performance of existing processes is important in enabling a targeted process improvement initiative. Recently, there has been an increase in the application of innovative process mining techniques to facilitate evidence-based understanding about organizations' business processes. Nevertheless, the application of these techniques in the domain of finance in Australia is, at best, scarce. This paper details a 6-month case study on the application of process mining in one of the largest insurance companies in Australia. In particular, the challenges encountered, the lessons learned, and the results obtained from this case study are detailed. Through this case study, we not only validated existing `lessons learned' from other similar case studies, but also added new insights that can be beneficial to other practitioners in applying process mining in their respective fields.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper proposes a technique that supports process participants in making risk-informed decisions, with the aim to reduce the process risks. Risk reduction involves decreasing the likelihood and severity of a process fault from occurring. Given a process exposed to risks, e.g. a financial process exposed to a risk of reputation loss, we enact this process and whenever a process participant needs to provide input to the process, e.g. by selecting the next task to execute or by filling out a form, we prompt the participant with the expected risk that a given fault will occur given the particular input. These risks are predicted by traversing decision trees generated from the logs of past process executions and considering process data, involved resources, task durations and contextual information like task frequencies. The approach has been implemented in the YAWL system and its effectiveness evaluated. The results show that the process instances executed in the tests complete with substantially fewer faults and with lower fault severities, when taking into account the recommendations provided by our technique.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents an input-orientated data envelopment analysis (DEA) framework which allows the measurement and decomposition of economic, environmental and ecological efficiency levels in agricultural production across different countries. Economic, environmental and ecological optimisations search for optimal input combinations that minimise total costs, total amount of nutrients, and total amount of cumulative exergy contained in inputs respectively. The application of the framework to an agricultural dataset of 30 OECD countries revealed that (i) there was significant scope to make their agricultural production systemsmore environmentally and ecologically sustainable; (ii) the improvement in the environmental and ecological sustainability could be achieved by being more technically efficient and, even more significantly, by changing the input combinations; (iii) the rankings of sustainability varied significantly across OECD countries within frontier-based environmental and ecological efficiency measures and between frontier-based measures and indicators.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper proposes a concrete approach for the automatic mitigation of risks that are detected during process enactment. Given a process model exposed to risks, e.g. a financial process exposed to the risk of approval fraud, we enact this process and as soon as the likelihood of the associated risk(s) is no longer tolerable, we generate a set of possible mitigation actions to reduce the risks' likelihood, ideally annulling the risks altogether. A mitigation action is a sequence of controlled changes applied to the running process instance, taking into account a snapshot of the process resources and data, and the current status of the system in which the process is executed. These actions are proposed as recommendations to help process administrators mitigate process-related risks as soon as they arise. The approach has been implemented in the YAWL environment and its performance evaluated. The results show that it is possible to mitigate process-related risks within a few minutes.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Open the sports or business section of your daily newspaper, and you are immediately bombarded with an array of graphs, tables, diagrams, and statistical reports that require interpretation. Across all walks of life, the need to understand statistics is fundamental. Given that our youngsters’ future world will be increasingly data laden, scaffolding their statistical understanding and reasoning is imperative, from the early grades on. The National Council of Teachers of Mathematics (NCTM) continues to emphasize the importance of early statistical learning; data analysis and probability was the Council’s professional development “Focus of the Year” for 2007–2008. We need such a focus, especially given the results of the statistics items from the 2003 NAEP. As Shaughnessy (2007) noted, students’ performance was weak on more complex items involving interpretation or application of items of information in graphs and tables. Furthermore, little or no gains were made between the 2000 NAEP and the 2003 NAEP studies. One approach I have taken to promote young children’s statistical reasoning is through data modeling. Having implemented in grades 3 –9 a number of model-eliciting activities involving working with data (e.g., English 2010), I observed how competently children could create their own mathematical ideas and representations—before being instructed how to do so. I thus wished to introduce data-modeling activities to younger children, confi dent that they would likewise generate their own mathematics. I recently implemented data-modeling activities in a cohort of three first-grade classrooms of six year- olds. I report on some of the children’s responses and discuss the components of data modeling the children engaged in.