699 resultados para data warehouse tuning aggregato business intelligence performance


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The skyrocketing trend for social media on the Internet greatly alters analytical Customer Relationship Management (CRM). Against this backdrop, the purpose of this paper is to advance the conceptual design of Business Intelligence (BI) systems with data identified from social networks. We develop an integrated social network data model, based on an in-depth analysis of Facebook. The data model can inform the design of data warehouses in order to offer new opportunities for CRM analyses, leading to a more consistent and richer picture of customers? characteristics, needs, wants, and demands. Four major contributions are offered. First, Social CRM and Social BI are introduced as emerging fields of research. Second, we develop a conceptual data model to identify and systematize the data available on online social networks. Third, based on the identified data, we design a multidimensional data model as an early contribution to the conceptual design of Social BI systems and demonstrate its application by developing management reports in a retail scenario. Fourth, intellectual challenges for advancing Social CRM and Social BI are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Debate about the relationships between business planning and performance has been active for decades (Bhidé, 2000; Mintzberg, 1994). While results have been inconclusive, this topic still strongly divides the research community (Brinckmann et al., 2010; Chwolka & Raith, 2011; Delmar & Shane, 2004; Frese, 2009; Gruber, 2007; Honig & Karlsson, 2004). Previous research explored the relationships between innovation and the venture creation process (Amason et al., 2006, Dewar & Dutton, 1986; Jennings et al., 2009). However, the relationships between business planning and innovation have mostly been invoked indirectly in the strategy and entrepreneurship literatures through the notion of uncertainty surrounding the development of innovation. Some posited that planning may be irrelevant due to the iterative process, the numerous changes innovation development entails and the need to be flexible (Brews & Hunt, 1999). Others suggested that planning may facilitate the achievement of goals and overcoming of obstacles (Locke and Latham, 2000), guide the venture in its allocation of resources (Delmar and Shane, 2003) and help to foster the communication about the innovation being developed (Liao & Welsh, 2008). However, the nature and extents of the relationships between business planning, innovation and performance are still largely unknown. Moreover, if the reasons why ventures should engage (Frese, 2009) –or not- (Honig, 2004) in business planning have been investigated quite extensively (Brinckmann et al., 2010), the specific value of business planning for nascent firms developing innovation is still unclear. The objective of this paper is to shed some light on these important aspects by investigating the two following questions on a large sample of random nascent firms: 1) how is business planning use over time by new ventures developing different types and degrees of innovation? 2) how do business planning and innovation impact the performance of the nascent firms? Methods & Key propositions This PSED-type study draws its data from the first three waves of the CAUSEE project where 30,105 Australian households were randomly contacted by phone using a methodology to capture emerging firms (Davidsson, Steffens, Gordon, Reynolds, 2008). This screening led to the identification of 594 nascent ventures (i.e., firms that were not operating yet at the time of the identification) that were willing to participate in the study. Comprehensive phone interviews were conducted with these 594 ventures. Likewise, two comprehensive follow-ups were organised 12 months and 24 months later where 80% of the eligible cases of the previous wave completed the interview. The questionnaire contains specific sections investigating business plans such as: presence or absence, degree of formality and updates of the plan. Four types of innovation are measured along three degrees of intensity to produce a comprehensive continuous measure ranging from 0 to 12 (Dahlqvist & Wiklund, 2011). Other sections informing on the gestation activities, industry and different types of experiences will be used as controls to measure the relationships and the impacts of business planning and innovation on the performance of nascent firms overtime. Results from two rounds of pre-testing informed the design of the instrument included in the main survey. The three waves of data are used to first test and compare the use of planning amongst nascent firms by their degrees of innovation and then to examine their impact on performance overtime through regression analyses. Results and Implications Three waves of data collection have been completed. Preliminary results show that on average, innovative firms are more likely to have a business plans than their low innovative counterpart. They are also most likely to update their plan suggesting a more continuous use of the plan over time than previously thought. Further analyses regarding the relationships between business planning, innovation and performance are undergoing. This paper is expected to contribute to the literature on business planning and innovation by measuring quantitatively their impact on nascent firms activities and performance at different stages of their development. In addition, this study will shed a new light on the business planning-performance relationship by disentangling plans, types of nascent firms regarding their innovation degres and their performance over time. Finally, we expect to increase the understanding of the venture creation process by analysing those questions on nascent firms from a large longitudinal sample of randomly selected ventures. We acknowledge the results from this study will be preliminary and will have to be interpreted with caution as the business planning-performance is not a straightforward relationship (Brinckmann et al., 2010). Meanwhile, we believe that this study is important to the field of entrepreneurship as it provides some much needed insights on the processes used by nascent firms during their creation and early operating stages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The global business environment is witnessing tough times, and this situation has significant implications on how organizations manage their processes and resources. Accounting information system (AIS) plays a critical role in this situation to ensure appropriate processing of financial transactions and availability to relevant information for decision-making. We suggest the need for a dynamic AIS environment for today’s turbulent business environment. This environment is possible with a dynamic AIS, complementary business intelligence systems, and technical human capability. Data collected through a field survey suggests that the dynamic AIS environment contributes to an organization’s accounting functions of processing transactions, providing information for decision making, and ensuring an appropriate control environment. These accounting processes contribute to the firm-level performance of the organization. From these outcomes, one can infer that a dynamic AIS environment contributes to organizational performance in today’s challenging business environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information visualization is a process of constructing a visual presentation of abstract quantitative data. The characteristics of visual perception enable humans to recognize patterns, trends and anomalies inherent in the data with little effort in a visual display. Such properties of the data are likely to be missed in a purely text-based presentation. Visualizations are therefore widely used in contemporary business decision support systems. Visual user interfaces called dashboards are tools for reporting the status of a company and its business environment to facilitate business intelligence (BI) and performance management activities. In this study, we examine the research on the principles of human visual perception and information visualization as well as the application of visualization in a business decision support system. A review of current BI software products reveals that the visualizations included in them are often quite ineffective in communicating important information. Based on the principles of visual perception and information visualization, we summarize a set of design guidelines for creating effective visual reporting interfaces.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accounting information systems (AIS) capture and process accounting data and provide valuable information for decision-makers. However, in a rapidly changing environment, continual management of the AIS is necessary for organizations to optimise performance outcomes. We suggest that building a dynamic AIS capability enables accounting process and organizational performance. Using the dynamic capabilities framework (Teece 2007) we propose that a dynamic AIS capability can be developed through the synergy of three competencies: a flexible AIS, having a complementary business intelligence system and accounting professionals with IT technical competency. Using survey data, we find evidence of a positive association between a dynamic AIS capability, accounting process performance, and overall firm performance. The results suggest that developing a dynamic AIS resource can add value to an organization. This study provides guidance for organizations looking to leverage the performance outcomes of their AIS environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabajo recopila literatura académica relevante sobre estrategias de entrada y metodologías para la toma de decisión sobre la contratación de servicios de Outsourcing para el caso de empresas que planean expandirse hacia mercados extranjeros. La manera en que una empresa planifica su entrada a un mercado extranjero, y realiza la consideración y evaluación de información relevante y el diseño de la estrategia, determina el éxito o no de la misma. De otro lado, las metodologías consideradas se concentran en el nivel estratégico de la pirámide organizacional. Se parte de métodos simples para llegar a aquellos basados en la Teoría de Decisión Multicriterio, tanto individuales como híbridos. Finalmente, se presenta la Dinámica de Sistemas como herramienta valiosa en el proceso, por cuanto puede combinarse con métodos multicriterio.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advances in hardware and software technology enable us to collect, store and distribute large quantities of data on a very large scale. Automatically discovering and extracting hidden knowledge in the form of patterns from these large data volumes is known as data mining. Data mining technology is not only a part of business intelligence, but is also used in many other application areas such as research, marketing and financial analytics. For example medical scientists can use patterns extracted from historic patient data in order to determine if a new patient is likely to respond positively to a particular treatment or not; marketing analysts can use extracted patterns from customer data for future advertisement campaigns; finance experts have an interest in patterns that forecast the development of certain stock market shares for investment recommendations. However, extracting knowledge in the form of patterns from massive data volumes imposes a number of computational challenges in terms of processing time, memory, bandwidth and power consumption. These challenges have led to the development of parallel and distributed data analysis approaches and the utilisation of Grid and Cloud computing. This chapter gives an overview of parallel and distributed computing approaches and how they can be used to scale up data mining to large datasets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, we study the performance evaluation of resource-aware business process models. We define a new framework that allows the generation of analytical models for performance evaluation from business process models annotated with resource management information. This framework is composed of a new notation that allows the specification of resource management constraints and a method to convert a business process specification and its resource constraints into Stochastic Automata Networks (SANs). We show that the analysis of the generated SAN model provides several performance indices, such as average throughput of the system, average waiting time, average queues size, and utilization rate of resources. Using the BP2SAN tool - our implementation of the proposed framework - and a SAN solver (such as the PEPS tool) we show through a simple use-case how a business specialist with no skills in stochastic modeling can easily obtain performance indices that, in turn, can help to identify bottlenecks on the model, to perform workload characterization, to define the provisioning of resources, and to study other performance related aspects of the business process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Even when data repositories exhibit near perfect data quality, users may formulate queries that do not correspond to the information requested. Users’ poor information retrieval performance may arise from either problems understanding of the data models that represent the real world systems, or their query skills. This research focuses on users’ understanding of the data structures, i.e., their ability to map the information request and the data model. The Bunge-Wand-Weber ontology was used to formulate three sets of hypotheses. Two laboratory experiments (one using a small data model and one using a larger data model) tested the effect of ontological clarity on users’ performance when undertaking component, record, and aggregate level tasks. The results indicate for the hypotheses associated with different representations but equivalent semantics that parsimonious data model participants performed better for component level tasks but that ontologically clearer data model participants performed better for record and aggregate level tasks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il presente elaborato esplora l’attitudine delle organizzazioni nei confronti dei processi di business che le sostengono: dalla semi-assenza di struttura, all’organizzazione funzionale, fino all’avvento del Business Process Reengineering e del Business Process Management, nato come superamento dei limiti e delle problematiche del modello precedente. All’interno del ciclo di vita del BPM, trova spazio la metodologia del process mining, che permette un livello di analisi dei processi a partire dagli event data log, ossia dai dati di registrazione degli eventi, che fanno riferimento a tutte quelle attività supportate da un sistema informativo aziendale. Il process mining può essere visto come naturale ponte che collega le discipline del management basate sui processi (ma non data-driven) e i nuovi sviluppi della business intelligence, capaci di gestire e manipolare l’enorme mole di dati a disposizione delle aziende (ma che non sono process-driven). Nella tesi, i requisiti e le tecnologie che abilitano l’utilizzo della disciplina sono descritti, cosi come le tre tecniche che questa abilita: process discovery, conformance checking e process enhancement. Il process mining è stato utilizzato come strumento principale in un progetto di consulenza da HSPI S.p.A. per conto di un importante cliente italiano, fornitore di piattaforme e di soluzioni IT. Il progetto a cui ho preso parte, descritto all’interno dell’elaborato, ha come scopo quello di sostenere l’organizzazione nel suo piano di improvement delle prestazioni interne e ha permesso di verificare l’applicabilità e i limiti delle tecniche di process mining. Infine, nell’appendice finale, è presente un paper da me realizzato, che raccoglie tutte le applicazioni della disciplina in un contesto di business reale, traendo dati e informazioni da working papers, casi aziendali e da canali diretti. Per la sua validità e completezza, questo documento è stata pubblicato nel sito dell'IEEE Task Force on Process Mining.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the past decade, there has been a dramatic increase by postsecondary institutions in providing academic programs and course offerings in a multitude of formats and venues (Biemiller, 2009; Kucsera & Zimmaro, 2010; Lang, 2009; Mangan, 2008). Strategies pertaining to reapportionment of course-delivery seat time have been a major facet of these institutional initiatives; most notably, within many open-door 2-year colleges. Often, these enrollment-management decisions are driven by the desire to increase market-share, optimize the usage of finite facility capacity, and contain costs, especially during these economically turbulent times. So, while enrollments have surged to the point where nearly one in three 18-to-24 year-old U.S. undergraduates are community college students (Pew Research Center, 2009), graduation rates, on average, still remain distressingly low (Complete College America, 2011). Among the learning-theory constructs related to seat-time reapportionment efforts is the cognitive phenomenon commonly referred to as the spacing effect, the degree to which learning is enhanced by a series of shorter, separated sessions as opposed to fewer, more massed episodes. This ex post facto study explored whether seat time in a postsecondary developmental-level algebra course is significantly related to: course success; course-enrollment persistence; and, longitudinally, the time to successfully complete a general-education-level mathematics course. Hierarchical logistic regression and discrete-time survival analysis were used to perform a multi-level, multivariable analysis of a student cohort (N = 3,284) enrolled at a large, multi-campus, urban community college. The subjects were retrospectively tracked over a 2-year longitudinal period. The study found that students in long seat-time classes tended to withdraw earlier and more often than did their peers in short seat-time classes (p < .05). Additionally, a model comprised of nine statistically significant covariates (all with p-values less than .01) was constructed. However, no longitudinal seat-time group differences were detected nor was there sufficient statistical evidence to conclude that seat time was predictive of developmental-level course success. A principal aim of this study was to demonstrate—to educational leaders, researchers, and institutional-research/business-intelligence professionals—the advantages and computational practicability of survival analysis, an underused but more powerful way to investigate changes in students over time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Road networks are a national critical infrastructure. The road assets need to be monitored and maintained efficiently as their conditions deteriorate over time. The condition of one of such assets, road pavement, plays a major role in the road network maintenance programmes. Pavement conditions depend upon many factors such as pavement types, traffic and environmental conditions. This paper presents a data analytics case study for assessing the factors affecting the pavement deflection values measured by the traffic speed deflectometer (TSD) device. The analytics process includes acquisition and integration of data from multiple sources, data pre-processing, mining useful information from them and utilising data mining outputs for knowledge deployment. Data mining techniques are able to show how TSD outputs vary in different roads, traffic and environmental conditions. The generated data mining models map the TSD outputs to some classes and define correction factors for each class.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

From 2014, QUT will be adopting a life-cycle approach to Course Quality Assurance informed by a wider and richer range of historic, ‘live’ and ‘predictive’ course data. Key data elements continue to be grouped according to the three broad categories – Viability, Quality of Learning Environment and Outcomes – and are further supported with analytic data presented within tables and charts. Course Quality Assurance and this Consolidated Courses Performance Report illuminate aspects of courses from a data evidence base highlighting the strengths and weaknesses of our courses. It provides the framework and tools to achieve QUT's commitment to excellent graduate outcomes by drawing attention and focus to the quality of our courses and providing a structured approach for bringing about change. Our portfolio of courses forms a vital part of QUT, generating almost $600 million in 2013 alone. Real world courses are fundamental to the strength of the Institution; they are what our many thousands of current and future students are drawn to and invest their time and aspirations in. As we move through a period of some regulatory and deregulatory uncertainty, there is a greater need for QUT to monitor and respond to the needs and expectations of our students. The life-cycle approach, with its rich and predicative data, provides the best source of evidence we have had, to date, to assure the quality of our courses and their relevance in a rapidly changing higher education context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The conventional measures of benchmarking focus mainly on the water produced or water delivered, and ignore the service quality, and as a result the 'low-cost and low-quality' utilities are rated as efficient units. Benchmarking must credit utilities for improvements in service delivery. This study measures the performance of 20 urban water utilities using data from an Asian Development Bank survey of Indian water utilities in 2005. It applies data envelopment analysis to measure the performance of utilities. The results reveal that incorporation of a quality dimension into the analysis significantly increases the average performance of utilities. The difference between conventional quantity-based measures and quality-adjusted estimates implies that there are significant opportunity costs of maintaining the quality of services in water delivery.