774 resultados para data warehouse tuning aggregato business intelligence performance


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Companies face new challenges almost every day. In order to stay competitive, it is important that companies strive for continuous development and improvement. By describing companies through their processes it is possible to get a clear overview of the entire operation, which can contribute, to a well-established overall understanding of the company. This is a case study based on Stort AB which is a small logistics company specialized in international transportation and logistics solutions. The purpose of this study is to perform value stream mapping in order to create a more efficient production process and propose possible improvements in order to reduce processing time. After performing value stream mapping, data envelopment analysis is used to calculate how lean Stort AB is today and how lean the company can become by implementing the proposed improvements. The results show that the production process can improve efficiency by minimizing waste produced by a bad workplace layout and over-processing. The authors suggested solution is to introduce standardized processes and invest in technical instruments in order to automate the process to reduce process time. According to data envelopment analysis the business is 41 percent lean at present and may soon become 55 percent lean and finally reach an optimum 100 percent lean mode if the process is automated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Twitter System is the biggest social network in the world, and everyday millions of tweets are posted and talked about, expressing various views and opinions. A large variety of research activities have been conducted to study how the opinions can be clustered and analyzed, so that some tendencies can be uncovered. Due to the inherent weaknesses of the tweets - very short texts and very informal styles of writing - it is rather hard to make an investigation of tweet data analysis giving results with good performance and accuracy. In this paper, we intend to attack the problem from another aspect - using a two-layer structure to analyze the twitter data: LDA with topic map modelling. The experimental results demonstrate that this approach shows a progress in twitter data analysis. However, more experiments with this method are expected in order to ensure that the accurate analytic results can be maintained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kandidaatintyö on toteutettu kirjallisuuskatsauksena, jonka tavoitteena on selvittää data-analytiikan käyttökohteita ja datan hyödyntämisen vaikutusta liiketoimintaan. Työ käsittelee data-analytiikan käyttöä ja datan tehokkaan hyödyntämisen haasteita. Työ on rajattu tarkastelemaan yrityksen talouden ohjausta, jossa analytiikkaa käytetään johdon ja rahoituksen laskentatoimessa. Datan määrän eksponentiaalinen kasvunopeus luo data-analytiikan käytölle uusia haasteita ja mahdollisuuksia. Datalla itsessään ei kuitenkaan ole suurta arvoa yritykselle, vaan arvo syntyy prosessoinnin kautta. Vaikka data-analytiikkaa tutkitaan ja käytetään jo runsaasti, se tarjoaa paljon nykyisiä sovelluksia suurempia mahdollisuuksia. Yksi työn keskeisimmistä tuloksista on, että data-analytiikalla voidaan tehostaa johdon laskentatoimea ja helpottaa rahoituksen laskentatoimen tehtäviä. Tarjolla olevan datan määrä kasvaa kuitenkin niin nopeasti, että käytettävissä oleva teknologia ja osaamisen taso eivät pysy kehityksessä mukana. Varsinkin big datan laajempi käyttöönotto ja sen tehokas hyödyntäminen vaikuttavat jatkossa talouden ohjauksen käytäntöihin ja sovelluksiin yhä enemmän.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A presente investigação propõe-se a atuar no sector turístico, uma vez que este é bombardeado diariamente por uma quantidade considerável de dados e informações. Atualmente, usufrui-se significativamente mais da tecnologia com a finalidade de promover e vender os produtos/serviços disponíveis no mercado. A par da evolução tecnológica, os utilizadores/clientes conseguem comprar, cada vez mais, à distancia de um clique os produtos turísticos que desejam. No entanto, há um variado leque de aplicações sobre o turismo que permitem entender os gostos e as necessidades dos turistas assim como a sua atitude para com o mesmo. Porém, nem as entidades nem os gestores turísticos usufruem inteligentemente dos dados que lhes são facultados. Estes tendem normalmente a prender-se pelo turismo em Portugal e de que forma é que a sua entidade é apresentada acabando por esquecer que os dados podem e devem ser utilizados para expandir o mercado assim como entender/conhecer potenciais mercados. Deste modo, o fundamento principal desta investigação remete para a criação de uma plataforma infocomunicacional que analise na totalidade os dados obtidos, assim como fornecer as ferramentas pertinentes para que se consiga fazer esta análise, nomeadamente através de uma representação infográfica adequada e estratégias de a comunicar aos stakeholders.. Para tal foi aplicada no âmbito desta dissertação a metodologia investigação/ação, vista como um processo cíclico que para além de incluir simultaneamente estas duas vertentes, vai alternando entre a ação e a reflexão critica sendo sustentada por bases teóricas. A criação do protótipo da plataforma Smart Tourism, resultou num sistema inovador que tenta responder aos indicadores escolhidos no Dashbord e ao problema infocomunicacional, tentando criar as bases necessárias para que as entidades consigam analisar de forma mais integrada/sistematizada e racional a atividade turística. Foi por isso, desenvolvido e avaliado qualitativamente um protótipo de base infocomunicacional visual (dashboard visual) que para além do que para além do que já foi referido, consegue proporcionar a gestão dos produtos, clientes, staff e parceiros, aumentando assim o valor deste sector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The business environment context points at the necessity of new forms of management for the sustainable competitiveness of organizations through time. Coopetition is characterized as an alternative in the interaction of different actors, which compete and cooperate simultaneously, in the pursuit of common goals. This dual relation, within a gain-increasing perspective, converts competitors into partners and fosters competitiveness, especially that of organizations within a specific sector. The field of competitive intelligence has, in its turn, assisted organizations, individually, in the systematization of information valuable to decision-making processes, which benefits competitiveness. It follows that it is possible to combine coopetition and competitive intelligence in a systematized process of sectorial intelligence for coopetitive relations. The general aim of this study is, therefore, to put forth a model of sectorial coopetitive intelligence. The methodological outlining of the study is characterized as a mixed approach (quantitative and qualitative methods), of an applied nature, of exploratory and descriptive aims. The Coordination of the Strategic Roadmapping Project for the Future of Paraná's Industry is the selected object of investigation. Protocols have been designed to collect primary and secondary data. In the collection of the primary ata, online questionary were sent to the sectors selected for examination. A total of 149 answers to the online questionary were obtained, and interviews were performed with all embers of the technical team of the Coordination, in a total of five interviewees. After the collection, all the data were tabulated, analyzed and validated by means of focal groups with the same five members of the Coordination technical team, and interviews were performed with a representative of each of the four sectors selected, in a total of nine participants in the validation. The results allowed the systematization of a sectorial coopetitive intelligence model called ICoops. This model is characterized by five stages, namely, planning, collection, nalysis, project development, dissemination and evaluation. Each stage is detailed in inputs, activities and outputs. The results suggest that sectorial coopetition is motivated mainly by knowledge sharing, technological development, investment in R&D, innovation, chain integration and resource complementation. The importance of a neutral institution has been recognized as a facilitator and incentive to the approximation of organizations. Among the main difficulties are the financing of the projects, the adhesion of new members, the lack of tools for the analysis of information and the dissemination of the actions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The process of building Data Warehouses (DW) is well known with well defined stages but at the same time, mostly carried out manually by IT people in conjunction with business people. Web Warehouses (WW) are DW whose data sources are taken from the web. We define a flexible WW, which can be configured accordingly to different domains, through the selection of the web sources and the definition of data processing characteristics. A Business Process Management (BPM) System allows modeling and executing Business Processes (BPs) providing support for the automation of processes. To support the process of building flexible WW we propose a two BPs level: a configuration process to support the selection of web sources and the definition of schemas and mappings, and a feeding process which takes the defined configuration and loads the data into the WW. In this paper we present a proof of concept of both processes, with focus on the configuration process and the defined data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Beef businesses in northern Australia are facing increased pressure to be productive and profitable with challenges such as climate variability and poor financial performance over the past decade. Declining terms of trade, limited recent gains in on-farm productivity, low profit margins under current management systems and current climatic conditions will leave little capacity for businesses to absorb climate change-induced losses. In order to generate a whole-of-business focus towards management change, the Climate Clever Beef project in the Maranoa-Balonne region of Queensland trialled the use of business analysis with beef producers to improve financial literacy, provide a greater understanding of current business performance and initiate changes to current management practices. Demonstration properties were engaged and a systematic approach was used to assess current business performance, evaluate impacts of management changes on the business and to trial practices and promote successful outcomes to the wider industry. Focus was concentrated on improving financial literacy skills, understanding the business’ key performance indicators and modifying practices to improve both business productivity and profitability. To best achieve the desired outcomes, several extension models were employed: the ‘group facilitation/empowerment model’, the ‘individual consultant/mentor model’ and the ‘technology development model’. Providing producers with a whole-of-business approach and using business analysis in conjunction with on-farm trials and various extension methods proved to be a successful way to encourage producers in the region to adopt new practices into their business, in the areas of greatest impact. The areas targeted for development within businesses generally led to improvements in animal performance and grazing land management further improving the prospects for climate resilience.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El volumen de datos en bibliotecas ha aumentado enormemente en los últimos años, así como también la complejidad de sus fuentes y formatos de información, dificultando su gestión y acceso, especialmente como apoyo en la toma de decisiones. Sabiendo que una buena gestión de bibliotecas involucra la integración de indicadores estratégicos, la implementación de un Data Warehouse (DW), que gestione adecuadamente tal cantidad de información, así como su compleja mezcla de fuentes de datos, se convierte en una alternativa interesante a considerar. El artículo describe el diseño e implementación de un sistema de soporte de decisiones (DSS) basado en técnicas de DW para la biblioteca de la Universidad de Cuenca. Para esto, el estudio utiliza una metodología holística, propuesto por Siguenza-Guzman et al. (2014) para la evaluación integral de bibliotecas. Dicha metodología evalúa la colección y los servicios, incorporando importantes elementos para la gestión de bibliotecas, tales como: el desempeño de los servicios, el control de calidad, el uso de la colección y la interacción con el usuario. A partir de este análisis, se propone una arquitectura de DW que integra, procesa y almacena los datos. Finalmente, estos datos almacenados son analizados y visualizados a través de herramientas de procesamiento analítico en línea (OLAP). Las pruebas iniciales de implementación confirman la viabilidad y eficacia del enfoque propuesto, al integrar con éxito múltiples y heterogéneas fuentes y formatos de datos, facilitando que los directores de bibliotecas generen informes personalizados, e incluso permitiendo madurar los procesos transaccionales que diariamente se llevan a cabo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação (mestrado)—Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Elétrica, 2015.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Part 14: Interoperability and Integration

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Part 2: Behaviour and Coordination

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Libraries since their inception 4000 years ago have been in a process of constant change. Although, changes were in slow motion for centuries, in the last decades, academic libraries have been continuously striving to adapt their services to the ever-changing user needs of students and academic staff. In addition, e-content revolution, technological advances, and ever-shrinking budgets have obliged libraries to efficiently allocate their limited resources among collection and services. Unfortunately, this resource allocation is a complex process due to the diversity of data sources and formats required to be analyzed prior to decision-making, as well as the lack of efficient integration methods. The main purpose of this study is to develop an integrated model that supports libraries in making optimal budgeting and resource allocation decisions among their services and collection by means of a holistic analysis. To this end, a combination of several methodologies and structured approaches is conducted. Firstly, a holistic structure and the required toolset to holistically assess academic libraries are proposed to collect and organize the data from an economic point of view. A four-pronged theoretical framework is used in which the library system and collection are analyzed from the perspective of users and internal stakeholders. The first quadrant corresponds to the internal perspective of the library system that is to analyze the library performance, and costs incurred and resources consumed by library services. The second quadrant evaluates the external perspective of the library system; user’s perception about services quality is judged in this quadrant. The third quadrant analyses the external perspective of the library collection that is to evaluate the impact of the current library collection on its users. Eventually, the fourth quadrant evaluates the internal perspective of the library collection; the usage patterns followed to manipulate the library collection are analyzed. With a complete framework for data collection, these data coming from multiple sources and therefore with different formats, need to be integrated and stored in an adequate scheme for decision support. A data warehousing approach is secondly designed and implemented to integrate, process, and store the holistic-based collected data. Ultimately, strategic data stored in the data warehouse are analyzed and implemented for different purposes including the following: 1) Data visualization and reporting is proposed to allow library managers to publish library indicators in a simple and quick manner by using online reporting tools. 2) Sophisticated data analysis is recommended through the use of data mining tools; three data mining techniques are examined in this research study: regression, clustering and classification. These data mining techniques have been applied to the case study in the following manner: predicting the future investment in library development; finding clusters of users that share common interests and similar profiles, but belong to different faculties; and predicting library factors that affect student academic performance by analyzing possible correlations of library usage and academic performance. 3) Input for optimization models, early experiences of developing an optimal resource allocation model to distribute resources among the different processes of a library system are documented in this study. Specifically, the problem of allocating funds for digital collection among divisions of an academic library is addressed. An optimization model for the problem is defined with the objective of maximizing the usage of the digital collection over-all library divisions subject to a single collection budget. By proposing this holistic approach, the research study contributes to knowledge by providing an integrated solution to assist library managers to make economic decisions based on an “as realistic as possible” perspective of the library situation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação (mestrado)—Universidade de Brasília, Faculdade de Economia, Administração e Contabilidade, Programa de Pós-Graduação em Administração, 2016.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The era of big data brings new challenges to the network traffic technique that is an essential tool for network management and security. To deal with the problems of dynamic ports and encrypted payload in traditional port-based and payload-basedmethods, the state-of-the-art method employs flow statistical features and machine learning techniques to identify network traffic. This chapter reviews the statistical-feature based traffic classification methods, that have been proposed in the last decade. We also examine a new problem: unclean traffic in the training stage of machine learning due to the labeling mistake and complex composition of big Internet data. This chapter further evaluates the performance of typical machine learning algorithms with unclean training data. The review and the empirical study can provide a guide for academia and practitioners in choosing proper traffic classification methods in real-world scenarios.