36 resultados para Business Intelligence,Data Warehouse,Sistemi Informativi


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article discusses the application of Information and Communication Technologies and strategies for best practices in order to capture and maintain faculty students' attention. It is based on a case study of ten years, using a complete information system. This system, in addition to be considered an ERP, to support the activities of academic management, also has a strong component of SRM that provides support to academic and administrative activities. It describes the extent to which the presented system facilitates the interaction and communication between members of the academic community, using the Internet, with services available on the Web complementing them with email, SMS and CTI. Through a perception, backed by empirical analysis and results of investigations, it demonstrates how this type of practice may raise the level of satisfaction of the community. In particular, it is possible to combat failure at school, avoid that students leave their course before its completion and also that they recommend them to potential students. In addition, such a strategy also allows strong economies in the management of the institution, increasing its value. As future work, we present the new phase of the project towards implementation of Business Intelligence to optimize the management process, making it proactive. The technological vision that guides new developments to a construction based on Web services and procedural languages is also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a Multi-Agent Market simulator designed for developing new agent market strategies based on a complete understanding of buyer and seller behaviors, preference models and pricing algorithms, considering user risk preferences and game theory for scenario analysis. This tool studies negotiations based on different market mechanisms and, time and behavior dependent strategies. The results of the negotiations between agents are analyzed by data mining algorithms in order to extract rules that give agents feedback to improve their strategies. The system also includes agents that are capable of improving their performance with their own experience, by adapting to the market conditions, and capable of considering other agent reactions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: The present paper deals with the issue of the increasing usage of corporation mergers and acquisitions strategies within pharmaceutical industry environment. The aim is to identify the triggers of such business phenomenon and the immediate impact on the financial outcome of two powerful biopharmaceutical corporations: Pfizer and GlaxoSmithKline, which have been sampled due to their successful approach of the tactics in question. Materials and Methods: In order to create an overview of the development steps through mergers and acquisitions, the historical data of the two corporations has been consulted, from their official websites. The most relevant events were then associated with adequate information from the financial reports and statements of the two corporations indulged by web-based financial data providers. Results and Discussions: In the past few decades Pfizer and GlaxoSmithKline have purchased or merged with various companies in order to monopolize new markets, diversify products and services portfolios, survive and surpass competitors. The consequences proved to be positive although this approach implies certain capital availability. Conclusions: Results reveal the fact that, as far as the two sampled companies are concerned, acquisitions and mergers are reactions at the pressure of the highly competitive environment. Moreover, the continuous diversification of the market’s needs is also a consistent motive. However, the prevalence and the eminence of mergers and acquisition strategies are conditioned by the tender offer, the announcer’s caliber, research and development status and further other factors determined by the internal and external actors of the market.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent decades, all over the world, competition in the electric power sector has deeply changed the way this sector’s agents play their roles. In most countries, electric process deregulation was conducted in stages, beginning with the clients of higher voltage levels and with larger electricity consumption, and later extended to all electrical consumers. The sector liberalization and the operation of competitive electricity markets were expected to lower prices and improve quality of service, leading to greater consumer satisfaction. Transmission and distribution remain noncompetitive business areas, due to the large infrastructure investments required. However, the industry has yet to clearly establish the best business model for transmission in a competitive environment. After generation, the electricity needs to be delivered to the electrical system nodes where demand requires it, taking into consideration transmission constraints and electrical losses. If the amount of power flowing through a certain line is close to or surpasses the safety limits, then cheap but distant generation might have to be replaced by more expensive closer generation to reduce the exceeded power flows. In a congested area, the optimal price of electricity rises to the marginal cost of the local generation or to the level needed to ration demand to the amount of available electricity. Even without congestion, some power will be lost in the transmission system through heat dissipation, so prices reflect that it is more expensive to supply electricity at the far end of a heavily loaded line than close to an electric power generation. Locational marginal pricing (LMP), resulting from bidding competition, represents electrical and economical values at nodes or in areas that may provide economical indicator signals to the market agents. This article proposes a data-mining-based methodology that helps characterize zonal prices in real power transmission networks. To test our methodology, we used an LMP database from the California Independent System Operator for 2009 to identify economical zones. (CAISO is a nonprofit public benefit corporation charged with operating the majority of California’s high-voltage wholesale power grid.) To group the buses into typical classes that represent a set of buses with the approximate LMP value, we used two-step and k-means clustering algorithms. By analyzing the various LMP components, our goal was to extract knowledge to support the ISO in investment and network-expansion planning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cyber-Physical Intelligence is a new concept integrating Cyber-Physical Systems and Intelligent Systems. The paradigm is centered in incorporating intelligent behavior in cyber-physical systems, until now too oriented to the operational technological aspects. In this paper we will describe the use of Cyber-Physical Intelligence in the context of Power Systems, namely in the use of Intelligent SCADA (Supervisory Control and Data Acquisition) systems at different levels of the Power System, from the Power Generation, Transmission, and Distribution Control Centers till the customers houses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a swarm intelligence long-term hedging tool to support electricity producers in competitive electricity markets. This tool investigates the long-term hedging opportunities available to electric power producers through the use of contracts with physical (spot and forward) and financial (options) settlement. To find the optimal portfolio the producer risk preference is stated by a utility function (U) expressing the trade-off between the expectation and the variance of the return. Variance estimation and the expected return are based on a forecasted scenario interval determined by a long-term price range forecast model, developed by the authors, whose explanation is outside the scope of this paper. The proposed tool makes use of Particle Swarm Optimization (PSO) and its performance has been evaluated by comparing it with a Genetic Algorithm (GA) based approach. To validate the risk management tool a case study, using real price historical data for mainland Spanish market, is presented to demonstrate the effectiveness of the proposed methodology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the optimal involvement in derivatives electricity markets of a power producer to hedge against the pool price volatility. To achieve this aim, a swarm intelligence meta-heuristic optimization technique for long-term risk management tool is proposed. This tool investigates the long-term opportunities for risk hedging available for electric power producers through the use of contracts with physical (spot and forward contracts) and financial (options contracts) settlement. The producer risk preference is formulated as a utility function (U) expressing the trade-off between the expectation and the variance of the return. Variance of return and the expectation are based on a forecasted scenario interval determined by a long-term price range forecasting model. This model also makes use of particle swarm optimization (PSO) to find the best parameters allow to achieve better forecasting results. On the other hand, the price estimation depends on load forecasting. This work also presents a regressive long-term load forecast model that make use of PSO to find the best parameters as well as in price estimation. The PSO technique performance has been evaluated by comparison with a Genetic Algorithm (GA) based approach. A case study is presented and the results are discussed taking into account the real price and load historical data from mainland Spanish electricity market demonstrating the effectiveness of the methodology handling this type of problems. Finally, conclusions are dully drawn.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The emergence of new business models, namely, the establishment of partnerships between organizations, the chance that companies have of adding existing data on the web, especially in the semantic web, to their information, led to the emphasis on some problems existing in databases, particularly related to data quality. Poor data can result in loss of competitiveness of the organizations holding these data, and may even lead to their disappearance, since many of their decision-making processes are based on these data. For this reason, data cleaning is essential. Current approaches to solve these problems are closely linked to database schemas and specific domains. In order that data cleaning can be used in different repositories, it is necessary for computer systems to understand these data, i.e., an associated semantic is needed. The solution presented in this paper includes the use of ontologies: (i) for the specification of data cleaning operations and, (ii) as a way of solving the semantic heterogeneity problems of data stored in different sources. With data cleaning operations defined at a conceptual level and existing mappings between domain ontologies and an ontology that results from a database, they may be instantiated and proposed to the expert/specialist to be executed over that database, thus enabling their interoperability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The industrial activity is inevitably associated with a certain degradation of the environmental quality, because is not possible to guarantee that a manufacturing process can be totally innocuous. The eco-efficiency concept is globally accepted as a philosophy of entreprise management, that encourages the companies to become more competitive, innovative and environmentally responsible by promoting the link between its companies objectives for excellence and its objectives of environmental excellence issues. This link imposes the creation of an organizational methodology where the performance of the company is concordant with the sustainable development. The main propose of this project is to apply the concept of eco-efficiency to the particular case of the metallurgical and metal workshop industries through the development of the particular indicators needed and to produce a manual of procedures for implementation of the accurate solution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the creation and development of technological schools directly linked to the business community and to higher public education. Establishing themselves as the key interface between the two sectors they make a signigicant contribution by having a greater competitive edge when faced with increasing competition in the tradional markets. The development of new business strategies supported by references of excellence, quality and competitiveness also provides a good link between the estalishment of partnerships aiming at the qualification of education boards at a medium level between the technological school and higher education with a technological foundation. We present a case study as an example depicting the success of Escola Tecnológica de Vale de Cambra.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we describe a low cost distributed system intended to increase the positioning accuracy of outdoor navigation systems based on the Global Positioning System (GPS). Since the accuracy of absolute GPS positioning is insufficient for many outdoor navigation tasks, another GPS based methodology – the Differential GPS (DGPS) – was developed in the nineties. The differential or relative positioning approach is based on the calculation and dissemination of the range errors of the received GPS satellites. GPS/DGPS receivers correlate the broadcasted GPS data with the DGPS corrections, granting users increased accuracy. DGPS data can be disseminated using terrestrial radio beacons, satellites and, more recently, the Internet. Our goal is to provide mobile platforms within our campus with DGPS data for precise outdoor navigation. To achieve this objective, we designed and implemented a three-tier client/server distributed system that, first, establishes Internet links with remote DGPS sources and, then, performs campus-wide dissemination of the obtained data. The Internet links are established between data servers connected to remote DGPS sources and the client, which is the data input module of the campus-wide DGPS data provider. The campus DGPS data provider allows the establishment of both Intranet and wireless links within the campus. This distributed system is expected to provide adequate support for accurate outdoor navigation tasks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effects of ISO 9001 quality certification process inside an organization may be considered on a wide range. This article places this phenomenon under a theoretical perspective and aims to analyze the impact of ISO 9001 quality standard on business performance according to the financial information data. More specifically, the impact of ISO 9001on productivity, business value and increase on sales is viewed through an econometric model of analysis concerning a panel data of Portuguese companies from the agro-food and to the construction sector. This analysis has presented interesting different results and in brief revealed that it may not be deterministically ascertained a direct connection between ISO 9001 certification and the improvement on business performance. Many other variables are committed to its success.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Os programas de melhoria contínua dos processos são cada vez mais a aposta das empresas para fazer face ao mercado. Através da implementação destes programas é possível conferir simplicidade e padronização aos processos e consequentemente reduzir os custos com desperdícios internos relacionados com a qualidade dos mesmos. As ferramentas de melhoria da qualidade e as ferramentas associadas ao Lean Thinking representam um pilar importante no sucesso de qualquer programa de melhoria contínua dos processos. Estas ferramentas constituem meios úteis na análise, controlo, organização de dados importantes para a correta tomada de decisão nas organizações. O presente projeto tem como principal objetivo a conceção e implementação de um programa de melhoria da qualidade na Eurico Ferreira, S.A., tendo por base a avaliação da satisfação do cliente e a aplicação dos 5S. Neste contexto, o trabalho teve como fundamentação teórica a Gestão da Qualidade, Lean Thinking e algumas ferramentas de ambas as matérias. Posteriormente foi selecionada a área de negócio da empresa a abordar. Após a seleção, realizou-se um diagnóstico inicial do processo identificando os diversos pontos de melhoria onde foram aplicadas algumas ferramentas do Lean Thinking, nomeadamente o Value Stream Mapping e a metodologia 5S. Com a primeira foi possível construir um mapa do estado atual do processo, no qual estavam representados todos os intervenientes assim como o fluxo de materiais e de informação ao longo do processo. A metodologia 5S permitiu atuar sobre os desperdícios, identificando e implementando diversas melhorias no processo. Concluiu-se que a implementação das ferramentas contribuiu eficientemente para a melhoria contínua da qualidade nos processos, tendo sido decisão da coordenação alargar o âmbito do projeto aos restantes armazéns do centro logístico da empresa. Pode afirmar-se com recurso à satisfação do cliente expressa através da evolução favorável do Service-level agreement que as ferramentas implementadas têm gerado resultados muito positivos no curto prazo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current Manufacturing Systems challenges due to international economic crisis, market globalization and e-business trends, incites the development of intelligent systems to support decision making, which allows managers to concentrate on high-level tasks management while improving decision response and effectiveness towards manufacturing agility. This paper presents a novel negotiation mechanism for dynamic scheduling based on social and collective intelligence. Under the proposed negotiation mechanism, agents must interact and collaborate in order to improve the global schedule. Swarm Intelligence (SI) is considered a general aggregation term for several computational techniques, which use ideas and inspiration from the social behaviors of insects and other biological systems. This work is primarily concerned with negotiation, where multiple self-interested agents can reach agreement over the exchange of operations on competitive resources. Experimental analysis was performed in order to validate the influence of negotiation mechanism in the system performance and the SI technique. Empirical results and statistical evidence illustrate that the negotiation mechanism influence significantly the overall system performance and the effectiveness of Artificial Bee Colony for makespan minimization and on the machine occupation maximization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This communication aims to present some reflections regarding the importance of information in organizational context, especially in business context. The ability to produce and to share expertise and knowledge among its employees is now a key factor in the success of any organization. However, it’s also true that workers are increasingly feeling that too much information can hurt their performance. The existence of skilled professionals able to organize, evaluate, select and disseminate information in organizations appears to be a prerequisite for success. The skills necessary for the formation of a professional devoted to the management of information and knowledge in the context of business organizations will be analysed. Then data collected in two focus group discussion with students from a graduate course in Business Information, from Polytechnic Institute of Porto, Portugal, a will be examined.