914 resultados para Knowledge-intensive Industry


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dominant discourse in education and training policies, at the turn of the millennium, was on lifelong learning (LLL) in the context of a knowledge-based society. As Green points (2002, pp. 611-612) several factors contribute to this global trend: The demographic change: In most advanced countries, the average age of the population is increasing, as people live longer; The effects of globalisation: Including both economic restructuring and cultural change which have impacts on the world of education; Global economic restructuring: Which causes, for example, a more intense demand for a higher order of skills; the intensified economic competition, forcing a wave of restructuring and creating enormous pressure to train and retrain the workforce In parallel, the “significance of the international division of labour cannot be underestimated for higher education”, as pointed out by Jarvis (1999, p. 250). This author goes on to argue that globalisation has exacerbated differentiation in the labour market, with the First World converting faster to a knowledge economy and a service society, while a great deal of the actual manufacturing is done elsewhere.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We are working on the confluence of knowledge management, organizational memory and emergent knowledge with the lens of complex adaptive systems. In order to be fundamentally sustainable organizations search for an adaptive need for managing ambidexterity of day-to-day work and innovation. An organization is an entity of a systemic nature, composed of groups of people who interact to achieve common objectives, making it necessary to capture, store and share interactions knowledge with the organization, this knowledge can be generated in intra-organizational or inter-organizational level. The organizations have organizational memory of knowledge of supported on the Information technology and systems. Each organization, especially in times of uncertainty and radical changes, to meet the demands of the environment, needs timely and sized knowledge on the basis of tacit and explicit. This sizing is a learning process resulting from the interaction that emerges from the relationship between the tacit and explicit knowledge and which we are framing within an approach of Complex Adaptive Systems. The use of complex adaptive systems for building the emerging interdependent relationship, will produce emergent knowledge that will improve the organization unique developing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O presente Relatório de Estágio teve como objectivo analisar e discutir as competências adquiridas e desenvolvidas, durante o estágio, no controlo da qualidade microbiológica na indústria farmacêutica O estágio, foi realizado no Laboratório de Microbiologia dos Laboratórios Atral, Grupo AtralCipan, localizado na Vala do Carregado, Castanheira do Ribatejo. Este teve a duração de aproximadamente dez meses e meio, com data de início a 21 de Setembro de 2011 e final de 10 de Agosto de 2012. Numa fase inicial, foram adquiridas as aptidões necessárias para a aplicação das metodologias realizadas no laboratório de microbiologia com o estudo das normas, procedimento e legislação aplicada, com foco na importância das farmacopeias na indústria farmacêutica e formação para a realização de ensaios em Áreas de Processamento Asséptico, conhecendo e compreendendo os procedimentos a ter nestas. Uma das principais metodologias realizadas, e desenvolvida neste relatório, compreendeu a análise de produtos farmacêuticos estéreis, com a aplicação de Testes de Esterilidade, pelo método de STERITEST e método Directo, com resultados que demonstraram a importância destas técnicas, da avaliação do ambiente em que são realizadas e do operador que as executa. Os Testes de Promoção de Crescimento, foram também explorados neste trabalho, realizados não só para avaliação dos meios de cultura, onde foi possível analisar os requisitos e resultados obtidos, como também para validação de metodologias, nomeadamente, na validação de meios e fluidos de marcas diferentes para utilização no método STERITEST e na validação do método de filtração de membrana para Enumeração Microbiana de um produto não estéril. Estas validações possibilitaram a redução de custos e melhoria das condições da análise e metodologias aplicadas. Com a realização do estágio foi possível adquirir aptidões práticas, que aliadas ao conhecimento teórico obtido no mestrado, proporcionaram um crescimento a nível pessoal, científico e profissional.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada ao Instituto Superior de Contabilidade para obtenção do Grau de Mestre em Auditoria Orientada por: Doutora Alcina Dias

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of distributed energy resources, based on natural intermittent power sources, like wind generation, in power systems imposes the development of new adequate operation management and control methodologies. A short-term Energy Resource Management (ERM) methodology performed in two phases is proposed in this paper. The first one addresses the day-ahead ERM scheduling and the second one deals with the five-minute ahead ERM scheduling. The ERM scheduling is a complex optimization problem due to the high quantity of variables and constraints. In this paper the main goal is to minimize the operation costs from the point of view of a virtual power player that manages the network and the existing resources. The optimization problem is solved by a deterministic mixedinteger non-linear programming approach. A case study considering a distribution network with 33 bus, 66 distributed generation, 32 loads with demand response contracts and 7 storage units and 1000 electric vehicles has been implemented in a simulator developed in the field of the presented work, in order to validate the proposed short-term ERM methodology considering the dynamic power system behavior.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the electricity market liberalization, distribution and retail companies are looking for better market strategies based on adequate information upon the consumption patterns of its electricity customers. In this environment all consumers are free to choose their electricity supplier. A fair insight on the customer´s behaviour will permit the definition of specific contract aspects based on the different consumption patterns. In this paper Data Mining (DM) techniques are applied to electricity consumption data from a utility client’s database. To form the different customer´s classes, and find a set of representative consumption patterns, we have used the Two-Step algorithm which is a hierarchical clustering algorithm. Each consumer class will be represented by its load profile resulting from the clustering operation. Next, to characterize each consumer class a classification model will be constructed with the C5.0 classification algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work describes a methodology to extract symbolic rules from trained neural networks. In our approach, patterns on the network are codified using formulas on a Lukasiewicz logic. For this we take advantage of the fact that every connective in this multi-valued logic can be evaluated by a neuron in an artificial network having, by activation function the identity truncated to zero and one. This fact simplifies symbolic rule extraction and allows the easy injection of formulas into a network architecture. We trained this type of neural network using a back-propagation algorithm based on Levenderg-Marquardt algorithm, where in each learning iteration, we restricted the knowledge dissemination in the network structure. This makes the descriptive power of produced neural networks similar to the descriptive power of Lukasiewicz logic language, minimizing the information loss on the translation between connectionist and symbolic structures. To avoid redundance on the generated network, the method simplifies them in a pruning phase, using the "Optimal Brain Surgeon" algorithm. We tested this method on the task of finding the formula used on the generation of a given truth table. For real data tests, we selected the Mushrooms data set, available on the UCI Machine Learning Repository.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: The present paper deals with the issue of the increasing usage of corporation mergers and acquisitions strategies within pharmaceutical industry environment. The aim is to identify the triggers of such business phenomenon and the immediate impact on the financial outcome of two powerful biopharmaceutical corporations: Pfizer and GlaxoSmithKline, which have been sampled due to their successful approach of the tactics in question. Materials and Methods: In order to create an overview of the development steps through mergers and acquisitions, the historical data of the two corporations has been consulted, from their official websites. The most relevant events were then associated with adequate information from the financial reports and statements of the two corporations indulged by web-based financial data providers. Results and Discussions: In the past few decades Pfizer and GlaxoSmithKline have purchased or merged with various companies in order to monopolize new markets, diversify products and services portfolios, survive and surpass competitors. The consequences proved to be positive although this approach implies certain capital availability. Conclusions: Results reveal the fact that, as far as the two sampled companies are concerned, acquisitions and mergers are reactions at the pressure of the highly competitive environment. Moreover, the continuous diversification of the market’s needs is also a consistent motive. However, the prevalence and the eminence of mergers and acquisition strategies are conditioned by the tender offer, the announcer’s caliber, research and development status and further other factors determined by the internal and external actors of the market.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent decades, all over the world, competition in the electric power sector has deeply changed the way this sector’s agents play their roles. In most countries, electric process deregulation was conducted in stages, beginning with the clients of higher voltage levels and with larger electricity consumption, and later extended to all electrical consumers. The sector liberalization and the operation of competitive electricity markets were expected to lower prices and improve quality of service, leading to greater consumer satisfaction. Transmission and distribution remain noncompetitive business areas, due to the large infrastructure investments required. However, the industry has yet to clearly establish the best business model for transmission in a competitive environment. After generation, the electricity needs to be delivered to the electrical system nodes where demand requires it, taking into consideration transmission constraints and electrical losses. If the amount of power flowing through a certain line is close to or surpasses the safety limits, then cheap but distant generation might have to be replaced by more expensive closer generation to reduce the exceeded power flows. In a congested area, the optimal price of electricity rises to the marginal cost of the local generation or to the level needed to ration demand to the amount of available electricity. Even without congestion, some power will be lost in the transmission system through heat dissipation, so prices reflect that it is more expensive to supply electricity at the far end of a heavily loaded line than close to an electric power generation. Locational marginal pricing (LMP), resulting from bidding competition, represents electrical and economical values at nodes or in areas that may provide economical indicator signals to the market agents. This article proposes a data-mining-based methodology that helps characterize zonal prices in real power transmission networks. To test our methodology, we used an LMP database from the California Independent System Operator for 2009 to identify economical zones. (CAISO is a nonprofit public benefit corporation charged with operating the majority of California’s high-voltage wholesale power grid.) To group the buses into typical classes that represent a set of buses with the approximate LMP value, we used two-step and k-means clustering algorithms. By analyzing the various LMP components, our goal was to extract knowledge to support the ISO in investment and network-expansion planning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Power system planning, control and operation require an adequate use of existing resources as to increase system efficiency. The use of optimal solutions in power systems allows huge savings stressing the need of adequate optimization and control methods. These must be able to solve the envisaged optimization problems in time scales compatible with operational requirements. Power systems are complex, uncertain and changing environments that make the use of traditional optimization methodologies impracticable in most real situations. Computational intelligence methods present good characteristics to address this kind of problems and have already proved to be efficient for very diverse power system optimization problems. Evolutionary computation, fuzzy systems, swarm intelligence, artificial immune systems, neural networks, and hybrid approaches are presently seen as the most adequate methodologies to address several planning, control and operation problems in power systems. Future power systems, with intensive use of distributed generation and electricity market liberalization increase power systems complexity and bring huge challenges to the forefront of the power industry. Decentralized intelligence and decision making requires more effective optimization and control techniques techniques so that the involved players can make the most adequate use of existing resources in the new context. The application of computational intelligence methods to deal with several problems of future power systems is presented in this chapter. Four different applications are presented to illustrate the promises of computational intelligence, and illustrate their potentials.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Artificial intelligence techniques are being widely used to face the new reality and to provide solutions that can make power systems undergo all the changes while assuring high quality power. In this way, the agents that act in the power industry are gaining access to a generation of more intelligent applications, making use of a wide set of AI techniques. Knowledge-based systems and decision-support systems have been applied in the power and energy industry. This article is intended to offer an updated overview of the application of artificial intelligence in power systems. This article paper is organized in a way so that readers can easily understand the problems and the adequacy of the proposed solutions. Because of space constraints, this approach can be neither complete nor sufficiently deep to satisfy all readers’ needs. As this is amultidisciplinary area, able to attract both software and computer engineering and power system people, this article tries to give an insight into themost important concepts involved in these applications. Complementary material can be found in the reference list, providing deeper and more specific approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Demand response can play a very relevant role in future power systems in which distributed generation can help to assure service continuity in some fault situations. This paper deals with the demand response concept and discusses its use in the context of competitive electricity markets and intensive use of distributed generation. The paper presents DemSi, a demand response simulator that allows studying demand response actions and schemes using a realistic network simulation based on PSCAD. Demand response opportunities are used in an optimized way considering flexible contracts between consumers and suppliers. A case study evidences the advantages of using flexible contracts and optimizing the available generation when there is a lack of supply.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The emergence of new business models, namely, the establishment of partnerships between organizations, the chance that companies have of adding existing data on the web, especially in the semantic web, to their information, led to the emphasis on some problems existing in databases, particularly related to data quality. Poor data can result in loss of competitiveness of the organizations holding these data, and may even lead to their disappearance, since many of their decision-making processes are based on these data. For this reason, data cleaning is essential. Current approaches to solve these problems are closely linked to database schemas and specific domains. In order that data cleaning can be used in different repositories, it is necessary for computer systems to understand these data, i.e., an associated semantic is needed. The solution presented in this paper includes the use of ontologies: (i) for the specification of data cleaning operations and, (ii) as a way of solving the semantic heterogeneity problems of data stored in different sources. With data cleaning operations defined at a conceptual level and existing mappings between domain ontologies and an ontology that results from a database, they may be instantiated and proposed to the expert/specialist to be executed over that database, thus enabling their interoperability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the last years there has been a considerable increase in the number of people in need of intensive care, especially among the elderly, a phenomenon that is related to population ageing (Brown 2003). However, this is not exclusive of the elderly, as diseases as obesity, diabetes, and blood pressure have been increasing among young adults (Ford and Capewell 2007). As a new fact, it has to be dealt with by the healthcare sector, and particularly by the public one. Thus, the importance of finding new and cost effective ways for healthcare delivery are of particular importance, especially when the patients are not to be detached from their environments (WHO 2004). Following this line of thinking, a VirtualECare Multiagent System is presented in section 2, being our efforts centered on its Group Decision modules (Costa, Neves et al. 2007) (Camarinha-Matos and Afsarmanesh 2001).On the other hand, there has been a growing interest in combining the technological advances in the information society - computing, telecommunications and knowledge – in order to create new methodologies for problem solving, namely those that convey on Group Decision Support Systems (GDSS), based on agent perception. Indeed, the new economy, along with increased competition in today’s complex business environments, takes the companies to seek complementarities, in order to increase competitiveness and reduce risks. Under these scenarios, planning takes a major role in a company life cycle. However, effective planning depends on the generation and analysis of ideas (innovative or not) and, as a result, the idea generation and management processes are crucial. Our objective is to apply the GDSS referred to above to a new area. We believe that the use of GDSS in the healthcare arena will allow professionals to achieve better results in the analysis of one’s Electronically Clinical Profile (ECP). This attainment is vital, regarding the incoming to the market of new drugs and medical practices, which compete in the use of limited resources.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To identify potential prognostic factors for neonatal mortality among newborns referred to intensive care units. METHODS: A live-birth cohort study was carried out in Goiânia, Central Brazil, from November 1999 to October 2000. Linked birth and infant death certificates were used to ascertain the cohort of live born infants. An additional active surveillance system of neonatal-based mortality was implemented. Exposure variables were collected from birth and death certificates. The outcome was survivors (n=713) and deaths (n=162) in all intensive care units in the study period. Cox's proportional hazards model was applied and a Receiver Operating Characteristic curve was used to compare the performance of statistically significant variables in the multivariable model. Adjusted mortality rates by birth weight and 5-min Apgar score were calculated for each intensive care unit. RESULTS: Low birth weight and 5-min Apgar score remained independently associated to death. Birth weight equal to 2,500g had 0.71 accuracy (95% CI: 0.65-0.77) for predicting neonatal death (sensitivity =72.2%). A wide variation in the mortality rates was found among intensive care units (9.5-48.1%) and two of them remained with significant high mortality rates even after adjusting for birth weight and 5-min Apgar score. CONCLUSIONS: This study corroborates birth weight as a sensitive screening variable in surveillance programs for neonatal death and also to target intensive care units with high mortality rates for implementing preventive actions and interventions during the delivery period.