997 resultados para Sistemas multi-agentes
Resumo:
Este estudo buscou verificar a influencia dos agentes da cadeia de suprimentos no desempenho do desenvolvimento de novos produtos quando os agentes são analisados em conjunto. A motivação desta pesquisa veio de estudos que alertaram para a consideração da integração da cadeia de suprimentos como um constructo multidimensional, englobando o envolvimento da manufatura, fornecedores e clientes no desenvolvimento de novos produtos; e devido à falta de informação sobre as influencias individuais destes agentes no desenvolvimento de novos produtos. Sob essas considerações, buscou-se construir um modelo analítico baseado na Teoria do Capital Social e Capacidade Absortiva, construir hipóteses a partir da revisão da literatura e conectar constructos como cooperação, envolvimento do fornecedor no desenvolvimento de novos produtos (DNP), envolvimento do cliente no DNP, envolvimento da manufatura no DNP, antecipação de novas tecnologias, melhoria contínua, desempenho operacional do DNP, desempenho de mercado do NPD e desempenho de negócio do DNP. Para testar as hipóteses foram consideradas três variáveis moderadoras, tais como turbulência ambiental (baixa, média e alta), indústria (eletrônicos, maquinários e equipamentos de transporte) e localização (América, Europa e Ásia). Para testar o modelo foram usados dados do projeto High Performance Manufacturing que contém 339 empresas das indústrias de eletrônicos, maquinários e equipamentos de transporte, localizadas em onze países. As hipóteses foram testadas por meio da Análise Fatorial Confirmatória (AFC) incluindo a moderação muti-grupo para as três variáveis moderadoras mencionadas anteriormente. Os principais resultados apontaram que as hipóteses relacionadas com cooperação foram confirmadas em ambientes de média turbulência, enquanto as hipóteses relacionadas ao desempenho no DNP foram confirmadas em ambientes de baixa turbulência ambiental e em países asiáticos. Adicionalmente, sob as mesmas condições, fornecedores, clientes e manufatura influenciam diferentemente no desempenho de novos produtos. Assim, o envolvimento de fornecedores influencia diretamente no desempenho operacional e indiretamente no desempenho de mercado e de negócio em baixos níveis de turbulência ambiental, na indústria de equipamentos de transporte em países da Americanos e Europeus. De igual forma, o envolvimento do cliente influenciou diretamente no desempenho operacional e indiretamente no desempenho de mercado e do negócio em médio nível de turbulência ambiental, na indústria de maquinários e em países Asiáticos. Fornecedores e clientes não influenciam diretamente no desempenho de mercado e do negócio e não influenciam indiretamente no desempenho operacional. O envolvimento da manufatura não influenciou nenhum tipo de desempenho do desenvolvimento de novos produtos em todos os cenários testados.
Resumo:
Com o objetivo de estabelecer uma metodologia capaz segregar momentos de mercado e de identificar as características predominantes dos investidores atuantes em um determinado mercado financeiro, este trabalho emprega simulações geradas em um Mercado Financeiro Artificial baseado em agentes, utilizando um Algoritmo Genético para ajustar tais simulações ao histórico real observado. Para tanto, uma aplicação foi desenvolvida utilizando-se o mercado de contratos futuros de índice Bovespa. Esta metodologia poderia facilmente ser estendida a outros mercados financeiros através da simples parametrização do modelo. Sobre as bases estabelecidas por Toriumi et al. (2011), contribuições significativas foram atingidas, promovendo acréscimo de conhecimento acerca tanto do mercado alvo escolhido, como das técnicas de modelagem em Mercados Financeiros Artificiais e também da aplicação de Algoritmos Genéticos a mercados financeiros, resultando em experimentos e análises que sugerem a eficácia do método ora proposto.
Resumo:
As tecnologias da informação e comunicação (TIC) estão presentes nas mais diversas áreas e atividades cotidianas, mas, em que pesem as ações de governos e instituições privadas, a informatização da saúde ainda é um desafio em aberto no Brasil. A situação atual leva a um questionamento sobre as dificuldades associadas à informatização das práticas em saúde, assim como, quais efeitos tais dificuldades têm causado à sociedade Brasileira. Com objetivo de discutir as questões acima citadas, esta tese apresenta quatro artigos sobre processo de informação da saúde no Brasil. O primeiro artigo revisa a literatura sobre TIC em saúde e baseado em duas perspectivas teóricas – estudos Europeus acerca dos Sistemas de Informação em Saúde (SIS) nos Países em Desenvolvimento e estudos sobre Informação e Informática em Saúde, no âmbito do Movimento da Reforma Sanitária –, formula um modelo integrado que combina dimensões de análise e fatores contextuais para a compreensão dos SIS no Brasil. Já o segundo artigo apresenta os conceitos e teóricos e metodológicos da Teoria Ator-Rede (ANT), uma abordagem para o estudo de controvérsias associadas às descobertas científicas e inovações tecnológicas, por meio das redes de atores envolvidos em tais ações. Tal abordagem tem embasado estudos de SI desde 1990 e inspirou as análises dois artigos empíricos desta tese. Os dois últimos artigos foram redigidos a partir da análise da implantação de um SIS em um hospital público no Brasil ocorrida entre os anos de 2010 e 2012. Para a análise do caso, seguiram-se os atores envolvidos nas controvérsias que surgiram durante a implantação do SIS. O terceiro artigo se debruçou sobre as atividades dos analistas de sistema e usuários envolvidos na implantação do SIS. As mudanças observadas durante a implantação do sistema revelam que o sucesso do SIS não foi alcançado pela estrita e técnica execução das atividades incialmente planejadas. Pelo contrário, o sucesso foi construído coletivamente, por meio da negociação entre os atores e de dispositivos de interessamento introduzidos durante o projeto. O quarto artigo, baseado no conceito das Infraestruturas de Informação, discutiu como o sistema CATMAT foi incorporado ao E-Hosp. A análise revelou como a base instalada do CATMAT foi uma condição relevante para a sua escolha durante a implantação do E-Hosp. Além disso, descrevem-se negociações e operações heterogêneas que aconteceram durante a incorporação do CATMAT no sistema E-Hosp. Assim, esta tese argumenta que a implantação de um SIS é um empreendimento de construção coletiva, envolvendo analistas de sistema, profissionais de saúde, políticos e artefatos técnicos. Ademais, evidenciou-se como os SIS inscrevem definições e acordos, influenciando as preferências dos atores na área de saúde.
Resumo:
The present study is about an etnographic research based on the Theory of Social Representation and its complementary approach, the Theory of Central Core based on the bourdiesianos concepts of field and habitus , concerning that these concepts, articulated to the constructed social representation, may contribute to the study of social identities. Its aim is to acknowledge which identity references community health agents (CHA), agents from Community Health Agent Program (CHAP) and Family Health Program (FHP) from João Pessoa PB and which social representation is constructed by them towards health education. The study had the participation of 119 CHAs, from which 90,3 % were female and 9,7% were male. Since the identity is also built by the representation of others towards the group, 63 professionals of the FHP group (16 nurses, 16 nursing assistents, 12 doctors, 9 dentists, 6 dentistry office assistents, 4 coordinators, 1 psicologist and 1 receptionist) and 1 nurse from CHAP took part of the study, oficial documents from the Health Ministry were analyzed, verbal information from its representatives were also taken into consideration, as well as reports from the many benefitiaries of the CHA, CHAP and FHP. For data collecting, we used the combination of (a) Direct Observation and Participant Observation of the functioning micro-areas of the CHA at the Family Health Units, and the Union of the Agents; (b) Free-Association of words and expressions to stimulate the CHA , Health Education and Health ; (c) Questionnaire; (d) Interviews. The interviews were submitted to a thematic analysis of its topic. The free-association was analyzed taking in consideration the vèrgesiana proposal (a combination of the frequency and average order of evocation) which treatment enabled the identification of the central and peripheral systems of social representation towards health education and the community health agent. A test of central refutation, associated to the analysis of the indicated evocations as the most important, provided empirical evidence of social representation towards health education as orientation , prevention and hygiene , as well as the identity of CHA as supervisor , friend , help , important , and the link between the community and the Family Health Staff. Other professionals from CHAP, FHP and the Health Ministry share all of these representational contents, especially the concepts of friend and link , also shared by the community. A habitus towards the community health agents was identified, as a representation based on trust and friendship, which gives the professional a great importance towards the daily inconsistencies faced by the community
Resumo:
This work addresses issues related to analysis and development of multivariable predictive controllers based on bilinear multi-models. Linear Generalized Predictive Control (GPC) monovariable and multivariable is shown, and highlighted its properties, key features and applications in industry. Bilinear GPC, the basis for the development of this thesis, is presented by the time-step quasilinearization approach. Some results are presented using this controller in order to show its best performance when compared to linear GPC, since the bilinear models represent better the dynamics of certain processes. Time-step quasilinearization, due to the fact that it is an approximation, causes a prediction error, which limits the performance of this controller when prediction horizon increases. Due to its prediction error, Bilinear GPC with iterative compensation is shown in order to minimize this error, seeking a better performance than the classic Bilinear GPC. Results of iterative compensation algorithm are shown. The use of multi-model is discussed in this thesis, in order to correct the deficiency of controllers based on single model, when they are applied in cases with large operation ranges. Methods of measuring the distance between models, also called metrics, are the main contribution of this thesis. Several application results in simulated distillation columns, which are close enough to actual behaviour of them, are made, and the results have shown satisfactory
Resumo:
This paper presents a new multi-model technique of dentification in ANFIS for nonlinear systems. In this technique, the structure used is of the fuzzy Takagi-Sugeno of which the consequences are local linear models that represent the system of different points of operation and the precursors are membership functions whose adjustments are realized by the learning phase of the neuro-fuzzy ANFIS technique. The models that represent the system at different points of the operation can be found with linearization techniques like, for example, the Least Squares method that is robust against sounds and of simple application. The fuzzy system is responsible for informing the proportion of each model that should be utilized, using the membership functions. The membership functions can be adjusted by ANFIS with the use of neural network algorithms, like the back propagation error type, in such a way that the models found for each area are correctly interpolated and define an action of each model for possible entries into the system. In multi-models, the definition of action of models is known as metrics and, since this paper is based on ANFIS, it shall be denominated in ANFIS metrics. This way, ANFIS metrics is utilized to interpolate various models, composing a system to be identified. Differing from the traditional ANFIS, the created technique necessarily represents the system in various well defined regions by unaltered models whose pondered activation as per the membership functions. The selection of regions for the application of the Least Squares method is realized manually from the graphic analysis of the system behavior or from the physical characteristics of the plant. This selection serves as a base to initiate the linear model defining technique and generating the initial configuration of the membership functions. The experiments are conducted in a teaching tank, with multiple sections, designed and created to show the characteristics of the technique. The results from this tank illustrate the performance reached by the technique in task of identifying, utilizing configurations of ANFIS, comparing the developed technique with various models of simple metrics and comparing with the NNARX technique, also adapted to identification
Resumo:
Simulations based on cognitively rich agents can become a very intensive computing task, especially when the simulated environment represents a complex system. This situation becomes worse when time constraints are present. This kind of simulations would benefit from a mechanism that improves the way agents perceive and react to changes in these types of environments. In other worlds, an approach to improve the efficiency (performance and accuracy) in the decision process of autonomous agents in a simulation would be useful. In complex environments, and full of variables, it is possible that not every information available to the agent is necessary for its decision-making process, depending indeed, on the task being performed. Then, the agent would need to filter the coming perceptions in the same as we do with our attentions focus. By using a focus of attention, only the information that really matters to the agent running context are perceived (cognitively processed), which can improve the decision making process. The architecture proposed herein presents a structure for cognitive agents divided into two parts: 1) the main part contains the reasoning / planning process, knowledge and affective state of the agent, and 2) a set of behaviors that are triggered by planning in order to achieve the agent s goals. Each of these behaviors has a runtime dynamically adjustable focus of attention, adjusted according to the variation of the agent s affective state. The focus of each behavior is divided into a qualitative focus, which is responsible for the quality of the perceived data, and a quantitative focus, which is responsible for the quantity of the perceived data. Thus, the behavior will be able to filter the information sent by the agent sensors, and build a list of perceived elements containing only the information necessary to the agent, according to the context of the behavior that is currently running. Based on the human attention focus, the agent is also dotted of a affective state. The agent s affective state is based on theories of human emotion, mood and personality. This model serves as a basis for the mechanism of continuous adjustment of the agent s attention focus, both the qualitative and the quantative focus. With this mechanism, the agent can adjust its focus of attention during the execution of the behavior, in order to become more efficient in the face of environmental changes. The proposed architecture can be used in a very flexibly way. The focus of attention can work in a fixed way (neither the qualitative focus nor the quantitaive focus one changes), as well as using different combinations for the qualitative and quantitative foci variation. The architecture was built on a platform for BDI agents, but its design allows it to be used in any other type of agents, since the implementation is made only in the perception level layer of the agent. In order to evaluate the contribution proposed in this work, an extensive series of experiments were conducted on an agent-based simulation over a fire-growing scenario. In the simulations, the agents using the architecture proposed in this work are compared with similar agents (with the same reasoning model), but able to process all the information sent by the environment. Intuitively, it is expected that the omniscient agent would be more efficient, since they can handle all the possible option before taking a decision. However, the experiments showed that attention-focus based agents can be as efficient as the omniscient ones, with the advantage of being able to solve the same problems in a significantly reduced time. Thus, the experiments indicate the efficiency of the proposed architecture
Resumo:
We propose a new approach to reduction and abstraction of visual information for robotics vision applications. Basically, we propose to use a multi-resolution representation in combination with a moving fovea for reducing the amount of information from an image. We introduce the mathematical formalization of the moving fovea approach and mapping functions that help to use this model. Two indexes (resolution and cost) are proposed that can be useful to choose the proposed model variables. With this new theoretical approach, it is possible to apply several filters, to calculate disparity and to obtain motion analysis in real time (less than 33ms to process an image pair at a notebook AMD Turion Dual Core 2GHz). As the main result, most of time, the moving fovea allows the robot not to perform physical motion of its robotics devices to keep a possible region of interest visible in both images. We validate the proposed model with experimental results
Resumo:
Postsurgical complication of hypertension may occur in cardiac patients. To decrease the chances of complication it is necessary to reduce elevated blood pressure as soon as possible. Continuous infusion of vasodilator drugs, such as sodium nitroprusside (Nipride), would quickly lower the blood pressure in most patients. However, each patient has a different sensitivity to infusion of Nipride. The parameters and the time delays of the system are initially unknown. Moreover, the parameters of the transfer function associated with a particular patient are time varying. the objective of the study is to develop a procedure for blood pressure control i the presence of uncertainty of parameters and considerable time delays. So, a methodology was developed multi-model, and for each such model a Preditive Controller can be a priori designed. An adaptive mechanism is then needed for deciding which controller should be dominant for a given plant
Resumo:
The introduction of new digital services in the cellular networks, in transmission rates each time more raised, has stimulated recent research that comes studying ways to increase the data communication capacity and to reduce the delays in forward and reverse links of third generation WCDMA systems. These studies have resulted in new standards, known as 3.5G, published by 3GPP group, for the evolution of the third generation of the cellular systems. In this Masters Thesis the performance of a 3G WCDMA system, with diverse base stations and thousand of users is developed with assists of the planning tool NPSW. Moreover the performance of the 3.5G techniques hybrid automatic retransmission and multi-user detection with interference cancellation, candidates for enhance the WCDMA uplink capacity, is verified by means of computational simulations in Matlab of the increase of the data communication capacity and the reduction of the delays in the retransmission of packages of information
Resumo:
In Natal still dominates the use of individual disposal systems for domestic sewage, once only 29% of the city has a sewarage system. Wastes that are accumulated in these individual treatment systems should be exhausted periodically, service provided by collector entrepreneurs. Some of these companies causing major damage to the environment. In Natal, only two companies have their own septage (RESTI) treatment system, which were designed with parameters from domestic sewage generating strain and inefficient systems. Therefore, the characterization becomes essential as a source of parameters for their design. Thus, this work presents the physical-chemical and microbiological characterization of waste pumped from individual sewage treatment systems. Samples collections were made weekly from 5 different trucks at the reception point on the treatment plant on the point of the preliminary treatment. From each truck it was taken 5 samples during the discharge in order to make a composite sample. Afterwards, samples were carried out to laboratory and analyses for determination of temperature, pH, conductivity, BOD, COD, nitrogen (ammonia e organic), alkalinity, oils, phosphorus, solids, faecal coliforms and helminth egg. The results were treated as a single database, and ranked according to its generating source (multi and single house, lodging, health, services and / or food), area of origin (metropolitan, south and north) and type of system (cesspits, septic tank and / or sink). Through these data it was possible to verify that the type of system adopted by most in Natal and the metropolitan region is cesspit, besides to confirm the difference between the septage of areas with a population have different social and economical characteristics. It was found that the septage have higher concentrations than domestic sewage, except for thermotolerant coliforms that showed concentrations of 1,38E+07. Among the parameters studied, is the median values identified for COD (3,549 mg / L), BOD (973mg / L) and total solids (3.557mg / L). The volatile fraction constitutes about 70% of the total solids of the septage. For helminths has been a median of 7 eggs/L. In general, the characteristics of the waste followed the variability found in the literature reviewed for all variables, showing high amplitudes
Resumo:
The Car Rental Salesman Problem (CaRS) is a variant of the classical Traveling Salesman Problem which was not described in the literature where a tour of visits can be decomposed into contiguous paths that may be performed in different rental cars. The aim is to determine the Hamiltonian cycle that results in a final minimum cost, considering the cost of the route added to the cost of an expected penalty paid for each exchange of vehicles on the route. This penalty is due to the return of the car dropped to the base. This paper introduces the general problem and illustrates some examples, also featuring some of its associated variants. An overview of the complexity of this combinatorial problem is also outlined, to justify their classification in the NPhard class. A database of instances for the problem is presented, describing the methodology of its constitution. The presented problem is also the subject of a study based on experimental algorithmic implementation of six metaheuristic solutions, representing adaptations of the best of state-of-the-art heuristic programming. New neighborhoods, construction procedures, search operators, evolutionary agents, cooperation by multi-pheromone are created for this problem. Furtermore, computational experiments and comparative performance tests are conducted on a sample of 60 instances of the created database, aiming to offer a algorithm with an efficient solution for this problem. These results will illustrate the best performance reached by the transgenetic algorithm in all instances of the dataset