927 resultados para Capability Hierarchy


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Numa Estação de Tratamento de Águas Residuais (ETAR), a otimização do processo de Digestão Anaeróbia (DA) é fundamental para o aumento da produção de biogás, que por sua vez é convertido em energia, essencial para a rentabilidade de exploração de ETAR. No entanto, a complexidade do processo de Digestão Anaeróbia das lamas constitui um obstáculo à sua otimização. Com este trabalho pretende-se efetuar a análise e tratamento de dados de Digestão Anaeróbia, com recurso a Redes Neuronais Artificiais (RNA), contribuindo, desta forma, para a compreensão do processo e do impacto de algumas variáveis na produção de biogás. As Redes Neuronais Artificiais são modelos matemáticos computacionais inspirados no funcionamento do cérebro humano, com capacidade para entender relações complexas num determinado conjunto de dados, motivo por que se optou pela sua utilização na procura de soluções que permitem predizer o comportamento de uma DA. Para o desenvolvimento das RNA utilizou-se o programa NeuralToolsTM da PalisadeTM. Como caso de estudo, a metodologia foi aplicada ao Digestor A da ETAR Sul da SIMRIA, empresa onde teve lugar o estágio curricular que originou o presente trabalho. Nesse contexto, utilizaram-se dados com informação referente aos últimos dois anos de funcionamento do digestor, disponíveis na empresa. Apesar de se terem verificado certas limitações, na predição em alguns casos particulares, de um modo geral, considera-se que os resultados obtidos permitiram concluir que as redes neuronais modeladas apresentam boa capacidade de generalização na imitação do processo anaeróbio. Conclui-se, portanto, que o estudo realizado pode constituir um contributo com interesse para a otimização da produção do biogás na DA de ETAR Sul da SIMRIA e que a utilização de RNA poderá ser uma ferramenta a explorar, quer nessa área, quer noutras áreas de gestão de sistemas de saneamento básico.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modelação e simulação baseadas em agentes estão a ganhar cada vez mais importância e adeptos devido à sua flexibilidade e potencialidade em reproduzir comportamentos e estudar um sistema na perspetiva global ou das interações individuais. Neste trabalho, criou-se um sistema baseado em agentes e desenvolvido em Repast Simphony com o objectivo de analisar a difusão de um novo produto ou serviço através de uma rede de potenciais clientes, tentando compreender, assim, como ocorre e quanto tempo demora esta passagem de informação (inovação) com diversas topologias de rede, no contato direto entre pessoas. A simulação baseia-se no conceito da existencia de iniciadores, que são os primeiros consumidores a adotar um produto quando este chega ao mercado e os seguidores, que são os potenciais consumidores que, apesar de terem alguma predisposição para adotar um novo produto, normalmente só o fazem depois de terem sido sujeitos a algum tipo de influência. Com a aplicação criada, simularam-se diversas situações com a finalidade de obter e observar os resultados gerados a partir de definições iniciais diferentes. Com os resultados gerados pelas simulações foram criados gráficos representativos dos diversos cenários. A finalidade prática desta aplicação, poderá ser o seu uso em sala de aula para simulação de casos de estudo e utilização, em casos reais, como ferramenta de apoio à tomada de decisão, das empresas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objetivos: O objetivo deste estudo é descrever o quadro de inovação no setor da saúde em Portugal, identificar os fatores críticos de sucesso da inovação, investigando os impactos da inovação nas organizações do setor da saúde. Metodologia: Na concretização da presente dissertação, recorremos a uma abordagem quantitativa, combinando a análise documental com a estatística, ao nível da análise do tratamento dos dados recolhidos através do Inquérito Comunitário à Inovação, efetuando assim um estudo de caso exploratório, descritivo e transversal. Principais resultados: As organizações analisadas operam sobretudo em mercados locais e regionais, de onde provém, maioritariamente, o seu volume de negócios, 80% do qual é composto por produtos pré-existentes. A maioria introduziu inovações de produto, processo, organizacionais ou de marketing, revelando potencial inovador. A maioria dos produtos novos ou significativamente melhorados foram desenvolvidos internamente, privilegiando fornecedores, consultores, instituições privadas de I&D e instituições do ensino superior como parceiros de cooperação, localizados sobretudo em Portugal e outros países europeus. As razões que motivam estas organizações a inovar são a melhoria da qualidade dos produtos e da capacidade de resposta a clientes e fornecedores, a diversificação da gama de produtos e o reforço da capacidade de desenvolvimento de novos produtos. Conclusões: O setor revela dinamismo na introdução de produtos novos para o mercado e para a empresa, apostando sobretudo num processo de inovação fechada. A cooperação externa é muito orientada à I&D e há um reduzido envolvimento dos agentes de mercado nas atividades de I&D através de parcerias. Contudo, estes são considerados importantes como fonte de informação e as organizações procuram responder às suas necessidades. Diferentes tipos de organizações adotam diferentes estratégias de inovação, conforme o seu mercado e a sua situação atual, o que traduz a materialização de políticas de inovação contextual, em linha com os desenvolvimentos teóricos da atualidade.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Química

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In search of an efficient but simple, low cost procedure for the serodiagnosis of Toxoplasmosis, especially suited for routine laboratories facing technical and budget limitations as in less developed countries, the diagnostic capability of Hematoxo® , an hemagglutination test for toxoplasmosis, was evaluated in relation to a battery of tests including IgG- and IgM-immunofluorescence tests, hemagglutination and an IgM-capture enzymatic assay. Detecting a little as 5 I.U. of IgG antitoxoplasma antibodies, Hematoxo® showed a straight agreement as to reactivity and non-reactivity for the 443 non-reactive and the 387 reactive serum samples, included in this study. In 23 cases presenting a serological pattern of acute toxoplasmosis and showing IgM antibodies, Hematoxo® could detect IgM antibodies in 18, indicated by negativation or a significant decrease in titers as a result of treating samples with 2-mercapto-ethanol. However, a neat increase in sensitivity for IgM specific antibodies could be achieved by previously removing IgG from the sample, as demonstrated in a series of acute toxoplasmosis sera. A simple procedure was developed for this purpose, by reconstituting a lyophilized suspension of Protein A - rich Staphylococcus with the lowest serum dilution to be tested. Of low cost and easy to perform, Hematoxo® affords not only a practical qualitative procedure for screening reactors and non-reactors, as in prenatal services, but also quantitative assays that permit to titrate antibodies as well as to identify IgM antibodies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper addresses a gap in the literature concerning the management of Intellectual Capital (IC) in a port, which is a network of independent organizations that act together in the provision of a set of services. As far as the authors are aware, this type of empirical context has been unexplored when regarding knowledge management or IC creation/destruction. Indeed, most research in IC still focus on individual firms, despite the more recent interest placed on the analysis of macro-level units such as regions or nations. In this study, we conceptualise the port as meta-organisation, which has the generic goal of economic development, both for itself and for the region where it is located. It provides us with a unique environment due to its complexity as an “organisation” composed by several organisations, connected by interdependency relationships and, typically, with no formal hierarchy. Accordingly, actors’ interests are not always aligned and in some situations their individual interests can be misaligned with the collective goals of the port. Moreover, besides having their own interests, port actors also have different sources of influence and different levels of power, which can impact on the port’s Collective Intellectual Capital (CIC). Consequently, the management of the port’s CIC can be crucial in order for its goals to be met. With this paper we intend to discuss how the network coordinator (the port authority) manages those complex relations of interest and power in order to develop collaboration and mitigate conflict, thus creating collective intellectual assets or avoiding intellectual liabilities that may emerge for the whole port. The fact that we are studying complex and dynamic processes, about which there is a lack of understanding, in a complex and atypical organisation, leads us to consider the case study as an appropriate method of research. Evidence presented in this study results from preliminary interviews and also from document analysis. Findings suggest that alignment of interests and actions, at both dyadic and networking levels, is critical to develop a context of collaboration/cooperation within the port community and, accordingly, the port coordinator should make use of different types of power in order to ensure that port’s goals are achieved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a real world multiagent system, where the agents are faced with partial, incomplete and intrinsically dynamic knowledge, conflicts are inevitable. Frequently, different agents have goals or beliefs that cannot hold simultaneously. Conflict resolution methodologies have to be adopted to overcome such undesirable occurrences. In this paper we investigate the application of distributed belief revision techniques as the support for conflict resolution in the analysis of the validity of the candidate beams to be produced in the CERN particle accelerators. This CERN multiagent system contains a higher hierarchy agent, the Specialist agent, which makes use of meta-knowledge (on how the con- flicting beliefs have been produced by the other agents) in order to detect which beliefs should be abandoned. Upon solving a conflict, the Specialist instructs the involved agents to revise their beliefs accordingly. Conflicts in the problem domain are mapped into conflicting beliefs of the distributed belief revision system, where they can be handled by proven formal methods. This technique builds on well established concepts and combines them in a new way to solve important problems. We find this approach generally applicable in several domains.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent integrated circuit technologies have opened the possibility to design parallel architectures with hundreds of cores on a single chip. The design space of these parallel architectures is huge with many architectural options. Exploring the design space gets even more difficult if, beyond performance and area, we also consider extra metrics like performance and area efficiency, where the designer tries to design the architecture with the best performance per chip area and the best sustainable performance. In this paper we present an algorithm-oriented approach to design a many-core architecture. Instead of doing the design space exploration of the many core architecture based on the experimental execution results of a particular benchmark of algorithms, our approach is to make a formal analysis of the algorithms considering the main architectural aspects and to determine how each particular architectural aspect is related to the performance of the architecture when running an algorithm or set of algorithms. The architectural aspects considered include the number of cores, the local memory available in each core, the communication bandwidth between the many-core architecture and the external memory and the memory hierarchy. To exemplify the approach we did a theoretical analysis of a dense matrix multiplication algorithm and determined an equation that relates the number of execution cycles with the architectural parameters. Based on this equation a many-core architecture has been designed. The results obtained indicate that a 100 mm(2) integrated circuit design of the proposed architecture, using a 65 nm technology, is able to achieve 464 GFLOPs (double precision floating-point) for a memory bandwidth of 16 GB/s. This corresponds to a performance efficiency of 71 %. Considering a 45 nm technology, a 100 mm(2) chip attains 833 GFLOPs which corresponds to 84 % of peak performance These figures are better than those obtained by previous many-core architectures, except for the area efficiency which is limited by the lower memory bandwidth considered. The results achieved are also better than those of previous state-of-the-art many-cores architectures designed specifically to achieve high performance for matrix multiplication.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thesis submitted to Faculdade de Ciências e Tecnologia of the Universidade Nova de Lisboa, in partial fulfillment of the requirements for the degree of Master in Computer Science

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: The role of persistent organic pollutants (POPs) with endocrine disrupting activity in the aetiology of obesity and other metabolic dysfunctions has been recently highlighted. Adipose tissue (AT) is a common site of POPs accumulation where they can induce adverse effects on human health. Objectives: To evaluate the presence of POPs in human visceral (vAT) and subcutaneous (scAT) adipose tissue in a sample of Portuguese obese patients that underwent bariatric surgery, and assess their putative association with metabolic disruption preoperatively, as well as with subsequent body mass index (BMI) reduction. Methods: AT samples (n=189) from obese patients (BMI ≥35) were collected and the levels of 13 POPs were determined by gas chromatography with electron-capture detection (GC-ECD). Anthropometric and biochemical data were collected at the time of surgery. BMI variation was evaluated after 12 months and adipocyte size was measured in AT samples. Results: Our data confirm that POPs are pervasive in this obese population (96.3% of detection on both tissues), their abundance increasing with age (RS=0.310, p<0.01) and duration of obesity (RS=0.170, p<0.05). We observed a difference in AT depot POPs storage capability, with higher levels of ΣPOPs in vAT (213.9±204.2 compared to 155.1±147.4 ng/g of fat, p<0.001), extremely relevant when evaluating their metabolic impact. Furthermore, there was a positive correlation between POP levels and the presence of metabolic syndrome components, namely dysglycaemia and hypertension, and more importantly with cardiovascular risk (RS=0.277, p<0.01), with relevance for vAT (RS=0.315, p<0.01). Finally, we observed an interesting relation of higher POP levels with lower weight loss in older patients. Conclusion: Our sample of obese subjects allowed us to highlight the importance of POPs stored in AT on the development of metabolic dysfunction in a context of obesity, shifting the focus to their metabolic effects and not only for their recognition as environmental obesogens.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A vigilância de efeitos indesejáveis após a vacinação é complexa. Existem vários actores de confundimento que podem dar origem a associações espúrias, meramente temporais mas que podem provocar uma percepção do risco alterada e uma consequente desconfiança generalizada acerca do uso das vacinas. Com efeito as vacinas são medicamentos complexos com características únicas cuja vigilância necessita de abordagens metodológicas desenvolvidas para esse propósito. Do exposto se entende que, desde o desenvolvimento da farmacovigilância se tem procurado desenvolver novas metodologias que sejam concomitantes aos Sistemas de Notificação Espontânea que já existem. Neste trabalho propusemo-nos a desenvolver e testar um modelo de vigilância de reacções adversas a vacinas, baseado na auto-declaração pelo utente de eventos ocorridos após a vacinação e testar a capacidade de gerar sinais aplicando cálculos de desproporção a datamining. Para esse efeito foi constituída uma coorte não controlada de utentes vacinados em Centros de Saúde que foram seguidos durante quinze dias. A recolha de eventos adversos a vacinas foi efectuada pelos próprios utentes através de um diário de registo. Os dados recolhidos foram objecto de análise descritiva e análise de data-mining utilizando os cálculos Proportional Reporting Ratio e o Information Component. A metodologia utilizada permitiu gerar um corpo de evidência suficiente para a geração de sinais. Tendo sido gerados quatro sinais. No âmbito do data-mining a utilização do Information Component como método de geração de sinais parece aumentar a eficiência científica ao permitir reduzir o número de ocorrências até detecção de sinal. A informação reportada pelos utentes parece válida como indicador de sinais de reacções adversas não graves, o que permitiu o registo de eventos sem incluir o viés da avaliação da relação causal pelo notificador. Os principais eventos reportados foram eventos adversos locais (62,7%) e febre (31,4%).------------------------------------------ABSTRACT: The monitoring of undesirable effects following vaccination is complex. There are several confounding factors that can lead to merely temporal but spurious associations that can cause a change in the risk perception and a consequent generalized distrust about the safe use of vaccines. Indeed, vaccines are complex drugs with unique characteristics so that its monitoring requires specifically designed methodological approaches. From the above-cited it is understandable that since the development of Pharmacovigilance there has been a drive for the development of new methodologies that are concomitant with Spontaneous Reporting Systems already in place. We proposed to develop and test a new model for vaccine adverse reaction monitoring, based on self-report by users of events following vaccination and to test its capability to generate disproportionality signals applying quantitative methods of signal generation to data-mining. For that effect we set up an uncontrolled cohort of users vaccinated in Healthcare Centers,with a follow-up period of fifteen days. Adverse vaccine events we registered by the users themselves in a paper diary The data was analyzed using descriptive statistics and two quantitative methods of signal generation: Proportional Reporting Ratio and Information Component. themselves in a paper diary The data was analyzed using descriptive statistics and two quantitative methods of signal generation: Proportional Reporting Ratio and Information Component. The methodology we used allowed for the generation of a sufficient body of evidence for signal generation. Four signals were generated. Regarding the data-mining, the use of Information Component as a method for generating disproportionality signals seems to increase scientific efficiency by reducing the number of events needed to signal detection. The information reported by users seems valid as an indicator of non serious adverse vaccine reactions, allowing for the registry of events without the bias of the evaluation of the casual relation by the reporter. The main adverse events reported were injection site reactions (62,7%) and fever (31,4%).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a step count algorithm designed to work in real-time using low computational power. This proposal is our first step for the development of an indoor navigation system, based on Pedestrian Dead Reckoning (PDR). We present two approaches to solve this problem and compare them based in their error on step counting, as well as, the capability of their use in a real time system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

apresentado ao Instituto de Contabilidade e Administração do Porto para a Dissertação de Mestrado para obtenção do grau de Mestre em Contabilidade e Finanças sob orientação do Mestre Adalmiro Álvaro Malheiro de Castro Andrade Pereira

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The capability to anticipate a contact with another device can greatly improve the performance and user satisfaction not only of mobile social network applications but of any other relying on some form of data harvesting or hoarding. One of the most promising approaches for contact prediction is to extrapolate from past experiences. This paper investigates the recurring contact patterns observed between groups of devices using an 8-year dataset of wireless access logs produced by more than 70000 devices. This effort permitted to model the probabilities of occurrence of a contact at a predefined date between groups of devices using a power law distribution that varies according to neighbourhood size and recurrence period. In the general case, the model can be used by applications that need to disseminate large datasets by groups of devices. As an example, the paper presents and evaluates an algorithm that provides daily contact predictions, based on the history of past pairwise contacts and their duration. Copyright © 2015 ICST.