94 resultados para External Knowledge Source


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Eletrotécnica e de Computadores

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The figure and the oeuvre of Carlos Seixas (1704-42) were rediscovered in the 1930s chiefly by the British musicologist Macario Santiago Kastner (1908-92), whose writings and editions raised the attention of international scholars and interpreters for the Portuguese composer. Nevertheless, and in spite of an unusual editorial tradition for Portuguese standards, a critical appraisal of the body of sources of Seixas' keyboard sonatas has never been attempted, to the point of even impeding the knowledge of how many there really are. This article deals briefly with the issues of source situation, authorship attribution and the distinctive characteristics of Seixas' style, offering a preliminary catalogue of the sonatas and a description of four of the manuscript collections containing them, housed at the National Library of Portugal. Textual peculiarities and problems of works surviving in more than one source are examined and the question of text 'banalization' by means of the incorporation of performance practice gestures is also discussed, advancing the hypothesis of the existence of two different traditions for some of the sonatas―one written and one 'oral'―that merge in the texts known to us.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Eradication of code smells is often pointed out as a way to improve readability, extensibility and design in existing software. However, code smell detection remains time consuming and error-prone, partly due to the inherent subjectivity of the detection processes presently available. In view of mitigating the subjectivity problem, this dissertation presents a tool that automates a technique for the detection and assessment of code smells in Java source code, developed as an Eclipse plugin. The technique is based upon a Binary Logistic Regression model that uses complexity metrics as independent variables and is calibrated by expert‟s knowledge. An overview of the technique is provided, the tool is described and validated by an example case study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research aims to provide a better understanding on how firms stimulate knowledge sharing through the utilization of collaboration tools, in particular Emergent Social Software Platforms (ESSPs). It focuses on the distinctive applications of ESSPs and on the initiatives contributing to maximize its advantages. In the first part of the research, I have itemized all types of existing collaboration tools and classify them in different categories according to their capabilities, objectives and according to their faculty for promoting knowledge sharing. In the second part, and based on an exploratory case study at Cisco Systems, I have identified the main applications of an existing enterprise social software platform named Webex Social. By combining a qualitative and quantitative approach, as well as combining data collected from survey’s results and from the analysis of the company’s documents, I am expecting to maximize the outcome of this investigation and reduce the risk of bias. Although effects cannot be universalized based on one single case study, some utilization patterns have been underlined from the data collected and potential trends in managing knowledge have been observed. The results of the research have also enabled identifying most of the constraints experienced by the users of the firm’s social software platform. Utterly, this research should provide a primary framework for firms planning to create or implement a social software platform and for firms willing to increase adoption levels and to promote the overall participation of users. It highlights the common traps that should be avoided by developers when designing a social software platform and the capabilities that it should inherently carry to support an effective knowledge management strategy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A importância dos sistemas de data warehousing e business intelligence é cada vez mais pronunciada, no sentido de dotar as organizações com a capacidade de guardar, explorar e produzir informação de valor acrescido para os seus processos de tomada de decisão. Esta realidade é claramente aplicável aos sectores da administração pública portuguesa e, muito em particular, aos organismos com responsabilidades centrais no Ministério da Saúde. No caso dos Serviços Partilhados do Ministério da Saúde (SPMS), que tem como missão prover o SNS de sistemas centrais de business intelligence, o apelo dos seus clientes, para que possam contar com capacidades analíticas nos seus sistemas centrais, tem sido sentido de forma muito acentuada. Todavia, é notório que, tanto os custos, como a complexidade, de grande parte destes projetos têm representado uma séria ameaça à sua adoção e sucesso. Por um lado, a administração pública tem recebido um forte encorajamento para integrar e adotar soluções de natureza open source (modelo de licenciamento gratuito), para os seus projetos de sistemas de informação. Por outro lado, temos vindo a assistir a uma vaga de aceitação generalizada de novas metodologias de desenvolvimento de projetos informáticos, nomeadamente no que diz respeito às metodologias Agéis, que se assumem como mais flexíveis, menos formais e com maior grau de sucesso. No sentido de averiguar da aplicabilidade do open source e das metodologias Ágeis aos sistemas de business intelligence, este trabalho documenta a implementação de um projeto organizacional para a SPMS, com recurso a ferramentas open source de licenciamento gratuito e através de uma metodologia de desenvolvimento de natureza Ágil.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Epistemology in philosophy of mind is a difficult endeavor. Those who believe that our phenomenal life is different from other domains suggest that self-knowledge about phenomenal properties is certain and therefore privileged. Usually, this so called privileged access is explained by the idea that we have direct access to our phenomenal life. This means, in contrast to perceptual knowledge, self-knowledge is non-inferential. It is widely believed that, this kind of directness involves two different senses: an epistemic sense and a metaphysical sense. Proponents of this view often claim that this is due to the fact that we are acquainted with our current experiences. The acquaintance thesis, therefore, is the backbone in justifying privileged access. Unfortunately the whole approach has a profound flaw. For the thesis to work, acquaintance has to be a genuine explanation. Since it is usually assumed that any knowledge relation between judgments and the corresponding objects are merely causal and contingent (e.g. in perception), the proponent of the privileged access view needs to show that acquaintance can do the job. In this thesis, however, I claim that the latter cannot be done. Based on considerations introduced by Levine, I conclude that this approach involves either the introduction of ontologically independent properties or a rather obscure knowledge relation. A proper explanation, however, cannot employ either of the two options. The acquaintance thesis is, therefore, bound to fail. Since the privileged access intuition seems to be vital to epistemology within the philosophy of mind, I will explore alternative justifications. After discussing a number of options, I will focus on the so called revelation thesis. This approach states that by simply having an experience with phenomenal properties, one is in the position to know the essence of those phenomenal properties. I will argue that, after finding a solution for the controversial essence claim, this thesis is a successful replacement explanation which maintains all the virtues of the acquaintance account without necessarily introducing ontologically independent properties or an obscure knowledge relation. The overall solution consists in qualifying the essence claim in the relevant sense, leaving us with an appropriate ontology for phenomenal properties. On the one hand, this avoids employing mysterious independent properties, since this ontological view is physicalist in nature. On the other hand, this approach has the right kind of structure to explain privileged self-knowledge of our phenomenal life. My final conclusion consists in the claim that the privileged access intuition is in fact veridical. It cannot, however, be justified by the popular acquaintance approach, but rather, is explainable by the controversial revelation thesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hybrid knowledge bases are knowledge bases that combine ontologies with non-monotonic rules, allowing to join the best of both open world ontologies and close world rules. Ontologies shape a good mechanism to share knowledge on theWeb that can be understood by both humans and machines, on the other hand rules can be used, e.g., to encode legal laws or to do a mapping between sources of information. Taking into account the dynamics present today on the Web, it is important for these hybrid knowledge bases to capture all these dynamics and thus adapt themselves. To achieve that, it is necessary to create mechanisms capable of monitoring the information flow present on theWeb. Up to today, there are no such mechanisms that allow for monitoring events and performing modifications of hybrid knowledge bases autonomously. The goal of this thesis is then to create a system that combine these hybrid knowledge bases with reactive rules, aiming to monitor events and perform actions over a knowledge base. To achieve this goal, a reactive system for the SemanticWeb is be developed in a logic-programming based approach accompanied with a language for heterogeneous rule base evolution having as its basis RIF Production Rule Dialect, which is a standard for exchanging rules over theWeb.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis introduces a novel conceptual framework to support the creation of knowledge representations based on enriched Semantic Vectors, using the classical vector space model approach extended with ontological support. One of the primary research challenges addressed here relates to the process of formalization and representation of document contents, where most existing approaches are limited and only take into account the explicit, word-based information in the document. This research explores how traditional knowledge representations can be enriched through incorporation of implicit information derived from the complex relationships (semantic associations) modelled by domain ontologies with the addition of information presented in documents. The relevant achievements pursued by this thesis are the following: (i) conceptualization of a model that enables the semantic enrichment of knowledge sources supported by domain experts; (ii) development of a method for extending the traditional vector space, using domain ontologies; (iii) development of a method to support ontology learning, based on the discovery of new ontological relations expressed in non-structured information sources; (iv) development of a process to evaluate the semantic enrichment; (v) implementation of a proof-of-concept, named SENSE (Semantic Enrichment kNowledge SourcEs), which enables to validate the ideas established under the scope of this thesis; (vi) publication of several scientific articles and the support to 4 master dissertations carried out by the department of Electrical and Computer Engineering from FCT/UNL. It is worth mentioning that the work developed under the semantic referential covered by this thesis has reused relevant achievements within the scope of research European projects, in order to address approaches which are considered scientifically sound and coherent and avoid “reinventing the wheel”.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

RESUMO: O cancro de mama e o mais frequente diagnoticado a indiv duos do sexo feminino. O conhecimento cientifico e a tecnologia tem permitido a cria ção de muitas e diferentes estrat egias para tratar esta patologia. A Radioterapia (RT) est a entre as diretrizes atuais para a maioria dos tratamentos de cancro de mama. No entanto, a radia ção e como uma arma de dois canos: apesar de tratar, pode ser indutora de neoplasias secund arias. A mama contralateral (CLB) e um orgão susceptivel de absorver doses com o tratamento da outra mama, potenciando o risco de desenvolver um tumor secund ario. Nos departamentos de radioterapia tem sido implementadas novas tecnicas relacionadas com a radia ção, com complexas estrat egias de administra ção da dose e resultados promissores. No entanto, algumas questões precisam de ser devidamente colocadas, tais como: E seguro avançar para tecnicas complexas para obter melhores indices de conformidade nos volumes alvo, em radioterapia de mama? O que acontece aos volumes alvo e aos tecidos saudaveis adjacentes? Quão exata e a administração de dose? Quais são as limitações e vantagens das técnicas e algoritmos atualmente usados? A resposta a estas questões e conseguida recorrendo a m etodos de Monte Carlo para modelar com precisão os diferentes componentes do equipamento produtor de radia ção(alvos, ltros, colimadores, etc), a m de obter uma descri cão apropriada dos campos de radia cão usados, bem como uma representa ção geometrica detalhada e a composição dos materiais que constituem os orgãos e os tecidos envolvidos. Este trabalho visa investigar o impacto de tratar cancro de mama esquerda usando diferentes tecnicas de radioterapia f-IMRT (intensidade modulada por planeamento direto), IMRT por planeamento inverso (IMRT2, usando 2 feixes; IMRT5, com 5 feixes) e DCART (arco conformacional dinamico) e os seus impactos em irradia ção da mama e na irradia ção indesejada dos tecidos saud aveis adjacentes. Dois algoritmos do sistema de planeamento iPlan da BrainLAB foram usados: Pencil Beam Convolution (PBC) e Monte Carlo comercial iMC. Foi ainda usado um modelo de Monte Carlo criado para o acelerador usado (Trilogy da VARIAN Medical Systems), no c odigo EGSnrc MC, para determinar as doses depositadas na mama contralateral. Para atingir este objetivo foi necess ario modelar o novo colimador multi-laminas High- De nition que nunca antes havia sido simulado. O modelo desenvolvido est a agora disponí vel no pacote do c odigo EGSnrc MC do National Research Council Canada (NRC). O acelerador simulado foi validado com medidas realizadas em agua e posteriormente com c alculos realizados no sistema de planeamento (TPS).As distribui ções de dose no volume alvo (PTV) e a dose nos orgãos de risco (OAR) foram comparadas atrav es da an alise de histogramas de dose-volume; an alise estati stica complementar foi realizadas usando o software IBM SPSS v20. Para o algoritmo PBC, todas as tecnicas proporcionaram uma cobertura adequada do PTV. No entanto, foram encontradas diferen cas estatisticamente significativas entre as t ecnicas, no PTV, nos OAR e ainda no padrão da distribui ção de dose pelos tecidos sãos. IMRT5 e DCART contribuem para maior dispersão de doses baixas pelos tecidos normais, mama direita, pulmão direito, cora cão e at e pelo pulmão esquerdo, quando comparados com as tecnicas tangenciais (f-IMRT e IMRT2). No entanto, os planos de IMRT5 melhoram a distribuição de dose no PTV apresentando melhor conformidade e homogeneidade no volume alvo e percentagens de dose mais baixas nos orgãos do mesmo lado. A t ecnica de DCART não apresenta vantagens comparativamente com as restantes t ecnicas investigadas. Foram tamb em identi cadas diferen cas entre os algoritmos de c alculos: em geral, o PBC estimou doses mais elevadas para o PTV, pulmão esquerdo e cora ção, do que os algoritmos de MC. Os algoritmos de MC, entre si, apresentaram resultados semelhantes (com dferen cas at e 2%). Considera-se que o PBC não e preciso na determina ção de dose em meios homog eneos e na região de build-up. Nesse sentido, atualmente na cl nica, a equipa da F sica realiza medi ções para adquirir dados para outro algoritmo de c alculo. Apesar de melhor homogeneidade e conformidade no PTV considera-se que h a um aumento de risco de cancro na mama contralateral quando se utilizam t ecnicas não-tangenciais. Os resultados globais dos estudos apresentados confirmam o excelente poder de previsão com precisão na determinação e c alculo das distribui ções de dose nos orgãos e tecidos das tecnicas de simulação de Monte Carlo usados.---------ABSTRACT:Breast cancer is the most frequent in women. Scienti c knowledge and technology have created many and di erent strategies to treat this pathology. Radiotherapy (RT) is in the actual standard guidelines for most of breast cancer treatments. However, radiation is a two-sword weapon: although it may heal cancer, it may also induce secondary cancer. The contralateral breast (CLB) is a susceptible organ to absorb doses with the treatment of the other breast, being at signi cant risk to develop a secondary tumor. New radiation related techniques, with more complex delivery strategies and promising results are being implemented and used in radiotherapy departments. However some questions have to be properly addressed, such as: Is it safe to move to complex techniques to achieve better conformation in the target volumes, in breast radiotherapy? What happens to the target volumes and surrounding healthy tissues? How accurate is dose delivery? What are the shortcomings and limitations of currently used treatment planning systems (TPS)? The answers to these questions largely rely in the use of Monte Carlo (MC) simulations using state-of-the-art computer programs to accurately model the di erent components of the equipment (target, lters, collimators, etc.) and obtain an adequate description of the radiation elds used, as well as the detailed geometric representation and material composition of organs and tissues. This work aims at investigating the impact of treating left breast cancer using di erent radiation therapy (RT) techniques f-IMRT (forwardly-planned intensity-modulated), inversely-planned IMRT (IMRT2, using 2 beams; IMRT5, using 5 beams) and dynamic conformal arc (DCART) RT and their e ects on the whole-breast irradiation and in the undesirable irradiation of the surrounding healthy tissues. Two algorithms of iPlan BrainLAB TPS were used: Pencil Beam Convolution (PBC)and commercial Monte Carlo (iMC). Furthermore, an accurate Monte Carlo (MC) model of the linear accelerator used (a Trilogy R VARIANR) was done with the EGSnrc MC code, to accurately determine the doses that reach the CLB. For this purpose it was necessary to model the new High De nition multileaf collimator that had never before been simulated. The model developed was then included on the EGSnrc MC package of National Research Council Canada (NRC). The linac was benchmarked with water measurements and later on validated against the TPS calculations. The dose distributions in the planning target volume (PTV) and the dose to the organs at risk (OAR) were compared analyzing dose-volume histograms; further statistical analysis was performed using IBM SPSS v20 software. For PBC, all the techniques provided adequate coverage of the PTV. However, statistically significant dose di erences were observed between the techniques, in the PTV, OAR and also in the pattern of dose distribution spreading into normal tissues. IMRT5 and DCART spread low doses into greater volumes of normal tissue, right breast, right lung, heart and even the left lung than tangential techniques (f-IMRT and IMRT2). However,IMRT5 plans improved distributions for the PTV, exhibiting better conformity and homogeneity in target and reduced high dose percentages in ipsilateral OAR. DCART did not present advantages over any of the techniques investigated. Di erences were also found comparing the calculation algorithms: PBC estimated higher doses for the PTV, ipsilateral lung and heart than the MC algorithms predicted. The MC algorithms presented similar results (within 2% di erences). The PBC algorithm was considered not accurate in determining the dose in heterogeneous media and in build-up regions. Therefore, a major e ort is being done at the clinic to acquire data to move from PBC to another calculation algorithm. Despite better PTV homogeneity and conformity there is an increased risk of CLB cancer development, when using non-tangential techniques. The overall results of the studies performed con rm the outstanding predictive power and accuracy in the assessment and calculation of dose distributions in organs and tissues rendered possible by the utilization and implementation of MC simulation techniques in RT TPS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Enhanced biological phosphorus removal (EBPR) is the most economic and sustainable option used in wastewater treatment plants (WWTPs) for phosphorus removal. In this process it is important to control the competition between polyphosphate accumulating organisms (PAOs) and glycogen accumulating organisms (GAOs), since EBPR deterioration or failure can be related with the proliferation of GAOs over PAOs. This thesis is focused on the effect of operational conditions (volatile fatty acid (VFA) composition, dissolved oxygen (DO) concentration and organic carbon loading) on PAO and GAO metabolism. The knowledge about the effect of these operational conditions on EBPR metabolism is very important, since they represent key factors that impact WWTPs performance and sustainability. Substrate competition between the anaerobic uptake of acetate and propionate (the main VFAs present in WWTPs) was shown in this work to be a relevant factor affecting PAO metabolism, and a metabolic model was developed that successfully describes this effect. Interestingly, the aerobic metabolism of PAOs was not affected by different VFA compositions, since the aerobic kinetic parameters for phosphorus uptake, polyhydroxyalkanoates (PHAs) degradation and glycogen production were relatively independent of acetate or propionate concentration. This is very relevant for WWTPs, since it will simplify the calibration procedure for metabolic models, facilitating their use for full-scale systems. The DO concentration and aerobic hydraulic retention time (HRT) affected the PAO-GAO competition, where low DO levels or lower aerobic HRT was more favourable for PAOs than GAOs. Indeed, the oxygen affinity coefficient was significantly higher for GAOs than PAOs, showing that PAOs were far superior at scavenging for the often limited oxygen levels in WWTPs. The operation of WWTPs with low aeration is of high importance for full-scale systems, since it decreases the energetic costs and can potentially improve WWTP sustainability. Extended periods of low organic carbon load, which are the most common conditions that exist in full-scale WWTPs, also had an impact on PAO and GAO activity. GAOs exhibited a substantially higher biomass decay rate as compared to PAOs under these conditions, which revealed a higher survival capacity for PAOs, representing an advantage for PAOs in EBPR processes. This superior survival capacity of PAOs under conditions more closely resembling a full-scale environment was linked with their ability to maintain a residual level of PHA reserves for longer than GAOs, providing them with an effective energy source for aerobic maintenance processes. Overall, this work shows that each of these key operational conditions play an important role in the PAO-GAO competition and should be considered in WWTP models in order to improve EBPR processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Digital Microfluidics (DMF) is a second generation technique, derived from the conventional microfluidics that instead of using continuous liquid fluxes, it uses only individual droplets driven by external electric signals. In this thesis a new DMF control/sensing system for visualization, droplet control (movement, dispensing, merging and splitting) and real time impedance measurement have been developed. The software for the proposed system was implemented in MATLAB with a graphical user interface. An Arduino was used as control board and dedicated circuits for voltage switching and contacts were designed and implemented in printed circuit boards. A high resolution camera was integrated for visualization. In our new approach, the DMF chips are driven by a dual-tone signal where the sum of two independent ac signals (one for droplet operations and the other for impedance sensing) is applied to the electrodes, and afterwards independently evaluated by a lock-in amplifier. With this new approach we were able to choose the appropriated amplitudes and frequencies for the different proposes (actuation and sensing). The measurements made were used to evaluate the real time droplet impedance enabling the knowledge of its position and velocity. This new approach opens new possibilities for impedance sensing and feedback control in DMF devices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The particular characteristics and affordances of technologies play a significant role in human experience by defining the realm of possibilities available to individuals and societies. Some technological configurations, such as the Internet, facilitate peer-to-peer communication and participatory behaviors. Others, like television broadcasting, tend to encourage centralization of creative processes and unidirectional communication. In other instances still, the affordances of technologies can be further constrained by social practices. That is the case, for example, of radio which, although technically allowing peer-to-peer communication, has effectively been converted into a broadcast medium through the legislation of the airwaves. How technologies acquire particular properties, meanings and uses, and who is involved in those decisions are the broader questions explored here. Although a long line of thought maintains that technologies evolve according to the logic of scientific rationality, recent studies demonstrated that technologies are, in fact, primarily shaped by social forces in specific historical contexts. In this view, adopted here, there is no one best way to design a technological artifact or system; the selection between alternative designs—which determine the affordances of each technology—is made by social actors according to their particular values, assumptions and goals. Thus, the arrangement of technical elements in any technological artifact is configured to conform to the views and interests of those involved in its development. Understanding how technologies assume particular shapes, who is involved in these decisions and how, in turn, they propitiate particular behaviors and modes of organization but not others, requires understanding the contexts in which they are developed. It is argued here that, throughout the last century, two distinct approaches to the development and dissemination of technologies have coexisted. In each of these models, based on fundamentally different ethoi, technologies are developed through different processes and by different participants—and therefore tend to assume different shapes and offer different possibilities. In the first of these approaches, the dominant model in Western societies, technologies are typically developed by firms, manufactured in large factories, and subsequently disseminated to the rest of the population for consumption. In this centralized model, the role of users is limited to selecting from the alternatives presented by professional producers. Thus, according to this approach, the technologies that are now so deeply woven into human experience, are primarily shaped by a relatively small number of producers. In recent years, however, a group of three interconnected interest groups—the makers, hackerspaces, and open source hardware communities—have increasingly challenged this dominant model by enacting an alternative approach in which technologies are both individually transformed and collectively shaped. Through a in-depth analysis of these phenomena, their practices and ethos, it is argued here that the distributed approach practiced by these communities offers a practical path towards a democratization of the technosphere by: 1) demystifying technologies, 2) providing the public with the tools and knowledge necessary to understand and shape technologies, and 3) encouraging citizen participation in the development of technologies.