957 resultados para External Knowledge Source
Resumo:
Epistemology in philosophy of mind is a difficult endeavor. Those who believe that our phenomenal life is different from other domains suggest that self-knowledge about phenomenal properties is certain and therefore privileged. Usually, this so called privileged access is explained by the idea that we have direct access to our phenomenal life. This means, in contrast to perceptual knowledge, self-knowledge is non-inferential. It is widely believed that, this kind of directness involves two different senses: an epistemic sense and a metaphysical sense. Proponents of this view often claim that this is due to the fact that we are acquainted with our current experiences. The acquaintance thesis, therefore, is the backbone in justifying privileged access. Unfortunately the whole approach has a profound flaw. For the thesis to work, acquaintance has to be a genuine explanation. Since it is usually assumed that any knowledge relation between judgments and the corresponding objects are merely causal and contingent (e.g. in perception), the proponent of the privileged access view needs to show that acquaintance can do the job. In this thesis, however, I claim that the latter cannot be done. Based on considerations introduced by Levine, I conclude that this approach involves either the introduction of ontologically independent properties or a rather obscure knowledge relation. A proper explanation, however, cannot employ either of the two options. The acquaintance thesis is, therefore, bound to fail. Since the privileged access intuition seems to be vital to epistemology within the philosophy of mind, I will explore alternative justifications. After discussing a number of options, I will focus on the so called revelation thesis. This approach states that by simply having an experience with phenomenal properties, one is in the position to know the essence of those phenomenal properties. I will argue that, after finding a solution for the controversial essence claim, this thesis is a successful replacement explanation which maintains all the virtues of the acquaintance account without necessarily introducing ontologically independent properties or an obscure knowledge relation. The overall solution consists in qualifying the essence claim in the relevant sense, leaving us with an appropriate ontology for phenomenal properties. On the one hand, this avoids employing mysterious independent properties, since this ontological view is physicalist in nature. On the other hand, this approach has the right kind of structure to explain privileged self-knowledge of our phenomenal life. My final conclusion consists in the claim that the privileged access intuition is in fact veridical. It cannot, however, be justified by the popular acquaintance approach, but rather, is explainable by the controversial revelation thesis.
Resumo:
Hybrid knowledge bases are knowledge bases that combine ontologies with non-monotonic rules, allowing to join the best of both open world ontologies and close world rules. Ontologies shape a good mechanism to share knowledge on theWeb that can be understood by both humans and machines, on the other hand rules can be used, e.g., to encode legal laws or to do a mapping between sources of information. Taking into account the dynamics present today on the Web, it is important for these hybrid knowledge bases to capture all these dynamics and thus adapt themselves. To achieve that, it is necessary to create mechanisms capable of monitoring the information flow present on theWeb. Up to today, there are no such mechanisms that allow for monitoring events and performing modifications of hybrid knowledge bases autonomously. The goal of this thesis is then to create a system that combine these hybrid knowledge bases with reactive rules, aiming to monitor events and perform actions over a knowledge base. To achieve this goal, a reactive system for the SemanticWeb is be developed in a logic-programming based approach accompanied with a language for heterogeneous rule base evolution having as its basis RIF Production Rule Dialect, which is a standard for exchanging rules over theWeb.
Resumo:
This thesis introduces a novel conceptual framework to support the creation of knowledge representations based on enriched Semantic Vectors, using the classical vector space model approach extended with ontological support. One of the primary research challenges addressed here relates to the process of formalization and representation of document contents, where most existing approaches are limited and only take into account the explicit, word-based information in the document. This research explores how traditional knowledge representations can be enriched through incorporation of implicit information derived from the complex relationships (semantic associations) modelled by domain ontologies with the addition of information presented in documents. The relevant achievements pursued by this thesis are the following: (i) conceptualization of a model that enables the semantic enrichment of knowledge sources supported by domain experts; (ii) development of a method for extending the traditional vector space, using domain ontologies; (iii) development of a method to support ontology learning, based on the discovery of new ontological relations expressed in non-structured information sources; (iv) development of a process to evaluate the semantic enrichment; (v) implementation of a proof-of-concept, named SENSE (Semantic Enrichment kNowledge SourcEs), which enables to validate the ideas established under the scope of this thesis; (vi) publication of several scientific articles and the support to 4 master dissertations carried out by the department of Electrical and Computer Engineering from FCT/UNL. It is worth mentioning that the work developed under the semantic referential covered by this thesis has reused relevant achievements within the scope of research European projects, in order to address approaches which are considered scientifically sound and coherent and avoid “reinventing the wheel”.
Resumo:
RESUMO: O cancro de mama e o mais frequente diagnoticado a indiv duos do sexo feminino. O conhecimento cientifico e a tecnologia tem permitido a cria ção de muitas e diferentes estrat egias para tratar esta patologia. A Radioterapia (RT) est a entre as diretrizes atuais para a maioria dos tratamentos de cancro de mama. No entanto, a radia ção e como uma arma de dois canos: apesar de tratar, pode ser indutora de neoplasias secund arias. A mama contralateral (CLB) e um orgão susceptivel de absorver doses com o tratamento da outra mama, potenciando o risco de desenvolver um tumor secund ario. Nos departamentos de radioterapia tem sido implementadas novas tecnicas relacionadas com a radia ção, com complexas estrat egias de administra ção da dose e resultados promissores. No entanto, algumas questões precisam de ser devidamente colocadas, tais como: E seguro avançar para tecnicas complexas para obter melhores indices de conformidade nos volumes alvo, em radioterapia de mama? O que acontece aos volumes alvo e aos tecidos saudaveis adjacentes? Quão exata e a administração de dose? Quais são as limitações e vantagens das técnicas e algoritmos atualmente usados? A resposta a estas questões e conseguida recorrendo a m etodos de Monte Carlo para modelar com precisão os diferentes componentes do equipamento produtor de radia ção(alvos, ltros, colimadores, etc), a m de obter uma descri cão apropriada dos campos de radia cão usados, bem como uma representa ção geometrica detalhada e a composição dos materiais que constituem os orgãos e os tecidos envolvidos. Este trabalho visa investigar o impacto de tratar cancro de mama esquerda usando diferentes tecnicas de radioterapia f-IMRT (intensidade modulada por planeamento direto), IMRT por planeamento inverso (IMRT2, usando 2 feixes; IMRT5, com 5 feixes) e DCART (arco conformacional dinamico) e os seus impactos em irradia ção da mama e na irradia ção indesejada dos tecidos saud aveis adjacentes. Dois algoritmos do sistema de planeamento iPlan da BrainLAB foram usados: Pencil Beam Convolution (PBC) e Monte Carlo comercial iMC. Foi ainda usado um modelo de Monte Carlo criado para o acelerador usado (Trilogy da VARIAN Medical Systems), no c odigo EGSnrc MC, para determinar as doses depositadas na mama contralateral. Para atingir este objetivo foi necess ario modelar o novo colimador multi-laminas High- De nition que nunca antes havia sido simulado. O modelo desenvolvido est a agora disponí vel no pacote do c odigo EGSnrc MC do National Research Council Canada (NRC). O acelerador simulado foi validado com medidas realizadas em agua e posteriormente com c alculos realizados no sistema de planeamento (TPS).As distribui ções de dose no volume alvo (PTV) e a dose nos orgãos de risco (OAR) foram comparadas atrav es da an alise de histogramas de dose-volume; an alise estati stica complementar foi realizadas usando o software IBM SPSS v20. Para o algoritmo PBC, todas as tecnicas proporcionaram uma cobertura adequada do PTV. No entanto, foram encontradas diferen cas estatisticamente significativas entre as t ecnicas, no PTV, nos OAR e ainda no padrão da distribui ção de dose pelos tecidos sãos. IMRT5 e DCART contribuem para maior dispersão de doses baixas pelos tecidos normais, mama direita, pulmão direito, cora cão e at e pelo pulmão esquerdo, quando comparados com as tecnicas tangenciais (f-IMRT e IMRT2). No entanto, os planos de IMRT5 melhoram a distribuição de dose no PTV apresentando melhor conformidade e homogeneidade no volume alvo e percentagens de dose mais baixas nos orgãos do mesmo lado. A t ecnica de DCART não apresenta vantagens comparativamente com as restantes t ecnicas investigadas. Foram tamb em identi cadas diferen cas entre os algoritmos de c alculos: em geral, o PBC estimou doses mais elevadas para o PTV, pulmão esquerdo e cora ção, do que os algoritmos de MC. Os algoritmos de MC, entre si, apresentaram resultados semelhantes (com dferen cas at e 2%). Considera-se que o PBC não e preciso na determina ção de dose em meios homog eneos e na região de build-up. Nesse sentido, atualmente na cl nica, a equipa da F sica realiza medi ções para adquirir dados para outro algoritmo de c alculo. Apesar de melhor homogeneidade e conformidade no PTV considera-se que h a um aumento de risco de cancro na mama contralateral quando se utilizam t ecnicas não-tangenciais. Os resultados globais dos estudos apresentados confirmam o excelente poder de previsão com precisão na determinação e c alculo das distribui ções de dose nos orgãos e tecidos das tecnicas de simulação de Monte Carlo usados.---------ABSTRACT:Breast cancer is the most frequent in women. Scienti c knowledge and technology have created many and di erent strategies to treat this pathology. Radiotherapy (RT) is in the actual standard guidelines for most of breast cancer treatments. However, radiation is a two-sword weapon: although it may heal cancer, it may also induce secondary cancer. The contralateral breast (CLB) is a susceptible organ to absorb doses with the treatment of the other breast, being at signi cant risk to develop a secondary tumor. New radiation related techniques, with more complex delivery strategies and promising results are being implemented and used in radiotherapy departments. However some questions have to be properly addressed, such as: Is it safe to move to complex techniques to achieve better conformation in the target volumes, in breast radiotherapy? What happens to the target volumes and surrounding healthy tissues? How accurate is dose delivery? What are the shortcomings and limitations of currently used treatment planning systems (TPS)? The answers to these questions largely rely in the use of Monte Carlo (MC) simulations using state-of-the-art computer programs to accurately model the di erent components of the equipment (target, lters, collimators, etc.) and obtain an adequate description of the radiation elds used, as well as the detailed geometric representation and material composition of organs and tissues. This work aims at investigating the impact of treating left breast cancer using di erent radiation therapy (RT) techniques f-IMRT (forwardly-planned intensity-modulated), inversely-planned IMRT (IMRT2, using 2 beams; IMRT5, using 5 beams) and dynamic conformal arc (DCART) RT and their e ects on the whole-breast irradiation and in the undesirable irradiation of the surrounding healthy tissues. Two algorithms of iPlan BrainLAB TPS were used: Pencil Beam Convolution (PBC)and commercial Monte Carlo (iMC). Furthermore, an accurate Monte Carlo (MC) model of the linear accelerator used (a Trilogy R VARIANR) was done with the EGSnrc MC code, to accurately determine the doses that reach the CLB. For this purpose it was necessary to model the new High De nition multileaf collimator that had never before been simulated. The model developed was then included on the EGSnrc MC package of National Research Council Canada (NRC). The linac was benchmarked with water measurements and later on validated against the TPS calculations. The dose distributions in the planning target volume (PTV) and the dose to the organs at risk (OAR) were compared analyzing dose-volume histograms; further statistical analysis was performed using IBM SPSS v20 software. For PBC, all the techniques provided adequate coverage of the PTV. However, statistically significant dose di erences were observed between the techniques, in the PTV, OAR and also in the pattern of dose distribution spreading into normal tissues. IMRT5 and DCART spread low doses into greater volumes of normal tissue, right breast, right lung, heart and even the left lung than tangential techniques (f-IMRT and IMRT2). However,IMRT5 plans improved distributions for the PTV, exhibiting better conformity and homogeneity in target and reduced high dose percentages in ipsilateral OAR. DCART did not present advantages over any of the techniques investigated. Di erences were also found comparing the calculation algorithms: PBC estimated higher doses for the PTV, ipsilateral lung and heart than the MC algorithms predicted. The MC algorithms presented similar results (within 2% di erences). The PBC algorithm was considered not accurate in determining the dose in heterogeneous media and in build-up regions. Therefore, a major e ort is being done at the clinic to acquire data to move from PBC to another calculation algorithm. Despite better PTV homogeneity and conformity there is an increased risk of CLB cancer development, when using non-tangential techniques. The overall results of the studies performed con rm the outstanding predictive power and accuracy in the assessment and calculation of dose distributions in organs and tissues rendered possible by the utilization and implementation of MC simulation techniques in RT TPS.
Resumo:
Enhanced biological phosphorus removal (EBPR) is the most economic and sustainable option used in wastewater treatment plants (WWTPs) for phosphorus removal. In this process it is important to control the competition between polyphosphate accumulating organisms (PAOs) and glycogen accumulating organisms (GAOs), since EBPR deterioration or failure can be related with the proliferation of GAOs over PAOs. This thesis is focused on the effect of operational conditions (volatile fatty acid (VFA) composition, dissolved oxygen (DO) concentration and organic carbon loading) on PAO and GAO metabolism. The knowledge about the effect of these operational conditions on EBPR metabolism is very important, since they represent key factors that impact WWTPs performance and sustainability. Substrate competition between the anaerobic uptake of acetate and propionate (the main VFAs present in WWTPs) was shown in this work to be a relevant factor affecting PAO metabolism, and a metabolic model was developed that successfully describes this effect. Interestingly, the aerobic metabolism of PAOs was not affected by different VFA compositions, since the aerobic kinetic parameters for phosphorus uptake, polyhydroxyalkanoates (PHAs) degradation and glycogen production were relatively independent of acetate or propionate concentration. This is very relevant for WWTPs, since it will simplify the calibration procedure for metabolic models, facilitating their use for full-scale systems. The DO concentration and aerobic hydraulic retention time (HRT) affected the PAO-GAO competition, where low DO levels or lower aerobic HRT was more favourable for PAOs than GAOs. Indeed, the oxygen affinity coefficient was significantly higher for GAOs than PAOs, showing that PAOs were far superior at scavenging for the often limited oxygen levels in WWTPs. The operation of WWTPs with low aeration is of high importance for full-scale systems, since it decreases the energetic costs and can potentially improve WWTP sustainability. Extended periods of low organic carbon load, which are the most common conditions that exist in full-scale WWTPs, also had an impact on PAO and GAO activity. GAOs exhibited a substantially higher biomass decay rate as compared to PAOs under these conditions, which revealed a higher survival capacity for PAOs, representing an advantage for PAOs in EBPR processes. This superior survival capacity of PAOs under conditions more closely resembling a full-scale environment was linked with their ability to maintain a residual level of PHA reserves for longer than GAOs, providing them with an effective energy source for aerobic maintenance processes. Overall, this work shows that each of these key operational conditions play an important role in the PAO-GAO competition and should be considered in WWTP models in order to improve EBPR processes.
Resumo:
INTRODUCTION: Human pappilomavirus is one of the most common sexually transmitted diseases, and persistent HPV infection is considered the most important cause of cervical cancer. It is detected in more than 98% of this type of cancer. This study aimed to determine the level of knowledge concerning human papillomavirus among nursing college students of a private educational institution located in the City of Bauru, SP, and correlate their knowledge according to the course year. METHODS: A descriptive study with a quantitative approach, performed with a questionnaire that permitted the quantification of data and opinions, thus guaranteeing the precision of the results without distortions in analysis or interpretation. The survey was applied to randomly selected 1st, 2nd, 3rd, and 4th-year nursing college students. Twenty students from each level were selected during August 2009, totaling 80 students of both genders. RESULTS: Observation revealed that 4th-year students had greater knowledge than 1st-year students, reflecting the greater period of study, the lack of knowledge of 1st-year students was due to the low level of information acquired before entering college. CONCLUSIONS: The need for complementary studies which determine the profile and knowledge of a larger number of teenagers in relation to HPV was established. The need for educational programs that can overcome this lack of information is undeniable, especially those aimed at making adolescents less susceptible to HPV and other STDs.
Resumo:
Digital Microfluidics (DMF) is a second generation technique, derived from the conventional microfluidics that instead of using continuous liquid fluxes, it uses only individual droplets driven by external electric signals. In this thesis a new DMF control/sensing system for visualization, droplet control (movement, dispensing, merging and splitting) and real time impedance measurement have been developed. The software for the proposed system was implemented in MATLAB with a graphical user interface. An Arduino was used as control board and dedicated circuits for voltage switching and contacts were designed and implemented in printed circuit boards. A high resolution camera was integrated for visualization. In our new approach, the DMF chips are driven by a dual-tone signal where the sum of two independent ac signals (one for droplet operations and the other for impedance sensing) is applied to the electrodes, and afterwards independently evaluated by a lock-in amplifier. With this new approach we were able to choose the appropriated amplitudes and frequencies for the different proposes (actuation and sensing). The measurements made were used to evaluate the real time droplet impedance enabling the knowledge of its position and velocity. This new approach opens new possibilities for impedance sensing and feedback control in DMF devices.
Resumo:
The particular characteristics and affordances of technologies play a significant role in human experience by defining the realm of possibilities available to individuals and societies. Some technological configurations, such as the Internet, facilitate peer-to-peer communication and participatory behaviors. Others, like television broadcasting, tend to encourage centralization of creative processes and unidirectional communication. In other instances still, the affordances of technologies can be further constrained by social practices. That is the case, for example, of radio which, although technically allowing peer-to-peer communication, has effectively been converted into a broadcast medium through the legislation of the airwaves. How technologies acquire particular properties, meanings and uses, and who is involved in those decisions are the broader questions explored here. Although a long line of thought maintains that technologies evolve according to the logic of scientific rationality, recent studies demonstrated that technologies are, in fact, primarily shaped by social forces in specific historical contexts. In this view, adopted here, there is no one best way to design a technological artifact or system; the selection between alternative designs—which determine the affordances of each technology—is made by social actors according to their particular values, assumptions and goals. Thus, the arrangement of technical elements in any technological artifact is configured to conform to the views and interests of those involved in its development. Understanding how technologies assume particular shapes, who is involved in these decisions and how, in turn, they propitiate particular behaviors and modes of organization but not others, requires understanding the contexts in which they are developed. It is argued here that, throughout the last century, two distinct approaches to the development and dissemination of technologies have coexisted. In each of these models, based on fundamentally different ethoi, technologies are developed through different processes and by different participants—and therefore tend to assume different shapes and offer different possibilities. In the first of these approaches, the dominant model in Western societies, technologies are typically developed by firms, manufactured in large factories, and subsequently disseminated to the rest of the population for consumption. In this centralized model, the role of users is limited to selecting from the alternatives presented by professional producers. Thus, according to this approach, the technologies that are now so deeply woven into human experience, are primarily shaped by a relatively small number of producers. In recent years, however, a group of three interconnected interest groups—the makers, hackerspaces, and open source hardware communities—have increasingly challenged this dominant model by enacting an alternative approach in which technologies are both individually transformed and collectively shaped. Through a in-depth analysis of these phenomena, their practices and ethos, it is argued here that the distributed approach practiced by these communities offers a practical path towards a democratization of the technosphere by: 1) demystifying technologies, 2) providing the public with the tools and knowledge necessary to understand and shape technologies, and 3) encouraging citizen participation in the development of technologies.
Resumo:
RESUMO - O decisor hospitalar tem como função decidir os recursos de uma organização de saúde, sejam estes financeiros, materiais ou humanos, sendo decisivo o conhecimento e informação que o apoiem na aplicabilidade nas tomadas de decisão e na solução dos problemas. As tomadas de decisão suportam-se em modelos reproduzidos pelos decisores, em processos, modelos, e em princípios, que podem ou não assumir intuição, objetividade, racionalidade e ética, bem como de técnicas várias que podem ser limitativas ou condicionadas, por força de fatores vários, como: a falta de informação inerente de uma multidisciplinaridade do processo; de condicionalismos organizacionais, internos ou externos, associados à envolvente e cultura organizacional e influências políticas e macroeconómicas; ao fator tempo; a tecnologia; a estrutura e desenho organizacional; a autoridade/poder e a autonomia para decidir; a liderança, e do estatuto jurídico que o hospital possui. Este último ponto será esmiuçado, mais profundamente, neste estudo. Iremos, através do estudo, compreender se os elementos componentes das decisões tomadas nos hospitais, são ou não adaptadas em consonância com diferentes políticas de governação hospitalar, em contextos e dinâmicas organizacionais diferenciadas, por diferentes Estatutos Jurídicos Hospitalares - EPE, SPA, PPP e Privados. Foi realizado um estudo de caráter exploratório, descritivo-correlacional e transversal, baseou-se num questionário aplicado a decisores hospitalares, incidindo nos dois vetores centrais do estudo, na tomada de decisão e no estatuto jurídico hospitalar. A decisão é então, um valiosíssimo veículo na persecução das estratégias e planos formulados pelo hospital, esperando-se destes produzir consequentes resultados eficientes, eficazes e efetivos na sua aplicação.
Resumo:
This research seeks to design and implement a WebGIS application allowing high school students to work with information related to the disciplinary competencies of the competency-teaching model, in Mexico. This paradigm assumes knowledge to be acquired through the application of new technologies and to link it with everyday life situations of students. The WebGIS provides access to maps regarding natural risks in Mexico, e.g. volcanism, seismic activities, or hurricanes; the prototype's user interface was designed with special emphasis on scholar needs for high school students.
Resumo:
The existing parking simulations, as most simulations, are intended to gain insights of a system or to make predictions. The knowledge they have provided has built up over the years, and several research works have devised detailed parking system models. This thesis work describes the use of an agent-based parking simulation in the context of a bigger parking system development. It focuses more on flexibility than on fidelity, showing the case where it is relevant for a parking simulation to consume dynamically changing GIS data from external, online sources and how to address this case. The simulation generates the parking occupancy information that sensing technologies should eventually produce and supplies it to the bigger parking system. It is built as a Java application based on the MASON toolkit and consumes GIS data from an ArcGis Server. The application context of the implemented parking simulation is a university campus with free, on-street parking places.
Resumo:
In the last few years, we have observed an exponential increasing of the information systems, and parking information is one more example of them. The needs of obtaining reliable and updated information of parking slots availability are very important in the goal of traffic reduction. Also parking slot prediction is a new topic that has already started to be applied. San Francisco in America and Santander in Spain are examples of such projects carried out to obtain this kind of information. The aim of this thesis is the study and evaluation of methodologies for parking slot prediction and the integration in a web application, where all kind of users will be able to know the current parking status and also future status according to parking model predictions. The source of the data is ancillary in this work but it needs to be understood anyway to understand the parking behaviour. Actually, there are many modelling techniques used for this purpose such as time series analysis, decision trees, neural networks and clustering. In this work, the author explains the best techniques at this work, analyzes the result and points out the advantages and disadvantages of each one. The model will learn the periodic and seasonal patterns of the parking status behaviour, and with this knowledge it can predict future status values given a date. The data used comes from the Smart Park Ontinyent and it is about parking occupancy status together with timestamps and it is stored in a database. After data acquisition, data analysis and pre-processing was needed for model implementations. The first test done was with the boosting ensemble classifier, employed over a set of decision trees, created with C5.0 algorithm from a set of training samples, to assign a prediction value to each object. In addition to the predictions, this work has got measurements error that indicates the reliability of the outcome predictions being correct. The second test was done using the function fitting seasonal exponential smoothing tbats model. Finally as the last test, it has been tried a model that is actually a combination of the previous two models, just to see the result of this combination. The results were quite good for all of them, having error averages of 6.2, 6.6 and 5.4 in vacancies predictions for the three models respectively. This means from a parking of 47 places a 10% average error in parking slot predictions. This result could be even better with longer data available. In order to make this kind of information visible and reachable from everyone having a device with internet connection, a web application was made for this purpose. Beside the data displaying, this application also offers different functions to improve the task of searching for parking. The new functions, apart from parking prediction, were: - Park distances from user location. It provides all the distances to user current location to the different parks in the city. - Geocoding. The service for matching a literal description or an address to a concrete location. - Geolocation. The service for positioning the user. - Parking list panel. This is not a service neither a function, is just a better visualization and better handling of the information.