90 resultados para Medical Knowledge
Resumo:
RESUMO: As Análises Clínicas são um precioso elemento entre os meios complementares de diagnóstico e terapêutica permitindo uma enorme panóplia de informações sobre o estado de saúde de determinado utente. O objetivo do laboratório é fornecer informação analítica sobre as amostras biológicas, sendo esta caracterizada pela sua fiabilidade, relevância e facultada em tempo útil. Assim, tratando-se de saúde, e mediante o propósito do laboratório, é notória a sua importância, bem como, a dos fatores associados para o cumprimento do mesmo. O bom desenrolar do ciclo laboratorial, compreendido pelas fases pré-analítica, analítica e pós-analítica é crucial para que o objetivo do laboratório seja cumprido com rigor e rapidez. O presente trabalho “O Erro na Fase Pré-Analítica: Amostras Não Conformes versus Procedimentos”, enquadrado no mestrado de Qualidade e Organização no Laboratório de Análises Clínicas, pretendeu enfatizar a importância da fase pré- analítica, sendo ela apontada como a primordial em erros que acabam por atrasar a saída de resultados ou por permitir que os mesmos não sejam fidedignos como se deseja, podendo acarretar falsos diagnósticos e decisões clínicas erradas. Esta fase, iniciada no pedido médico e finalizada com a chegada das amostras biológicas ao laboratório está entregue a uma diversidade de procedimentos que acarretam, por si só, uma grande diversidade de intervenientes, para além de uma variabilidade de factores que influenciam a amostra e seus resultados. Estes fatores, que podem alterar de algum modo a “veracidade” dos resultados analíticos, devem ser identificados e tidos em consideração para que estejamos convitos que os resultados auxiliam diagnósticos precisos e uma avaliação correta do estado do utente. As colheitas que por quaisquer divergências não originam amostras que cumpram o objectivo da sua recolha, não estando por isso em conformidade com o pretendido, constituem uma importante fonte de erro para esta fase pré-analítica. Neste estudo foram consultados os dados relativos a amostras de sangue e urina não conformes detetadas no laboratório, em estudo, durante o 1º trimestre de 2012, para permitir conhecer o tipo de falhas que acontecem e a sua frequência. Aos Técnicos de Análises Clínicas, colaboradores do laboratório, foi-lhes pedido que respondessem a um questionário sobre os seus procedimentos quotidianos e constituíssem, assim, a população desta 2ª parte do projeto. Preenchido e devolvido de forma anónima, este questionário pretendeu conhecer os procedimentos na tarefa de executar colheitas e, hipoteticamente, confrontá-los com as amostras não conformes verificadas. No 1ºsemestre de 2012 e num total de 25319 utentes registaram-se 146 colheitas que necessitaram de repetição por se verificarem não conformes. A “amostra não colhida” foi a não conformidade mais frequente (50%) versus a “má identificação” que registou somente 1 acontecimento. Houve ainda não conformidades que não se registaram como “preparação inadequada” e “amostra mal acondicionada”. Os técnicos revelaram-se profissionais competentes, conhecedores das tarefas a desempenhar e preocupados em executá-las com qualidade. Eliminar o erro não estará, seguramente, ao nosso alcance porém admitir a sua presença, detetá-lo e avaliar a sua frequência fará com que possamos diminuir a sua existência e melhorar a qualidade na fase pré-analítica, atribuindo-lhe a relevância que desempenha no processo laboratorial.-----------ABSTRACT:Clinical analyses are a precious element among diagnostic and therapeutic tests as they allow an enormous variety of information on the state of health of a user. The aim of the laboratory is to supply reliable, relevant and timely analytical information on biological samples. In health-related matters, in accordance with the objective of the laboratory, their importance is vital, as is the assurance that all the tools are in place for the fulfillment of its purpose. A good laboratory cycle, which includes the pre-analytical, analytical and post-analytical phases, is crucial in fulfilling the laboratory’s mission rapidly and efficiently. The present work - "Error in the pre-analytical phase: non-compliant samples versus procedures”, as part of the Master’s in Quality and Organization in the Clinical Analyses Laboratory, wishes to emphasize the importance of the pre-analytical phase, as the phase containing most errors which eventually lead to delays in the issue of results, or the one which enables those results not to be as reliable as desired, which can lead to false diagnosis and wrong clinical decisions. This phase, which starts with the medical request and ends with the arrival of the biological samples to the laboratory, entails a variety of procedures, which require the intervention of different players, not to mention a great number of factors, which influence the sample and the results. These factors, capable of somehow altering the “truth” of the analytical results, must be identified and taken into consideration so that we may ensure that the results help to make precise diagnoses and a correct evaluation of the user’s condition. Those collections which, due to any type of differences, do not originate samples capable of fulfilling their purpose, and are therefore not compliant with the objective, constitute an important source of error in this pre-analytical phase. In the present study, we consulted data from non-compliant blood and urine samples, detected at the laboratory during the 1st quarter of 2012, to find out the type of faults that happen and their frequency. The clinical analysis technicians working at the laboratory were asked to fill out a questionnaire regarding their daily procedures, forming in this way the population for this second part of the project. Completed and returned anonymously, this questionnaire intended to investigate the procedures for collections and, hypothetically, confront them with the verified non-compliant samples. In the first semester of 2012, and out of a total of 25319 users, 146 collections had to be repeated due to non-compliance. The “uncollected sample” was the most frequent non-compliance (>50%) versus “incorrect identification” which had only one occurrence. There were also unregistered non-compliance issues such as “inadequate preparation” and “inappropriately packaged sample”. The technicians proved to be competent professionals, with knowledge of the tasks they have to perform and eager to carry them out efficiently. We will certainly not be able to eliminate error, but recognizing its presence, detecting it and evaluating its frequency will help to decrease its occurrence and improve quality in the pre-analytical phase, giving it the relevance it has within the laboratory process.
Resumo:
We present a qualitative analysis of organizational improvisation and provide a preliminary insight into the following question: how is improvisation present in tightly controlled work environments? We conducted in situ observations of, and interviews with, several emergency medical teams and complemented this information with statistical and media data. Using grounded theory, we developed four propositions that were arranged into a model that allowed the identification of two use levels of established routines: (1) the visible side that accommodates contextual requirements, and (2) the improvisational side that provides a response to activity characteristics. This dual process is related to the existence of pressures that operate at the institutional level with practical needs emerging from the operational domain. In contrast with most of the literature, this study reveals that the presence of a broad procedural organizational memory does not restrict improvisation but enables a bureaucratic system to produce flexible improvised performance.
Resumo:
Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies
Resumo:
A Masters Thesis, presented as part of the requirements for the award of a Research Masters Degree in Economics from NOVA – School of Business and Economics
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Eletrotécnica e de Computadores
Resumo:
The work presented in this thesis was developed in collaboration with a Portuguese company, BeyonDevices, devoted to pharmaceutical packaging, medical technology and device industry. Specifically, the composition impact and surface modification of two polymeric medical devices from the company were studied: inhalers and vaginal applicators. The polyethylene-based vaginal applicator was modified using supercritical fluid technology to acquire self-cleaning properties and prevent the transport of bacteria and yeasts to vaginal flora. For that, in-situ polymerization of 2-substituted oxazolines was performed within the polyethylene matrix using supercritical carbon dioxide. The cationic ring-opening polymerization process was followed by end-capping with N,N-dimethyldodecylamine. Furthermore, for the same propose, the polyethylene matrix was impregnated with lavender oil in supercritical medium. The obtained materials were characterized physical and morphologically and the antimicrobial activity against bacteria and yeasts was accessed. Materials modified using 2-substituted oxazolines showed an effective killing ability for all the tested microorganisms, while the materials modified with lavender oil did not show antimicrobial activity. Only materials modified with oligo(2-ethyl-2-oxazoline) maintain the activity during the long term stability. Furthermore, the cytotoxicity of the materials was tested, confirming their biocompatibilty. Regarding the inhaler, its surface was modified in order to improve powder flowability and consequently, to reduce powder retention in the inhaler´s nozzle. New dry powder inhalers (DPIs), with different needle’s diameters, were evaluated in terms of internal resistance and uniformity of the emitted dose. It was observed that they present a mean resistance of 0.06 cmH2O0.5/(L/min) and the maximum emitted dose obtained was 68.9% for the inhaler with higher needle´s diameter (2 mm). Thus, this inhaler was used as a test and modified by the coating with a commonly-used force control agent, magnesium stearate, dried with supercritical carbon dioxide (scCO2) and the uniformity of delivered dose tests were repeated. The modified inhaler showed an increase in emitted dose from 68.9% to 71.3% for lactose and from 30.0% to 33.7% for Foradil.
Resumo:
This research aims to provide a better understanding on how firms stimulate knowledge sharing through the utilization of collaboration tools, in particular Emergent Social Software Platforms (ESSPs). It focuses on the distinctive applications of ESSPs and on the initiatives contributing to maximize its advantages. In the first part of the research, I have itemized all types of existing collaboration tools and classify them in different categories according to their capabilities, objectives and according to their faculty for promoting knowledge sharing. In the second part, and based on an exploratory case study at Cisco Systems, I have identified the main applications of an existing enterprise social software platform named Webex Social. By combining a qualitative and quantitative approach, as well as combining data collected from survey’s results and from the analysis of the company’s documents, I am expecting to maximize the outcome of this investigation and reduce the risk of bias. Although effects cannot be universalized based on one single case study, some utilization patterns have been underlined from the data collected and potential trends in managing knowledge have been observed. The results of the research have also enabled identifying most of the constraints experienced by the users of the firm’s social software platform. Utterly, this research should provide a primary framework for firms planning to create or implement a social software platform and for firms willing to increase adoption levels and to promote the overall participation of users. It highlights the common traps that should be avoided by developers when designing a social software platform and the capabilities that it should inherently carry to support an effective knowledge management strategy.
Resumo:
Epistemology in philosophy of mind is a difficult endeavor. Those who believe that our phenomenal life is different from other domains suggest that self-knowledge about phenomenal properties is certain and therefore privileged. Usually, this so called privileged access is explained by the idea that we have direct access to our phenomenal life. This means, in contrast to perceptual knowledge, self-knowledge is non-inferential. It is widely believed that, this kind of directness involves two different senses: an epistemic sense and a metaphysical sense. Proponents of this view often claim that this is due to the fact that we are acquainted with our current experiences. The acquaintance thesis, therefore, is the backbone in justifying privileged access. Unfortunately the whole approach has a profound flaw. For the thesis to work, acquaintance has to be a genuine explanation. Since it is usually assumed that any knowledge relation between judgments and the corresponding objects are merely causal and contingent (e.g. in perception), the proponent of the privileged access view needs to show that acquaintance can do the job. In this thesis, however, I claim that the latter cannot be done. Based on considerations introduced by Levine, I conclude that this approach involves either the introduction of ontologically independent properties or a rather obscure knowledge relation. A proper explanation, however, cannot employ either of the two options. The acquaintance thesis is, therefore, bound to fail. Since the privileged access intuition seems to be vital to epistemology within the philosophy of mind, I will explore alternative justifications. After discussing a number of options, I will focus on the so called revelation thesis. This approach states that by simply having an experience with phenomenal properties, one is in the position to know the essence of those phenomenal properties. I will argue that, after finding a solution for the controversial essence claim, this thesis is a successful replacement explanation which maintains all the virtues of the acquaintance account without necessarily introducing ontologically independent properties or an obscure knowledge relation. The overall solution consists in qualifying the essence claim in the relevant sense, leaving us with an appropriate ontology for phenomenal properties. On the one hand, this avoids employing mysterious independent properties, since this ontological view is physicalist in nature. On the other hand, this approach has the right kind of structure to explain privileged self-knowledge of our phenomenal life. My final conclusion consists in the claim that the privileged access intuition is in fact veridical. It cannot, however, be justified by the popular acquaintance approach, but rather, is explainable by the controversial revelation thesis.
Resumo:
Hybrid knowledge bases are knowledge bases that combine ontologies with non-monotonic rules, allowing to join the best of both open world ontologies and close world rules. Ontologies shape a good mechanism to share knowledge on theWeb that can be understood by both humans and machines, on the other hand rules can be used, e.g., to encode legal laws or to do a mapping between sources of information. Taking into account the dynamics present today on the Web, it is important for these hybrid knowledge bases to capture all these dynamics and thus adapt themselves. To achieve that, it is necessary to create mechanisms capable of monitoring the information flow present on theWeb. Up to today, there are no such mechanisms that allow for monitoring events and performing modifications of hybrid knowledge bases autonomously. The goal of this thesis is then to create a system that combine these hybrid knowledge bases with reactive rules, aiming to monitor events and perform actions over a knowledge base. To achieve this goal, a reactive system for the SemanticWeb is be developed in a logic-programming based approach accompanied with a language for heterogeneous rule base evolution having as its basis RIF Production Rule Dialect, which is a standard for exchanging rules over theWeb.
Resumo:
This thesis introduces a novel conceptual framework to support the creation of knowledge representations based on enriched Semantic Vectors, using the classical vector space model approach extended with ontological support. One of the primary research challenges addressed here relates to the process of formalization and representation of document contents, where most existing approaches are limited and only take into account the explicit, word-based information in the document. This research explores how traditional knowledge representations can be enriched through incorporation of implicit information derived from the complex relationships (semantic associations) modelled by domain ontologies with the addition of information presented in documents. The relevant achievements pursued by this thesis are the following: (i) conceptualization of a model that enables the semantic enrichment of knowledge sources supported by domain experts; (ii) development of a method for extending the traditional vector space, using domain ontologies; (iii) development of a method to support ontology learning, based on the discovery of new ontological relations expressed in non-structured information sources; (iv) development of a process to evaluate the semantic enrichment; (v) implementation of a proof-of-concept, named SENSE (Semantic Enrichment kNowledge SourcEs), which enables to validate the ideas established under the scope of this thesis; (vi) publication of several scientific articles and the support to 4 master dissertations carried out by the department of Electrical and Computer Engineering from FCT/UNL. It is worth mentioning that the work developed under the semantic referential covered by this thesis has reused relevant achievements within the scope of research European projects, in order to address approaches which are considered scientifically sound and coherent and avoid “reinventing the wheel”.
Resumo:
RESUMO: O cancro de mama e o mais frequente diagnoticado a indiv duos do sexo feminino. O conhecimento cientifico e a tecnologia tem permitido a cria ção de muitas e diferentes estrat egias para tratar esta patologia. A Radioterapia (RT) est a entre as diretrizes atuais para a maioria dos tratamentos de cancro de mama. No entanto, a radia ção e como uma arma de dois canos: apesar de tratar, pode ser indutora de neoplasias secund arias. A mama contralateral (CLB) e um orgão susceptivel de absorver doses com o tratamento da outra mama, potenciando o risco de desenvolver um tumor secund ario. Nos departamentos de radioterapia tem sido implementadas novas tecnicas relacionadas com a radia ção, com complexas estrat egias de administra ção da dose e resultados promissores. No entanto, algumas questões precisam de ser devidamente colocadas, tais como: E seguro avançar para tecnicas complexas para obter melhores indices de conformidade nos volumes alvo, em radioterapia de mama? O que acontece aos volumes alvo e aos tecidos saudaveis adjacentes? Quão exata e a administração de dose? Quais são as limitações e vantagens das técnicas e algoritmos atualmente usados? A resposta a estas questões e conseguida recorrendo a m etodos de Monte Carlo para modelar com precisão os diferentes componentes do equipamento produtor de radia ção(alvos, ltros, colimadores, etc), a m de obter uma descri cão apropriada dos campos de radia cão usados, bem como uma representa ção geometrica detalhada e a composição dos materiais que constituem os orgãos e os tecidos envolvidos. Este trabalho visa investigar o impacto de tratar cancro de mama esquerda usando diferentes tecnicas de radioterapia f-IMRT (intensidade modulada por planeamento direto), IMRT por planeamento inverso (IMRT2, usando 2 feixes; IMRT5, com 5 feixes) e DCART (arco conformacional dinamico) e os seus impactos em irradia ção da mama e na irradia ção indesejada dos tecidos saud aveis adjacentes. Dois algoritmos do sistema de planeamento iPlan da BrainLAB foram usados: Pencil Beam Convolution (PBC) e Monte Carlo comercial iMC. Foi ainda usado um modelo de Monte Carlo criado para o acelerador usado (Trilogy da VARIAN Medical Systems), no c odigo EGSnrc MC, para determinar as doses depositadas na mama contralateral. Para atingir este objetivo foi necess ario modelar o novo colimador multi-laminas High- De nition que nunca antes havia sido simulado. O modelo desenvolvido est a agora disponí vel no pacote do c odigo EGSnrc MC do National Research Council Canada (NRC). O acelerador simulado foi validado com medidas realizadas em agua e posteriormente com c alculos realizados no sistema de planeamento (TPS).As distribui ções de dose no volume alvo (PTV) e a dose nos orgãos de risco (OAR) foram comparadas atrav es da an alise de histogramas de dose-volume; an alise estati stica complementar foi realizadas usando o software IBM SPSS v20. Para o algoritmo PBC, todas as tecnicas proporcionaram uma cobertura adequada do PTV. No entanto, foram encontradas diferen cas estatisticamente significativas entre as t ecnicas, no PTV, nos OAR e ainda no padrão da distribui ção de dose pelos tecidos sãos. IMRT5 e DCART contribuem para maior dispersão de doses baixas pelos tecidos normais, mama direita, pulmão direito, cora cão e at e pelo pulmão esquerdo, quando comparados com as tecnicas tangenciais (f-IMRT e IMRT2). No entanto, os planos de IMRT5 melhoram a distribuição de dose no PTV apresentando melhor conformidade e homogeneidade no volume alvo e percentagens de dose mais baixas nos orgãos do mesmo lado. A t ecnica de DCART não apresenta vantagens comparativamente com as restantes t ecnicas investigadas. Foram tamb em identi cadas diferen cas entre os algoritmos de c alculos: em geral, o PBC estimou doses mais elevadas para o PTV, pulmão esquerdo e cora ção, do que os algoritmos de MC. Os algoritmos de MC, entre si, apresentaram resultados semelhantes (com dferen cas at e 2%). Considera-se que o PBC não e preciso na determina ção de dose em meios homog eneos e na região de build-up. Nesse sentido, atualmente na cl nica, a equipa da F sica realiza medi ções para adquirir dados para outro algoritmo de c alculo. Apesar de melhor homogeneidade e conformidade no PTV considera-se que h a um aumento de risco de cancro na mama contralateral quando se utilizam t ecnicas não-tangenciais. Os resultados globais dos estudos apresentados confirmam o excelente poder de previsão com precisão na determinação e c alculo das distribui ções de dose nos orgãos e tecidos das tecnicas de simulação de Monte Carlo usados.---------ABSTRACT:Breast cancer is the most frequent in women. Scienti c knowledge and technology have created many and di erent strategies to treat this pathology. Radiotherapy (RT) is in the actual standard guidelines for most of breast cancer treatments. However, radiation is a two-sword weapon: although it may heal cancer, it may also induce secondary cancer. The contralateral breast (CLB) is a susceptible organ to absorb doses with the treatment of the other breast, being at signi cant risk to develop a secondary tumor. New radiation related techniques, with more complex delivery strategies and promising results are being implemented and used in radiotherapy departments. However some questions have to be properly addressed, such as: Is it safe to move to complex techniques to achieve better conformation in the target volumes, in breast radiotherapy? What happens to the target volumes and surrounding healthy tissues? How accurate is dose delivery? What are the shortcomings and limitations of currently used treatment planning systems (TPS)? The answers to these questions largely rely in the use of Monte Carlo (MC) simulations using state-of-the-art computer programs to accurately model the di erent components of the equipment (target, lters, collimators, etc.) and obtain an adequate description of the radiation elds used, as well as the detailed geometric representation and material composition of organs and tissues. This work aims at investigating the impact of treating left breast cancer using di erent radiation therapy (RT) techniques f-IMRT (forwardly-planned intensity-modulated), inversely-planned IMRT (IMRT2, using 2 beams; IMRT5, using 5 beams) and dynamic conformal arc (DCART) RT and their e ects on the whole-breast irradiation and in the undesirable irradiation of the surrounding healthy tissues. Two algorithms of iPlan BrainLAB TPS were used: Pencil Beam Convolution (PBC)and commercial Monte Carlo (iMC). Furthermore, an accurate Monte Carlo (MC) model of the linear accelerator used (a Trilogy R VARIANR) was done with the EGSnrc MC code, to accurately determine the doses that reach the CLB. For this purpose it was necessary to model the new High De nition multileaf collimator that had never before been simulated. The model developed was then included on the EGSnrc MC package of National Research Council Canada (NRC). The linac was benchmarked with water measurements and later on validated against the TPS calculations. The dose distributions in the planning target volume (PTV) and the dose to the organs at risk (OAR) were compared analyzing dose-volume histograms; further statistical analysis was performed using IBM SPSS v20 software. For PBC, all the techniques provided adequate coverage of the PTV. However, statistically significant dose di erences were observed between the techniques, in the PTV, OAR and also in the pattern of dose distribution spreading into normal tissues. IMRT5 and DCART spread low doses into greater volumes of normal tissue, right breast, right lung, heart and even the left lung than tangential techniques (f-IMRT and IMRT2). However,IMRT5 plans improved distributions for the PTV, exhibiting better conformity and homogeneity in target and reduced high dose percentages in ipsilateral OAR. DCART did not present advantages over any of the techniques investigated. Di erences were also found comparing the calculation algorithms: PBC estimated higher doses for the PTV, ipsilateral lung and heart than the MC algorithms predicted. The MC algorithms presented similar results (within 2% di erences). The PBC algorithm was considered not accurate in determining the dose in heterogeneous media and in build-up regions. Therefore, a major e ort is being done at the clinic to acquire data to move from PBC to another calculation algorithm. Despite better PTV homogeneity and conformity there is an increased risk of CLB cancer development, when using non-tangential techniques. The overall results of the studies performed con rm the outstanding predictive power and accuracy in the assessment and calculation of dose distributions in organs and tissues rendered possible by the utilization and implementation of MC simulation techniques in RT TPS.
Resumo:
ABSTRACT - Background: Integration of health care services is emerging as a central challenge of health care delivery, particularly for patients with elderly and complex chronic conditions. In 2003, the World Health Organization (WHO) already began to identify it as one of the key pathways to improve primary care. In 2005, the European Commission declared integrated care as vital for the sustainability of social protection systems in Europe. Nowadays, it is recognized as a core component of health and social care reforms across European countries. Implementing integrated care requires coordination between settings, organizations, providers and professionals. In order to address the challenge of integration in such complex scenario, an effective workforce is required capable of working across interdependent settings. The World Health Report 2006 noted that governments should prepare their workforce and explore what tasks the different levels of health workers are trained to do and are capable of performing (skills mix). Comparatively to other European countries, Portugal is at an early stage in what integrated care is concerned facing a growing elderly population and the subsequent increase in the pressure on institutions and professionals to provide social and medical care in the most cost-effective way. In 2006 the Portuguese government created the Portuguese Network for Integrated Care Development (PNICD) to solve the existing long-term gap in social support and healthcare. On what concerns health workforce, the Portuguese government already recognized the importance of redefine careers keeping professional motivation and satisfaction. Aim of the study: This study aims to contribute new evidence to the debate surrounding integrated care and skills mix policies in Europe. It also seeks to provide the first evidence that incorporates both the current dynamics of implementing integrated care in Portugal and the developments of international literature. The first ambition of our study is to contribute to the growing interest in integrated care and to the ongoing research in this area by identifying its different approaches and retrieve a number of experiences in some European countries. Our second goal of this research is to produce an update on the knowledge developed on skills mix to the international healthcare management community and to policy makers involved in reforming healthcare systems and organizations. To better inform Portuguese health policies makers in a third stage we explore the current dynamics of implementing integrated care in Portugal and contextualize them with the developments reported in the international literature. Methodology: This is essentially an exploratory and descriptive study using qualitative methodology. In order to identify integrated care approaches in Europe, a systematic literature review was undertaken which resulted in a paper published in the Journal of Management and Marketing in Health care titled: Approaches to developing integrated care in Europe: a systematic literature review. This article was recommended and included into a list of references identified by The King's Fund Library. A second systematic literature review was undertaken which resulted in a paper published in the International Journal of Healthcare Management titled: Skills mix in healthcare: An international update for the management debate. Semi-structured interviews were performed on experts representing the regional coordination teams of the Portuguese Network for Integrated Care Development. In a last stage a questionnaire survey was developed based on the findings of both systematic literature reviews and semi-structured interviews. Conclusions: Even though integrated care is a worldwide trend in health care reforms, there is no unique definition. Definitions can be grouped according to their sectorial focus: community-based care, combined health and social care, combined acute and primary care, the integration of providers, and in a more comprehensive approach the whole health system. Indeed, models that seek to apply the principles of integrated care have a similar background and are continually evolving and depend on the different initiatives taken at national level. . Despite the fact that we cannot argue that there is one single set typology of models for integrated care, it is possible to identify and categorize some of the basic approaches that have been taken in attempts to implement integrated care according to: changes in organizational structure, workforce reconfiguring, and changes in the financing system. The systematic literature review on skills mix showed that despite the widely acknowledged interest on skills mix initiatives there is a lack of evidence on skills mix implications, constraints, outcomes, and quality impact that would allow policy makers to take sustained and evidence-based decisions. Within the Portuguese health system, the integrated care approach is rather organizational and financial, whereas little attention is given to workforce integration. On what concerns workforce planning Portugal it is still in the stage of analyzing the acceptability of health workforce skills mix. In line with the international approaches, integration of health and social services and bridging primary and acute care are the main goals of the national government strategy. The findings from our interviews clarify perceptions which show no discrepancy with the related literature but are rather scarce comparing to international experience. Informants hold a realistic but narrow view of integrated care related issues. They seem to be limited to the regional context, requiring a more comprehensive perspective. The questionnaire developed in this thesis is an instrument which, when applied, will allow policy makers to understand the basic set of concepts and managerial motivations behind national and regional integrated care programs. The instrument developed can foster evidence on the three essential components of integrated care policies: organizational, financial, and human resources development, and can give additional input on the context in which integrated care is being developed, the type of providers and organizations involved, barriers and constraints, and the workforce skills mix planning related strategies. The thesis was successful in recognizing differences between countries and interventions and the instrument developed will allow a better comprehension of the international options available and how to address the vital components of integrated care programs.
Resumo:
This research seeks to design and implement a WebGIS application allowing high school students to work with information related to the disciplinary competencies of the competency-teaching model, in Mexico. This paradigm assumes knowledge to be acquired through the application of new technologies and to link it with everyday life situations of students. The WebGIS provides access to maps regarding natural risks in Mexico, e.g. volcanism, seismic activities, or hurricanes; the prototype's user interface was designed with special emphasis on scholar needs for high school students.
Resumo:
Instituto Politécnico de Lisboa (IPL) e Instituto Superior de Engenharia de Lisboa (ISEL)apoio concedido pela bolsa SPRH/PROTEC/67580/2010, que apoiou parcialmente este trabalho
Resumo:
Nowadays, the consumption of goods and services on the Internet are increasing in a constant motion. Small and Medium Enterprises (SMEs) mostly from the traditional industry sectors are usually make business in weak and fragile market sectors, where customized products and services prevail. To survive and compete in the actual markets they have to readjust their business strategies by creating new manufacturing processes and establishing new business networks through new technological approaches. In order to compete with big enterprises, these partnerships aim the sharing of resources, knowledge and strategies to boost the sector’s business consolidation through the creation of dynamic manufacturing networks. To facilitate such demand, it is proposed the development of a centralized information system, which allows enterprises to select and create dynamic manufacturing networks that would have the capability to monitor all the manufacturing process, including the assembly, packaging and distribution phases. Even the networking partners that come from the same area have multi and heterogeneous representations of the same knowledge, denoting their own view of the domain. Thus, different conceptual, semantic, and consequently, diverse lexically knowledge representations may occur in the network, causing non-transparent sharing of information and interoperability inconsistencies. The creation of a framework supported by a tool that in a flexible way would enable the identification, classification and resolution of such semantic heterogeneities is required. This tool will support the network in the semantic mapping establishments, to facilitate the various enterprises information systems integration.