990 resultados para Software-related inventions
Resumo:
Medical charts and radiographs from 38 HIV-infected patients with positive cultures for Mycobacterium tuberculosis from sputum or bronchoalveolar lavage were reviewed in order to compare the clinical, radiographic, and sputum bacilloscopy characteristics of HIV-infected patients with pulmonary tuberculosis according to CD4+ lymphocyte count (CD4). The mean age of the patients was 32 years and 76% were male. The median CD4 was 106 cells/mm³ and 71% had CD4 < 200 cells/mm³. Sputum bacilloscopy was positive in 45% of the patients. Patients with CD4 < 200 cells/mm³ showed significantly less post-primary pattern (7% vs. 63%; p = 0.02) and more frequently reported weight loss (p = 0.04). Although not statistically significant, patients with lower CD4 showed lower positivity of sputum bacilloscopy (37% vs. 64%; p = 0.18). HIV-infected patients with culture-confirmed pulmonary tuberculosis had a high proportion of non-post-primary pattern in thoracic radiographs. Patients with CD4 lower than 200 cells/mm³ showed post-primary patterns less frequently and reported weight loss more frequently.
Resumo:
This research aims to provide a better understanding on how firms stimulate knowledge sharing through the utilization of collaboration tools, in particular Emergent Social Software Platforms (ESSPs). It focuses on the distinctive applications of ESSPs and on the initiatives contributing to maximize its advantages. In the first part of the research, I have itemized all types of existing collaboration tools and classify them in different categories according to their capabilities, objectives and according to their faculty for promoting knowledge sharing. In the second part, and based on an exploratory case study at Cisco Systems, I have identified the main applications of an existing enterprise social software platform named Webex Social. By combining a qualitative and quantitative approach, as well as combining data collected from survey’s results and from the analysis of the company’s documents, I am expecting to maximize the outcome of this investigation and reduce the risk of bias. Although effects cannot be universalized based on one single case study, some utilization patterns have been underlined from the data collected and potential trends in managing knowledge have been observed. The results of the research have also enabled identifying most of the constraints experienced by the users of the firm’s social software platform. Utterly, this research should provide a primary framework for firms planning to create or implement a social software platform and for firms willing to increase adoption levels and to promote the overall participation of users. It highlights the common traps that should be avoided by developers when designing a social software platform and the capabilities that it should inherently carry to support an effective knowledge management strategy.
Resumo:
Software development is a discipline that is almost as old as the history of computers. With the advent of the Internet and all of its related technologies, software development has been on high demand. But, and especially in SME (small and medium enterprise), this was not accompanied with a comparable effort to develop a set of sustainable and standardized activities of project management, which lead to increasing inefficiencies and costs. Given the actual economic situation, it makes sense to engage in an effort to reduce said inefficiencies and rising costs. For that end, this work will analyze the current state of software development’s project management processes on a Portuguese SME, along with its problems and inefficiencies in an effort to create a standardized model to manage software development, with special attention given to critical success factors in an agile software development environment, while using the best practices in process modeling. This work also aims to create guidelines to correctly integrate these changes in the existing IS structure of a company.
Resumo:
During drilling operation, cuttings are produced downhole and must be removed to avoid issues which can lead to Non Productive Time (NPT). Most of stuck pipe and then Bottom-Hole Assembly (BHA) lost events are hole cleaned related. There are many parameters which help determine hole cleaning conditions, but a proper selection of the key parameters will facilitate monitoring hole cleaning conditions and interventions. The aim of Hole Cleaning Monitoring is to keep track of borehole conditions including hole cleaning efficiency and wellbore stability issues during drilling operations. Adequate hole cleaning is the one of the main concerns in the underbalanced drilling operations especially for directional and horizontal wells. This dissertation addresses some hole cleaning fundamentals which will act as the basis for recommendation practice during drilling operations. Understand how parameters such as Flowrate, Rotation per Minute (RPM), Rate of Penetration (ROP) and Mud Weight are useful to improve the hole cleaning performance and how Equivalent Circulate Density (ECD), Torque & Drag (T&D) and Cuttings Volumes coming from downhole help to indicate how clean and stable the well is. For case study, hole cleaning performance or cuttings volume removal monitoring, will be based on real-time measurements of the cuttings volume removal from downhole at certain time, taking into account Flowrate, RPM, ROP and Drilling fluid or Mud properties, and then will be plotted and compared to the volume being drilled expected. ECD monitoring will dictate hole stability conditions and T&D and Cuttings Volume coming from downhole monitoring will dictate how clean the well is. T&D Modeling Software provide theoretical calculated T&D trends which will be plotted and compared to the real-time measurements. It will use the measured hookloads to perform a back-calculation of friction factors along the wellbore.
Resumo:
O peixe-zebra é utilizado como modelo vertebrado para estudos in vivo de diversas patologias de origem genética. Neste trabalho pretendeu-se estudar a discinesia ciliar primária através do estudo in vivo do organizador da esquerda-direita destes peixes, conhecido por vesícula de Kupffer. Em particular, após análise de sequências de imagens captadas a alta velocidade, avaliou-se o com-portamento dinâmico de cílios normais e comparou-se com o de cílios com alte-rações genéticas com o objectivo de melhor compreender os processos que in-fluenciam a localização assimétrica dos órgãos internos, associados a esta doen-ça. De vídeos obtidos por microscopia de alta velocidade do interior da vesí-cula de Kupffer foram analisados 32 cílios, sendo 8 normais e os restantes cílios pertencentes a três alterações genéticas diferentes: subexpressão de Pkd2, so-brexpressão de Arl13b e mutação no gene deltaD. Para cada cílio calculou-se a frequência de batimento e caracterizou-se quantitativamente o movimento. Esta última análise teve como base a segmentação manual do cílio em quatro pontos definindo a base, o meio e a ponta. De seguida, estudou-se a dinâmica de cada uma das rectas constituídas por estas três estruturas ao longo do tempo. Com recurso à análise estatística ANOVA podemos comprovar diferenças no movi-mento entre os cílios alterados quando comparados com os normais. A análise da frequência demonstrou que todos os cílios estudados possu-em uma frequência média 34,9 Hz. Demonstrou-se ainda que, quando compa-rados com cílios normais, os cílios Pkd2 possuem um movimento 60% mais rí-gido, os Alr13b são caracterizados por amplitudes angulares 26% superiores no seu movimento e, por fim, a relação entre as amplitudes base/meio é 23% supe-rior nos cílios Delta D. Para implementar estes estudos, desenvolveu-se uma ferramenta baseada num plugin para ImageJ conjuntamente com códigos em R, que poderá vir a ser utilizada em investigações da discinesia ciliar primária em peixes-zebra.
Resumo:
RESUMO: Objetivo: Este estudo teve como objetivo avaliar a carga dos cuidadores de doentes com Esquizofrenia e avaliar a correlação com determinadas características demográficas dos doentes e dos cuidadores, assim como com o nível de emoção expressa na família. Métodos: Este estudo descritivo-analítico foi realizado em 172 doentes com Esquizofrenia e seus cuidadores primários, que foram selecionados em ambulatório, a partir da consulta externa do Hospital Psiquiátrico em Teerão, no Irão, mediante uma amostra de conveniência. Os cuidadores foram entrevistados utilizando as escalas Zarit Burden Interview e Family Questionnaire, de forma a avaliar a sobrecarga sentida pelos cuidadores e o nível de emoção expressa na família, respectivamente. Os dados foram analisados por meio de testes de Mann-Whitney, Kruskal-Wallis, e Spearman. Resultados: O nível de sobrecarga vivenciada pela maioria dos cuidadores primários foi moderada a grave. A pontuação obtida nas sub-escalas de comentários emocionais, envolvimento excessivo e comentários críticos foram maiores do que o ponto de corte em 51,7% e 64,5% dos cuidadores, respectivamente. Os resultados mostraram que as pontuações obtidas nas duas subescalas do questionário família tinham uma correlação significativa e direta com a carga experimentada pelos cuidadores. O nível de carga experimentada pelos cuidadores foi significativamente diferente entre os grupos de idade e estado civil dos cuidadores, e género, status ocupacional e estado civil dos doentes. O número de membros da família, as condições 5 de alojamento, o tempo gasto pelo cuidador com o paciente diariamente, o nível de renda familiar e a duração da doença afetaram significativamente o nível de carga experimentada pelo cuidador. Conclusão: Com base nos resultados, alguns fatores demográficos dos cuidadores, pacientes e seus familiares afetam significativamente a carga experimentada pelos cuidadores primários. A maioria das famílias dos pacientes têm alto nível de emoção expressa e existe uma significativa associação direta entre as emoções expressas e a carga experimentada.------------ABSTRACT: Aim: This study aimed to assess the burden experienced by the caregiver of patients with Schizophrenia, and evaluate its correlation with some demographic characteristics of patients, their caregivers, and the level of expressed emotion in the family. Methods: This descriptive-analytic study was conducted on 172 schizophrenic patients and their primary caregivers selected from the outpatient clinic of a mental hospital in Tehran, Iran using convenience sampling. Caregivers were evaluated with Zarit Burden Interview and Family Questionnaire to assess the burden experienced by the caregivers and the level of expressed emotion in the family, respectively. Data were analyzed using Mann-Whitney, Kruskal-Wallis, and Spearman’s tests. Results: The level of burden experienced by the majority of primary caregivers was moderate to severe. The scores obtained in the subscales of emotional over-involvement and critical comments were higher than the cutoff point in 51.7% and 64.5% of caregivers respectively. The results showed that the scores obtained in the two subscales of family questionnaire had a significant, direct correlation with the burden experienced by the caregivers. The level of burden experienced by the caregivers was significantly different between the subgroups of age and marital status of the caregivers, and gender, occupational status and marital status of the patients. Number of family members, home ownership status, time spent by the caregiver with the patient daily, level of family income and duration of disorder significantly affected the level of burden experienced by the caregiver. Conclusion: Based on the results, some demographic factors of the primary caregivers, patients and their family significantly affect the burden experienced by the primary caregivers. Most of the families of patients have high expressed emotions and a significant, direct association exists between the expressed emotions and the burden experienced.
Resumo:
Os incêndios em edifícios representam um fenómeno que pode ter consequências devastadoras quando não controlado, não só em termos de perdas de vidas humanas, como em termos económicos. No passado, a ocorrência de incêndios de grandes dimensões mostrou os efeitos do fogo descontrolado nos edifícios, assim como a ineficiência dos meios de segurança ativa ao fogo. Nas últimas duas décadas, estas questões motivaram o estudo e compreensão da ação dos incêndios nas estruturas dos edifícios. Neste trabalho estuda-se a modelação da ação do fogo em estruturas metálicas e mistas, com o objetivo de contribuir para a sua melhor caracterização. A presente tese foca-se na validação e compreensão da implementação de análises termo-mecânicas a estruturas mistas no software de elementos finitos OpenSees (Open System for Earthquake Engineering Simulation), contribuindo assim para futuros estudos, não só de análises de estruturas mistas sujeitas a incêndio, mas também de análises de estruturas mistas sujeitas a eventos consecutivos, como sismo seguido de incêndio. Neste trabalho é feita uma breve descrição do fenómeno fogo, assim como dos processos inerentes à dinâmica de um incêndio que constituem uma fonte de incerteza para a modelação de cenários de incêndio num edifício. Posto isto, são abordados alguns modelos de incêndios presentes nos Eurocódigos, assim como o recente modelo de fogos móveis(“Travelling fires”). São realizados exemplos de aplicação no software e dois casos de estudo. O primeiro consiste na modelação de dois ensaios ao fogo realizados na Alemanha em 1986 em estruturas metálicas à escala 1/4. O segundo consiste na modelação de um ensaio ao fogo a uma viga de betão armado simplesmente apoiada, realizado no Instituto Superior Técnico em 2010. Os modelos numéricos desenvolvidos no OpenSees contabilizam as não linearidades físicas e geométricas, com elementos finitos de plasticidade distribuída e com uma formulação baseada em deslocamentos. Os resultados numéricos são então comparados com os experimentais, de modo a validar as análises termo-mecânicas no OpenSees.
Resumo:
The Corporate world is becoming more and more competitive. This leads organisations to adapt to this reality, by adopting more efficient processes, which result in a decrease in cost as well as an increase of product quality. One of these processes consists in making proposals to clients, which necessarily include a cost estimation of the project. This estimation is the main focus of this project. In particular, one of the goals is to evaluate which estimation models fit the Altran Portugal software factory the most, the organization where the fieldwork of this thesis will be carried out. There is no broad agreement about which is the type of estimation model more suitable to be used in software projects. Concerning contexts where there is plenty of objective information available to be used as input to an estimation model, model-based methods usually yield better results than the expert judgment. However, what happens more frequently is not having this volume and quality of information, which has a negative impact in the model-based methods performance, favouring the usage of expert judgement. In practice, most organisations use expert judgment, making themselves dependent on the expert. A common problem found is that the performance of the expert’s estimation depends on his previous experience with identical projects. This means that when new types of projects arrive, the estimation will have an unpredictable accuracy. Moreover, different experts will make different estimates, based on their individual experience. As a result, the company will not directly attain a continuous growing knowledge about how the estimate should be carried. Estimation models depend on the input information collected from previous projects, the size of the project database and the resources available. Altran currently does not store the input information from previous projects in a systematic way. It has a small project database and a team of experts. Our work is targeted to companies that operate in similar contexts. We start by gathering information from the organisation in order to identify which estimation approaches can be applied considering the organization’s context. A gap analysis is used to understand what type of information the company would have to collect so that other approaches would become available. Based on our assessment, in our opinion, expert judgment is the most adequate approach for Altran Portugal, in the current context. We analysed past development and evolution projects from Altran Portugal and assessed their estimates. This resulted in the identification of common estimation deviations, errors, and patterns, which lead to the proposal of metrics to help estimators produce estimates leveraging past projects quantitative and qualitative information in a convenient way. This dissertation aims to contribute to more realistic estimates, by identifying shortcomings in the current estimation process and supporting the self-improvement of the process, by gathering as much relevant information as possible from each finished project.
Resumo:
Conventionally the problem of the best path in a network refers to the shortest path problem. However, for the vast majority of networks present nowadays this solution has some limitations which directly affect their proper functioning, as well as an inefficient use of their potentialities. Problems at the level of large networks where graphs of high complexity are commonly present as well as the appearing of new services and their respective requirements, are intrinsically related to the inability of this solution. In order to overcome the needs present in these networks, a new approach to the problem of the best path must be explored. One solution that has aroused more interest in the scientific community considers the use of multiple paths between two network nodes, where they can all now be considered as the best path between those nodes. Therefore, the routing will be discontinued only by minimizing one metric, where only one path between nodes is chosen, and shall be made by the selection of one of many paths, thereby allowing the use of a greater diversity of the present paths (obviously, if the network consents). The establishment of multi-path routing in a given network has several advantages for its operation. Its use may well improve the distribution of network traffic, improve recovery time to failure, or it can still offer a greater control of the network by its administrator. These factors still have greater relevance when networks have large dimensions, as well as when their constitution is of high complexity, such as the Internet, where multiple networks managed by different entities are interconnected. A large part of the growing need to use multipath protocols is associated to the routing made based on policies. Therefore, paths with different characteristics can be considered with equal level of preference, and thus be part of the solution for the best way problem. To perform multi-path routing using protocols based only on the destination address has some limitations but it is possible. Concepts of graph theory of algebraic structures can be used to describe how the routes are calculated and classified, enabling to model the routing problem. This thesis studies and analyzes multi-path routing protocols from the known literature and derives a new algebraic condition which allows the correct operation of these protocols without any network restriction. It also develops a range of software tools that allows the planning and the respective verification/validation of new protocols models according to the study made.
Resumo:
In Portugal, the introduction of the seven-valent pneumococcal conjugate vaccine (PCV7) has led to significant changes in the population structure of Streptococcus pneumoniae. However, the levels of antimicrobial resistance have not decreased and have been a matter of concern. (...)
Resumo:
A discinesia ciliar primária (DCP) resulta de disfunção ciliar no ser humano, estando associada a um conjunto de sintomas muito diversificados. É uma doença respiratória rara caracterizada por infecções respiratórias, situs inversus, infertilidade e hidrocefalia. Em Portugal não existe nenhum centro de diagnóstico da doença. Mas a inten-ção de criar um surgiu, seguindo o método de centros de diagnóstico para DCP utilizado noutros países. Este diagnóstico consiste em recolher amostras dos cílios do nariz, através do método de escovagem nasal e obter a gravação do batimento das células ciliadas por uma câmara de alta velocidade acoplada a um microscópio com objectivas de alta resolução. É possível estudar a DCP através da análise do comportamento físico dos cílios, e, para uma melhor abordagem, foi desenvolvido um programa executável, em C#, para análise destas amostras. Este, após a escolha de uma zona de interesse da sequência de imagens pelo utilizador (ROI), detecta as frequências do bati-mento ciliar, indicando uma lista com as percentagens das frequências obtidas e cria um mapa de frequências do ROI. A ferramenta permite ainda calcular o comprimento do cílio e realizar um estudo do movimento do mesmo, algo que ainda não foi abordado por outros programas. O código desenvolvido permitirá, assim, obter um diagnóstico de DCP em Por-tugal, rápido e nalguns casos com um melhor desempenho do que a inspecção visual seguida noutros centros de diagnóstico.
Resumo:
RESUMO: O cancro de mama e o mais frequente diagnoticado a indiv duos do sexo feminino. O conhecimento cientifico e a tecnologia tem permitido a cria ção de muitas e diferentes estrat egias para tratar esta patologia. A Radioterapia (RT) est a entre as diretrizes atuais para a maioria dos tratamentos de cancro de mama. No entanto, a radia ção e como uma arma de dois canos: apesar de tratar, pode ser indutora de neoplasias secund arias. A mama contralateral (CLB) e um orgão susceptivel de absorver doses com o tratamento da outra mama, potenciando o risco de desenvolver um tumor secund ario. Nos departamentos de radioterapia tem sido implementadas novas tecnicas relacionadas com a radia ção, com complexas estrat egias de administra ção da dose e resultados promissores. No entanto, algumas questões precisam de ser devidamente colocadas, tais como: E seguro avançar para tecnicas complexas para obter melhores indices de conformidade nos volumes alvo, em radioterapia de mama? O que acontece aos volumes alvo e aos tecidos saudaveis adjacentes? Quão exata e a administração de dose? Quais são as limitações e vantagens das técnicas e algoritmos atualmente usados? A resposta a estas questões e conseguida recorrendo a m etodos de Monte Carlo para modelar com precisão os diferentes componentes do equipamento produtor de radia ção(alvos, ltros, colimadores, etc), a m de obter uma descri cão apropriada dos campos de radia cão usados, bem como uma representa ção geometrica detalhada e a composição dos materiais que constituem os orgãos e os tecidos envolvidos. Este trabalho visa investigar o impacto de tratar cancro de mama esquerda usando diferentes tecnicas de radioterapia f-IMRT (intensidade modulada por planeamento direto), IMRT por planeamento inverso (IMRT2, usando 2 feixes; IMRT5, com 5 feixes) e DCART (arco conformacional dinamico) e os seus impactos em irradia ção da mama e na irradia ção indesejada dos tecidos saud aveis adjacentes. Dois algoritmos do sistema de planeamento iPlan da BrainLAB foram usados: Pencil Beam Convolution (PBC) e Monte Carlo comercial iMC. Foi ainda usado um modelo de Monte Carlo criado para o acelerador usado (Trilogy da VARIAN Medical Systems), no c odigo EGSnrc MC, para determinar as doses depositadas na mama contralateral. Para atingir este objetivo foi necess ario modelar o novo colimador multi-laminas High- De nition que nunca antes havia sido simulado. O modelo desenvolvido est a agora disponí vel no pacote do c odigo EGSnrc MC do National Research Council Canada (NRC). O acelerador simulado foi validado com medidas realizadas em agua e posteriormente com c alculos realizados no sistema de planeamento (TPS).As distribui ções de dose no volume alvo (PTV) e a dose nos orgãos de risco (OAR) foram comparadas atrav es da an alise de histogramas de dose-volume; an alise estati stica complementar foi realizadas usando o software IBM SPSS v20. Para o algoritmo PBC, todas as tecnicas proporcionaram uma cobertura adequada do PTV. No entanto, foram encontradas diferen cas estatisticamente significativas entre as t ecnicas, no PTV, nos OAR e ainda no padrão da distribui ção de dose pelos tecidos sãos. IMRT5 e DCART contribuem para maior dispersão de doses baixas pelos tecidos normais, mama direita, pulmão direito, cora cão e at e pelo pulmão esquerdo, quando comparados com as tecnicas tangenciais (f-IMRT e IMRT2). No entanto, os planos de IMRT5 melhoram a distribuição de dose no PTV apresentando melhor conformidade e homogeneidade no volume alvo e percentagens de dose mais baixas nos orgãos do mesmo lado. A t ecnica de DCART não apresenta vantagens comparativamente com as restantes t ecnicas investigadas. Foram tamb em identi cadas diferen cas entre os algoritmos de c alculos: em geral, o PBC estimou doses mais elevadas para o PTV, pulmão esquerdo e cora ção, do que os algoritmos de MC. Os algoritmos de MC, entre si, apresentaram resultados semelhantes (com dferen cas at e 2%). Considera-se que o PBC não e preciso na determina ção de dose em meios homog eneos e na região de build-up. Nesse sentido, atualmente na cl nica, a equipa da F sica realiza medi ções para adquirir dados para outro algoritmo de c alculo. Apesar de melhor homogeneidade e conformidade no PTV considera-se que h a um aumento de risco de cancro na mama contralateral quando se utilizam t ecnicas não-tangenciais. Os resultados globais dos estudos apresentados confirmam o excelente poder de previsão com precisão na determinação e c alculo das distribui ções de dose nos orgãos e tecidos das tecnicas de simulação de Monte Carlo usados.---------ABSTRACT:Breast cancer is the most frequent in women. Scienti c knowledge and technology have created many and di erent strategies to treat this pathology. Radiotherapy (RT) is in the actual standard guidelines for most of breast cancer treatments. However, radiation is a two-sword weapon: although it may heal cancer, it may also induce secondary cancer. The contralateral breast (CLB) is a susceptible organ to absorb doses with the treatment of the other breast, being at signi cant risk to develop a secondary tumor. New radiation related techniques, with more complex delivery strategies and promising results are being implemented and used in radiotherapy departments. However some questions have to be properly addressed, such as: Is it safe to move to complex techniques to achieve better conformation in the target volumes, in breast radiotherapy? What happens to the target volumes and surrounding healthy tissues? How accurate is dose delivery? What are the shortcomings and limitations of currently used treatment planning systems (TPS)? The answers to these questions largely rely in the use of Monte Carlo (MC) simulations using state-of-the-art computer programs to accurately model the di erent components of the equipment (target, lters, collimators, etc.) and obtain an adequate description of the radiation elds used, as well as the detailed geometric representation and material composition of organs and tissues. This work aims at investigating the impact of treating left breast cancer using di erent radiation therapy (RT) techniques f-IMRT (forwardly-planned intensity-modulated), inversely-planned IMRT (IMRT2, using 2 beams; IMRT5, using 5 beams) and dynamic conformal arc (DCART) RT and their e ects on the whole-breast irradiation and in the undesirable irradiation of the surrounding healthy tissues. Two algorithms of iPlan BrainLAB TPS were used: Pencil Beam Convolution (PBC)and commercial Monte Carlo (iMC). Furthermore, an accurate Monte Carlo (MC) model of the linear accelerator used (a Trilogy R VARIANR) was done with the EGSnrc MC code, to accurately determine the doses that reach the CLB. For this purpose it was necessary to model the new High De nition multileaf collimator that had never before been simulated. The model developed was then included on the EGSnrc MC package of National Research Council Canada (NRC). The linac was benchmarked with water measurements and later on validated against the TPS calculations. The dose distributions in the planning target volume (PTV) and the dose to the organs at risk (OAR) were compared analyzing dose-volume histograms; further statistical analysis was performed using IBM SPSS v20 software. For PBC, all the techniques provided adequate coverage of the PTV. However, statistically significant dose di erences were observed between the techniques, in the PTV, OAR and also in the pattern of dose distribution spreading into normal tissues. IMRT5 and DCART spread low doses into greater volumes of normal tissue, right breast, right lung, heart and even the left lung than tangential techniques (f-IMRT and IMRT2). However,IMRT5 plans improved distributions for the PTV, exhibiting better conformity and homogeneity in target and reduced high dose percentages in ipsilateral OAR. DCART did not present advantages over any of the techniques investigated. Di erences were also found comparing the calculation algorithms: PBC estimated higher doses for the PTV, ipsilateral lung and heart than the MC algorithms predicted. The MC algorithms presented similar results (within 2% di erences). The PBC algorithm was considered not accurate in determining the dose in heterogeneous media and in build-up regions. Therefore, a major e ort is being done at the clinic to acquire data to move from PBC to another calculation algorithm. Despite better PTV homogeneity and conformity there is an increased risk of CLB cancer development, when using non-tangential techniques. The overall results of the studies performed con rm the outstanding predictive power and accuracy in the assessment and calculation of dose distributions in organs and tissues rendered possible by the utilization and implementation of MC simulation techniques in RT TPS.
Resumo:
Nesta Tese foi desenvolvida uma plataforma online multiutilizador, que tem como objetivo principal comparar algoritmos de análise de imagens para de-terminar o seu grau de eficácia. Um exemplo de aplicação é a comparação de algoritmos de análise de imagens da retina para deteção de drusas. A compa-ração é feita considerando um dos algoritmos como referência padrão e sobre o este são avaliados os restantes. O funcionamento da plataforma é semelhante à de um fórum, onde é possível a um utilizador criar tópicos publicando imagens e seu descritivo. Após a cria-ção do tópico qualquer utilizador pode visualizar o mesmo, dando a hipótese de comentar ou de acrescentar imagens processadas com os seus próprios al-goritmos de análise. Com o aumento de imagens processadas, obtém-se uma base de dados de algoritmos de análise de imagens sobre a qual é possível avaliar o seu grau de eficácia. A plataforma pretende também criar comunidades onde os utilizadores pos-sam interagir uns com os outros comentando sobre os tópicos, contribuindo assim para o melhoramento dos algoritmos. Deste modo, além de uma base de dados que qualquer utilizador pode usar, obtém-se uma fonte de informação disponibilizada por outros profissionais da área.
Resumo:
Nowadays, participatory processes attending the need for real democracy and transparency in governments and collectives are more needed than ever. Immediate participation through channels like social networks enable people to give their opinion and become pro-active citizens, seeking applications to interact with each other. The application described in this dissertation is a hybrid channel of communication of questions, petitions and participatory processes based on Public Participation Geographic Information System (PPGIS), Participation Geographic Information System (PGIS) and ‘soft’ (subjective data) Geographic Information System (SoftGIS) methodologies. To achieve a new approach to an application, its entire design is focused on the spatial component related with user interests. The spatial component is treated as main feature of the system to develop all others depending on it, enabling new features never seen before in social actions (questions, petitions and participatory processes). Results prove that it is possible to develop a working application mainly using open source software, with the possibility of spatial and subject filtering, visualizing and free download of actions within application. The resulting application empowers society by releasing soft data and defines a new breaking approach, unseen so far.
Resumo:
Since the invention of photography humans have been using images to capture, store and analyse the act that they are interested in. With the developments in this field, assisted by better computers, it is possible to use image processing technology as an accurate method of analysis and measurement. Image processing's principal qualities are flexibility, adaptability and the ability to easily and quickly process a large amount of information. Successful examples of applications can be seen in several areas of human life, such as biomedical, industry, surveillance, military and mapping. This is so true that there are several Nobel prizes related to imaging. The accurate measurement of deformations, displacements, strain fields and surface defects are challenging in many material tests in Civil Engineering because traditionally these measurements require complex and expensive equipment, plus time consuming calibration. Image processing can be an inexpensive and effective tool for load displacement measurements. Using an adequate image acquisition system and taking advantage of the computation power of modern computers it is possible to accurately measure very small displacements with high precision. On the market there are already several commercial software packages. However they are commercialized at high cost. In this work block-matching algorithms will be used in order to compare the results from image processing with the data obtained with physical transducers during laboratory load tests. In order to test the proposed solutions several load tests were carried out in partnership with researchers from the Civil Engineering Department at Universidade Nova de Lisboa (UNL).