31 resultados para Improved sequential algebraic algorithm
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics
Resumo:
Dissertação para obtenção do Grau de Mestre em Biotecnologia
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics
Resumo:
Dissertation presented to obtain the Ph.D degree in Engineering and Technology Sciences-Biotechnology
Resumo:
The work presented in this thesis was developed in collaboration with a Portuguese company, BeyonDevices, devoted to pharmaceutical packaging, medical technology and device industry. Specifically, the composition impact and surface modification of two polymeric medical devices from the company were studied: inhalers and vaginal applicators. The polyethylene-based vaginal applicator was modified using supercritical fluid technology to acquire self-cleaning properties and prevent the transport of bacteria and yeasts to vaginal flora. For that, in-situ polymerization of 2-substituted oxazolines was performed within the polyethylene matrix using supercritical carbon dioxide. The cationic ring-opening polymerization process was followed by end-capping with N,N-dimethyldodecylamine. Furthermore, for the same propose, the polyethylene matrix was impregnated with lavender oil in supercritical medium. The obtained materials were characterized physical and morphologically and the antimicrobial activity against bacteria and yeasts was accessed. Materials modified using 2-substituted oxazolines showed an effective killing ability for all the tested microorganisms, while the materials modified with lavender oil did not show antimicrobial activity. Only materials modified with oligo(2-ethyl-2-oxazoline) maintain the activity during the long term stability. Furthermore, the cytotoxicity of the materials was tested, confirming their biocompatibilty. Regarding the inhaler, its surface was modified in order to improve powder flowability and consequently, to reduce powder retention in the inhaler´s nozzle. New dry powder inhalers (DPIs), with different needle’s diameters, were evaluated in terms of internal resistance and uniformity of the emitted dose. It was observed that they present a mean resistance of 0.06 cmH2O0.5/(L/min) and the maximum emitted dose obtained was 68.9% for the inhaler with higher needle´s diameter (2 mm). Thus, this inhaler was used as a test and modified by the coating with a commonly-used force control agent, magnesium stearate, dried with supercritical carbon dioxide (scCO2) and the uniformity of delivered dose tests were repeated. The modified inhaler showed an increase in emitted dose from 68.9% to 71.3% for lactose and from 30.0% to 33.7% for Foradil.
Resumo:
Diffusion Kurtosis Imaging (DKI) is a fairly new magnetic resonance imag-ing (MRI) technique that tackles the non-gaussian motion of water in biological tissues by taking into account the restrictions imposed by tissue microstructure, which are not considered in Diffusion Tensor Imaging (DTI), where the water diffusion is considered purely gaussian. As a result DKI provides more accurate information on biological structures and is able to detect important abnormalities which are not visible in standard DTI analysis. This work regards the development of a tool for DKI computation to be implemented as an OsiriX plugin. Thus, as OsiriX runs under Mac OS X, the pro-gram is written in Objective-C and also makes use of Apple’s Cocoa framework. The whole program is developed in the Xcode integrated development environ-ment (IDE). The plugin implements a fast heuristic constrained linear least squares al-gorithm (CLLS-H) for estimating the diffusion and kurtosis tensors, and offers the user the possibility to choose which maps are to be generated for not only standard DTI quantities such as Mean Diffusion (MD), Radial Diffusion (RD), Axial Diffusion (AD) and Fractional Anisotropy (FA), but also DKI metrics, Mean Kurtosis (MK), Radial Kurtosis (RK) and Axial Kurtosis (AK).The plugin was subjected to both a qualitative and a semi-quantitative analysis which yielded convincing results. A more accurate validation pro-cess is still being developed, after which, and with some few minor adjust-ments the plugin shall become a valid option for DKI computation
Resumo:
Nanotechnology plays a central role in ‘tailoring’ materials’ properties and thus improving its performances for a wide range of applications. Coupling nature nano-objects with nanotechnology results in materials with enhanced functionalities. The main objective of this master thesis was the synthesis of nanocrystalline cellulose (NCCs) and its further incorporation in a cellulosic matrix, in order to produce a stimuli-responsive material to moisture. The induced behaviour (bending/unbending) of the samples was deeply investigated, in order to determine relationships between structure/properties. Using microcrystalline cellulose as a starting material, acid hydrolysis was performed and the NCC was obtained. Anisotropic aqueous solutions of HPC and NCC were prepared and films with thicknesses ranging from 22μm to 61μm were achieved, by using a shear casting technique. Microscopic and spectroscopic techniques as well as mechanical and rheological essays were used to characterize the transparent and flexible films produced. Upon the application of a stimulus (moisture), the bending/unbending response times were measured. The use of NCC allowed obtaining films with response times in the order of 6 seconds for the bending and 5 seconds for the unbending, improving the results previously reported. These promising results open new horizons for building up improved soft steam engines.
Resumo:
RESUMO: O cancro de mama e o mais frequente diagnoticado a indiv duos do sexo feminino. O conhecimento cientifico e a tecnologia tem permitido a cria ção de muitas e diferentes estrat egias para tratar esta patologia. A Radioterapia (RT) est a entre as diretrizes atuais para a maioria dos tratamentos de cancro de mama. No entanto, a radia ção e como uma arma de dois canos: apesar de tratar, pode ser indutora de neoplasias secund arias. A mama contralateral (CLB) e um orgão susceptivel de absorver doses com o tratamento da outra mama, potenciando o risco de desenvolver um tumor secund ario. Nos departamentos de radioterapia tem sido implementadas novas tecnicas relacionadas com a radia ção, com complexas estrat egias de administra ção da dose e resultados promissores. No entanto, algumas questões precisam de ser devidamente colocadas, tais como: E seguro avançar para tecnicas complexas para obter melhores indices de conformidade nos volumes alvo, em radioterapia de mama? O que acontece aos volumes alvo e aos tecidos saudaveis adjacentes? Quão exata e a administração de dose? Quais são as limitações e vantagens das técnicas e algoritmos atualmente usados? A resposta a estas questões e conseguida recorrendo a m etodos de Monte Carlo para modelar com precisão os diferentes componentes do equipamento produtor de radia ção(alvos, ltros, colimadores, etc), a m de obter uma descri cão apropriada dos campos de radia cão usados, bem como uma representa ção geometrica detalhada e a composição dos materiais que constituem os orgãos e os tecidos envolvidos. Este trabalho visa investigar o impacto de tratar cancro de mama esquerda usando diferentes tecnicas de radioterapia f-IMRT (intensidade modulada por planeamento direto), IMRT por planeamento inverso (IMRT2, usando 2 feixes; IMRT5, com 5 feixes) e DCART (arco conformacional dinamico) e os seus impactos em irradia ção da mama e na irradia ção indesejada dos tecidos saud aveis adjacentes. Dois algoritmos do sistema de planeamento iPlan da BrainLAB foram usados: Pencil Beam Convolution (PBC) e Monte Carlo comercial iMC. Foi ainda usado um modelo de Monte Carlo criado para o acelerador usado (Trilogy da VARIAN Medical Systems), no c odigo EGSnrc MC, para determinar as doses depositadas na mama contralateral. Para atingir este objetivo foi necess ario modelar o novo colimador multi-laminas High- De nition que nunca antes havia sido simulado. O modelo desenvolvido est a agora disponí vel no pacote do c odigo EGSnrc MC do National Research Council Canada (NRC). O acelerador simulado foi validado com medidas realizadas em agua e posteriormente com c alculos realizados no sistema de planeamento (TPS).As distribui ções de dose no volume alvo (PTV) e a dose nos orgãos de risco (OAR) foram comparadas atrav es da an alise de histogramas de dose-volume; an alise estati stica complementar foi realizadas usando o software IBM SPSS v20. Para o algoritmo PBC, todas as tecnicas proporcionaram uma cobertura adequada do PTV. No entanto, foram encontradas diferen cas estatisticamente significativas entre as t ecnicas, no PTV, nos OAR e ainda no padrão da distribui ção de dose pelos tecidos sãos. IMRT5 e DCART contribuem para maior dispersão de doses baixas pelos tecidos normais, mama direita, pulmão direito, cora cão e at e pelo pulmão esquerdo, quando comparados com as tecnicas tangenciais (f-IMRT e IMRT2). No entanto, os planos de IMRT5 melhoram a distribuição de dose no PTV apresentando melhor conformidade e homogeneidade no volume alvo e percentagens de dose mais baixas nos orgãos do mesmo lado. A t ecnica de DCART não apresenta vantagens comparativamente com as restantes t ecnicas investigadas. Foram tamb em identi cadas diferen cas entre os algoritmos de c alculos: em geral, o PBC estimou doses mais elevadas para o PTV, pulmão esquerdo e cora ção, do que os algoritmos de MC. Os algoritmos de MC, entre si, apresentaram resultados semelhantes (com dferen cas at e 2%). Considera-se que o PBC não e preciso na determina ção de dose em meios homog eneos e na região de build-up. Nesse sentido, atualmente na cl nica, a equipa da F sica realiza medi ções para adquirir dados para outro algoritmo de c alculo. Apesar de melhor homogeneidade e conformidade no PTV considera-se que h a um aumento de risco de cancro na mama contralateral quando se utilizam t ecnicas não-tangenciais. Os resultados globais dos estudos apresentados confirmam o excelente poder de previsão com precisão na determinação e c alculo das distribui ções de dose nos orgãos e tecidos das tecnicas de simulação de Monte Carlo usados.---------ABSTRACT:Breast cancer is the most frequent in women. Scienti c knowledge and technology have created many and di erent strategies to treat this pathology. Radiotherapy (RT) is in the actual standard guidelines for most of breast cancer treatments. However, radiation is a two-sword weapon: although it may heal cancer, it may also induce secondary cancer. The contralateral breast (CLB) is a susceptible organ to absorb doses with the treatment of the other breast, being at signi cant risk to develop a secondary tumor. New radiation related techniques, with more complex delivery strategies and promising results are being implemented and used in radiotherapy departments. However some questions have to be properly addressed, such as: Is it safe to move to complex techniques to achieve better conformation in the target volumes, in breast radiotherapy? What happens to the target volumes and surrounding healthy tissues? How accurate is dose delivery? What are the shortcomings and limitations of currently used treatment planning systems (TPS)? The answers to these questions largely rely in the use of Monte Carlo (MC) simulations using state-of-the-art computer programs to accurately model the di erent components of the equipment (target, lters, collimators, etc.) and obtain an adequate description of the radiation elds used, as well as the detailed geometric representation and material composition of organs and tissues. This work aims at investigating the impact of treating left breast cancer using di erent radiation therapy (RT) techniques f-IMRT (forwardly-planned intensity-modulated), inversely-planned IMRT (IMRT2, using 2 beams; IMRT5, using 5 beams) and dynamic conformal arc (DCART) RT and their e ects on the whole-breast irradiation and in the undesirable irradiation of the surrounding healthy tissues. Two algorithms of iPlan BrainLAB TPS were used: Pencil Beam Convolution (PBC)and commercial Monte Carlo (iMC). Furthermore, an accurate Monte Carlo (MC) model of the linear accelerator used (a Trilogy R VARIANR) was done with the EGSnrc MC code, to accurately determine the doses that reach the CLB. For this purpose it was necessary to model the new High De nition multileaf collimator that had never before been simulated. The model developed was then included on the EGSnrc MC package of National Research Council Canada (NRC). The linac was benchmarked with water measurements and later on validated against the TPS calculations. The dose distributions in the planning target volume (PTV) and the dose to the organs at risk (OAR) were compared analyzing dose-volume histograms; further statistical analysis was performed using IBM SPSS v20 software. For PBC, all the techniques provided adequate coverage of the PTV. However, statistically significant dose di erences were observed between the techniques, in the PTV, OAR and also in the pattern of dose distribution spreading into normal tissues. IMRT5 and DCART spread low doses into greater volumes of normal tissue, right breast, right lung, heart and even the left lung than tangential techniques (f-IMRT and IMRT2). However,IMRT5 plans improved distributions for the PTV, exhibiting better conformity and homogeneity in target and reduced high dose percentages in ipsilateral OAR. DCART did not present advantages over any of the techniques investigated. Di erences were also found comparing the calculation algorithms: PBC estimated higher doses for the PTV, ipsilateral lung and heart than the MC algorithms predicted. The MC algorithms presented similar results (within 2% di erences). The PBC algorithm was considered not accurate in determining the dose in heterogeneous media and in build-up regions. Therefore, a major e ort is being done at the clinic to acquire data to move from PBC to another calculation algorithm. Despite better PTV homogeneity and conformity there is an increased risk of CLB cancer development, when using non-tangential techniques. The overall results of the studies performed con rm the outstanding predictive power and accuracy in the assessment and calculation of dose distributions in organs and tissues rendered possible by the utilization and implementation of MC simulation techniques in RT TPS.
Resumo:
Generating personalized movie recommendations to users is a problem that most commonly relies on user-movie ratings. These ratings are generally used either to understand the user preferences or to recommend movies that users with similar rating patterns have rated highly. However, movie recommenders are often subject to the Cold-Start problem: new movies have not been rated by anyone, so, they will not be recommended to anyone; likewise, the preferences of new users who have not rated any movie cannot be learned. In parallel, Social-Media platforms, such as Twitter, collect great amounts of user feedback on movies, as these are very popular nowadays. This thesis proposes to explore feedback shared on Twitter to predict the popularity of new movies and show how it can be used to tackle the Cold-Start problem. It also proposes, at a finer grain, to explore the reputation of directors and actors on IMDb to tackle the Cold-Start problem. To assess these aspects, a Reputation-enhanced Recommendation Algorithm is implemented and evaluated on a crawled IMDb dataset with previous user ratings of old movies,together with Twitter data crawled from January 2014 to March 2014, to recommend 60 movies affected by the Cold-Start problem. Twitter revealed to be a strong reputation predictor, and the Reputation-enhanced Recommendation Algorithm improved over several baseline methods. Additionally, the algorithm also proved to be useful when recommending movies in an extreme Cold-Start scenario, where both new movies and users are affected by the Cold-Start problem.
Resumo:
RESUMO: A Nigéria tem uma população estimada em cerca de 170 milhões de pessoas. O número de profissionais de saúde mental é muito diminuto, contando apenas com 150 psiquiatras o que perfaz aproximadamente um rácio de psiquiatra: população de mais de 1:1 milhão de pessoas. O Plano Nacional de Saúde Mental de 1991 reconheceu esta insuficiência e recomendou a integração dos serviços de saúde mental nos cuidados de saúde primários (CSP). Depois de mais de duas décadas, essa política não foi ainda implementada. Este estudo teve como objetivos mapear a estrutura organizacional dos serviços de saúde mental da Nigéria, e explorar os desafios e barreiras que impedem a integração bem-sucedida dos serviços de saúde mental nos cuidados de saúde primários, isto segundo a perspectiva dos profissionais dos cuidados de saúde primários. Com este objetivo, desenvolveu-se um estudo exploratório sequencial e utilizou-se um modelo misto para a recolha de dados. A aplicação em simultâneo de abordagens qualitativas e quantitativas permitiram compreender os problemas relacionados com a integração dos serviços de saúde mental nos CSP na Nigéria. No estudo qualitativo inicial, foram realizadas entrevistas com listagens abertas a 30 profissionais dos CSP, seguidas de dois grupos focais com profissionais dos CSP de duas zonas governamentais do estado de Oyo de forma a obter uma visão global das perspectivas destes profissionais locais sobre os desafios e barreiras que impedem uma integração bem-sucedida dos serviços de saúde mental nos CSP. Subsequentemente, foram realizadas entrevistas com quatro pessoas-chave, especificamente coordenadores e especialistas em saúde mental. Os resultados do estudo qualitativo foram utilizados para desenvolver um questionário para análise quantitativa das opiniões de uma amostra maior e mais representativa dos profissionais dos CSP do Estado de Oyo, bem como de duas zonas governamentais locais do Estado de Osun. As barreiras mais comummente identificadas a partir deste estudo incluem o estigma e os preconceitos sobre a doença mental, a formação inadequada dos profissionais dos CPS sobre saúde mental, a perceção pela equipa dos CSP de baixa prioridade de ação do Governo, o medo da agressão e violência pela equipa dos CSP, bem como a falta de disponibilidade de fármacos. As recomendações para superar estes desafios incluem a melhoria sustentada dos esforços da advocacia à saúde mental que vise uma maior valorização e apoio governamental, a formação e treino organizados dos profissionais dos cuidados primários, a criação de redes de referência e de apoio com instituições terciárias adjacentes, e o engajamento da comunidade para melhorar o acesso aos serviços e à reabilitação, pelas pessoas com doença mental. Estes resultados fornecem indicações úteis sobre a perceção das barreiras para a integração bem sucedida dos serviços de saúde mental nos CSP, enquanto se recomenda uma abordagem holística e abrangente. Esta informação pode orientar as futuras tentativas de implementação da integração dos serviços de saúde mental nos cuidados primários na Nigéria.------------ABSTRACT: Nigeria has an estimated population of about 170 million people but the number of mental health professionals is very small, with about 150 psychiatrists. This roughly translates to a psychiatrist:population ratio of more than 1:1 million people. The National Mental Health Policy of 1991 recognized this deficiency and recommended the integration of mental health into primary health care (PHC) delivery system. After more than two decades, this policy has yet to be implemented. This study aimed to map out the organizational structure of the mental health systems in Nigeria, and to explore the challenges and barriers preventing the successful integration of mental health into primary health care, from the perspective of the primary health care workers. A mixed methods exploratory sequential study design was employed, which entails the use of sequential timing in the combined methods of data collection. A combination of qualitative and uantitative approaches in sequence, were utilized to understand the problems of mental health services integration into PHC in Nigeria. The initial qualitative phase utilized free listing interviews with 30 PHC workers, followed by two focus group discussions with primary care workers from two Local Government Areas (LGA) of Oyo State to gain useful insight into the local perspectives of PHC workers about the challenges and barriers preventing successful integration of mental health care services into PHC. Subsequently, 4 key informant interviews with PHC co-ordinators and mental health experts were carried out. The findings from the qualitative study were utilized to develop a quantitative study questionnaire to understand the opinions of a larger and more representative sample of PHC staff in two more LGAs of Oyo State, as well as 2 LGAs from Osun State. The common barriers identified from this study include stigma and misconceptions about mental illness, inadequate training of PHC staff about mental health, low government priority, fear of aggression and violence by the PHC staff, as well as non-availability of medications. Recommendations for overcoming these challenges include improved and sustained efforts at mental health advocacy to gain governmental attention and support, organized training and retraining for primary care staff, establishment of referral and supportive networks with neighbouring tertiary facilities and community engagement to improve service utilization and rehabilitation of mentally ill persons. These findings provide useful insight into the barriers to the successful integration of mental health into PHC, while recommending a holistic and comprehensive approach. This information can guide future attempts to implement the integration of mental health into primary care in Nigeria.
Resumo:
The aim of this work project is to analyze the current algorithm used by EDP to estimate their clients’ electrical energy consumptions, create a new algorithm and compare the advantages and disadvantages of both. This new algorithm is different from the current one as it incorporates some effects from temperature variations. The results of the comparison show that this new algorithm with temperature variables performed better than the same algorithm without temperature variables, although there is still potential for further improvements of the current algorithm, if the prediction model is estimated using a sample of daily data, which is the case of the current EDP algorithm.
Resumo:
Combinatorial Optimization Problems occur in a wide variety of contexts and generally are NP-hard problems. At a corporate level solving this problems is of great importance since they contribute to the optimization of operational costs. In this thesis we propose to solve the Public Transport Bus Assignment problem considering an heterogeneous fleet and line exchanges, a variant of the Multi-Depot Vehicle Scheduling Problem in which additional constraints are enforced to model a real life scenario. The number of constraints involved and the large number of variables makes impracticable solving to optimality using complete search techniques. Therefore, we explore metaheuristics, that sacrifice optimality to produce solutions in feasible time. More concretely, we focus on the development of algorithms based on a sophisticated metaheuristic, Ant-Colony Optimization (ACO), which is based on a stochastic learning mechanism. For complex problems with a considerable number of constraints, sophisticated metaheuristics may fail to produce quality solutions in a reasonable amount of time. Thus, we developed parallel shared-memory (SM) synchronous ACO algorithms, however, synchronism originates the straggler problem. Therefore, we proposed three SM asynchronous algorithms that break the original algorithm semantics and differ on the degree of concurrency allowed while manipulating the learned information. Our results show that our sequential ACO algorithms produced better solutions than a Restarts metaheuristic, the ACO algorithms were able to learn and better solutions were achieved by increasing the amount of cooperation (number of search agents). Regarding parallel algorithms, our asynchronous ACO algorithms outperformed synchronous ones in terms of speedup and solution quality, achieving speedups of 17.6x. The cooperation scheme imposed by asynchronism also achieved a better learning rate than the original one.
Resumo:
Contém resumo
Resumo:
Mutable state can be useful in certain algorithms, to structure programs, or for efficiency purposes. However, when shared mutable state is used in non-local or nonobvious ways, the interactions that can occur via aliases to that shared memory can be a source of program errors. Undisciplined uses of shared state may unsafely interfere with local reasoning as other aliases may interleave their changes to the shared state in unexpected ways. We propose a novel technique, rely-guarantee protocols, that structures the interactions between aliases and ensures that only safe interference is possible. We present a linear type system outfitted with our novel sharing mechanism that enables controlled interference over shared mutable resources. Each alias is assigned separate, local roles encoded in a protocol abstraction that constrains how an alias can legally use that shared state. By following the spirit of rely-guarantee reasoning, our rely-guarantee protocols ensure that only safe interference can occur but still allow many interesting uses of shared state, such as going beyond invariant and monotonic usages. This thesis describes the three core mechanisms that enable our type-based technique to work: 1) we show how a protocol models an alias’s perspective on how the shared state evolves and constrains that alias’s interactions with the shared state; 2) we show how protocols can be used while enforcing the agreed interference contract; and finally, 3) we show how to check that all local protocols to some shared state can be safely composed to ensure globally safe interference over that shared memory. The interference caused by shared state is rooted at how the uses of di↵erent aliases to that state may be interleaved (perhaps even in non-deterministic ways) at run-time. Therefore, our technique is mostly agnostic as to whether this interference was the result of alias interleaving caused by sequential or concurrent semantics. We show implementations of our technique in both settings, and highlight their di↵erences. Because sharing is “first-class” (and not tied to a module), we show a polymorphic procedure that enables abstract compositions of protocols. Thus, protocols can be specialized or extended without requiring specific knowledge of the interference produce by other protocols to that state. We show that protocol composition can ensure safety even when considering abstracted protocols. We show that this core composition mechanism is sound, decidable (without the need for manual intervention), and provide an algorithm implementation.
Resumo:
Search is now going beyond looking for factual information, and people wish to search for the opinions of others to help them in their own decision-making. Sentiment expressions or opinion expressions are used by users to express their opinion and embody important pieces of information, particularly in online commerce. The main problem that the present dissertation addresses is how to model text to find meaningful words that express a sentiment. In this context, I investigate the viability of automatically generating a sentiment lexicon for opinion retrieval and sentiment classification applications. For this research objective we propose to capture sentiment words that are derived from online users’ reviews. In this approach, we tackle a major challenge in sentiment analysis which is the detection of words that express subjective preference and domain-specific sentiment words such as jargon. To this aim we present a fully generative method that automatically learns a domain-specific lexicon and is fully independent of external sources. Sentiment lexicons can be applied in a broad set of applications, however popular recommendation algorithms have somehow been disconnected from sentiment analysis. Therefore, we present a study that explores the viability of applying sentiment analysis techniques to infer ratings in a recommendation algorithm. Furthermore, entities’ reputation is intrinsically associated with sentiment words that have a positive or negative relation with those entities. Hence, is provided a study that observes the viability of using a domain-specific lexicon to compute entities reputation. Finally, a recommendation system algorithm is improved with the use of sentiment-based ratings and entities reputation.