11 resultados para policy implementation analysis
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
International Conference with Peer Review 2012 IEEE International Conference in Geoscience and Remote Sensing Symposium (IGARSS), 22-27 July 2012, Munich, Germany
Resumo:
Endmember extraction (EE) is a fundamental and crucial task in hyperspectral unmixing. Among other methods vertex component analysis ( VCA) has become a very popular and useful tool to unmix hyperspectral data. VCA is a geometrical based method that extracts endmember signatures from large hyperspectral datasets without the use of any a priori knowledge about the constituent spectra. Many Hyperspectral imagery applications require a response in real time or near-real time. Thus, to met this requirement this paper proposes a parallel implementation of VCA developed for graphics processing units. The impact on the complexity and on the accuracy of the proposed parallel implementation of VCA is examined using both simulated and real hyperspectral datasets.
Resumo:
A previously developed model is used to numerically simulate real clinical cases of the surgical correction of scoliosis. This model consists of one-dimensional finite elements with spatial deformation in which (i) the column is represented by its axis; (ii) the vertebrae are assumed to be rigid; and (iii) the deformability of the column is concentrated in springs that connect the successive rigid elements. The metallic rods used for the surgical correction are modeled by beam elements with linear elastic behavior. To obtain the forces at the connections between the metallic rods and the vertebrae geometrically, non-linear finite element analyses are performed. The tightening sequence determines the magnitude of the forces applied to the patient column, and it is desirable to keep those forces as small as possible. In this study, a Genetic Algorithm optimization is applied to this model in order to determine the sequence that minimizes the corrective forces applied during the surgery. This amounts to find the optimal permutation of integers 1, ... , n, n being the number of vertebrae involved. As such, we are faced with a combinatorial optimization problem isomorph to the Traveling Salesman Problem. The fitness evaluation requires one computing intensive Finite Element Analysis per candidate solution and, thus, a parallel implementation of the Genetic Algorithm is developed.
Resumo:
Portugal comprometeu-se em finais de 2009 com novos objetivos para a política energética e estabeleceu como prioridade a eficiência energética, designadamente através da aplicação de programas de redução do consumo de energia na Administração Pública e da promoção de comportamentos e escolhas que minimizem o consumo energético. Neste contexto, o principal objetivo a que se propõe este trabalho é: contribuir para a nova política energética do Governo, através do estudo do impacto das novas tecnologias e novos procedimentos, no aumento da eficiência da Iluminação Pública. Nesta dissertação é efetuada uma análise aos sistemas de Iluminação Pública existentes e, também o estudo das inovações tecnológicas disponíveis no mercado. Feita esta caracterização e com base em critérios económicos e técnicos suportados por normas nacionais e internacionais em vigor, as soluções que maximizem a eficiência energética na sua globalidade são identificadas e propostas para implementação no terreno. Será igualmente analisado o impacto no sistema elétrico existente das alterações propostas. A componente experimental desta dissertação foi realizada no Parque das Nações com a colaboração da Parque Expo – Gestão Urbana do Parque das Nações, S.A., entidade gestora do espaço público do Parque das Nações. O Parque das Nações apresenta-se como um local de inegável interesse para os objetivos desta dissertação porque: dispõe de uma grande diversidade de soluções na Iluminação Pública; encontra-se a curta distância do ISEL e funciona, não raras vezes, como “montra” tecnológica do País.
Resumo:
Dissertação apresentada à Escola Superior de Educação de Lisboa para obtenção de grau de mestre em Ciências da Educação Especialização em Administração Escolar
Resumo:
Cancer is a national and international health care concern. It’s important to find strategies for early diagnosis as well as for the optimization of the various therapeutic options currently existing in Portugal. Cancer is the second leading cause of death in Portugal, the choice of this study, is due to the importance of radiotherapy approach in cancer treatment and because is the therapy used in 40% of oncology patients. Radiation therapy has evolve data technological level, that allows new treatment techniques that are more efficient and that also promotes greater professional satisfaction. The hadrons are charged particles, used in cancer therapy. These particles can bring a paradigm shift regarding the therapeutic approach in radiotherapy. The technique used is proton therapy, that reveal to be more accurate, efficacious and less toxic to surrounding tissue. Proton therapy may be a promising development in the field of oncology and how the treatment is given in radiotherapy. Although there is awareness of the benefits of proton therapy in oncology it’s also important to take in consideration the costs of these therapy, because they are considerably higher than conventional treatments of radiotherapy. Given the lack of a proton therapy service in Portugal, this study aims to be a documentary analysis of clinical records that will achieve the following objectives: to identify the number of cancer patients diagnosed in 2010 in Portugal and to calculate the estimated number of patients that could have been treated with proton therapy according to the Health Council of the Netherlands registration document.
Resumo:
Contabilidade e Gestão das Instituições Financeiras
Resumo:
Artigo baseado na comunicação proferida no 1st International Symposium on Media Studies, realizado na Akdeniz Universitesi Yayınları, Antalya, Turquia, 21-23 de novembro de 2013
Resumo:
The scope of this paper is to adapt the standard mean-variance model of Henry Markowitz theory, creating a simulation tool to find the optimal configuration of the portfolio aggregator, calculate its profitability and risk. Currently, there is a deep discussion going on among the power system society about the structure and architecture of the future electric system. In this environment, policy makers and electric utilities find new approaches to access the electricity market; this configures new challenging positions in order to find innovative strategies and methodologies. Decentralized power generation is gaining relevance in liberalized markets, and small and medium size electricity consumers are also become producers (“prosumers”). In this scenario an electric aggregator is an entity that joins a group of electric clients, customers, producers, “prosumers” together as a single purchasing unit to negotiate the purchase and sale of electricity. The aggregator conducts research on electricity prices, contract terms and conditions in order to promote better energy prices for their clients and allows small and medium customers to benefit improved market prices.
Resumo:
Brain dopamine transporters imaging by Single Emission Tomography (SPECT) with 123I-FP-CIT (DaTScanTM) has become an important tool in the diagnosis and evaluation of Parkinson syndromes.This diagnostic method allows the visualization of a portion of the striatum – where healthy pattern resemble two symmetric commas - allowing the evaluation of dopamine presynaptic system, in which dopamine transporters are responsible for dopamine release into the synaptic cleft, and their reabsorption into the nigrostriatal nerve terminals, in order to be stored or degraded. In daily practice for assessment of DaTScan TM, it is common to rely only on visual assessment for diagnosis. However, this process is complex and subjective as it depends on the observer’s experience and it is associated with high variability intra and inter observer. Studies have shown that semiquantification can improve the diagnosis of Parkinson syndromes. For semiquantification, analysis methods of image segmentation using regions of interest (ROI) are necessary. ROIs are drawn, in specific - striatum - and in nonspecific – background – uptake areas. Subsequently, specific binding ratios are calculated. Low adherence of semiquantification for diagnosis of Parkinson syndromes is related, not only with the associated time spent, but also with the need of an adapted database of reference values for the population concerned, as well as, the examination of each service protocol. Studies have concluded, that this process increases the reproducibility of semiquantification. The aim of this investigation was to create and validate a database of healthy controls for Dopamine transporters with DaTScanTM named DBRV. The created database has been adapted to the Nuclear Medicine Department’s protocol, and the population of Infanta Cristina’s Hospital located in Badajoz, Spain.
Resumo:
Hyperspectral imaging can be used for object detection and for discriminating between different objects based on their spectral characteristics. One of the main problems of hyperspectral data analysis is the presence of mixed pixels, due to the low spatial resolution of such images. This means that several spectrally pure signatures (endmembers) are combined into the same mixed pixel. Linear spectral unmixing follows an unsupervised approach which aims at inferring pure spectral signatures and their material fractions at each pixel of the scene. The huge data volumes acquired by such sensors put stringent requirements on processing and unmixing methods. This paper proposes an efficient implementation of a unsupervised linear unmixing method on GPUs using CUDA. The method finds the smallest simplex by solving a sequence of nonsmooth convex subproblems using variable splitting to obtain a constraint formulation, and then applying an augmented Lagrangian technique. The parallel implementation of SISAL presented in this work exploits the GPU architecture at low level, using shared memory and coalesced accesses to memory. The results herein presented indicate that the GPU implementation can significantly accelerate the method's execution over big datasets while maintaining the methods accuracy.