992 resultados para Multiple description coding
Resumo:
In this paper, we present a deterministic approach to tsunami hazard assessment for the city and harbour of Sines, Portugal, one of the test sites of project ASTARTE (Assessment, STrategy And Risk Reduction for Tsunamis in Europe). Sines has one of the most important deep-water ports, which has oil-bearing, petrochemical, liquid-bulk, coal, and container terminals. The port and its industrial infrastructures face the ocean southwest towards the main seismogenic sources. This work considers two different seismic zones: the Southwest Iberian Margin and the Gloria Fault. Within these two regions, we selected a total of six scenarios to assess the tsunami impact at the test site. The tsunami simulations are computed using NSWING, a Non-linear Shallow Water model wIth Nested Grids. In this study, the static effect of tides is analysed for three different tidal stages: MLLW (mean lower low water), MSL (mean sea level), and MHHW (mean higher high water). For each scenario, the tsunami hazard is described by maximum values of wave height, flow depth, drawback, maximum inundation area and run-up. Synthetic waveforms are computed at virtual tide gauges at specific locations outside and inside the harbour. The final results describe the impact at the Sines test site considering the single scenarios at mean sea level, the aggregate scenario, and the influence of the tide on the aggregate scenario. The results confirm the composite source of Horseshoe and Marques de Pombal faults as the worst-case scenario, with wave heights of over 10 m, which reach the coast approximately 22 min after the rupture. It dominates the aggregate scenario by about 60 % of the impact area at the test site, considering maximum wave height and maximum flow depth. The HSMPF scenario inundates a total area of 3.5 km2. © Author(s) 2015.
Resumo:
Materials selection is a matter of great importance to engineering design and software tools are valuable to inform decisions in the early stages of product development. However, when a set of alternative materials is available for the different parts a product is made of, the question of what optimal material mix to choose for a group of parts is not trivial. The engineer/designer therefore goes about this in a part-by-part procedure. Optimizing each part per se can lead to a global sub-optimal solution from the product point of view. An optimization procedure to deal with products with multiple parts, each with discrete design variables, and able to determine the optimal solution assuming different objectives is therefore needed. To solve this multiobjective optimization problem, a new routine based on Direct MultiSearch (DMS) algorithm is created. Results from the Pareto front can help the designer to align his/hers materials selection for a complete set of materials with product attribute objectives, depending on the relative importance of each objective.
Resumo:
In the last decade, local image features have been widely used in robot visual localization. In order to assess image similarity, a strategy exploiting these features compares raw descriptors extracted from the current image with those in the models of places. This paper addresses the ensuing step in this process, where a combining function must be used to aggregate results and assign each place a score. Casting the problem in the multiple classifier systems framework, in this paper we compare several candidate combiners with respect to their performance in the visual localization task. For this evaluation, we selected the most popular methods in the class of non-trained combiners, namely the sum rule and product rule. A deeper insight into the potential of these combiners is provided through a discriminativity analysis involving the algebraic rules and two extensions of these methods: the threshold, as well as the weighted modifications. In addition, a voting method, previously used in robot visual localization, is assessed. Furthermore, we address the process of constructing a model of the environment by describing how the model granularity impacts upon performance. All combiners are tested on a visual localization task, carried out on a public dataset. It is experimentally demonstrated that the sum rule extensions globally achieve the best performance, confirming the general agreement on the robustness of this rule in other classification problems. The voting method, whilst competitive with the product rule in its standard form, is shown to be outperformed by its modified versions.
Resumo:
Abstract The investigation of the web of relationships between the different elements of the immune system has proven instrumental to better understand this complex biological system. This is particularly true in the case of the interactions between B and T lymphocytes, both during cellular development and at the stage of cellular effectors functions. The understanding of the B–T cells interdependency and the possibility to manipulate this relationship may be directly applicable to situations where immunity is deficient, as is the case of cancer or immune suppression after radio and chemotherapy. The work presented here started with the development of a novel and accurate tool to directly assess the diversity of the cellular repertoire (Chapter III). Contractions of T cell receptor diversity have been related with a deficient immune status. This method uses gene chips platforms where nucleic acids coding for lymphocyte receptors are hybridized and is based on the fact that the frequency of hybridization of nucleic acids to the oligonucleotides on a gene chip varies in direct proportion to diversity. Subsequently, and using this new method and other techniques of cell quantification I examined, in an animal model, the role that polyclonal B cells and immunoglobulin exert upon T cell development in the thymus, specifically on the acquisition of a broader repertoire diversity by the T cell receptors (Chapter IV and V). The hypothesis tested was if the presence of more diverse peptides in the thymus, namely polyclonal immunoglobulin, would induce the generation of more diverse T cells precursors. The results obtained demonstrated that the diversity of the T cell compartment is increased by the presence of polyclonal immunoglobulin. Polyclonal immunoglobulin, and particularly the Fab fragments of the molecule, represent the most diverse self-molecules in the body and its peptides are presented by antigen presenting cells to precursor T cells in the thymus during its development. This probably contributes significantly to the generation of receptor diversity. Furthermore, we also demonstrated that a more diverse repertoire of T lymphocytes is associated with a more effective and robust T cell immune function in vivo, as mice with a more diverse T cell receptors reject minor histocompatiblility discordant skin grafts faster than mice with a shrunken T cell receptor repertoire (Chapter V). We believe that a broader T cell receptor diversity allows a more efficient recognition and rejection of a higher range of external and internal aggressions. In this work it is demonstrated that a reduction of TCR diversity by thymectomy in wild type mice significantly increased survival of H-Y incompatible skin grafts, indicating decrease on T cell function. In addiction reconstitution of T-cell diversity in mice with a decreased T cell repertoire diversity with immunoglobulin Fab fragments, lead to a increase on TCR diversity and to a significantly decreased survival of the skin grafts (Chapter V). These results strongly suggest that increases on T cell repertoire diversity contribute to improvement of T cell function. Our results may have important implications on therapy and immune reconstitution in the context of AIDS, cancer, autoimmunity and post myeloablative treatments. Based on the previous results, we tested the clinical hypothesis that patients with haematological malignancies subjected to stem cell transplantation who recovered a robust immune system would have a better survival compared to patients who did not recover such a robust immune system. This study was undertaken by the examination of the progression and overall survival of 42 patients with mantle cell non-Hodgkin lymphoma receiving autologous hematopoietic stem cell transplantation (Chapter VI). The results obtained show that patients who recovered higher numbers of lymphocytes soon after autologous transplantation had a statistically significantly longer progression free and overall survivals. These results demonstrate the positive impact that a more robust immune system reconstitution after stem cell transplantation may have upon the survival of patients with haematological malignancies. In a similar clinical research framework, this dissertation also includes the study of the impact of recovering normal serum levels of polyclonal immunoglobulin on the survival of patients with another B cell haematological malignancy, multiple myeloma, after autologous stem cell transplantation (Chapter VII). The relapse free survival of the 110 patients with multiple myeloma analysed was associated with their ability to recover normal serum levels of the polyclonal compartment of immunoglobulin. These results suggest again the important effect of polyclonal immunoglobulin for the (re)generation of the immune competence. We also studied the impact of a robust immunity for the response to treatment with the antibody anti CD20, rituximab, in patients with non- Hodgkin’s lymphoma (NHL) (Chapter VIII). Patients with higher absolute counts of CD4+ T lymphocytes respond better (in terms of longer progression free survival) to rituximab compared to patients with lower number of CD4+ T lymphocytes. These observations highlight again the fact that a competent immune system is required for the clinical benefit of rituximab therapy in NHL patients. In conclusion, the work presented in this dissertation demonstrates, for the first time, that diverse B cells and polyclonal immunoglobulin promote T cell diversification in the thymus and improve T lymphocyte function. Also, it shows that in the setting of immune reconstitution, as after autologous stem cell transplantation for mantle cell lymphoma and in the setting of immune therapy for NHL, the absolute lymphocyte counts are an independent factor predicting progression free and overall survival. These results can have an important application in the clinical practice since the majority of the current treatments for cancer are immunosuppressive and implicate a subsequent immune recovery. Also, the effects of a number of antineoplastic treatments, including biological agents, depend on the immune system activity. In this way, studies similar to the ones presented here, where methods to improve the immune reconstitution are examined, may prove to be instrumental for a better understanding of the immune system and to guide more efficient treatment options and the design of future clinical trials. Resumo O estudo da rede de inter-relações entre os diversos elementos do sistema immune tem-se mostrado um instrumento essencial para uma melhor compreensão deste complexo sistema biológico. Tal é particularmente verdade no caso das interacções entre os linfócitos B e T, quer durante o desenvolvimento celular, quer ao nível das funções celulares efectoras. A compreensão da interdependência entre linfócitos B e T e a possibilidade de manipular esta relação pode ser directamente aplicável a situações em que a imunidade está deficiente, como é o caso das doenças neoplásicas ou da imunossupressão após radio ou quimioterapia. O trabalho apresentado nesta dissertação iniciou-se com o desenvolvimento de um novo método laboratorial para medir directamente a diversidade do reportório celular (Capítulo III). Reduções da diversidade do reportório dos receptores de células T têm sido relacionadas com um estado de imunodeficiência. O método desenvolvido utiliza “gene chips”, aos quais hibridizam os ácidos nucleicos codificantes das cadeias proteicas dos receptores linfocitários. A diversidade é calculada com base na frequência de hibridização do ácido nucleico da amostra aos oligonucleótidos presentes no “gene chip”. De seguida, e utilizando este novo método e outras técnicas de quantificação celular examinei, num modelo animal, o papel que as células policlonais B e a imunoglobulina exercem sobre o desenvolvimento linfocitário T no timo, especificamente na aquisição de um reportório diverso de receptores T (Capítulos IV e V). Testei, então, a hipótese de que a presença no timo de péptidos mais diversos, como a imunoglobulna policlonal, induzisse a génese de precursores T mais diversos. Demonstrámos que a diversidade do compartimento T é aumentado pela presença de imunoglobulina policlonal. A imunoglobulina policlonal, e particularmente os fragmentos Fab desta molécula, representam as moléculas autólogas mais diversas presentes nos organismos vertebrados. Estes péptidos são apresentados por células apresentadoras de antigénio às células precursoras T no timo, durante o desenvolvimento celular T. Tal, provavelmente, contribui para a génese da diversidade dos receptores. Também demonstrámos que a presença de um reportório mais diverso de linfócitos T se associa a um incremento da função imunológica T in vivo. Uma diversidade de receptores T mais extensa parece permitir um reconhecimento e rejeição mais eficientes de um maior número de agressores internos e externos. Demonstrámos que ratinhos com receptores de células T (RCT) com maior diversidade rejeitam transplantes cutâneos discordantes para antigénios minor de histocompatibilidade mais rapidamente do que ratinhos com um menor reportório T (Capítulo V). Por outro lado, uma redução da diversidade do RCT, causada por timectomia de ratinhos de estirpes selvagens, mostrou aumentar significativamente a sobrevivência de transplantes cutâneos incompatíveis para o antigénio H-Y (antigénio minor de histocompatibilidade), indicando uma diminuição da função linfocitária T. Além disso, a reconstituição da diversidade dos linfócitos T em ratinhos com uma diversidade de reportório T diminuída, induzida pela administração de fragmentos Fab de imunoglobulina, conduz a um aumento da diversidade dos RCT e a uma diminuição significativa da sobrevivência dos enxertos cutâneos (Capítulo V). Estes resultados sugerem que o aumento do reportório de células T contribui para uma melhoria das funções celulares T e poderão ter implicações importantes na terapêutica e reconstitutição imunológica em contexto de SIDA, neoplasias, autoimunidade e após tratamentos mieloablativos. Baseado nos resultados anteriores, decidimos testar a hipótese clínica de que doentes com neoplasias hematológicas sujeitos a transplantação de precursores hematopoiéticos e com recuperação imunológica precoce após transplante teriam uma sobrevivência mais longa do que doentes que não recuperassem tão bem a sua imunidade. Analisámos a sobrevivência global e sobrevivência sem doença de 42 doentes com linfoma não Hodgkin de células do manto sujeitos a transplante autólogo de precursores hematopoiéticos (Capítulo VI). Os resultados obtidos mostraram que os doentes que recuperaram contagens mais elevadas de linfócitos imediatamente após o transplante autólogo, apresentaram uma sobrevivência global e sem progressão mais longa do que doentes que não recuperaram contagens linfocitárias tão precocemente. Estes resultados demonstram o efeito positivo de uma reconstitutição imunológica robusta após transplante de presursores hematopoiéticos, sobre a sobrevivência de doentes com neoplasias hematológicas. Do mesmo modo, estudámos o efeito que a recuperação de níveis séricos normais de imunoglobulina policlonal tem na sobrevivência de doentes com outras neoplasias hematológicas de linfócitos B, como o mieloma múltiplo,após transplante autólogo de precursos hematopoiéticos (Capítulo VII). A sobrevivência livre de doença dos 110 doentes com mieloma múltiplo analizados está associada com a sua capacidade de recuperar níveis séricos normais do compartmento policlonal de imunoglobulina. Estes resultados pioneiros indicam a importância da imunoglobulina policlonal para a génese de competência imunológica. Também estudámos o impacto de um sistema imunitário eficiente sobre a resposta ao tratamento com o anticorpo anti CD20, ituximab, em doentes com linfoma não Hodgkin (LNH) (Capítulo VIII). Os resultados mostram que doentes com valores mais elevados de linfócitos T CD4+ respondem melhor (em termos de maior sobrevida livre de doença) ao rituximab, do que doentes com valores mais baixos. Estas observações ilustram a necessidade de um sistema imunitário competente para o benefício clínico da terapêutica com rituximab em doentes com LNH. Em conclusão, o trabalho apresentado nesta dissertação demonstra que as células B e a imunoglobulina policlonal promovem a diversidade das células T no timo e melhoram a função linfocitária T periférica. Concomitantemente, também demonstrámos que, no contexto de reconstituição imune, por exemplo, após transplante autólogo de precursores hematopoiéticos em doentes com linfomas de células do manto, o número absoluto de linfócitos é uma factor independente da sobrevivência. Os resultados demonstram, também, a importância dos valores de linfocitos T na resposta ao tratamento com rituximab no caso de doentes com LNH. O mesmo princípio se prova pelo facto de que doentes com mieloma múltiplo sujeitos a transplante autólogo de precursores hematopoiéticos que recuperam valores normais séricos de imunoglobulinas policlonais, terem melhores taxas de resposta em comparação com doentes que não recuperam valores normais de imunoglobulinas policlonais. Estes resultados podem ter importantes aplicações na prática clínica dado que a maioria dos tratamentos de doenças neoplásicas implica imunossupressão e, subsequente, recuperação imunológica. Estes estudos podem ser um instrumento fundamental para uma melhor compreensão do sistema imune e guiar uma escolha mais eficiente de opções terapêuticas bem como contribuir para a concepção de futuros estudos clínicos.
Resumo:
Trabalho apresentado no âmbito do European Master in Computational Logics, como requisito parcial para obtenção do grau de Mestre em Computational Logics
Resumo:
The development of high spatial resolution airborne and spaceborne sensors has improved the capability of ground-based data collection in the fields of agriculture, geography, geology, mineral identification, detection [2, 3], and classification [4–8]. The signal read by the sensor from a given spatial element of resolution and at a given spectral band is a mixing of components originated by the constituent substances, termed endmembers, located at that element of resolution. This chapter addresses hyperspectral unmixing, which is the decomposition of the pixel spectra into a collection of constituent spectra, or spectral signatures, and their corresponding fractional abundances indicating the proportion of each endmember present in the pixel [9, 10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. The linear mixing model holds when the mixing scale is macroscopic [13]. The nonlinear model holds when the mixing scale is microscopic (i.e., intimate mixtures) [14, 15]. The linear model assumes negligible interaction among distinct endmembers [16, 17]. The nonlinear model assumes that incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [18]. Under the linear mixing model and assuming that the number of endmembers and their spectral signatures are known, hyperspectral unmixing is a linear problem, which can be addressed, for example, under the maximum likelihood setup [19], the constrained least-squares approach [20], the spectral signature matching [21], the spectral angle mapper [22], and the subspace projection methods [20, 23, 24]. Orthogonal subspace projection [23] reduces the data dimensionality, suppresses undesired spectral signatures, and detects the presence of a spectral signature of interest. The basic concept is to project each pixel onto a subspace that is orthogonal to the undesired signatures. As shown in Settle [19], the orthogonal subspace projection technique is equivalent to the maximum likelihood estimator. This projection technique was extended by three unconstrained least-squares approaches [24] (signature space orthogonal projection, oblique subspace projection, target signature space orthogonal projection). Other works using maximum a posteriori probability (MAP) framework [25] and projection pursuit [26, 27] have also been applied to hyperspectral data. In most cases the number of endmembers and their signatures are not known. Independent component analysis (ICA) is an unsupervised source separation process that has been applied with success to blind source separation, to feature extraction, and to unsupervised recognition [28, 29]. ICA consists in finding a linear decomposition of observed data yielding statistically independent components. Given that hyperspectral data are, in given circumstances, linear mixtures, ICA comes to mind as a possible tool to unmix this class of data. In fact, the application of ICA to hyperspectral data has been proposed in reference 30, where endmember signatures are treated as sources and the mixing matrix is composed by the abundance fractions, and in references 9, 25, and 31–38, where sources are the abundance fractions of each endmember. In the first approach, we face two problems: (1) The number of samples are limited to the number of channels and (2) the process of pixel selection, playing the role of mixed sources, is not straightforward. In the second approach, ICA is based on the assumption of mutually independent sources, which is not the case of hyperspectral data, since the sum of the abundance fractions is constant, implying dependence among abundances. This dependence compromises ICA applicability to hyperspectral images. In addition, hyperspectral data are immersed in noise, which degrades the ICA performance. IFA [39] was introduced as a method for recovering independent hidden sources from their observed noisy mixtures. IFA implements two steps. First, source densities and noise covariance are estimated from the observed data by maximum likelihood. Second, sources are reconstructed by an optimal nonlinear estimator. Although IFA is a well-suited technique to unmix independent sources under noisy observations, the dependence among abundance fractions in hyperspectral imagery compromises, as in the ICA case, the IFA performance. Considering the linear mixing model, hyperspectral observations are in a simplex whose vertices correspond to the endmembers. Several approaches [40–43] have exploited this geometric feature of hyperspectral mixtures [42]. Minimum volume transform (MVT) algorithm [43] determines the simplex of minimum volume containing the data. The MVT-type approaches are complex from the computational point of view. Usually, these algorithms first find the convex hull defined by the observed data and then fit a minimum volume simplex to it. Aiming at a lower computational complexity, some algorithms such as the vertex component analysis (VCA) [44], the pixel purity index (PPI) [42], and the N-FINDR [45] still find the minimum volume simplex containing the data cloud, but they assume the presence in the data of at least one pure pixel of each endmember. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. Hyperspectral sensors collects spatial images over many narrow contiguous bands, yielding large amounts of data. For this reason, very often, the processing of hyperspectral data, included unmixing, is preceded by a dimensionality reduction step to reduce computational complexity and to improve the signal-to-noise ratio (SNR). Principal component analysis (PCA) [46], maximum noise fraction (MNF) [47], and singular value decomposition (SVD) [48] are three well-known projection techniques widely used in remote sensing in general and in unmixing in particular. The newly introduced method [49] exploits the structure of hyperspectral mixtures, namely the fact that spectral vectors are nonnegative. The computational complexity associated with these techniques is an obstacle to real-time implementations. To overcome this problem, band selection [50] and non-statistical [51] algorithms have been introduced. This chapter addresses hyperspectral data source dependence and its impact on ICA and IFA performances. The study consider simulated and real data and is based on mutual information minimization. Hyperspectral observations are described by a generative model. This model takes into account the degradation mechanisms normally found in hyperspectral applications—namely, signature variability [52–54], abundance constraints, topography modulation, and system noise. The computation of mutual information is based on fitting mixtures of Gaussians (MOG) to data. The MOG parameters (number of components, means, covariances, and weights) are inferred using the minimum description length (MDL) based algorithm [55]. We study the behavior of the mutual information as a function of the unmixing matrix. The conclusion is that the unmixing matrix minimizing the mutual information might be very far from the true one. Nevertheless, some abundance fractions might be well separated, mainly in the presence of strong signature variability, a large number of endmembers, and high SNR. We end this chapter by sketching a new methodology to blindly unmix hyperspectral data, where abundance fractions are modeled as a mixture of Dirichlet sources. This model enforces positivity and constant sum sources (full additivity) constraints. The mixing matrix is inferred by an expectation-maximization (EM)-type algorithm. This approach is in the vein of references 39 and 56, replacing independent sources represented by MOG with mixture of Dirichlet sources. Compared with the geometric-based approaches, the advantage of this model is that there is no need to have pure pixels in the observations. The chapter is organized as follows. Section 6.2 presents a spectral radiance model and formulates the spectral unmixing as a linear problem accounting for abundance constraints, signature variability, topography modulation, and system noise. Section 6.3 presents a brief resume of ICA and IFA algorithms. Section 6.4 illustrates the performance of IFA and of some well-known ICA algorithms with experimental data. Section 6.5 studies the ICA and IFA limitations in unmixing hyperspectral data. Section 6.6 presents results of ICA based on real data. Section 6.7 describes the new blind unmixing scheme and some illustrative examples. Section 6.8 concludes with some remarks.
Resumo:
The egg of Anopheles (Anopheles) intermedius (Peryassu, 1908) is described and illustrated with scanning electron micrographs. Literature data on An. (Ano.) maculipes (Theobald, 1903) is provided
Resumo:
Dissertação para obtenção do Grau de Mestre em Matemática e Aplicações Especialização em Actuariado, Estatística e Investigação Operacional
Resumo:
Dissertation presented in fulfillment of the requirements for the Degree of Doctor of Philosophy in Biology (Molecular Genetics) at the Instituto de Tecnologia Química e Biológica da Universidade Nova de Lisboa
Resumo:
The aim of this case series was to describe the clinical, laboratory and epidemiological characteristics and the presentation of bacillary angiomatosis cases (and/or parenchymal bacillary peliosis) that were identified in five public hospitals of Rio de Janeiro state between 1990 and 1997; these cases were compared with those previously described in the medical literature. Thirteen case-patients were enrolled in the study; the median age was 39 years and all patients were male. All patients were human immunodeficiency virus type 1 (HIV-1) infected and they had previous or concomitant HIV-associated opportunistic infections or malignancies diagnosed at the time bacillary angiomatosis was diagnosed. Median T4 helper lymphocyte counts of patients was 96 cells per mm³. Cutaneous involvement was the most common clinical manifestation of bacillary angiomatosis in this study. Clinical remission following appropriate treatment was more common in our case series than that reported in the medical literature, while the incidence of relapse was similar. The frequency of bacillary angiomatosis in HIV patients calculated from two of the hospitals included in our study was 1.42 cases per 1000 patients, similar to the frequencies reported in the medical literature. Bacillary angiomatosis is an unusual opportunistic pathogen in our setting.
Resumo:
A thirty three year-old, male patient was admitted at the Hospital of the São Paulo University School of Medicine, at the city of São Paulo, Brazil, with complaint of pains, tingling and decreased sensibility in the right hand for the last four months. This had progressed to the left hand, left foot and right foot, in addition to a difficulty of flexing and stretching in the left foot. Tests were positive for HBeAg, IgM anti-HBc and HBsAg, thus characterizing the condition of acute hepatitis B. The ALT serum level was 15 times above the upper normal limit. Blood glucose, cerebral spinal fluid, antinuclear antibodies (ANA) and anti-HIV and anti-HCV serum tests were either normal or negative. Electroneuromyography disclosed severe peripheral neuropathy with an axon prevalence and signs of denervation; nerve biopsy disclosed intense vasculitis. The diagnosis of multiple confluent mononeuropathy associated to acute hepatitis B was done. This association is not often reported in international literature and its probable cause is the direct action of the hepatitis B virus on the nerves or a vasculitis of the vasa nervorum brought about by deposits of immune complexes.
Resumo:
Actualmente a humanidade depara-se com um dos grandes desafios que é o de efectivar a transição para um futuro sustentável. Logo, o sector da energia tem um papel chave neste processo de transição, com principal destaque para a energia solar, tendo em conta que é uma das fontes de energias renováveis mais promissoras, podendo no médiolongo prazo, tornar-se uma das principais fontes de energia no panorama energético dos países. A energia solar térmica de concentração (CSP), apesar não ser ainda conhecida em Portugal, possui um potencial relevante em regiões específicas do nosso território. Logo, o objectivo deste trabalho é efectuar uma análise detalhada dos sistemas solares de concentração para produção de energia eléctrica, abordando temas, tais como, o potencial da energia solar, a definição do processo de concentração solar, a descrição das tecnologias existentes, o estado da arte do CSP, mercado CSP no mundo, e por último, a análise da viabilidade técnico-económica da instalação de uma central tipo torre solar de 20 MW, em Portugal. Para que este objectivo fosse exequível, recorreu-se à utilização de um software de simulação termodinâmica de centrais CSP, denominado por Solar Advisor Model (SAM). O caso prático foi desenvolvido para a cidade de Faro, onde foram simuladas quatro configurações distintas para uma central do tipo torre solar de 20 MW. Foram apresentados resultados, focando a desempenho diário e anual da central. Foi efectuada uma análise para avaliação da influência da variabilidade dos parâmetros, localização geográfica, múltiplo solar, capacidade de armazenamento de calor e fracção de hibridização sobre o custo nivelado da energia (LCOE), o factor de capacidade e a produção anual de energia. Conjuntamente, é apresentada uma análise de sensibilidade, com a finalidade de averiguar quais os parâmetros que influenciam de forma mais predominante o valor do LCOE. Por último, é apresentada uma análise de viabilidade económica de um investimento deste tipo.
Resumo:
In-network storage of data in wireless sensor networks contributes to reduce the communications inside the network and to favor data aggregation. In this paper, we consider the use of n out of m codes and data dispersal in combination to in-network storage. In particular, we provide an abstract model of in-network storage to show how n out of m codes can be used, and we discuss how this can be achieved in five cases of study. We also define a model aimed at evaluating the probability of correct data encoding and decoding, we exploit this model and simulations to show how, in the cases of study, the parameters of the n out of m codes and the network should be configured in order to achieve correct data coding and decoding with high probability.