852 resultados para robust estimator
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia Informática
Resumo:
The application of femtosecond laser interferometry to direct patterning of thin-film magnetic alloys is demonstrated. The formation of stripe gratings with submicron periodicities is achieved in Fe1-xVx (x=18-34wt. %) layers, with a difference in magnetic moments up to Delta mu/mu similar to 20 between adjacent stripes but without any significant development of the topographical relief (<1% of the film thickness). The produced gratings exhibit a robust effect of their anisotropy shape on magnetization curves in the film plane. The obtained data witness ultrafast diffusive transformations associated with the process of spinodal decomposition and demonstrate an opportunity for producing magnetic nanostructures with engineered properties upon this basis.
Resumo:
The new generations of SRAM-based FPGA (field programmable gate array) devices are the preferred choice for the implementation of reconfigurable computing platforms intended to accelerate processing in real-time systems. However, FPGA's vulnerability to hard and soft errors is a major weakness to robust configurable system design. In this paper, a novel built-in self-healing (BISH) methodology, based on run-time self-reconfiguration, is proposed. A soft microprocessor core implemented in the FPGA is responsible for the management and execution of all the BISH procedures. Fault detection and diagnosis is followed by repairing actions, taking advantage of the dynamic reconfiguration features offered by new FPGA families. Meanwhile, modular redundancy assures that the system still works correctly
Resumo:
One of the most important measures to prevent wild forest fires is the use of prescribed and controlled burning actions as it reduce the fuel mass availability. The impact of these management activities on soil physical and chemical properties varies according to the type of both soil and vegetation. Decisions in forest management plans are often based on the results obtained from soil-monitoring campaigns. Those campaigns are often man-labor intensive and expensive. In this paper we have successfully used the multivariate statistical technique Robust Principal Analysis Compounds (ROBPCA) to investigate on the sampling procedure effectiveness for two different methodologies, in order to reflect on the possibility of simplifying and reduce the sampling collection process and its auxiliary laboratory analysis work towards a cost-effective and competent forest soil characterization.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia Electrotécnica e de Computadores
Resumo:
Dissertação apresentada como requisito parcial para obtenção do grau de Mestre em Ciência e Sistemas de Informação Geográfica
Resumo:
O trabalho apresentado centra-se na determinação dos custos de construção de condutas de pequenos e médios diâmetros em Polietileno de Alta Densidade (PEAD) para saneamento básico, tendo como base a metodologia descrita no livro Custos de Construção e Exploração – Volume 9 da série Gestão de Sistemas de Saneamento Básico, de Lencastre et al. (1994). Esta metodologia descrita no livro já referenciado, nos procedimentos de gestão de obra, e para tal foram estimados custos unitários de diversos conjuntos de trabalhos. Conforme Lencastre et al (1994), “esses conjuntos são referentes a movimentos de terras, tubagens, acessórios e respetivos órgãos de manobra, pavimentações e estaleiro, estando englobado na parte do estaleiro trabalhos acessórios correspondentes à obra.” Os custos foram obtidos analisando vários orçamentos de obras de saneamento, resultantes de concursos públicos de empreitadas recentemente realizados. Com vista a tornar a utilização desta metodologia numa ferramenta eficaz, foram organizadas folhas de cálculo que possibilitam obter estimativas realistas dos custos de execução de determinada obra em fases anteriores ao desenvolvimento do projeto, designadamente numa fase de preparação do plano diretor de um sistema ou numa fase de elaboração de estudos de viabilidade económico-financeiros, isto é, mesmo antes de existir qualquer pré-dimensionamento dos elementos do sistema. Outra técnica implementada para avaliar os dados de entrada foi a “Análise Robusta de Dados”, Pestana (1992). Esta metodologia permitiu analisar os dados mais detalhadamente antes de se formularem hipóteses para desenvolverem a análise de risco. A ideia principal é o exame bastante flexível dos dados, frequentemente antes mesmo de os comparar a um modelo probabilístico. Assim, e para um largo conjunto de dados, esta técnica possibilitou analisar a disparidade dos valores encontrados para os diversos trabalhos referenciados anteriormente. Com os dados recolhidos, e após o seu tratamento, passou-se à aplicação de uma metodologia de Análise de Risco, através da Simulação de Monte Carlo. Esta análise de risco é feita com recurso a uma ferramenta informática da Palisade, o @Risk, disponível no Departamento de Engenharia Civil. Esta técnica de análise quantitativa de risco permite traduzir a incerteza dos dados de entrada, representada através de distribuições probabilísticas que o software disponibiliza. Assim, para por em prática esta metodologia, recorreu-se às folhas de cálculo que foram realizadas seguindo a abordagem proposta em Lencastre et al (1994). A elaboração e a análise dessas estimativas poderão conduzir à tomada de decisões sobre a viabilidade da ou das obras a realizar, nomeadamente no que diz respeito aos aspetos económicos, permitindo uma análise de decisão fundamentada quanto à realização dos investimentos.
Resumo:
Consolidation consists in scheduling multiple virtual machines onto fewer servers in order to improve resource utilization and to reduce operational costs due to power consumption. However, virtualization technologies do not offer performance isolation, causing applications’ slowdown. In this work, we propose a performance enforcing mechanism, composed of a slowdown estimator, and a interference- and power-aware scheduling algorithm. The slowdown estimator determines, based on noisy slowdown data samples obtained from state-of-the-art slowdown meters, if tasks will complete within their deadlines, invoking the scheduling algorithm if needed. When invoked, the scheduling algorithm builds performance and power aware virtual clusters to successfully execute the tasks. We conduct simulations injecting synthetic jobs which characteristics follow the last version of the Google Cloud tracelogs. The results indicate that our strategy can be efficiently integrated with state-of-the-art slowdown meters to fulfil contracted SLAs in real-world environments, while reducing operational costs in about 12%.
Resumo:
Trabalho de Projecto para obtenção do grau de Mestre em Engenharia Informática e de Computadores
Resumo:
A dot enzyme-linked immunosorbent assay (DOT-ELISA) was developed to detect specific antibodies in cerebrospinal fluid (CSF) for human neurocysticercosis immunodiagnosis, with Cysticercus cellulosae antigen dotted on a new solid-phase. This was represented by sheets of a synthetic polyester fabric impregnated with a polymerized resin (N-methylol-acrylamide). A very stable preparation was thus obtained, the antigen being covalently bound by cross-linking with free N-methylol groups on the resin. Since robust, no special care was necessary for handling the solid-phase. The test could be performed at room-temperature. From 30 CSF samples assayed, 14 were positive, from a group of 15 cases of neurocysticercosis, with titers from 1 to 128; 15 other samples, from normals or other neurological diseases, were all negative. Test characteristics seem to indicate it as adequate for epidemiological surveys. A more detailed study on sensitivity, specificity, reproducibility and the use in serum samples is being conducted.
Resumo:
O desenvolvimento de recursos multilingues robustos para fazer face às exigências crescentes na complexidade dos processos intra e inter-organizacionais é um processo complexo que obriga a um aumento da qualidade nos modos de interacção e partilha dos recursos das organizações, através, por exemplo, de um maior envolvimento dos diferentes interlocutores em formas eficazes e inovadoras de colaboração. É um processo em que se identificam vários problemas e dificuldades, como sendo, no caso da criação de bases de dados lexicais multilingues, o desenvolvimento de uma arquitectura capaz de dar resposta a um conjunto vasto de questões linguísticas, como a polissemia, os padrões lexicais ou os equivalentes de tradução. Estas questões colocam-se na construção quer dos recursos terminológicos, quer de ontologias multilingues. No caso da construção de uma ontologia em diferentes línguas, processo no qual focalizaremos a nossa atenção, as questões e a complexidade aumentam, dado o tipo e propósitos do artefacto semântico, os elementos a localizar (conceitos e relações conceptuais) e o contexto em que o processo de localização ocorre. Pretendemos, assim, com este artigo, analisar o conceito e o processo de localização no contexto dos sistemas de gestão do conhecimento baseados em ontologias, tendo em atenção o papel central da terminologia no processo de localização, as diferentes abordagens e modelos propostos, bem como as ferramentas de base linguística que apoiam a implementação do processo. Procuraremos, finalmente, estabelecer alguns paralelismos entre o processo tradicional de localização e o processo de localização de ontologias, para melhor o situar e definir.
Resumo:
Purpose – Our paper aims at analyzing how different European countries cope with the European Energy Policy, which proposes a set of measures (free energy market, smart meters, energy certificates) to improve energy utilization and management in Europe. Design/methodology/approach – The paper first reports the general vision, regulations and goals set up by Europe to implement the European Energy Policy. Later on, it performs an analysis of how some European countries are coping with the goals, with financial, legal, economical and regulatory measures. Finally, the paper draws a comparison between the countries to present a view on how Europe is responding to the emerging energy emergency of the modern world. Findings – Our analysis on different use cases (countries) showed that European countries are converging to a common energy policy, even though some countries appear to be later than others In particular, Southern European countries were slowed down by the world financial and economical crisis. Still, it appears that contingency plans were put into action, and Europe as a whole is proceeding steadily towards the common vision. Research limitations/implications – European countries are applying yet more cuts to financing green technologies, and it is not possible to predict clearly how each country will evolve its support to the European energy policy. Practical implications – Different countries applied the concepts and measures in different ways. The implementation of the European energy policy has to cope with the resulting plethora of regulations, and a company proposing enhancement regarding energy management still has to possess robust knowledge of the single country, before being able to export experience and know-how between European countries. Originality/Value – Even though a few surveys on energy measures in Europe are already part of the state-of-the-art, organic analysis diagonal to the different topics of the European Energy Policy is missing. Moreover, this paper highlights how European countries are converging on a common view, and provides some details on the differences between the countries, thus facilitating parties interesting into cross-country export of experience and technology for energy management.
Resumo:
The use of a solar photovoltaic (PV) panel simulator can be a valued tool for the design and evaluation of the several components of a photovoltaic system. This simulator is based on power electronic converter controlled in such a way that will behave as a PV panel. Thus, in this paper a PV panel simulator based on a two quadrant DC/DC power converter is proposed. This topology will allow to achieve fast responses, like suddenly changes in the irradiation and temperature. To control the power converter it will be used a fast and robust sliding mode controller. Therefore, with the proposed system I-V curve simulation of a PV panel is obtained. Experimental results from a laboratory prototype are presented in order to confirm the theoretical operation.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia de redes de Comunicação e Multimédia
Resumo:
The development of biopharmaceutical manufacturing processes presents critical constraints, with the major constraint being that living cells synthesize these molecules, presenting inherent behavior variability due to their high sensitivity to small fluctuations in the cultivation environment. To speed up the development process and to control this critical manufacturing step, it is relevant to develop high-throughput and in situ monitoring techniques, respectively. Here, high-throughput mid-infrared (MIR) spectral analysis of dehydrated cell pellets and in situ near-infrared (NIR) spectral analysis of the whole culture broth were compared to monitor plasmid production in recombinant Escherichia coil cultures. Good partial least squares (PLS) regression models were built, either based on MIR or NIR spectral data, yielding high coefficients of determination (R-2) and low predictive errors (root mean square error, or RMSE) to estimate host cell growth, plasmid production, carbon source consumption (glucose and glycerol), and by-product acetate production and consumption. The predictive errors for biomass, plasmid, glucose, glycerol, and acetate based on MIR data were 0.7 g/L, 9 mg/L, 0.3 g/L, 0.4 g/L, and 0.4 g/L, respectively, whereas for NIR data the predictive errors obtained were 0.4 g/L, 8 mg/L, 0.3 g/L, 0.2 g/L, and 0.4 g/L, respectively. The models obtained are robust as they are valid for cultivations conducted with different media compositions and with different cultivation strategies (batch and fed-batch). Besides being conducted in situ with a sterilized fiber optic probe, NIR spectroscopy allows building PLS models for estimating plasmid, glucose, and acetate that are as accurate as those obtained from the high-throughput MIR setup, and better models for estimating biomass and glycerol, yielding a decrease in 57 and 50% of the RMSE, respectively, compared to the MIR setup. However, MIR spectroscopy could be a valid alternative in the case of optimization protocols, due to possible space constraints or high costs associated with the use of multi-fiber optic probes for multi-bioreactors. In this case, MIR could be conducted in a high-throughput manner, analyzing hundreds of culture samples in a rapid and automatic mode.