952 resultados para Doubly robust estimation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In distributed soft real-time systems, maximizing the aggregate quality-of-service (QoS) is a typical system-wide goal, and addressing the problem through distributed optimization is challenging. Subtasks are subject to unpredictable failures in many practical environments, and this makes the problem much harder. In this paper, we present a robust optimization framework for maximizing the aggregate QoS in the presence of random failures. We introduce the notion of K-failure to bound the effect of random failures on schedulability. Using this notion we define the concept of K-robustness that quantifies the degree of robustness on QoS guarantee in a probabilistic sense. The parameter K helps to tradeoff achievable QoS versus robustness. The proposed robust framework produces optimal solutions through distributed computations on the basis of Lagrangian duality, and we present some implementation techniques. Our simulation results show that the proposed framework can probabilistically guarantee sub-optimal QoS which remains feasible even in the presence of random failures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia Electrotécnica e de Computadores

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada para obtenção do Grau de Doutor em Matemática, Estatística, pela Universidade Nova de Lisboa, faculdade de Ciências e Tecnologia

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A robot’s drive has to exert appropriate driving forces that can keep its arm and end effector at the proper position, velocity and acceleration, and simultaneously has to compensate for the effects of the contact forces arising between the tool and the workpiece depending on the needs of the actual technological operation. Balancing the effects of a priori unknown external disturbance forces and the inaccuracies of the available dynamic model of the robot is also important. Technological tasks requiring well prescribed end effector trajectories and contact forces simultaneously are challenging control problems that can be tackled in various manners.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim: Optimise a set of exposure factors, with the lowest effective dose, to delineate spinal curvature with the modified Cobb method in a full spine using computed radiography (CR) for a 5-year-old paediatric anthropomorphic phantom. Methods: Images were acquired by varying a set of parameters: positions (antero-posterior (AP), posteroanterior (PA) and lateral), kilo-voltage peak (kVp) (66-90), source-to-image distance (SID) (150 to 200cm), broad focus and the use of a grid (grid in/out) to analyse the impact on E and image quality (IQ). IQ was analysed applying two approaches: objective [contrast-to-noise-ratio/(CNR] and perceptual, using 5 observers. Monte-Carlo modelling was used for dose estimation. Cohen’s Kappa coefficient was used to calculate inter-observer-variability. The angle was measured using Cobb’s method on lateral projections under different imaging conditions. Results: PA promoted the lowest effective dose (0.013 mSv) compared to AP (0.048 mSv) and lateral (0.025 mSv). The exposure parameters that allowed lower dose were 200cm SID, 90 kVp, broad focus and grid out for paediatrics using an Agfa CR system. Thirty-seven images were assessed for IQ and thirty-two were classified adequate. Cobb angle measurements varied between 16°±2.9 and 19.9°±0.9. Conclusion: Cobb angle measurements can be performed using the lowest dose with a low contrast-tonoise ratio. The variation on measurements for this was ±2.9° and this is within the range of acceptable clinical error without impact on clinical diagnosis. Further work is recommended on improvement to the sample size and a more robust perceptual IQ assessment protocol for observers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Os métodos clínicos que são realizados com recurso a tecnologias de imagiologia têm registado um aumento de popularidade nas últimas duas décadas. Os procedimentos tradicionais usados em cirurgia têm sido substituídos por métodos minimamente invasivos de forma a conseguir diminuir os custos associados e aperfeiçoar factores relacionados com a produtividade. Procedimentos clínicos modernos como a broncoscopia e a cardiologia são caracterizados por se focarem na minimização de acções invasivas, com os arcos em ‘C’ a adoptarem um papel relevante nesta área. Apesar de o arco em ‘C’ ser uma tecnologia amplamente utilizada no auxílio da navegação em intervenções minimamente invasivas, este falha na qualidade da informação fornecida ao cirurgião. A informação obtida em duas dimensões não é suficiente para proporcionar uma compreensão total da localização tridimensional da região de interesse, revelando-se como uma tarefa essencial o estabelecimento de um método que permita a aquisição de informação tridimensional. O primeiro passo para alcançar este objectivo foi dado ao definir um método que permite a estimativa da posição e orientação de um objecto em relação ao arco em ‘C’. De forma a realizar os testes com o arco em ‘C’, a geometria deste teve que ser inicialmente definida e a calibração do sistema feita. O trabalho desenvolvido e apresentado nesta tese foca-se num método que provou ser suficientemente sustentável e eficiente para se estabelecer como um ponto de partida no caminho para alcançar o objectivo principal: o desenvolvimento de uma técnica que permita o aperfeiçoamento da qualidade da informação adquirida com o arco em ‘C’ durante uma intervenção clínica.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O trabalho apresentado centra-se na determinação dos custos de construção de condutas de pequenos e médios diâmetros em Polietileno de Alta Densidade (PEAD) para saneamento básico, tendo como base a metodologia descrita no livro Custos de Construção e Exploração – Volume 9 da série Gestão de Sistemas de Saneamento Básico, de Lencastre et al. (1994). Esta metodologia descrita no livro já referenciado, nos procedimentos de gestão de obra, e para tal foram estimados custos unitários de diversos conjuntos de trabalhos. Conforme Lencastre et al (1994), “esses conjuntos são referentes a movimentos de terras, tubagens, acessórios e respetivos órgãos de manobra, pavimentações e estaleiro, estando englobado na parte do estaleiro trabalhos acessórios correspondentes à obra.” Os custos foram obtidos analisando vários orçamentos de obras de saneamento, resultantes de concursos públicos de empreitadas recentemente realizados. Com vista a tornar a utilização desta metodologia numa ferramenta eficaz, foram organizadas folhas de cálculo que possibilitam obter estimativas realistas dos custos de execução de determinada obra em fases anteriores ao desenvolvimento do projeto, designadamente numa fase de preparação do plano diretor de um sistema ou numa fase de elaboração de estudos de viabilidade económico-financeiros, isto é, mesmo antes de existir qualquer pré-dimensionamento dos elementos do sistema. Outra técnica implementada para avaliar os dados de entrada foi a “Análise Robusta de Dados”, Pestana (1992). Esta metodologia permitiu analisar os dados mais detalhadamente antes de se formularem hipóteses para desenvolverem a análise de risco. A ideia principal é o exame bastante flexível dos dados, frequentemente antes mesmo de os comparar a um modelo probabilístico. Assim, e para um largo conjunto de dados, esta técnica possibilitou analisar a disparidade dos valores encontrados para os diversos trabalhos referenciados anteriormente. Com os dados recolhidos, e após o seu tratamento, passou-se à aplicação de uma metodologia de Análise de Risco, através da Simulação de Monte Carlo. Esta análise de risco é feita com recurso a uma ferramenta informática da Palisade, o @Risk, disponível no Departamento de Engenharia Civil. Esta técnica de análise quantitativa de risco permite traduzir a incerteza dos dados de entrada, representada através de distribuições probabilísticas que o software disponibiliza. Assim, para por em prática esta metodologia, recorreu-se às folhas de cálculo que foram realizadas seguindo a abordagem proposta em Lencastre et al (1994). A elaboração e a análise dessas estimativas poderão conduzir à tomada de decisões sobre a viabilidade da ou das obras a realizar, nomeadamente no que diz respeito aos aspetos económicos, permitindo uma análise de decisão fundamentada quanto à realização dos investimentos.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is on an onshore variable speed wind turbine with doubly fed induction generator and under supervisory control. The control architecture is equipped with an event-based supervisor for the supervision level and fuzzy proportional integral or discrete adaptive linear quadratic as proposed controllers for the execution level. The supervisory control assesses the operational state of the variable speed wind turbine and sends the state to the execution level. Controllers operation are in the full load region to extract energy at full power from the wind while ensuring safety conditions required to inject the energy into the electric grid. A comparison between the simulations of the proposed controllers with the inclusion of the supervisory control on the variable speed wind turbine benchmark model is presented to assess advantages of these controls. (C) 2015 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, we present results from teleseismic P-wave receiver functions (PRFs) obtained in Portugal, Western Iberia. A dense seismic station deployment conducted between 2010 and 2012, in the scope of the WILAS project and covering the entire country, allowed the most spatially extensive probing on the bulk crustal seismic properties of Portugal up to date. The application of the H-κ stacking algorithm to the PRFs enabled us to estimate the crustal thickness (H) and the average crustal ratio of the P- and S-waves velocities V p/V s (κ) for the region. Observations of Moho conversions indicate that this interface is relatively smooth with the crustal thickness ranging between 24 and 34 km, with an average of 30 km. The highest V p/V s values are found on the Mesozoic-Cenozoic crust beneath the western and southern coastal domain of Portugal, whereas the lowest values correspond to Palaeozoic crust underlying the remaining part of the subject area. An average V p/V s is found to be 1.72, ranging 1.63-1.86 across the study area, indicating a predominantly felsic composition. Overall, we systematically observe a decrease of V p/V s with increasing crustal thickness. Taken as a whole, our results indicate a clear distinction between the geological zones of the Variscan Iberian Massif in Portugal, the overall shape of the anomalies conditioned by the shape of the Ibero-Armorican Arc, and associated Late Paleozoic suture zones, and the Meso-Cenozoic basin associated with Atlantic rifting stages. Thickened crust (30-34 km) across the studied region may be inherited from continental collision during the Paleozoic Variscan orogeny. An anomalous crustal thinning to around 28 km is observed beneath the central part of the Central Iberian Zone and the eastern part of South Portuguese Zone.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new algorithm for the velocity vector estimation of moving ships using Single Look Complex (SLC) SAR data in strip map acquisition mode is proposed. The algorithm exploits both amplitude and phase information of the Doppler decompressed data spectrum, with the aim to estimate both the azimuth antenna pattern and the backscattering coefficient as function of the look angle. The antenna pattern estimation provides information about the target velocity; the backscattering coefficient can be used for vessel classification. The range velocity is retrieved in the slow time frequency domain by estimating the antenna pattern effects induced by the target motion, while the azimuth velocity is calculated by the estimated range velocity and the ship orientation. Finally, the algorithm is tested on simulated SAR SLC data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper extents the by now classic sensor fusion complementary filter (CF) design, involving two sensors, to the case where three sensors that provide measurements in different bands are available. This paper shows that the use of classical CF techniques to tackle a generic three sensors fusion problem, based solely on their frequency domain characteristics, leads to a minimal realization, stable, sub-optimal solution, denoted as Complementary Filters3 (CF3). Then, a new approach for the estimation problem at hand is used, based on optimal linear Kalman filtering techniques. Moreover, the solution is shown to preserve the complementary property, i.e. the sum of the three transfer functions of the respective sensors add up to one, both in continuous and discrete time domains. This new class of filters are denoted as Complementary Kalman Filters3 (CKF3). The attitude estimation of a mobile robot is addressed, based on data from a rate gyroscope, a digital compass, and odometry. The experimental results obtained are reported.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an ankle mounted Inertial Navigation System (INS) used to estimate the distance traveled by a pedestrian. This distance is estimated by the number of steps given by the user. The proposed method is based on force sensors to enhance the results obtained from an INS. Experimental results have shown that, depending on the step frequency, the traveled distance error varies between 2.7% and 5.6%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the estimation of surfaces from a set of 3D points using the unified framework described in [1]. This framework proposes the use of competitive learning for curve estimation, i.e., a set of points is defined on a deformable curve and they all compete to represent the available data. This paper extends the use of the unified framework to surface estimation. It o shown that competitive learning performes better than snakes, improving the model performance in the presence of concavities and allowing to desciminate close surfaces. The proposed model is evaluated in this paper using syntheticdata and medical images (MRI and ultrasound images).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dimensionality reduction plays a crucial role in many hyperspectral data processing and analysis algorithms. This paper proposes a new mean squared error based approach to determine the signal subspace in hyperspectral imagery. The method first estimates the signal and noise correlations matrices, then it selects the subset of eigenvalues that best represents the signal subspace in the least square sense. The effectiveness of the proposed method is illustrated using simulated and real hyperspectral images.