994 resultados para Harmonic balance algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Normal human metabolism leads to the daily production of large amounts of volatile and non-volatile acids. The maintenance of the pH within physiological limits is a demanding task in which several mechanisms are involved. The most immediate answer comes from several physiological buffers that quickly neutralize pH deviations caused by the addition of strong acids or bases to the body. Bicarbonate/carbonic acid is the most important buffer pair of the extracellular milieu, but is chemically inefficient and depends on the continuous activity of the lung and kidney. Other physiological buffers have higher efficacy and are very important in the intracellular environment and renal tubules. The capacity of the various chemical buffers is kept by operating in an open system and by several controlling mechanisms. The lung is responsible for the elimination of the carbon dioxide (CO2) produced in the body. In metabolic disorders, respiratory adjustment of the elimination of CO2 prolongs the effect of the bicarbonate/carbonic acid buffer, but this process consumes bicarbonate. The kidney contributes to acid-base balance through several mechanisms: 1) controls the reabsorption of filtered bicarbonate; 2) regenerates bicarbonate consumed in buffer reactions; 3) eliminates non-volatile acids. Renal elimination of acid and bicarbonate regeneration is only possible due to the existence of several urinary buffers and to the ability of the kidneys to produce ammonia

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Biomédica

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introdução: Muitos dos instrumentos disponíveis para a avaliação do equilíbrio, risco de queda e medo de cair medem apenas actividades simples no domicílio e apresentam tendência para um “efeito de tecto” em idosos residentes na comunidade. A “Activities-specific Balance Confidence (ABC) Scale” foi concebida para avaliar o equilíbrio de forma abrangente, num conjunto de actividades de vida diária associadas a um largo espectro de dificuldade. Objectivos: Traduzir para português e adaptar culturalmente para Portugal a “Activities-specific Balance Confidence (ABS) Scale” e avaliar a sua fiabilidade. População e Métodos: Tradução e adaptação cultural do instrumento e subsequente aplicação a uma população idosa portuguesa, para determinação da sua fiabilidade inter-observador, fiabilidade intra-observador e consistência interna. Resultados: Os resultados foram muito homogéneos na grande maioria das comparações realizadas, quer intra-observador, quer inter-observador. A avaliação da consistência interna revelou valores muito elevados. Estes níveis de fiabilidade mantiveram-se mesmo removendo qualquer uma das 16 questões que compõem o questionário, mantendo-se praticamente constantes os valores registados com o questionário completo. Conclusões: A versão portuguesa da escala ABC demonstrou boa fiabilidade intra-observador, fiabilidade inter-observador e consistência interna na avaliação da auto-percepção do equilíbrio para diversas actividades de vida diária numa população idosa portuguesa. Outros trabalhos serão necessários para avaliar a utilidade desta escala na avaliação do risco de queda e do efeito de intervenções terapêuticas nesta população.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tipicamente as redes elétricas de distribuição apresentam uma topologia parcialmente malhada e são exploradas radialmente. A topologia radial é obtida através da abertura das malhas nos locais que otimizam o ponto de operação da rede, através da instalação de aparelhos de corte que operam normalmente abertos. Para além de manterem a topologia radial, estes equipamentos possibilitam também a transferência de cargas entre saídas, aquando da ocorrência de defeitos. As saídas radiais são ainda dotadas de aparelhos de corte que operam normalmente fechados, estes têm como objetivo maximizar a fiabilidade e isolar defeitos, minimizando a área afetada pelos mesmos. Assim, na presente dissertação são desenvolvidos dois algoritmos determinísticos para a localização ótima de aparelhos de corte normalmente abertos e fechados, minimizando a potência ativa de perdas e o custo da energia não distribuída. O algoritmo de localização de aparelhos de corte normalmente abertos visa encontrar a topologia radial ótima que minimiza a potência ativa de perdas. O método é desenvolvido em ambiente Matlab – Tomlab, e é formulado como um problema de programação quadrática inteira mista. A topologia radial ótima é garantida através do cálculo de um trânsito de potências ótimo baseado no modelo DC. A função objetivo é dada pelas perdas por efeito de Joule. Por outro lado o problema é restringido pela primeira lei de Kirchhoff, limites de geração das subestações, limites térmicos dos condutores, trânsito de potência unidirecional e pela condição de radialidade. Os aparelhos de corte normalmente fechados são localizados ao longo das saídas radiais obtidas pelo anterior algoritmo, e permite minimizar o custo da energia não distribuída. No limite é possível localizar um aparelho de corte normalmente fechado em todas as linhas de uma rede de distribuição, sendo esta a solução que minimiza a energia não distribuída. No entanto, tendo em conta que a cada aparelho de corte está associado um investimento, é fundamental encontrar um equilíbrio entre a melhoria de fiabilidade e o investimento. Desta forma, o algoritmo desenvolvido avalia os benefícios obtidos com a instalação de aparelhos de corte normalmente fechados, e retorna o número e a localização dos mesmo que minimiza o custo da energia não distribuída. Os métodos apresentados são testados em duas redes de distribuição reais, exploradas com um nível de tensão de 15 kV e 30 kV, respetivamente. A primeira rede é localizada no distrito do Porto e é caraterizada por uma topologia mista e urbana. A segunda rede é localizada no distrito de Bragança e é caracterizada por uma topologia maioritariamente aérea e rural.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Classical serological screening assays for Chagas' disease are time consuming and subjective. The objective of the present work is to evaluate the enzyme immuno-assay (ELISA) methodology and to propose an algorithm for blood banks to be applied to Chagas' disease. Seven thousand, nine hundred and ninety nine blood donor samples were screened by both reverse passive hemagglutination (RPHA) and indirect immunofluorescence assay (IFA). Samples reactive on RPHA and/or IFA were submitted to supplementary RPHA, IFA and complement fixation (CFA) tests. This strategy allowed us to create a panel of 60 samples to evaluate the ELISA methodology from 3 different manufacturers. The sensitivity of the screening by IFA and the 3 different ELISA's was 100%. The specificity was better on ELISA methodology. For Chagas disease, ELISA seems to be the best test for blood donor screening, because it showed high sensitivity and specificity, it is not subjective and can be automated. Therefore, it was possible to propose an algorithm to screen samples and confirm donor results at the blood bank.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diffusion Kurtosis Imaging (DKI) is a fairly new magnetic resonance imag-ing (MRI) technique that tackles the non-gaussian motion of water in biological tissues by taking into account the restrictions imposed by tissue microstructure, which are not considered in Diffusion Tensor Imaging (DTI), where the water diffusion is considered purely gaussian. As a result DKI provides more accurate information on biological structures and is able to detect important abnormalities which are not visible in standard DTI analysis. This work regards the development of a tool for DKI computation to be implemented as an OsiriX plugin. Thus, as OsiriX runs under Mac OS X, the pro-gram is written in Objective-C and also makes use of Apple’s Cocoa framework. The whole program is developed in the Xcode integrated development environ-ment (IDE). The plugin implements a fast heuristic constrained linear least squares al-gorithm (CLLS-H) for estimating the diffusion and kurtosis tensors, and offers the user the possibility to choose which maps are to be generated for not only standard DTI quantities such as Mean Diffusion (MD), Radial Diffusion (RD), Axial Diffusion (AD) and Fractional Anisotropy (FA), but also DKI metrics, Mean Kurtosis (MK), Radial Kurtosis (RK) and Axial Kurtosis (AK).The plugin was subjected to both a qualitative and a semi-quantitative analysis which yielded convincing results. A more accurate validation pro-cess is still being developed, after which, and with some few minor adjust-ments the plugin shall become a valid option for DKI computation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Double Degree. A Work Project presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics and a Masters Degree in Finance from Louvain School of Management

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this work project is to analyze the current algorithm used by EDP to estimate their clients’ electrical energy consumptions, create a new algorithm and compare the advantages and disadvantages of both. This new algorithm is different from the current one as it incorporates some effects from temperature variations. The results of the comparison show that this new algorithm with temperature variables performed better than the same algorithm without temperature variables, although there is still potential for further improvements of the current algorithm, if the prediction model is estimated using a sample of daily data, which is the case of the current EDP algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study was to assess the flexor-extensor group of muscles of the knee in young athletes diagnosed with a total rupture of the anterior cruciate ligament (ACL). Eighteen knees of 18 athletes (14 men and 4 women) with an average age of 21.6 years (range 16-32 years) were assessed with a Cybex 6000 model isokinetic apparatus. The average interval between occurrence of the injury and assessment was 10.2 months (range 2 - 48 months). There was an associated meniscal injury in eight of the knees. Athletes with any other kind of associated injury, limitation, or blockage of the movement of the joint, significant pain during the exam, or interval between injury and exam of less than two months were excluded from the study. The parameters studied were the peak torque-velocity and flexor-extensor relationships at the constant angular velocities of 60°/sec and 240°/sec. Previous warming-up was done by means of an ergometric bicycle and adaptation with 3 submaximal repetitions. The contra-lateral side, which presented no injury, was used as control. Peak torque (PT) at the constant velocity of 60°/sec was greater than that at 240°/sec for knees with and without injuries. However, there was no significant difference between the injured and uninjured sides at 60°/sec or at 240°/sec. The average value for the flexor-extensor relationship at 60°/sec on the injured was 60% (( 6), compared to 57% (( 10) on the contra-lateral side. At 240°/sec, the average value was 75% ((10) on the injured side, and 65% ((12) on the contra-lateral side. In conclusion, despite the complete rupture of the ACL of one knee, the average values for the flexor-extensor relationship were similar on the injured and uninjured sides at the velocity of 60°/sec. As the velocity increased, an increase in the values for the flexor-extensor relationship of the knee also occurred, indicating a tendency of the performance of the flexor muscle group to approach that of the extensor muscle group, and this tendency was more pronounced on the side of the injury.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Contém resumo

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ship tracking systems allow Maritime Organizations that are concerned with the Safety at Sea to obtain information on the current location and route of merchant vessels. Thanks to Space technology in recent years the geographical coverage of the ship tracking platforms has increased significantly, from radar based near-shore traffic monitoring towards a worldwide picture of the maritime traffic situation. The long-range tracking systems currently in operations allow the storage of ship position data over many years: a valuable source of knowledge about the shipping routes between different ocean regions. The outcome of this Master project is a software prototype for the estimation of the most operated shipping route between any two geographical locations. The analysis is based on the historical ship positions acquired with long-range tracking systems. The proposed approach makes use of a Genetic Algorithm applied on a training set of relevant ship positions extracted from the long-term storage tracking database of the European Maritime Safety Agency (EMSA). The analysis of some representative shipping routes is presented and the quality of the results and their operational applications are assessed by a Maritime Safety expert.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present paper reports the precipitation process of Al3Sc structures in an aluminum scandium alloy, which has been simulated with a synchronous parallel kinetic Monte Carlo (spkMC) algorithm. The spkMC implementation is based on the vacancy diffusion mechanism. To filter the raw data generated by the spkMC simulations, the density-based clustering with noise (DBSCAN) method has been employed. spkMC and DBSCAN algorithms were implemented in the C language and using MPI library. The simulations were conducted in the SeARCH cluster located at the University of Minho. The Al3Sc precipitation was successfully simulated at the atomistic scale with the spkMC. DBSCAN proved to be a valuable aid to identify the precipitates by performing a cluster analysis of the simulation results. The achieved simulations results are in good agreement with those reported in the literature under sequential kinetic Monte Carlo simulations (kMC). The parallel implementation of kMC has provided a 4x speedup over the sequential version.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The jet energy scale (JES) and its systematic uncertainty are determined for jets measured with the ATLAS detector using proton–proton collision data with a centre-of-mass energy of s√=7 TeV corresponding to an integrated luminosity of 4.7 fb −1 . Jets are reconstructed from energy deposits forming topological clusters of calorimeter cells using the anti- kt algorithm with distance parameters R=0.4 or R=0.6 , and are calibrated using MC simulations. A residual JES correction is applied to account for differences between data and MC simulations. This correction and its systematic uncertainty are estimated using a combination of in situ techniques exploiting the transverse momentum balance between a jet and a reference object such as a photon or a Z boson, for 20≤pjetT<1000 GeV and pseudorapidities |η|<4.5 . The effect of multiple proton–proton interactions is corrected for, and an uncertainty is evaluated using in situ techniques. The smallest JES uncertainty of less than 1 % is found in the central calorimeter region ( |η|<1.2 ) for jets with 55≤pjetT<500 GeV . For central jets at lower pT , the uncertainty is about 3 %. A consistent JES estimate is found using measurements of the calorimeter response of single hadrons in proton–proton collisions and test-beam data, which also provide the estimate for pjetT>1 TeV. The calibration of forward jets is derived from dijet pT balance measurements. The resulting uncertainty reaches its largest value of 6 % for low- pT jets at |η|=4.5 . Additional JES uncertainties due to specific event topologies, such as close-by jets or selections of event samples with an enhanced content of jets originating from light quarks or gluons, are also discussed. The magnitude of these uncertainties depends on the event sample used in a given physics analysis, but typically amounts to 0.5–3 %.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper deals with a computing simulation for an offshore wind energy system taking into account the influence of the marine waves action throughout the floating platform. The wind energy system has a variable-speed turbine equipped with a permanent magnet synchronous generator and a full-power five level converter, injecting energy into the electric grid through a high voltage alternate current link. A reduction on the unbalance of the voltage in the DC-link capacitors of the five-level converter is proposed by a strategic selection of the output voltage vectors. The model for the drive train of the wind energy system is a two mass model, including the dynamics of the floating platform. A case study is presented and the assessment of the quality of the energy injected into the electric grid is discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The artificial fish swarm algorithm has recently been emerged in continuous global optimization. It uses points of a population in space to identify the position of fish in the school. Many real-world optimization problems are described by 0-1 multidimensional knapsack problems that are NP-hard. In the last decades several exact as well as heuristic methods have been proposed for solving these problems. In this paper, a new simpli ed binary version of the artificial fish swarm algorithm is presented, where a point/ fish is represented by a binary string of 0/1 bits. Trial points are created by using crossover and mutation in the different fi sh behavior that are randomly selected by using two user de ned probability values. In order to make the points feasible the presented algorithm uses a random heuristic drop item procedure followed by an add item procedure aiming to increase the profit throughout the adding of more items in the knapsack. A cyclic reinitialization of 50% of the population, and a simple local search that allows the progress of a small percentage of points towards optimality and after that refines the best point in the population greatly improve the quality of the solutions. The presented method is tested on a set of benchmark instances and a comparison with other methods available in literature is shown. The comparison shows that the proposed method can be an alternative method for solving these problems.