80 resultados para Simple methods

em Instituto Politécnico do Porto, Portugal


Relevância:

20.00% 20.00%

Publicador:

Resumo:

As sequelas fisiopatológicas do stress oxidativo são difíceis de quantificar. Apesar dos obstáculos, a relevância médica do stress oxidativo tem vindo a ser cada vez mais reconhecida, sendo hoje em dia encarado como um componente chave de virtualmente todas as doenças. A disfunção erétil (DE) surge neste contexto como uma espécie de barómetro da função endotelial e do dano oxidativo. A quantificação de biomarcadores de stress oxidativo poderá apresentar um enorme impacto na avaliação de pacientes com DE. O rácio glutationa reduzida/oxidada (GSH/GSSG) e a nitrotirosina (3-NT) têm vindo a demonstrar relevância clínica. A consideração de polimorfismos genéticos constitui ainda uma abordagem promissora na avaliação destas relações no futuro. Um método altamente sensível de cromatografia líquida de alta performance (HPLC) foi desenvolvido para a determinação de 3-NT em plasma humano. As concentrações de 3-NT medidos em indivíduos com DE foram 6,6±2,1μM (média±S.D., n = 46). A medição da concentração plasmática de 3-NT poderá revelar-se útil como marcador de dano oxidativo dependente do óxido nítrico (NO). O nível de stress oxidativo pode também ser quantificado através da medição do decréscimo do rácio GSH/GSSG, que tem mostrado alterações numa miríade de patologias, como a DE e a diabetes mellitus. O método proposto para a quantificação do rácio GSH/GSSG em HPLC apresenta a vantagem de avaliação concomitante dos dois parâmetros em apenas uma corrida. O valor do rácio GSH/GSSG obtido a partir de sangue de indivíduos com DE foi 11,9±9,8 (média±S.D., n = 49). Os resultados estatísticos revelaram diferenças significativas (p<0,001) entre ambos a concentração plasmática de 3-NT e o rácio GSH/GSSG de sangue de indivíduos com DE e as respetivas medições em indivíduos saudáveis. Observaram-se ainda diferenças estatisticamente significativas (p≈0,027) entre o rácio GSH/GSSG do sangue de pacientes apenas com diagnóstico de DE e a medição respetiva em indivíduos com DE e comorbilidades cardiovasculares. Estes resultados enfatizam o papel do dano oxidativo na biopatologia da DE, elucidado com o auxílio destas duas metodologias, que poderão ter um amplo campo de aplicação no futuro, dado que se mostraram simples, não dispendiosas e rápidas, podendo eventualmente adequar-se a estudos de rastreio em larga escala.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present research paper presents five different clustering methods to identify typical load profiles of medium voltage (MV) electricity consumers. These methods are intended to be used in a smart grid environment to extract useful knowledge about customer’s behaviour. The obtained knowledge can be used to support a decision tool, not only for utilities but also for consumers. Load profiles can be used by the utilities to identify the aspects that cause system load peaks and enable the development of specific contracts with their customers. The framework presented throughout the paper consists in several steps, namely the pre-processing data phase, clustering algorithms application and the evaluation of the quality of the partition, which is supported by cluster validity indices. The process ends with the analysis of the discovered knowledge. To validate the proposed framework, a case study with a real database of 208 MV consumers is used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Intensive use of Distributed Generation (DG) represents a change in the paradigm of power systems operation making small-scale energy generation and storage decision making relevant for the whole system. This paradigm led to the concept of smart grid for which an efficient management, both in technical and economic terms, should be assured. This paper presents a new approach to solve the economic dispatch in smart grids. The proposed methodology for resource management involves two stages. The first one considers fuzzy set theory to define the natural resources range forecast as well as the load forecast. The second stage uses heuristic optimization to determine the economic dispatch considering the generation forecast, storage management and demand response

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the context of electricity markets, transmission pricing is an important tool to achieve an efficient operation of the electricity system. The electricity market is influenced by several factors; however the transmission network management is one of the most important aspects, because the network is a natural monopoly. The transmission tariffs can help to regulate the market, for this reason transmission tariffs must follow strict criteria. This paper presents the following methods to tariff the use of transmission networks by electricity market players: Post-Stamp Method; MW-Mile Method Distribution Factors Methods; Tracing Methodology; Bialek’s Tracing Method and Locational Marginal Price. A nine bus transmission network is used to illustrate the application of the tariff methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes two meta-heuristics (Genetic Algorithm and Evolutionary Particle Swarm Optimization) for solving a 15 bid-based case of Ancillary Services Dispatch in an Electricity Market. A Linear Programming approach is also included for comparison purposes. A test case based on the dispatch of Regulation Down, Regulation Up, Spinning Reserve and Non-Spinning Reserve services is used to demonstrate that the use of meta-heuristics is suitable for solving this kind of optimization problem. Faster execution times and lower computational resources requirements are the most relevant advantages of the used meta-heuristics when compared with the Linear Programming approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electricity market players operating in a liberalized environment requires access to an adequate decision support tool, allowing them to consider all the business opportunities and take strategic decisions. Ancillary services represent a good negotiation opportunity that must be considered by market players. For this, decision support tool must include ancillary market simulation. This paper proposes two different methods (Linear Programming and Genetic Algorithm approaches) for ancillary services dispatch. The methodologies are implemented in MASCEM, a multi-agent based electricity market simulator. A test case based on California Independent System Operator (CAISO) data concerning the dispatch of Regulation Down, Regulation Up, Spinning Reserve and Non-Spinning Reserve services is included in this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives : The purpose of this article is to find out differences between surveys using paper and online questionnaires. The author has deep knowledge in the case of questions concerning opinions in the development of survey based research, e.g. the limits of postal and online questionnaires. Methods : In the physician studies carried out in 1995 (doctors graduated in 1982-1991), 2000 (doctors graduated in 1982-1996), 2005 (doctors graduated in 1982-2001), 2011 (doctors graduated in 1977-2006) and 457 family doctors in 2000, were used paper and online questionnaires. The response rates were 64%, 68%, 64%, 49% and 73%, respectively. Results : The results of the physician studies showed that there were differences between methods. These differences were connected with using paper-based questionnaire and online questionnaire and response rate. The online-based survey gave a lower response rate than the postal survey. The major advantages of online survey were short response time; very low financial resource needs and data were directly loaded in the data analysis software, thus saved time and resources associated with the data entry process. Conclusions : The current article helps researchers with planning the study design and choosing of the right data collection method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In real optimization problems, usually the analytical expression of the objective function is not known, nor its derivatives, or they are complex. In these cases it becomes essential to use optimization methods where the calculation of the derivatives, or the verification of their existence, is not necessary: the Direct Search Methods or Derivative-free Methods are one solution. When the problem has constraints, penalty functions are often used. Unfortunately the choice of the penalty parameters is, frequently, very difficult, because most strategies for choosing it are heuristics strategies. As an alternative to penalty function appeared the filter methods. A filter algorithm introduces a function that aggregates the constrained violations and constructs a biobjective problem. In this problem the step is accepted if it either reduces the objective function or the constrained violation. This implies that the filter methods are less parameter dependent than a penalty function. In this work, we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of the simplex method and filter methods. This method does not compute or approximate any derivatives, penalty constants or Lagrange multipliers. The basic idea of simplex filter algorithm is to construct an initial simplex and use the simplex to drive the search. We illustrate the behavior of our algorithm through some examples. The proposed methods were implemented in Java.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, a microwave-assisted extraction (MAE) methodology was compared with several conventional extraction methods (Soxhlet, Bligh & Dyer, modified Bligh & Dyer, Folch, modified Folch, Hara & Radin, Roese-Gottlieb) for quantification of total lipid content of three fish species: horse mackerel (Trachurus trachurus), chub mackerel (Scomber japonicus), and sardine (Sardina pilchardus). The influence of species, extraction method and frozen storage time (varying from fresh to 9 months of freezing) on total lipid content was analysed in detail. The efficiencies of methods MAE, Bligh & Dyer, Folch, modified Folch and Hara & Radin were the highest and although they were not statistically different, differences existed in terms of variability, with MAE showing the highest repeatability (CV = 0.034). Roese-Gottlieb, Soxhlet, and modified Bligh & Dyer methods were very poor in terms of efficiency as well as repeatability (CV between 0.13 and 0.18).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

GOAL: The manufacturing and distribution of strips of instant thin - layer chromatography with silica gel (ITLC - SG) (reference method) is currently discontinued so there is a need for an alternative method f or the determination of radiochemical purity (RCP) of 99m Tc - tetrofosmin. This study aims to compare five alternative methods proposed by the producer to determine the RCP of 99m Tc - tetrofosmin. METHODS: Nineteen vials of tetrofosmin were radiolabelled with 99m Tc and the percentages of the RCP were determined. Five different methods were compared with the standard RCP testing method (ITLC - SG, 2x20 cm): Whatman 3MM (1x10 cm) with acetone and dichloro - methane (method 1); Whatman 3MM (1x1 0 cm) with ethyl acetate (method 2); aluminum oxide - coated plastic thin - layer chromatography (TLC) plate (1x10 cm) and ethanol (method 3); Whatman 3MM (2x20 cm) with acetone and dichloro - methane (method 4); solid - phase extraction method C18 cartridge (meth od 5). RESULTS: The average values of RCP were 95,30% ± 1,28% (method 1), 93,95 ± 0,61% (method 2), 96,85% ± 0,93% (method 3), 92,94% ± 0,99% (method 4) and 96,25% ± 2,57% (method 5) (n=12 each), and 93,15% ± 1,13% for the standard method (n=19). There we re statistical significant differences in the values obtained for methods 1 (P=0,001), 3 (P=0,000) and 5 (P=0,004), and there were no statistical significant differences in the values obtained for methods 2 (P=0,113) and 4 (P=0,327). CONCLUSION: From the results obtained, methods 2 and 4 showed a higher correlation with the standard method. Unlike method 4, method 2 is less time - consuming than the reference method and can overcome the problems associated with the solvent toxicity. The remaining methods (1, 3 and 5) tended to overestimate RCP value compared to the standard method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Paper and thin layer chromatography methods are frequently used in Classic Nuclear Medicine for the determination of radiochemical purity (RCP) on radiopharmaceutical preparations. An aliquot of the radiopharmaceutical to be tested is spotted at the origin of a chromatographic strip (stationary phase), which in turn is placed in a chromatographic chamber in order to separate and quantify radiochemical species present in the radiopharmaceutical preparation. There are several methods for the RCP measurement, based on the use of equipment as dose calibrators, well scintillation counters, radiochromatografic scanners and gamma cameras. The purpose of this study was to compare these quantification methods for the determination of RCP. Material and Methods: 99mTc-Tetrofosmin and 99mTc-HDP are the radiopharmaceuticals chosen to serve as the basis for this study. For the determination of RCP of 99mTc-Tetrofosmin we used ITLC-SG (2.5 x 10 cm) and 2-butanone (99mTc-tetrofosmin Rf = 0.55, 99mTcO4- Rf = 1.0, other labeled impurities 99mTc-RH RF = 0.0). For the determination of RCP of 99mTc-HDP, Whatman 31ET and acetone was used (99mTc-HDP Rf = 0.0, 99mTcO4- Rf = 1.0, other labeled impurities RF = 0.0). After the development of the solvent front, the strips were allowed to dry and then imaged on the gamma camera (256x256 matrix; zoom 2; LEHR parallel-hole collimator; 5-minute image) and on the radiochromatogram scanner. Then, strips were cut in Rf 0.8 in the case of 99mTc-tetrofosmin and Rf 0.5 in the case of 99mTc-HDP. The resultant pieces were smashed in an assay tube (to minimize the effect of counting geometry) and counted in the dose calibrator and in the well scintillation counter (during 1 minute). The RCP was calculated using the formula: % 99mTc-Complex = [(99mTc-Complex) / (Total amount of 99mTc-labeled species)] x 100. Statistical analysis was done using the test of hypotheses for the difference between means in independent samples. Results:The gamma camera based method demonstrated higher operator-dependency (especially concerning the drawing of the ROIs) and the measures obtained using the dose calibrator are very sensitive to the amount of activity spotted in the chromatographic strip, so the use of a minimum of 3.7 MBq activity is essential to minimize quantification errors. Radiochromatographic scanner and well scintillation counter showed concordant results and demonstrated the higher level of precision. Conclusions: Radiochromatographic scanners and well scintillation counters based methods demonstrate to be the most accurate and less operator-dependant methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Although relative uptake values aren’t the most important objective of a 99mTc-DMSA scan, they are important quantitative information. In most of the dynamic renal scintigraphies attenuation correction is essential if one wants to obtain a reliable result of the quantification process. Although in DMSA scans the absent of significant background and the lesser attenuation in pediatric patients, makes that this attenuation correction techniques are actually not applied. The geometric mean is the most common method, but that includes the acquisition of an anterior (extra) projection, which it is not acquired by a large number of NM departments. This method and the attenuation factors proposed by Tonnesen will be correlated with the absence of attenuation correction procedures. Material and Methods: Images from 20 individuals (aged 3 years +/- 2) were used and the two attenuation correction methods applied. The mean time of acquisition (time post DMSA administration) was 3.5 hours +/- 0.8h. Results: The absence of attenuation correction showed a good correlation with both attenuation methods (r=0.73 +/- 0.11) and the mean difference verified on the uptake values between the different methods were 4 +/- 3. The correlation was higher when the age was lower. The attenuation correction methods correlation was higher between them two than with the “no attenuation correction” method (r=0.82 +/- 0.8), and the mean differences of the uptake values were 2 +/- 2. Conclusion: The decision of not doing any kind of attenuation correction method can be justified by the minor differences verified on the relative kidney uptake values. Nevertheless, if it is recognized that there is a need for an accurate value of the relative kidney uptake, then an attenuation correction method should be used. Attenuation correction factors proposed by Tonnesen can be easily implemented and so become a practical and easy to implement alternative, namely when the anterior projection - needed for the geometric mean methodology – is not acquired.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A noção de Economia relativa ao Hidrogénio no vocabulário dos líderes políticos e empresariais tem vindo a mudar sobretudo pela preocupação da poluição global, segurança energética e mudanças climáticas, para além do crescente domínio técnico dos cientistas e engenheiros. O interesse neste composto, que é o elemento mais simples e abundante no universo, está a crescer, devido aos avanços tecnológicos das células de combustível – as potenciais sucessoras das baterias dos aparelhos portáteis eletrónicos, centrais elétricas e motores de combustão interna. Existem métodos já bem desenvolvidos para produzir o hidrogénio. Contudo, destacase a eletrólise da água, não só por ser um método simples mas porque pode utilizar recursos energéticos renováveis, tais como, o vento ou os painéis fotovoltaicos, e aumentar a sua eficiência. Os desafios para melhorar a utilização deste método consistem em reduzir o consumo, a manutenção e os custos energéticos e aumentar a confiança, a durabilidade e a segurança. Mais ainda, consistem em rentabilizar o subproduto oxigénio pois é um gás industrial e medicinal muito importante. Neste trabalho, estudou-se a viabilidade económica da instalação de uma unidade de produção de hidrogénio e oxigénio puros por eletrólise da água, utilizando como fonte energética a energia solar, na empresa Gasoxmed – Gases Medicinais S.A., pretendendo num futuro próximo, comercializar o hidrogénio como fonte de energia, e por outro lado, aproveitar o subproduto oxigénio para utilização industrial. Projetou-se assim uma unidade utilizando um eletrolisador da marca Proton, modelo C30, com capacidade de produção gasosa de 3 kg/h (30 m3/h) de hidrogénio e 20 kg/h (15 m3/h) de oxigénio. Os gases produzidos são comprimidos num compressor da marca RIX a 200 bares para posterior armazenamento em cilindros pressurizados. Dimensionou-se ainda um sistema de miniprodução fotovoltaico com potência 250 kW para alimentar eletricamente a instalação. A realização do projeto na nova área de produção necessitará de 1.713.963€, os quais serão adquiridos por empréstimo bancário. Definiram-se todos os custos fixos associados ao projeto que perfazem um total de 62.554€/mês para os primeiros 5 anos (duração do crédito bancário) findo o qual diminuirão para 21.204€/mês. Da comercialização do hidrogénio, do oxigénio industrial e da eletricidade produzida no sistema de miniprodução de 250 kW, prevê-se um lucro mensal de 117.925€, perfazendo assim um total líquido mensal positivo de 55.371€ durante os primeiros 5 anos e a partir daí de 96.721€/mês, resultando uma amortização do investimento inicial no final do 3º ano.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Um dos temas mais debatidos na sociedade actual é a segurança. Os níveis de segurança e as ferramentas para os alcançar entram em contraponto com os métodos usados para os quebrar. Como no passado, a razão qualidade/serviço mantém-se hoje, e manter-se-á no futuro, assegurando maior segurança àqueles que melhor se protejam. Problemas simples da vida real como furtos ou uso de falsa identidade assumem no meio informático uma forma rápida e por vezes indetectável de crime organizado. Neste estudo são investigados métodos sociais e aplicações informáticas comuns para quebrar a segurança de um sistema informático genérico. Desta forma, e havendo um entendimento sobre o Modus Operandi das entidades mal-intencionadas, poderá comprovar-se a instabilidade e insegurança de um sistema informático, e, posteriormente, actuar sobre o mesmo de tal forma que fique colocado numa posição da segurança que, podendo não ser infalível, poderá estar muito melhorada. Um dos objectivos fulcrais deste trabalho é conseguir implementar e configurar um sistema completo através de um estudo de soluções de mercado, gratuitas ou comerciais, a nível da implementação de um sistema em rede com todos os serviços comuns instalados, i.e., um pacote “chave na mão” com serviços de máquinas, sistema operativo, aplicações, funcionamento em rede com serviços de correio electrónico, gestão empresarial, anti-vírus, firewall, entre outros. Será possível então evidenciar uma instância de um sistema funcional, seguro e com os serviços necessários a um sistema actual, sem recurso a terceiros, e sujeito a um conjunto de testes que contribuem para o reforço da segurança.