985 resultados para concentration methods
Resumo:
As sequelas fisiopatológicas do stress oxidativo são difíceis de quantificar. Apesar dos obstáculos, a relevância médica do stress oxidativo tem vindo a ser cada vez mais reconhecida, sendo hoje em dia encarado como um componente chave de virtualmente todas as doenças. A disfunção erétil (DE) surge neste contexto como uma espécie de barómetro da função endotelial e do dano oxidativo. A quantificação de biomarcadores de stress oxidativo poderá apresentar um enorme impacto na avaliação de pacientes com DE. O rácio glutationa reduzida/oxidada (GSH/GSSG) e a nitrotirosina (3-NT) têm vindo a demonstrar relevância clínica. A consideração de polimorfismos genéticos constitui ainda uma abordagem promissora na avaliação destas relações no futuro. Um método altamente sensível de cromatografia líquida de alta performance (HPLC) foi desenvolvido para a determinação de 3-NT em plasma humano. As concentrações de 3-NT medidos em indivíduos com DE foram 6,6±2,1μM (média±S.D., n = 46). A medição da concentração plasmática de 3-NT poderá revelar-se útil como marcador de dano oxidativo dependente do óxido nítrico (NO). O nível de stress oxidativo pode também ser quantificado através da medição do decréscimo do rácio GSH/GSSG, que tem mostrado alterações numa miríade de patologias, como a DE e a diabetes mellitus. O método proposto para a quantificação do rácio GSH/GSSG em HPLC apresenta a vantagem de avaliação concomitante dos dois parâmetros em apenas uma corrida. O valor do rácio GSH/GSSG obtido a partir de sangue de indivíduos com DE foi 11,9±9,8 (média±S.D., n = 49). Os resultados estatísticos revelaram diferenças significativas (p<0,001) entre ambos a concentração plasmática de 3-NT e o rácio GSH/GSSG de sangue de indivíduos com DE e as respetivas medições em indivíduos saudáveis. Observaram-se ainda diferenças estatisticamente significativas (p≈0,027) entre o rácio GSH/GSSG do sangue de pacientes apenas com diagnóstico de DE e a medição respetiva em indivíduos com DE e comorbilidades cardiovasculares. Estes resultados enfatizam o papel do dano oxidativo na biopatologia da DE, elucidado com o auxílio destas duas metodologias, que poderão ter um amplo campo de aplicação no futuro, dado que se mostraram simples, não dispendiosas e rápidas, podendo eventualmente adequar-se a estudos de rastreio em larga escala.
Resumo:
Storm- and tsunami-deposits are generated by similar depositional mechanisms making their discrimination hard to establish using classic sedimentologic methods. Here we propose an original approach to identify tsunami-induced deposits by combining numerical simulation and rock magnetism. To test our method, we investigate the tsunami deposit of the Boca do Rio estuary generated by the 1755 earthquake in Lisbon which is well described in the literature. We first test the 1755 tsunami scenario using a numerical inundation model to provide physical parameters for the tsunami wave. Then we use concentration (MS. SIRM) and grain size (chi(ARM), ARM, B1/2, ARM/SIRM) sensitive magnetic proxies coupled with SEM microscopy to unravel the magnetic mineralogy of the tsunami-induced deposit and its associated depositional mechanisms. In order to study the connection between the tsunami deposit and the different sedimentologic units present in the estuary, magnetic data were processed by multivariate statistical analyses. Our numerical simulation show a large inundation of the estuary with flow depths varying from 0.5 to 6 m and run up of similar to 7 m. Magnetic data show a dominance of paramagnetic minerals (quartz) mixed with lesser amount of ferromagnetic minerals, namely titanomagnetite and titanohematite both of a detrital origin and reworked from the underlying units. Multivariate statistical analyses indicate a better connection between the tsunami-induced deposit and a mixture of Units C and D. All these results point to a scenario where the energy released by the tsunami wave was strong enough to overtop and erode important amount of sand from the littoral dune and mixed it with reworked materials from underlying layers at least 1 m in depth. The method tested here represents an original and promising tool to identify tsunami-induced deposits in similar embayed beach environments.
Resumo:
Personal memories composed of digital pictures are very popular at the moment. To retrieve these media items annotation is required. During the last years, several approaches have been proposed in order to overcome the image annotation problem. This paper presents our proposals to address this problem. Automatic and semi-automatic learning methods for semantic concepts are presented. The automatic method is based on semantic concepts estimated using visual content, context metadata and audio information. The semi-automatic method is based on results provided by a computer game. The paper describes our proposals and presents their evaluations.
Resumo:
The present research paper presents five different clustering methods to identify typical load profiles of medium voltage (MV) electricity consumers. These methods are intended to be used in a smart grid environment to extract useful knowledge about customer’s behaviour. The obtained knowledge can be used to support a decision tool, not only for utilities but also for consumers. Load profiles can be used by the utilities to identify the aspects that cause system load peaks and enable the development of specific contracts with their customers. The framework presented throughout the paper consists in several steps, namely the pre-processing data phase, clustering algorithms application and the evaluation of the quality of the partition, which is supported by cluster validity indices. The process ends with the analysis of the discovered knowledge. To validate the proposed framework, a case study with a real database of 208 MV consumers is used.
Resumo:
OBJECTIVE: To evaluate fluoride and aluminum concentration in herbal, black, ready-to-drink, and imported teas available in Brazil considering the risks fluoride and aluminum pose to oral and general health, respectively. METHODS: One-hundred and seventy-seven samples of herbal and black tea, 11 types of imported tea and 21 samples of ready-to-drink tea were divided into four groups: I-herbal tea; II-Brazilian black tea (Camellia sinensis); III-imported tea (Camellia sinensis); IV-ready-to-drink tea-based beverages. Fluoride and aluminum were analyzed using ion-selective electrode and atomic absorption, respectively. RESULTS: Fluoride and aluminum levels in herbal teas were very low, but high amounts were found in black and ready-to-drink teas. Aluminum found in all samples analyzed can be considered safe to general health. However, considering 0.07 mg F/kg/day as the upper limit of fluoride intake with regard to undesirable dental fluorosis, some teas exceed the daily intake limit for children. CONCLUSIONS: Brazilian and imported teas made from Camellia sinensis as well as some tea-based beverages are sources of significant amounts of fluoride, and their intake may increase the risk of developing dental fluorosis.
Resumo:
Intensive use of Distributed Generation (DG) represents a change in the paradigm of power systems operation making small-scale energy generation and storage decision making relevant for the whole system. This paradigm led to the concept of smart grid for which an efficient management, both in technical and economic terms, should be assured. This paper presents a new approach to solve the economic dispatch in smart grids. The proposed methodology for resource management involves two stages. The first one considers fuzzy set theory to define the natural resources range forecast as well as the load forecast. The second stage uses heuristic optimization to determine the economic dispatch considering the generation forecast, storage management and demand response
Resumo:
In the context of electricity markets, transmission pricing is an important tool to achieve an efficient operation of the electricity system. The electricity market is influenced by several factors; however the transmission network management is one of the most important aspects, because the network is a natural monopoly. The transmission tariffs can help to regulate the market, for this reason transmission tariffs must follow strict criteria. This paper presents the following methods to tariff the use of transmission networks by electricity market players: Post-Stamp Method; MW-Mile Method Distribution Factors Methods; Tracing Methodology; Bialek’s Tracing Method and Locational Marginal Price. A nine bus transmission network is used to illustrate the application of the tariff methods.
Resumo:
This paper proposes two meta-heuristics (Genetic Algorithm and Evolutionary Particle Swarm Optimization) for solving a 15 bid-based case of Ancillary Services Dispatch in an Electricity Market. A Linear Programming approach is also included for comparison purposes. A test case based on the dispatch of Regulation Down, Regulation Up, Spinning Reserve and Non-Spinning Reserve services is used to demonstrate that the use of meta-heuristics is suitable for solving this kind of optimization problem. Faster execution times and lower computational resources requirements are the most relevant advantages of the used meta-heuristics when compared with the Linear Programming approach.
Resumo:
Electricity market players operating in a liberalized environment requires access to an adequate decision support tool, allowing them to consider all the business opportunities and take strategic decisions. Ancillary services represent a good negotiation opportunity that must be considered by market players. For this, decision support tool must include ancillary market simulation. This paper proposes two different methods (Linear Programming and Genetic Algorithm approaches) for ancillary services dispatch. The methodologies are implemented in MASCEM, a multi-agent based electricity market simulator. A test case based on California Independent System Operator (CAISO) data concerning the dispatch of Regulation Down, Regulation Up, Spinning Reserve and Non-Spinning Reserve services is included in this paper.
Resumo:
Objectives : The purpose of this article is to find out differences between surveys using paper and online questionnaires. The author has deep knowledge in the case of questions concerning opinions in the development of survey based research, e.g. the limits of postal and online questionnaires. Methods : In the physician studies carried out in 1995 (doctors graduated in 1982-1991), 2000 (doctors graduated in 1982-1996), 2005 (doctors graduated in 1982-2001), 2011 (doctors graduated in 1977-2006) and 457 family doctors in 2000, were used paper and online questionnaires. The response rates were 64%, 68%, 64%, 49% and 73%, respectively. Results : The results of the physician studies showed that there were differences between methods. These differences were connected with using paper-based questionnaire and online questionnaire and response rate. The online-based survey gave a lower response rate than the postal survey. The major advantages of online survey were short response time; very low financial resource needs and data were directly loaded in the data analysis software, thus saved time and resources associated with the data entry process. Conclusions : The current article helps researchers with planning the study design and choosing of the right data collection method.
Resumo:
Tomographic image can be degraded, partially by patient based attenuation. The aim of this paper is to quantitatively verify the effects of attenuation correction methods Chang and CT in 111In studies through the analysis of profiles from abdominal SPECT, correspondent to a uniform radionuclide uptake organ, the left kidney.
Resumo:
Mestrado em Segurança e Higiene do Trabalho.
Resumo:
OBJECTIVE: Myocardial infarction is an acute and severe cardiovascular disease that generally leads to patient admissions to intensive care units and few cases are initially admitted to infirmaries. The objective of the study was to assess whether estimates of air pollution effects on myocardial infarction morbidity are modified by the source of health information. METHODS: The study was carried out in hospitals of the Brazilian Health System in the city of São Paulo, Southern Brazil. A time series study (1998-1999) was performed using two outcomes: infarction admissions to infirmaries and to intensive care units, both for people older than 64 years of age. Generalized linear models controlling for seasonality (long and short-term trends) and weather were used. The eight-day cumulative effects of air pollutants were assessed using third degree polynomial distributed lag models. RESULTS: Almost 70% of daily hospital admissions due to myocardial infarction were to infirmaries. Despite that, the effects of air pollutants on infarction were higher for intensive care units admissions. All pollutants were positively associated with the study outcomes but SO2 presented the strongest statistically significant association. An interquartile range increase on SO2 concentration was associated with increases of 13% (95% CI: 6-19) and 8% (95% CI: 2-13) of intensive care units and infirmary infarction admissions, respectively. CONCLUSIONS: It may be assumed there is a misclassification of myocardial infarction admissions to infirmaries leading to overestimation. Also, despite the absolute number of events, admissions to intensive care units data provides a more adequate estimate of the magnitude of air pollution effects on infarction admissions.
Resumo:
In real optimization problems, usually the analytical expression of the objective function is not known, nor its derivatives, or they are complex. In these cases it becomes essential to use optimization methods where the calculation of the derivatives, or the verification of their existence, is not necessary: the Direct Search Methods or Derivative-free Methods are one solution. When the problem has constraints, penalty functions are often used. Unfortunately the choice of the penalty parameters is, frequently, very difficult, because most strategies for choosing it are heuristics strategies. As an alternative to penalty function appeared the filter methods. A filter algorithm introduces a function that aggregates the constrained violations and constructs a biobjective problem. In this problem the step is accepted if it either reduces the objective function or the constrained violation. This implies that the filter methods are less parameter dependent than a penalty function. In this work, we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of the simplex method and filter methods. This method does not compute or approximate any derivatives, penalty constants or Lagrange multipliers. The basic idea of simplex filter algorithm is to construct an initial simplex and use the simplex to drive the search. We illustrate the behavior of our algorithm through some examples. The proposed methods were implemented in Java.
Resumo:
Mestrado em Engenharia Química