23 resultados para DNA Error Correction

em Instituto Politécnico do Porto, Portugal


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper addresses the impact of the CO2 opportunity cost on the wholesale electricity price in the context of the Iberian electricity market (MIBEL), namely on the Portuguese system, for the period corresponding to the Phase II of the European Union Emission Trading Scheme (EU ETS). In the econometric analysis a vector error correction model (VECM) is specified to estimate both long–run equilibrium relations and short–run interactions between the electricity price and the fuel (natural gas and coal) and carbon prices. The model is estimated using daily spot market prices and the four commodities prices are jointly modelled as endogenous variables. Moreover, a set of exogenous variables is incorporated in order to account for the electricity demand conditions (temperature) and the electricity generation mix (quantity of electricity traded according the technology used). The outcomes for the Portuguese electricity system suggest that the dynamic pass–through of carbon prices into electricity prices is strongly significant and a long–run elasticity was estimated (equilibrium relation) that is aligned with studies that have been conducted for other markets.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The European Union Emissions Trading Scheme (EU ETS) is a cornerstone of the European Union's policy to combat climate change and its key tool for reducing industrial greenhouse gas emissions cost-effectively. The purpose of the present work is to evaluate the influence of CO2 opportunity cost on the Spanish wholesale electricity price. Our sample includes all Phase II of the EU ETS and the first year of Phase III implementation, from January 2008 to December 2013. A vector error correction model (VECM) is applied to estimate not only long-run equilibrium relations, but also short-run interactions between the electricity price and the fuel (natural gas and coal) and carbon prices. The four commodities prices are modeled as joint endogenous variables with air temperature and renewable energy as exogenous variables. We found a long-run relationship (cointegration) between electricity price, carbon price, and fuel prices. By estimating the dynamic pass-through of carbon price into electricity price for different periods of our sample, it is possible to observe the weakening of the link between carbon and electricity prices as a result from the collapse on CO2 prices, therefore compromising the efficacy of the system to reach proposed environmental goals. This conclusion is in line with the need to shape new policies within the framework of the EU ETS that prevent excessive low prices for carbon over extended periods of time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Deoxyribonucleic acid, or DNA, is the most fundamental aspect of life but present day scientific knowledge has merely scratched the surface of the problem posed by its decoding. While experimental methods provide insightful clues, the adoption of analysis tools supported by the formalism of mathematics will lead to a systematic and solid build-up of knowledge. This paper studies human DNA from the perspective of system dynamics. By associating entropy and the Fourier transform, several global properties of the code are revealed. The fractional order characteristics emerge as a natural consequence of the information content. These properties constitute a small piece of scientific knowledge that will support further efforts towards the final aim of establishing a comprehensive theory of the phenomena involved in life.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electricity markets are complex environments, involving a large number of different entities, playing in a dynamic scene to obtain the best advantages and profits. MASCEM is a multi-agent electricity market simu-lator to model market players and simulate their operation in the market. Market players are entities with specific characteristics and objectives, making their decisions and interacting with other players. MASCEM pro-vides several dynamic strategies for agents’ behaviour. This paper presents a method that aims to provide market players strategic bidding capabilities, allowing them to obtain the higher possible gains out of the market. This method uses an auxiliary forecasting tool, e.g. an Artificial Neural Net-work, to predict the electricity market prices, and analyses its forecasting error patterns. Through the recognition of such patterns occurrence, the method predicts the expected error for the next forecast, and uses it to adapt the actual forecast. The goal is to approximate the forecast to the real value, reducing the forecasting error.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyzes DNA information using entropy and phase plane concepts. First, the DNA code is converted into a numerical format by means of histograms that capture DNA sequence length ranging from one up to ten bases. This strategy measures dynamical evolutions from 4 up to 410 signal states. The resulting histograms are analyzed using three distinct entropy formulations namely the Shannon, Rényie and Tsallis definitions. Charts of entropy versus sequence length are applied to a set of twenty four species, characterizing 486 chromosomes. The information is synthesized and visualized by adapting phase plane concepts leading to a categorical representation of chromosomes and species.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the DNA code analysis in the perspective of dynamics and fractional calculus. Several mathematical tools are selected to establish a quantitative method without distorting the alphabet represented by the sequence of DNA bases. The association of Gray code, Fourier transform and fractional calculus leads to a categorical representation of species and chromosomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies the human DNA in the perspective of signal processing. Six wavelets are tested for analyzing the information content of the human DNA. By adopting real Shannon wavelet several fundamental properties of the code are revealed. A quantitative comparison of the chromosomes and visualization through multidimensional and dendograms is developed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe a novel approach to explore DNA nucleotide sequence data, aiming to produce high-level categorical and structural information about the underlying chromosomes, genomes and species. The article starts by analyzing chromosomal data through histograms using fixed length DNA sequences. After creating the DNA-related histograms, a correlation between pairs of histograms is computed, producing a global correlation matrix. These data are then used as input to several data processing methods for information extraction and tabular/graphical output generation. A set of 18 species is processed and the extensive results reveal that the proposed method is able to generate significant and diversified outputs, in good accordance with current scientific knowledge in domains such as genomics and phylogenetics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies the DNA code of eleven mammals from the perspective of fractional dynamics. The application of Fourier transform and power law trendlines leads to a categorical representation of species and chromosomes. The DNA information reveals long range memory characteristics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper aims to study the relationships between chromosomal DNA sequences of twenty species. We propose a methodology combining DNA-based word frequency histograms, correlation methods, and an MDS technique to visualize structural information underlying chromosomes (CRs) and species. Four statistical measures are tested (Minkowski, Cosine, Pearson product-moment, and Kendall τ rank correlations) to analyze the information content of 421 nuclear CRs from twenty species. The proposed methodology is built on mathematical tools and allows the analysis and visualization of very large amounts of stream data, like DNA sequences, with almost no assumptions other than the predefined DNA “word length.” This methodology is able to produce comprehensible three-dimensional visualizations of CR clustering and related spatial and structural patterns. The results of the four test correlation scenarios show that the high-level information clusterings produced by the MDS tool are qualitatively similar, with small variations due to each correlation method characteristics, and that the clusterings are a consequence of the input data and not method’s artifacts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction Myocardial Perfusion Imaging (MPI) is a very important tool in the assessment of Coronary Artery Disease ( CAD ) patient s and worldwide data demonstrate an increasingly wider use and clinical acceptance. Nevertheless, it is a complex process and it is quite vulnerable concerning the amount and type of possible artefacts, some of them affecting seriously the overall quality and the clinical utility of the obtained data. One of the most in convenient artefacts , but relatively frequent ( 20% of the cases ) , is relate d with patient motion during image acquisition . Mostly, in those situations, specific data is evaluated and a decisi on is made between A) accept the results as they are , consider ing that t he “noise” so introduced does not affect too seriously the final clinical information, or B) to repeat the acquisition process . Another possib ility could be to use the “ Motion Correcti on Software” provided within the software package included in any actual gamma camera. The aim of this study is to compare the quality of the final images , obtained after the application of motion correction software and after the repetition of image acqui sition. Material and Methods Thirty cases of MPI affected by Motion Artefacts and repeated , were used. A group of three, independent (blinded for the differences of origin) expert Nuclear Medicine Clinicians had been invited to evaluate the 30 sets of thre e images - one set for each patient - being ( A) original image , motion uncorrected , (B) original image, motion corrected, and (C) second acquisition image, without motion . The results so obtained were statistically analysed . Results and Conclusion Results obtained demonstrate that the use of the Motion Correction Software is useful essentiall y if the amplitude of movement is not too important (with this specific quantification found hard to define precisely , due to discrepancies between clinicians and other factors , namely between one to another brand); when that is not the case and the amplitude of movement is too important , the n the percentage of agreement between clinicians is much higher and the repetition of the examination is unanimously considered ind ispensable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: The quantification of th e differential renal function in adults can be difficult due to many factors - on e of the se is the variances in kidney depth and the attenuation related with all the tissue s between the kidney and the camera. Some authors refer that t he lower attenuation i n p ediatric patients makes unnecessary the use of attenuation correction algorithms. This study will com pare the values of differential renal function obtained with and with out attenuation correction techniques . Material and Methods: Images from a group consisting of 15 individuals (aged 3 years +/ - 2) were used and two attenuation correction method s were applied – Tonnesen correction factors and the geometric mean method . The mean time of acquisition (time post 99m Tc - DMSA administration) was 3.5 hours +/ - 0.8h. Results: T he absence of any method of attenuation correction apparently seems to lead to consistent values that seem to correlate well with the ones obtained with the incorporation of methods of attenuation correction . The differences found between the values obtained with and without attenuation correction were not significant. Conclusion: T he decision of not doing any kind of attenuation correction method can apparently be justified by the minor differences verified on the relative kidney uptake values. Nevertheless, if it is recognized that there is a need for a really accurate value of the relative kidney uptake, then an attenuation correction method should be used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Although relative uptake values aren’t the most important objective of a 99mTc-DMSA scan, they are important quantitative information. In most of the dynamic renal scintigraphies attenuation correction is essential if one wants to obtain a reliable result of the quantification process. Although in DMSA scans the absent of significant background and the lesser attenuation in pediatric patients, makes that this attenuation correction techniques are actually not applied. The geometric mean is the most common method, but that includes the acquisition of an anterior (extra) projection, which it is not acquired by a large number of NM departments. This method and the attenuation factors proposed by Tonnesen will be correlated with the absence of attenuation correction procedures. Material and Methods: Images from 20 individuals (aged 3 years +/- 2) were used and the two attenuation correction methods applied. The mean time of acquisition (time post DMSA administration) was 3.5 hours +/- 0.8h. Results: The absence of attenuation correction showed a good correlation with both attenuation methods (r=0.73 +/- 0.11) and the mean difference verified on the uptake values between the different methods were 4 +/- 3. The correlation was higher when the age was lower. The attenuation correction methods correlation was higher between them two than with the “no attenuation correction” method (r=0.82 +/- 0.8), and the mean differences of the uptake values were 2 +/- 2. Conclusion: The decision of not doing any kind of attenuation correction method can be justified by the minor differences verified on the relative kidney uptake values. Nevertheless, if it is recognized that there is a need for an accurate value of the relative kidney uptake, then an attenuation correction method should be used. Attenuation correction factors proposed by Tonnesen can be easily implemented and so become a practical and easy to implement alternative, namely when the anterior projection - needed for the geometric mean methodology – is not acquired.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The process of immobilization of biological molecules is one of the most important steps in the construction of a biosensor. In the case of DNA, the way it exposes its bases can result in electrochemical signals to acceptable levels. The use of self-assembled monolayer that allows a connection to the gold thiol group and DNA binding to an aldehydic ligand resulted in the possibility of determining DNA hybridization. Immobilized single strand of DNA (ssDNA) from calf thymus pre-formed from alkanethiol film was formed by incubating a solution of 2-aminoethanothiol (Cys) followed by glutaraldehyde (Glu). Cyclic voltammetry (CV) was used to characterize the self-assembled monolayer on the gold electrode and, also, to study the immobilization of ssDNA probe and hybridization with the complementary sequence (target ssDNA). The ssDNA probe presents a well-defined oxidation peak at +0.158 V. When the hybridization occurs, this peak disappears which confirms the efficacy of the annealing and the DNA double helix performing without the presence of electroactive indicators. The use of SAM resulted in a stable immobilization of the ssDNA probe, enabling the hybridization detection without labels. This study represents a promising approach for molecular biosensor with sensible and reproducible results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reactive oxygen species (ROS) are produced as a consequence of normal aerobic metabolism and are able to induce DNA oxidative damage. At the cellular level, the evaluation of the protective effect of antioxidants can be achieved by examining the integrity of the DNA nucleobases using electrochemical techniques. Herein, the use of an adenine-rich oligonucleotide (dA21) adsorbed on carbon paste electrodes for the assessment of the antioxidant capacity is proposed. The method was based on the partial damage of a DNA layer adsorbed on the electrode surface by OH• radicals generated by Fenton reaction and the subsequent electrochemical oxidation of the intact adenine bases to generate an oxidation product that was able to catalyze the oxidation of NADH. The presence of antioxidant compounds scavenged hydroxyl radicals leaving more adenines unoxidized, and thus, increasing the electrocatalytic current of NADHmeasured by differential pulse voltammetry (DPV). Using ascorbic acid (AA) as a model antioxidant species, the detection of as low as 50nMof AA in aqueous solution was possible. The protection efficiency was evaluated for several antioxidant compounds. The biosensor was applied to the determination of the total antioxidant capacity (TAC) in beverages.