14 resultados para normalization
em CentAUR: Central Archive University of Reading - UK
Resumo:
We argue that impulsiveness is characterized by compromised timing functions such as premature motor timing, decreased tolerance to delays, poor temporal foresight and steeper temporal discounting. A model illustration for the association between impulsiveness and timing deficits is the impulsiveness disorder of attention-deficit hyperactivity disorder (ADHD). Children with ADHD have deficits in timing processes of several temporal domains and the neural substrates of these compromised timing functions are strikingly similar to the neuropathology of ADHD. We review our published and present novel functional magnetic resonance imaging data to demonstrate that ADHD children show dysfunctions in key timing regions of prefrontal, cingulate, striatal and cerebellar location during temporal processes of several time domains including time discrimination of milliseconds, motor timing to seconds and temporal discounting of longer time intervals. Given that impulsiveness, timing abnormalities and more specifically ADHD have been related to dopamine dysregulation, we tested for and demonstrated a normalization effect of all brain dysfunctions in ADHD children during time discrimination with the dopamine agonist and treatment of choice, methylphenidate. This review together with the new empirical findings demonstrates that neurocognitive dysfunctions in temporal processes are crucial to the impulsiveness disorder of ADHD and provides first evidence for normalization with a dopamine reuptake inhibitor.
Resumo:
The current study aims to assess the applicability of direct or indirect normalization for the analysis of fractional anisotropy (FA) maps in the context of diffusion-weighted images (DWIs) contaminated by ghosting artifacts. We found that FA maps acquired by direct normalization showed generally higher anisotropy than indirect normalization, and the disparities were aggravated by the presence of ghosting artifacts in DWIs. The voxel-wise statistical comparisons demonstrated that indirect normalization reduced the influence of artifacts and enhanced the sensitivity of detecting anisotropy differences between groups. This suggested that images contaminated with ghosting artifacts can be sensibly analyzed using indirect normalization.
Resumo:
A stochastic parameterization scheme for deep convection is described, suitable for use in both climate and NWP models. Theoretical arguments and the results of cloud-resolving models, are discussed in order to motivate the form of the scheme. In the deterministic limit, it tends to a spectrum of entraining/detraining plumes and is similar to other current parameterizations. The stochastic variability describes the local fluctuations about a large-scale equilibrium state. Plumes are drawn at random from a probability distribution function (pdf) that defines the chance of finding a plume of given cloud-base mass flux within each model grid box. The normalization of the pdf is given by the ensemble-mean mass flux, and this is computed with a CAPE closure method. The characteristics of each plume produced are determined using an adaptation of the plume model from the Kain-Fritsch parameterization. Initial tests in the single column version of the Unified Model verify that the scheme is effective in producing the desired distributions of convective variability without adversely affecting the mean state.
Resumo:
The expression of two metallothionein genes (Mt-I and Mt-II) in the liver, kidney, and gonad of bank voles collected at four metal-contaminated sites (Cd, Zn, Pb, and Fe) were measured using the quantitative real-time PCR method (QPCR). Relative Mt gene expression was calculated by applying a normalization factor (NF) using the expression of two housekeeping genes, ribosomal 18S and beta-actin. Relative Mt expression in tissues of animals from contaminated sites was up to 54.8-fold higher than those from the reference site for Mt-I and up to 91.6-fold higher for Mt-II. Mt-II gene expression in the livers of bank voles from contaminated sites was higher than Mt-I gene expression. Inversely, Mt-II expression in the kidneys of voles was lower than Mt-I expression. Positive correlations between cadmium levels in the tissues and Mt-I were obtained in all studied tissues. Zinc, which undergoes homeostatic regulation, correlated positively with both Mt-I and Mt-II gene expression only in the kidney. Results showed that animals living in chronically contaminated environments intensively activate detoxifying mechanisms such as metallothionein expression. This is the first time that QPCR techniques to measure MT gene expression have been applied to assess the impact of environmental metal pollution on field collected bank voles.
Resumo:
The expression of two metallothionein genes (Mt-I and Mt-II) in the liver, kidney, and gonad of bank voles collected at four metal-contaminated sites (Cd, Zn, Pb, and Fe) were measured using the quantitative real-time PCR method (QPCR). Relative Mt gene expression was calculated by applying a normalization factor (NF) using the expression of two housekeeping genes, ribosomal 18S and beta-actin. Relative Mt expression in tissues of animals from contaminated sites was up to 54.8-fold higher than those from the reference site for Mt-I and up to 91.6-fold higher for Mt-II. Mt-II gene expression in the livers of bank voles from contaminated sites was higher than Mt-I gene expression. Inversely, Mt-II expression in the kidneys of voles was lower than Mt-I expression. Positive correlations between cadmium levels in the tissues and Mt-I were obtained in all studied tissues. Zinc, which undergoes homeostatic regulation, correlated positively with both Mt-I and Mt-II gene expression only in the kidney. Results showed that animals living in chronically contaminated environments intensively activate detoxifying mechanisms such as metallothionein expression. This is the first time that QPCR techniques to measure MT gene expression have been applied to assess the impact of environmental metal pollution on field collected bank voles.
Resumo:
The ability of human postprandial triacylglycerol-rich lipoproteins (TRLs), isolated after meals enriched in saturated fatty acids (SFAs), n-6 PUFAs, and MUFAs, to inhibit the uptake of I-125-labeled LDL by the LDL receptor was investigated in HepG2 cells. Addition of TRLs resulted in a dose-dependent inhibition of heparin-releasable binding, cell-associated radioactivity, and degradation products of I-125-labeled LDL (P < 0.001). SFA-rich Svedberg flotation rate (S-f) 60-400 resulted in significantly greater inhibition of cell-associated radioactivity than PUFA-rich particles (P = 0.016) and total uptake of I-125-labeled LDL compared with PUFA- and MUFA-rich particles (P = 0.02). Normalization of the apolipoprotein (apo)E but not apoC-III content of the TRLs removed the effect of meal fatty acid composition, and addition of an anti-apoE antibody reversed the inhibitory effect of TRLs on the total uptake of I-125-labeled LDL. Real time RT-PCR showed that the SFA-rich Sf 60-400 increased the expression of genes involved in hepatic lipid synthesis (P < 0.05) and decreased the expression of the LDL receptor-related protein 1 compared with MUFAs (P = 0.008). In conclusion, these findings suggest an alternative or additional mechanism whereby acute fat ingestion can influence LDL clearance via competitive apoE-dependent effects of TRL on the LDL receptor.-Jackson, K. G., V. Maitin, D. S. Leake, P. Yaqoob, and C. M. Williams. Saturated fat-induced changes in Sf 60 400 particle composition reduces uptake of LDL by HepG2 cells.
Resumo:
It is well known that gut bacteria contribute significantly to the host homeostasis, providing a range of benefits such as immune protection and vitamin synthesis. They also supply the host with a considerable amount of nutrients, making this ecosystem an essential metabolic organ. In the context of increasing evidence of the link between the gut flora and the metabolic syndrome, understanding the metabolic interaction between the host and its gut microbiota is becoming an important challenge of modern biology.1-4 Colonization (also referred to as normalization process) designates the establishment of micro-organisms in a former germ-free animal. While it is a natural process occurring at birth, it is also used in adult germ-free animals to control the gut floral ecosystem and further determine its impact on the host metabolism. A common procedure to control the colonization process is to use the gavage method with a single or a mixture of micro-organisms. This method results in a very quick colonization and presents the disadvantage of being extremely stressful5. It is therefore useful to minimize the stress and to obtain a slower colonization process to observe gradually the impact of bacterial establishment on the host metabolism. In this manuscript, we describe a procedure to assess the modification of hepatic metabolism during a gradual colonization process using a non-destructive metabolic profiling technique. We propose to monitor gut microbial colonization by assessing the gut microbial metabolic activity reflected by the urinary excretion of microbial co-metabolites by 1H NMR-based metabolic profiling. This allows an appreciation of the stability of gut microbial activity beyond the stable establishment of the gut microbial ecosystem usually assessed by monitoring fecal bacteria by DGGE (denaturing gradient gel electrophoresis).6 The colonization takes place in a conventional open environment and is initiated by a dirty litter soiled by conventional animals, which will serve as controls. Rodents being coprophagous animals, this ensures a homogenous colonization as previously described.7 Hepatic metabolic profiling is measured directly from an intact liver biopsy using 1H High Resolution Magic Angle Spinning NMR spectroscopy. This semi-quantitative technique offers a quick way to assess, without damaging the cell structure, the major metabolites such as triglycerides, glucose and glycogen in order to further estimate the complex interaction between the colonization process and the hepatic metabolism7-10. This method can also be applied to any tissue biopsy11,12.
Resumo:
The technique of constructing a transformation, or regrading, of a discrete data set such that the histogram of the transformed data matches a given reference histogram is commonly known as histogram modification. The technique is widely used for image enhancement and normalization. A method which has been previously derived for producing such a regrading is shown to be “best” in the sense that it minimizes the error between the cumulative histogram of the transformed data and that of the given reference function, over all single-valued, monotone, discrete transformations of the data. Techniques for smoothed regrading, which provide a means of balancing the error in matching a given reference histogram against the information lost with respect to a linear transformation are also examined. The smoothed regradings are shown to optimize certain cost functionals. Numerical algorithms for generating the smoothed regradings, which are simple and efficient to implement, are described, and practical applications to the processing of LANDSAT image data are discussed.
Resumo:
Methods for producing nonuniform transformations, or regradings, of discrete data are discussed. The transformations are useful in image processing, principally for enhancement and normalization of scenes. Regradings which “equidistribute” the histogram of the data, that is, which transform it into a constant function, are determined. Techniques for smoothing the regrading, dependent upon a continuously variable parameter, are presented. Generalized methods for constructing regradings such that the histogram of the data is transformed into any prescribed function are also discussed. Numerical algorithms for implementing the procedures and applications to specific examples are described.
Resumo:
The transcriptome of an organism is its set of gene transcripts (mRNAs) at a defined spatial and temporal locus. Because gene expression is affected markedly by environmental and developmental perturbations, it is widely assumed that transcriptome divergence among taxa represents adaptive phenotypic selection. This assumption has been challenged by neutral theories which propose that stochastic processes drive transcriptome evolution. To test for evidence of neutral transcriptome evolution in plants, we quantified 18 494 gene transcripts in nonsenescent leaves of 14 taxa of Brassicaceae using robust cross-species transcriptomics which includes a two-step physical and in silico-based normalization procedure based on DNA similarity among taxa. Transcriptome divergence correlates positively with evolutionary distance between taxa and with variation in gene expression among samples. Results are similar for pseudogenes and chloroplast genes evolving at different rates. Remarkably, variation in transcript abundance among root-cell samples correlates positively with transcriptome divergence among root tissues and among taxa. Because neutral processes affect transcriptome evolution in plants, many differences in gene expression among or within taxa may be nonfunctional, reflecting ancestral plasticity and founder effects. Appropriate null models are required when comparing transcriptomes in space and time.
Resumo:
Recent studies showed that features extracted from brain MRIs can well discriminate Alzheimer’s disease from Mild Cognitive Impairment. This study provides an algorithm that sequentially applies advanced feature selection methods for findings the best subset of features in terms of binary classification accuracy. The classifiers that provided the highest accuracies, have been then used for solving a multi-class problem by the one-versus-one strategy. Although several approaches based on Regions of Interest (ROIs) extraction exist, the prediction power of features has not yet investigated by comparing filter and wrapper techniques. The findings of this work suggest that (i) the IntraCranial Volume (ICV) normalization can lead to overfitting and worst the accuracy prediction of test set and (ii) the combined use of a Random Forest-based filter with a Support Vector Machines-based wrapper, improves accuracy of binary classification.
Resumo:
Parkinson is a neurodegenerative disease, in which tremor is the main symptom. This paper investigates the use of different classification methods to identify tremors experienced by Parkinsonian patients.Some previous research has focussed tremor analysis on external body signals (e.g., electromyography, accelerometer signals, etc.). Our advantage is that we have access to sub-cortical data, which facilitates the applicability of the obtained results into real medical devices since we are dealing with brain signals directly. Local field potentials (LFP) were recorded in the subthalamic nucleus of 7 Parkinsonian patients through the implanted electrodes of a deep brain stimulation (DBS) device prior to its internalization. Measured LFP signals were preprocessed by means of splinting, down sampling, filtering, normalization and rec-tification. Then, feature extraction was conducted through a multi-level decomposition via a wavelettrans form. Finally, artificial intelligence techniques were applied to feature selection, clustering of tremor types, and tremor detection.The key contribution of this paper is to present initial results which indicate, to a high degree of certainty, that there appear to be two distinct subgroups of patients within the group-1 of patients according to the Consensus Statement of the Movement Disorder Society on Tremor. Such results may well lead to different resultant treatments for the patients involved, depending on how their tremor has been classified. Moreover, we propose a new approach for demand driven stimulation, in which tremor detection is also based on the subtype of tremor the patient has. Applying this knowledge to the tremor detection problem, it can be concluded that the results improve when patient clustering is applied prior to detection.
Resumo:
Background Serotonin is under-researched in attention deficit hyperactivity disorder (ADHD), despite accumulating evidence for its involvement in impulsiveness and the disorder. Serotonin further modulates temporal discounting (TD), which is typically abnormal in ADHD relative to healthy subjects, underpinned by reduced fronto-striato-limbic activation. This study tested whether a single acute dose of the selective serotonin reuptake inhibitor (SSRI) fluoxetine up-regulates and normalizes reduced fronto-striato-limbic neurofunctional activation in ADHD during TD. Method Twelve boys with ADHD were scanned twice in a placebo-controlled randomized design under either fluoxetine (between 8 and 15 mg, titrated to weight) or placebo while performing an individually adjusted functional magnetic resonance imaging TD task. Twenty healthy controls were scanned once. Brain activation was compared in patients under either drug condition and compared to controls to test for normalization effects. Results Repeated-measures whole-brain analysis in patients revealed significant up-regulation with fluoxetine in a large cluster comprising right inferior frontal cortex, insula, premotor cortex and basal ganglia, which further correlated trend-wise with TD performance, which was impaired relative to controls under placebo, but normalized under fluoxetine. Fluoxetine further down-regulated default mode areas of posterior cingulate and precuneus. Comparisons between controls and patients under either drug condition revealed normalization with fluoxetine in right premotor-insular-parietal activation, which was reduced in patients under placebo. Conclusions The findings show that a serotonin agonist up-regulates activation in typical ADHD dysfunctional areas in right inferior frontal cortex, insula and striatum as well as down-regulating default mode network regions in the context of impulsivity and TD.