970 resultados para quantitative trait loci (QTLs)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: Limited information is available on the quantitative relationship between family history and the corresponding underlying traits. We analyzed these associations for blood pressure, fasting blood glucose, and cholesterol levels. Methods: Data were obtained from 6,102 Caucasian participants (2,903 men and 3,199 women) aged 35-75 years using a population-based cross-sectional survey in Switzerland. Cardiovascular disease risk factors were measured, and the corresponding family history was self-reported using a structured questionnaire. Results: The prevalence of a positive family history (in first-degree relatives) was 39.6% for hypertension, 22.3% for diabetes, and 29.0% for hypercholesterolemia. Family history was not known for at least one family member in 41.8% of participants for hypertension, 14.4% for diabetes, and 50.2% for hypercholesterolemia. A positive family history was strongly associated with higher levels of the corresponding trait, but not with the other traits. Participants who reported not to know their family history of hypertension had a higher systolic blood pressure than participants with a negative history. Sibling histories had higher positive predictive values than parental histories. The ability to discriminate, calibrate, and reclassify was best for the family history of hypertension. Conclusions: Family history of hypertension, diabetes, and hypercholesterolemia was strongly associated with the corresponding dichotomized and continuous phenotypes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The tourism consumer’s purchase decision process is, to a great extent, conditioned by the image the tourist has of the different destinations that make up his or her choice set. In a highly competitive international tourist market, those responsible for destinations’ promotion and development policies seek differentiation strategies so that they may position the destinations in the most suitable market segments for their product in order to improve their attractiveness to visitors and increase or consolidate the economic benefits that tourism activity generates in their territory. To this end, the main objective we set ourselves in this paper is the empirical analysis of the factors that determine the image formation of Tarragona city as a cultural heritage destination. Without a doubt, UNESCO’s declaration of Tarragona’s artistic and monumental legacies as World Heritage site in the year 2000 meant important international recognition of the quality of the cultural and patrimonial elements offered by the city to the visitors who choose it as a tourist destination. It also represents a strategic opportunity to boost the city’s promotion of tourism and its consolidation as a unique destination given its cultural and patrimonial characteristics. Our work is based on the use of structured and unstructured techniques to identify the factors that determine Tarragona’s tourist destination image and that have a decisive influence on visitors’ process of choice of destination. In addition to being able to ascertain Tarragona’s global tourist image, we consider that the heterogeneity of its visitors requires a more detailed study that enables us to segment visitor typology. We consider that the information provided by these results may prove of great interest to those responsible for local tourism policy, both when designing products and when promoting the destination.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The recurrent ~600 kb 16p11.2 BP4-BP5 deletion is among the most frequent known genetic aetiologies of autism spectrum disorder (ASD) and related neurodevelopmental disorders. OBJECTIVE: To define the medical, neuropsychological, and behavioural phenotypes in carriers of this deletion. METHODS: We collected clinical data on 285 deletion carriers and performed detailed evaluations on 72 carriers and 68 intrafamilial non-carrier controls. RESULTS: When compared to intrafamilial controls, full scale intelligence quotient (FSIQ) is two standard deviations lower in carriers, and there is no difference between carriers referred for neurodevelopmental disorders and carriers identified through cascade family testing. Verbal IQ (mean 74) is lower than non-verbal IQ (mean 83) and a majority of carriers require speech therapy. Over 80% of individuals exhibit psychiatric disorders including ASD, which is present in 15% of the paediatric carriers. Increase in head circumference (HC) during infancy is similar to the HC and brain growth patterns observed in idiopathic ASD. Obesity, a major comorbidity present in 50% of the carriers by the age of 7 years, does not correlate with FSIQ or any behavioural trait. Seizures are present in 24% of carriers and occur independently of other symptoms. Malformations are infrequently found, confirming only a few of the previously reported associations. CONCLUSIONS: The 16p11.2 deletion impacts in a quantitative and independent manner FSIQ, behaviour and body mass index, possibly through direct influences on neural circuitry. Although non-specific, these features are clinically significant and reproducible. Lastly, this study demonstrates the necessity of studying large patient cohorts ascertained through multiple methods to characterise the clinical consequences of rare variants involved in common diseases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: Prospective studies have shown that quantitative ultrasound (QUS) techniques predict the risk of fracture of the proximal femur with similar standardised risk ratios to dual-energy x-ray absorptiometry (DXA). Few studies have investigated these devices for the prediction of vertebral fractures. The Basel Osteoporosis Study (BOS) is a population-based prospective study to assess the performance of QUS devices and DXA in predicting incident vertebral fractures. METHODS: 432 women aged 60-80 years were followed-up for 3 years. Incident vertebral fractures were assessed radiologically. Bone measurements using DXA (spine and hip) and QUS measurements (calcaneus and proximal phalanges) were performed. Measurements were assessed for their value in predicting incident vertebral fractures using logistic regression. RESULTS: QUS measurements at the calcaneus and DXA measurements discriminated between women with and without incident vertebral fracture, (20% height reduction). The relative risks (RRs) for vertebral fracture, adjusted for age, were 2.3 for the Stiffness Index (SI) and 2.8 for the Quantitative Ultrasound Index (QUI) at the calcaneus and 2.0 for bone mineral density at the lumbar spine. The predictive value (AUC (95% CI)) of QUS measurements at the calcaneus remained highly significant (0.70 for SI, 0.72 for the QUI, and 0.67 for DXA at the lumbar spine) even after adjustment for other confounding variables. CONCLUSIONS: QUS of the calcaneus and bone mineral density measurements were shown to be significant predictors of incident vertebral fracture. The RRs for QUS measurements at the calcaneus are of similar magnitude as for DXA measurements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite the central role of quantitative PCR (qPCR) in the quantification of mRNA transcripts, most analyses of qPCR data are still delegated to the software that comes with the qPCR apparatus. This is especially true for the handling of the fluorescence baseline. This article shows that baseline estimation errors are directly reflected in the observed PCR efficiency values and are thus propagated exponentially in the estimated starting concentrations as well as 'fold-difference' results. Because of the unknown origin and kinetics of the baseline fluorescence, the fluorescence values monitored in the initial cycles of the PCR reaction cannot be used to estimate a useful baseline value. An algorithm that estimates the baseline by reconstructing the log-linear phase downward from the early plateau phase of the PCR reaction was developed and shown to lead to very reproducible PCR efficiency values. PCR efficiency values were determined per sample by fitting a regression line to a subset of data points in the log-linear phase. The variability, as well as the bias, in qPCR results was significantly reduced when the mean of these PCR efficiencies per amplicon was used in the calculation of an estimate of the starting concentration per sample.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the last decade microsatellites have become one of the most useful genetic markers used in a large number of organisms due to their abundance and high level of polymorphism. Microsatellites have been used for individual identification, paternity tests, forensic studies and population genetics. Data on microsatellite abundance comes preferentially from microsatellite enriched libraries and DNA sequence databases. We have conducted a search in GenBank of more than 16,000 Schistosoma mansoni ESTs and 42,000 BAC sequences. In addition, we obtained 300 sequences from CA and AT microsatellite enriched genomic libraries. The sequences were searched for simple repeats using the RepeatMasker software. Of 16,022 ESTs, we detected 481 (3%) sequences that contained 622 microsatellites (434 perfect, 164 imperfect and 24 compounds). Of the 481 ESTs, 194 were grouped in 63 clusters containing 2 to 15 ESTs per cluster. Polymorphisms were observed in 16 clusters. The 287 remaining ESTs were orphan sequences. Of the 42,017 BAC end sequences, 1,598 (3.8%) contained microsatellites (2,335 perfect, 287 imperfect and 79 compounds). The 1,598 BAC end sequences 80 were grouped into 17 clusters containing 3 to 17 BAC end sequences per cluster. Microsatellites were present in 67 out of 300 sequences from microsatellite enriched libraries (55 perfect, 38 imperfect and 15 compounds). From all of the observed loci 55 were selected for having the longest perfect repeats and flanking regions that allowed the design of primers for PCR amplification. Additionally we describe two new polymorphic microsatellite loci.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: The optimal coronary MR angiography sequence has yet to be determined. We sought to quantitatively and qualitatively compare four coronary MR angiography sequences. SUBJECTS AND METHODS. Free-breathing coronary MR angiography was performed in 12 patients using four imaging sequences (turbo field-echo, fast spin-echo, balanced fast field-echo, and spiral turbo field-echo). Quantitative comparisons, including signal-to-noise ratio, contrast-to-noise ratio, vessel diameter, and vessel sharpness, were performed using a semiautomated analysis tool. Accuracy for detection of hemodynamically significant disease (> 50%) was assessed in comparison with radiographic coronary angiography. RESULTS: Signal-to-noise and contrast-to-noise ratios were markedly increased using the spiral (25.7 +/- 5.7 and 15.2 +/- 3.9) and balanced fast field-echo (23.5 +/- 11.7 and 14.4 +/- 8.1) sequences compared with the turbo field-echo (12.5 +/- 2.7 and 8.3 +/- 2.6) sequence (p < 0.05). Vessel diameter was smaller with the spiral sequence (2.6 +/- 0.5 mm) than with the other techniques (turbo field-echo, 3.0 +/- 0.5 mm, p = 0.6; balanced fast field-echo, 3.1 +/- 0.5 mm, p < 0.01; fast spin-echo, 3.1 +/- 0.5 mm, p < 0.01). Vessel sharpness was highest with the balanced fast field-echo sequence (61.6% +/- 8.5% compared with turbo field-echo, 44.0% +/- 6.6%; spiral, 44.7% +/- 6.5%; fast spin-echo, 18.4% +/- 6.7%; p < 0.001). The overall accuracies of the sequences were similar (range, 74% for turbo field-echo, 79% for spiral). Scanning time for the fast spin-echo sequences was longest (10.5 +/- 0.6 min), and for the spiral acquisitions was shortest (5.2 +/- 0.3 min). CONCLUSION: Advantages in signal-to-noise and contrast-to-noise ratios, vessel sharpness, and the qualitative results appear to favor spiral and balanced fast field-echo coronary MR angiography sequences, although subjective accuracy for the detection of coronary artery disease was similar to that of other sequences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Calcium is vital to the normal functioning of multiple organ systems and its serum concentration is tightly regulated. Apart from CASR, the genes associated with serum calcium are largely unknown. We conducted a genome-wide association meta-analysis of 39,400 individuals from 17 population-based cohorts and investigated the 14 most strongly associated loci in ≤ 21,679 additional individuals. Seven loci (six new regions) in association with serum calcium were identified and replicated. Rs1570669 near CYP24A1 (P = 9.1E-12), rs10491003 upstream of GATA3 (P = 4.8E-09) and rs7481584 in CARS (P = 1.2E-10) implicate regions involved in Mendelian calcemic disorders: Rs1550532 in DGKD (P = 8.2E-11), also associated with bone density, and rs7336933 near DGKH/KIAA0564 (P = 9.1E-10) are near genes that encode distinct isoforms of diacylglycerol kinase. Rs780094 is in GCKR. We characterized the expression of these genes in gut, kidney, and bone, and demonstrate modulation of gene expression in bone in response to dietary calcium in mice. Our results shed new light on the genetics of calcium homeostasis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Genetic Investigation of Anthropometric Traits (GIANT) consortium identified 14 loci in European Ancestry (EA) individuals associated with waist-to-hip ratio (WHR) adjusted for body mass index. These loci are wide and narrowing the signals remains necessary. Twelve of 14 loci identified in GIANT EA samples retained strong associations with WHR in our joint EA/individuals of African Ancestry (AA) analysis (log-Bayes factor >6.1). Trans-ethnic analyses at five loci (TBX15-WARS2, LYPLAL1, ADAMTS9, LY86 and ITPR2-SSPN) substantially narrowed the signals to smaller sets of variants, some of which are in regions that have evidence of regulatory activity. By leveraging varying linkage disequilibrium structures across different populations, single-nucleotide polymorphisms (SNPs) with strong signals and narrower credible sets from trans-ethnic meta-analysis of central obesity provide more precise localizations of potential functional variants and suggest a possible regulatory role. Meta-analysis results for WHR were obtained from 77 167 EA participants from GIANT and 23 564 AA participants from the African Ancestry Anthropometry Genetics Consortium. For fine mapping we interrogated SNPs within ± 250 kb flanking regions of 14 previously reported index SNPs from loci discovered in EA populations by performing trans-ethnic meta-analysis of results from the EA and AA meta-analyses. We applied a Bayesian approach that leverages allelic heterogeneity across populations to combine meta-analysis results and aids in fine-mapping shared variants at these locations. We annotated variants using information from the ENCODE Consortium and Roadmap Epigenomics Project to prioritize variants for possible functionality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La douleur neuropathique est définie comme une douleur causée par une lésion du système nerveux somato-sensoriel. Elle se caractérise par des douleurs exagérées, spontanées, ou déclenchées par des stimuli normalement non douloureux (allodynie) ou douloureux (hyperalgésie). Bien qu'elle concerne 7% de la population, ses mécanismes biologiques ne sont pas encore élucidés. L'étude des variations d'expressions géniques dans les tissus-clés des voies sensorielles (notamment le ganglion spinal et la corne dorsale de la moelle épinière) à différents moments après une lésion nerveuse périphérique permettrait de mettre en évidence de nouvelles cibles thérapeutiques. Elles se détectent de manière sensible par reverse transcription quantitative real-time polymerase chain reaction (RT- qPCR). Pour garantir des résultats fiables, des guidelines ont récemment recommandé la validation des gènes de référence utilisés pour la normalisation des données ("Minimum information for publication of quantitative real-time PCR experiments", Bustin et al 2009). Après recherche dans la littérature des gènes de référence fréquemment utilisés dans notre modèle de douleur neuropathique périphérique SNI (spared nerve injury) et dans le tissu nerveux en général, nous avons établi une liste de potentiels bons candidats: Actin beta (Actb), Glyceraldehyde-3-phosphate dehydrogenase (GAPDH), ribosomal proteins 18S (18S), L13a (RPL13a) et L29 (RPL29), hypoxanthine phosphoribosyltransferase 1 (HPRT1) et hydroxymethyl-bilane synthase (HMBS). Nous avons évalué la stabilité d'expression de ces gènes dans le ganglion spinal et dans la corne dorsale à différents moments après la lésion nerveuse (SNI) en calculant des coefficients de variation et utilisant l'algorithme geNorm qui compare les niveaux d'expression entre les différents candidats et détermine la paire de gènes restante la plus stable. Il a aussi été possible de classer les gènes selon leur stabilité et d'identifier le nombre de gènes nécessaires pour une normalisation la plus précise. Les gènes les plus cités comme référence dans le modèle SNI ont été GAPDH, HMBS, Actb, HPRT1 et 18S. Seuls HPRT1 and 18S ont été précédemment validés dans des arrays de RT-qPCR. Dans notre étude, tous les gènes testés dans le ganglion spinal et dans la corne dorsale satisfont au critère de stabilité exprimé par une M-value inférieure à 1. Par contre avec un coefficient de variation (CV) supérieur à 50% dans le ganglion spinal, 18S ne peut être retenu. La paire de gènes la plus stable dans le ganglion spinal est HPRT1 et Actb et dans la corne dorsale il s'agit de RPL29 et RPL13a. L'utilisation de 2 gènes de référence stables suffit pour une normalisation fiable. Nous avons donc classé et validé Actb, RPL29, RPL13a, HMBS, GAPDH, HPRT1 et 18S comme gènes de référence utilisables dans la corne dorsale pour le modèle SNI chez le rat. Dans le ganglion spinal 18S n'a pas rempli nos critères. Nous avons aussi déterminé que la combinaison de deux gènes de référence stables suffit pour une normalisation précise. Les variations d'expression génique de potentiels gènes d'intérêts dans des conditions expérimentales identiques (SNI, tissu et timepoints post SNI) vont pouvoir se mesurer sur la base d'une normalisation fiable. Non seulement il sera possible d'identifier des régulations potentiellement importantes dans la genèse de la douleur neuropathique mais aussi d'observer les différents phénotypes évoluant au cours du temps après lésion nerveuse.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Barnardos supports children whose well-being is under threat, by working with them, their families and communities and by campaigning for the rights of children. Barnardos was established in Ireland in 1962 and is Ireland’s leading independent Children's charity.  

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is enormous interest in designing training methods for reducing cognitive decline in healthy older adults. Because it is impaired with aging, multitasking has often been targeted and has been shown to be malleable with appropriate training. Investigating the effects of cognitive training on functional brain activation might provide critical indication regarding the mechanisms that underlie those positive effects, as well as provide models for selecting appropriate training methods. The few studies that have looked at brain correlates of cognitive training indicate a variable pattern and location of brain changes - a result that might relate to differences in training formats. The goal of this study was to measure the neural substrates as a function of whether divided attentional training programs induced the use of alternative processes or whether it relied on repeated practice. Forty-eight older adults were randomly allocated to one of three training programs. In the SINGLE REPEATED training, participants practiced an alphanumeric equation and a visual detection task, each under focused attention. In the DIVIDED FIXED training, participants practiced combining verification and detection by divided attention, with equal attention allocated to both tasks. In the DIVIDED VARIABLE training, participants completed the task by divided attention, but were taught to vary the attentional priority allocated to each task. Brain activation was measured with fMRI pre- and post-training while completing each task individually and the two tasks combined. The three training programs resulted in markedly different brain changes. Practice on individual tasks in the SINGLE REPEATED training resulted in reduced brain activation whereas DIVIDED VARIABLE training resulted in a larger recruitment of the right superior and middle frontal gyrus, a region that has been involved in multitasking. The type of training is a critical factor in determining the pattern of brain activation.