959 resultados para FUNCTIONAL PERFORMANCE
Resumo:
The paper industry is constantly looking for new ideas for improving paper products while competition and raw material prices are increasing. Many paper products are pigment coated. Coating layer is the top layer of paper, thus by modifying coating pigment also the paper itself can be altered and value added to the final product. In this thesis, synthesis of new plastic and hybrid pigments and their performance in paper and paperboard coating is reported. Two types of plastic pigments were studied: core-shell latexes and solid beads of maleimide copolymers. Core-shell latexes with partially crosslinked hydrophilic polymer core of poly(n-butyl acrylate-co-methacrylic acid) and a hard hydrophobic polystyrene shell were prepared to improve the optical properties of coated paper. In addition, the effect of different crosslinkers was analyzed and the best overall performance was achieved by the use of ethylene glycol dimethacrylate (EGDMA). Furthermore, the possibility to modify core-shell latex was investigated by introducing a new polymerizable optical brightening agent, 1-[(4-vinylphenoxy)methyl]-4-(2-henylethylenyl)benzene which gave promising results. The prepared core-shell latex pigments performed smoothly also in pilot coating and printing trials. The results demonstrated that by optimizing polymer composition, the optical and surface properties of coated paper can be significantly enhanced. The optimal reaction conditions were established for thermal imidization of poly(styrene-co-maleimide) (SMI) and poly(octadecene-co-maleimide) (OMI) from respective maleic anhydride copolymer precursors and ammonia in a solvent free process. The obtained aqueous dispersions of nanoparticle copolymers exhibited glass transition temperatures (Tg) between 140-170ºC and particle sizes from 50-230 nm. Furthermore, the maleimide copolymers were evaluated in paperboard coating as additional pigments. The maleimide copolymer nanoparticles were partly imbedded into the porous coating structure and therefore the full potential of optical property enhancement for paperboard was not achieved by this method. The possibility to modify maleimide copolymers was also studied. Modifications were carried out via N-substitution by replacing part of the ammonia in the imidization reaction with amines, such as triacetonediamine (TAD), aspartic acid (ASP) and fluorinated amines (2,2,2- trifluoroethylamine, TFEA and 2,2,3,3,4,4,4-heptafluorobuthylamine, HFBA). The obtained functional nanoparticles varied in size between 50-217 nm and their Tg from 150-180ºC. During the coating process the produced plastic pigments exhibited good runnability. No significant improvements were achieved in light stability with TAD modified copolymers whereas nanoparticles modified with aspartic acid and those containing fluorinated groups showed the desired changes in surface properties of the coated paperboard. Finally, reports on preliminary studies with organic-inorganic hybrids are presented. The hybrids prepared by an in situ polymerization reaction consisted of 30 wt% poly(styrene- co-maleimide) (SMI) and high levels of 70 wt% inorganic components of kaolin and/or alumina trihydrate. Scanning Electron Microscopy (SEM) images and characterization by Fourier Transform Infrared Spcetroscopy (FTIR) and X-Ray Diffraction (XRD) revealed that the hybrids had conventional composite structure and inorganic components were covered with precipitated SMI nanoparticles attached to the surface via hydrogen bonding. In paper coating, the hybrids had a beneficial effect on increasing gloss levels.
Resumo:
Protein engineering aims to improve the properties of enzymes and affinity reagents by genetic changes. Typical engineered properties are affinity, specificity, stability, expression, and solubility. Because proteins are complex biomolecules, the effects of specific genetic changes are seldom predictable. Consequently, a popular strategy in protein engineering is to create a library of genetic variants of the target molecule, and render the population in a selection process to sort the variants by the desired property. This technique, called directed evolution, is a central tool for trimming protein-based products used in a wide range of applications from laundry detergents to anti-cancer drugs. New methods are continuously needed to generate larger gene repertoires and compatible selection platforms to shorten the development timeline for new biochemicals. In the first study of this thesis, primer extension mutagenesis was revisited to establish higher quality gene variant libraries in Escherichia coli cells. In the second study, recombination was explored as a method to expand the number of screenable enzyme variants. A selection platform was developed to improve antigen binding fragment (Fab) display on filamentous phages in the third article and, in the fourth study, novel design concepts were tested by two differentially randomized recombinant antibody libraries. Finally, in the last study, the performance of the same antibody repertoire was compared in phage display selections as a genetic fusion to different phage capsid proteins and in different antibody formats, Fab vs. single chain variable fragment (ScFv), in order to find out the most suitable display platform for the library at hand. As a result of the studies, a novel gene library construction method, termed selective rolling circle amplification (sRCA), was developed. The method increases mutagenesis frequency close to 100% in the final library and the number of transformants over 100-fold compared to traditional primer extension mutagenesis. In the second study, Cre/loxP recombination was found to be an appropriate tool to resolve the DNA concatemer resulting from error-prone RCA (epRCA) mutagenesis into monomeric circular DNA units for higher efficiency transformation into E. coli. Library selections against antigens of various size in the fourth study demonstrated that diversity placed closer to the antigen binding site of antibodies supports generation of antibodies against haptens and peptides, whereas diversity at more peripheral locations is better suited for targeting proteins. The conclusion from a comparison of the display formats was that truncated capsid protein three (p3Δ) of filamentous phage was superior to the full-length p3 and protein nine (p9) in obtaining a high number of uniquely specific clones. Especially for digoxigenin, a difficult hapten target, the antibody repertoire as ScFv-p3Δ provided the clones with the highest affinity for binding. This thesis on the construction, design, and selection of gene variant libraries contributes to the practical know-how in directed evolution and contains useful information for scientists in the field to support their undertakings.
Resumo:
The main objective of the present study was to evaluate the diagnostic value (clinical application) of brain measures and cognitive function. Alzheimer and multiinfarct patients (N = 30) and normal subjects over the age of 50 (N = 40) were submitted to a medical, neurological and cognitive investigation. The cognitive tests applied were Mini-Mental, word span, digit span, logical memory, spatial recognition span, Boston naming test, praxis, and calculation tests. The brain ratios calculated were the ventricle-brain, bifrontal, bicaudate, third ventricle, and suprasellar cistern measures. These data were obtained from a brain computer tomography scan, and the cutoff values from receiver operating characteristic curves. We analyzed the diagnostic parameters provided by these ratios and compared them to those obtained by cognitive evaluation. The sensitivity and specificity of cognitive tests were higher than brain measures, although dementia patients presented higher ratios, showing poorer cognitive performances than normal individuals. Normal controls over the age of 70 presented higher measures than younger groups, but similar cognitive performance. We found diffuse losses of tissue from the central nervous system related to distribution of cerebrospinal fluid in dementia patients. The likelihood of case identification by functional impairment was higher than when changes of the structure of the central nervous system were used. Cognitive evaluation still seems to be the best method to screen individuals from the community, especially for developing countries, where the cost of brain imaging precludes its use for screening and initial assessment of dementia.
Resumo:
Although echocardiography has been used in rats, few studies have determined its efficacy for estimating myocardial infarct size. Our objective was to estimate the myocardial infarct size, and to evaluate anatomic and functional variables of the left ventricle. Myocardial infarction was produced in 43 female Wistar rats by ligature of the left coronary artery. Echocardiography was performed 5 weeks later to measure left ventricular diameter and transverse area (mean of 3 transverse planes), infarct size (percentage of the arc with infarct on 3 transverse planes), systolic function by the change in fractional area, and diastolic function by mitral inflow parameters. The histologic measurement of myocardial infarction size was similar to the echocardiographic method. Myocardial infarct size ranged from 4.8 to 66.6% when determined by histology and from 5 to 69.8% when determined by echocardiography, with good correlation (r = 0.88; P < 0.05; Pearson correlation coefficient). Left ventricular diameter and mean diastolic transverse area correlated with myocardial infarct size by histology (r = 0.57 and r = 0.78; P < 0.0005). The fractional area change ranged from 28.5 ± 5.6 (large-size myocardial infarction) to 53.1 ± 1.5% (control) and correlated with myocardial infarct size by echocardiography (r = -0.87; P < 0.00001) and histology (r = -0.78; P < 00001). The E/A wave ratio of mitral inflow velocity for animals with large-size myocardial infarction (5.6 ± 2.7) was significantly higher than for all others (control: 1.9 ± 0.1; small-size myocardial infarction: 1.9 ± 0.4; moderate-size myocardial infarction: 2.8 ± 2.3). There was good agreement between echocardiographic and histologic estimates of myocardial infarct size in rats.
Resumo:
Intercellular adhesion molecule-1 (ICAM-1) is an important factor in the progression of inflammatory responses in vivo. To develop a new anti-inflammatory drug to block the biological activity of ICAM-1, we produced a monoclonal antibody (Ka=4.19×10−8 M) against human ICAM-1. The anti-ICAM-1 single-chain variable antibody fragment (scFv) was expressed at a high level as inclusion bodies in Escherichia coli. We refolded the scFv (Ka=2.35×10−7 M) by ion-exchange chromatography, dialysis, and dilution. The results showed that column chromatography refolding by high-performance Q Sepharose had remarkable advantages over conventional dilution and dialysis methods. Furthermore, the anti-ICAM-1 scFv yield of about 60 mg/L was higher with this method. The purity of the final product was greater than 90%, as shown by denaturing gel electrophoresis. Enzyme-linked immunosorbent assay, cell culture, and animal experiments were used to assess the immunological properties and biological activities of the renatured scFv.
Resumo:
Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.
Resumo:
The aim of this work was to evaluate spices and industrial ingredients for the development of functional foods with high phenolic contents and antioxidant capacity. Basil, bay, chives, onion, oregano, parsley, rosemary, turmeric and powdered industrial ingredients (β-carotene, green tea extract, lutein, lycopene and olive extract) had their in vitro antioxidant capacity evaluated by means of the Folin-Ciocalteu reducing capacity and DPPH scavenging ability. Flavonoids identification and quantification were performed by High Performance Liquid Chromatography (HPLC). The results showed that spices presented a large variation in flavonoids content and in vitro antioxidant capacity, according to kind, brand and batches. Oregano had the highest antioxidant capacity and parsley had the highest flavonoid content. The industrial ingredient with the highest antioxidant capacity was green tea extract, which presented a high content of epigalocatechin gallate. Olive extract also showed a high antioxidant activity and it was a good source of chlorogenic acid. This study suggests that oregano, parsley, olive and green tea extract have an excellent potential for the development of functional foods rich in flavonoids as antioxidant, as long as the variability between batches/brands is controlled.
Resumo:
This work presents synopsis of efficient strategies used in power managements for achieving the most economical power and energy consumption in multicore systems, FPGA and NoC Platforms. In this work, a practical approach was taken, in an effort to validate the significance of the proposed Adaptive Power Management Algorithm (APMA), proposed for system developed, for this thesis project. This system comprise arithmetic and logic unit, up and down counters, adder, state machine and multiplexer. The essence of carrying this project firstly, is to develop a system that will be used for this power management project. Secondly, to perform area and power synopsis of the system on these various scalable technology platforms, UMC 90nm nanotechnology 1.2v, UMC 90nm nanotechnology 1.32v and UMC 0.18 μmNanotechnology 1.80v, in order to examine the difference in area and power consumption of the system on the platforms. Thirdly, to explore various strategies that can be used to reducing system’s power consumption and to propose an adaptive power management algorithm that can be used to reduce the power consumption of the system. The strategies introduced in this work comprise Dynamic Voltage Frequency Scaling (DVFS) and task parallelism. After the system development, it was run on FPGA board, basically NoC Platforms and on these various technology platforms UMC 90nm nanotechnology1.2v, UMC 90nm nanotechnology 1.32v and UMC180 nm nanotechnology 1.80v, the system synthesis was successfully accomplished, the simulated result analysis shows that the system meets all functional requirements, the power consumption and the area utilization were recorded and analyzed in chapter 7 of this work. This work extensively reviewed various strategies for managing power consumption which were quantitative research works by many researchers and companies, it's a mixture of study analysis and experimented lab works, it condensed and presents the whole basic concepts of power management strategy from quality technical papers.
Resumo:
The ability to monitor and evaluate the consequences of ongoing behaviors and coordinate behavioral adjustments seems to rely on networks including the anterior cingulate cortex (ACC) and phasic changes in dopamine activity. Activity (and presumably functional maturation) of the ACC may be indirectly measured using the error-related negativity (ERN), an event-related potential (ERP) component that is hypothesized to reflect activity of the automatic response monitoring system. To date, no studies have examined the measurement reliability of the ERN as a trait-like measure of response monitoring, its development in mid- and late- adolescence as well as its relation to risk-taking and empathic ability, two traits linked to dopaminergic and ACC activity. Utilizing a large sample of 15- and 18-year-old males, the present study examined the test-retest reliability of the ERN, age-related changes in the ERN and other components of the ERP associated with error monitoring (the Pe and CRN), and the relations of the error-related ERP components to personality traits of risk propensity and empathy. Results indicated good test-retest reliability of the ERN providing important validation of the ERN as a stable and possibly trait-like electrophysiological correlate of performance monitoring. Ofthe three components, only the ERN was of greater amplitude for the older adolescents suggesting that its ACC network is functionally late to mature, due to either structural or neurochemical changes with age. Finally, the ERN was smaller for those with high risk propensity and low empathy, while other components associated with error monitoring were not, which suggests that poor ACe function may be associated with the desire to engage in risky behaviors and the ERN may be influenced by the extent of individuals' concern with the outcome of events.
Resumo:
Functional Electrically Stimulated (FES) ami cycle ergometry is a relatively new technique for exercise in individuals with impairments of the upper limbs. The purpose of this study was to determine the effects of 12 weeks of FES arm cycle ergometry on upper limb function and cardiovascular fitness in individuals with tetraplegia. F!ve subjects (4M/1F; mean age 43.8 ± 15.4 years) with a spinal cord injury of the cervical spine (C3- C7; ASIA B-D) participated in 12 weeks of3 times per week FES arm cycle ergometry training. Exercise performance measures (time to fatigue, distance to fatigue, work rate) were taken at baseline, 6 weeks, and following 12 weeks of training. Cardiovascular measures (MAP, resting HR, average and peak HR during exercise, cardiovascular efficiency) and self reported upper limb function (as determined by the CUE, sf-QIF, SCI-SET questionnaires) were taken at baseline and following 12 weeks of training. Increases were found in time to fatigue (84.4%), distance to fatigue (111.7%), and work rate (51.3%). These changes were non-significant. There was a significant decrease in MAP (91.1 ± 13.9 vs. 87.7 ± 14.7 mmHg) following 12 weeks ofFES arm cycle ergometry. There was no significant change in resting HR or average and peak HR during exercise. Cardiovascular efficiency showed an increase following the 12 weeks ofFES training (142.9%), which was non-significant. There were no significant changes in the measures of upper limb function and spasticity. Overall, FES arm cycle ergometry is an effective method of cardiovascular exercise for individuals with tetraplegia, as evidenced by a significant decrease in MAP, however it is unclear whether 12 weeks of thrice weekly FES arm cycle ergometry may effectively improve upper limb function in all individuals with a cervical SCI.
Resumo:
Activity of the medial frontal cortex (MFC) has been implicated in attention regulation and performance monitoring. The MFC is thought to generate several event-related potential (ERPs) components, known as medial frontal negativities (MFNs), that are elicited when a behavioural response becomes difficult to control (e.g., following an error or shifting from a frequently executed response). The functional significance of MFNs has traditionally been interpreted in the context of the paradigm used to elicit a specific response, such as errors. In a series of studies, we consider the functional similarity of multiple MFC brain responses by designing novel performance monitoring tasks and exploiting advanced methods for electroencephalography (EEG) signal processing and robust estimation statistics for hypothesis testing. In study 1, we designed a response cueing task and used Independent Component Analysis (ICA) to show that the latent factors describing a MFN to stimuli that cued the potential need to inhibit a response on upcoming trials also accounted for medial frontal brain responses that occurred when individuals made a mistake or inhibited an incorrect response. It was also found that increases in theta occurred to each of these task events, and that the effects were evident at the group level and in single cases. In study 2, we replicated our method of classifying MFC activity to cues in our response task and showed again, using additional tasks, that error commission, response inhibition, and, to a lesser extent, the processing of performance feedback all elicited similar changes across MFNs and theta power. In the final study, we converted our response cueing paradigm into a saccade cueing task in order to examine the oscillatory dynamics of response preparation. We found that, compared to easy pro-saccades, successfully preparing a difficult anti-saccadic response was characterized by an increase in MFC theta and the suppression of posterior alpha power prior to executing the eye movement. These findings align with a large body of literature on performance monitoring and ERPs, and indicate that MFNs, along with their signature in theta power, reflects the general process of controlling attention and adapting behaviour without the need to induce error commission, the inhibition of responses, or the presentation of negative feedback.
Resumo:
Cette thèse poursuit un double objectif. D’une part, mesurer et situer le niveau de la condition physique, de la performance motrice et de la participation aux activités physiques chez des enfants ayant un TDAH. D’autre part, apprécier l’impact d’un programme structuré en activité physique sur la condition physique, la performance motrice, certains comportements cibles ainsi que sur les fonctions cognitives propres à ces enfants. Pour vérifier l’atteinte de ces objectifs, trois études expérimentales ont été complétées et ont fait l’objet d’articles soumis pour publications. Dans le premier article, on évalue la condition physique et la performance motrice chez des enfants ayant un TDAH prenant ou non de la médication. Les résultats obtenus démontrent que la condition physique, comprenant ici des variables reliées à la composition corporelle, l’endurance musculaire et la flexibilité de ces enfants, ne diffèrent pas de celle des membres d’un groupe témoin. Seul l’indice de masse corporelle est significativement moins élevé chez les enfants ayant un TDAH prenant de la médication. Aucune différence n’est observée entre les groupes en ce qui a trait à la capacité aérobie telle que mesurée lors d’une épreuve de tapis roulant. Par contre, lorsqu’évaluée à partir d’un test navette, la performance aérobie de tous les participants est significativement moins élevée, d’où l’importance du choix de l’instrument de mesure. Finalement, les enfants ayant un TDAH ont significativement plus de problèmes de motricité globale que les enfants du groupe témoin. Ces difficultés sont particulièrement importantes pour la locomotion. Dans le cadre d’un programme d’activités physiques structurées et supervisées, le deuxième article porte, d’une part, sur l’évaluation de l’intensité de la participation aux exercices proposés chez des enfants ayant un TDAH. D’autre part, l’impact potentiel de facteurs comme les problèmes de poids et la présence de difficultés motrices sont également pris en considération. Les résultats obtenus suggèrent que ces enfants atteignent une intensité et une durée d’exercice qui ne diffèrent pas de celles des enfants du groupe témoin. Quant aux enfants qui ont un problème de poids ou des difficultés motrices, l’intensité et la durée de leur participation ne diffèrent pas de celles des participants témoins. Sur la base des données obtenues, les enfants ayant un TDAH peuvent parvenir à un degré de participation aux exercices qui permet de bénéficier des bienfaits de la pratique d’activités physiques. Le troisième article traite de l’impact potentiel d’un programme d’activités physiques sur la condition physique, la performance motrice, certains comportements ainsi que sur les fonctions cognitives des enfants ayant un TDAH. Sur la base des résultats obtenus, il est possible de faire valoir que la participation à un tel programme permet d’améliorer les capacités musculaires, les habiletés motrices, certains comportements observés par les parents et les enseignants ainsi que la capacité d’attention. Cela pourrait produire un impact significatif dans l’adaptation fonctionnelle de ces jeunes. Ces résultats soulignent le besoin de continuer la recherche dans les domaines de l’activité physique et du TDAH. La discussion générale présente les liens existants entre les manuscrits en fonction du modèle de l’engagement dans les activités physiques. Le déficit de la motricité globale ainsi que l’impact clinique potentiel de l’activité physique dans le traitement du TDAH sont les deux axes de recherche qui semblent le plus propices à des travaux futurs.
Resumo:
L’insuffisance cardiaque est une pathologie provoquant une diminution importante des capacités fonctionnelles des patients ainsi qu’une diminution drastique de la qualité de vie. L’évaluation des capacités fonctionnelles est généralement effectuée par une épreuve d’effort maximal. Cependant pour plusieurs patients, cet effort est difficile à compléter. Les objectifs de l’étude présentée dans ce mémoire sont : (1) valider trois méthodes d’évaluation de la capacité fonctionnelle et aérobie des sujets souffrant d’insuffisance cardiaque avec un complexe QRS élargi; (2) chercher à établir le profil des patients démontrant une meilleure tolérance à l’exercice malgré une consommation maximale d’oxygène identique; et (3) démontrer les conséquences de la présence et de la magnitude de l’asynchronisme cardiaque dans la capacité fonctionnelle et la tolérance à l’exercice. Tous les sujets ont été soumis à un test de marche de six minutes, un test d’endurance à charge constante sur tapis roulant et à une épreuve d’effort maximal avec mesure d’échanges gazeux à la bouche. Les résultats ont montré une association significative entre les épreuves maximale et plus spécifiquement sous-maximale. De plus, une meilleure tolérance à l’exercice serait associée significativement à une plus grande masse du ventricule gauche. Finalement, les résultats de notre étude n’ont pas montré d’effet d’un asynchronisme cardiaque sur la performance à l’effort tel qu’évalué par nos protocoles.
Resumo:
L'intégralité de ce projet a été réalisé à l'aide de logiciels sous licence libre.
Resumo:
Les études d’imagerie par résonance magnétique fonctionnelle (IRMf) ont pour prémisse générale l’idée que le signal BOLD peut être utilisé comme un succédané direct de l’activation neurale. Les études portant sur le vieillissement cognitif souvent comparent directement l’amplitude et l’étendue du signal BOLD entre des groupes de personnes jeunes et âgés. Ces études comportent donc un a priori additionnel selon lequel la relation entre l’activité neurale et la réponse hémodynamique à laquelle cette activité donne lieu restent inchangée par le vieillissement. Cependant, le signal BOLD provient d’une combinaison ambiguë de changements de métabolisme oxydatif, de flux et de volume sanguin. De plus, certaines études ont démontré que plusieurs des facteurs influençant les propriétés du signal BOLD subissent des changements lors du vieillissement. L’acquisition d’information physiologiquement spécifique comme le flux sanguin cérébral et le métabolisme oxydatif permettrait de mieux comprendre les changements qui sous-tendent le contraste BOLD, ainsi que les altérations physiologiques et cognitives propres au vieillissement. Le travail présenté ici démontre l’application de nouvelles techniques permettant de mesurer le métabolisme oxydatif au repos, ainsi que pendant l’exécution d’une tâche. Ces techniques représentent des extensions de méthodes d’IRMf calibrée existantes. La première méthode présentée est une généralisation des modèles existants pour l’estimation du métabolisme oxydatif évoqué par une tâche, permettant de prendre en compte tant des changements arbitraires en flux sanguin que des changements en concentrations sanguine d’O2. Des améliorations en terme de robustesse et de précisions sont démontrées dans la matière grise et le cortex visuel lorsque cette méthode est combinée à une manipulation respiratoire incluant une composante d’hypercapnie et d’hyperoxie. Le seconde technique présentée ici est une extension de la première et utilise une combinaison de manipulations respiratoires incluant l’hypercapnie, l’hyperoxie et l’administration simultanée des deux afin d’obtenir des valeurs expérimentales de la fraction d’extraction d’oxygène et du métabolisme oxydatif au repos. Dans la deuxième partie de cette thèse, les changements vasculaires et métaboliques liés à l’âge sont explorés dans un groupe de jeunes et aînés, grâce au cadre conceptuel de l’IRMf calibrée, combiné à une manipulation respiratoire d’hypercapnie et une tâche modifiée de Stroop. Des changements de flux sanguin au repos, de réactivité vasculaire au CO2 et de paramètre de calibration M ont été identifiés chez les aînés. Les biais affectant les mesures de signal BOLD obtenues chez les participants âgés découlant de ces changements physiologiques sont de plus discutés. Finalement, la relation entre ces changements cérébraux et la performance dans la tâche de Stroop, la santé vasculaire centrale et la condition cardiovasculaire est explorée. Les résultats présentés ici sont en accord avec l’hypothèse selon laquelle une meilleure condition cardiovasculaire est associée à une meilleure fonction vasculaire centrale, contribuant ainsi à l’amélioration de la santé vasculaire cérébrale et cognitive.