940 resultados para Compression Parallel to Grain tests


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Glioblastoma multiforme (GBM) is the most malignant variant of human glial tumors. A prominent feature of this tumor is the occurrence of necrosis and vascular proliferation. The regulation of glial neovascularization is still poorly understood and the characterization of factors involved in this process is of major clinical interest. Macrophage migration inhibitory factor (MIF) is a pleiotropic cytokine released by leukocytes and by a variety of cells outside of the immune system. Recent work has shown that MIF may function to regulate cellular differentiation and proliferation in normal and tumor-derived cell lines, and may also contribute to the neovascularization of tumors. Our immunohistological analysis of MIF distribution in GBM tissues revealed the strong MIF protein accumulation in close association with necrotic areas and in tumor cells surrounding blood vessels. In addition, MIF expression was frequently associated with the presence of the tumor-suppressor gene p53. To substantiate the concept that MIF might be involved in the regulation of angiogenesis in GBM, we analyzed the MIF gene and protein expression under hypoxic and hypoglycemic stress conditions in vitro. Northern blot analysis showed a clear increase of MIF mRNA after hypoxia and hypoglycemia. We could also demonstrate that the increase of MIF transcripts on hypoxic stress can be explained by a profound transcriptional activation of the MIF gene. In parallel to the increase of MIF transcripts, we observed a significant rise in extracellular MIF protein on angiogenic stimulation. The data of our preliminary study suggest that the up-regulation of MIF expression during hypoxic and hypoglycemic stress might play a critical role for the neovascularization of glial tumors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Accurate and reproducible tibial tunnel placement minimizing the risk of neurovascular damage is a crucial condition for successful arthroscopic reconstruction of the posterior cruciate ligament (PCL). This step is commonly performed under fluoroscopic control. Hypothesis: Performing the tibial tunnel under exclusive arthroscopic control allows accurate and reliable tunnel placement according to recommendations in the literature. Materials and Methods: Between February 2007 and December 2009, 108 arthroscopic single bundle PCL reconstructions in tibial tunnel technique were performed. The routine postoperative radiographs were screened according to previously defined quality criterions. After critical analysis, the radiographs of 48 patients (48 knees) were enrolled in the study. 10 patients had simultaneous ACL reconstruction and 7 had PCL revision surgery. The tibial tunnel was placed under direct arthroscopic control through a posteromedial portal using a standard tibial aming device. Key anatomical landmarks were the exposed tibial insertion of the PCL and the posterior horn of the medial meniscus. First, the centre of the posterior tibial tunnel outlet on the a-p view was determined by digital analysis of the postoperative radiographes. Its distance to the medial tibial spine was measured parallel to the tibia plateau. The mediolateral position was expressed by the ratio between the distance of the tunnel outlet to the medial border and the total width of the tibial plateau. On the lateral view the vertical tunnel position was measured perpendicularly to a tangent of the medial tibial plateau. All measurement were repeated at least twice and carried out by two examiners. Results: The mean mediolateral tunnel position was 49.3 ± 4.6% (ratio), 6.7 ± 3.6 mm lateral to the medial tibial spine. On the lateral view the tunnel centre was 10.1 ± 4.5 mm distal to the bony surface of the medial tibial plateau. Neurovascular damage was observed in none of our patients. Conclusion: The results of this radiological study confirm that exclusive arthroscopic control for tibial tunnel placement in PCL reconstruction yields reproducible and accurate results according to the literature. Our technique avoids radiation, facilitates the operation room setting and enables the surgeon to visualize the anatomic key landmarks for tibial tunnel placement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to analyse how learning assessment, particularly the Continuous Assessment system, has been defined in the Public Administration and Management Diploma Course of the University of Barcelona (Spain). This course was a pioneering experiment at this university in implementing the guidelines of the European Higher Education Area (EHEA), and thus represents a good case study for verifying whether one of the cornerstones of the EHEA has been accomplished with success. Using data obtained from the Teaching Plans elaborated by the lecturers of each subject, we are able to establish that the CA system has been progressively accepted to such an extent that it is now the assessment formula used by practically all of the lecturers, conforming in this way to the protocols laid down by the Faculty of Law in which this diploma course is taught. Nevertheless, we find that high dispersion exists in how Continuous Assessment is actually defined. Indeed, it seems that there is no unified view of how Continuous Assessment should be performed. This dispersion, however, seems to diminish over time and raises some questions about the advisability of agreement on criteria, considering the potential which CA has as a pedagogical tool. Moreover, we find that the Unique Assessment system, which students may also apply for, is an option chosen only by a minority, with lecturers usually defining it as merely a theoretical and/or practical test, of little innovation in relation to traditional tests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a new hypothesis that relates global plate tectonics to the formation of marginal basins, island arcs, spreading ridges and arc-shaped mountain belts around the North Pacific Ocean. According to our model, the ellipsoidal-shaped Paleogene basins of the South China Sea, Parece-Vela Basin, Shikoku Basin, Sea of Japan and the Sea of Okhotsk in addition to those of the North American Cordillera can be attributed to the change in plate convergence direction at 42 Ma between the Indoaustralian and Eurasian plates. The new direction of convergence was parallel to the eastern continental margin of Asia and resulted in widespread extension perpendicular to this margin and to the western margin of North America. Both margins form part of a circle parallel to the Indoaustralian-Eurasian direction of convergence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to analyse how learning assessment, particularly the Continuous Assessment system, has been defined in the Public Administration and Management Diploma Course of the University of Barcelona (Spain). This course was a pioneering experiment at this university in implementing the guidelines of the European Higher Education Area (EHEA), and thus represents a good case study for verifying whether one of the cornerstones of the EHEA has been accomplished with success. Using data obtained from the Teaching Plans elaborated by the lecturers of each subject, we are able to establish that the CA system has been progressively accepted to such an extent that it is now the assessment formula used by practically all of the lecturers, conforming in this way to the protocols laid down by the Faculty of Law in which this diploma course is taught. Nevertheless, we find that high dispersion exists in how Continuous Assessment is actually defined. Indeed, it seems that there is no unified view of how Continuous Assessment should be performed. This dispersion, however, seems to diminish over time and raises some questions about the advisability of agreement on criteria, considering the potential which CA has as a pedagogical tool. Moreover, we find that the Unique Assessment system, which students may also apply for, is an option chosen only by a minority, with lecturers usually defining it as merely a theoretical and/or practical test, of little innovation in relation to traditional tests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An understanding of details of the interaction mechanisms of bacterial endotoxins (lipopolysaccharide, LPS) with the oxygen transport protein hemoglobin is still lacking, despite its high biological relevance. Here, a biophysical investigation into the endotoxin:hemoglobin interaction is presented which comprises the use of various rough mutant LPS as well as free lipid A; in addition to the complete hemoglobin molecule from fetal sheep extract, also the partial structure alpha-chain and the heme-free sample are studied. The investigations comprise the determination of the gel-to-liquid crystalline phase behaviour of the acyl chains of LPS, the ultrastructure (type of aggregate structure and morphology) of the endotoxins, and the incorporation of the hemoglobins into artificial immune cell membranes and into LPS. Our data suggest a model for the interaction between Hb and LPS in which hemoglobins do not react strongly with the hydrophilic or with the hydrophobic moiety of LPS, but with the complete endotoxin aggregate. Hb is able to incorporate into LPS with the longitudinal direction parallel to the lipid A double-layer. Although this does not lead to a strong disturbance of the LPS acyl chain packing, the change of the curvature leads to a slightly conical molecular shape with a change of the three-dimensional arrangement from unilamellar into cubic LPS aggregates. Our previous results show that cubic LPS structures exhibit strong endotoxic activity. The property of Hb on the physical state of LPS described here may explain the observation of an increase in LPS-mediating endotoxicity due to the action of Hb.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main sources of coarse aggregate for secondary slip form paving in Southwest Iowa exhibit undesirable "D" cracking. "D" cracking is a discoloration of the concrete caused by fine, hairline cracks. These cracks are caused by the freezing and thawing of moisture inside the coarse aggregate. The cracks are often hour glass shaped, are parallel to each other, and occur along saw joints. The B-4, a typical secondary mix, utilizes 50% fine aggregate and 50% coarse aggregate. It has been proposed that a concrete mix with less coarse aggregate and more fine aggregate might impede this type of deterioration. The Nebraska Standard 47B Mix, a 70% fine aggregate, and 30% coarse aggregate mix, as used by Nebraska Department of Roads produces concrete with ultimate strengths in excess of 4500 psi but because of the higher cost of cement (it is a six bag per cubic yard mix) is not competitive with our present secondary mixes. The sands of Southwest Iowa generally have poorer mortar strengths than the average Iowa Sand. Class V Aggregate also found in Southwest Iowa has a coarser sand fraction, therefore it has a better mortar strength, but exhibits an acidic reaction and therefore must be·used with limestone. This illustrates the need to find a mix for use in Southwest Iowa that possesses adequate strength and satisfactory durability at a low cost. The purpose of this study is to determine a concrete mix with an acceptable cement content which will produce physical properties similar to that of our present secondary paving mixes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to frequent accidental damage to prestressed concrete (P/C) bridges caused by impact from overheight vehicles, a project was initiated to evaluate the strength and load distribution characteristics of damaged P/C bridges. A comprehensive literature review was conducted. It was concluded that only a few references pertain to the assessment and repair of damaged P/C beams. No reference was found that involves testing of a damaged bridge(s) as well as the damaged beams following their removal. Structural testing of two bridges was conducted in the field. The first bridge tested, damaged by accidental impact, was the westbound (WB) I-680 bridge in Beebeetown, Iowa. This bridge had significant damage to the first and second beams consisting of extensive loss of section and the exposure of numerous strands. The second bridge, the adjacent eastbound (EB) structure, was used as a baseline of the behavior of an undamaged bridge. Load testing concluded that a redistribution of load away from the damaged beams of the WB bridge was occurring. Subsequent to these tests, the damaged beams in the WB bridge were replaced and the bridge retested. The repaired WB bridge behaved, for the most part, like the undamaged EB bridge indicating that the beam replacement restored the original live load distribution patterns. A large-scale bridge model constructed for a previous project was tested to study the changes in behavior due to incrementally applied damage consisting initially of only concrete removal and then concrete removal and strand damage. A total of 180 tests were conducted with the general conclusion that for exterior beam damage, the bridge load distribution characteristics were relatively unchanged until significant portions of the bottom flange were removed along with several strands. A large amount of the total applied moment to the exterior beam was redistributed to the interior beam of the model. Four isolated P/C beams were tested, two removed from the Beebeetown bridge and two from the aforementioned bridge model. For the Beebeetown beams, the first beam, Beam 1W, was tested in an "as removed" condition to obtain the baseline characteristics of a damaged beam. The second beam, Beam 2W, was retrofit with carbon fiber reinforced polymer (CFRP) longitudinal plates and transverse stirrups to strengthen the section. The strengthened Beam was 12% stronger than Beam 1W. Beams 1 and 2 from the bridge model were also tested. Beam 1 was not damaged and served as the baseline behavior of a "new" beam while Beam 2 was damaged and repaired again using CFRP plates. Prior to debonding of the plates from the beam, the behavior of both Beams 1 and 2 was similar. The retrofit beam attained a capacity greater than a theoretically undamaged beam prior to plate debonding. Analytical models were created for the undamaged and damaged center spans of the WB bridge; stiffened plate and refined grillage models were used. Both models were accurate at predicting the deflections in the tested bridge and should be similarly accurate in modeling other P/C bridges. The moment fractions per beam were computed using both models for the undamaged and damaged bridges. The damaged model indicates a significant decrease in moment in the damaged beams and a redistribution of load to the adjacent curb and rail as well as to the undamaged beam lines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to analyse how learning assessment, particularly the Continuous Assessment system, has been defined in the Public Administration and Management Diploma Course of the University of Barcelona (Spain). This course was a pioneering experiment at this university in implementing the guidelines of the European Higher Education Area (EHEA), and thus represents a good case study for verifying whether one of the cornerstones of the EHEA has been accomplished with success. Using data obtained from the Teaching Plans elaborated by the lecturers of each subject, we are able to establish that the CA system has been progressively accepted to such an extent that it is now the assessment formula used by practically all of the lecturers, conforming in this way to the protocols laid down by the Faculty of Law in which this diploma course is taught. Nevertheless, we find that high dispersion exists in how Continuous Assessment is actually defined. Indeed, it seems that there is no unified view of how Continuous Assessment should be performed. This dispersion, however, seems to diminish over time and raises some questions about the advisability of agreement on criteria, considering the potential which CA has as a pedagogical tool. Moreover, we find that the Unique Assessment system, which students may also apply for, is an option chosen only by a minority, with lecturers usually defining it as merely a theoretical and/or practical test, of little innovation in relation to traditional tests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to the hazardous nature of chemical asphalt extraction agents, nuclear gauges have become an increasingly popular method of determining the asphalt content of a bituminous mix. This report details the results of comparisons made between intended, tank stick, extracted, and nuclear asphalt content determinations. A total of 315 sets of comparisons were made on samples that represented 110 individual mix designs and 99 paving projects. All samples were taken from 1987 construction projects. In addition to the comparisons made, seventeen asphalt cement samples were recovered for determination of penetration and viscosity. Results were compared to similar tests performed on the asphalt assurance samples in an attempt to determine the amount of asphalt hardening that can be expected due to the hot mix process. Conclusions of the report are: 1. Compared to the reflux extraction procedure, nuclear asphalt content gauges determine asphalt content of bituminous mixes with much greater accuracy and comparable precision. 2. As a means for determining asphalt content, the nuclear procedure should be used as an alternate to chemical extractions whenever possible. 3. Based on penetration and viscosity results, softer grade asphalts undergo a greater degree 'of hardening due to hot mix processing than do harder grades, and asphalt viscosity changes caused by the mixing process are subject to much more variability than are changes in penetration. 4. Based on changes in penetration and viscosity, the Thin Film Oven Test provides a reasonable means of estimating how much asphalt hardening can be anticipated due to exposure to the hot mix processing environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Spermiogenesis and the ultrastructure of the spermatozoon of the bothriocephalidean cestode Clestobothrium crassiceps (Rudolphi, 1819), a parasite of the teleost fish Merluccius merluccius (Linnaeus, 1758), have been studied by means of transmission electron microscopy. Spermiogenesis involves firstly the formation of a differentiation zone. It is characterized by the presence of two centrioles associated with striated rootlets, an intercentriolar body and an electron-dense material in the apical region of this zone. Later, two flagella develop from the centrioles, growing orthogonally in relation to the median cytoplasmic process. Flagella then undergo a rotation of 90° until they become parallel to the median cytoplasmic process, followed by the proximodistal fusion of the flagella with the median cytoplasmic process. The nucleus elongates and afterwards it migrates along the spermatid body. Spermiogenesis finishes with the appearance of the apical cone surrounded by the single helical crested body at the base of the spermatid. Finally, the narrowing of the ring of arched membranes detaches the fully formed spermatozoon. The mature spermatozoon of C. crassiceps is filiform and contains two axonemes of the 9 + '1' trepaxonematan pattern, a parallel nucleus, parallel cortical microtubules, and electron-dense granules of glycogen. The anterior extremity of the gamete exhibits a short electron-dense apical cone and one crested body, which turns once around the sperm cell. The first axoneme is surrounded by a ring of thick cortical microtubules that persist until the appearance of the second axoneme. Later, these thick cortical microtubules disappear and thus, the mature spermatozoon exhibits two bundles of thin cortical microtubules. The posterior extremity of the male gamete presents only the nucleus. Results are discussed and compared particularly with the available ultrastructural data on the former 'pseudophyllideans'. Two differences can be established between spermatozoa of Bothriocephalidea and Diphyllobothriidea, the type of spermatozoon (II vs I) and the presence/absence of the ring of cortical microtubules.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe the spatial distribution of tree height of Pinus uncinata at two undisturbed altitudinal treeline ecotones in the southern Pyrenees (Ordesa, O, and Tessó, T). At each site, a rectangular plot (30 x 140 m) was located with its longest side parallel to the slope and encompassing treeline and timberline. At site O, height increased abruptly going downslope with a high spatial autocorrelation at short distances. In contrast, the changes of tree height across the ecotone at site T were gradual, and tree height was less spatially autocorrelated. These results can be explained by the greater importance of wind and snow avalanches at sites O and T, respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An efficient screening strategy for the identification of potentially interesting low-abundance antifungal natural products in crude extracts that combines both a sensitive bioautography assay and high performance liquid chromatography (HPLC) microfractionation was developed. This method relies on high performance thin layer chromatography (HPTLC) bioautography with a hypersusceptible engineered strain of Candida albicans (DSY2621) for bioactivity detection, followed by the evaluation of wild type strains in standard microdilution antifungal assays. Active extracts were microfractionated by HPLC in 96-well plates, and the fractions were subsequently submitted to the bioassay. This procedure enabled precise localisation of the antifungal compounds directly in the HPLC chromatograms of the crude extracts. HPLC-PDA-mass spectrometry (MS) data obtained in parallel to the HPLC antifungal profiles provided a first chemical screening about the bioactive constituents. Transposition of the HPLC analytical conditions to medium-pressure liquid chromatography (MPLC) allowed the efficient isolation of the active constituents in mg amounts for structure confirmation and more extensive characterisation of their biological activities. The antifungal properties of the isolated natural products were evaluated by their minimum inhibitory concentration (MIC) in a dilution assay against both wild type and engineered strains of C. albicans. The biological activity of the most promising agents was further evaluated in vitro by electron microscopy and in vivo in a Galleria mellonella model of C. albicans infection. The overall procedure represents a rational and comprehensive means of evaluating antifungal activity from various perspectives for the selection of initial hits that can be explored in more in-depth mode-of-action studies. This strategy is illustrated by the identification and bioactivity evaluation of a series of antifungal compounds from the methanolic extract of a Rubiaceae plant, Morinda tomentosa, which was used as a model in these studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RESUME LARGE PUBLIC Le système nerveux central est principalement composé de deux types de cellules :les neurones et les cellules gliales. Ces dernières, bien que l'emportant en nombre sur les neurones, ont longtemps été considérées comme des cellules sans intérêts par les neuroscientifiques. Hors, les connaissances modernes à leurs sujets indiquent qu'elles participent à la plupart des tâches physiologiques du cerveau. Plus particulièrement, elles prennent part aux processus énergétiques cérébraux. Ceux-ci, en plus d'être vitaux, sont particulièrement intrigants puisque le cerveau représente seulement 2 % de la masse corporelle mais consomme environ 25 % du glucose (substrat énergétique) corporel. Les astrocytes, un type de cellules gliales, jouent un rôle primordial dans cette formidable utilisation de glucose par le cerveau. En effet, l'activité neuronale (transmission de l'influx nerveux) est accompagnée d'une augmentation de la capture de glucose, issu de la circulation sanguine, par les astrocytes. Ce phénomène est appelé le «couplage neurométabolique » entre neurones et astrocytes. L'ion sodium fait partie des mécanismes cellulaires entrant en fonction lors de ces processus. Ainsi, dans le cadre de cette thèse, les aspects dynamiques de la régulation du sodium astrocytaire et leurs implications dans le couplage neurométabolique ont été étudiés par des techniques d'imagerie cellulaires. Ces études ont démontré que les mitochondries, machineries cellulaires convertissant l'énergie contenue dans le glucose, participent à la régulation du sodium astrocytaire. De plus, ce travail de thèse a permis de découvrir que les astrocytes sont capables de se transmettre, sous forme de vagues de sodium se propageant de cellules en cellules, un message donnant l'ordre d'accroître leur consommation d'énergie. Cette voie de signalisation leur permettrait de fournir de l'énergie aux neurones suite à leur activation. RESUME Le glutamate libéré dans la fente synaptique pendant l'activité neuronale, est éliminé par les astrocytes environnants. Le glutamate est co-transporté avec des ions sodiques, induisant une augmentation intracellulaire de sodium (Na+i) dans les astrocytes. Cette élévation de Na+i déclenche une cascade de mécanismes moléculaires qui aboutissent à la production de substrats énergétiques pouvant être utilisés par les neurones. Durant cette thèse, la mesure simultanée du sodium mitochondrial (Na+mit) et cytosolique par des techniques d'imagerie utilisant des sondes fluorescentes spécifiques, a indiqué que les variations de Na+i induites par le transport du glutamate sont transmises aux mitochondries. De plus, les voies d'entrée et de sortie du sodium mitochondrial ont été identifiées. L'échangeur de Na+ et de Ca2+ mitochondrial semble jouer un rôle primordial dans l'influx de Na+mit, alors que l'efflux de Na+mit est pris en charge par l'échangeur de Na+ et de H+ mitochondrial. L'étude du Na+mit a nécessité l'utilisation d'un système de photoactivation. Les sources de lumière ultraviolette (UV) classiques utilisées à cet effet (lasers, lampes à flash) ayant plusieurs désavantages, une alternative efficace et peu coûteuse a été développée. Il s'agit d'un système compact utilisant une diode électroluminescente (LED) à haute puissance et de longueur d'onde de 365nm. En plus de leurs rôles dans le couplage neurométabolique, les astrocytes participent à la signalisation multicellulaire en transmettant des vagues intercellulaires de calcium. Ce travail de thèse démontre également que des vagues intercellulaires de sodium peuvent être évoquées en parallèle à ces vagues calciques. Le glutamate, suite à sa libération par un mécanisme dépendent du calcium, est réabsorbé par les transporteurs au glutamate. Ce mécanisme a pour conséquence la génération de vagues sodiques se propageant de cellules en cellules. De plus, ces vagues sodiques sont corrélées spatialement avec une consommation accrue de glucose par les astrocytes. En conclusion, ce travail de thèse a permis de montrer que le signal sodique astrocytaire, déclenché en réponse au glutamate, se propage à la fois de façon intracellulaire aux mitochondries et de façon intercellulaire. Ces résultats suggèrent que les astrocytes fonctionnent comme un réseau de cellules nécessaire au couplage énergétique concerté entre neurones et astrocytes et que le sodium est un élément clé dans les mécanismes de signalisations cellulaires sous-jacents. SUMMARY Glutamate, released in the synaptic cleft during neuronal activity, is removed by surrounding astrocytes. Glutamate is taken-up with Na+ ions by specific transporters, inducing an intracellular Na+ (Na+i) elevation in astrocytes which triggers a cascade of molecular mechanisms that provides metabolic substrates to neurons. Thus, astrocytic Na+i homeostasis represents a key component of the so-called neurometabolic coupling. In this context, the first part of this thesis work was aimed at investigating whether cytosolic Na+ changes are transmitted to mitochondria, which could therefore influence their function and contribute to the overall intracellular Na+ regulation. Simultaneous monitoring of both mitochondrial Na+ (Na+mit) and cytosolic Na+ changes with fluorescent dyes revealed that glutamate-evoked cytosolic Na+ elevations are indeed transmitted to mitochondria. The mitochondrial Na+/Ca2+ exchangers have a prominent role in the regulation of Na+mit influx pathway, and Na+mit extrusion appears to be mediated by Na+/H+ exchangers. To demonstrate the implication of Na+/Ca2+ exchangers, this study has required the technical development of an UV-flash photolysis system. Because light sources for flash photolysis have to be powerful and in the near UV range, the use of UV lasers or flash lamps is usually required. As an alternative to these UV sources that have several drawbaks, we developped a compact, efficient and lowcost flash photolysis system which employs a high power 365nm light emitting diode. In addition to their role in neurometabolic coupling, astrocytes participate in multicellular signaling by transmitting intercellular Ca2+ waves. The third part of this thesis show that intercellular Na+ waves can be evoked in parallel to Ca2+ waves. Glutamate released by a Ca2+ wave-dependent mechanism is taken up by glutamate transporters, resulting in a regenerative propagation of cytosolic Na+ increases. Na+ waves in turn lead to a spatially correlated increase in glucose uptake. In conclusion, the present thesis demonstrates that glutamate-induced Na+ changes occurring in the cytosol of astrocytes propagate to both the mitochondrial matrix and the astrocytic network. These results furthermore support the view that astrocytic Na+ is a signal coupled to the brain energy metabolism.