14 resultados para Mixed-mode end load split
em Université de Lausanne, Switzerland
Resumo:
Introduction: The general strategy to perform anti-doping analysis starts with a screening followed by a confirmatory step when a sample is suspected to be positive. The screening step should be fast, generic and able to highlight any sample that may contain a prohibited substance by avoiding false negative and reducing false positive results. The confirmatory step is a dedicated procedure comprising a selective sample preparation and detection mode. Aim: The purpose of the study is to develop rapid screening and selective confirmatory strategies to detect and identify 103 doping agents in urine. Methods: For the screening, urine samples were simply diluted by a factor 2 with ultra-pure water and directly injected ("dilute and shoot") in the ultrahigh- pressure liquid chromatography (UHPLC). The UHPLC separation was performed in two gradients (ESI positive and negative) from 5/95 to 95/5% of MeCN/Water containing 0.1% formic acid. The gradient analysis time is 9 min including 3 min reequilibration. Analytes detection was performed in full scan mode on a quadrupole time-of-flight (QTOF) mass spectrometer by acquiring the exact mass of the protonated (ESI positive) or deprotonated (ESI negative) molecular ion. For the confirmatory analysis, urine samples were extracted on SPE 96-well plate with mixed-mode cation (MCX) for basic and neutral compounds or anion exchange (MAX) sorbents for acidic molecules. The analytes were eluted in 3 min (including 1.5 min reequilibration) with a S1-25 Ann Toxicol Anal. 2009; 21(S1) Abstracts gradient from 5/95 to 95/5% of MeCN/Water containing 0.1% formic acid. Analytes confirmation was performed in MS and MS/MS mode on a QTOF mass spectrometer. Results: In the screening and confirmatory analysis, basic and neutral analytes were analysed in the positive ESI mode, whereas acidic compounds were analysed in the negative mode. The analyte identification was based on retention time (tR) and exact mass measurement. "Dilute and shoot" was used as a generic sample treatment in the screening procedure, but matrix effect (e.g., ion suppression) cannot be avoided. However, the sensitivity was sufficient for all analytes to reach the minimal required performance limit (MRPL) required by the World Anti Doping Agency (WADA). To avoid time-consuming confirmatory analysis of false positive samples, a pre-confirmatory step was added. It consists of the sample re-injection, the acquisition of MS/MS spectra and the comparison to reference material. For the confirmatory analysis, urine samples were extracted by SPE allowing a pre-concentration of the analyte. A fast chromatographic separation was developed as a single analyte has to be confirmed. A dedicated QTOF-MS and MS/MS acquisition was performed to acquire within the same run a parallel scanning of two functions. Low collision energy was applied in the first channel to obtain the protonated molecular ion (QTOF-MS), while dedicated collision energy was set in the second channel to obtain fragmented ions (QTOF-MS/MS). Enough identification points were obtained to compare the spectra with reference material and negative urine sample. Finally, the entire process was validated and matrix effects quantified. Conclusion: Thanks to the coupling of UHPLC with the QTOF mass spectrometer, high tR repeatability, sensitivity, mass accuracy and mass resolution over a broad mass range were obtained. The method was sensitive, robust and reliable enough to detect and identify doping agents in urine. Keywords: screening, confirmatory analysis, UHPLC, QTOF, doping agents
Resumo:
A sensitive and specific ultra performance liquid chromatography-tandem mass spectrometry method for the simultaneous quantification of nicotine, its metabolites cotinine and trans-3'-hydroxycotinine and varenicline in human plasma was developed and validated. Sample preparation was realized by solid phase extraction of the target compounds and of the internal standards (nicotine-d4, cotinine-d3, trans-3'-hydroxycotinine-d3 and CP-533,633, a structural analog of varenicline) from 0.5mL of plasma, using a mixed-mode cation exchange support. Chromatographic separations were performed on a hydrophilic interaction liquid chromatography column (HILIC BEH 2.1×100mm, 1.7μm). A gradient program was used, with a 10mM ammonium formate buffer pH 3/acetonitrile mobile phase at a flow of 0.4mL/min. The compounds were detected on a triple quadrupole mass spectrometer, operated with an electrospray interface in positive ionization mode and quantification was performed using multiple reaction monitoring. Matrix effects were quantitatively evaluated with success, with coefficients of variation inferior to 8%. The procedure was fully validated according to Food and Drug Administration guidelines and to Société Française des Sciences et Techniques Pharmaceutiques. The concentration range was 2-500ng/mL for nicotine, 1-1000ng/mL for cotinine, 2-1000ng/mL for trans-3'-hydroxycotinine and 1-500ng/mL for varenicline, according to levels usually measured in plasma. Trueness (86.2-113.6%), repeatability (1.9-12.3%) and intermediate precision (4.4-15.9%) were found to be satisfactory, as well as stability in plasma. The procedure was successfully used to quantify nicotine, its metabolites and varenicline in more than 400 plasma samples from participants in a clinical study on smoking cessation.
Resumo:
The anti-diuretic neurohypophysial hormone Vasopressin (Vp) and its synthetic analogue Desmopressin (Dp, 1-desamino-vasopressin) have received considerable attention from doping control authorities due to their impact on physiological blood parameters. Accordingly, the illicit use of Desmopressin in elite sport is sanctioned by the World Anti-Doping Agency (WADA) and the drug is classified as masking agent. Vp and Dp are small (8-9 amino acids) peptides administered orally as well as intranasally. Within the present study a method to determine Dp and Vp in urinary doping control samples by means of liquid chromatography coupled to quadrupole high resolution time-of-flight mass spectrometry was developed. After addition of Lys-Vasopressin as internal standard and efficient sample clean up with a mixed mode solid phase extraction (weak cation exchange), the samples were directly injected into the LC-MS system. The method was validated considering the parameters specificity, linearity, recovery (80-100%), accuracy, robustness, limit of detection/quantification (20/50 pg mL(-1)), precision (inter/intra-day<10%), ion suppression and stability. The analysis of administration study urine samples collected after a single intranasal or oral application of Dp yielded in detection windows for the unchanged target analyte for up to 20 h at concentrations between 50 and 600 pg mL(-1). Endogenous Vp was detected in concentrations of approximately 20-200 pg mL(-1) in spontaneous urine samples obtained from healthy volunteers. The general requirements of the developed method provide the characteristics for an easy transfer to other anti-doping laboratories and support closing another potential gap for cheating athletes.
Resumo:
A simple and sensitive liquid chromatography-electrospray ionization mass spectrometry method was developed for the simultaneous quantification in human plasma of all selective serotonin reuptake inhibitors (citalopram, fluoxetine, fluvoxamine, paroxetine and sertraline) and their main active metabolites (desmethyl-citalopram and norfluoxetine). A stable isotope-labeled internal standard was used for each analyte to compensate for the global method variability, including extraction and ionization variations. After sample (250μl) pre-treatment with acetonitrile (500μl) to precipitate proteins, a fast solid-phase extraction procedure was performed using mixed mode Oasis MCX 96-well plate. Chromatographic separation was achieved in less than 9.0min on a XBridge C18 column (2.1×100mm; 3.5μm) using a gradient of ammonium acetate (pH 8.1; 50mM) and acetonitrile as mobile phase at a flow rate of 0.3ml/min. The method was fully validated according to Société Française des Sciences et Techniques Pharmaceutiques protocols and the latest Food and Drug Administration guidelines. Six point calibration curves were used to cover a large concentration range of 1-500ng/ml for citalopram, desmethyl-citalopram, paroxetine and sertraline, 1-1000ng/ml for fluoxetine and fluvoxamine, and 2-1000ng/ml for norfluoxetine. Good quantitative performances were achieved in terms of trueness (84.2-109.6%), repeatability (0.9-14.6%) and intermediate precision (1.8-18.0%) in the entire assay range including the lower limit of quantification. Internal standard-normalized matrix effects were lower than 13%. The accuracy profiles (total error) were mainly included in the acceptance limits of ±30% for biological samples. The method was successfully applied for routine therapeutic drug monitoring of more than 1600 patient plasma samples over 9 months. The β-expectation tolerance intervals determined during the validation phase were coherent with the results of quality control samples analyzed during routine use. This method is therefore precise and suitable both for therapeutic drug monitoring and pharmacokinetic studies in most clinical laboratories.
Resumo:
Ethyl glucuronide (EtG) is a minor and direct metabolite of ethanol. EtG is incorporated into the growing hair allowing retrospective investigation of chronic alcohol abuse. In this study, we report the development and the validation of a method using gas chromatography-negative chemical ionization tandem mass spectrometry (GC-NCI-MS/MS) for the quantification of EtG in hair. EtG was extracted from about 30 mg of hair by aqueous incubation and purified by solid-phase extraction (SPE) using mixed mode extraction cartridges followed by derivation with perfluoropentanoic anhydride (PFPA). The analysis was performed in the selected reaction monitoring (SRM) mode using the transitions m/z 347-->163 (for the quantification) and m/z 347-->119 (for the identification) for EtG, and m/z 352-->163 for EtG-d(5) used as internal standard. For validation, we prepared quality controls (QC) using hair samples taken post mortem from 2 subjects with a known history of alcoholism. These samples were confirmed by a proficiency test with 7 participating laboratories. The assay linearity of EtG was confirmed over the range from 8.4 to 259.4 pg/mg hair, with a coefficient of determination (r(2)) above 0.999. The limit of detection (LOD) was estimated with 3.0 pg/mg. The lower limit of quantification (LLOQ) of the method was fixed at 8.4 pg/mg. Repeatability and intermediate precision (relative standard deviation, RSD%), tested at 4 QC levels, were less than 13.2%. The analytical method was applied to several hair samples obtained from autopsy cases with a history of alcoholism and/or lesions caused by alcohol. EtG concentrations in hair ranged from 60 to 820 pg/mg hair.
Resumo:
Objective. Vibration training (VT) is a new exercise method, with good acceptance among sedentary subjects, due to its passive principle: the machine moves the subject, not the opposite. We hypothesize that untrained subjects can benefit from a greater cardiovascular and metabolic stimulation than trained athletes, resembling classical aerobic-type activity, in addition of eliciting strength gains shown in diverse studies. Methods. 3 group of male subjects, inactive (SED), endurance trained athletes (END) and strength trained athletes (STR) underwent fitness (VO2max) and lower-body strength tests (isokinetic). Subjects were submitted to a session of oscillating VT, composed of 3 exercises (isometric half-squat, dynamic squat, dynamic squat with added load), each of 3 minutes duration, and repeated at 3 frequencies. VO2, heart rate and Borg scale were monitored. Results. 27 healthy subjects (10 SED, 9 END and 8 STR), mean age 24.5 (SED), 25.0 (STR) and 29.8 (END) were included. VO2max was significantly different as expected (47.9 vs. 52.9 vs. 63.9 ml/kg/min, resp. for SED, STR and END). Isokinetic dominant leg extensors strength was higher in STR (3.32 Nm/kg vs. 2.60 and 2.74 in SED and END). During VT, peak oxygen consumption (% of VO2max) attained was 59.3 in SED, 50.8 in STR and 48.0 in END (P<0.001 between SED and other subjects). Peak heart rate (% of heart rate max) was 82.7 in SED, 80.4 in STR and 72.4 in END. In SED, dynamic exercises without extra load elicited 51.0% of VO2max and 72.1% of heart rate max, and perceived effort reached 15.1/20. Conclusions. VT is an unconventional type of exercise, which has been shown to enhance strength, bone density, balance and flexibility. Users are attracted by the relative passivity. In SED, we show that VT elicits sufficient cardiovascular response to benefit overall fitness in addition to the known strength effects. VT's higher acceptance as an exercise in sedentary people, compared to jogging or cycling for example, can lead to better adherence to physical activity. Although long-term effects of VT on health are not avalaible, we believe this type of combination of aerobic and resistance-type exercise can be beneficial on multiple health parameters, especially cardiovascular health.
Resumo:
Dans les dernières années du 20ème siècle, l'aluminium a fait l'objet de beaucoup de communications outrancières et divergentes cautionnées par des scientifiques et des organismes faisant autorité. En 1986, la société PECHINEY le décrète perpétuel tel le mouvement « L'aluminium est éternel. Il est recyclable indéfiniment sans que ses propriétés soient altérées », ce qui nous avait alors irrité. Peu de temps après, en 1990, une communication tout aussi outrancière et irritante d'une grande organisation environnementale, le World Wild Fund, décrète que « le recyclage de l'aluminium est la pire menace pour l'environnement. Il doit être abandonné ». C'est ensuite à partir de la fin des années 1990, l'explosion des publications relatives au développement durable, le bien mal nommé. Au développement, synonyme de croissance obligatoire, nous préférons société ou organisation humaine et à durable, mauvaise traduction de l'anglais « sustainable », nous préférons supportable : idéalement, nous aurions souhaité parler de société durable, mais, pour être compris de tous, nous nous sommes limités à parler dorénavant de développement supportable. Pour l'essentiel, ces publications reconnaissent les très graves défauts de la métallurgie extractive de l'aluminium à partir du minerai et aussi les mérites extraordinaires du recyclage de l'aluminium puisqu'il représente moins de 10% de la consommation d'énergie de la métallurgie extractive à partir du minerai (on verra que c'est aussi moins de 10% de la pollution et du capital). C'est précisément sur le recyclage que se fondent les campagnes de promotion de l'emballage boisson, en Suisse en particulier. Cependant, les données concernant le recyclage de l'aluminium publiées par l'industrie de l'aluminium reflètent seulement en partie ces mérites. Dans les années 1970, les taux de croissance de la production recyclée sont devenus plus élevés que ceux de la production électrolytique. Par contre, les taux de recyclage, établis à indicateur identique, sont unanimement tous médiocres comparativement à d'autres matériaux tels le cuivre et le fer. Composante de l'industrie de l'aluminium, le recyclage bénéficie d'une image favorable auprès du grand public, démontrant le succès des campagnes de communication. A l'inverse, à l'intérieur de l'industrie de l'aluminium, c'est une image dévalorisée. Les opinions émises par tous les acteurs, commerçants, techniciens, dirigeants, encore recueillies pendant ce travail, sont les suivantes : métier de chiffonnier, métier misérable, métier peu technique mais très difficile (un recycleur 15 d'aluminium n'a-t-il pas dit que son métier était un métier d'homme alors que celui du recycleur de cuivre était un jeu d'enfant). A notre avis ces opinions appartiennent à un passé révolu qu'elles retraduisent cependant fidèlement car le recyclage est aujourd'hui reconnu comme une contribution majeure au développement supportable de l'aluminium. C'est bien pour cette raison que, en 2000, l'industrie de l'aluminium mondiale a décidé d'abandonner le qualificatif « secondaire » jusque là utilisé pour désigner le métal recyclé. C'est en raison de toutes ces données discordantes et parfois contradictoires qu'a débuté ce travail encouragé par de nombreuses personnalités. Notre engagement a été incontestablement facilité par notre connaissance des savoirs indispensables (métallurgie, économie, statistiques) et surtout notre expérience acquise au cours d'une vie professionnelle menée à l'échelle mondiale dans (recherche et développement, production), pour (recherche, développement, marketing, stratégie) et autour (marketing, stratégie de produits connexes, les ferro-alliages, et concurrents, le fer) de l'industrie de l'aluminium. Notre objectif est de faire la vérité sur le recyclage de l'aluminium, un matériau qui a très largement contribué à faire le 20ème siècle, grâce à une revue critique embrassant tous les aspects de cette activité méconnue ; ainsi il n'y a pas d'histoire du recyclage de l'aluminium alors qu'il est plus que centenaire. Plus qu'une simple compilation, cette revue critique a été conduite comme une enquête scientifique, technique, économique, historique, socio-écologique faisant ressortir les faits principaux ayant marqué l'évolution du recyclage de l'aluminium. Elle conclut sur l'état réel du recyclage, qui se révèle globalement satisfaisant avec ses forces et ses faiblesses, et au-delà du recyclage sur l'adéquation de l'aluminium au développement supportable, adéquation largement insuffisante. C'est pourquoi, elle suggère les thèmes d'études intéressant tous ceux scientifiques, techniciens, historiens, économistes, juristes concernés par une industrie très représentative de notre monde en devenir, un monde où la place de l'aluminium dépendra de son aptitude à satisfaire les critères du développement supportable. ABSTRACT Owing to recycling, the aluminium industry's global energetic and environmental prints are much lower than its ore extractive metallurgy's ones. Likewise, recycling will allow the complete use of the expected avalanche of old scraps, consequently to the dramatic explosion of aluminium consumption since the 50's. The recycling state is characterized by: i) raw materials split in two groups :one, the new scrap, internal and prompt, proportional to semi-finished and finished products quantities, exhibits a fairly good and regular quality. The other, the old scrap, proportional to the finished products arrivïng at their end-of--life, about 22 years later on an average, exhibits a variable quality depending on the collect mode. ii) a poor recycling rate, near by that of steel. The aluminium industry generates too much new internal scrap and doesn't collect all the availa~e old scrap. About 50% of it is not recycled (when steel is recycling about 70% of the old scrap flow). iii) recycling techniques, all based on melting, are well handled in spite of aluminium atiiníty to oxygen and the practical impossibility to purify aluminium from any impurity. Sorting and first collect are critical issues before melting. iv) products and markets of recycled aluminium :New scraps have still been recycled in the production lines from where there are coming (closed loop). Old scraps, mainly those mixed, have been first recycled in different production lines (open loop) :steel deoxidation products followed during the 30's, with the development of the foundry alloys, by foundry pieces of which the main market is the automotive industry. During the 80's, the commercial development of the beverage can in North America has permitted the first old scrap recycling closed loop which is developing. v) an economy with low and erratic margins because the electrolytic aluminium quotation fixes scrap purchasing price and recycled aluminium selling price. vi) an industrial organisation historically based on the scrap group and the loop mode. New scrap is recycled either by the transformation industry itself or by the recycling industry, the remelter, old scrap by the refiner, the other component of the recycling industry. The big companies, the "majors" are often involved in the closed loop recycling and very seldom in the open loop one. To-day, aluminium industry's global energetic and environmental prints are too unbeara~ e and the sustainaЫe development criteria are not fully met. Critical issues for the aluminium industry are to better produce, to better consume and to better recycle in order to become a real sustainaЫe development industry. Specific issues to recycling are a very efficient recycling industry, a "sustainaЫe development" economy, a complete old scrap collect favouring the closed loop. Also, indirectly connected to the recycling, are a very efficient transformation industry generating much less new scrap and a finished products industry delivering only products fulfilling sustainaЫe development criteria.
Resumo:
PURPOSE: We hypothesize that untrained subjects can benefit from a greater cardiovascular stimulation than trained athletes, resembling classical aerobic-type activity, in addition to eliciting strength gains.METHODS: 3 groups of male subjects, inactive (SED), endurance trained (END) and strength trained (STR) underwent fitness (VO2max) and lower-body strength tests (isokinetic). Subjects were submitted to a session of oscillating VT, composed of 3 exercises (isometric half-squat, dynamic squat, dynamic squat with added load), each of 3 minutes duration, and repeated at 3 vibration frequencies (20, 26 and 32 Hz). VO2, heart rate and Borg scale were monitored.RESULTS: 27 healthy subjects (10 SED, 9 END and 8 STR), mean age 24.5 (SED), 25.0 (STR) and 29.8 (END) were included. VO2max was significantly different as expected (47.9 vs. 52.9 vs. 63.9 mL?min-1?kg-1, resp. for SED, STR and END). Isokinetic dominant leg extensors strength was higher in STR (3.32 N?m?kg-1 vs. 2.60 and 2.74 in SED and END). During VT, peak oxygen consumption (% of VO2max) attained was 59.3 in SED, 50.8 in STR and 48.0 in END (P<0.001 between SED and other subjects). Peak heart rate (% of heart rate max) was 82.7 in SED, 80.4 in STR and 72.4 in END. In SED, dynamic exercises without extra load elicited 51.0 % of VO2max and 72.1 % of heart rate max, and perceived effort reached 15.1/20.CONCLUSIONS: VT is an unconventional type of exercise, known to enhance strength, bone density, balance and flexibility. Users are attracted by the relative passivity. In SED, VT elicits sufficient cardiovascular response to benefit overall fitness in addition to the strength effects. VT's higher acceptance as an exercise in sedentary people, compared to jogging or cycling, can lead to better adherence to physical activity. Although long-term effects of VT on health are not available, we believe this type of mixed aerobic and resistance-type exercise can be beneficial on multiple health parameters, especially cardiovascular health.
Resumo:
Testosterone can benefit individual fitness by increasing ornament colour, aggressiveness, and sperm quality, but it can also impose both metabolic and immunological costs. However, evidence that testosterone causes immuno suppression in freely living populations is scant. We studied the effects of testosterone on one component of the immune system (i.e., the cell-mediated response to phytohaemagglutinin), parasite load, and metabolic rate in the common wall lizard, Podarcis muralis (Laurenti, 1768). For analyses of immunocompetence and parasitism, male lizards were implanted at the end of the breeding season with either empty or testosterone implants and were returned to their site of capture for 5-6 weeks before recapture. For analyses of the effects of testosterone on metabolic rate, male lizards were captured and implanted before hibernation and were held in the laboratory for 1 week prior to calorimetry. Experimental treatment with testosterone decreased the cell-mediated response to the T-cell mitogen phytohemagglutinin and increased mean metabolic rate. No effects of testosterone on the number of ectoparasites, hemoparasites, and resting metabolic rate could be detected. These results are discussed in the framework of the immunocompetence handicap hypothesis and the immuno-redistribution process hypothesis. [Authors]
Resumo:
Postsynaptic density-95/disks large/zonula occludens-1 (PDZ) domains are relatively small (80-120 residues) protein binding modules central in the organization of receptor clusters and in the association of cellular proteins. Their main function is to bind C-terminals of selected proteins that are recognized through specific amino acids in their carboxyl end. Binding is associated with a deformation of the PDZ native structure and is responsible for dynamical changes in regions not in direct contact with the target. We investigate how this deformation is related to the harmonic dynamics of the PDZ structure and show that one low-frequency collective normal mode, characterized by the concerted movements of different secondary structures, is involved in the binding process. Our results suggest that even minimal structural changes are responsible for communication between distant regions of the protein, in agreement with recent NMR experiments. Thus, PDZ domains are a very clear example of how collective normal modes are able to characterize the relation between function and dynamics of proteins, and to provide indications on the precursors of binding/unbinding events.
Resumo:
By the end of the 1970s, contaminated sites had emerged as one of the most complex and urgent environmental issues affecting industrialized countries. The authors show that small and prosperous Switzerland is no exception to the pervasive problem of sites contamination, the legacy of past practices in waste management having left some 38,000 contaminated sites throughout the country. This book outlines the problem, offering evidence that open and polycentric environmental decision-making that includes civil society actors is valuable. They propose an understanding of environmental management of contaminated sites as a political process in which institutions frame interactions between strategic actors pursuing sometimes conflicting interests. In the opening chapter, the authors describe the influences of politics and the power relationships between actors involved in decision-making in contaminated sites management, which they term a "wicked problem." Chapter Two offers a theoretical framework for understanding institutions and the environmental management of contaminated sites. The next five chapters present a detailed case study on environmental management and contaminated sites in Switzerland, focused on the Bonfol Chemical Landfill. The study and analysis covers the establishment of the landfill under the first generation of environmental regulations, its closure and early remediation efforts, and the gambling on the remediation objectives, methods and funding in the first decade of the 21st Century. The concluding chapter discusses the question of whether the strength of environmental regulations, and the type of interactions between public, private, and civil society actors can explain the environmental choices in contaminated sites management. Drawing lessons from research, the authors debate the value of institutional flexibility for dealing with environmental issues such as contaminated sites.
Resumo:
1. The relation between dietary carbohydrate: lipid ratio and the fuel mixture oxidized during 24 h was investigated in eleven healthy volunteers (six females, and five males) in a respiration chamber. Values of the fuel mixture oxidized were estimated by continuous indirect calorimetry and urinary nitrogen measurements. 2. The subjects, were first given a mixed diet for 7 d and spent the last 24 h of the 7 d period in a respiration chamber for continuous gas-exchange measurement. The fuels oxidized during 2.5 h or moderate exercise were also measured in the respiration chamber. After an interval of 2 weeks from the end of the mixed-diet period, the same subjects were given an isoenergetic high-carbohydrate low-fat diet for 7 d, and the same experimental regimen was repeated. 3. Dietary composition markedly influenced the fuel mixture oxidized during 24 h and this effect was still present 12 h after the last meal in the postabsorptive state. However, the diets had no influence on the substrates oxidized above resting levels during exercise. With both diets, the 24 h energy balance was slightly negative and the energy deficit was covered by lipid oxidation. 4. With the high-carbohydrate low-fat diet, the energy expenditure during sleep was found to be higher than that with the mixed diet. 5. It is concluded that: (a) the composition of the diet did not influence the fuel mixture utilized for moderate exercise, (b) the energy deficit calculated for a 24 h period was compensated by lipid oxidation irrespective of the carbohydrate content of the diet, (c) energy expenditure during sleep was found to be higher with the high-carbohydrate low-fat diet than with the mixed diet.
Resumo:
La gouvernance de l'Internet est une thématique récente dans la politique mondiale. Néanmoins, elle est devenue au fil des années un enjeu économique et politique important. La question a même pris une importance particulière au cours des derniers mois en devenant un sujet d'actualité récurrent. Forte de ce constat, c ette recherche retrace l'histoire de la gouvernance de l'Internet depuis son émergence comme enjeu politique dans les années 1980 jusqu'à la fin du Sommet Mondial sur la Société de l'Information (SMSI) en 2005. Plutôt que de se focaliser sur l'une ou l'autre des institutions impliquées dans la régulation du réseau informatique mondial, cette recherche analyse l'émergence et l'évolution historique d'un espace de luttes rassemblant un nombre croissant d'acteurs différents. Cette évolution est décrite à travers le prisme de la relation dialectique entre élites et non-élites et de la lutte autour de la définition de la gouvernance de l'Internet. Cette thèse explore donc la question de comment les relations au sein des élites de la gouvernance de l'Internet et entre ces élites et les non-élites expliquent l'emergence, l'évolution et la structuration d'un champ relativement autonome de la politique mondiale centré sur la gouvernance de l'Internet. Contre les perspectives dominantes réaliste et libérales, cette recherche s'ancre dans une approche issue de la combinaison des traditions hétérodoxes en économie politique internationale et des apports de la sociologie politique internationale. Celle-ci s'articule autour des concepts de champ, d'élites et d'hégémonie. Le concept de champ, développé par Bourdieu inspire un nombre croissant d'études de la politique mondiale. Il permet à la fois une étude différenciée de la mondialisation et l'émergence d'espaces de lutte et de domination au niveau transnational. La sociologie des élites, elle, permet une approche pragmatique et centrée sur les acteurs des questions de pouvoir dans la mondialisation. Cette recherche utilise plus particulièrement le concept d'élite du pouvoir de Wright Mills pour étudier l'unification d'élites a priori différentes autour de projets communs. Enfin, cette étude reprend le concept néo-gramscien d'hégémonie afin d'étudier à la fois la stabilité relative du pouvoir d'une élite garantie par la dimension consensuelle de la domination, et les germes de changement contenus dans tout ordre international. A travers l'étude des documents produits au cours de la période étudiée et en s'appuyant sur la création de bases de données sur les réseaux d'acteurs, cette étude s'intéresse aux débats qui ont suivi la commercialisation du réseau au début des années 1990 et aux négociations lors du SMSI. La première période a abouti à la création de l'Internet Corporation for Assigned Names and Numbers (ICANN) en 1998. Cette création est le résultat de la recherche d'un consensus entre les discours dominants des années 1990. C'est également le fruit d'une coalition entre intérêts au sein d'une élite du pouvoir de la gouvernance de l'Internet. Cependant, cette institutionnalisation de l'Internet autour de l'ICANN excluait un certain nombre d'acteurs et de discours qui ont depuis tenté de renverser cet ordre. Le SMSI a été le cadre de la remise en cause du mode de gouvernance de l'Internet par les États exclus du système, des universitaires et certaines ONG et organisations internationales. C'est pourquoi le SMSI constitue la seconde période historique étudiée dans cette thèse. La confrontation lors du SMSI a donné lieu à une reconfiguration de l'élite du pouvoir de la gouvernance de l'Internet ainsi qu'à une redéfinition des frontières du champ. Un nouveau projet hégémonique a vu le jour autour d'éléments discursifs tels que le multipartenariat et autour d'insitutions telles que le Forum sur la Gouvernance de l'Internet. Le succès relatif de ce projet a permis une stabilité insitutionnelle inédite depuis la fin du SMSI et une acceptation du discours des élites par un grand nombre d'acteurs du champ. Ce n'est que récemment que cet ordre a été remis en cause par les pouvoirs émergents dans la gouvernance de l'Internet. Cette thèse cherche à contribuer au débat scientifique sur trois plans. Sur le plan théorique, elle contribue à l'essor d'un dialogue entre approches d'économie politique mondiale et de sociologie politique internationale afin d'étudier à la fois les dynamiques structurelles liées au processus de mondialisation et les pratiques localisées des acteurs dans un domaine précis. Elle insiste notamment sur l'apport de les notions de champ et d'élite du pouvoir et sur leur compatibilité avec les anlayses néo-gramsciennes de l'hégémonie. Sur le plan méthodologique, ce dialogue se traduit par une utilisation de méthodes sociologiques telles que l'anlyse de réseaux d'acteurs et de déclarations pour compléter l'analyse qualitative de documents. Enfin, sur le plan empirique, cette recherche offre une perspective originale sur la gouvernance de l'Internet en insistant sur sa dimension historique, en démontrant la fragilité du concept de gouvernance multipartenaire (multistakeholder) et en se focalisant sur les rapports de pouvoir et les liens entre gouvernance de l'Internet et mondialisation. - Internet governance is a recent issue in global politics. However, it gradually became a major political and economic issue. It recently became even more important and now appears regularly in the news. Against this background, this research outlines the history of Internet governance from its emergence as a political issue in the 1980s to the end of the World Summit on the Information Society (WSIS) in 2005. Rather than focusing on one or the other institution involved in Internet governance, this research analyses the emergence and historical evolution of a space of struggle affecting a growing number of different actors. This evolution is described through the analysis of the dialectical relation between elites and non-elites and through the struggle around the definition of Internet governance. The thesis explores the question of how the relations among the elites of Internet governance and between these elites and non-elites explain the emergence, the evolution, and the structuration of a relatively autonomous field of world politics centred around Internet governance. Against dominant realist and liberal perspectives, this research draws upon a cross-fertilisation of heterodox international political economy and international political sociology. This approach focuses on concepts such as field, elites and hegemony. The concept of field, as developed by Bourdieu, is increasingly used in International Relations to build a differentiated analysis of globalisation and to describe the emergence of transnational spaces of struggle and domination. Elite sociology allows for a pragmatic actor-centred analysis of the issue of power in the globalisation process. This research particularly draws on Wright Mill's concept of power elite in order to explore the unification of different elites around shared projects. Finally, this thesis uses the Neo-Gramscian concept of hegemony in order to study both the consensual dimension of domination and the prospect of change contained in any international order. Through the analysis of the documents produced within the analysed period, and through the creation of databases of networks of actors, this research focuses on the debates that followed the commercialisation of the Internet throughout the 1990s and during the WSIS. The first time period led to the creation of the Internet Corporation for Assigned Names and Numbers (ICANN) in 1998. This creation resulted from the consensus-building between the dominant discourses of the time. It also resulted from the coalition of interests among an emerging power elite. However, this institutionalisation of Internet governance around the ICANN excluded a number of actors and discourses that resisted this mode of governance. The WSIS became the institutional framework within which the governance system was questioned by some excluded states, scholars, NGOs and intergovernmental organisations. The confrontation between the power elite and counter-elites during the WSIS triggered a reconfiguration of the power elite as well as a re-definition of the boundaries of the field. A new hegemonic project emerged around discursive elements such as the idea of multistakeholderism and institutional elements such as the Internet Governance Forum. The relative success of the hegemonic project allowed for a certain stability within the field and an acceptance by most non-elites of the new order. It is only recently that this order began to be questioned by the emerging powers of Internet governance. This research provides three main contributions to the scientific debate. On the theoretical level, it contributes to the emergence of a dialogue between International Political Economy and International Political Sociology perspectives in order to analyse both the structural trends of the globalisation process and the located practices of actors in a given issue-area. It notably stresses the contribution of concepts such as field and power elite and their compatibility with a Neo-Gramscian framework to analyse hegemony. On the methodological level, this perspective relies on the use of mixed methods, combining qualitative content analysis with social network analysis of actors and statements. Finally, on the empirical level, this research provides an original perspective on Internet governance. It stresses the historical dimension of current Internet governance arrangements. It also criticise the notion of multistakeholde ism and focuses instead on the power dynamics and the relation between Internet governance and globalisation.
Resumo:
AIM: To develop and test the Parental PELICAN Questionnaire, an instrument to retrospectively assess parental experiences and needs during their child's end-of-life care. BACKGROUND: To offer appropriate care for dying children, healthcare professionals need to understand the illness experience from the family perspective. A questionnaire specific to the end-of-life experiences and needs of parents losing a child is needed to evaluate the perceived quality of paediatric end-of-life care. DESIGN: This is an instrument development study applying mixed methods based on recommendations for questionnaire design and validation. METHOD: The Parental PELICAN Questionnaire was developed in four phases between August 2012-March 2014: phase 1: item generation; phase 2: validity testing; phase 3: translation; phase 4: pilot testing. Psychometric properties were assessed after applying the Parental PELICAN Questionnaire in a sample of 224 bereaved parents in April 2014. Validity testing covered the evidence based on tests of content, internal structure and relations to other variables. RESULTS: The Parental PELICAN Questionnaire consists of approximately 90 items in four slightly different versions accounting for particularities of the four diagnostic groups. The questionnaire's items were structured according to six quality domains described in the literature. Evidence of initial validity and reliability could be demonstrated with the involvement of healthcare professionals and bereaved parents. CONCLUSION: The Parental PELICAN Questionnaire holds promise as a measure to assess parental experiences and needs and is applicable to a broad range of paediatric specialties and settings. Future validation is needed to evaluate its suitability in different cultures.