25 resultados para Efficient Production Scale
Resumo:
Polyhydroxyalkanoates (PHAs) are bacterial polyesters having the properties of biodegradable thermoplastics and elastomers. Synthesis of PHAs has been demonstrated in transgenic plants. Both polyhydroxybutyrate and the co-polymer poly(hydroxybutyrate-co-hydroxyvalerate) have been synthesized in the plastids of Arabidopsis thaliana and Brassica napus. Furthermore, a range of medium-chain-length PHAs has also been produced in plant peroxisomes. Development of agricultural crops to produce PHA on a large scale and at low cost will be a challenging task requiring a coordinated and stable expression of several genes. Novel extraction methods designed to maximize the use of harvested plants for PHA, oil, carbohydrate, and feed production will be needed. In addition to their use as plastics, PHAs can also be used to modify fiber properties in plants such as cotton. Furthermore, PHA can be exploited as a novel tool to study the carbon flux through various metabolic pathways, such as the fatty acid beta-oxidation cycle.
Resumo:
Retroviral vectors have many favorable properties for gene therapies, but their use remains limited by safety concerns and/or by relatively lower titers for some of the safer self-inactivating (SIN) derivatives. In this study, we evaluated whether increased production of SIN retroviral vectors can be achieved from the use of matrix attachment region (MAR) epigenetic regulators. Two MAR elements of human origin were found to increase and to stabilize the expression of the green fluorescent protein transgene in stably transfected HEK-293 packaging cells. Introduction of one of these MAR elements in retroviral vector-producing plasmids yielded higher expression of the viral vector RNA. Consistently, viral titers obtained from transient transfection of MAR-containing plasmids were increased up to sixfold as compared with the parental construct, when evaluated in different packaging cell systems and transfection conditions. Thus, use of MAR elements opens new perspectives for the efficient generation of gene therapy vectors.
Resumo:
Background and Aims: Vitamin D is an important modulatorof numerous cellular processes. Some of us recently observedan association of the 1a-hydroxylase promoter polymorphismCYP27B1-1260 rs10877012 with sustained virologic response (SVR)in a relatively small number of German patients with chronichepatitis C. In the present study, we aimed to validate thisassociation in a large and well characterized patient cohort, theSwiss Hepatitis C Cohort Study (SCCS). In addition, we examinedthe effect of vitamin D on the hepatitis C virus (HCV) life cyclein vitro.Methods: CYP27B1-1260 rs10877012 and IL28B rs12979860 singlenucleotide polymorphisms (SNPs) were genotyped in 1049 patientswith chronic hepatitis C from the SCCS, of whom 698 were treatedwith pegylated interferon-a (PEG-IFN-a) and ribavirin. In addition,112 patients with spontaneous clearance of HCV were examined.SNPs were correlated with variables reflecting the natural courseand treatment outcome of chronic hepatitis C. The effect of1,25-(OH)2D3 (calcitriol) on HCV replication and viral particleproduction was investigated in vitro using human hepatoma celllines (Huh-7.5) harbouring subgenomic replicons and cell culturederivedHCV.Results: The CYP27B1-1260 rs10877012 genotype was notassociated with SVR in patients with the good-response IL28Brs1279860 CC genotype. However, in patients with poor-responseIL28B rs1279860 genotype CT and TT, CYP27B1-1260 rs10877012was a significant independent predictor of SVR (15% difference inSVR between rs10877012 genotype AA vs. CC, p = 0.030, OR = 1.495,95% CI = 1.038-2.152). The CYPB27-1260 rs10877012 genotype wasneither associated with spontaneous clearance of HCV, nor withliver fibrosis progression rate, inflammatory activity of chronichepatitis C, or HCV viral load. Physiological doses of 1,25-(OH)2D3did not significantly affect HCVRNA replication or infectiousparticle production in vitro.Conclusions: The results of this large-scale genetic validationstudy reveal a role of vitamin D metabolism in the responseto treatment in chronic hepatitis C, but 1,25-(OH)2D3 does notexhibit a significant direct inhibitory antiviral effect. Thus, theability of vitamin D to modulate immunity against HCV shouldbe investigated.
Resumo:
BACKGROUND: Silicone breast implants are used to a wide extent in the field of plastic surgery. However, capsular contracture remains a considerable concern. This study aimed to analyze the effectiveness and applicability of an ultracision knife for capsulectomy breast surgery. METHODS: A prospective, single-center, randomized study was performed in 2009. The inclusion criteria specified female patients 20-80 years of age with capsular contracture (Baker 3-4). Ventral capsulectomy was performed using an ultracision knife on one side and the conventional Metzenbaum-type scissors and surgical knife on the collateral side of the breast. Measurements of the resected capsular ventral fragment, operative time, remaining breast tissue, drainage time, seroma and hematoma formation, visual analog scale pain score, and sensory function of the nipple-areola complex were assessed. In addition, histologic analysis of the resected capsule was performed. RESULTS: Five patients (median age, 59.2 years) were included in this study with a mean follow-up period of 6 months. Three patients had Baker grade 3 capsular contracture, and two patients had Baker grade 4 capsular contracture. The ultracision knife was associated with a significantly lower pain score, shorter operative time, smaller drainage volume, and shorter drainage time and resulted in a larger amount of remaining breast tissue. Histologic analysis of the resected capsule showed no apoptotic cells in the study group or control group. CONCLUSIONS: The results suggest that ventral capsulectomy with Baker grade 3 or 4 contracture using the ultracision knife is feasible, safe, and more efficient than blunt dissection and monopolar cutting diathermy and has a short learning curve. LEVEL OF EVIDENCE II: This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors at www.springer.com/00266 .
Resumo:
Production flow analysis (PFA) is a well-established methodology used for transforming traditional functional layout into product-oriented layout. The method uses part routings to find natural clusters of workstations forming production cells able to complete parts and components swiftly with simplified material flow. Once implemented, the scheduling system is based on period batch control aiming to establish fixed planning, production and delivery cycles for the whole production unit. PFA is traditionally applied to job-shops with functional layouts, and after reorganization within groups lead times reduce, quality improves and motivation among personnel improves. Several papers have documented this, yet no research has studied its application to service operations management. This paper aims to show that PFA can well be applied not only to job-shop and assembly operations, but also to back-office and service processes with real cases. The cases clearly show that PFA reduces non-value adding operations, introduces flow by evening out bottlenecks and diminishes process variability, all of which contribute to efficient operations management.
Resumo:
In order to study the various health influencing parameters related to engineered nanoparticles as well as to soot emitted b diesel engines, there is an urgent need for appropriate sampling devices and methods for cell exposure studies that simulate the respiratory system and facilitate associated biological and toxicological tests. The objective of the present work was the further advancement of a Multiculture Exposure Chamber (MEC) into a dose-controlled system for efficient delivery of nanoparticles to cells. It was validated with various types of nanoparticles (diesel engine soot aggregates, engineered nanoparticles for various applications) and with state-of-the-art nanoparticle measurement instrumentation to assess the local deposition of nanoparticles on the cell cultures. The dose of nanoparticles to which cell cultures are being exposed was evaluated in the normal operation of the in vitro cell culture exposure chamber based on measurements of the size specific nanoparticle collection efficiency of a cell free device. The average efficiency in delivering nanoparticles in the MEC was approximately 82%. The nanoparticle deposition was demonstrated by Transmission Electron Microscopy (TEM). Analysis and design of the MEC employs Computational Fluid Dynamics (CFD) and true to geometry representations of nanoparticles with the aim to assess the uniformity of nanoparticle deposition among the culture wells. Final testing of the dose-controlled cell exposure system was performed by exposing A549 lung cell cultures to fluorescently labeled nanoparticles. Delivery of aerosolized nanoparticles was demonstrated by visualization of the nanoparticle fluorescence in the cell cultures following exposure. Also monitored was the potential of the aerosolized nanoparticles to generate reactive oxygen species (ROS) (e.g. free radicals and peroxides generation), thus expressing the oxidative stress of the cells which can cause extensive cellular damage or damage on DNA.
Resumo:
Long synthetic peptides (LSPs) have a variety of important clinical uses as synthetic vaccines and drugs. Techniques for peptide synthesis were revolutionized in the 1960s and 1980s, after which efficient techniques for purification and characterization of the product were developed. These improved techniques allowed the stepwise synthesis of increasingly longer products at a faster rate, greater purity, and lower cost for clinical use. A synthetic peptide approach, coupled with bioinformatics analysis of genomes, can tremendously expand the search for clinically relevant products. In this Review, we discuss efforts to develop a malaria vaccine from LSPs, among other clinically directed work.
Resumo:
Les reconstructions palinspastiques fournissent le cadre idéal à de nombreuses études géologiques, géographiques, océanographique ou climatiques. En tant qu?historiens de la terre, les "reconstructeurs" essayent d?en déchiffrer le passé. Depuis qu?ils savent que les continents bougent, les géologues essayent de retracer leur évolution à travers les âges. Si l?idée originale de Wegener était révolutionnaire au début du siècle passé, nous savons depuis le début des années « soixante » que les continents ne "dérivent" pas sans but au milieu des océans mais sont inclus dans un sur-ensemble associant croûte « continentale » et « océanique »: les plaques tectoniques. Malheureusement, pour des raisons historiques aussi bien que techniques, cette idée ne reçoit toujours pas l'écho suffisant parmi la communauté des reconstructeurs. Néanmoins, nous sommes intimement convaincus qu?en appliquant certaines méthodes et certains principes il est possible d?échapper à l?approche "Wégenerienne" traditionnelle pour enfin tendre vers la tectonique des plaques. Le but principal du présent travail est d?exposer, avec tous les détails nécessaires, nos outils et méthodes. Partant des données paléomagnétiques et paléogéographiques classiquement utilisées pour les reconstructions, nous avons développé une nouvelle méthodologie replaçant les plaques tectoniques et leur cinématique au coeur du problème. En utilisant des assemblages continentaux (aussi appelés "assemblées clés") comme des points d?ancrage répartis sur toute la durée de notre étude (allant de l?Eocène jusqu?au Cambrien), nous développons des scénarios géodynamiques permettant de passer de l?une à l?autre en allant du passé vers le présent. Entre deux étapes, les plaques lithosphériques sont peu à peu reconstruites en additionnant/ supprimant les matériels océaniques (symbolisés par des isochrones synthétiques) aux continents. Excepté lors des collisions, les plaques sont bougées comme des entités propres et rigides. A travers les âges, les seuls éléments évoluant sont les limites de plaques. Elles sont préservées aux cours du temps et suivent une évolution géodynamique consistante tout en formant toujours un réseau interconnecté à travers l?espace. Cette approche appelée "limites de plaques dynamiques" intègre de multiples facteurs parmi lesquels la flottabilité des plaques, les taux d'accrétions aux rides, les courbes de subsidence, les données stratigraphiques et paléobiogéographiques aussi bien que les évènements tectoniques et magmatiques majeurs. Cette méthode offre ainsi un bon contrôle sur la cinématique des plaques et fournit de sévères contraintes au modèle. Cette approche "multi-source" nécessite une organisation et une gestion des données efficaces. Avant le début de cette étude, les masses de données nécessaires était devenues un obstacle difficilement surmontable. Les SIG (Systèmes d?Information Géographiques) et les géo-databases sont des outils informatiques spécialement dédiés à la gestion, au stockage et à l?analyse des données spatialement référencées et de leurs attributs. Grâce au développement dans ArcGIS de la base de données PaleoDyn nous avons pu convertir cette masse de données discontinues en informations géodynamiques précieuses et facilement accessibles pour la création des reconstructions. Dans le même temps, grâce à des outils spécialement développés, nous avons, tout à la fois, facilité le travail de reconstruction (tâches automatisées) et amélioré le modèle en développant fortement le contrôle cinématique par la création de modèles de vitesses des plaques. Sur la base des 340 terranes nouvellement définis, nous avons ainsi développé un set de 35 reconstructions auxquelles est toujours associé un modèle de vitesse. Grâce à cet ensemble de données unique, nous pouvons maintenant aborder des problématiques majeurs de la géologie moderne telles que l?étude des variations du niveau marin et des changements climatiques. Nous avons commencé par aborder un autre problème majeur (et non définitivement élucidé!) de la tectonique moderne: les mécanismes contrôlant les mouvements des plaques. Nous avons pu observer que, tout au long de l?histoire de la terre, les pôles de rotation des plaques (décrivant les mouvements des plaques à la surface de la terre) tendent à se répartir le long d'une bande allant du Pacifique Nord au Nord de l'Amérique du Sud, l'Atlantique Central, l'Afrique du Nord, l'Asie Centrale jusqu'au Japon. Fondamentalement, cette répartition signifie que les plaques ont tendance à fuir ce plan médian. En l'absence d'un biais méthodologique que nous n'aurions pas identifié, nous avons interprété ce phénomène comme reflétant l'influence séculaire de la Lune sur le mouvement des plaques. La Lune sur le mouvement des plaques. Le domaine océanique est la clé de voute de notre modèle. Nous avons attaché un intérêt tout particulier à le reconstruire avec beaucoup de détails. Dans ce modèle, la croûte océanique est préservée d?une reconstruction à l?autre. Le matériel crustal y est symbolisé sous la forme d?isochrones synthétiques dont nous connaissons les âges. Nous avons également reconstruit les marges (actives ou passives), les rides médio-océaniques et les subductions intra-océaniques. En utilisant ce set de données très détaillé, nous avons pu développer des modèles bathymétriques 3-D unique offrant une précision bien supérieure aux précédents.<br/><br/>Palinspastic reconstructions offer an ideal framework for geological, geographical, oceanographic and climatology studies. As historians of the Earth, "reconstructers" try to decipher the past. Since they know that continents are moving, geologists a trying to retrieve the continents distributions through ages. If Wegener?s view of continent motions was revolutionary at the beginning of the 20th century, we know, since the Early 1960?s that continents are not drifting without goal in the oceanic realm but are included in a larger set including, all at once, the oceanic and the continental crust: the tectonic plates. Unfortunately, mainly due to technical and historical issues, this idea seems not to receive a sufficient echo among our particularly concerned community. However, we are intimately convinced that, by applying specific methods and principles we can escape the traditional "Wegenerian" point of view to, at last, reach real plate tectonics. This is the main aim of this study to defend this point of view by exposing, with all necessary details, our methods and tools. Starting with the paleomagnetic and paleogeographic data classically used in reconstruction studies, we developed a modern methodology placing the plates and their kinematics at the centre of the issue. Using assemblies of continents (referred as "key assemblies") as anchors distributed all along the scope of our study (ranging from Eocene time to Cambrian time) we develop geodynamic scenarios leading from one to the next, from the past to the present. In between, lithospheric plates are progressively reconstructed by adding/removing oceanic material (symbolized by synthetic isochrones) to major continents. Except during collisions, plates are moved as single rigid entities. The only evolving elements are the plate boundaries which are preserved and follow a consistent geodynamical evolution through time and form an interconnected network through space. This "dynamic plate boundaries" approach integrates plate buoyancy factors, oceans spreading rates, subsidence patterns, stratigraphic and paleobiogeographic data, as well as major tectonic and magmatic events. It offers a good control on plate kinematics and provides severe constraints for the model. This multi-sources approach requires an efficient data management. Prior to this study, the critical mass of necessary data became a sorely surmountable obstacle. GIS and geodatabases are modern informatics tools of specifically devoted to store, analyze and manage data and associated attributes spatially referenced on the Earth. By developing the PaleoDyn database in ArcGIS software we converted the mass of scattered data offered by the geological records into valuable geodynamical information easily accessible for reconstructions creation. In the same time, by programming specific tools we, all at once, facilitated the reconstruction work (tasks automation) and enhanced the model (by highly increasing the kinematic control of plate motions thanks to plate velocity models). Based on the 340 terranes properly defined, we developed a revised set of 35 reconstructions associated to their own velocity models. Using this unique dataset we are now able to tackle major issues of the geology (such as the global sea-level variations and climate changes). We started by studying one of the major unsolved issues of the modern plate tectonics: the driving mechanism of plate motions. We observed that, all along the Earth?s history, plates rotation poles (describing plate motions across the Earth?s surface) tend to follow a slight linear distribution along a band going from the Northern Pacific through Northern South-America, Central Atlantic, Northern Africa, Central Asia up to Japan. Basically, it sighifies that plates tend to escape this median plan. In the absence of a non-identified methodological bias, we interpreted it as the potential secular influence ot the Moon on plate motions. The oceanic realms are the cornerstone of our model and we attached a particular interest to reconstruct them with many details. In this model, the oceanic crust is preserved from one reconstruction to the next. The crustal material is symbolised by the synthetic isochrons from which we know the ages. We also reconstruct the margins (active or passive), ridges and intra-oceanic subductions. Using this detailed oceanic dataset, we developed unique 3-D bathymetric models offering a better precision than all the previously existing ones.
Resumo:
Engineered nanomaterials (ENMs) exhibit special physicochemical properties and thus are finding their way into an increasing number of industries, enabling products with improved properties. Their increased use brings a greater likelihood of exposure to the nanoparticles (NPs) that could be released during the life cycle of nano-abled products. The field of nanotoxicology has emerged as a consequence of the development of these novel materials, and it has gained ever more attention due to the urgent need to gather information on exposure to them and to understand the potential hazards they engender. However, current studies on nanotoxicity tend to focus on pristine ENMs, and they use these toxicity results to generalize risk assessments on human exposure to NPs. ENMs released into the environment can interact with their surroundings, change characteristics and exhibit toxicity effects distinct from those of pristine ENMs. Furthermore, NPs' large surface areas provide extra-large potential interfaces, thus promoting more significant interactions between NPs and other co-existing species. In such processes, other species can attach to a NP's surface and modify its surface functionality, in addition to the toxicity in normally exhibits. One particular occupational health scenario involves NPs and low-volatile organic compounds (LVOC), a common type of pollutant existing around many potential sources of NPs. LVOC can coat a NP's surface and then dominate its toxicity. One important mechanism in nanotoxicology is the creation of reactive oxygen species (ROS) on a NP's surface; LVOC can modify the production of these ROS. In summary, nanotoxicity research should not be limited to the toxicity of pristine NPs, nor use their toxicity to evaluate the health effects of exposure to environmental NPs. Instead, the interactions which NPs have with other environmental species should also be considered and researched. The potential health effects of exposure to NPs should be derived from these real world NPs with characteristics modified by the environment and their distinct toxicity. Failure to suitably address toxicity results could lead to an inappropriate treatment of nano- release, affect the environment and public health and put a blemish on the development of sustainable nanotechnologies as a whole. The main objective of this thesis is to demonstrate a process for coating NP surfaces with LVOC using a well-controlled laboratory design and, with regard to these NPs' capacity to generate ROS, explore the consequences of changing particle toxicity. The dynamic coating system developed yielded stable and replicable coating performance, simulating an important realistic scenario. Clear changes in the size distribution of airborne NPs were observed using a scanning mobility particle sizer, were confirmed using both liquid nanotracking analyses and transmission electron microscopy (TEM) imaging, and were verified thanks to the LVOC coating. Coating thicknesses corresponded to the amount of coating material used and were controlled using the parameters of the LVOC generator. The capacity of pristine silver NPs (Ag NPs) to generate ROS was reduced when they were given a passive coating of inert paraffin: this coating blocked the reactive zones on the particle surfaces. In contrast, a coating of active reduced-anthraquinone contributed to redox reactions and generated ROS itself, despite the fact that ROS generation due to oxidation by Ag NPs themselves was quenched. Further objectives of this thesis included development of ROS methodology and the analysis of ROS case studies. Since the capacity of NPs to create ROS is an important effect in nanotoxicity, we attempted to refine and standardize the use of 2'7-dichlorodihydrofluorescin (DCFH) as a chemical tailored for the characterization of NPs' capacity for ROS generation. Previous studies had reported a wide variety of results, which were due to a number of insufficiently well controlled factors. We therefore cross-compared chemicals and concentrations, explored ways of dispersing NP samples in liquid solutions, identified sources of contradictions in the literature and investigated ways of reducing artificial results. The most robust results were obtained by sonicating an optimal sample of NPs in a DCFH-HRP solution made of 5,M DCFH and 0.5 unit/ml horseradish peroxidase (HRP). Our findings explained how the major reasons for previously conflicting results were the different experimental approaches used and the potential artifacts appearing when using high sample concentrations. Applying our advanced DCFH protocol with other physicochemical characterizations and biological analyses, we conducted several case studies, characterizing aerosols and NP samples. Exposure to aged brake wear dust engenders a risk of potential deleterious health effects in occupational scenarios. We performed microscopy and elemental analyses, as well as ROS measurements, with acellular and cellular DCFH assays. TEM images revealed samples to be heterogeneous mixtures with few particles in the nano-scale. Metallic and non-metallic elements were identified, primarily iron, carbon and oxygen. Moderate amounts of ROS were detected in the cell-free fluorescent tests; however, exposed cells were not dramatically activated. In addition to their highly aged state due to oxidation, the reason aged brake wear samples caused less oxidative stress than fresh brake wear samples may be because of their larger size and thus smaller relative reactive surface area. Other case studies involving welding fumes and differently charged NPs confirmed the performance of our DCFH assay and found ROS generation linked to varying characteristics, especially the surface functionality of the samples. Les nanomatériaux manufacturés (ENM) présentent des propriétés physico-chimiques particulières et ont donc trouvés des applications dans un nombre croissant de secteurs, permettant de réaliser des produits ayant des propriétés améliorées. Leur utilisation accrue engendre un plus grand risque pour les êtres humains d'être exposés à des nanoparticules (NP) qui sont libérées au long de leur cycle de vie. En conséquence, la nanotoxicologie a émergé et gagné de plus en plus d'attention dû à la nécessité de recueillir les renseignements nécessaires sur l'exposition et les risques associés à ces nouveaux matériaux. Cependant, les études actuelles sur la nanotoxicité ont tendance à se concentrer sur les ENM et utiliser ces résultats toxicologiques pour généraliser l'évaluation des risques sur l'exposition humaine aux NP. Les ENM libérés dans l'environnement peuvent interagir avec l'environnement, changeant leurs caractéristiques, et montrer des effets de toxicité distincts par rapport aux ENM originaux. Par ailleurs, la grande surface des NP fournit une grande interface avec l'extérieur, favorisant les interactions entre les NP et les autres espèces présentes. Dans ce processus, d'autres espèces peuvent s'attacher à la surface des NP et modifier leur fonctionnalité de surface ainsi que leur toxicité. Un scénario d'exposition professionnel particulier implique à la fois des NP et des composés organiques peu volatils (LVOC), un type commun de polluant associé à de nombreuses sources de NP. Les LVOC peuvent se déposer sur la surface des NP et donc dominer la toxicité globale de la particule. Un mécanisme important en nanotoxicologie est la création d'espèces réactives d'oxygène (ROS) sur la surface des particules, et les LVOC peuvent modifier cette production de ROS. En résumé, la recherche en nanotoxicité ne devrait pas être limitée à la toxicité des ENM originaux, ni utiliser leur toxicité pour évaluer les effets sur la santé de l'exposition aux NP de l'environnement; mais les interactions que les NP ont avec d'autres espèces environnementales doivent être envisagées et étudiées. Les effets possibles sur la santé de l'exposition aux NP devraient être dérivés de ces NP aux caractéristiques modifiées et à la toxicité distincte. L'utilisation de résultats de toxicité inappropriés peut conduire à une mauvaise prise en charge de l'exposition aux NP, de détériorer l'environnement et la santé publique et d'entraver le développement durable des industries de la nanotechnologie dans leur ensemble. L'objectif principal de cette thèse est de démontrer le processus de déposition des LVOC sur la surface des NP en utilisant un environnement de laboratoire bien contrôlé et d'explorer les conséquences du changement de toxicité des particules sur leur capacité à générer des ROS. Le système de déposition dynamique développé a abouti à des performances de revêtement stables et reproductibles, en simulant des scénarios réalistes importants. Des changements clairs dans la distribution de taille des NP en suspension ont été observés par spectrométrie de mobilité électrique des particules, confirmé à la fois par la méthode dite liquid nanotracking analysis et par microscopie électronique à transmission (MET), et a été vérifié comme provenant du revêtement par LVOC. La correspondance entre l'épaisseur de revêtement et la quantité de matériau de revêtement disponible a été démontré et a pu être contrôlé par les paramètres du générateur de LVOC. La génération de ROS dû aux NP d'argent (Ag NP) a été diminuée par un revêtement passif de paraffine inerte bloquant les zones réactives à la surface des particules. Au contraire, le revêtement actif d'anthraquinone réduit a contribué aux réactions redox et a généré des ROS, même lorsque la production de ROS par oxydation des Ag NP avec l'oxygène a été désactivé. Les objectifs associés comprennent le développement de la méthodologie et des études de cas spécifique aux ROS. Etant donné que la capacité des NP à générer des ROS contribue grandement à la nanotoxicité, nous avons tenté de définir un standard pour l'utilisation de 27- dichlorodihydrofluorescine (DCFH) adapté pour caractériser la génération de ROS par les NP. Des etudes antérieures ont rapporté une grande variété de résultats différents, ce qui était dû à un contrôle insuffisant des plusieurs facteurs. Nous avons donc comparé les produits chimiques et les concentrations utilisés, exploré les moyens de dispersion des échantillons HP en solution liquide, investigué les sources de conflits identifiées dans les littératures et étudié les moyens de réduire les résultats artificiels. De très bon résultats ont été obtenus par sonication d'une quantité optimale d'échantillons de NP en solution dans du DCFH-HRP, fait de 5 nM de DCFH et de 0,5 unité/ml de Peroxydase de raifort (HRP). Notre étude a démontré que les principales raisons causant les conflits entre les études précédemment conduites dans la littérature étaient dues aux différentes approches expérimentales et à des artefacts potentiels dus à des concentrations élevées de NP dans les échantillons. Utilisant notre protocole DCFH avancé avec d'autres caractérisations physico-chimiques et analyses biologiques, nous avons mené plusieurs études de cas, caractérisant les échantillons d'aérosols et les NP. La vielle poussière de frein en particulier présente un risque élevé d'exposition dans les scénarios professionnels, avec des effets potentiels néfastes sur la santé. Nous avons effectué des analyses d'éléments et de microscopie ainsi que la mesure de ROS avec DCFH cellulaire et acellulaire. Les résultats de MET ont révélé que les échantillons se présentent sous la forme de mélanges de particules hétérogènes, desquels une faible proportion se trouve dans l'échelle nano. Des éléments métalliques et non métalliques ont été identifiés, principalement du fer, du carbone et de l'oxygène. Une quantité modérée de ROS a été détectée dans le test fluorescent acellulaire; cependant les cellules exposées n'ont pas été très fortement activées. La raison pour laquelle les échantillons de vielle poussière de frein causent un stress oxydatif inférieur par rapport à la poussière de frein nouvelle peut-être à cause de leur plus grande taille engendrant une surface réactive proportionnellement plus petite, ainsi que leur état d'oxydation avancé diminuant la réactivité. D'autres études de cas sur les fumées de soudage et sur des NP différemment chargées ont confirmé la performance de notre test DCFH et ont trouvé que la génération de ROS est liée à certaines caractéristiques, notamment la fonctionnalité de surface des échantillons.
Resumo:
Worldwide, about half the adult population is considered overweight as defined by a body mass index (BMI - calculated by body weight divided by height squared) ratio in excess of 25 kg.m-2. Of these individuals, half are clinically obese (with a BMI in excess of 30) and these numbers are still increasing, notably in developing countries such as those of the Middle East region. Obesity is a disorder characterised by increased mass of adipose tissue (excessive fat accumulation) that is the result of a systemic imbalance between food intake and energy expenditure. Although factors such as family history, sedentary lifestyle, urbanisation, income and family diet patterns determine obesity prevalence, the main underlying causes are poor knowledge about food choice and lack of physical activity3. Current obesity treatments include dietary restriction, pharmacological interventions and ultimately, bariatric surgery. The beneficial effects of physical activity on weight loss through increased energy expenditure and appetite modulation are also firmly established. Another viable option to induce a negative energy balance, is to incorporate hypoxia per se or combine it with exercise in an individual's daily schedule. This article will present recent evidence suggesting that combining hypoxic exposure and exercise training might provide a cost-effective strategy for reducing body weight and improving cardio-metabolic health in obese individuals. The efficacy of this approach is further reinforced by epidemiological studies using large-scale databases, which evidence a negative relationship between altitude of habitation and obesity. In the United States, for instance, obesity prevalence is inversely associated with altitude of residence and urbanisation, after adjusting for temperature, diet, physical activity, smoking and demographic factors.