954 resultados para reconstructed epidermis


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Le but essentiel de notre travail a été d?étudier la capacité du foie, premier organe de métabolisation des xénobiotiques, à dégrader la cocaïne en présence d?éthanol, à l?aide de deux modèles expérimentaux, à savoir un modèle cellulaire (les hépatocytes de rat en suspension) et un modèle acellulaire (modèle reconstitué in vitro à partir d?enzymes purifiées de foie humain). La première partie a pour objectifs de rechercher les voies de métabolisation de la cocaïne qui sont inhibées et / ou stimulées en présence d?éthanol, sur hépatocytes isolés de rat. Dans ce but, une méthode originale permettant de séparer et de quantifier simultanément la cocaïne, le cocaéthylène et huit de leurs métabolites respectifs a été développée par Chromatographie Phase Gazeuse couplée à la Spectrométrie de Masse (CPG / SM). Nos résultats préliminaires indiquent que l?éthanol aux trois concentrations testées (20, 40 et 80 mM) n?a aucun effet sur la cinétique de métabolisation de la cocaïne. Notre étude confirme que l?addition d?éthanol à des cellules hépatiques de rat en suspension supplémentées en cocaïne résulte en la formation précoce de benzoylecgonine et de cocaéthylène. L?apparition retardée d?ecgonine méthyl ester démontre l?activation d?une deuxième voie de détoxification. La production tardive d?ecgonine indique une dégradation de la benzoylecgonine et de l?ecgonine méthyl ester. De plus, la voie d?oxydation intervenant dans l?induction du stress oxydant en produisant de la norcocaïne est tardivement stimulée. Enfin, notre étude montre une métabolisation complète de la concentration initiale en éthanol par les hépatocytes de rat en suspension. La deuxième partie a pour but de déterminer s?il existe d?autres enzymes que les carboxylesterases formes 1 et 2 humaines ayant une capacité à métaboliser la cocaïne seule ou associée à de l?éthanol. Pour ce faire, une méthode de micropurification par chromatographie liquide (Smart System®) a été mise au point. Dans le cadre de nos dosages in situ de la cocaïne, du cocaéthylène, de la benzoylecgonine, de l?acide benzoïque et de la lidocaïne, une technique par Chromatographie Liquide Haute Performance couplée à une Détection par Barrette de Diode (CLHP / DBD) et une méthode de dosage de l?éthanol par Chromatographie Phase Gazeuse couplée à une Détection par Ionisation de Flamme équipée d?un injecteur à espace de tête (espace de tête CPG / DIF) ont été développées. La procédure de purification nous a permis de suspecter la présence d?autres enzymes que les carboxylesterases formes 1 et 2 de foie humain impliquées dans le métabolisme de la cocaïne et déjà isolées. A partir d?un modèle enzymatique reconstitué in vitro, nos résultats préliminaires indiquent que d?autres esterases que les formes 1 et 2 de foie humain sont impliquées dans l?élimination de la cocaïne, produisant benzoylecgonine et ecgonine méthyl ester. De plus, nous avons montré que les sensibilités de ces enzymes à l?éthanol sont variables.<br/><br/>The main purpose of our work was to study the ability of the liver, as the first organ to metabolise xenobiotic substances, to degrade cocaine in the presence of ethanol. In order to do this, we used two experimental models, namely a cellular model (rat liver cells in suspension) and an a-cellular model (model reconstructed in vitro from purified human liver enzymes). The purpose of the first part of our study was to look for cocaine metabolising processes which were inhibited and / or stimulated by the presence of ethanol, in isolated rat liver cells. With this aim in mind, an original method for simultaneously separating and quantifying cocaine, cocaethylene and eight of their respective metabolites was developed by Vapour Phase Chromatography coupled with Mass Spectrometry (VPC / MS). Our preliminary results point out that ethanol at three tested concentrations (20, 40 et 80 mM) have no effect on the kinetic of metabolisation of cocaine. Our study confirms that the addition of alcohol to rat liver cells in suspension, supplemented with cocaine, results in the premature formation of ecgonine benzoyl ester and cocaethylene. The delayed appearance of ecgonine methyl ester shows that a second detoxification process is activated. The delayed production of ecgonine indicates a degradation of the ecgonine benzoyl ester and the ecgonine methyl ester. Moreover, the oxidising process which occurs during the induction of the oxidising stress, producing norcocaine, is stimulated at a late stage. Finally, our study shows the complete metabolisation of the initial alcohol concentration by the rat liver cells in suspension. The second part consisted in determining if enzymes other than human carboxylesterases 1 and 2, able to metabolise cocaine on its own or with alcohol, existed. To do this, a micropurification method us ing liquid phase chromatography (Smart System®) was developed. A technique based on High Performance Liquid Chromatography coupled with a Diode Array Detection (HPLC / DAD) in the in situ proportioning of cocaine, cocaethylene, ecgonine benzoyl ester, benzoic acid and lidocaine, and a method for proportioning alcohol by quantifying the head space using Vapour Phase Chromatography coupled with a Flame Ionisation Detection (head space VPC / FID) were used. The purification procedure pointed to the presence of enzymes other than the human liver carboxylesterases, forms 1 and 2, involved in the metabolism of cocaine and already isolated. The preliminary results drawn from an enzymatic model reconstructed in vitro indicate that human liver carboxylesterases, other than forms 1 and 2, are involved in the elimination of cocaine, producing ecgonine benzoyl ester and ecgonine methyl ester. Moreover, we have shown that the sensitivity of these enzymes to alcohol is variable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Les reconstructions palinspastiques fournissent le cadre idéal à de nombreuses études géologiques, géographiques, océanographique ou climatiques. En tant qu?historiens de la terre, les "reconstructeurs" essayent d?en déchiffrer le passé. Depuis qu?ils savent que les continents bougent, les géologues essayent de retracer leur évolution à travers les âges. Si l?idée originale de Wegener était révolutionnaire au début du siècle passé, nous savons depuis le début des années « soixante » que les continents ne "dérivent" pas sans but au milieu des océans mais sont inclus dans un sur-ensemble associant croûte « continentale » et « océanique »: les plaques tectoniques. Malheureusement, pour des raisons historiques aussi bien que techniques, cette idée ne reçoit toujours pas l'écho suffisant parmi la communauté des reconstructeurs. Néanmoins, nous sommes intimement convaincus qu?en appliquant certaines méthodes et certains principes il est possible d?échapper à l?approche "Wégenerienne" traditionnelle pour enfin tendre vers la tectonique des plaques. Le but principal du présent travail est d?exposer, avec tous les détails nécessaires, nos outils et méthodes. Partant des données paléomagnétiques et paléogéographiques classiquement utilisées pour les reconstructions, nous avons développé une nouvelle méthodologie replaçant les plaques tectoniques et leur cinématique au coeur du problème. En utilisant des assemblages continentaux (aussi appelés "assemblées clés") comme des points d?ancrage répartis sur toute la durée de notre étude (allant de l?Eocène jusqu?au Cambrien), nous développons des scénarios géodynamiques permettant de passer de l?une à l?autre en allant du passé vers le présent. Entre deux étapes, les plaques lithosphériques sont peu à peu reconstruites en additionnant/ supprimant les matériels océaniques (symbolisés par des isochrones synthétiques) aux continents. Excepté lors des collisions, les plaques sont bougées comme des entités propres et rigides. A travers les âges, les seuls éléments évoluant sont les limites de plaques. Elles sont préservées aux cours du temps et suivent une évolution géodynamique consistante tout en formant toujours un réseau interconnecté à travers l?espace. Cette approche appelée "limites de plaques dynamiques" intègre de multiples facteurs parmi lesquels la flottabilité des plaques, les taux d'accrétions aux rides, les courbes de subsidence, les données stratigraphiques et paléobiogéographiques aussi bien que les évènements tectoniques et magmatiques majeurs. Cette méthode offre ainsi un bon contrôle sur la cinématique des plaques et fournit de sévères contraintes au modèle. Cette approche "multi-source" nécessite une organisation et une gestion des données efficaces. Avant le début de cette étude, les masses de données nécessaires était devenues un obstacle difficilement surmontable. Les SIG (Systèmes d?Information Géographiques) et les géo-databases sont des outils informatiques spécialement dédiés à la gestion, au stockage et à l?analyse des données spatialement référencées et de leurs attributs. Grâce au développement dans ArcGIS de la base de données PaleoDyn nous avons pu convertir cette masse de données discontinues en informations géodynamiques précieuses et facilement accessibles pour la création des reconstructions. Dans le même temps, grâce à des outils spécialement développés, nous avons, tout à la fois, facilité le travail de reconstruction (tâches automatisées) et amélioré le modèle en développant fortement le contrôle cinématique par la création de modèles de vitesses des plaques. Sur la base des 340 terranes nouvellement définis, nous avons ainsi développé un set de 35 reconstructions auxquelles est toujours associé un modèle de vitesse. Grâce à cet ensemble de données unique, nous pouvons maintenant aborder des problématiques majeurs de la géologie moderne telles que l?étude des variations du niveau marin et des changements climatiques. Nous avons commencé par aborder un autre problème majeur (et non définitivement élucidé!) de la tectonique moderne: les mécanismes contrôlant les mouvements des plaques. Nous avons pu observer que, tout au long de l?histoire de la terre, les pôles de rotation des plaques (décrivant les mouvements des plaques à la surface de la terre) tendent à se répartir le long d'une bande allant du Pacifique Nord au Nord de l'Amérique du Sud, l'Atlantique Central, l'Afrique du Nord, l'Asie Centrale jusqu'au Japon. Fondamentalement, cette répartition signifie que les plaques ont tendance à fuir ce plan médian. En l'absence d'un biais méthodologique que nous n'aurions pas identifié, nous avons interprété ce phénomène comme reflétant l'influence séculaire de la Lune sur le mouvement des plaques. La Lune sur le mouvement des plaques. Le domaine océanique est la clé de voute de notre modèle. Nous avons attaché un intérêt tout particulier à le reconstruire avec beaucoup de détails. Dans ce modèle, la croûte océanique est préservée d?une reconstruction à l?autre. Le matériel crustal y est symbolisé sous la forme d?isochrones synthétiques dont nous connaissons les âges. Nous avons également reconstruit les marges (actives ou passives), les rides médio-océaniques et les subductions intra-océaniques. En utilisant ce set de données très détaillé, nous avons pu développer des modèles bathymétriques 3-D unique offrant une précision bien supérieure aux précédents.<br/><br/>Palinspastic reconstructions offer an ideal framework for geological, geographical, oceanographic and climatology studies. As historians of the Earth, "reconstructers" try to decipher the past. Since they know that continents are moving, geologists a trying to retrieve the continents distributions through ages. If Wegener?s view of continent motions was revolutionary at the beginning of the 20th century, we know, since the Early 1960?s that continents are not drifting without goal in the oceanic realm but are included in a larger set including, all at once, the oceanic and the continental crust: the tectonic plates. Unfortunately, mainly due to technical and historical issues, this idea seems not to receive a sufficient echo among our particularly concerned community. However, we are intimately convinced that, by applying specific methods and principles we can escape the traditional "Wegenerian" point of view to, at last, reach real plate tectonics. This is the main aim of this study to defend this point of view by exposing, with all necessary details, our methods and tools. Starting with the paleomagnetic and paleogeographic data classically used in reconstruction studies, we developed a modern methodology placing the plates and their kinematics at the centre of the issue. Using assemblies of continents (referred as "key assemblies") as anchors distributed all along the scope of our study (ranging from Eocene time to Cambrian time) we develop geodynamic scenarios leading from one to the next, from the past to the present. In between, lithospheric plates are progressively reconstructed by adding/removing oceanic material (symbolized by synthetic isochrones) to major continents. Except during collisions, plates are moved as single rigid entities. The only evolving elements are the plate boundaries which are preserved and follow a consistent geodynamical evolution through time and form an interconnected network through space. This "dynamic plate boundaries" approach integrates plate buoyancy factors, oceans spreading rates, subsidence patterns, stratigraphic and paleobiogeographic data, as well as major tectonic and magmatic events. It offers a good control on plate kinematics and provides severe constraints for the model. This multi-sources approach requires an efficient data management. Prior to this study, the critical mass of necessary data became a sorely surmountable obstacle. GIS and geodatabases are modern informatics tools of specifically devoted to store, analyze and manage data and associated attributes spatially referenced on the Earth. By developing the PaleoDyn database in ArcGIS software we converted the mass of scattered data offered by the geological records into valuable geodynamical information easily accessible for reconstructions creation. In the same time, by programming specific tools we, all at once, facilitated the reconstruction work (tasks automation) and enhanced the model (by highly increasing the kinematic control of plate motions thanks to plate velocity models). Based on the 340 terranes properly defined, we developed a revised set of 35 reconstructions associated to their own velocity models. Using this unique dataset we are now able to tackle major issues of the geology (such as the global sea-level variations and climate changes). We started by studying one of the major unsolved issues of the modern plate tectonics: the driving mechanism of plate motions. We observed that, all along the Earth?s history, plates rotation poles (describing plate motions across the Earth?s surface) tend to follow a slight linear distribution along a band going from the Northern Pacific through Northern South-America, Central Atlantic, Northern Africa, Central Asia up to Japan. Basically, it sighifies that plates tend to escape this median plan. In the absence of a non-identified methodological bias, we interpreted it as the potential secular influence ot the Moon on plate motions. The oceanic realms are the cornerstone of our model and we attached a particular interest to reconstruct them with many details. In this model, the oceanic crust is preserved from one reconstruction to the next. The crustal material is symbolised by the synthetic isochrons from which we know the ages. We also reconstruct the margins (active or passive), ridges and intra-oceanic subductions. Using this detailed oceanic dataset, we developed unique 3-D bathymetric models offering a better precision than all the previously existing ones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Estimation of the dimensions of fluvial geobodies from core data is a notoriously difficult problem in reservoir modeling. To try and improve such estimates and, hence, reduce uncertainty in geomodels, data on dunes, unit bars, cross-bar channels, and compound bars and their associated deposits are presented herein from the sand-bed braided South Saskatchewan River, Canada. These data are used to test models that relate the scale of the formative bed forms to the dimensions of the preserved deposits and, therefore, provide an insight as to how such deposits may be preserved over geologic time. The preservation of bed-form geometry is quantified by comparing the Alluvial architecture above and below the maximum erosion depth of the modem channel deposits. This comparison shows that there is no significant difference in the mean set thickness of dune cross-strata above and below the basal erosion surface of the contemporary channel, thus suggesting that dimensional relationships between dune deposits and the formative bed-form dimensions are likely to be valid from both recent and older deposits. The data show that estimates of mean bankfull flow depth derived from dune, unit bar, and cross-bar channel deposits are all very similar. Thus, the use of all these metrics together can provide a useful check that all components and scales of the alluvial architecture have been identified correctly when building reservoir models. The data also highlight several practical issues with identifying and applying data relating to cross-strata. For example, the deposits of unit bars were found to be severely truncated in length and width, with only approximately 10% of the mean bar-form length remaining, and thus making identification in section difficult. For similar reasons, the deposits of compound bars were found to be especially difficult to recognize, and hence, estimates of channel depth based on this method may be problematic. Where only core data are available (i.e., no outcrop data exist), formative flow depths are suggested to be best reconstructed using cross-strata formed by dunes. However, theoretical relationships between the distribution of set thicknesses and formative dune height are found to result in slight overestimates of the latter and, hence, mean bankfull flow depths derived from these measurements. This article illustrates that the preservation of fluvial cross-strata and, thus, the paleohydraulic inferences that can be drawn from them, are a function of the ratio of the size and migration rate of bed forms and the time scale of aggradation and channel migration. These factors must thus be considered when deciding on appropriate length:thickness ratios for the purposes of object-based modeling in reservoir characterization.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a widespread agreement from patient and professional organisations alike that the safety of stem cell therapeutics is of paramount importance, particularly for ex vivo autologous gene therapy. Yet current technology makes it difficult to thoroughly evaluate the behaviour of genetically corrected stem cells before they are transplanted. To address this, we have developed a strategy that permits transplantation of a clonal population of genetically corrected autologous stem cells that meet stringent selection criteria and the principle of precaution. As a proof of concept, we have stably transduced epidermal stem cells (holoclones) obtained from a patient suffering from recessive dystrophic epidermolysis bullosa. Holoclones were infected with self-inactivating retroviruses bearing a COL7A1 cDNA and cloned before the progeny of individual stem cells were characterised using a number of criteria. Clonal analysis revealed a great deal of heterogeneity among transduced stem cells in their capacity to produce functional type VII collagen (COLVII). Selected transduced stem cells transplanted onto immunodeficient mice regenerated a non-blistering epidermis for months and produced a functional COLVII. Safety was assessed by determining the sites of proviral integration, rearrangements and hit genes and by whole-genome sequencing. The progeny of the selected stem cells also had a diploid karyotype, was not tumorigenic and did not disseminate after long-term transplantation onto immunodeficient mice. In conclusion, a clonal strategy is a powerful and efficient means of by-passing the heterogeneity of a transduced stem cell population. It guarantees a safe and homogenous medicinal product, fulfilling the principle of precaution and the requirements of regulatory affairs. Furthermore, a clonal strategy makes it possible to envision exciting gene-editing technologies like zinc finger nucleases, TALENs and homologous recombination for next-generation gene therapy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A study of the angular distributions of leptons from decays of J/ψ"s produced in p-C and p-W collisions at s√=41.6~GeV has been performed in the J/ψ Feynman-x region −0.34reconstructed in both the e + e − and μ + μ − decay channels, indicate that J/ψ"s are produced polarized. The magnitude of the effect is maximal at low p T . For p T >1 GeV/c a significant dependence on the reference frame is found: the polar anisotropy is more pronounced in the Collins-Soper frame and almost vanishes in the helicity frame, where, instead, a significant azimuthal anisotropy arises.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It was evaluated the genetic divergence in peach genotypes for brown rot reaction. It was evaluated 26 and 29 peach genotypes in the 2009/2010 and 2010/2011 production cycle, respectively. The experiment was carried out at the Laboratório de Fitossanidade, da UTFPR - Campus Dois Vizinhos. The experimental design was entirely randomized, considering each peach genotype a treatment, and it was use three replication of nine fruits. The treatment control use three replication of three peach. The fruit epidermis were inoculated individually with 0.15 mL of M. fructicola conidial suspension (1.0 x 10(5) spores mL-1). In the control treatment was sprayed with 0.15 mL of distilled water. The fruits were examined 72 and 120 hours after inoculation, and the incidence and severity disease were evaluated. These results allowed realized study for genetic divergence, used as dissimilarity measure the Generalized Mahalanobis distance. Cluster analysis using Tocher´s optimization method and distances in the plan were applied. There was smallest genetic divergence among peach trees evaluated for brown rot, what can difficult to obtain resistance in the genotypes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: All methods presented to date to map both conductivity and permittivity rely on multiple acquisitions to compute quantitatively the magnitude of radiofrequency transmit fields, B1+. In this work, we propose a method to compute both conductivity and permittivity based solely on relative receive coil sensitivities ( B1-) that can be obtained in one single measurement without the need to neither explicitly perform transmit/receive phase separation nor make assumptions regarding those phases. THEORY AND METHODS: To demonstrate the validity and the noise sensitivity of our method we used electromagnetic finite differences simulations of a 16-channel transceiver array. To experimentally validate our methodology at 7 Tesla, multi compartment phantom data was acquired using a standard 32-channel receive coil system and two-dimensional (2D) and 3D gradient echo acquisition. The reconstructed electric properties were correlated to those measured using dielectric probes. RESULTS: The method was demonstrated both in simulations and in phantom data with correlations to both the modeled and bench measurements being close to identity. The noise properties were modeled and understood. CONCLUSION: The proposed methodology allows to quantitatively determine the electrical properties of a sample using any MR contrast, with the only constraint being the need to have 4 or more receive coils and high SNR. Magn Reson Med, 2014. © 2014 Wiley Periodicals, Inc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The performance of natural extracts obtained from underutilized and residual vegetal and macroalgal biomass processed with food-grade green solvents was compared with that of commercial antioxidants. Selected extracts were obtained from two terrestrial sources: winery byproducts concentrate (WBC) and chestnut burs hydrothermally fractionated extract (CBAE), and from two underutilized seaweeds: Sargassum muticum extracts, either extracted with ethanol (SmEE) or after alginate extraction and hydrothermal fractionation (SmAE) and from Ulva lactuca processed by mild acid extraction and membrane concentration (UlAE). These extracts showed in vitro antioxidant properties comparable to commercial antioxidants and were safe for topical use based on the absence of skin-irritant effects at 0.1% on reconstructed human tissues. The stability of several cosmetic model emulsions was assessed during accelerated oxidation assays. The incorporation of natural extracts produced from renewable underutilized resources at 0.4-0.5% in an oil-in-water emulsions reduced lipid oxidation during storage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present study evaluated the anatomy, chlorophyll content and photosynthetic potential of grapevine leaves grown under plastic cover. The experiment was carried out in vineyards of Moscato Giallo cultivar covered and uncovered with plastic. A block design with 10 selected plants was used for each area (covered and uncovered). Twelve leaves (six of them fully exposed to solar radiation and six grown under shaded conditions) were collected from each area and were fixed and analyzed microscopically (thickness of the adaxial and abaxial epidermis and of the palisade and spongy parenchymas). Chlorophyll content and photosynthetic potential were determined in the vineyard at veraison and after harvest. Plastic covering increased the thickness of the palisade parenchyma in exposed and shaded leaves due to solar radiation restriction. However, the leaves from the covered vineyard did not have the same response to the restriction of solar radiation, as observed in the uncovered vineyard. The thickness of the adaxial and abaxial epidermis and of the spongy parenchyma did not vary due to solar radiation restriction. Chlorophyll content increased in the leaves of covered plants. The photosynthetic potential of the vines is not affected by solar radiation restriction imposed by plastic cover due to anatomical modification in leaves.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mayflies (Ephemeroptera) are known to generally present a high degree of insular endemism: half of the 28 species known from Corsica and Sardinia are considered as endemic. We sequenced the DNA barcode (a fragment of the mitochondrial COI gene) of 349 specimens from 50 localities in Corsica, Sardinia, continental Europe and North Africa. We reconstructed gene trees of eight genera or species groups representing the main mayfly families. Alternative topologies were built to test if our reconstructions suggested a single or multiple Corsican/Sardinian colonization event(s) in each genus or species group. A molecular clock calibrated with different evolution rates was used to try to link speciation processes with geological events. Our results confirm the high degree of endemism of Corsican and Sardinian mayflies and the close relationship between these two faunas. Moreover, we have evidence that the mayfly diversity of the two islands is highly underestimated as at least six new putative species occur on the two islands. We demonstrated that the Corsican and Sardinian mayfly fauna reveals a complex history mainly related to geological events. The Messinian Salinity Crisis, which is thought to have reduced marine barriers, thus facilitating gene flow between insular and continental populations, was detected as the most important event in the speciation of most lineages. Vicariance processes related to the split and rotation of the Corso-Sardinian microplate had a minor impact as they involved only two genera with limited dispersal and ecological range. Colonization events posterior to the Messinian Salinity Crisis had only marginal effects as we had indication of recent gene flow only in two clades. With very limited recent gene flow and a high degree of endemism, mayflies from Corsica and Sardinia present all the criteria for conservation prioritization.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Electrical impedance tomography (EIT) allows the measurement of intra-thoracic impedance changes related to cardiovascular activity. As a safe and low-cost imaging modality, EIT is an appealing candidate for non-invasive and continuous haemodynamic monitoring. EIT has recently been shown to allow the assessment of aortic blood pressure via the estimation of the aortic pulse arrival time (PAT). However, finding the aortic signal within EIT image sequences is a challenging task: the signal has a small amplitude and is difficult to locate due to the small size of the aorta and the inherent low spatial resolution of EIT. In order to most reliably detect the aortic signal, our objective was to understand the effect of EIT measurement settings (electrode belt placement, reconstruction algorithm). This paper investigates the influence of three transversal belt placements and two commonly-used difference reconstruction algorithms (Gauss-Newton and GREIT) on the measurement of aortic signals in view of aortic blood pressure estimation via EIT. A magnetic resonance imaging based three-dimensional finite element model of the haemodynamic bio-impedance properties of the human thorax was created. Two simulation experiments were performed with the aim to (1) evaluate the timing error in aortic PAT estimation and (2) quantify the strength of the aortic signal in each pixel of the EIT image sequences. Both experiments reveal better performance for images reconstructed with Gauss-Newton (with a noise figure of 0.5 or above) and a belt placement at the height of the heart or higher. According to the noise-free scenarios simulated, the uncertainty in the analysis of the aortic EIT signal is expected to induce blood pressure errors of at least ± 1.4 mmHg.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Anthropogenic disturbance of wildlife is of growing conservation concern, but we lack comprehensive approaches of its multiple negative effects. We investigated several effects of disturbance by winter outdoor sports on free-ranging alpine Black Grouse by simultaneously measuring their physiological and behavioral responses. We experimentally flushed radio-tagged Black Grouse from their snow burrows, once a day, during several successive days, and quantified their stress hormone levels (corticosterone metabolites in feces [FCM] collected from individual snow burrows). We also measured feeding time allocation (activity budgets reconstructed from radio-emitted signals) in response to anthropogenic disturbance. Finally, we estimated the related extra energy expenditure that may be incurred: based on activity budgets, energy expenditure was modeled from measures of metabolism obtained from captive birds subjected to different ambient temperatures. The pattern of FCM excretion indicated the existence of a funneling effect as predicted by the allostatic theory of stress: initial stress hormone concentrations showed a wide inter-individual variation, which decreased during experimental flushing. Individuals with low initial pre-flushing FCM values augmented their concentration, while individuals with high initial FCM values lowered it. Experimental disturbance resulted in an extension of feeding duration during the following evening foraging bout, confirming the prediction that Black Grouse must compensate for the extra energy expenditure elicited by human disturbance. Birds with low initial baseline FCM concentrations were those that spent more time foraging. These FCM excretion and foraging patterns suggest that birds with high initial FCM concentrations might have been experiencing a situation of allostatic overload. The energetic model provides quantitative estimates of extra energy expenditure. A longer exposure to ambient temperatures outside the shelter of snow burrows, following disturbance, could increase the daily energy expenditure by >10%, depending principally on ambient temperature and duration of exposure. This study confirms the predictions of allostatic theory and, to the best of our knowledge, constitutes the first demonstration of a funneling effect. It further establishes that winter recreation activities incur costly allostatic behavioral and energetic adjustments, which call for the creation of winter refuge areas together with the implementation of visitor-steering measures for sensitive wildlife.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Peroxisome proliferator-activated receptors (PPARs) are ligand activated transcription factors belonging to the nuclear hormone receptor superfamily. PPARγ is involved in many different activities in the epidermis, such as keratinocyte differentiation, permeability barrier recovery, dermal wound closure, sebaceous gland formation, sebocyte differentiation, and melanogenesis. Preclinical studies with PPARγ ligands on various skin diseases have been performed and they could represent a new strategy in the treatment of scarring alopecia. PPARγ deserves further studies as therapeutic target, likely not with the current drugs, but with future new classes of safer molecules and in combined therapies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Työssä määritettiin Kaakkois-Suomessa sijaitsevan rintamamiestalon eri energiasaneerausvaihtoehtojen kannattavuus. Taselaskenta pohjautui pääosin Suomen rakentamismääräyskokoelman osaan D5. Tarkastelu laajennettiin koskemaan myös yleisesti rintamamiestaloja. Tällöin referenssitalona toimi kyseinen kohdetalo konstruktiossa, jossa talon vaipan eristeet olivat alkuperäiset ja lämmitysjärjestelmä uusimisen tarpeessa. Työssä kehitettiin toimiva laskenta-algoritmi, jolla pystytään määrittämään tyypillisten rintamamiestalojen energiasaneerausten kannattavuus eri lämmitysjärjestelmillä ja eri ilmastovyöhykkeillä. Samalla saadaan määritellyksi yksittäisten rakenneosien saneerauksen vaikutukset talon energiantarpeeseen. Käytännössä kaikkiin 1950-luvulla rakennettuihin rintamamiestaloihin on vuosien kuluessa tehty asumisviihtyvyyttä lisääviä parannuksia, mutta energiaa säästävät investoinnit ovat jääneet vähemmälle. Energiantarve on niissä tyypillisesti 400 kWh asuinneliötä kohti vuodessa, kun vastaava talo nyt rakennettuna kuluttaisi 55% - 60% vähemmän energiaa.