953 resultados para NUCLEAR-STRUCTURE INVESTIGATIONS
Resumo:
The effect of DNA cytosine methylation on H-ras promoter activity was assessed using a transient expression system employing the plasmid H-rasCAT (NaeI H-ras promoter linked to the chloramphenicol acetyltransferase (CAT) gene). This 551 bp promoter is 80% GC rich, enriched with 168 CpG dinucleotides, and contains six functional GC box elements which represent major DNA methylation target sites. Prokaryotic methyltransferases HhaI (CGm$\sp5$CG) and HpaII (Cm$\sp5$CGG) alone or in combination with a human placental methyltransferase (HP MTase) were used to introduce methyl groups at different CpG sites within the promoter. To test for functional promoter activity, the methylated plasmids were introduced into CV-1 cells and CAT activity assessed 48 h post-transfection. Methylation at specific HhaI and HpaII sites reduced CAT expression by 70%, whereas more extensive methylation at generalized CpG sites with HP MTase inactivated the promoter $>$95%. The inhibition of H-ras promoter activity was not attributable to methylation-induced differences in DNA uptake or stability in the cell, topological form of the plasmid, or methylation effects in nonpromoter regions. We also observed that DNA cytosine methylation of a 360 bp promoter fragment by HP MTase induced a local change in DNA conformation. Using three independent methodologies (nitrocellulose filter binding assays, gel mobility shifts, and Southwestern blots), we determined that this change in promoter conformation affected the interaction of nuclear proteins with cis-regulatory sequences residing in the promoter region. The results provide evidence to suggest that DNA methylation may regulate gene expression by inducing changes in local promoter conformation which in turn alters the interactions between DNA and protein factors required for transcription. The results provide supportive evidence for the hypothesis of Cedar and Riggs, who postulated that DNA methylation may regulate gene expression by altering the binding affinities of proteins for DNA. ^
Resumo:
Cardiovascular disease (CVD) is the leading cause of death in the United States. One manifestation of CVD known to increase mortality is an enlarged, or hypertrophic heart. Hypertrophic cardiomyocytes adapt to increased contractile demand at the genetic level with a re-emergence of the fetal gene program and a downregulation of fatty acid oxidation genes with concomitant increased reliance on glucose-based metabolism. To understand the transcriptional regulatory pathways that implement hypertrophic directives we analyzed the upstream promoter region of the muscle specific isoform of the nuclear-encoded mitochondrial gene, carnitine palmitoyltransferase-1β (CPT-1β) in cultured rat neonatal cardiac myocytes. This enzyme catalyzes the rate-limiting step of fatty acid entry into β-oxidation and is downregulated in cardiac hypertrophy and failure, making it an attractive model for the study of hypertrophic gene regulation and metabolic adaptations. We demonstrate that the muscle-enriched transcription factors GATA-4 and SRF synergistically activate CPT-1β; moreover, DNA binding to cognate sites and intact protein structure are required. This mechanism coordinates upregulation of energy generating processes with activation of the energy consuming contractile promoter for cardiac α-actin. We hypothesized that fatty acid or glucose responsive transcription factors may also regulate CPT-1β. Oleate weakly stimulates CPT-1β activity; in contrast, the glucose responsive Upstream Stimulatory Factors (USF) dramatically depresses the CPT-1β reporter. USF regulates CPT-1β through a novel physical interaction with the cofactor PGC-1 and abrogation of MEF2A/PGC-1 synergistic stimulation. In this way, USF can inversely regulate metabolic gene programs and may play a role in the shift of metabolic substrate preference seen in hypertrophy. Failing hearts have elevated expression of the nuclear hormone receptor COUP-TF. We report that COUP-TF significantly suppresses reporter transcription independent of DNA binding and specific interactions with GATA-4, Nkx2.5 or USF. In summary, CPT-1β transcriptional regulation integrates mitochondrial gene expression with two essential cardiac functions: contraction and metabolic substrate oxidation. ^
Resumo:
Glacial millennial-scale paleoceanographic changes in the Southeast Pacific and the adjacent Southern Ocean are poorly known due to the scarcity of well-dated and high resolution sediment records. Here we present new surface water records from sediment core MD07-3128 recovered at 53°S off the Pacific entrance of the Strait of Magellan. The alkenone-derived sea surface temperature (SST) record reveals a very strong warming of ca. 8°C over the last Termination and substantial millennial-scale variability in the glacial section largely consistent with our planktonic foraminifera oxygen isotope (d18O) record of Neogloboquadrina pachyderma (sin.). The timing and structure of the Termination and some of the millennial-scale SST fluctuations are very similar to those observed in the well-dated SST record from ODP Site 1233 (41°S) and the temperature record from Drowning Maud Land Antarctic ice core supporting the hemispheric-wide Antarctic timing of SST changes. However, differences in our new SST record are also found including a long-term warming trend over Marine Isotope Stage (MIS) 3 followed by a cooling toward the Last Glacial Maximum (LGM). We suggest that these differences reflect regional cooling related to the proximal location of the southern Patagonian Ice Sheet and related meltwater supply at least during the LGM consistent with the fact that no longer SST cooling trend is observed in ODP Site 1233 or any SST Chilean record. This proximal ice sheet location is documented by generally higher contents of ice rafted debris (IRD) and tetra-unsaturated alkenones, and a slight trend toward lighter planktonic d18O during late MIS 3 and MIS 2.
Resumo:
Joint interpretation of magnetotelluric and geomagnetic depth sounding data in the western European Alps offer new insights into the conductivity structure of the Earth's crust and mantle. This first large scale electromagnetic study in the Alps covers a cross-section from Germany to northern Italy and shows the importance of the alpine mountain chain as an interrupter of continuous conductors. Poor data quality due to the highly crystalline underground is overcome by Remote Reference and Robust Processing techniques. 3d-forward-modelling reveals on the one hand interrupted dipping crustal conductors with maximum conductance of 4960 S and on the other hand a lithosphere thickening up to 208 km beneath the central western Alps. Graphite networks arising from Paleozoic sedimentary deposits are considered to be accountable for the occurrence of high conductivity and the distribution pattern of crustal conductors. The influence of huge sedimentary molasse basins on the electromagnetic data is suggested to be minor compared with the influence of crustal conductors. In conclusion, electromagnetic results can be attributed to the geological, tectonic and palaeogeographical background. Dipping direction (S-SE) and maximum angle (10.1°) of the northern crustal conductor reveal the main thrusting conditions beneath the Helvetic Alps whereas the existence of a crustal conductor in the Briançonnais supports theses about its palaeographic belonging to the Iberian Peninsula.
Resumo:
The book is devoted to investigations of benthic fauna and geology of the Southern Atlantic Ocean. These works have been carried out in terms of exploring biological structure of the ocean and are of great importance for development of this fundamental problem. They are based on material collected during Cruise 43 of R/V Akademik Kurchatov in 1985-1986 and Cruise 43 of R/V Dmitry Mendeleev in 1989. Problems of quantitative distribution, group composition and trophic structure of benthos in the Southern Scotia Sea, along the east-west Transatlantic section along 31°30'S, and offshore Namibia in the area of the Benguela upwelling are under consideration in the book. Authors present new data on fauna of several groups of deep-sea bottom animals and their zoogeography. Much attention is paid to analysis of morphological structure of the Scotia Sea floor considered in terms of plate tectonics. Bottom sediments along the Transatlantic section and facial variation of sediments in the area of South Shetland Islands and of the continental margin of Namibia are under consideration.
Resumo:
During the international "Overflow-Expedition'' 1973 on R.V. "Meteor" oxygen concentrations in surface layers were measured in order to determine the oxygen gradients within the first two meters and to add some informations to the mechanisms of oxygen exchange at the air-sea interface. These investigations may be interesting also with regard to longterm- observations of the oxygen distribution in the Atlantic, especially the problem of the A.O.U. (apparent oxygen utilization) determination. To measure oxygen gradients a special sampler was built which is able to take water samples each 20 cm of the first 2 meters. These data were supplemented by further samples down to 150 m, taken by conventional water samplers, from which samples were also taken to measure N2/O2-relations. By comparing these relations with theoretical relations in air-saturated water the influence of biological production and consumption on the oxygen contents in water could be estimated. A simple glass apparatus was built to extract gas from the water samples, and hereafter the N2/O2-relations were determined by mass spectrometry. Most distributions of the oxygen anomaly show a negative oxygen balance which varies largely, probably due to strong mixing processes in the Iceland-Faroe ridge area. The distribution of surface oxygen saturation values are of two different types. The values of the stations 260, 262 and 270 stem from mixed water and show homogeneous supersaturations, as can be found instantly when whitecaps appear. The values of 9 other stations are from water, sampled during calm periods which has been mixed and supersaturated before. They show a decreasing oxygen saturation towards the sea surface and often undersaturation in the upper decimeters up to 98 % and even 91 %. So at the air-sea interface even less initial oxygen saturation than 100 % can be found after supersaturation during heavy weather periods.
Resumo:
The elaboration of a generic decision-making strategy to address the evolution of an emergency situation, from the stages of response to recovery, and including a planning stage, can facilitate timely, effective and consistent decision making by the response organisations at every level within the emergency management structure and between countries, helping to ensure optimal protection of health, environment, and society. The degree of involvement of stakeholders in this process is a key strategic element for strengthening the local preparedness and response and can help a successful countermeasures strategy. A significant progress was made with the multi-national European project EURANOS (2004-2009) which brought together best practice, knowledge and technology to enhance the preparedness for Europe's response to any radiation emergency and long term contamination. The subsequent establishment of a European Technology Platform and the recent launch of the research project NERIS-TP ("Towards a self sustaining European Technology Platform (NERIS-TP) on Preparedness for Nuclear and Radiological Emergency Response and Recovery") are aimed to continue with the remaining tasks for gaining appropriate levels of emergency preparedness at local level in most European countries. One of the objectives of the NERIS-TP project is: Strengthen the preparedness at the local/national level by setting up dedicated fora and developing new tools or adapting the tools developed within the EURANOS projects (such as the governance framework for preparedness, the handbooks on countermeasures, the RODOS system, and the MOIRA DSS for long term contamination in catchments) to meet the needs of local communities. CIEMAT and UPM in close interaction with the Nuclear Safety Council will explore, within this project, the use and application in Spain of such technical tools, including other national tools and information and communication strategies to foster cooperation between local, national and international stakeholders. The aim is identify and involve relevant stakeholders in emergency preparedness to improve the development and implementation of appropriate protection strategies as part of the consequence management and the transition to recovery. In this paper, an overview of the "state of the art" on this area in Spain and the methodology and work Plan proposed by the Spanish group within the project NERIS to grow the stakeholder involvement in the preparedness to emergency response and recovery is presented.
Resumo:
Hydrogen isotopes play a critical role both in inertial and magnetic confinemen Nuclear Fusion. Since the preferent fuel needed for this technology is a mixture of deuterium and tritium. The study of these isotopes particularly at very low temperatures carries a technological interest in other applications. The present line promotes a deep study on the structural configuration that hydrogen and deuterium adopt at cryogenic temperatures and at high pressures. Typical conditions occurring in present Inertial Fusion target designs. Our approach is aims to determine the crystal structure characteristics, phase transitions and other parameters strongly correlated to variations of temperature and pressure.
Resumo:
Una apropiada evaluación de los márgenes de seguridad de una instalación nuclear, por ejemplo, una central nuclear, tiene en cuenta todas las incertidumbres que afectan a los cálculos de diseño, funcionanmiento y respuesta ante accidentes de dicha instalación. Una fuente de incertidumbre son los datos nucleares, que afectan a los cálculos neutrónicos, de quemado de combustible o activación de materiales. Estos cálculos permiten la evaluación de las funciones respuesta esenciales para el funcionamiento correcto durante operación, y también durante accidente. Ejemplos de esas respuestas son el factor de multiplicación neutrónica o el calor residual después del disparo del reactor. Por tanto, es necesario evaluar el impacto de dichas incertidumbres en estos cálculos. Para poder realizar los cálculos de propagación de incertidumbres, es necesario implementar metodologías que sean capaces de evaluar el impacto de las incertidumbres de estos datos nucleares. Pero también es necesario conocer los datos de incertidumbres disponibles para ser capaces de manejarlos. Actualmente, se están invirtiendo grandes esfuerzos en mejorar la capacidad de analizar, manejar y producir datos de incertidumbres, en especial para isótopos importantes en reactores avanzados. A su vez, nuevos programas/códigos están siendo desarrollados e implementados para poder usar dichos datos y analizar su impacto. Todos estos puntos son parte de los objetivos del proyecto europeo ANDES, el cual ha dado el marco de trabajo para el desarrollo de esta tesis doctoral. Por tanto, primero se ha llevado a cabo una revisión del estado del arte de los datos nucleares y sus incertidumbres, centrándose en los tres tipos de datos: de decaimiento, de rendimientos de fisión y de secciones eficaces. A su vez, se ha realizado una revisión del estado del arte de las metodologías para la propagación de incertidumbre de estos datos nucleares. Dentro del Departamento de Ingeniería Nuclear (DIN) se propuso una metodología para la propagación de incertidumbres en cálculos de evolución isotópica, el Método Híbrido. Esta metodología se ha tomado como punto de partida para esta tesis, implementando y desarrollando dicha metodología, así como extendiendo sus capacidades. Se han analizado sus ventajas, inconvenientes y limitaciones. El Método Híbrido se utiliza en conjunto con el código de evolución isotópica ACAB, y se basa en el muestreo por Monte Carlo de los datos nucleares con incertidumbre. En esta metodología, se presentan diferentes aproximaciones según la estructura de grupos de energía de las secciones eficaces: en un grupo, en un grupo con muestreo correlacionado y en multigrupos. Se han desarrollado diferentes secuencias para usar distintas librerías de datos nucleares almacenadas en diferentes formatos: ENDF-6 (para las librerías evaluadas), COVERX (para las librerías en multigrupos de SCALE) y EAF (para las librerías de activación). Gracias a la revisión del estado del arte de los datos nucleares de los rendimientos de fisión se ha identificado la falta de una información sobre sus incertidumbres, en concreto, de matrices de covarianza completas. Además, visto el renovado interés por parte de la comunidad internacional, a través del grupo de trabajo internacional de cooperación para evaluación de datos nucleares (WPEC) dedicado a la evaluación de las necesidades de mejora de datos nucleares mediante el subgrupo 37 (SG37), se ha llevado a cabo una revisión de las metodologías para generar datos de covarianza. Se ha seleccionando la actualización Bayesiana/GLS para su implementación, y de esta forma, dar una respuesta a dicha falta de matrices completas para rendimientos de fisión. Una vez que el Método Híbrido ha sido implementado, desarrollado y extendido, junto con la capacidad de generar matrices de covarianza completas para los rendimientos de fisión, se han estudiado diferentes aplicaciones nucleares. Primero, se estudia el calor residual tras un pulso de fisión, debido a su importancia para cualquier evento después de la parada/disparo del reactor. Además, se trata de un ejercicio claro para ver la importancia de las incertidumbres de datos de decaimiento y de rendimientos de fisión junto con las nuevas matrices completas de covarianza. Se han estudiado dos ciclos de combustible de reactores avanzados: el de la instalación europea para transmutación industrial (EFIT) y el del reactor rápido de sodio europeo (ESFR), en los cuales se han analizado el impacto de las incertidumbres de los datos nucleares en la composición isotópica, calor residual y radiotoxicidad. Se han utilizado diferentes librerías de datos nucleares en los estudios antreriores, comparando de esta forma el impacto de sus incertidumbres. A su vez, mediante dichos estudios, se han comparando las distintas aproximaciones del Método Híbrido y otras metodologías para la porpagación de incertidumbres de datos nucleares: Total Monte Carlo (TMC), desarrollada en NRG por A.J. Koning y D. Rochman, y NUDUNA, desarrollada en AREVA GmbH por O. Buss y A. Hoefer. Estas comparaciones demostrarán las ventajas del Método Híbrido, además de revelar sus limitaciones y su rango de aplicación. ABSTRACT For an adequate assessment of safety margins of nuclear facilities, e.g. nuclear power plants, it is necessary to consider all possible uncertainties that affect their design, performance and possible accidents. Nuclear data are a source of uncertainty that are involved in neutronics, fuel depletion and activation calculations. These calculations can predict critical response functions during operation and in the event of accident, such as decay heat and neutron multiplication factor. Thus, the impact of nuclear data uncertainties on these response functions needs to be addressed for a proper evaluation of the safety margins. Methodologies for performing uncertainty propagation calculations need to be implemented in order to analyse the impact of nuclear data uncertainties. Nevertheless, it is necessary to understand the current status of nuclear data and their uncertainties, in order to be able to handle this type of data. Great eórts are underway to enhance the European capability to analyse/process/produce covariance data, especially for isotopes which are of importance for advanced reactors. At the same time, new methodologies/codes are being developed and implemented for using and evaluating the impact of uncertainty data. These were the objectives of the European ANDES (Accurate Nuclear Data for nuclear Energy Sustainability) project, which provided a framework for the development of this PhD Thesis. Accordingly, first a review of the state-of-the-art of nuclear data and their uncertainties is conducted, focusing on the three kinds of data: decay, fission yields and cross sections. A review of the current methodologies for propagating nuclear data uncertainties is also performed. The Nuclear Engineering Department of UPM has proposed a methodology for propagating uncertainties in depletion calculations, the Hybrid Method, which has been taken as the starting point of this thesis. This methodology has been implemented, developed and extended, and its advantages, drawbacks and limitations have been analysed. It is used in conjunction with the ACAB depletion code, and is based on Monte Carlo sampling of variables with uncertainties. Different approaches are presented depending on cross section energy-structure: one-group, one-group with correlated sampling and multi-group. Differences and applicability criteria are presented. Sequences have been developed for using different nuclear data libraries in different storing-formats: ENDF-6 (for evaluated libraries) and COVERX (for multi-group libraries of SCALE), as well as EAF format (for activation libraries). A revision of the state-of-the-art of fission yield data shows inconsistencies in uncertainty data, specifically with regard to complete covariance matrices. Furthermore, the international community has expressed a renewed interest in the issue through the Working Party on International Nuclear Data Evaluation Co-operation (WPEC) with the Subgroup (SG37), which is dedicated to assessing the need to have complete nuclear data. This gives rise to this review of the state-of-the-art of methodologies for generating covariance data for fission yields. Bayesian/generalised least square (GLS) updating sequence has been selected and implemented to answer to this need. Once the Hybrid Method has been implemented, developed and extended, along with fission yield covariance generation capability, different applications are studied. The Fission Pulse Decay Heat problem is tackled first because of its importance during events after shutdown and because it is a clean exercise for showing the impact and importance of decay and fission yield data uncertainties in conjunction with the new covariance data. Two fuel cycles of advanced reactors are studied: the European Facility for Industrial Transmutation (EFIT) and the European Sodium Fast Reactor (ESFR), and response function uncertainties such as isotopic composition, decay heat and radiotoxicity are addressed. Different nuclear data libraries are used and compared. These applications serve as frameworks for comparing the different approaches of the Hybrid Method, and also for comparing with other methodologies: Total Monte Carlo (TMC), developed at NRG by A.J. Koning and D. Rochman, and NUDUNA, developed at AREVA GmbH by O. Buss and A. Hoefer. These comparisons reveal the advantages, limitations and the range of application of the Hybrid Method.