940 resultados para FGGE-Equator ´79 - First GARP Global Experiment


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A procedure has been proposed by Ciotti and Bricaud (2006) to retrieve spectral absorption coefficients of phytoplankton and colored detrital matter (CDM) from satellite radiance measurements. This was also the first procedure to estimate a size factor for phytoplankton, based on the shape of the retrieved algal absorption spectrum, and the spectral slope of CDM absorption. Applying this method to the global ocean color data set acquired by SeaWiFS over twelve years (1998-2009), allowed for a comparison of the spatial variations of chlorophyll concentration ([Chl]), algal size factor (S-f), CDM absorption coefficient (a(cdm)) at 443 nm, and spectral slope of CDM absorption (S-cdm). As expected, correlations between the derived parameters were characterized by a large scatter at the global scale. We compared temporal variability of the spatially averaged parameters over the twelve-year period for three oceanic areas of biogeochemical importance: the Eastern Equatorial Pacific, the North Atlantic and the Mediterranean Sea. In all areas, both S-f and a(cdm)(443) showed large seasonal and interannual variations, generally correlated to those of algal biomass. The CDM maxima appeared in some occasions to last longer than those of [Chl]. The spectral slope of CDM absorption showed very large seasonal cycles consistent with photobleaching, challenging the assumption of a constant slope commonly used in bio-optical models. In the Equatorial Pacific, the seasonal cycles of [Chl], S-f, a(cdm)(443) and S-cdm, as well as the relationships between these parameters, were strongly affected by the 1997-98 El Ni o/La Ni a event.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The regional monsoons of the world have long been viewed as seasonal atmospheric circulation reversal-analogous to a thermally-driven land-sea breeze on a continental scale. This conventional view of monsoons is now being integrated at a global scale and accordingly, a new paradigm has emerged which considers regional monsoons to be manifestations of global-scale seasonal changes in response to overturning of atmospheric circulation in the tropics and subtropics, and henceforth, interactive components of a singular Global Monsoon (GM) system. The paleoclimate community, however, tends to view 'paleomonsoon' (PM), largely in terms of regional circulation phenomena. In the past decade, many high-quality speleothem oxygen isotope (delta O-18) records have been established from the Asian Monsoon and the South American Monsoon regions that primarily reflect changes in the integrated intensities of monsoons on orbital-to-decadal timescales. With the emergence of these high-resolution and absolute-dated records from both sides of the Equator, it is now possible to test a concept of the 'Global-Paleo-Monsoon' (GPM) on a wide-range of timescales. Here we present a comprehensive synthesis of globally-distributed speleothem delta O-18 records and highlight three aspects of the GPM that are comparable to the modern GM: (1) the GPM intensity swings on different timescales; (2) their global extent; and (3) an anti-phased inter-hemispheric relationship between the Asian and South American monsoon systems on a wide range of timescales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background Xylella fastidiosa, a Gram-negative fastidious bacterium, grows in the xylem of several plants causing diseases such as citrus variegated chlorosis. As the xylem sap contains low concentrations of amino acids and other compounds, X. fastidiosa needs to cope with nitrogen limitation in its natural habitat. Results In this work, we performed a whole-genome microarray analysis of the X. fastidiosa nitrogen starvation response. A time course experiment (2, 8 and 12 hours) of cultures grown in defined medium under nitrogen starvation revealed many differentially expressed genes, such as those related to transport, nitrogen assimilation, amino acid biosynthesis, transcriptional regulation, and many genes encoding hypothetical proteins. In addition, a decrease in the expression levels of many genes involved in carbon metabolism and energy generation pathways was also observed. Comparison of gene expression profiles between the wild type strain and the rpoN null mutant allowed the identification of genes directly or indirectly induced by nitrogen starvation in a σ54-dependent manner. A more complete picture of the σ54 regulon was achieved by combining the transcriptome data with an in silico search for potential σ54-dependent promoters, using a position weight matrix approach. One of these σ54-predicted binding sites, located upstream of the glnA gene (encoding glutamine synthetase), was validated by primer extension assays, confirming that this gene has a σ54-dependent promoter. Conclusions Together, these results show that nitrogen starvation causes intense changes in the X. fastidiosa transcriptome and some of these differentially expressed genes belong to the σ54 regulon.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to evaluate the chemical composition and dry matter in vitro digestibility of stem, leaf, straw, cob and kernel fractions of eleven corn (Zea mays) cultivars, harvested at two cutting heights. The experiment was designed as randomized blocks, with three replicates, in a 2 × 11 factorial arrangement (eleven cultivars and two cutting heights). The corn cultivars evaluated were D 766, D 657, D 1000, P 3021, P 3041, C 805, C 333, AG 5011, FOR 01, CO 9621 and BR 205, harvested at a low cutting height (5 cm above ground) and a high cutting height (5 cm below the first ear insertion). Cutting height influenced the dry matter content of the stem fraction, which was lower (23.95%) in plants harvested at the low, than in plants harvested at the high cutting height (26.28%). The kernel fraction had the highest dry matter in vitro digestibility (85.13%), while cultivars did not differ between each other. Cob and straw were the fractions with the highest level of neutral detergent fiber (80.74 and 79.77%, respectively) and the lowest level of crude protein (3.84% and 3.69%, respectively). The leaf fraction had the highest crude protein content, both for plants of low and high cuttings (15.55% and 16.20%, respectively). The increase in the plant cutting height enhanced the dry matter content and dry matter in vitro digestibility of stem fraction, but did not affect the DM content of the leaf fraction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Amazon basin is a region of constant scientific interest due to its environmental importance and its biodiversity and climate on a global scale. The seasonal variations in water volume are one of the examples of topics studied nowadays. In general, the variations in river levels depend primarily on the climate and physics characteristics of the corresponding basins. The main factor which influences the water level in the Amazon Basin is the intensive rainfall over this region as a consequence of the humidity of the tropical climate. Unfortunately, the Amazon basin is an area with lack of water level information due to difficulties in access for local operations. The purpose of this study is to compare and evaluate the Equivalent Water Height (Ewh) from GRACE (Gravity Recovery And Climate Experiment) mission, to study the connection between water loading and vertical variations of the crust due to the hydrologic. In order to achieve this goal, the Ewh is compared with in-situ information from limnimeter. For the analysis it was computed the correlation coefficients, phase and amplitude of GRACE Ewh solutions and in-situ data, as well as the timing of periods of drought in different parts of the basin. The results indicated that vertical variations of the lithosphere due to water mass loading could reach 7 to 5 cm per year, in the sedimentary and flooded areas of the region, where water level variations can reach 10 to 8 m.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Too Big to Ignore (TBTI; www.toobigtoignore.net) is a research network and knowledge mobilization partnership established to elevate the profile of small-scale fisheries (SSF), to argue against their marginalization in national and international policies, and to develop research and governance capacity to address global fisheries challenges. Network participants and partners are conducting global and comparative analyses, as well as in-depth studies of SSF in the context of local complexity and dynamics, along with a thorough examination of governance challenges, to encourage careful consideration of this sector in local, regional and global policy arenas. Comprising 15 partners and 62 researchers from 27 countries, TBTI conducts activities in five regions of the world. In Latin America and the Caribbean (LAC) region, we are taking a participative approach to investigate and promote stewardship and self-governance in SSF, seeking best practices and success stories that could be replicated elsewhere. As well, the region will focus to promote sustainable livelihoods of coastal communities. Key activities include workshops and stakeholder meetings, facilitation of policy dialogue and networking, as well as assessing local capacity needs and training. Currently, LAC members are putting together publications that examine key issues concerning SSF in the region and best practices, with a first focus on ecosystem stewardship. Other planned deliverables include comparative analysis, a regional profile on the top research issues on SSF, and a synthesis of SSF knowledge in LAC

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rickettsia rickettsii is an obligate intracellular tick-borne bacterium that causes Rocky Mountain Spotted Fever (RMSF), the most lethal spotted fever rickettsiosis. When an infected starving tick begins blood feeding from a vertebrate host, R. rickettsii is exposed to a temperature elevation and to components in the blood meal. These two environmental stimuli have been previously associated with the reactivation of rickettsial virulence in ticks, but the factors responsible for this phenotype conversion have not been completely elucidated. Using customized oligonucleotide microarrays and high-throughput microfluidic qRT-PCR, we analyzed the effects of a 10 degrees C temperature elevation and of a blood meal on the transcriptional profile of R. rickettsii infecting the tick Amblyomma aureolatum. This is the first study of the transcriptome of a bacterium in the genus Rickettsia infecting a natural tick vector. Although both stimuli significantly increased bacterial load, blood feeding had a greater effect, modulating five-fold more genes than the temperature upshift. Certain components of the Type IV Secretion System (T4SS) were up-regulated by blood feeding. This suggests that this important bacterial transport system may be utilized to secrete effectors during the tick vector's blood meal. Blood feeding also up-regulated the expression of antioxidant enzymes, which might correspond to an attempt by R. rickettsii to protect itself against the deleterious effects of free radicals produced by fed ticks. The modulated genes identified in this study, including those encoding hypothetical proteins, require further functional analysis and may have potential as future targets for vaccine development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the first underground nuclear explosion, carried out in 1958, the analysis of seismic signals generated by these sources has allowed seismologists to refine the travel times of seismic waves through the Earth and to verify the accuracy of the location algorithms (the ground truth for these sources was often known). Long international negotiates have been devoted to limit the proliferation and testing of nuclear weapons. In particular the Treaty for the comprehensive nuclear test ban (CTBT), was opened to signatures in 1996, though, even if it has been signed by 178 States, has not yet entered into force, The Treaty underlines the fundamental role of the seismological observations to verify its compliance, by detecting and locating seismic events, and identifying the nature of their sources. A precise definition of the hypocentral parameters represents the first step to discriminate whether a given seismic event is natural or not. In case that a specific event is retained suspicious by the majority of the State Parties, the Treaty contains provisions for conducting an on-site inspection (OSI) in the area surrounding the epicenter of the event, located through the International Monitoring System (IMS) of the CTBT Organization. An OSI is supposed to include the use of passive seismic techniques in the area of the suspected clandestine underground nuclear test. In fact, high quality seismological systems are thought to be capable to detect and locate very weak aftershocks triggered by underground nuclear explosions in the first days or weeks following the test. This PhD thesis deals with the development of two different seismic location techniques: the first one, known as the double difference joint hypocenter determination (DDJHD) technique, is aimed at locating closely spaced events at a global scale. The locations obtained by this method are characterized by a high relative accuracy, although the absolute location of the whole cluster remains uncertain. We eliminate this problem introducing a priori information: the known location of a selected event. The second technique concerns the reliable estimates of back azimuth and apparent velocity of seismic waves from local events of very low magnitude recorded by a trypartite array at a very local scale. For the two above-mentioned techniques, we have used the crosscorrelation technique among digital waveforms in order to minimize the errors linked with incorrect phase picking. The cross-correlation method relies on the similarity between waveforms of a pair of events at the same station, at the global scale, and on the similarity between waveforms of the same event at two different sensors of the try-partite array, at the local scale. After preliminary tests on the reliability of our location techniques based on simulations, we have applied both methodologies to real seismic events. The DDJHD technique has been applied to a seismic sequence occurred in the Turkey-Iran border region, using the data recorded by the IMS. At the beginning, the algorithm was applied to the differences among the original arrival times of the P phases, so the cross-correlation was not used. We have obtained that the relevant geometrical spreading, noticeable in the standard locations (namely the locations produced by the analysts of the International Data Center (IDC) of the CTBT Organization, assumed as our reference), has been considerably reduced by the application of our technique. This is what we expected, since the methodology has been applied to a sequence of events for which we can suppose a real closeness among the hypocenters, belonging to the same seismic structure. Our results point out the main advantage of this methodology: the systematic errors affecting the arrival times have been removed or at least reduced. The introduction of the cross-correlation has not brought evident improvements to our results: the two sets of locations (without and with the application of the cross-correlation technique) are very similar to each other. This can be commented saying that the use of the crosscorrelation has not substantially improved the precision of the manual pickings. Probably the pickings reported by the IDC are good enough to make the random picking error less important than the systematic error on travel times. As a further justification for the scarce quality of the results given by the cross-correlation, it should be remarked that the events included in our data set don’t have generally a good signal to noise ratio (SNR): the selected sequence is composed of weak events ( magnitude 4 or smaller) and the signals are strongly attenuated because of the large distance between the stations and the hypocentral area. In the local scale, in addition to the cross-correlation, we have performed a signal interpolation in order to improve the time resolution. The algorithm so developed has been applied to the data collected during an experiment carried out in Israel between 1998 and 1999. The results pointed out the following relevant conclusions: a) it is necessary to correlate waveform segments corresponding to the same seismic phases; b) it is not essential to select the exact first arrivals; and c) relevant information can be also obtained from the maximum amplitude wavelet of the waveforms (particularly in bad SNR conditions). Another remarkable point of our procedure is that its application doesn’t demand a long time to process the data, and therefore the user can immediately check the results. During a field survey, such feature will make possible a quasi real-time check allowing the immediate optimization of the array geometry, if so suggested by the results at an early stage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This doctoral work gains deeper insight into the dynamics of knowledge flows within and across clusters, unfolding their features, directions and strategic implications. Alliances, networks and personnel mobility are acknowledged as the three main channels of inter-firm knowledge flows, thus offering three heterogeneous measures to analyze the phenomenon. The interplay between the three channels and the richness of available research methods, has allowed for the elaboration of three different papers and perspectives. The common empirical setting is the IT cluster in Bangalore, for its distinguished features as a high-tech cluster and for its steady yearly two-digit growth around the service-based business model. The first paper deploys both a firm-level and a tie-level analysis, exploring the cases of 4 domestic companies and of 2 MNCs active the cluster, according to a cluster-based perspective. The distinction between business-domain knowledge and technical knowledge emerges from the qualitative evidence, further confirmed by quantitative analyses at tie-level. At firm-level, the specialization degree seems to be influencing the kind of knowledge shared, while at tie-level both the frequency of interaction and the governance mode prove to determine differences in the distribution of knowledge flows. The second paper zooms out and considers the inter-firm networks; particularly focusing on the role of cluster boundary, internal and external networks are analyzed, in their size, long-term orientation and exploration degree. The research method is purely qualitative and allows for the observation of the evolving strategic role of internal network: from exploitation-based to exploration-based. Moreover, a causal pattern is emphasized, linking the evolution and features of the external network to the evolution and features of internal network. The final paper addresses the softer and more micro-level side of knowledge flows: personnel mobility. A social capital perspective is here developed, which considers both employees’ acquisition and employees’ loss as building inter-firm ties, thus enhancing company’s overall social capital. Negative binomial regression analyses at dyad-level test the significant impact of cluster affiliation (cluster firms vs non-cluster firms), industry affiliation (IT firms vs non-IT fims) and foreign affiliation (MNCs vs domestic firms) in shaping the uneven distribution of personnel mobility, and thus of knowledge flows, among companies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research project presented in this dissertation is about text and memory. The title of the work is "Text and memory between Semiotics and Cognitive Science: an experimental setting about remembering a movie". The object of the research is the relationship between texts or "textuality" - using a more general semiotic term - and memory. The goal is to analyze the link between those semiotic artifacts that a culture defines as autonomous meaningful objects - namely texts - and the cognitive performance of memory that allows to remember them. An active dialogue between Semiotics and Cognitive Science is the theoretical paradigm in which this research is set, the major intend is to establish a productive alignment between the "theory of text" developed in Semiotics and the "theory of memory" outlined in Cognitive Science. In particular the research is an attempt to study how human subjects remember and/or misremember a film, as a specific case study; in semiotics, films are “cinematographic texts”. The research is based on the production of a corpus of data gained through the qualitative method of interviewing. After an initial screening of a fulllength feature film each participant of the experiment has been interviewed twice, according to a pre-established set of questions. The first interview immediately after the screening: the subsequent, follow-up interview three months from screening. The purpose of this design is to elicit two types of recall from the participants. In order to conduce a comparative inquiry, three films have been used in the experimental setting. Each film has been watched by thirteen subjects, that have been interviewed twice. The corpus of data is then made by seventy-eight interviews. The present dissertation displays the results of the investigation of these interviews. It is divided into six main parts. Chapter one presents a theoretical framework about the two main issues: memory and text. The issue of the memory is introduced through many recherches drown up in the field of Cognitive Science and Neuroscience. It is developed, at the same time, a possible relationship with a semiotic approach. The theoretical debate about textuality, characterizing the field of Semiotics, is examined in the same chapter. Chapter two deals with methodology, showing the process of definition of the whole method used for production of the corpus of data. The interview is explored in detail: how it is born, what are the expected results, what are the main underlying hypothesis. In Chapter three the investigation of the answers given by the spectators starts. It is examined the phenomenon of the outstanding details of the process of remembering, trying to define them in a semiotic way. Moreover there is an investigation of the most remembered scenes in the movie. Chapter four considers how the spectators deal with the whole narrative. At the same time it is examined what they think about the global meaning of the film. Chapter five is about affects. It tries to define the role of emotions in the process of comprehension and remembering. Chapter six presents a study of how the spectators account for a single scene of the movie. The complete work offers a broad perspective about the semiotic issue of textuality, using both a semiotic competence and a cognitive one. At the same time it presents a new outlook on the issue of memory, opening several direction of research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Das experimentelle Studium der 1966 von Gerasimov, Drell undHearn unabhängig voneinander aufgestellten und als GDH-SummenregelbezeichnetenRelation macht die Vermessung totalerPhotoabsorptionswirkungsquerschnitte von zirkular polarisierten Photonen an longitudinalpolarisierten Nukleonen über einen weiten Energiebereich notwendig. Die im Sommer1998 erfolgte Messung am Mainzer Mikrotron stellt das erste derartigeExperiment mit reellen Photonen zur Messung des GDH-Integrals am Protondar. Die Verwendung eines Frozen-Spin-Butanoltargets, das eingesetzt wurde, umeinen möglichst hohen Proton-Polarisationsgrad zu erreichen, hat diezusätzliche experimentelle Schwierigkeit zur Folge, daß die imButanoltarget enthaltenen Kohlenstoffkerne ebenfalls Reaktionsprodukte liefern, diezusammen mit den am Proton erzeugten nachgewiesen werden.Ziel der Arbeit war die Bestimmung von Wirkungsquerschnittenam freien Proton aus Messungen an einem komplexen Target (CH2) wie esbeim polarisiertenTarget vorliegt. Die hierzu durchgeführten Pilotexperimentedienten neben der Entwicklung von Methoden zur Reaktionsidentifikation auchder Eichung des Detektorsystems. Durch die Reproduktion der schon bekanntenund vermessenen unpolarisierten differentiellen und totalenEin-Pion-Wirkungsquerschnitte am Proton (gamma p -> p pi0 und gamma p -> n pi+), die bis zueiner Photonenergievon etwa 400 MeV den Hauptbeitrag zum GDH-Integralausmachen, konnte gezeigt werden, daß eine Separation der Wasserstoff- vonKohlenstoffereignissen möglich ist. Die notwendigen Techniken hierzu wurden imRahmen dieser Arbeit zu einem allgemein nutzbaren Werkzeug entwickelt.Weiterhin konnte gezeigt werden, daß der vom Kohlenstoffstammende Anteil der Reaktionen keine Helizitätsabhängigkeit besitzt. Unterdieser Voraussetzung reduziert sich die Bestimmung der helizitätsabhängigenWirkungsquerschnittsdifferenz auf eine einfacheDifferenzbildung. Aus den erhaltenen Ergebnissen der intensiven Analyse von Daten, diemit einem unpolarisierten Target erhalten wurden, konnten so schnellerste Resultate für Messungen, die mit dem polarisierten Frozen-Spin-Targetaufgenommen wurden, geliefert werden. Es zeigt sich, daß sich dieseersten Resultate für polarisierte differentielle und totale (gammaN)-Wirkungsquerschnitte im Delta-Bereich in guter Übereinstimmung mit theoretischenAnalysen befinden.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is about three major aspects of the identification of top quarks. First comes the understanding of their production mechanism, their decay channels and how to translate theoretical formulae into programs that can simulate such physical processes using Monte Carlo techniques. In particular, the author has been involved in the introduction of the POWHEG generator in the framework of the ATLAS experiment. POWHEG is now fully used as the benchmark program for the simulation of ttbar pairs production and decay, along with MC@NLO and AcerMC: this will be shown in chapter one. The second chapter illustrates the ATLAS detectors and its sub-units, such as calorimeters and muon chambers. It is very important to evaluate their efficiency in order to fully understand what happens during the passage of radiation through the detector and to use this knowledge in the calculation of final quantities such as the ttbar production cross section. The last part of this thesis concerns the evaluation of this quantity deploying the so-called "golden channel" of ttbar decays, yielding one energetic charged lepton, four particle jets and a relevant quantity of missing transverse energy due to the neutrino. The most important systematic errors arising from the various part of the calculation are studied in detail. Jet energy scale, trigger efficiency, Monte Carlo models, reconstruction algorithms and luminosity measurement are examples of what can contribute to the uncertainty about the cross-section.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This PhD thesis addresses the topic of large-scale interactions between climate and marine biogeochemistry. To this end, centennial simulations are performed under present and projected future climate conditions with a coupled ocean-atmosphere model containing a complex marine biogeochemistry model. The role of marine biogeochemistry in the climate system is first investigated. Phytoplankton solar radiation absorption in the upper ocean enhances sea surface temperatures and upper ocean stratification. The associated increase in ocean latent heat losses raises atmospheric temperatures and water vapor. Atmospheric circulation is modified at tropical and extratropical latitudes with impacts on precipitation, incoming solar radiation, and ocean circulation which cause upper-ocean heat content to decrease at tropical latitudes and to increase at middle latitudes. Marine biogeochemistry is tightly related to physical climate variability, which may vary in response to internal natural dynamics or to external forcing such as anthropogenic carbon emissions. Wind changes associated with the North Atlantic Oscillation (NAO), the dominant mode of climate variability in the North Atlantic, affect ocean properties by means of momentum, heat, and freshwater fluxes. Changes in upper ocean temperature and mixing impact the spatial structure and seasonality of North Atlantic phytoplankton through light and nutrient limitations. These changes affect the capability of the North Atlantic Ocean of absorbing atmospheric CO2 and of fixing it inside sinking particulate organic matter. Low-frequency NAO phases determine a delayed response of ocean circulation, temperature and salinity, which in turn affects stratification and marine biogeochemistry. In 20th and 21st century simulations natural wind fluctuations in the North Pacific, related to the two dominant modes of atmospheric variability, affect the spatial structure and the magnitude of the phytoplankton spring bloom through changes in upper-ocean temperature and mixing. The impacts of human-induced emissions in the 21st century are generally larger than natural climate fluctuations, with the phytoplankton spring bloom starting one month earlier than in the 20th century and with ~50% lower magnitude. This PhD thesis advances the knowledge of bio-physical interactions within the global climate, highlighting the intrinsic coupling between physical climate and biosphere, and providing a framework on which future studies of Earth System change can be built on.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Die vorliegende Dissertation beinhaltet Anwendungen der Quantenchemie und methodische Entwicklungen im Bereich der "Coupled-Cluster"-Theorie zu den folgenden Themen: 1.) Die Bestimmung von Geometrieparametern in wasserstoffverbrückten Komplexen mit Pikometer-Genauigkeit durch Kopplung von NMR-Experimenten und quantenchemischen Rechnungen wird an zwei Beispielen dargelegt. 2.) Die hierin auftretenden Unterschiede in Theorie und Experiment werden diskutiert. Hierzu wurde die Schwingungsmittelung des Dipolkopplungstensors implementiert, um Nullpunkt-Effekte betrachten zu können. 3.) Ein weiterer Aspekt der Arbeit behandelt die Strukturaufklärung an diskotischen Flüssigkristallen. Die quantenchemische Modellbildung und das Zusammenspiel mit experimentellen Methoden, vor allem der Festkörper-NMR, wird vorgestellt. 4.) Innerhalb dieser Arbeit wurde mit der Parallelisierung des Quantenchemiepaketes ACESII begonnen. Die grundlegende Strategie und erste Ergebnisse werden vorgestellt. 5.) Zur Skalenreduktion des CCCSD(T)-Verfahrens durch Faktorisierung wurden verschiedene Zerlegungen des Energienenners getestet. Ein sich hieraus ergebendes Verfahren zur Berechnung der CCSD(T)-Energie wurde implementiert. 6.) Die Reaktionsaufklärung der Bildung von HSOH aus di-tert-Butyl-Sulfoxid wird vorgestellt. Dazu wurde die Thermodynamik der Reaktionsschritte mit Methoden der Quantenchemie berechnet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis we describe in detail the Monte Carlo simulation (LVDG4) built to interpret the experimental data collected by LVD and to measure the muon-induced neutron yield in iron and liquid scintillator. A full Monte Carlo simulation, based on the Geant4 (v 9.3) toolkit, has been developed and validation tests have been performed. We used the LVDG4 to determine the active vetoing and the shielding power of LVD. The idea was to evaluate the feasibility to host a dark matter detector in the most internal part, called Core Facility (LVD-CF). The first conclusion is that LVD is a good moderator, but the iron supporting structure produce a great number of neutrons near the core. The second conclusions is that if LVD is used as an active veto for muons, the neutron flux in the LVD-CF is reduced by a factor 50, of the same order of magnitude of the neutron flux in the deepest laboratory of the world, Sudbury. Finally, the muon-induced neutron yield has been measured. In liquid scintillator we found $(3.2 \pm 0.2) \times 10^{-4}$ n/g/cm$^2$, in agreement with previous measurements performed at different depths and with the general trend predicted by theoretical calculations and Monte Carlo simulations. Moreover we present the first measurement, in our knowledge, of the neutron yield in iron: $(1.9 \pm 0.1) \times 10^{-3}$ n/g/cm$^2$. That measurement provides an important check for the MC of neutron production in heavy materials that are often used as shield in low background experiments.