867 resultados para Knowledge of the language
Resumo:
Hisonotus bockmanni, new species, is described based on specimens collected in a sandbank in the Rio Cururu, a tributary to the Rio Teles Pires, one of the rivers forming the Rio Tapajos in the Amazon Basin. The new taxon is distinguished from its congeners by a unique color pattern, whose most striking features are: two elliptical white spots, anterior to nostrils; predorsal region darkly pigmented with five unpigmented spots arranged as anteriorly pointed chevron; and a rostrocaudally elongate cross along most of the caudal peduncle. The placement of the new species in Hisonotus as well as its possible affinities within that genus are discussed in light of the current knowledge of the phylogenetic relationships among the Hypoptopomatinae.
Resumo:
Adult stem cells are distributed through the whole organism, and present a great potential for the therapy of different types of disease. For the design of efficient therapeutic strategies, it is important to have a more detailed understanding of their basic biological characteristics, as well as of the signals produced by damaged tissues and to which they respond. Myocardial infarction (MI), a disease caused by a lack of blood flow supply in the heart, represents the most common cause of morbidity and mortality in the Western world. Stem cell therapy arises as a promising alternative to conventional treatments, which are often ineffective in preventing loss of cardiomyocytes and fibrosis. Cell therapy protocols must take into account the molecular events that occur in the regenerative niche of MI. In the present study, we investigated the expression profile of ten genes coding for chemokines or cytokines in a murine model of MI, aiming at the characterization of the regenerative niche. MI was induced in adult C57BL/6 mice and heart samples were collected after 24 h and 30 days, as well as from control animals, for quantitative RT-PCR. Expression of the chemokine genes CCL2, CCL3, CCL4, CCL7, CXCL2 and CXCL10 was significantly increased 24 h after infarction, returning to baseline levels on day 30. Expression of the CCL8 gene significantly increased only on day 30, whereas gene expression of CXCL12 and CX3CL1 were not significantly increased in either ischemic period. Finally, expression of the IL-6 gene increased 24 h after infarction and was maintained at a significantly higher level than control samples 30 days later. These results contribute to the better knowledge of the regenerative niche in MI, allowing a more efficient selection or genetic manipulation of cells in therapeutic protocols.
Resumo:
Abstract Background The generalized odds ratio (GOR) was recently suggested as a genetic model-free measure for association studies. However, its properties were not extensively investigated. We used Monte Carlo simulations to investigate type-I error rates, power and bias in both effect size and between-study variance estimates of meta-analyses using the GOR as a summary effect, and compared these results to those obtained by usual approaches of model specification. We further applied the GOR in a real meta-analysis of three genome-wide association studies in Alzheimer's disease. Findings For bi-allelic polymorphisms, the GOR performs virtually identical to a standard multiplicative model of analysis (e.g. per-allele odds ratio) for variants acting multiplicatively, but augments slightly the power to detect variants with a dominant mode of action, while reducing the probability to detect recessive variants. Although there were differences among the GOR and usual approaches in terms of bias and type-I error rates, both simulation- and real data-based results provided little indication that these differences will be substantial in practice for meta-analyses involving bi-allelic polymorphisms. However, the use of the GOR may be slightly more powerful for the synthesis of data from tri-allelic variants, particularly when susceptibility alleles are less common in the populations (≤10%). This gain in power may depend on knowledge of the direction of the effects. Conclusions For the synthesis of data from bi-allelic variants, the GOR may be regarded as a multiplicative-like model of analysis. The use of the GOR may be slightly more powerful in the tri-allelic case, particularly when susceptibility alleles are less common in the populations.
Resumo:
Persistent organic pollutants (POPs) is a group of chemicals that are toxic, undergo long-range transport and accumulate in biota. Due to their persistency the distribution and recirculation in the environment often continues for a long period of time. Thereby they appear virtually everywhere within the biosphere, and poses a toxic stress to living organisms. In this thesis, attempts are made to contribute to the understanding of factors that influence the distribution of POPs with focus on processes in the marine environment. The bioavailability and the spatial distribution are central topics for the environmental risk management of POPs. In order to study these topics, various field studies were undertaken. To determine the bioavailable fraction of polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs), polychlorinated naphthalenes (PCNs), and polychlorinated biphenyls (PCBs) the aqueous dissolved phase were sampled and analysed. In the same samples, we also measured how much of these POPs were associated with suspended particles. Different models, which predicted the phase distribution of these POPs, were then evaluated. It was found that important water characteristics, which influenced the solid-water phase distribution of POPs, were particulate organic matter (POM), particulate soot (PSC), and dissolved organic matter (DOM). The bioavailable dissolved POP-phase in the water was lower when these sorbing phases were present. Furthermore, sediments were sampled and the spatial distribution of the POPs was examined. The results showed that the concentration of PCDD/Fs, and PCNs were better described using PSC- than using POM-content of the sediment. In parallel with these field studies, we synthesized knowledge of the processes affecting the distribution of POPs in a multimedia mass balance model. This model predicted concentrations of PCDD/Fs throughout our study area, the Grenlandsfjords in Norway, within factors of ten. This makes the model capable to validate the effect of suitable remedial actions in order to decrease the exposure of these POPs to biota in the Grenlandsfjords which was the aim of the project. Also, to evaluate the influence of eutrophication on the marine occurrence PCB data from the US Musselwatch and Benthic Surveillance Programs are examined in this thesis. The dry weight based concentrations of PCB in bivalves were found to correlate positively to the organic matter content of nearby sediments, and organic matter based concentrations of PCB in sediments were negatively correlated to the organic matter content of the sediment.
Resumo:
Thanks to the Chandra and XMM–Newton surveys, the hard X-ray sky is now probed down to a flux limit where the bulk of the X-ray background is almost completely resolved into discrete sources, at least in the 2–8 keV band. Extensive programs of multiwavelength follow-up observations showed that the large majority of hard X–ray selected sources are identified with Active Galactic Nuclei (AGN) spanning a broad range of redshifts, luminosities and optical properties. A sizable fraction of relatively luminous X-ray sources hosting an active, presumably obscured, nucleus would not have been easily recognized as such on the basis of optical observations because characterized by “peculiar” optical properties. In my PhD thesis, I will focus the attention on the nature of two classes of hard X-ray selected “elusive” sources: those characterized by high X-ray-to-optical flux ratios and red optical-to-near-infrared colors, a fraction of which associated with Type 2 quasars, and the X-ray bright optically normal galaxies, also known as XBONGs. In order to characterize the properties of these classes of elusive AGN, the datasets of several deep and large-area surveys have been fully exploited. The first class of “elusive” sources is characterized by X-ray-to-optical flux ratios (X/O) significantly higher than what is generally observed from unobscured quasars and Seyfert galaxies. The properties of well defined samples of high X/O sources detected at bright X–ray fluxes suggest that X/O selection is highly efficient in sampling high–redshift obscured quasars. At the limits of deep Chandra surveys (∼10−16 erg cm−2 s−1), high X/O sources are generally characterized by extremely faint optical magnitudes, hence their spectroscopic identification is hardly feasible even with the largest telescopes. In this framework, a detailed investigation of their X-ray properties may provide useful information on the nature of this important component of the X-ray source population. The X-ray data of the deepest X-ray observations ever performed, the Chandra deep fields, allows us to characterize the average X-ray properties of the high X/O population. The results of spectral analysis clearly indicate that the high X/O sources represent the most obscured component of the X–ray background. Their spectra are harder (G ∼ 1) than any other class of sources in the deep fields and also of the XRB spectrum (G ≈ 1.4). In order to better understand the AGN physics and evolution, a much better knowledge of the redshift, luminosity and spectral energy distributions (SEDs) of elusive AGN is of paramount importance. The recent COSMOS survey provides the necessary multiwavelength database to characterize the SEDs of a statistically robust sample of obscured sources. The combination of high X/O and red-colors offers a powerful tool to select obscured luminous objects at high redshift. A large sample of X-ray emitting extremely red objects (R−K >5) has been collected and their optical-infrared properties have been studied. In particular, using an appropriate SED fitting procedure, the nuclear and the host galaxy components have been deconvolved over a large range of wavelengths and ptical nuclear extinctions, black hole masses and Eddington ratios have been estimated. It is important to remark that the combination of hard X-ray selection and extreme red colors is highly efficient in picking up highly obscured, luminous sources at high redshift. Although the XBONGs do not present a new source population, the interest on the nature of these sources has gained a renewed attention after the discovery of several examples from recent Chandra and XMM–Newton surveys. Even though several possibilities were proposed in recent literature to explain why a relatively luminous (LX = 1042 − 1043erg s−1) hard X-ray source does not leave any significant signature of its presence in terms of optical emission lines, the very nature of XBONGs is still subject of debate. Good-quality photometric near-infrared data (ISAAC/VLT) of 4 low-redshift XBONGs from the HELLAS2XMMsurvey have been used to search for the presence of the putative nucleus, applying the surface-brightness decomposition technique. In two out of the four sources, the presence of a nuclear weak component hosted by a bright galaxy has been revealed. The results indicate that moderate amounts of gas and dust, covering a large solid angle (possibly 4p) at the nuclear source, may explain the lack of optical emission lines. A weak nucleus not able to produce suffcient UV photons may provide an alternative or additional explanation. On the basis of an admittedly small sample, we conclude that XBONGs constitute a mixed bag rather than a new source population. When the presence of a nucleus is revealed, it turns out to be mildly absorbed and hosted by a bright galaxy.
Resumo:
The g-factor is a constant which connects the magnetic moment $vec{mu}$ of a charged particle, of charge q and mass m, with its angular momentum $vec{J}$. Thus, the magnetic moment can be writen $ vec{mu}_J=g_Jfrac{q}{2m}vec{J}$. The g-factor for a free particle of spin s=1/2 should take the value g=2. But due to quantum electro-dynamical effects it deviates from this value by a small amount, the so called g-factor anomaly $a_e$, which is of the order of $10^{-3}$ for the free electron. This deviation is even bigger if the electron is exposed to high electric fields. Therefore highly charged ions, where electric field strength gets values on the order of $10^{13}-10^{16}$V/cm at the position of the bound electron, are an interesting field of investigations to test QED-calculations. In previous experiments [H"aff00,Ver04] using a single hydrogen-like ion confined in a Penning trap an accuracy of few parts in $10^{-9}$ was obtained. In the present work a new method for precise measurement of magnetic the electronic g-factor of hydrogen-like ions is discussed. Due to the unavoidable magnetic field inhomogeneity in a Penning trap, a very important contribution to the systematic uncertainty in the previous measurements arose from the elevated energy of the ion required for the measurement of its motional frequencies. Then it was necessary to extrapolate the result to vanishing energies. In the new method the energy in the cyclotron degree of freedom is reduced to the minimum attainable energy. This method consist in measuring the reduced cyclotron frequency $nu_{+}$ indirectly by coupling the axial to the reduced cyclotron motion by irradiation of the radio frequency $nu_{coup}=nu_{+}-nu_{ax}+delta$ where $delta$ is, in principle, an unknown detuning that can be obtained from the knowledge of the coupling process. Then the only unknown parameter is the desired value of $nu_+$. As a test, a measurement with, for simplicity, artificially increased axial energy was performed yielding the result $g_{exp}=2.000~047~020~8(24)(44)$. This is in perfect agreement with both the theoretical result $g_{theo}=2.000~047~020~2(6)$ and the previous experimental result $g_{exp1}=2.000~047~025~4(15)(44).$ In the experimental results the second error-bar is due to the uncertainty in the accepted value for the electron's mass. Thus, with the new method a higher accuracy in the g-factor could lead by comparison to the theoretical value to an improved value of the electron's mass. [H"af00] H. H"affner et al., Phys. Rev. Lett. 85 (2000) 5308 [Ver04] J. Verd'u et al., Phys. Rev. Lett. 92 (2004) 093002-1
Resumo:
The term "Brain Imaging" identi�es a set of techniques to analyze the structure and/or functional behavior of the brain in normal and/or pathological situations. These techniques are largely used in the study of brain activity. In addition to clinical usage, analysis of brain activity is gaining popularity in others recent �fields, i.e. Brain Computer Interfaces (BCI) and the study of cognitive processes. In this context, usage of classical solutions (e.g. f MRI, PET-CT) could be unfeasible, due to their low temporal resolution, high cost and limited portability. For these reasons alternative low cost techniques are object of research, typically based on simple recording hardware and on intensive data elaboration process. Typical examples are ElectroEncephaloGraphy (EEG) and Electrical Impedance Tomography (EIT), where electric potential at the patient's scalp is recorded by high impedance electrodes. In EEG potentials are directly generated from neuronal activity, while in EIT by the injection of small currents at the scalp. To retrieve meaningful insights on brain activity from measurements, EIT and EEG relies on detailed knowledge of the underlying electrical properties of the body. This is obtained from numerical models of the electric �field distribution therein. The inhomogeneous and anisotropic electric properties of human tissues make accurate modeling and simulation very challenging, leading to a tradeo�ff between physical accuracy and technical feasibility, which currently severely limits the capabilities of these techniques. Moreover elaboration of data recorded requires usage of regularization techniques computationally intensive, which influences the application with heavy temporal constraints (such as BCI). This work focuses on the parallel implementation of a work-flow for EEG and EIT data processing. The resulting software is accelerated using multi-core GPUs, in order to provide solution in reasonable times and address requirements of real-time BCI systems, without over-simplifying the complexity and accuracy of the head models.
Resumo:
Several coralligenous reefs occur in the soft bottoms of the northern Adriatic continental shelf. Mediterranean coralligenous habitats are characterised by high species diversity and are intrinsically valuable for their biological diversity and for the ecological processes they support. The conservation and management of these habitats require quantifying spatial and temporal variability of their benthic assemblages. This PhD thesis aims to give a relevant contribution to the knowledge of the structure and dynamics of the epibenthic assemblages on the coralligenous subtidal reefs occurring in the northern Adriatic Sea. The epibenthic assemblages showed a spatial variation larger compared to temporal changes, with a temporal persistence of reef-forming organisms. Assemblages spatial heterogeneity has been related to morphological features and geographical location of the reefs, together with variation in the hydrological conditions. Manipulative experiments help to understand the ecological processes structuring the benthic assemblages and maintaining their diversity. In this regards a short and long term experiment on colonization patterns of artificial substrata over a 3-year period has been performed in three reefs, corresponding to the three main types of assemblages detected in the previous study. The first colonisers, largely depending by the different larval supply, played a key role in determining the heterogeneity of the assemblages in the early stage of colonisation. Lateral invasion, from the surrounding assemblages, was the driver in structuring the mature assemblages. These complex colonisation dynamics explained the high heterogeneity of the assemblages dwelling on the northern Adriatic biogenic reefs. The buildup of these coralligenous reefs mainly depends by the bioconstruction-erosion processes that has been analysed through a field experiment. Bioconstruction, largely due to serpulid polychaetes, prevailed on erosion processes and occurred at similar rates in all sites. Similarly, the total energy contents in the benthic communities do not differ among sites, despite being provided by different species. Therefore, we can hypothesise that both bioconstruction processes and energetic storage may be limited by the availability of resources. Finally the major contribution of the zoobenthos compared to the phytobenthos to the total energetic content of assemblages suggests that the energy flow in these benthic habitats is primarily supported by planktonic food web trough the filter feeding invertebrates.
Resumo:
In this thesis we describe in detail the Monte Carlo simulation (LVDG4) built to interpret the experimental data collected by LVD and to measure the muon-induced neutron yield in iron and liquid scintillator. A full Monte Carlo simulation, based on the Geant4 (v 9.3) toolkit, has been developed and validation tests have been performed. We used the LVDG4 to determine the active vetoing and the shielding power of LVD. The idea was to evaluate the feasibility to host a dark matter detector in the most internal part, called Core Facility (LVD-CF). The first conclusion is that LVD is a good moderator, but the iron supporting structure produce a great number of neutrons near the core. The second conclusions is that if LVD is used as an active veto for muons, the neutron flux in the LVD-CF is reduced by a factor 50, of the same order of magnitude of the neutron flux in the deepest laboratory of the world, Sudbury. Finally, the muon-induced neutron yield has been measured. In liquid scintillator we found $(3.2 \pm 0.2) \times 10^{-4}$ n/g/cm$^2$, in agreement with previous measurements performed at different depths and with the general trend predicted by theoretical calculations and Monte Carlo simulations. Moreover we present the first measurement, in our knowledge, of the neutron yield in iron: $(1.9 \pm 0.1) \times 10^{-3}$ n/g/cm$^2$. That measurement provides an important check for the MC of neutron production in heavy materials that are often used as shield in low background experiments.
Resumo:
The thesis objectives are to develop new methodologies for study of the space and time variability of Italian upper ocean ecosystem through the combined use of multi-sensors satellite data and in situ observations and to identify the capability and limits of remote sensing observations to monitor the marine state at short and long time scales. Three oceanographic basins have been selected and subjected to different types of analyses. The first region is the Tyrrhenian Sea where a comparative analysis of altimetry and lagrangian measurements was carried out to study the surface circulation. The results allowed to deepen the knowledge of the Tyrrhenian Sea surface dynamics and its variability and to defined the limitations of satellite altimetry measurements to detect small scale marine circulation features. Channel of Sicily study aimed to identify the spatial-temporal variability of phytoplankton biomass and to understand the impact of the upper ocean circulation on the marine ecosystem. An combined analysis of the satellite of long term time series of chlorophyll, Sea Surface Temperature and Sea Level field data was applied. The results allowed to identify the key role of the Atlantic water inflow in modulating the seasonal variability of the phytoplankton biomass in the region. Finally, Italian coastal marine system was studied with the objective to explore the potential capability of Ocean Color data in detecting chlorophyll trend in coastal areas. The most appropriated methodology to detect long term environmental changes was defined through intercomparison of chlorophyll trends detected by in situ and satellite. Then, Italian coastal areas subject to eutrophication problems were identified. This work has demonstrated that satellites data constitute an unique opportunity to define the features and forcing influencing the upper ocean ecosystems dynamics and can be used also to monitor environmental variables capable of influencing phytoplankton productivity.
Resumo:
The primary goal of volcanological studies is to reconstruct the eruptive history of active volcanoes, by correlating and dating volcanic deposits, in order to depict a future scenario and determine the volcanic hazard of an area. However, alternative methods are necessary where the lack of outcrops, the deposit variability and discontinuity make the correlation difficult, and suitable materials for an accurate dating lack. In this thesis, paleomagnetism (a branch of Geophysics studying the remanent magnetization preserved in rocks) is used as a correlating and dating tool. The correlation is based on the assumption that coeval rocks record similar paleomagnetic directions; the dating relies upon the comparison between paleomagnetic directions recorded by rocks with the expected values from references Paleo-Secular Variation curves (PSV, the variation of the geomagnetic field along time). I first used paleomagnetism to refine the knowledge of the pre – 50 ka geologic history of the Pantelleria island (Strait of Sicily, Italy), by correlating five ignimbrites and two breccias deposits emplaced during that period. Since the use of the paleomagnetic dating is limited by the availability of PSV curves for the studied area, I firstly recovered both paleomagnetic directions and intensities (using a modified Thellier method) from radiocarbon dated lava flows in São Miguel (Azores Islands, Portugal), reconstructing the first PSV reference curve for the Atlantic Ocean for the last 3 ka. Afterwards, I applied paleomagnetism to unravel the chronology and characteristics of Holocene volcanic activity at Faial (Azores) where geochronological age constraints lack. I correlated scoria cones and lava flows yielded by the same eruption on the Capelo Peninsula and dated eruptive events (by comparing paleomagnetic directions with PSV from France and United Kingdom), finding that the volcanics exposed at the Capelo Peninsula are younger than previously believed, and entirely comprised in the last 4 ka.
Resumo:
The electromagnetic form factors of the proton are fundamental quantities sensitive to the distribution of charge and magnetization inside the proton. Precise knowledge of the form factors, in particular of the charge and magnetization radii provide strong tests for theory in the non-perturbative regime of QCD. However, the existing data at Q^2 below 1 (GeV/c)^2 are not precise enough for a hard test of theoretical predictions.rnrnFor a more precise determination of the form factors, within this work more than 1400 cross sections of the reaction H(e,e′)p were measured at the Mainz Microtron MAMI using the 3-spectrometer-facility of the A1-collaboration. The data were taken in three periods in the years 2006 and 2007 using beam energies of 180, 315, 450, 585, 720 and 855 MeV. They cover the Q^2 region from 0.004 to 1 (GeV/c)^2 with counting rate uncertainties below 0.2% for most of the data points. The relative luminosity of the measurements was determined using one of the spectrometers as a luminosity monitor. The overlapping acceptances of the measurements maximize the internal redundancy of the data and allow, together with several additions to the standard experimental setup, for tight control of systematic uncertainties.rnTo account for the radiative processes, an event generator was developed and implemented in the simulation package of the analysis software which works without peaking approximation by explicitly calculating the Bethe-Heitler and Born Feynman diagrams for each event.rnTo separate the form factors and to determine the radii, the data were analyzed by fitting a wide selection of form factor models directly to the measured cross sections. These fits also determined the absolute normalization of the different data subsets. The validity of this method was tested with extensive simulations. The results were compared to an extraction via the standard Rosenbluth technique.rnrnThe dip structure in G_E that was seen in the analysis of the previous world data shows up in a modified form. When compared to the standard-dipole form factor as a smooth curve, the extracted G_E exhibits a strong change of the slope around 0.1 (GeV/c)^2, and in the magnetic form factor a dip around 0.2 (GeV/c)^2 is found. This may be taken as indications for a pion cloud. For higher Q^2, the fits yield larger values for G_M than previous measurements, in agreement with form factor ratios from recent precise polarized measurements in the Q2 region up to 0.6 (GeV/c)^2.rnrnThe charge and magnetic rms radii are determined as rn⟨r_e⟩=0.879 ± 0.005(stat.) ± 0.004(syst.) ± 0.002(model) ± 0.004(group) fm,rn⟨r_m⟩=0.777 ± 0.013(stat.) ± 0.009(syst.) ± 0.005(model) ± 0.002(group) fm.rnThis charge radius is significantly larger than theoretical predictions and than the radius of the standard dipole. However, it is in agreement with earlier results measured at the Mainz linear accelerator and with determinations from Hydrogen Lamb shift measurements. The extracted magnetic radius is smaller than previous determinations and than the standard-dipole value.
Resumo:
Precision measurements of observables in neutron beta decay address important open questions of particle physics and cosmology. In this thesis, a measurement of the proton recoil spectrum with the spectrometer aSPECT is described. From this spectrum the antineutrino-electron angular correlation coefficient a can be derived. In our first beam time at the FRM II in Munich, background instabilities prevented us from presenting a new value for a. In the latest beam time at the ILL in Grenoble, the background has been reduced sufficiently. As a result of the data analysis, we identified and fixed a problem in the detector electronics which caused a significant systematic error. The aim of the latest beam time was a new value for a with an error well below the present literature value of 4%. A statistical accuracy of about 1.4% was reached, but we could only set upper limits on the correction of the problem in the detector electronics, too high to determine a meaningful result. This thesis focused on the investigation of different systematic effects. With the knowledge of the systematics gained in this thesis, we are able to improve aSPECT to perform a 1% measurement of a in a further beam time.
Resumo:
In this thesis the measurement of the effective weak mixing angle wma in proton-proton collisions is described. The results are extracted from the forward-backward asymmetry (AFB) in electron-positron final states at the ATLAS experiment at the LHC. The AFB is defined upon the distribution of the polar angle between the incoming quark and outgoing lepton. The signal process used in this study is the reaction pp to zgamma + X to ee + X taking a total integrated luminosity of 4.8\,fb^(-1) of data into account. The data was recorded at a proton-proton center-of-mass energy of sqrt(s)=7TeV. The weak mixing angle is a central parameter of the electroweak theory of the Standard Model (SM) and relates the neutral current interactions of electromagnetism and weak force. The higher order corrections on wma are related to other SM parameters like the mass of the Higgs boson.rnrnBecause of the symmetric initial state constellation of colliding protons, there is no favoured forward or backward direction in the experimental setup. The reference axis used in the definition of the polar angle is therefore chosen with respect to the longitudinal boost of the electron-positron final state. This leads to events with low absolute rapidity have a higher chance of being assigned to the opposite direction of the reference axis. This effect called dilution is reduced when events at higher rapidities are used. It can be studied including electrons and positrons in the forward regions of the ATLAS calorimeters. Electrons and positrons are further referred to as electrons. To include the electrons from the forward region, the energy calibration for the forward calorimeters had to be redone. This calibration is performed by inter-calibrating the forward electron energy scale using pairs of a central and a forward electron and the previously derived central electron energy calibration. The uncertainty is shown to be dominated by the systematic variations.rnrnThe extraction of wma is performed using chi^2 tests, comparing the measured distribution of AFB in data to a set of template distributions with varied values of wma. The templates are built in a forward folding technique using modified generator level samples and the official fully simulated signal sample with full detector simulation and particle reconstruction and identification. The analysis is performed in two different channels: pairs of central electrons or one central and one forward electron. The results of the two channels are in good agreement and are the first measurements of wma at the Z resonance using electron final states at proton-proton collisions at sqrt(s)=7TeV. The precision of the measurement is already systematically limited mostly by the uncertainties resulting from the knowledge of the parton distribution functions (PDF) and the systematic uncertainties of the energy calibration.rnrnThe extracted results of wma are combined and yield a value of wma_comb = 0.2288 +- 0.0004 (stat.) +- 0.0009 (syst.) = 0.2288 +- 0.0010 (tot.). The measurements are compared to the results of previous measurements at the Z boson resonance. The deviation with respect to the combined result provided by the LEP and SLC experiments is up to 2.7 standard deviations.
Resumo:
The association of several favorable factors has resulted in the development of a wide barchan dune field that stands out as a fundamental element in the coastal landscape of southern Santa Catarina state in Brazil. This original ecosystem is being destroyed and highly modified, due to urbanization. This work identifies and discusses its basic characteristics and analyzes the favorable factors for its preservation, in the foreseen of both a sustainable future and potential incomes from ecotourism. The knowledge of the geologic evolution allows to associate this transgressive Holocene dunes formation to more dissipative beach conditions. Spatial differences on morphodynamics are related to local and regional contrasts in the sediment budget, with an influence on gradients of wave attenuation in the inner shelf and consequently with influence in the level of coastal erosion. The link between relative sea level changes and coastal eolian sedimentation can be used to integrate coastal eolian systems to the sequence stratigraphy model. The main accumulation phase of eolian sediments would occur during the final transgressive and highstand systems tracts. Considering the global character of Quaternary relative sea level changes, the Laguna transgressive dune field should be correlated with similar eolian deposits developed along other parts of the Brazilian coast compatibles with the model of dunefield initiation during rising and highstand sea level phases.