998 resultados para Lago Titicaca
Resumo:
Background: The malaria parasite Plasmodium falciparum exhibits abundant genetic diversity, and this diversity is key to its success as a pathogen. Previous efforts to study genetic diversity in P. falciparum have begun to elucidate the demographic history of the species, as well as patterns of population structure and patterns of linkage disequilibrium within its genome. Such studies will be greatly enhanced by new genomic tools and recent large-scale efforts to map genomic variation. To that end, we have developed a high throughput single nucleotide polymorphism (SNP) genotyping platform for P. falciparum. Results: Using an Affymetrix 3,000 SNP assay array, we found roughly half the assays (1,638) yielded high quality, 100% accurate genotyping calls for both major and minor SNP alleles. Genotype data from 76 global isolates confirm significant genetic differentiation among continental populations and varying levels of SNP diversity and linkage disequilibrium according to geographic location and local epidemiological factors. We further discovered that nonsynonymous and silent (synonymous or noncoding) SNPs differ with respect to within-population diversity, interpopulation differentiation, and the degree to which allele frequencies are correlated between populations. Conclusions: The distinct population profile of nonsynonymous variants indicates that natural selection has a significant influence on genomic diversity in P. falciparum, and that many of these changes may reflect functional variants deserving of follow-up study. Our analysis demonstrates the potential for new high-throughput genotyping technologies to enhance studies of population structure, natural selection, and ultimately enable genome-wide association studies in P. falciparum to find genes underlying key phenotypic traits.
Resumo:
The energy spectrum of cosmic rays above 2.5 x 10(18) eV, derived from 20 000 events recorded at the Pierre Auger Observatory, is described. The spectral index gamma of the particle flux, J proportional to E(-gamma), at energies between 4 x 10(18) eV and 4 x 10(19) eV is 2.69 +/- 0.02(stat) +/- 0.06(syst), steepening to 4.2 +/- 0.4(stat) +/- 0: 06 (syst) at higher energies. The hypothesis of a single power law is rejected with a significance greater than 6 standard deviations. The data are consistent with the prediction by Greisen and by Zatsepin and Kuz'min.
Resumo:
The surface detector array of the Pierre Auger Observatory is sensitive to Earth-skimming tau neutrinos that interact in Earth's crust. Tau leptons from nu(tau) charged-current interactions can emerge and decay in the atmosphere to produce a nearly horizontal shower with a significant electromagnetic component. The data collected between 1 January 2004 and 31 August 2007 are used to place an upper limit on the diffuse flux of nu(tau) at EeV energies. Assuming an E(nu)(-2) differential energy spectrum the limit set at 90% C. L. is E(nu)(2)dN(nu tau)/dE(nu) < 1: 3 x 10(-7) GeV cm(-2) s(-1) sr(-1) in the energy range 2 x 10(17) eV< E(nu) < 2 x 10(19) eV.
Resumo:
We describe the measurement of the depth of maximum, X(max), of the longitudinal development of air showers induced by cosmic rays. Almost 4000 events above 10(18) eV observed by the fluorescence detector of the Pierre Auger Observatory in coincidence with at least one surface detector station are selected for the analysis. The average shower maximum was found to evolve with energy at a rate of (106 +/- 35-21) g/cm(2)/decade below 10(18.24) +/- (0.05) eV, and d24 +/- 3 g/cm(2)/ecade above this energy. The measured shower-to-shower fluctuations decrease from about 55 to 26 g/cm(2). The interpretation of these results in terms of the cosmic ray mass composition is briefly discussed.
Resumo:
Data collected at the Pierre Auger Observatory are used to establish an upper limit on the diffuse flux of tau neutrinos in the cosmic radiation. Earth-skimming nu(tau) may interact in the Earth's crust and produce a tau lepton by means of charged-current interactions. The tau lepton may emerge from the Earth and decay in the atmosphere to produce a nearly horizontal shower with a typical signature, a persistent electromagnetic component even at very large atmospheric depths. The search procedure to select events induced by tau decays against the background of normal showers induced by cosmic rays is described. The method used to compute the exposure for a detector continuously growing with time is detailed. Systematic uncertainties in the exposure from the detector, the analysis, and the involved physics are discussed. No tau neutrino candidates have been found. For neutrinos in the energy range 2x10(17) eV < E(nu)< 2x10(19) eV, assuming a diffuse spectrum of the form E(nu)(-2), data collected between 1 January 2004 and 30 April 2008 yield a 90% confidence-level upper limit of E(nu)(2)dN(nu tau)/dE(nu)< 9x10(-8) GeV cm(-2) s(-1) sr(-1).
Resumo:
Background: Identifying local similarity between two or more sequences, or identifying repeats occurring at least twice in a sequence, is an essential part in the analysis of biological sequences and of their phylogenetic relationship. Finding such fragments while allowing for a certain number of insertions, deletions, and substitutions, is however known to be a computationally expensive task, and consequently exact methods can usually not be applied in practice. Results: The filter TUIUIU that we introduce in this paper provides a possible solution to this problem. It can be used as a preprocessing step to any multiple alignment or repeats inference method, eliminating a possibly large fraction of the input that is guaranteed not to contain any approximate repeat. It consists in the verification of several strong necessary conditions that can be checked in a fast way. We implemented three versions of the filter. The first is simply a straightforward extension to the case of multiple sequences of an application of conditions already existing in the literature. The second uses a stronger condition which, as our results show, enable to filter sensibly more with negligible (if any) additional time. The third version uses an additional condition and pushes the sensibility of the filter even further with a non negligible additional time in many circumstances; our experiments show that it is particularly useful with large error rates. The latter version was applied as a preprocessing of a multiple alignment tool, obtaining an overall time (filter plus alignment) on average 63 and at best 530 times smaller than before (direct alignment), with in most cases a better quality alignment. Conclusion: To the best of our knowledge, TUIUIU is the first filter designed for multiple repeats and for dealing with error rates greater than 10% of the repeats length.
Resumo:
Changes in the oxygen isotopic composition of the planktonic foraminifer Globigerinoides ruber and in the foraminifera faunal composition in a core retrieved from the southeastern Brazilian continental margin were used to infer past changes in the hydrological balance and monsoon precipitation in the western South Atlantic since the Last Glacial Maximum (LGM). The results suggest a first-order orbital (precessional) control on the South American Monsoon precipitation. This agrees with previous studies based on continental proxies except for LGM estimates provided by pollen records. The causes for this disagreement are discussed.
Resumo:
Although H(+) and OH(-) are the most common ions in aqueous media, they are not usually observable in capillary electrophoresis (CE) experiments, because of the extensive use of buffer solutions as the background electrolyte. In the present work, we introduce CE equipment designed to allow the determination of such ions in a similar fashion as any other ion. Basically, it consists of a four-compartment piece of equipment for electrolysis-separated experiments (D. P. de Jesus et at, Anal. Chem., 2005, 77, 607). In such a system, the ends of the capillary are placed in two reservoirs, which are connected to two other reservoirs through electrolyte-filled tubes. The electrodes of the high-voltage power source are positioned in these reservoirs. Thus, the electrolysis products are kept away from the inputs of the capillary. The detection was provided by two capacitively coupled contactless conductivity detectors (CD), each one positioned about 11 cm from the end of the capillary. Two applications were demonstrated: titration-like procedures for nanolitre samples and mobility measurements. Strong and weak acids (pK(a) < 5), pure or mixtures, could be titrated. The analytical curve is linear from 50 mu M up to 10 mM of total dissociable hydrogen (r = 0.99899 for n =10) in 10-nL samples. By including D(2)O in the running electrolyte, we could demonstrate how to measure the mixed proton/deuteron mobility. When H(2)O/D(2)O (9 : 1 v/v) was used as the solvent, the mobility was 289.6 +/- 0.5 x 10(-5) cm(2) V(-1) s(-1). Due to the fast conversion of the species, this value is related to the overall behaviour of all isotopologues and isotopomers of the Zundel and Eigen structures, as well as the Stokesian mobility of proton and deuteron. The effect of neutral (o-phenanthroline) and negatively charged (chloroacetate) bases and aprotic solvent (DMSO) over the H(+) mobility was also demonstrated.
Resumo:
In repair works of reinforced concrete, patch repairs tend to crack in the interfacial zone between the mortar and the old concrete. This occurs basically due to the high degree of restriction that acts on a patch repair. For this reason, the technology of patch repair needs to be the subject of a discussion involving professionals who work with projects, construction maintenance and mix proportioning of repair mortars. In the present work, a study is presented on the benefits that the ethylene vinyl acetate copolymer (EVA) and acrylate polymers can provide in the mix proportioning of a repair mortar with respect to compressive, tensile and direct-shear bond strength. The results indicated that the increase in bond strength and the reduction in the influence of the deficiency in Curing conditioning are the main contributions offered by the polymers studied here. (C) 2009 Elsevier, Ltd. All rights reserved.
Resumo:
Carbonation is one of the main concerns for concrete service life in tropical countries. The mechanism and materials that produce it have been widely studied as well as natural and accelerated methods to report and analyze it. In spite of reported investigations, there is a need for information that could allow an adequate interpretation of the results of the standardization process. This lack of information can produce variations not only in the interpretation but also in the predictions of service life. The purpose of this paper is to analyze and discuss variables that could be sources of error, especially when performing accelerated tests. As a result, a methodologies to minimize variations when interpreting and comparing results is proposed, such as specimen geometry and preconditioning, spacing, relative humidity, and CO(2) concentration.
Resumo:
Chloride migration tests are used to measure the concrete capacity to inhibit chloride attack. Many researchers carry out this test in a slice of concrete extracted from the central part of cylindrical specimens, discarding about 75% of the concrete used to mold the specimens. This fact generated the question: would it be possible to extract more slices from a same specimen without losing the confidence in the results? The main purpose of this work is to answer this question. Moreover, another aim of this study was to show the difference of chloride penetration between finished faces and the formwork surfaces of concrete beams and slabs. The results indicated that it is possible to use more slices of a single specimen for a chloride migration test. Moreover, it was demonstrated that there is a significant difference of chloride penetration between the finished surface and the formwork surface of the specimens. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
There are currently many types of protective materials for reinforced concrete structures and the influence of these materials in the chloride diffusion coefficient still needs more research. The aim of this paper is to study the efficacy of certain surface treatments (such as hydrophobic agents, acrylic coating, polyurethane coating and double systems) in inhibiting chloride penetration in concrete. The results indicated that all tested surface protection significantly reduced the sorptivity of concrete (reduction rate > 70%). However, only the polyurethane coating was highly effective in reducing the chloride diffusion coefficient (reduction rate of 86%). (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Chloride attack in marine environments or in structures where deicing salts are used will not always show profiles with concentrations that decrease from the external surface to the interior of the concrete. Some profiles show an increase in chloride concentrations from when a peak is formed. This type of profile must be analyzed in a different way from the traditional model of Fick`s second law to generate more precise service life models. A model for forecasting the penetration of chloride ions as a function of time for profiles having formed a peak. To confirm the efficiency of this model, it is necessary to observe the behavior of a chloride profile with peak in a specific structure over a period of time. To achieve this, two chloride profiles with different ages (22 and 27 years) were extracted from the same structure. The profile obtained from the 22-year sample was used to estimate the chloride profile at 27 years using three models: a) the traditional model using Fick`s second law and extrapolating the value of C(S)-external surface chloride concentration; b) the traditional model using Fick`s second law and shifting the x-axis to the peak depth; c) the previously proposed model. The results from these models were compared with the actual profile measured in the 27-year sample and the results were analyzed. The model was presented with good precision for this study of case, requiring to be tested with other structures in use.
Resumo:
Hydrophobic agents are surface protection materials capable of increasing the angle of contact between the water and the concrete surface. For this reason, hydrophobic agents reduce water (in liquid form) penetration in concrete. Therefore, many European construction regulating agencies recommend this treatment in their maintenance policy. Nonetheless, there continues to be a gap in the understanding about which transport mechanisms of the concrete are modified by the hidrophobic agents. The aim of this study was to fill this gap in regards to reinforced concrete structures inserted in a marine environment. To this end, certain tests were used: Two involving permeability mechanism, one determining capillary absorption, and the last, a migration test used to estimate the chloride diffusion coefficient in saturated condition. Results indicated the efficacy of the hydrophobic agents in cases where capillary suction is the mechanism of water penetration (reduced by 2.12 and 7.0 times, depending of the product). However, when the transport mechanism is permeability this product is not advisable. Moreover, it was demonstrated that the chloride diffusion coefficient (in saturated condition) is reduced by the hydrophobic agents, however, the magnitude of this reduction is minor (reduced by 11% and 17%, depending on the product).
Resumo:
Background-In the Bypass Angioplasty Revascularization Investigation 2 Diabetes (BARI 2D) trial, an initial strategy of coronary revascularization and optimal medical treatment (REV) compared with an initial optimal medical treatment with the option of subsequent revascularization (MED) did not reduce all-cause mortality or the composite of cardiovascular death, myocardial infarction, and stroke in patients with type 2 diabetes mellitus and stable ischemic heart disease. In the same population, we tested whether the REV strategy was superior to the MED strategy in preventing worsening and new angina and subsequent coronary revascularizations. Methods and Results-Among the 2364 men and women (mean age, 62.4 years) with type 2 diabetes mellitus, documented coronary artery disease, and myocardial ischemia, 1191 were randomized to the MED and 1173 to the REV strategy preselected in the percutaneous coronary intervention (796) and coronary artery bypass graft (377) strata. Compared with the MED strategy, the REV strategy at the 3-year follow-up had a lower rate of worsening angina (8% versus 13%; P < 0.001), new angina (37% versus 51%; P = 0.001), and subsequent coronary revascularizations (18% versus 33%; P < 0.001) and a higher rate of angina-free status (66% versus 58%; P = 0.003). The coronary artery bypass graft stratum patients were at higher risk than those in the percutaneous coronary intervention stratum, and had the greatest benefits from REV. Conclusions-In these patients, the REV strategy reduced the occurrence of worsening angina, new angina, and subsequent coronary revascularizations more than the MED strategy. The symptomatic benefits were observed particularly for high-risk patients.