924 resultados para human-populations
Resumo:
Living at high altitude is one of the most difficult challenges that humans had to cope with during their evolution. Whereas several genomic studies have revealed some of the genetic bases of adaptations in Tibetan, Andean, and Ethiopian populations, relatively little evidence of convergent evolution to altitude in different continents has accumulated. This lack of evidence can be due to truly different evolutionary responses, but it can also be due to the low power of former studies that have mainly focused on populations from a single geographical region or performed separate analyses on multiple pairs of populations to avoid problems linked to shared histories between some populations. We introduce here a hierarchical Bayesian method to detect local adaptation that can deal with complex demographic histories. Our method can identify selection occurring at different scales, as well as convergent adaptation in different regions. We apply our approach to the analysis of a large SNP data set from low- and high-altitude human populations from America and Asia. The simultaneous analysis of these two geographic areas allows us to identify several candidate genome regions for altitudinal selection, and we show that convergent evolution among continents has been quite common. In addition to identifying several genes and biological processes involved in high-altitude adaptation, we identify two specific biological pathways that could have evolved in both continents to counter toxic effects induced by hypoxia.
Resumo:
Fluctuations in the Δ14C curve and subsequent gaps of archaeological findings at 800–650 and 400–100 BC in western and central Europe may indicate major climate-driven land-abandonment phases. To address this hypothesis radiocarbon-dated sediments from four lakes in Switzerland were studied palynologically. Pollen analysis indicates contemporaneous phases of forest clearances and of intensified land-use at 1450–1250 BC, 650–450 BC, 50 BC–100 AD and around 700 AD. These land-use expansions coincided with periods of warm climate as recorded by the Alpine dendroclimatic and Greenland oxygen isotope records. Our results suggest that harvest yields would have increased synchronously over wide areas of central and southern Europe during periods of warm and dry climate. Combined interpretation of palaeoecological and archaeological findings suggests that higher food production led to increased human populations. Positive long-term trends in pollen values of Cerealia and Plantago lanceolata indicate that technical innovations during the Bronze and Iron Age (e.g. metal ploughs, scythes, hay production, fertilising methods) gradually increased agricultural productivity. The successful adoption of yield-increasing advances cannot be explained by climatic determinism alone. Combined with archaeological evidence, our results suggest that despite considerable cycles of spatial and demographic reorganisation (repeated land abandonments and expansions, as well as large-scale migrations and population decreases), human societies were able to shift to lower subsistence levels without dramatic ruptures in material culture. However, our data imply that human societies were not able to compensate rapidly for harvest failures when climate deteriorated. Agriculture in marginal areas was abandoned, and spontaneous reforestations took place on abandoned land south and north of the Alps. Only when the climate changed again to drier and warmer conditions did a new wide-spread phase of forest clearances and field extensions occur, allowing the reoccupation of previously abandoned areas. Spatial distribution of cereal cultivation and growth requirements of Cerealia species suggest that increases in precipitation were far more decisive in driving crop failures over central and southern Europe than temperature decreases.
Resumo:
Yakutia, Sakha Republic, in the Siberian Far East, represents one of the coldest places on Earth, with winter record temperatures dropping below -70 °C. Nevertheless, Yakutian horses survive all year round in the open air due to striking phenotypic adaptations, including compact body conformations, extremely hairy winter coats, and acute seasonal differences in metabolic activities. The evolutionary origins of Yakutian horses and the genetic basis of their adaptations remain, however, contentious. Here, we present the complete genomes of nine present-day Yakutian horses and two ancient specimens dating from the early 19th century and ∼5,200 y ago. By comparing these genomes with the genomes of two Late Pleistocene, 27 domesticated, and three wild Przewalski's horses, we find that contemporary Yakutian horses do not descend from the native horses that populated the region until the mid-Holocene, but were most likely introduced following the migration of the Yakut people a few centuries ago. Thus, they represent one of the fastest cases of adaptation to the extreme temperatures of the Arctic. We find cis-regulatory mutations to have contributed more than nonsynonymous changes to their adaptation, likely due to the comparatively limited standing variation within gene bodies at the time the population was founded. Genes involved in hair development, body size, and metabolic and hormone signaling pathways represent an essential part of the Yakutian horse adaptive genetic toolkit. Finally, we find evidence for convergent evolution with native human populations and woolly mammoths, suggesting that only a few evolutionary strategies are compatible with survival in extremely cold environments.
Resumo:
Die Geschichte der Humangenetik stellte lange Zeit ein vernachlässigtes Gebiet der medizin- und wissenschaftshistorischen Forschung dar. Erst in jüngster Vergangenheit sind einige historische Arbeiten erschienen, die sich der Geschichte dieses medizinischen Forschungs- und Praxisfeldes widmen. Eine wichtige Forschungsfrage betrifft die Beziehung der Humangenetik zur Eugenik. Der vorliegende Beitrag greift diese Frage auf und zeigt anhand eines Schweizer Fallbeispiels zur Vererbung des Kropfes, dass zwischen der Humangenetik und der Eugenik im 20. Jahrhundert enge, aber auch widersprüchliche Beziehungen bestanden: Ergebnisse aus Vererbungsstudien widersprachen nicht selten eugenischen Postulaten, zugleich konnten aber dieselben humangenetischen Untersuchungen Visionen einer erbbiologischen Bevölkerungsüberwachung befeuern.
Resumo:
Expanding populations incur a mutation burden – the so-called expansion load. Previous studies of expansion load have focused on codominant mutations. An important consequence of this assumption is that expansion load stems exclusively from the accumulation of new mutations occurring in individuals living at the wave front. Using individual-based simulations, we study here the dynamics of standing genetic variation at the front of expansions, and its consequences on mean fitness if mutations are recessive. We find that deleterious genetic diversity is quickly lost at the front of the expansion, but the loss of deleterious mutations at some loci is compensated by an increase of their frequencies at other loci. The frequency of deleterious homozygotes therefore increases along the expansion axis, whereas the average number of deleterious mutations per individual remains nearly constant across the species range. This reveals two important differences to codominant models: (i) mean fitness at the front of the expansion drops much faster if mutations are recessive, and (ii) mutation load can increase during the expansion even if the total number of deleterious mutations per individual remains constant. We use our model to make predictions about the shape of the site frequency spectrum at the front of range expansion, and about correlations between heterozygosity and fitness in different parts of the species range. Importantly, these predictions provide opportunities to empirically validate our theoretical results. We discuss our findings in the light of recent results on the distribution of deleterious genetic variation across human populations and link them to empirical results on the correlation of heterozygosity and fitness found in many natural range expansions.
Resumo:
Australia is unique as a populated continent in that canine rabies is exotic, with only one likely incursion in 1867. This is despite the presence of a widespread free-ranging dog population, which includes the naturalized dingo, feral domestic dogs and dingo-dog cross-breeds. To Australia's immediate north, rabies has recently spread within the Indonesian archipelago, with outbreaks occurring in historically free islands to the east including Bali, Flores, Ambon and the Tanimbar Islands. Australia depends on strict quarantine protocols to prevent importation of a rabid animal, but the risk of illegal animal movements by fishing and recreational vessels circumventing quarantine remains. Predicting where rabies will enter Australia is important, but understanding dog population dynamics and interactions, including contact rates in and around human populations, is essential for rabies preparedness. The interactions among and between Australia's large populations of wild, free-roaming and restrained domestic dogs require quantification for rabies incursions to be detected and controlled. The imminent risk of rabies breaching Australian borders makes the development of disease spread models that will assist in the deployment of cost-effective surveillance, improve preventive strategies and guide disease management protocols vitally important. Here, we critically review Australia's preparedness for rabies, discuss prevailing assumptions and models, identify knowledge deficits in free-roaming dog ecology relating to rabies maintenance and speculate on the likely consequences of endemic rabies for Australia.
Resumo:
This study used canine sentinel surveillance and collected a sample of adult mosquitoes to investigate the potential impact of West Nile virus (WNV) in human populations in the Rio Grande Valley along the Texas-Mexico border. Samples for this study were collected from juvenile dogs two months to one year of age in animal shelters located in the Rio Grande Valley. The sample was comprised of stray dogs in order to include animals with maximum nighttime exposures to Culex mosquitoes. Serum samples were collected following the 2007 WNV transmission season and were tested for IgG antibodies against WNV. Evidence of antibodies to WNV was found in 35.1% of the sample population consisting of 74 dogs. During this same time period, mosquitoes in Brownsville were trapped and morphologically identified to develop greater understanding of the mosquito populations in the region and to further understand other potential mosquito vectors for disease transmission. The vast majority of mosquitoes living in this area were Aedes albopictus (47.6%), Culex quinquefasciatus (23.7%), and Aedes aegypti (20.1%). This study shows that WNV and the vector responsible for WNV transmission are active in the Rio Grande Valley and pose a threat to the human and animal populations. ^
Resumo:
Global climate change is becoming an increasing concern among the public health community. Some researchers believe the earth is rapidly undergoing changes in temperature, sea level, population movement, and extreme weather phenomenon. With these geographic, meteorological, and social changes come increased threats to human health. One of these threats is the spread of vector-borne infectious diseases. The changes mentioned above are believed to contribute to increased arthropod survival, transmission, and habitation. These changes, in turn, lead to increased incidence among neighboring human populations. It is also argued that human action may play more of a role than climate change. This systematic review served to determine whether or not climate change poses a significant risk to human exposure and increased incidence of vector-borne disease. ^
Resumo:
Background. Acute diarrhea (AD) is an important cause of morbidity and mortality among both children and adults. An ideal antidiarrheal treatment should be safe, effective, compatible with Oral Rehydration Solution, and inexpensive. Herbal medicines, if effective, should fit these criteria as well or better than standard treatment. ^ Objective. The objective of the present study was to assess the effectiveness of plant preparations in patients with AD in reports of randomized and non-randomized controlled trials. ^ Aims. The aims of the present study were to identify effective antidiarrheal herbs and to identify potential antidiarrheal herbs for future studies of efficacy through well designed clinical trials in human populations. ^ Methods. Nineteen published studies of herbal management of AD were examined to identify effective plant preparations. Ten plant preparations including Berberine (Berberis aristata), tormentil root ( Potentialla tormentilla), baohauhau (from the baobaosan plant), carob (Ceratonia siliqua), pectin (Malus domestica), wood creosote (Creosote bush), guava (Psidium guajava L.), belladonna (Atropa belladonna), white bean (Phaseolis vulgaris), and wheat (Triticum aestivum) were identified. ^ Results. Qualitative data analysis of nineteen clinical trials indicated berberine’s potentially valuable antisecretory effects against diarrhea caused by Vibrio cholerae and enterotoxigenic Escherichia coli. Tormentil root showed significant efficacy against rotavirus-induced diarrhea; carob exhibited antidiarrheal properties not only by acting to detoxify and constipate but by providing a rich source of calories; guava and belladonna are antispasmodics and have been shown to relieve the symptoms of AD. Finally, white bean and wheat yielded favorable clinical and dietary outcomes in children with diarrhea. ^ Conclusion. The present study is the first to review the evidence for use of herbal compounds for treatment of AD. Future randomized controlled trials are needed to evaluate their efficacy and safety.^
Resumo:
The potential for significant human populations to experience long-term inhalation of formaldehyde and reports of symptomatology due to this exposure has led to a considerable interest in the toxicologic assessment of risk from subchronic formaldehyde exposures using animal models. Since formaldehyde inhalation depresses certain respiratory parameters in addition to its other forms of toxicity, there is a potential for the alteration of the actual dose received by the exposed individual (and the resulting toxicity) due to this respiratory effect. The respiratory responses to formaldehyde inhalation and the subsequent pattern of deposition were therefore investigated in animals that had received subchronic exposure to the compound, and the potential for changes in the formaldehyde dose received due to long-term inhalation evaluated. Male Sprague-Dawley rats were exposed to either 0, 0.5, 3, or 15 ppm formaldehyde for 6 hours/day, 5 days/week for up to 6 months. The patterns of respiratory response, deposition and the compensation mechanisms involved were then determined in a series of formaldehyde test challenges to both the upper and to the lower respiratory tracts in separate groups of subchronically exposed animals and age-specific controls (four concentration groups, two time points). In both the control and pre-exposed animals, there was a characteristic recovery of respiratory parameters initially depressed by formaldehyde inhalation to at or approaching pre-exposure levels within 10 minutes of the initiation of exposure. Also, formaldehyde deposition was found to remain very high in the upper and lower tracts after long-term exposure. Therefore, there was probably little subsequent effect on the dose received by the exposed individual that was attributable to the repeated exposures. There was a diminished initial minute volume response in test challenges of both the upper and lower tracts of animals that had received at least 16 weeks of exposure to 15 ppm, with compensatory increases in tidal volume in the upper tract and respiratory rate in the lower tract. However, this dose-related effect was probably not relevant to human risk estimation because this formaldehyde dose is in excess of that experienced by human populations. ^
Resumo:
The effects of conversion treatments, depending on ecological factors and silvicultural parameters (thinning intensity, thinning type and rotation, among others) have been studied during the last fifteen years in an experimental trial in Central Spain. The general climate is continental Mediterranean; soils are low depth and limy; vegetation is an homogeneous dense coppices of Quercus ilex with isolated Pinus nigra trees. The experimental design (three locations) includes different thinning intensities (from 0 to 100% of extracted basal area). Inventories have been carried out in 1994 and 2010; thinning treatments were done in 1995 and 2011. Analysis of the effects of the conversion treatment show the increment of diameter and height growth rates, the canopy recovery and the stand resprouting, finding differences in these effects between thinning treatments. Besides the induced changes at holm oak stand, the application of conversion treatment clearly changed the woodland dynamics. Fifteen years after the thinnings, floristic composition varied and an abundant pine regeneration was installed in the woodland. In this work we describe the changes between inventories in tree species composition and diameter distribution, specially in the case of black pine. The conversion treatment caused changes in forest dynamics in the short term, increasing biodiversity and diversifying the forest structure. The fast installation of Pinus regeneration suggests the potential of the zone for the establishment of multipurpose mixed Quercus-Pinus stands in wide areas where Quercus species were favoured by human populations for firewood production. Conversion treatment of coppices, with the creation of mixed stands, constitutes a good management alternative for extensive areas and an interesting technique to adaptation to global change.
Resumo:
The Caribbean and Central America are among the regions with highest HIV-1B prevalence worldwide. Despite of this high virus burden, little is known about the timing and the migration patterns of HIV-1B in these regions. Migration is one of the major processes shaping the genetic structure of virus populations. Thus, reconstruction of epidemiological network may contribute to understand HIV-1B evolution and reduce virus prevalence. We have investigated the spatio-temporal dynamics of the HIV-1B epidemic in The Caribbean and Central America using 1,610 HIV-1B partial pol sequences from 13 Caribbean and 5 Central American countries. Timing of HIV-1B introduction and virus evolutionary rates, as well as the spatial genetic structure of the HIV-1B populations and the virus migration patterns were inferred. Results revealed that in The Caribbean and Central America most of the HIV-1B variability was generated since the 80 s. At odds with previous data suggesting that Haiti was the origin of the epidemic in The Caribbean, our reconstruction indicated that the virus could have been disseminated from Puerto Rico and Antigua. These two countries connected two distinguishable migration areas corresponding to the (mainly Spanish-colonized) Easter and (mainly British-colonized) Western islands, which indicates that virus migration patterns are determined by geographical barriers and by the movement of human populations among culturally related countries. Similar factors shaped the migration of HIV-1B in Central America. The HIV-1B population was significantly structured according to the country of origin, and the genetic diversity in each country was associated with the virus prevalence in both regions, which suggests that virus populations evolve mainly through genetic drift. Thus, our work contributes to the understanding of HIV-1B evolution and dispersion pattern in the Americas, and its relationship with the geography of the area and the movements of human populations.
Resumo:
Melon is traditionally cultivated in fertigated farmlands in the center of Spain with high inputs of water and N fertilizer. Excess N can have a negative impact, from the economic point of view, since it can diminish the production and quality of the fruit, from the environmental point of view, since it is a very mobile element in the soil and can contaminate groundwater. From health point of view, nitrate can be accumulated in fruit pulp, and, in addition, groundwater is the fundamental supply source of human populations. Best management practices are particularly necessary in this region as many zones have been declared vulnerable to NO3- pollution (Directive 91/676/CEE) During successive years, a melon crop (Cucumis melo L.) was grown under field conditions applying mineral and organic fertilizers under drip irrigation. Different doses of ammonium nitrate were used as well as compost derived from the wine-distillery industry which is relevant in this area. The present study reviews the most common N efficiency indexes under the different management options with a view to maximizing yield and minimizing N loss.
Resumo:
The observation of high frequencies of certain inherited disorders in the population of Saguenay–Lac Saint Jean can be explained in terms of the variance and the correlation of effective family size (EFS) from one generation to the next. We have shown this effect by using the branching process approach with real demographic data. When variance of EFS is included in the model, despite its profound effect on mutant allele frequency, any mutant introduced in the population never reaches the known carrier frequencies (between 0.035 and 0.05). It is only when the EFS correlation between generations is introduced into the model that we can explain the rise of the mutant alleles. This correlation is described by a c parameter that reflects the dependency of children’s EFS on their parents’ EFS. The c parameter can be considered to reflect social transmission of demographic behavior. We show that such social transmission dramatically reduces the effective population size. This could explain particular distributions in allele frequencies and unusually high frequency of certain inherited disorders in some human populations.
Resumo:
Immunodeficiency typically appears many years after initial HIV infection. This long, essentially asymptomatic period contributes to the transmission of HIV in human populations. In rare instances, clearance of HIV-1 infection has been observed, particularly in infants. There are also reports of individuals who have been frequently exposed to HIV-1 but remain seronegative for the virus, and it has been hypothesized that these individuals are resistant to infection by HIV-1. However, little is known about the mechanism of immune clearance or protection against HIV-1 in these high-risk individuals because it is difficult to directly demonstrate in vivo protective immunity. Although most of these high-risk individuals show an HIV-1-specific cell-mediated immune response using in vitro assays, their peripheral blood lymphocytes (PBLs) are still susceptible to HIV infection in tissue culture. To study this further in vivo, we have established a humanized SCID mouse infection model whereby T-, B-, and natural killer-cell defective SCID/beige mice that have been reconstituted with normal human PBLs can be infected with HIV-1. When the SCID/beige mice were reconstituted with PBLs from two different multiply exposed HIV-1 seronegative individuals, the mice showed resistance to infection by two strains of HIV-1 (macrophage tropic and T cell tropic), although the same PBLs were easily infected in vitro. Mice reconstituted with PBLs from non-HIV-exposed controls were readily infected. When the same reconstituted mice were depleted of human CD8 T cells, however, they became susceptible to HIV-1 infection, indicating that the in vivo protection required CD8 T cells. This provides clear experimental evidence that some multiply exposed, HIV-1-negative individuals have in vivo protective immunity that is CD8 T cell-dependent. Understanding the mechanism of such protective immunity is critical to the design and testing of effective prophylactic vaccines and immunotherapeutic regimens.