938 resultados para Late early Oligocene
Resumo:
Aim We used combined palaeobotanical and genetic data to assess whether Norway spruce (Picea abies) and Siberian spruce (Picea obovata), two major components of the Eurasian boreal forests, occupied separate glacial refugia, and to test previous hypotheses on their distinction, geographical delimitation and introgression. Location The range of Norway spruce in northern Europe and Siberian spruce in northern Asia. Methods Pollen data and recently compiled macrofossil records were summarized for the Last Glacial Maximum (LGM), late glacial and Holocene. Genetic variation was assessed in 50 populations using one maternally (mitochondrial nad1) and one paternally (chloroplast trnT–trnL) inherited marker and analysed using spatial analyses of molecular variance (SAMOVA). Results Macrofossils showed that spruce was present in both northern Europe and Siberia at the LGM. Congruent macrofossil and pollen data from the late glacial suggested widespread expansions of spruce in the East European Plain, West Siberian Plain, southern Siberian mountains and the Baikal region. Colonization was largely completed during the early Holocene, except in the formerly glaciated area of northern Europe. Both DNA markers distinguished two highly differentiated groups that correspond to Norway spruce and Siberian spruce and coincide spatially with separate LGM spruce occurrences. The division of the mtDNA variation was geographically well defined and occurred to the east of the Ural Mountains along the Ob River, whereas the cpDNA variation showed widespread admixture. Genetic diversity of both DNA markers was higher in western than in eastern populations. Main conclusions North Eurasian Norway spruce and Siberian spruce are genetically distinct and occupied separate LGM refugia, Norway spruce on the East European Plain and Siberian spruce in southern Siberia, where they were already widespread during the late glacial. They came into contact in the basin of the Ob River and probably hybridized. The lower genetic diversity in the eastern populations may indicate that Siberian spruce suffered more from past climatic fluctuations than Norway spruce.
Resumo:
The abrupt Northern Hemispheric warming at the end of the twentieth century has been attributed to an enhanced greenhouse effect. Yet Greenland and surrounding subpolar North Atlantic remained anomalously cold in 1970s to early 1990s. Here we reconstructed robust Greenland temperature records (North Greenland Ice Core Project and Greenland Ice Sheet Project 2) over the past 2100 years using argon and nitrogen isotopes in air trapped within ice cores and show that this cold anomaly was part of a recursive pattern of antiphase Greenland temperature responses to solar variability with a possible multidecadal lag. We hypothesize that high solar activity during the modern solar maximum (approximately 1950s–1980s) resulted in a cooling over Greenland and surrounding subpolar North Atlantic through the slowdown of Atlantic Meridional Overturning Circulation with atmospheric feedback processes.
Resumo:
1 The Early Holocene sediment of a lake at tree line (Gouillé Rion, 2343 m a.s.l.) in the Swiss Central Alps was sampled for plant macrofossils. Thin (0.5 cm) slices, representing time intervals of c. 50 years each from 11 800 to 7800 cal. year bp, were analysed and the data compared with independent palaeoclimatic proxies to study vegetational responses to environmental change. 2 Alpine plant communities (e.g. with Salix herbacea) were established at 11 600–11 500 cal. year bp, when oxygen-isotope records showed that temperatures increased by c. 3–4 °C within decades. Larix decidua trees reached the site at c. 11 350 cal. year bp, probably in response to further warming by 1–2 °C. Forests dominated by L. decidua persisted until 9600 cal. year bp, when Pinus cembra became more important. 3 The dominance of Larix decidua for two millennia is explained by dry summer conditions, and possibly low winter temperatures, which favoured it over the late-successional Pinus cembra. Environmental conditions were a result of variations in the earth's orbit, leading to a maximum of summer and a minimum of winter solar radiation. Other heliophilous and drought-adapted species, such as Dryas octopetala and Juniperus nana, could persist in the open L. decidua forests, but were out-competed when the shade-tolerant P. cembra expanded. 4 The relative importance of Larix decidua decreased during periods of diminished solar radiation at 11 100, 10 100 and 9400 cal. year bp. Stable concentrations of L. decidua indicate that these percentage oscillations were caused by temporary increases of Pinus cembra, Dryas octopetala and Juniperus nana that can be explained by increases in moisture and/or decreases in summer temperature. 5 The final collapse of Larix decidua at 8400 cal. year bp was possibly related to abrupt climatic cooling as a consequence of a large meltwater input to the North Atlantic. Similarly, the temporary exclusion of Pinus cembra from tree line at 10 600–10 200 cal. year bp may be related to slowing down of thermohaline circulation at 10 700–10 300 cal. year bp. 6 Our results show that tree line vegetation was in dynamic equilibrium with climate, even during periods of extraordinarily rapid climatic change. They also imply that forecasted global warming may trigger rapid upslope movements of the tree line of up to 800 m within a few decades or centuries at most, probably inducing large-scale displacements of plant species as well as irrecoverable biodiversity losses.
Resumo:
o reconstruct the vegetation and fire history of the Upper Engadine, two continuous sediment cores from Lej da Champfèr and Lej da San Murezzan (Upper Engadine Valley, southeastern Switzerland) were analysed for pollen, plant macrofossils, charcoal and kerogen. The chronologies of the cores are based on 38 radiocarbon dates. Pollen and macrofossil data suggest a rapid afforestation with Betula, Pinus sylvestris, Pinus cembra, and Larix decidua after the retreat of the glaciers from the lake catchments 11,000 cal years ago. This vegetation type persisted until ca. 7300 cal b.p. (5350 b.c.) when Picea replaced Pinus cembra. Pollen indicative of human impact suggests that in this high-mountain region of the central Alps strong anthropogenic activities began during the Early Bronze Age (3900 cal b.p., 1950 b.c.). Local human settlements led to vegetational changes, promoting the expansion of Larix decidua and Alnus viridis. In the case of Larix, continuing land use and especially grazing after fire led to the formation of Larix meadows. The expansion of Alnus viridis was directly induced by fire, as evidenced by time-series analysis. Subsequently, the process of forest conversion into open landscapes continued for millennia and reached its maximum at the end of the Middle Ages at around 500 cal b.p. (a.d. 1450).
Resumo:
Gebiet: Chirurgie Abstract: Background: Preservation of cardiac grafts for transplantation is not standardized and most centers use a single administration of crystalloid solution at the time of harvesting. We investigated possible benefits of an additional dose of cardioplegia dispensed immediately before implantation. – – Methods: Consecutive adult cardiac transplantations (2005?2012) were reviewed. Hearts were harvested following a standard protocol (Celsior 2L, 4?8°C). In 2008, 100 ml crys-talloid cardioplegic solution was added and administered immediately before implanta-tion. Univariate and logistic regression analyses were used to investigate risk factors for post-operative graft failure and mid-term outcome. – – Results: A total of 81 patients, 44 standard (?Cardio???) vs. 37 with additional cardiople-gia (?CardioC?) were analyzed. Recipients and donors were comparable in both groups. CardioC patients demonstrated a reduced need for defibrillation (24 vs. 48%, p D0.03), post-operative ratio of CK-MB/CK (10.1_3.9 vs. 13.3_4.2%, p D0.001), intubation time (2.0_1.6 vs. 7.2_11.5 days, p D0.05), and ICU stay (3.9_2.1 vs. 8.5_7.8 days, p D0.001). Actuarial survival was reduced when graft ischemic time was >180 min in Cardio?? but not in CardioC patients (p D0.033). Organ ischemic time >180 min (OR: 5.48, CI: 1.08?27.75), donor female gender (OR: 5.84, CI: 1.13?33.01), and recipient/donor age >60 (OR: 6.33, CI: 0.86?46.75), but not the additional cardioplegia or the observation period appeared independent predictors of post-operative acute graft failure. – – Conclusion: An additional dose of cardioplegia administered immediately before implan-tation may be a simple way to improve early and late outcome of cardiac transplantation, especially in situations of prolonged graft ischemia.A large, ideally multicentric, randomized study is desirable to verify this preliminary observation.
Resumo:
BACKGROUND The impact of early treatment with immunomodulators (IM) and/or TNF antagonists on bowel damage in Crohn's disease (CD) patients is unknown. AIM To assess whether 'early treatment' with IM and/or TNF antagonists, defined as treatment within a 2-year period from the date of CD diagnosis, was associated with development of lesser number of disease complications when compared to 'late treatment', which was defined as treatment initiation after >2 years from the time of CD diagnosis. METHODS Data from the Swiss IBD Cohort Study were analysed. The following outcomes were assessed using Cox proportional hazard modelling: bowel strictures, perianal fistulas, internal fistulas, intestinal surgery, perianal surgery and any of the aforementioned complications. RESULTS The 'early treatment' group of 292 CD patients was compared to the 'late treatment' group of 248 CD patients. We found that 'early treatment' with IM or TNF antagonists alone was associated with reduced risk of bowel strictures [hazard ratio (HR) 0.496, P = 0.004 for IM; HR 0.276, P = 0.018 for TNF antagonists]. Furthermore, 'early treatment' with IM was associated with reduced risk of undergoing intestinal surgery (HR 0.322, P = 0.005), and perianal surgery (HR 0.361, P = 0.042), as well as developing any complication (HR 0.567, P = 0.006). CONCLUSIONS Treatment with immunomodulators or TNF antagonists within the first 2 years of CD diagnosis was associated with reduced risk of developing bowel strictures, when compared to initiating these drugs >2 years after diagnosis. Furthermore, early immunomodulators treatment was associated with reduced risk of intestinal surgery, perianal surgery and any complication.
Resumo:
Since the late 19th century different social actors have played an important role in providing social security in Switzerland. Cooperatives, philanthropic organisations, social insurances, and the poor relief of the communes were all part of a "mixed economy of welfare". This article examines how the different actors in this "mixed economy" worked together, and asks what forms of help they supplied. It raises the question of whether a dichotomy between public and private forms of relief can be traced in the Swiss case. Did democratically legitimised processes of redistribution shape the social security system? Or was social security rather funded by private relief programs? The author argues that in the early 20th century, a complex public-private mix structured the Swiss welfare state and the poor often depended on both public and private funding. In this system, financially potent philanthropic organisations successfully contested the legal power of public actors.
Resumo:
BACKGROUND The pathomechanisms underlying very late stent thrombosis (VLST) after implantation of drug-eluting stents (DES) are incompletely understood. Using optical coherence tomography, we investigated potential causes of this adverse event. METHODS AND RESULTS Between August 2010 and December 2014, 64 patients were investigated at the time point of VLST as part of an international optical coherence tomography registry. Optical coherence tomography pullbacks were performed after restoration of flow and analyzed at 0.4 mm. A total of 38 early- and 20 newer-generation drug-eluting stents were suitable for analysis. VLST occurred at a median of 4.7 years (interquartile range, 3.1-7.5 years). An underlying putative cause by optical coherence tomography was identified in 98% of cases. The most frequent findings were strut malapposition (34.5%), neoatherosclerosis (27.6%), uncovered struts (12.1%), and stent underexpansion (6.9%). Uncovered and malapposed struts were more frequent in thrombosed compared with nonthrombosed regions (ratio of percentages, 8.26; 95% confidence interval, 6.82-10.04; P<0.001 and 13.03; 95% confidence interval, 10.13-16.93; P<0.001, respectively). The maximal length of malapposed or uncovered struts (3.40 mm; 95% confidence interval, 2.55-4.25; versus 1.29 mm; 95% confidence interval, 0.81-1.77; P<0.001), but not the maximal or average axial malapposition distance, was greater in thrombosed compared with nonthrombosed segments. The associations of both uncovered and malapposed struts with thrombus were consistent among early- and newer-generation drug-eluting stents. CONCLUSIONS The leading associated findings in VLST patients in descending order were malapposition, neoatherosclerosis, uncovered struts, and stent underexpansion without differences between patients treated with early- and new-generation drug-eluting stents. The longitudinal extension of malapposed and uncovered stent was the most important correlate of thrombus formation in VLST.
Resumo:
The palynostratigraphy of two sediment cores from Soppensee, Central Switzerland (596 m asl) was correlated with nine regional pollen assemblage zones defined for the Swiss Plateau. This biostratigraphy shows that the sedimentary record of Soppensee includes the last 15 000 years, i.e. the entire Late-glacial and Holocene environmental history. The vegetation history of the Soppensee catchment was inferred by pollen and plant-macrofossil analyses on three different cores taken in the deepest part of the lake basin (27 m). On the basis of a high-resolution varve and calibrated radiocarbonchronology it was possible to estimate pollen accumulation rates, which together with the pollen percentage data, formed the basis for the interpretation of the past vegetation dynamics. The basal sediment dates back to the last glacial. After reforestation with juniper and birch at ca. 12 700 B.P., the vegetation changed at around 12 000 B.P. to a pine-birch woodland and at the onset of the Holocene to a mixed deciduous forest. At ca. 7000 B.P., fir expanded and dominated the vegetation with beech becoming predominant at ca. 50014C-years later until sometime during the Iron Age. Large-scale deforestation, especially during the Middle Ages, altered the vegetation cover drastically. During the Late-glacial period two distinct regressive phases in vegetation development are demonstrated, namely, the Aegelsee oscillation (equivalent to the Older Dryas biozone) and the Younger Dryas biozone. No unambiguous evidence for Holocene climatic change was detected at Soppensee. Human presence is indicated by early cereal pollen and distinct pulses of forest clearance as a result of human activity can be observed from the Neolithic period onwards.
Resumo:
Beginning in the late 1980s, lobster (Homarus americanus) landings for the state of Maine and the Bay of Fundy increased to levels more than three times their previous 20-year means. Reduced predation may have permitted the expansion of lobsters into previously inhospitable territory, but we argue that in this region the spatial patterns of recruitment and the abundance of lobsters are substantially driven by events governing the earliest life history stages, including the abundance and distribution of planktonic stages and their initial settlement as Young-of-Year (YOY) lobsters. Settlement densities appear to be strongly driven by abundance of the pelagic postlarvae. Postlarvae and YOY show large-scale spatial patterns commensurate with coastal circulation, but also multi-year trends in abundance and abrupt shifts in abundance and spatial patterns that signal strong environmental forcing. The extent of the coastal shelf that defines the initial settlement grounds for lobsters is important to future population modeling. We address one part of this definition by examining patterns of settlement with depth, and discuss a modeling framework for the full life history of lobsters in the Gulf of Maine.
Resumo:
My dissertation focuses mainly on Bayesian adaptive designs for phase I and phase II clinical trials. It includes three specific topics: (1) proposing a novel two-dimensional dose-finding algorithm for biological agents, (2) developing Bayesian adaptive screening designs to provide more efficient and ethical clinical trials, and (3) incorporating missing late-onset responses to make an early stopping decision. Treating patients with novel biological agents is becoming a leading trend in oncology. Unlike cytotoxic agents, for which toxicity and efficacy monotonically increase with dose, biological agents may exhibit non-monotonic patterns in their dose-response relationships. Using a trial with two biological agents as an example, we propose a phase I/II trial design to identify the biologically optimal dose combination (BODC), which is defined as the dose combination of the two agents with the highest efficacy and tolerable toxicity. A change-point model is used to reflect the fact that the dose-toxicity surface of the combinational agents may plateau at higher dose levels, and a flexible logistic model is proposed to accommodate the possible non-monotonic pattern for the dose-efficacy relationship. During the trial, we continuously update the posterior estimates of toxicity and efficacy and assign patients to the most appropriate dose combination. We propose a novel dose-finding algorithm to encourage sufficient exploration of untried dose combinations in the two-dimensional space. Extensive simulation studies show that the proposed design has desirable operating characteristics in identifying the BODC under various patterns of dose-toxicity and dose-efficacy relationships. Trials of combination therapies for the treatment of cancer are playing an increasingly important role in the battle against this disease. To more efficiently handle the large number of combination therapies that must be tested, we propose a novel Bayesian phase II adaptive screening design to simultaneously select among possible treatment combinations involving multiple agents. Our design is based on formulating the selection procedure as a Bayesian hypothesis testing problem in which the superiority of each treatment combination is equated to a single hypothesis. During the trial conduct, we use the current values of the posterior probabilities of all hypotheses to adaptively allocate patients to treatment combinations. Simulation studies show that the proposed design substantially outperforms the conventional multi-arm balanced factorial trial design. The proposed design yields a significantly higher probability for selecting the best treatment while at the same time allocating substantially more patients to efficacious treatments. The proposed design is most appropriate for the trials combining multiple agents and screening out the efficacious combination to be further investigated. The proposed Bayesian adaptive phase II screening design substantially outperformed the conventional complete factorial design. Our design allocates more patients to better treatments while at the same time providing higher power to identify the best treatment at the end of the trial. Phase II trial studies usually are single-arm trials which are conducted to test the efficacy of experimental agents and decide whether agents are promising to be sent to phase III trials. Interim monitoring is employed to stop the trial early for futility to avoid assigning unacceptable number of patients to inferior treatments. We propose a Bayesian single-arm phase II design with continuous monitoring for estimating the response rate of the experimental drug. To address the issue of late-onset responses, we use a piece-wise exponential model to estimate the hazard function of time to response data and handle the missing responses using the multiple imputation approach. We evaluate the operating characteristics of the proposed method through extensive simulation studies. We show that the proposed method reduces the total length of the trial duration and yields desirable operating characteristics for different physician-specified lower bounds of response rate with different true response rates.
Resumo:
There are two practical challenges in the phase I clinical trial conduct: lack of transparency to physicians, and the late onset toxicity. In my dissertation, Bayesian approaches are used to address these two problems in clinical trial designs. The proposed simple optimal designs cast the dose finding problem as a decision making process for dose escalation and deescalation. The proposed designs minimize the incorrect decision error rate to find the maximum tolerated dose (MTD). For the late onset toxicity problem, a Bayesian adaptive dose-finding design for drug combination is proposed. The dose-toxicity relationship is modeled using the Finney model. The unobserved delayed toxicity outcomes are treated as missing data and Bayesian data augment is employed to handle the resulting missing data. Extensive simulation studies have been conducted to examine the operating characteristics of the proposed designs and demonstrated the designs' good performances in various practical scenarios.^
Resumo:
Histone acetylation is a central event in transcriptional activation. The importance of this modification in mammalian development is highlighted by knockout studies that revealed loss of the histone acetyltransferases GCN5, p300, or CBP results in embryonic lethality. Furthermore, early embryogenesis is sensitive to the dosage of p300 and CBP since double p300 +/−CBP+/− heterozygotes die in utero, although either single heterozygote survives. PCAF and GCN5 physically interact with p300 and CBP in vitro. To determine whether these two groups of HATs interact functionally in vivo, we created mice lacking one or more allele of p300, GCN5 or PCAF. As expected, we found that mice heterozygous for any one of these null alleles are viable. The majority of GCN5 p300 double heterozygotes also survive to adulthood with no apparent abnormalities. However, a portion of these mice die prior to birth. These embryos are developmentally stunted and exhibit increased apoptosis compared to wild type or single GCN5 or p300 heterozygous littermates at E8.5. Tissue specification is unaffected in these embryos but organ formation is compromised. In contrast, no abnormalities were observed in mice harboring mutations in both PCAF and p300 , emphasizing the specificity of HAT functions in mammalian development. ^ Since GCN5 null embryos die early in embryogenesis because of a marked increase in apoptosis, studies of its function and mechanism in late development and in tissue specific differentiation are precluded. Here, we also report the establishment of a GCN5 null embryonic stem cell line and a conditional floxGCN5 mouse line, which will serve as powerful genetic tools to examine in depth the function of GCN5 in mammalian development and in adult tissues. ^
Resumo:
High tunnels have been successfully used in Iowa to modify the climate and extend the growing season for tomatoes and other crops. Without the use of supplemental heat these ventilated, single layered plastic structures have typically increased average inside air temperatures by 10°F or more over outside temperatures for the growing season. The same tunnel, however, will only increase the daily low temperature by about 1 or 2°F, thus making early season high tunnel plantings without additional heat or plant coverings risky in Iowa. Fabric row covers are commonly used in high tunnels to provide for an additional 2-4°F frost protection during cold evenings. The recommended planting date for high tunnel tomatoes in Iowa has been about April 16 (4 to 5 weeks ahead of the recommended outside planting date). Producers are also advised to have some sort of plant covering material available to protect plants during a late spring frost.